Dec 11 13:04:01 crc systemd[1]: Starting Kubernetes Kubelet... Dec 11 13:04:01 crc restorecon[4681]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 11 13:04:01 crc restorecon[4681]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 11 13:04:02 crc restorecon[4681]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 11 13:04:02 crc restorecon[4681]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Dec 11 13:04:02 crc kubenswrapper[4898]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 11 13:04:02 crc kubenswrapper[4898]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Dec 11 13:04:02 crc kubenswrapper[4898]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 11 13:04:02 crc kubenswrapper[4898]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 11 13:04:02 crc kubenswrapper[4898]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Dec 11 13:04:02 crc kubenswrapper[4898]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.604521 4898 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.609940 4898 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.609960 4898 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.609967 4898 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.609973 4898 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.609978 4898 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.609984 4898 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.609989 4898 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.609998 4898 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.610006 4898 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.610013 4898 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.610020 4898 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.610025 4898 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.610030 4898 feature_gate.go:330] unrecognized feature gate: Example Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.610036 4898 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.610042 4898 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.610047 4898 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.610053 4898 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.610058 4898 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.610065 4898 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.610070 4898 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.610076 4898 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.610081 4898 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.610087 4898 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.610103 4898 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.610109 4898 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.610115 4898 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.610120 4898 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.610126 4898 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.610134 4898 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.610140 4898 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.610145 4898 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.610151 4898 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.610156 4898 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.610161 4898 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.610166 4898 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.610171 4898 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.610176 4898 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.610181 4898 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.610186 4898 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.610192 4898 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.610198 4898 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.610203 4898 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.610208 4898 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.610214 4898 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.610220 4898 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.610225 4898 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.610230 4898 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.610235 4898 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.610240 4898 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.610245 4898 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.610251 4898 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.610256 4898 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.610263 4898 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.610270 4898 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.610275 4898 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.610281 4898 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.610286 4898 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.610292 4898 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.610300 4898 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.610306 4898 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.610312 4898 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.610317 4898 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.610323 4898 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.610328 4898 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.610334 4898 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.610340 4898 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.610346 4898 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.610352 4898 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.610358 4898 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.610363 4898 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.610368 4898 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.610484 4898 flags.go:64] FLAG: --address="0.0.0.0" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.610497 4898 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.610507 4898 flags.go:64] FLAG: --anonymous-auth="true" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.610515 4898 flags.go:64] FLAG: --application-metrics-count-limit="100" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.610522 4898 flags.go:64] FLAG: --authentication-token-webhook="false" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.610528 4898 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.610536 4898 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.610544 4898 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.610551 4898 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.610558 4898 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.610564 4898 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.610570 4898 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.610576 4898 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.610582 4898 flags.go:64] FLAG: --cgroup-root="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.610588 4898 flags.go:64] FLAG: --cgroups-per-qos="true" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.610594 4898 flags.go:64] FLAG: --client-ca-file="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.610600 4898 flags.go:64] FLAG: --cloud-config="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.610607 4898 flags.go:64] FLAG: --cloud-provider="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.610612 4898 flags.go:64] FLAG: --cluster-dns="[]" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.610619 4898 flags.go:64] FLAG: --cluster-domain="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.610625 4898 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.610631 4898 flags.go:64] FLAG: --config-dir="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.610640 4898 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.610647 4898 flags.go:64] FLAG: --container-log-max-files="5" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.610654 4898 flags.go:64] FLAG: --container-log-max-size="10Mi" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.610660 4898 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.610666 4898 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.610673 4898 flags.go:64] FLAG: --containerd-namespace="k8s.io" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.610678 4898 flags.go:64] FLAG: --contention-profiling="false" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.610684 4898 flags.go:64] FLAG: --cpu-cfs-quota="true" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.610690 4898 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.610697 4898 flags.go:64] FLAG: --cpu-manager-policy="none" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.610703 4898 flags.go:64] FLAG: --cpu-manager-policy-options="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.610710 4898 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.610716 4898 flags.go:64] FLAG: --enable-controller-attach-detach="true" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.610722 4898 flags.go:64] FLAG: --enable-debugging-handlers="true" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.610728 4898 flags.go:64] FLAG: --enable-load-reader="false" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.610734 4898 flags.go:64] FLAG: --enable-server="true" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.610740 4898 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.610750 4898 flags.go:64] FLAG: --event-burst="100" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.610756 4898 flags.go:64] FLAG: --event-qps="50" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.610762 4898 flags.go:64] FLAG: --event-storage-age-limit="default=0" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.610768 4898 flags.go:64] FLAG: --event-storage-event-limit="default=0" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.610775 4898 flags.go:64] FLAG: --eviction-hard="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.610782 4898 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.610788 4898 flags.go:64] FLAG: --eviction-minimum-reclaim="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.610793 4898 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.610800 4898 flags.go:64] FLAG: --eviction-soft="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.610806 4898 flags.go:64] FLAG: --eviction-soft-grace-period="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.610812 4898 flags.go:64] FLAG: --exit-on-lock-contention="false" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.610817 4898 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.610823 4898 flags.go:64] FLAG: --experimental-mounter-path="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.610829 4898 flags.go:64] FLAG: --fail-cgroupv1="false" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.610835 4898 flags.go:64] FLAG: --fail-swap-on="true" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.610843 4898 flags.go:64] FLAG: --feature-gates="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.610850 4898 flags.go:64] FLAG: --file-check-frequency="20s" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.610856 4898 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.610862 4898 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.610868 4898 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.610875 4898 flags.go:64] FLAG: --healthz-port="10248" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.610881 4898 flags.go:64] FLAG: --help="false" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.610887 4898 flags.go:64] FLAG: --hostname-override="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.610893 4898 flags.go:64] FLAG: --housekeeping-interval="10s" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.610900 4898 flags.go:64] FLAG: --http-check-frequency="20s" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.610908 4898 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.610914 4898 flags.go:64] FLAG: --image-credential-provider-config="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.610919 4898 flags.go:64] FLAG: --image-gc-high-threshold="85" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.610925 4898 flags.go:64] FLAG: --image-gc-low-threshold="80" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.610932 4898 flags.go:64] FLAG: --image-service-endpoint="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.610938 4898 flags.go:64] FLAG: --kernel-memcg-notification="false" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.610944 4898 flags.go:64] FLAG: --kube-api-burst="100" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.610950 4898 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.610956 4898 flags.go:64] FLAG: --kube-api-qps="50" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.610962 4898 flags.go:64] FLAG: --kube-reserved="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.610969 4898 flags.go:64] FLAG: --kube-reserved-cgroup="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.610974 4898 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.610980 4898 flags.go:64] FLAG: --kubelet-cgroups="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.610986 4898 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.610992 4898 flags.go:64] FLAG: --lock-file="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.610999 4898 flags.go:64] FLAG: --log-cadvisor-usage="false" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.611005 4898 flags.go:64] FLAG: --log-flush-frequency="5s" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.611012 4898 flags.go:64] FLAG: --log-json-info-buffer-size="0" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.611028 4898 flags.go:64] FLAG: --log-json-split-stream="false" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.611035 4898 flags.go:64] FLAG: --log-text-info-buffer-size="0" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.611042 4898 flags.go:64] FLAG: --log-text-split-stream="false" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.611048 4898 flags.go:64] FLAG: --logging-format="text" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.611055 4898 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.611062 4898 flags.go:64] FLAG: --make-iptables-util-chains="true" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.611068 4898 flags.go:64] FLAG: --manifest-url="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.611073 4898 flags.go:64] FLAG: --manifest-url-header="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.611081 4898 flags.go:64] FLAG: --max-housekeeping-interval="15s" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.611087 4898 flags.go:64] FLAG: --max-open-files="1000000" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.611094 4898 flags.go:64] FLAG: --max-pods="110" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.611100 4898 flags.go:64] FLAG: --maximum-dead-containers="-1" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.611106 4898 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.611112 4898 flags.go:64] FLAG: --memory-manager-policy="None" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.611119 4898 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.611125 4898 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.611131 4898 flags.go:64] FLAG: --node-ip="192.168.126.11" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.611137 4898 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.611150 4898 flags.go:64] FLAG: --node-status-max-images="50" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.611157 4898 flags.go:64] FLAG: --node-status-update-frequency="10s" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.611163 4898 flags.go:64] FLAG: --oom-score-adj="-999" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.611169 4898 flags.go:64] FLAG: --pod-cidr="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.611175 4898 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.611184 4898 flags.go:64] FLAG: --pod-manifest-path="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.611190 4898 flags.go:64] FLAG: --pod-max-pids="-1" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.611197 4898 flags.go:64] FLAG: --pods-per-core="0" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.611203 4898 flags.go:64] FLAG: --port="10250" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.611209 4898 flags.go:64] FLAG: --protect-kernel-defaults="false" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.611215 4898 flags.go:64] FLAG: --provider-id="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.611221 4898 flags.go:64] FLAG: --qos-reserved="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.611227 4898 flags.go:64] FLAG: --read-only-port="10255" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.611233 4898 flags.go:64] FLAG: --register-node="true" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.611239 4898 flags.go:64] FLAG: --register-schedulable="true" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.611246 4898 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.611256 4898 flags.go:64] FLAG: --registry-burst="10" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.611262 4898 flags.go:64] FLAG: --registry-qps="5" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.611270 4898 flags.go:64] FLAG: --reserved-cpus="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.611276 4898 flags.go:64] FLAG: --reserved-memory="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.611283 4898 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.611290 4898 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.611296 4898 flags.go:64] FLAG: --rotate-certificates="false" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.611302 4898 flags.go:64] FLAG: --rotate-server-certificates="false" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.611308 4898 flags.go:64] FLAG: --runonce="false" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.611314 4898 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.611321 4898 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.611327 4898 flags.go:64] FLAG: --seccomp-default="false" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.611333 4898 flags.go:64] FLAG: --serialize-image-pulls="true" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.611339 4898 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.611345 4898 flags.go:64] FLAG: --storage-driver-db="cadvisor" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.611351 4898 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.611358 4898 flags.go:64] FLAG: --storage-driver-password="root" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.611364 4898 flags.go:64] FLAG: --storage-driver-secure="false" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.611370 4898 flags.go:64] FLAG: --storage-driver-table="stats" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.611376 4898 flags.go:64] FLAG: --storage-driver-user="root" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.611382 4898 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.611388 4898 flags.go:64] FLAG: --sync-frequency="1m0s" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.611394 4898 flags.go:64] FLAG: --system-cgroups="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.611400 4898 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.611410 4898 flags.go:64] FLAG: --system-reserved-cgroup="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.611416 4898 flags.go:64] FLAG: --tls-cert-file="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.611422 4898 flags.go:64] FLAG: --tls-cipher-suites="[]" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.611429 4898 flags.go:64] FLAG: --tls-min-version="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.611435 4898 flags.go:64] FLAG: --tls-private-key-file="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.611441 4898 flags.go:64] FLAG: --topology-manager-policy="none" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.611447 4898 flags.go:64] FLAG: --topology-manager-policy-options="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.611471 4898 flags.go:64] FLAG: --topology-manager-scope="container" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.611477 4898 flags.go:64] FLAG: --v="2" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.611485 4898 flags.go:64] FLAG: --version="false" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.611496 4898 flags.go:64] FLAG: --vmodule="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.611504 4898 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.611511 4898 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.611657 4898 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.611664 4898 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.611671 4898 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.611677 4898 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.611684 4898 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.611689 4898 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.611695 4898 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.611701 4898 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.611707 4898 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.611714 4898 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.611720 4898 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.611726 4898 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.611731 4898 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.611737 4898 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.611742 4898 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.611750 4898 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.611757 4898 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.611764 4898 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.611771 4898 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.611777 4898 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.611783 4898 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.611789 4898 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.611794 4898 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.611799 4898 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.611804 4898 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.611810 4898 feature_gate.go:330] unrecognized feature gate: Example Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.611817 4898 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.611824 4898 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.611830 4898 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.611838 4898 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.611845 4898 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.611851 4898 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.611856 4898 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.611861 4898 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.611868 4898 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.611873 4898 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.611879 4898 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.611884 4898 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.611889 4898 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.611894 4898 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.611900 4898 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.611906 4898 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.611911 4898 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.611916 4898 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.611923 4898 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.611929 4898 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.611935 4898 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.611940 4898 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.611946 4898 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.611951 4898 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.611956 4898 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.611962 4898 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.611968 4898 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.611973 4898 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.611979 4898 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.611984 4898 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.611990 4898 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.611996 4898 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.612002 4898 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.612008 4898 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.612014 4898 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.612022 4898 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.612028 4898 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.612034 4898 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.612039 4898 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.612045 4898 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.612050 4898 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.612056 4898 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.612062 4898 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.612067 4898 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.612074 4898 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.612091 4898 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.624919 4898 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.625004 4898 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.625122 4898 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.625137 4898 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.625147 4898 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.625154 4898 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.625161 4898 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.625168 4898 feature_gate.go:330] unrecognized feature gate: Example Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.625175 4898 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.625182 4898 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.625189 4898 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.625195 4898 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.625203 4898 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.625210 4898 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.625217 4898 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.625226 4898 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.625237 4898 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.625245 4898 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.625252 4898 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.625260 4898 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.625269 4898 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.625277 4898 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.625284 4898 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.625291 4898 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.625298 4898 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.625305 4898 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.625311 4898 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.625319 4898 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.625327 4898 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.625334 4898 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.625342 4898 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.625350 4898 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.625357 4898 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.625364 4898 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.625370 4898 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.625377 4898 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.625388 4898 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.625395 4898 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.625401 4898 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.625409 4898 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.625415 4898 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.625422 4898 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.625430 4898 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.625436 4898 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.625443 4898 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.625449 4898 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.625482 4898 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.625488 4898 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.625495 4898 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.625500 4898 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.625506 4898 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.625513 4898 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.625520 4898 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.625552 4898 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.625559 4898 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.625565 4898 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.625571 4898 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.625578 4898 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.625584 4898 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.625590 4898 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.625596 4898 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.625603 4898 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.625609 4898 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.625642 4898 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.625649 4898 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.625655 4898 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.625664 4898 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.625672 4898 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.625681 4898 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.625688 4898 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.625694 4898 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.625700 4898 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.625708 4898 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.625720 4898 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.625930 4898 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.625943 4898 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.625950 4898 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.625957 4898 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.625964 4898 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.625971 4898 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.625978 4898 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.625985 4898 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.625992 4898 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.625999 4898 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.626006 4898 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.626012 4898 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.626018 4898 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.626024 4898 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.626031 4898 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.626037 4898 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.626043 4898 feature_gate.go:330] unrecognized feature gate: Example Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.626050 4898 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.626057 4898 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.626063 4898 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.626070 4898 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.626076 4898 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.626083 4898 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.626089 4898 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.626096 4898 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.626102 4898 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.626109 4898 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.626116 4898 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.626122 4898 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.626129 4898 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.626136 4898 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.626142 4898 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.626149 4898 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.626156 4898 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.626164 4898 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.626170 4898 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.626179 4898 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.626189 4898 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.626196 4898 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.626203 4898 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.626210 4898 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.626216 4898 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.626222 4898 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.626229 4898 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.626235 4898 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.626242 4898 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.626248 4898 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.626254 4898 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.626261 4898 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.626268 4898 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.626275 4898 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.626281 4898 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.626288 4898 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.626294 4898 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.626301 4898 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.626308 4898 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.626314 4898 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.626321 4898 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.626330 4898 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.626339 4898 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.626348 4898 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.626356 4898 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.626364 4898 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.626373 4898 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.626380 4898 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.626387 4898 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.626393 4898 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.626400 4898 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.626407 4898 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.626414 4898 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.626421 4898 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.626434 4898 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.626740 4898 server.go:940] "Client rotation is on, will bootstrap in background" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.630863 4898 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.631271 4898 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.632075 4898 server.go:997] "Starting client certificate rotation" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.632132 4898 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.632343 4898 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2025-12-27 18:41:22.202515129 +0000 UTC Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.632422 4898 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 389h37m19.570096373s for next certificate rotation Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.639241 4898 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.641508 4898 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.652068 4898 log.go:25] "Validated CRI v1 runtime API" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.678028 4898 log.go:25] "Validated CRI v1 image API" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.679542 4898 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.683622 4898 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2025-12-11-12-59-52-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.683656 4898 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.704503 4898 manager.go:217] Machine: {Timestamp:2025-12-11 13:04:02.701549787 +0000 UTC m=+0.273876264 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654124544 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:3ede5a2c-67f9-4bff-827e-03a23908e5c0 BootID:160776e4-8c30-4eed-9dbf-aa0b51733cfb Filesystems:[{Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365408768 Type:vfs Inodes:821633 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108169 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827060224 Type:vfs Inodes:4108169 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:c8:6c:5e Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:c8:6c:5e Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:0f:20:39 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:d4:69:b6 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:c6:72:cf Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:b7:12:28 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:5a:7e:c1:38:d1:83 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:c6:96:8a:33:51:34 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654124544 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.704785 4898 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.704981 4898 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.706177 4898 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.706370 4898 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.706411 4898 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.706769 4898 topology_manager.go:138] "Creating topology manager with none policy" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.706783 4898 container_manager_linux.go:303] "Creating device plugin manager" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.707073 4898 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.707122 4898 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.707299 4898 state_mem.go:36] "Initialized new in-memory state store" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.707699 4898 server.go:1245] "Using root directory" path="/var/lib/kubelet" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.709346 4898 kubelet.go:418] "Attempting to sync node with API server" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.709370 4898 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.709387 4898 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.709402 4898 kubelet.go:324] "Adding apiserver pod source" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.709415 4898 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.711491 4898 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.711823 4898 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.712670 4898 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.713207 4898 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.713236 4898 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.713244 4898 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.713253 4898 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.713267 4898 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.713275 4898 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.713305 4898 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.713317 4898 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.713327 4898 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.713335 4898 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.713359 4898 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.713369 4898 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.713619 4898 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.18:6443: connect: connection refused Dec 11 13:04:02 crc kubenswrapper[4898]: E1211 13:04:02.713677 4898 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.18:6443: connect: connection refused" logger="UnhandledError" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.713794 4898 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.713901 4898 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.18:6443: connect: connection refused Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.714257 4898 server.go:1280] "Started kubelet" Dec 11 13:04:02 crc kubenswrapper[4898]: E1211 13:04:02.714161 4898 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.18:6443: connect: connection refused" logger="UnhandledError" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.715482 4898 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.18:6443: connect: connection refused Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.715691 4898 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.715115 4898 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Dec 11 13:04:02 crc systemd[1]: Started Kubernetes Kubelet. Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.720022 4898 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.721776 4898 server.go:460] "Adding debug handlers to kubelet server" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.721931 4898 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.721960 4898 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Dec 11 13:04:02 crc kubenswrapper[4898]: E1211 13:04:02.722445 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.722634 4898 volume_manager.go:287] "The desired_state_of_world populator starts" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.722903 4898 volume_manager.go:289] "Starting Kubelet Volume Manager" Dec 11 13:04:02 crc kubenswrapper[4898]: E1211 13:04:02.721416 4898 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.18:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.18802aeafec404d0 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-11 13:04:02.71420744 +0000 UTC m=+0.286533877,LastTimestamp:2025-12-11 13:04:02.71420744 +0000 UTC m=+0.286533877,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.723458 4898 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-15 00:55:21.38379726 +0000 UTC Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.723631 4898 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.18:6443: connect: connection refused Dec 11 13:04:02 crc kubenswrapper[4898]: E1211 13:04:02.723731 4898 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.18:6443: connect: connection refused" logger="UnhandledError" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.722660 4898 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.723866 4898 factory.go:55] Registering systemd factory Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.723882 4898 factory.go:221] Registration of the systemd container factory successfully Dec 11 13:04:02 crc kubenswrapper[4898]: E1211 13:04:02.723881 4898 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.18:6443: connect: connection refused" interval="200ms" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.725174 4898 factory.go:153] Registering CRI-O factory Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.725196 4898 factory.go:221] Registration of the crio container factory successfully Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.725260 4898 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.725287 4898 factory.go:103] Registering Raw factory Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.725306 4898 manager.go:1196] Started watching for new ooms in manager Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.726205 4898 manager.go:319] Starting recovery of all containers Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.734735 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.734832 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.734856 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.734904 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.734920 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.734935 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.734950 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.734965 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.734984 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.735001 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.735018 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.735049 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.735066 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.735086 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.735106 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.735123 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.735139 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.735156 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.735172 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.735191 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.735207 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.735231 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.735250 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.735297 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.735315 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.735330 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.735350 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.735375 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.735401 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.735417 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.735437 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.735454 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.735575 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.735603 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.735679 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.735705 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.735720 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.735734 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.735749 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.735763 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.735779 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.735793 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.735859 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.735877 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.735895 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.735911 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.735985 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.736003 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.736021 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.736039 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.736056 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.736072 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.736147 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.736168 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.736231 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.736246 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.736258 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.736271 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.736285 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.736296 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.736307 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.736318 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.736331 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.736342 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.736354 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.736365 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.736377 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.736388 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.736398 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.736408 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.736419 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.736429 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.736439 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.736450 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.737129 4898 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.737154 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.737167 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.737179 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.737224 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.737237 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.737253 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.737285 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.737301 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.737314 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.737327 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.737340 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.737354 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.737365 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.737375 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.737387 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.737398 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.737408 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.737426 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.737439 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.737451 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.737467 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.737478 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.737519 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.737532 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.737542 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.737555 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.737566 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.737576 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.737589 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.737601 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.737618 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.737631 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.737643 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.737654 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.737667 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.737679 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.737695 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.737707 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.737720 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.737732 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.737743 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.737757 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.737774 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.737789 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.737805 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.737818 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.737831 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.737844 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.737855 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.737866 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.737880 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.737891 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.737902 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.737914 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.737925 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.737937 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.737949 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.737960 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.737971 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.737982 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.737993 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.738005 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.738017 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.738027 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.738038 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.738050 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.738061 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.738073 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.738089 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.738100 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.738120 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.738131 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.738143 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.738153 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.738165 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.738176 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.738187 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.738198 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.738253 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.738265 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.738278 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.738292 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.738306 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.738324 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.738337 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.738349 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.738360 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.738371 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.738384 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.738396 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.738407 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.738418 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.738430 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.738444 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.738460 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.738471 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.738505 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.738523 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.738537 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.738557 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.738581 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.738597 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.738614 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.738631 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.738646 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.738660 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.738670 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.738721 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.738733 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.738744 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.738757 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.738768 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.738787 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.738799 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.738811 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.738824 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.738836 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.738850 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.738861 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.738872 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.738883 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.738894 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.738911 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.738922 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.738932 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.738942 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.738956 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.738969 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.739013 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.739030 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.739046 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.739061 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.739082 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.739096 4898 reconstruct.go:97] "Volume reconstruction finished" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.739103 4898 reconciler.go:26] "Reconciler: start to sync state" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.753726 4898 manager.go:324] Recovery completed Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.763326 4898 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.764891 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.764917 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.764926 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.766094 4898 cpu_manager.go:225] "Starting CPU manager" policy="none" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.766110 4898 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.766159 4898 state_mem.go:36] "Initialized new in-memory state store" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.772141 4898 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.773645 4898 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.773676 4898 status_manager.go:217] "Starting to sync pod status with apiserver" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.773695 4898 kubelet.go:2335] "Starting kubelet main sync loop" Dec 11 13:04:02 crc kubenswrapper[4898]: E1211 13:04:02.773728 4898 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Dec 11 13:04:02 crc kubenswrapper[4898]: W1211 13:04:02.776346 4898 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.18:6443: connect: connection refused Dec 11 13:04:02 crc kubenswrapper[4898]: E1211 13:04:02.776467 4898 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.18:6443: connect: connection refused" logger="UnhandledError" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.782105 4898 policy_none.go:49] "None policy: Start" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.782693 4898 memory_manager.go:170] "Starting memorymanager" policy="None" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.782720 4898 state_mem.go:35] "Initializing new in-memory state store" Dec 11 13:04:02 crc kubenswrapper[4898]: E1211 13:04:02.823154 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.864037 4898 manager.go:334] "Starting Device Plugin manager" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.864104 4898 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.864124 4898 server.go:79] "Starting device plugin registration server" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.864722 4898 eviction_manager.go:189] "Eviction manager: starting control loop" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.864761 4898 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.864944 4898 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.865239 4898 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.865255 4898 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Dec 11 13:04:02 crc kubenswrapper[4898]: E1211 13:04:02.873343 4898 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.874600 4898 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc"] Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.874742 4898 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.875907 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.875945 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.875960 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.876112 4898 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.876848 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.876896 4898 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.877020 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.877046 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.877059 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.877194 4898 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.877700 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.877759 4898 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.878111 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.878174 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.878191 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.879044 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.879068 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.879078 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.879173 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.879206 4898 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.879213 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.879233 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.879556 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.879621 4898 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.880113 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.880146 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.880159 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.880272 4898 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.880539 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.880585 4898 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.881136 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.881168 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.881184 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.881143 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.881265 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.881301 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.881481 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.881519 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.881532 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.881643 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.881709 4898 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.882559 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.882585 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.882597 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:02 crc kubenswrapper[4898]: E1211 13:04:02.924819 4898 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.18:6443: connect: connection refused" interval="400ms" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.940619 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.940685 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.940711 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.940738 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.940845 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.940898 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.940929 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.940958 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.940989 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.941015 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.941046 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.941076 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.941187 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.941240 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.941276 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.965148 4898 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.966158 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.966187 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.966196 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:02 crc kubenswrapper[4898]: I1211 13:04:02.966223 4898 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 11 13:04:02 crc kubenswrapper[4898]: E1211 13:04:02.966676 4898 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.18:6443: connect: connection refused" node="crc" Dec 11 13:04:03 crc kubenswrapper[4898]: I1211 13:04:03.042784 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 11 13:04:03 crc kubenswrapper[4898]: I1211 13:04:03.043069 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 11 13:04:03 crc kubenswrapper[4898]: I1211 13:04:03.043244 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 11 13:04:03 crc kubenswrapper[4898]: I1211 13:04:03.043389 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 11 13:04:03 crc kubenswrapper[4898]: I1211 13:04:03.043425 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 11 13:04:03 crc kubenswrapper[4898]: I1211 13:04:03.043452 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 11 13:04:03 crc kubenswrapper[4898]: I1211 13:04:03.043489 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 11 13:04:03 crc kubenswrapper[4898]: I1211 13:04:03.043553 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 11 13:04:03 crc kubenswrapper[4898]: I1211 13:04:03.043519 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 11 13:04:03 crc kubenswrapper[4898]: I1211 13:04:03.043599 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 11 13:04:03 crc kubenswrapper[4898]: I1211 13:04:03.043604 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 11 13:04:03 crc kubenswrapper[4898]: I1211 13:04:03.043643 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 11 13:04:03 crc kubenswrapper[4898]: I1211 13:04:03.043627 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 11 13:04:03 crc kubenswrapper[4898]: I1211 13:04:03.043619 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 11 13:04:03 crc kubenswrapper[4898]: I1211 13:04:03.043719 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 11 13:04:03 crc kubenswrapper[4898]: I1211 13:04:03.043652 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 11 13:04:03 crc kubenswrapper[4898]: I1211 13:04:03.043699 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 11 13:04:03 crc kubenswrapper[4898]: I1211 13:04:03.043824 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 11 13:04:03 crc kubenswrapper[4898]: I1211 13:04:03.043927 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 11 13:04:03 crc kubenswrapper[4898]: I1211 13:04:03.044006 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 11 13:04:03 crc kubenswrapper[4898]: I1211 13:04:03.044032 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 11 13:04:03 crc kubenswrapper[4898]: I1211 13:04:03.044056 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 11 13:04:03 crc kubenswrapper[4898]: I1211 13:04:03.044083 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 11 13:04:03 crc kubenswrapper[4898]: I1211 13:04:03.044104 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 11 13:04:03 crc kubenswrapper[4898]: I1211 13:04:03.044142 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 11 13:04:03 crc kubenswrapper[4898]: I1211 13:04:03.044119 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 11 13:04:03 crc kubenswrapper[4898]: I1211 13:04:03.044157 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 11 13:04:03 crc kubenswrapper[4898]: I1211 13:04:03.044216 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 11 13:04:03 crc kubenswrapper[4898]: I1211 13:04:03.044297 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 11 13:04:03 crc kubenswrapper[4898]: I1211 13:04:03.044213 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 11 13:04:03 crc kubenswrapper[4898]: I1211 13:04:03.167032 4898 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 11 13:04:03 crc kubenswrapper[4898]: I1211 13:04:03.168604 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:03 crc kubenswrapper[4898]: I1211 13:04:03.168672 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:03 crc kubenswrapper[4898]: I1211 13:04:03.168689 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:03 crc kubenswrapper[4898]: I1211 13:04:03.168726 4898 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 11 13:04:03 crc kubenswrapper[4898]: E1211 13:04:03.169244 4898 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.18:6443: connect: connection refused" node="crc" Dec 11 13:04:03 crc kubenswrapper[4898]: I1211 13:04:03.217957 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 11 13:04:03 crc kubenswrapper[4898]: I1211 13:04:03.225167 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 11 13:04:03 crc kubenswrapper[4898]: I1211 13:04:03.249145 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 11 13:04:03 crc kubenswrapper[4898]: W1211 13:04:03.257391 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-c4640c64406101a103cde0113f73f2764b3590003a220e00b42d0e93cd3e1f47 WatchSource:0}: Error finding container c4640c64406101a103cde0113f73f2764b3590003a220e00b42d0e93cd3e1f47: Status 404 returned error can't find the container with id c4640c64406101a103cde0113f73f2764b3590003a220e00b42d0e93cd3e1f47 Dec 11 13:04:03 crc kubenswrapper[4898]: I1211 13:04:03.260247 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 11 13:04:03 crc kubenswrapper[4898]: W1211 13:04:03.262640 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-54cc4b1d3a45c03635026894d9469b020d4c192d5f73b837458e02ae827229a1 WatchSource:0}: Error finding container 54cc4b1d3a45c03635026894d9469b020d4c192d5f73b837458e02ae827229a1: Status 404 returned error can't find the container with id 54cc4b1d3a45c03635026894d9469b020d4c192d5f73b837458e02ae827229a1 Dec 11 13:04:03 crc kubenswrapper[4898]: I1211 13:04:03.266711 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Dec 11 13:04:03 crc kubenswrapper[4898]: W1211 13:04:03.282159 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-5e953fa0343d910215827463201df46c1a76ae6c77199f9335cc51dd083dbb4c WatchSource:0}: Error finding container 5e953fa0343d910215827463201df46c1a76ae6c77199f9335cc51dd083dbb4c: Status 404 returned error can't find the container with id 5e953fa0343d910215827463201df46c1a76ae6c77199f9335cc51dd083dbb4c Dec 11 13:04:03 crc kubenswrapper[4898]: W1211 13:04:03.283472 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-20f9ac57dcdbb5ab6c19ec7461afad2e0f2bc7161957c883037d4b193cfd4e28 WatchSource:0}: Error finding container 20f9ac57dcdbb5ab6c19ec7461afad2e0f2bc7161957c883037d4b193cfd4e28: Status 404 returned error can't find the container with id 20f9ac57dcdbb5ab6c19ec7461afad2e0f2bc7161957c883037d4b193cfd4e28 Dec 11 13:04:03 crc kubenswrapper[4898]: W1211 13:04:03.297095 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-1db7034e7c33e98177ba039b390175a727897e0757cf13f85f5607b6a34d9a34 WatchSource:0}: Error finding container 1db7034e7c33e98177ba039b390175a727897e0757cf13f85f5607b6a34d9a34: Status 404 returned error can't find the container with id 1db7034e7c33e98177ba039b390175a727897e0757cf13f85f5607b6a34d9a34 Dec 11 13:04:03 crc kubenswrapper[4898]: E1211 13:04:03.325933 4898 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.18:6443: connect: connection refused" interval="800ms" Dec 11 13:04:03 crc kubenswrapper[4898]: I1211 13:04:03.569348 4898 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 11 13:04:03 crc kubenswrapper[4898]: I1211 13:04:03.570998 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:03 crc kubenswrapper[4898]: I1211 13:04:03.571054 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:03 crc kubenswrapper[4898]: I1211 13:04:03.571071 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:03 crc kubenswrapper[4898]: I1211 13:04:03.571106 4898 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 11 13:04:03 crc kubenswrapper[4898]: E1211 13:04:03.571670 4898 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.18:6443: connect: connection refused" node="crc" Dec 11 13:04:03 crc kubenswrapper[4898]: W1211 13:04:03.636781 4898 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.18:6443: connect: connection refused Dec 11 13:04:03 crc kubenswrapper[4898]: E1211 13:04:03.636867 4898 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.18:6443: connect: connection refused" logger="UnhandledError" Dec 11 13:04:03 crc kubenswrapper[4898]: W1211 13:04:03.665038 4898 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.18:6443: connect: connection refused Dec 11 13:04:03 crc kubenswrapper[4898]: E1211 13:04:03.665130 4898 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.18:6443: connect: connection refused" logger="UnhandledError" Dec 11 13:04:03 crc kubenswrapper[4898]: I1211 13:04:03.716068 4898 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.18:6443: connect: connection refused Dec 11 13:04:03 crc kubenswrapper[4898]: I1211 13:04:03.724402 4898 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-12 10:46:03.672496233 +0000 UTC Dec 11 13:04:03 crc kubenswrapper[4898]: I1211 13:04:03.724443 4898 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 765h41m59.948055321s for next certificate rotation Dec 11 13:04:03 crc kubenswrapper[4898]: I1211 13:04:03.778775 4898 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="77f3c45a0ef619bd5ee7b89f62fdaf4ae983c8f75e23034eb88019194f6ef88a" exitCode=0 Dec 11 13:04:03 crc kubenswrapper[4898]: I1211 13:04:03.778873 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"77f3c45a0ef619bd5ee7b89f62fdaf4ae983c8f75e23034eb88019194f6ef88a"} Dec 11 13:04:03 crc kubenswrapper[4898]: I1211 13:04:03.779029 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"1db7034e7c33e98177ba039b390175a727897e0757cf13f85f5607b6a34d9a34"} Dec 11 13:04:03 crc kubenswrapper[4898]: I1211 13:04:03.779158 4898 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 11 13:04:03 crc kubenswrapper[4898]: I1211 13:04:03.780211 4898 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="62d748867b692dbb01e602f478b8af8cfee39ad2effae7fb32c2d53207c52519" exitCode=0 Dec 11 13:04:03 crc kubenswrapper[4898]: I1211 13:04:03.780273 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"62d748867b692dbb01e602f478b8af8cfee39ad2effae7fb32c2d53207c52519"} Dec 11 13:04:03 crc kubenswrapper[4898]: I1211 13:04:03.780305 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"20f9ac57dcdbb5ab6c19ec7461afad2e0f2bc7161957c883037d4b193cfd4e28"} Dec 11 13:04:03 crc kubenswrapper[4898]: I1211 13:04:03.780389 4898 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 11 13:04:03 crc kubenswrapper[4898]: I1211 13:04:03.780804 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:03 crc kubenswrapper[4898]: I1211 13:04:03.780835 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:03 crc kubenswrapper[4898]: I1211 13:04:03.780846 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:03 crc kubenswrapper[4898]: I1211 13:04:03.781197 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:03 crc kubenswrapper[4898]: I1211 13:04:03.781228 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:03 crc kubenswrapper[4898]: I1211 13:04:03.781239 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:03 crc kubenswrapper[4898]: I1211 13:04:03.782131 4898 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="e329db1261719ccef14dcf3965ff73799b275591613d4035cb820253dced79b9" exitCode=0 Dec 11 13:04:03 crc kubenswrapper[4898]: I1211 13:04:03.782165 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"e329db1261719ccef14dcf3965ff73799b275591613d4035cb820253dced79b9"} Dec 11 13:04:03 crc kubenswrapper[4898]: I1211 13:04:03.782196 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"5e953fa0343d910215827463201df46c1a76ae6c77199f9335cc51dd083dbb4c"} Dec 11 13:04:03 crc kubenswrapper[4898]: I1211 13:04:03.782286 4898 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 11 13:04:03 crc kubenswrapper[4898]: I1211 13:04:03.783891 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:03 crc kubenswrapper[4898]: I1211 13:04:03.783920 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:03 crc kubenswrapper[4898]: I1211 13:04:03.783930 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:03 crc kubenswrapper[4898]: I1211 13:04:03.784178 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"a52b842d9e7001cf1e9c3284c56ac9a76f642b66457d6717c6367e00ef89fdf3"} Dec 11 13:04:03 crc kubenswrapper[4898]: I1211 13:04:03.784218 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"54cc4b1d3a45c03635026894d9469b020d4c192d5f73b837458e02ae827229a1"} Dec 11 13:04:03 crc kubenswrapper[4898]: I1211 13:04:03.786460 4898 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="ca4b5408884ed07718750ad82f77961d242327ca11991fd4dbbd1b5b53c5b642" exitCode=0 Dec 11 13:04:03 crc kubenswrapper[4898]: I1211 13:04:03.786508 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"ca4b5408884ed07718750ad82f77961d242327ca11991fd4dbbd1b5b53c5b642"} Dec 11 13:04:03 crc kubenswrapper[4898]: I1211 13:04:03.786581 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"c4640c64406101a103cde0113f73f2764b3590003a220e00b42d0e93cd3e1f47"} Dec 11 13:04:03 crc kubenswrapper[4898]: I1211 13:04:03.786700 4898 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 11 13:04:03 crc kubenswrapper[4898]: I1211 13:04:03.787450 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:03 crc kubenswrapper[4898]: I1211 13:04:03.787494 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:03 crc kubenswrapper[4898]: I1211 13:04:03.787504 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:03 crc kubenswrapper[4898]: I1211 13:04:03.789112 4898 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 11 13:04:03 crc kubenswrapper[4898]: I1211 13:04:03.789838 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:03 crc kubenswrapper[4898]: I1211 13:04:03.789876 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:03 crc kubenswrapper[4898]: I1211 13:04:03.789892 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:04 crc kubenswrapper[4898]: W1211 13:04:04.006391 4898 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.18:6443: connect: connection refused Dec 11 13:04:04 crc kubenswrapper[4898]: E1211 13:04:04.006582 4898 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.18:6443: connect: connection refused" logger="UnhandledError" Dec 11 13:04:04 crc kubenswrapper[4898]: E1211 13:04:04.127303 4898 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.18:6443: connect: connection refused" interval="1.6s" Dec 11 13:04:04 crc kubenswrapper[4898]: W1211 13:04:04.241401 4898 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.18:6443: connect: connection refused Dec 11 13:04:04 crc kubenswrapper[4898]: E1211 13:04:04.241882 4898 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.18:6443: connect: connection refused" logger="UnhandledError" Dec 11 13:04:04 crc kubenswrapper[4898]: I1211 13:04:04.371842 4898 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 11 13:04:04 crc kubenswrapper[4898]: I1211 13:04:04.374205 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:04 crc kubenswrapper[4898]: I1211 13:04:04.374247 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:04 crc kubenswrapper[4898]: I1211 13:04:04.374260 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:04 crc kubenswrapper[4898]: I1211 13:04:04.374511 4898 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 11 13:04:04 crc kubenswrapper[4898]: E1211 13:04:04.375370 4898 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.18:6443: connect: connection refused" node="crc" Dec 11 13:04:04 crc kubenswrapper[4898]: I1211 13:04:04.716529 4898 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.18:6443: connect: connection refused Dec 11 13:04:04 crc kubenswrapper[4898]: I1211 13:04:04.789341 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"02c825a08c28d57d3fe2034e57f21ac7b8ca377df15a37ad6757b4d0c6632f2f"} Dec 11 13:04:04 crc kubenswrapper[4898]: I1211 13:04:04.789449 4898 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 11 13:04:04 crc kubenswrapper[4898]: I1211 13:04:04.790234 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:04 crc kubenswrapper[4898]: I1211 13:04:04.790260 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:04 crc kubenswrapper[4898]: I1211 13:04:04.790271 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:04 crc kubenswrapper[4898]: I1211 13:04:04.793259 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"6cbfca7c42a03929bac318edc66ce1a6c7f88f795fd3b58db686b8484edfac60"} Dec 11 13:04:04 crc kubenswrapper[4898]: I1211 13:04:04.793303 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"fcda5c81dd6cdc256d7acd24d32f956484e8823bed9e29c5713a722d8ebd6f46"} Dec 11 13:04:04 crc kubenswrapper[4898]: I1211 13:04:04.793314 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"e84fc12f1339e236216301663a693fc0ae8c043f282081b77e08455869556f40"} Dec 11 13:04:04 crc kubenswrapper[4898]: I1211 13:04:04.793437 4898 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 11 13:04:04 crc kubenswrapper[4898]: I1211 13:04:04.794358 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:04 crc kubenswrapper[4898]: I1211 13:04:04.794384 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:04 crc kubenswrapper[4898]: I1211 13:04:04.794394 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:04 crc kubenswrapper[4898]: I1211 13:04:04.796736 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"f419f755ec0a61ffad06dca0d35df43386ebd253152f29cf04312cd0ea89d5de"} Dec 11 13:04:04 crc kubenswrapper[4898]: I1211 13:04:04.796764 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"583816a3f9ba9a092d6cd8d75bba54cf42e3ff9f65230f83822baec967dd53dd"} Dec 11 13:04:04 crc kubenswrapper[4898]: I1211 13:04:04.796776 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"0b72e9f6787b4bfadbc89cf669d5f31678f0a85a726107e3b3d2e06eff0e18ec"} Dec 11 13:04:04 crc kubenswrapper[4898]: I1211 13:04:04.796840 4898 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 11 13:04:04 crc kubenswrapper[4898]: I1211 13:04:04.797541 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:04 crc kubenswrapper[4898]: I1211 13:04:04.797575 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:04 crc kubenswrapper[4898]: I1211 13:04:04.797589 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:04 crc kubenswrapper[4898]: I1211 13:04:04.801222 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"8857b4ade110b31598d9080f45bc15f5702b2c278fe7909e4d172c9877e61534"} Dec 11 13:04:04 crc kubenswrapper[4898]: I1211 13:04:04.801249 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"5a6e962a8c37823fc7d0047f9c1cae14311bfa1f36236addff039bba3369a021"} Dec 11 13:04:04 crc kubenswrapper[4898]: I1211 13:04:04.801262 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"765cb993103d23770e20a725ce1f28ca635d13d2ecd05676039a833274ecf3e5"} Dec 11 13:04:04 crc kubenswrapper[4898]: I1211 13:04:04.801275 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"41568c567a418a6ebd534c09d02c4331dfa2cd00580441bfdd83a15a12153a59"} Dec 11 13:04:04 crc kubenswrapper[4898]: I1211 13:04:04.801286 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"df081be8bffcbf5068ade5f09ad81f2ea332d110fb0dffc1ab215f4a447e75d5"} Dec 11 13:04:04 crc kubenswrapper[4898]: I1211 13:04:04.801362 4898 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 11 13:04:04 crc kubenswrapper[4898]: I1211 13:04:04.801961 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:04 crc kubenswrapper[4898]: I1211 13:04:04.801981 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:04 crc kubenswrapper[4898]: I1211 13:04:04.801988 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:04 crc kubenswrapper[4898]: I1211 13:04:04.803451 4898 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="32f392949d5ebc630af8b0646f8f482f806803d464e2cedbcbe731328d6ba5ef" exitCode=0 Dec 11 13:04:04 crc kubenswrapper[4898]: I1211 13:04:04.803489 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"32f392949d5ebc630af8b0646f8f482f806803d464e2cedbcbe731328d6ba5ef"} Dec 11 13:04:04 crc kubenswrapper[4898]: I1211 13:04:04.803557 4898 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 11 13:04:04 crc kubenswrapper[4898]: I1211 13:04:04.804011 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:04 crc kubenswrapper[4898]: I1211 13:04:04.804028 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:04 crc kubenswrapper[4898]: I1211 13:04:04.804035 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:05 crc kubenswrapper[4898]: I1211 13:04:05.010637 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 11 13:04:05 crc kubenswrapper[4898]: I1211 13:04:05.814125 4898 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="bf8553e4b1a22afaf454efae3707136b9e6e320a54fd269f48a5f72297249d21" exitCode=0 Dec 11 13:04:05 crc kubenswrapper[4898]: I1211 13:04:05.814241 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"bf8553e4b1a22afaf454efae3707136b9e6e320a54fd269f48a5f72297249d21"} Dec 11 13:04:05 crc kubenswrapper[4898]: I1211 13:04:05.814348 4898 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 11 13:04:05 crc kubenswrapper[4898]: I1211 13:04:05.814479 4898 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 11 13:04:05 crc kubenswrapper[4898]: I1211 13:04:05.814353 4898 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 11 13:04:05 crc kubenswrapper[4898]: I1211 13:04:05.815873 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:05 crc kubenswrapper[4898]: I1211 13:04:05.815929 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:05 crc kubenswrapper[4898]: I1211 13:04:05.815953 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:05 crc kubenswrapper[4898]: I1211 13:04:05.816033 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:05 crc kubenswrapper[4898]: I1211 13:04:05.816071 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:05 crc kubenswrapper[4898]: I1211 13:04:05.816090 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:05 crc kubenswrapper[4898]: I1211 13:04:05.816154 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:05 crc kubenswrapper[4898]: I1211 13:04:05.816181 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:05 crc kubenswrapper[4898]: I1211 13:04:05.816191 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:05 crc kubenswrapper[4898]: I1211 13:04:05.884565 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 11 13:04:05 crc kubenswrapper[4898]: I1211 13:04:05.884789 4898 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 11 13:04:05 crc kubenswrapper[4898]: I1211 13:04:05.884868 4898 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 11 13:04:05 crc kubenswrapper[4898]: I1211 13:04:05.886448 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:05 crc kubenswrapper[4898]: I1211 13:04:05.886546 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:05 crc kubenswrapper[4898]: I1211 13:04:05.886570 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:05 crc kubenswrapper[4898]: I1211 13:04:05.976480 4898 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 11 13:04:05 crc kubenswrapper[4898]: I1211 13:04:05.977620 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:05 crc kubenswrapper[4898]: I1211 13:04:05.977681 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:05 crc kubenswrapper[4898]: I1211 13:04:05.977699 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:05 crc kubenswrapper[4898]: I1211 13:04:05.977731 4898 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 11 13:04:06 crc kubenswrapper[4898]: I1211 13:04:06.824655 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"ad23dc07c8ad7cf6100e3b53e1d9ed8abac20e942968edb5decdde1b263de306"} Dec 11 13:04:06 crc kubenswrapper[4898]: I1211 13:04:06.824709 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"fdd8f4f5e41c02e8bfa32782027f6483a3568a3e1cb28f2abc7807eaada32646"} Dec 11 13:04:06 crc kubenswrapper[4898]: I1211 13:04:06.824729 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"87f5a835b639c4ce50c0d2b5f1a865f1b67bc5cc06d9fd88875d9defb71dbed2"} Dec 11 13:04:07 crc kubenswrapper[4898]: I1211 13:04:07.797514 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 11 13:04:07 crc kubenswrapper[4898]: I1211 13:04:07.797726 4898 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 11 13:04:07 crc kubenswrapper[4898]: I1211 13:04:07.799263 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:07 crc kubenswrapper[4898]: I1211 13:04:07.799329 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:07 crc kubenswrapper[4898]: I1211 13:04:07.799349 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:07 crc kubenswrapper[4898]: I1211 13:04:07.805268 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 11 13:04:07 crc kubenswrapper[4898]: I1211 13:04:07.833613 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"e4801f56faef50a1c100b7f54be9508a2be17ab655648fdb92cffaab1cb198a5"} Dec 11 13:04:07 crc kubenswrapper[4898]: I1211 13:04:07.833679 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"ce1bf1a7718645350f992e26a366a0fefa0d757b696c932d3552dd24a4c93550"} Dec 11 13:04:07 crc kubenswrapper[4898]: I1211 13:04:07.833687 4898 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 11 13:04:07 crc kubenswrapper[4898]: I1211 13:04:07.833782 4898 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 11 13:04:07 crc kubenswrapper[4898]: I1211 13:04:07.837423 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:07 crc kubenswrapper[4898]: I1211 13:04:07.837524 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:07 crc kubenswrapper[4898]: I1211 13:04:07.837545 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:07 crc kubenswrapper[4898]: I1211 13:04:07.840442 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:07 crc kubenswrapper[4898]: I1211 13:04:07.840541 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:07 crc kubenswrapper[4898]: I1211 13:04:07.840560 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:08 crc kubenswrapper[4898]: I1211 13:04:08.167040 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 11 13:04:08 crc kubenswrapper[4898]: I1211 13:04:08.167266 4898 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 11 13:04:08 crc kubenswrapper[4898]: I1211 13:04:08.167318 4898 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 11 13:04:08 crc kubenswrapper[4898]: I1211 13:04:08.169018 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:08 crc kubenswrapper[4898]: I1211 13:04:08.169133 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:08 crc kubenswrapper[4898]: I1211 13:04:08.169159 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:08 crc kubenswrapper[4898]: I1211 13:04:08.528608 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Dec 11 13:04:08 crc kubenswrapper[4898]: I1211 13:04:08.551255 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 11 13:04:08 crc kubenswrapper[4898]: I1211 13:04:08.837253 4898 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 11 13:04:08 crc kubenswrapper[4898]: I1211 13:04:08.837311 4898 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 11 13:04:08 crc kubenswrapper[4898]: I1211 13:04:08.837318 4898 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 11 13:04:08 crc kubenswrapper[4898]: I1211 13:04:08.838855 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:08 crc kubenswrapper[4898]: I1211 13:04:08.838907 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:08 crc kubenswrapper[4898]: I1211 13:04:08.838925 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:08 crc kubenswrapper[4898]: I1211 13:04:08.839288 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:08 crc kubenswrapper[4898]: I1211 13:04:08.839450 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:08 crc kubenswrapper[4898]: I1211 13:04:08.839661 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:09 crc kubenswrapper[4898]: I1211 13:04:09.255450 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 11 13:04:09 crc kubenswrapper[4898]: I1211 13:04:09.839782 4898 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 11 13:04:09 crc kubenswrapper[4898]: I1211 13:04:09.839825 4898 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 11 13:04:09 crc kubenswrapper[4898]: I1211 13:04:09.839944 4898 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 11 13:04:09 crc kubenswrapper[4898]: I1211 13:04:09.840945 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:09 crc kubenswrapper[4898]: I1211 13:04:09.840970 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:09 crc kubenswrapper[4898]: I1211 13:04:09.840978 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:09 crc kubenswrapper[4898]: I1211 13:04:09.841330 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:09 crc kubenswrapper[4898]: I1211 13:04:09.841396 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:09 crc kubenswrapper[4898]: I1211 13:04:09.841451 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:11 crc kubenswrapper[4898]: I1211 13:04:11.743407 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 11 13:04:11 crc kubenswrapper[4898]: I1211 13:04:11.743672 4898 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 11 13:04:11 crc kubenswrapper[4898]: I1211 13:04:11.745085 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:11 crc kubenswrapper[4898]: I1211 13:04:11.745145 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:11 crc kubenswrapper[4898]: I1211 13:04:11.745164 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:12 crc kubenswrapper[4898]: I1211 13:04:12.120677 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 11 13:04:12 crc kubenswrapper[4898]: I1211 13:04:12.120851 4898 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 11 13:04:12 crc kubenswrapper[4898]: I1211 13:04:12.122238 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:12 crc kubenswrapper[4898]: I1211 13:04:12.122290 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:12 crc kubenswrapper[4898]: I1211 13:04:12.122317 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:12 crc kubenswrapper[4898]: I1211 13:04:12.255995 4898 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 11 13:04:12 crc kubenswrapper[4898]: I1211 13:04:12.256090 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 11 13:04:12 crc kubenswrapper[4898]: I1211 13:04:12.258220 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Dec 11 13:04:12 crc kubenswrapper[4898]: I1211 13:04:12.258397 4898 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 11 13:04:12 crc kubenswrapper[4898]: I1211 13:04:12.259474 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:12 crc kubenswrapper[4898]: I1211 13:04:12.259507 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:12 crc kubenswrapper[4898]: I1211 13:04:12.259517 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:12 crc kubenswrapper[4898]: E1211 13:04:12.874126 4898 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Dec 11 13:04:14 crc kubenswrapper[4898]: I1211 13:04:14.866731 4898 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Liveness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Dec 11 13:04:14 crc kubenswrapper[4898]: I1211 13:04:14.866812 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Dec 11 13:04:15 crc kubenswrapper[4898]: I1211 13:04:15.718200 4898 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Dec 11 13:04:15 crc kubenswrapper[4898]: E1211 13:04:15.728725 4898 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" interval="3.2s" Dec 11 13:04:15 crc kubenswrapper[4898]: W1211 13:04:15.869862 4898 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout Dec 11 13:04:15 crc kubenswrapper[4898]: I1211 13:04:15.870014 4898 trace.go:236] Trace[461368671]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (11-Dec-2025 13:04:05.868) (total time: 10001ms): Dec 11 13:04:15 crc kubenswrapper[4898]: Trace[461368671]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (13:04:15.869) Dec 11 13:04:15 crc kubenswrapper[4898]: Trace[461368671]: [10.001290699s] [10.001290699s] END Dec 11 13:04:15 crc kubenswrapper[4898]: E1211 13:04:15.870054 4898 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Dec 11 13:04:15 crc kubenswrapper[4898]: I1211 13:04:15.885351 4898 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="Get \"https://192.168.126.11:6443/livez\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 11 13:04:15 crc kubenswrapper[4898]: I1211 13:04:15.885434 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="Get \"https://192.168.126.11:6443/livez\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 11 13:04:15 crc kubenswrapper[4898]: E1211 13:04:15.978532 4898 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": net/http: TLS handshake timeout" node="crc" Dec 11 13:04:16 crc kubenswrapper[4898]: I1211 13:04:16.020804 4898 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Dec 11 13:04:16 crc kubenswrapper[4898]: I1211 13:04:16.020871 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Dec 11 13:04:19 crc kubenswrapper[4898]: I1211 13:04:19.178730 4898 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 11 13:04:19 crc kubenswrapper[4898]: I1211 13:04:19.180330 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:19 crc kubenswrapper[4898]: I1211 13:04:19.180393 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:19 crc kubenswrapper[4898]: I1211 13:04:19.180414 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:19 crc kubenswrapper[4898]: I1211 13:04:19.180499 4898 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 11 13:04:19 crc kubenswrapper[4898]: E1211 13:04:19.184435 4898 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Dec 11 13:04:20 crc kubenswrapper[4898]: I1211 13:04:20.642743 4898 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Dec 11 13:04:20 crc kubenswrapper[4898]: I1211 13:04:20.889583 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 11 13:04:20 crc kubenswrapper[4898]: I1211 13:04:20.889748 4898 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 11 13:04:20 crc kubenswrapper[4898]: I1211 13:04:20.890750 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:20 crc kubenswrapper[4898]: I1211 13:04:20.890777 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:20 crc kubenswrapper[4898]: I1211 13:04:20.890788 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:20 crc kubenswrapper[4898]: I1211 13:04:20.895179 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.010999 4898 trace.go:236] Trace[1005342536]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (11-Dec-2025 13:04:07.103) (total time: 13906ms): Dec 11 13:04:21 crc kubenswrapper[4898]: Trace[1005342536]: ---"Objects listed" error: 13906ms (13:04:21.010) Dec 11 13:04:21 crc kubenswrapper[4898]: Trace[1005342536]: [13.906951207s] [13.906951207s] END Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.011038 4898 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.011821 4898 trace.go:236] Trace[822672229]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (11-Dec-2025 13:04:06.814) (total time: 14197ms): Dec 11 13:04:21 crc kubenswrapper[4898]: Trace[822672229]: ---"Objects listed" error: 14197ms (13:04:21.011) Dec 11 13:04:21 crc kubenswrapper[4898]: Trace[822672229]: [14.197363504s] [14.197363504s] END Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.011851 4898 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.014290 4898 trace.go:236] Trace[713391242]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (11-Dec-2025 13:04:07.042) (total time: 13971ms): Dec 11 13:04:21 crc kubenswrapper[4898]: Trace[713391242]: ---"Objects listed" error: 13971ms (13:04:21.014) Dec 11 13:04:21 crc kubenswrapper[4898]: Trace[713391242]: [13.971890156s] [13.971890156s] END Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.014319 4898 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.014354 4898 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.050726 4898 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:35500->192.168.126.11:17697: read: connection reset by peer" start-of-body= Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.050779 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:35500->192.168.126.11:17697: read: connection reset by peer" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.687018 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.694104 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.694398 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.720589 4898 apiserver.go:52] "Watching apiserver" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.723238 4898 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.723517 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h"] Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.724065 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 11 13:04:21 crc kubenswrapper[4898]: E1211 13:04:21.724146 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.724236 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.724543 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.724656 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.724702 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 11 13:04:21 crc kubenswrapper[4898]: E1211 13:04:21.724718 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.724701 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 11 13:04:21 crc kubenswrapper[4898]: E1211 13:04:21.725631 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.726602 4898 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.728389 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.728780 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.728874 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.729203 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.729404 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.729749 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.729855 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.730341 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.730718 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.743694 4898 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.743764 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.754988 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.755302 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.783212 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.798397 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca3db723-c4e9-4a38-9cee-827f045c7be3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b72e9f6787b4bfadbc89cf669d5f31678f0a85a726107e3b3d2e06eff0e18ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a52b842d9e7001cf1e9c3284c56ac9a76f642b66457d6717c6367e00ef89fdf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://583816a3f9ba9a092d6cd8d75bba54cf42e3ff9f65230f83822baec967dd53dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f419f755ec0a61ffad06dca0d35df43386ebd253152f29cf04312cd0ea89d5de\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.810178 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.818391 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.818425 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.818445 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.818491 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.818507 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.818522 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.818538 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.818553 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.818590 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.818605 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.818621 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.818636 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.818652 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.818669 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.818687 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.818705 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.818721 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.818813 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.818860 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.818878 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.818876 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.818920 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.818894 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.818996 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.818986 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.819022 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.818992 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.818997 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.819036 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.819026 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.819044 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.819095 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.819135 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.819168 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.819196 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.819220 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.819244 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.819268 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.819292 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.819314 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.819336 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.819355 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.819374 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.819138 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.819396 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.819180 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.819404 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.819227 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.819229 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.819232 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.819350 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.819369 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.819385 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.819384 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.819485 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.819421 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.819532 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.819567 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.819579 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.819577 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.819595 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.819589 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.819637 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.819661 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.819683 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.819705 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.819768 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.819786 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.819792 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.819807 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.819810 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.819829 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.819833 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.819818 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.819874 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.819891 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.819895 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.819917 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.819951 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.819972 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.819996 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.820021 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.820045 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.820069 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.820092 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.820117 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.820141 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.820165 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.820188 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.820216 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.820241 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.820262 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.820282 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.820304 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.819998 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.820015 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.820331 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.820353 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.820377 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.820399 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.820421 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.820441 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.820481 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.820501 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.820571 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.820595 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.820617 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.820640 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.820663 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.820685 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.820705 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.820726 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.820745 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.820764 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.820787 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.820808 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.820831 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.820852 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.820875 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.820957 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.820980 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.821001 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.821035 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.821059 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.821080 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.821102 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.821127 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.821148 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.821170 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.821189 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.821207 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.821231 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.821255 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.821281 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.821305 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.821330 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.821352 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.821374 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.821393 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.821414 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.821437 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.821477 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.821503 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.820031 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.820048 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.820072 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.821561 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.820103 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.820122 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.820189 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.820198 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.820204 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.820297 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.820301 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.820331 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.820346 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.820348 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.820362 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.820380 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.820415 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.820588 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.820630 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.820820 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.820875 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.821660 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.821699 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.821865 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.821980 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.822415 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.822477 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.822502 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.822766 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.821010 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.821057 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.821215 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.821254 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.821249 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.821419 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.821543 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.823788 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.832584 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.834442 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.834621 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.834855 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.834903 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:04:21 crc kubenswrapper[4898]: E1211 13:04:21.835034 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-11 13:04:22.335010635 +0000 UTC m=+19.907337182 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.836101 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.836442 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.836472 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.836574 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.836788 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.836968 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.836974 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.837254 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.837274 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.837257 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.837375 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.837425 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.837441 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.837477 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.837557 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.837589 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.837621 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.837639 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.837658 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.837673 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.837690 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.837711 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.837728 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.837749 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.837766 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.837792 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.837807 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.837827 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.837851 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.837867 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.837883 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.837903 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.837919 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.837933 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.837948 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.837964 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.837981 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.837998 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.838014 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.838029 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.838050 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.838066 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.838081 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.838097 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.838112 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.838128 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.838142 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.838159 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.838175 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.838189 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.838209 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.838228 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.838244 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.838261 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.838275 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.838306 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.838328 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.838343 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.838359 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.838378 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.838396 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.838411 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.838432 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.838446 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.838483 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.838501 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.838516 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.838531 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.838546 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.838561 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.838575 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.838594 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.838617 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.838637 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.838655 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.837387 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.838683 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.837883 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.837967 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.838662 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.838718 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.838784 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.838812 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.838975 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.839012 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.839039 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.839062 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.839086 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.839110 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.839539 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.839569 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.839595 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.839621 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.839644 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.839669 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.839691 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.839714 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.839731 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.839773 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.839806 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.839830 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.839853 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.839878 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.839903 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.839928 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.839948 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.839967 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.839985 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.840003 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.840024 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.840044 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.840061 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.840127 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.840139 4898 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.840149 4898 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.840159 4898 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.840168 4898 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.840177 4898 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.840186 4898 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.840196 4898 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.840206 4898 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.840216 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.840224 4898 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.840424 4898 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.840433 4898 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.840442 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.840450 4898 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.840481 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.840493 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.840505 4898 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.840516 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.840527 4898 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.840539 4898 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.840551 4898 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.840563 4898 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.840575 4898 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.840586 4898 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.840597 4898 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.840608 4898 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.840619 4898 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.840631 4898 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.840643 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.840655 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.840669 4898 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.840681 4898 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.840693 4898 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.840704 4898 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.840716 4898 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.840727 4898 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.840738 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.840750 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.840762 4898 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.840775 4898 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.840787 4898 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.840800 4898 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.840813 4898 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.840826 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.840839 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.840851 4898 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.840864 4898 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.840874 4898 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.840883 4898 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.840891 4898 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.840900 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.840909 4898 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.840919 4898 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.840927 4898 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.840936 4898 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.840947 4898 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.840957 4898 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.840966 4898 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.840975 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.840984 4898 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.840993 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.841001 4898 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.841010 4898 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.841019 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.841028 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.841043 4898 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.841052 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.841061 4898 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.841070 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.841079 4898 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.841088 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.841097 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.841106 4898 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.841114 4898 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.841124 4898 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.841132 4898 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.841140 4898 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.841149 4898 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.841158 4898 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.841166 4898 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.841175 4898 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.841184 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.841193 4898 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.841202 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.841211 4898 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.841220 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.846232 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.851045 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.839709 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.840137 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.841272 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.841342 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.853198 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.841798 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.841888 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.841929 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.842061 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.842138 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.842202 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.842282 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.842314 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.842330 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.842511 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.842568 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.845097 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.845145 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.845238 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.845238 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.845343 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.845344 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.845436 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.845421 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.845166 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.848869 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.850342 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:04:21 crc kubenswrapper[4898]: E1211 13:04:21.850650 4898 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 11 13:04:21 crc kubenswrapper[4898]: E1211 13:04:21.854845 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-11 13:04:22.354820223 +0000 UTC m=+19.927146650 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.856543 4898 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.859769 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.863202 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.863241 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.863309 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.863387 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.863552 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.863722 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.863866 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.864223 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.863992 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.864343 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.865367 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.849684 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:04:21 crc kubenswrapper[4898]: E1211 13:04:21.851336 4898 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 11 13:04:21 crc kubenswrapper[4898]: E1211 13:04:21.865594 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-11 13:04:22.365572431 +0000 UTC m=+19.937898918 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.851429 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.865786 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.865831 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.866913 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.867276 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:04:21 crc kubenswrapper[4898]: E1211 13:04:21.867793 4898 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 11 13:04:21 crc kubenswrapper[4898]: E1211 13:04:21.867815 4898 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 11 13:04:21 crc kubenswrapper[4898]: E1211 13:04:21.867832 4898 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.867944 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.868138 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:04:21 crc kubenswrapper[4898]: E1211 13:04:21.868236 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-11 13:04:22.368220755 +0000 UTC m=+19.940547192 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.868262 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.868390 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.868790 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 11 13:04:21 crc kubenswrapper[4898]: E1211 13:04:21.868872 4898 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.868874 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:04:21 crc kubenswrapper[4898]: E1211 13:04:21.868886 4898 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 11 13:04:21 crc kubenswrapper[4898]: E1211 13:04:21.868900 4898 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 11 13:04:21 crc kubenswrapper[4898]: E1211 13:04:21.869015 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-11 13:04:22.368974296 +0000 UTC m=+19.941300823 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.869932 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.870002 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.870097 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.870430 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.870306 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.870820 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.870917 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.871124 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.871428 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.871518 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.871531 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.871895 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.871994 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.872770 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.872842 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.873506 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.874870 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.876205 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.881277 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.881758 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.882418 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.883149 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.883311 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.883208 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.883427 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.883623 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.883815 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.884056 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.884149 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.884162 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.884218 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.884614 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.884659 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.884856 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.885672 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.885947 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.886068 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.886351 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.886599 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.886508 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.886905 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.887086 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.886518 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.887703 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.889225 4898 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="8857b4ade110b31598d9080f45bc15f5702b2c278fe7909e4d172c9877e61534" exitCode=255 Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.889682 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"8857b4ade110b31598d9080f45bc15f5702b2c278fe7909e4d172c9877e61534"} Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.890313 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.890450 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.890592 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.890777 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.890788 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.890783 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.890893 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.891061 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.891097 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.892013 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.892139 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.892210 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.892253 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.892270 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.892855 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.892910 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.893301 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.893768 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.893845 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.896866 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.902542 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.904511 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.908026 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.912029 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.919636 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.927501 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca3db723-c4e9-4a38-9cee-827f045c7be3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b72e9f6787b4bfadbc89cf669d5f31678f0a85a726107e3b3d2e06eff0e18ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a52b842d9e7001cf1e9c3284c56ac9a76f642b66457d6717c6367e00ef89fdf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://583816a3f9ba9a092d6cd8d75bba54cf42e3ff9f65230f83822baec967dd53dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f419f755ec0a61ffad06dca0d35df43386ebd253152f29cf04312cd0ea89d5de\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.937880 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.941717 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.941767 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.941879 4898 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.941892 4898 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.941903 4898 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.941913 4898 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.941922 4898 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.941949 4898 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.941987 4898 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.941998 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.942008 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.942018 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.942030 4898 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.942039 4898 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.942057 4898 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.942067 4898 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.942076 4898 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.942084 4898 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.942092 4898 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.942100 4898 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.942109 4898 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.942117 4898 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.942127 4898 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.942135 4898 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.942144 4898 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.942153 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.942162 4898 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.942170 4898 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.942179 4898 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.942188 4898 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.942197 4898 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.942205 4898 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.942216 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.942225 4898 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.942234 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.942242 4898 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.942250 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.942259 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.942268 4898 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.942276 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.942285 4898 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.942294 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.942304 4898 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.942313 4898 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.942323 4898 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.942332 4898 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.942341 4898 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.942349 4898 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.942357 4898 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.942365 4898 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.942373 4898 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.942405 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.942414 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.942434 4898 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.942443 4898 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.942466 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.942476 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.942484 4898 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.942492 4898 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.942500 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.942520 4898 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.942528 4898 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.942536 4898 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.942544 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.942553 4898 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.942639 4898 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.942647 4898 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.942655 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.942664 4898 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.942672 4898 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.942681 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.942700 4898 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.942708 4898 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.942716 4898 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.942723 4898 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.942731 4898 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.942739 4898 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.942746 4898 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.942754 4898 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.942761 4898 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.942769 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.942777 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.942785 4898 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.942793 4898 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.942801 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.942809 4898 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.942818 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.942828 4898 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.942836 4898 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.942845 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.942853 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.942861 4898 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.942869 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.942877 4898 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.942886 4898 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.942895 4898 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.942903 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.942912 4898 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.943016 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.943148 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.942922 4898 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.943429 4898 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.943439 4898 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.943447 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.943881 4898 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.943892 4898 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.943939 4898 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.943953 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.943962 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.943970 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.943979 4898 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.948731 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.956820 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.957086 4898 scope.go:117] "RemoveContainer" containerID="8857b4ade110b31598d9080f45bc15f5702b2c278fe7909e4d172c9877e61534" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.957300 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 11 13:04:21 crc kubenswrapper[4898]: I1211 13:04:21.966926 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.039719 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.051319 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 11 13:04:22 crc kubenswrapper[4898]: W1211 13:04:22.051478 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37a5e44f_9a88_4405_be8a_b645485e7312.slice/crio-da39f485a52399a6647e5d5d284e8f75f69e86aac167e6b4e7aa5a0eaa0c045b WatchSource:0}: Error finding container da39f485a52399a6647e5d5d284e8f75f69e86aac167e6b4e7aa5a0eaa0c045b: Status 404 returned error can't find the container with id da39f485a52399a6647e5d5d284e8f75f69e86aac167e6b4e7aa5a0eaa0c045b Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.058822 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 11 13:04:22 crc kubenswrapper[4898]: W1211 13:04:22.071275 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-5da705528d8f52dac75bc0e8933f0291198edd28851afbf8e4dffa510f0e0b64 WatchSource:0}: Error finding container 5da705528d8f52dac75bc0e8933f0291198edd28851afbf8e4dffa510f0e0b64: Status 404 returned error can't find the container with id 5da705528d8f52dac75bc0e8933f0291198edd28851afbf8e4dffa510f0e0b64 Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.295191 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.309399 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.311941 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fc485a2-a69e-4453-8c3d-4b9698caa632\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df081be8bffcbf5068ade5f09ad81f2ea332d110fb0dffc1ab215f4a447e75d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://765cb993103d23770e20a725ce1f28ca635d13d2ecd05676039a833274ecf3e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41568c567a418a6ebd534c09d02c4331dfa2cd00580441bfdd83a15a12153a59\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8857b4ade110b31598d9080f45bc15f5702b2c278fe7909e4d172c9877e61534\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8857b4ade110b31598d9080f45bc15f5702b2c278fe7909e4d172c9877e61534\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"message\\\":\\\"falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1211 13:04:15.326141 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1211 13:04:15.327902 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-38072788/tls.crt::/tmp/serving-cert-38072788/tls.key\\\\\\\"\\\\nI1211 13:04:21.035582 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1211 13:04:21.038444 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1211 13:04:21.038483 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1211 13:04:21.038515 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1211 13:04:21.038527 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1211 13:04:21.044993 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1211 13:04:21.045022 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1211 13:04:21.045029 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1211 13:04:21.045034 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1211 13:04:21.045038 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1211 13:04:21.045043 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1211 13:04:21.045047 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1211 13:04:21.045297 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1211 13:04:21.046774 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a6e962a8c37823fc7d0047f9c1cae14311bfa1f36236addff039bba3369a021\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca4b5408884ed07718750ad82f77961d242327ca11991fd4dbbd1b5b53c5b642\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca4b5408884ed07718750ad82f77961d242327ca11991fd4dbbd1b5b53c5b642\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.322115 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.332774 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.344437 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.349647 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 11 13:04:22 crc kubenswrapper[4898]: E1211 13:04:22.349862 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-11 13:04:23.349816119 +0000 UTC m=+20.922142556 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.359382 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.373382 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.382435 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca3db723-c4e9-4a38-9cee-827f045c7be3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b72e9f6787b4bfadbc89cf669d5f31678f0a85a726107e3b3d2e06eff0e18ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a52b842d9e7001cf1e9c3284c56ac9a76f642b66457d6717c6367e00ef89fdf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://583816a3f9ba9a092d6cd8d75bba54cf42e3ff9f65230f83822baec967dd53dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f419f755ec0a61ffad06dca0d35df43386ebd253152f29cf04312cd0ea89d5de\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.391897 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.410707 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.427910 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fc485a2-a69e-4453-8c3d-4b9698caa632\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df081be8bffcbf5068ade5f09ad81f2ea332d110fb0dffc1ab215f4a447e75d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://765cb993103d23770e20a725ce1f28ca635d13d2ecd05676039a833274ecf3e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41568c567a418a6ebd534c09d02c4331dfa2cd00580441bfdd83a15a12153a59\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8857b4ade110b31598d9080f45bc15f5702b2c278fe7909e4d172c9877e61534\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8857b4ade110b31598d9080f45bc15f5702b2c278fe7909e4d172c9877e61534\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"message\\\":\\\"falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1211 13:04:15.326141 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1211 13:04:15.327902 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-38072788/tls.crt::/tmp/serving-cert-38072788/tls.key\\\\\\\"\\\\nI1211 13:04:21.035582 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1211 13:04:21.038444 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1211 13:04:21.038483 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1211 13:04:21.038515 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1211 13:04:21.038527 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1211 13:04:21.044993 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1211 13:04:21.045022 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1211 13:04:21.045029 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1211 13:04:21.045034 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1211 13:04:21.045038 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1211 13:04:21.045043 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1211 13:04:21.045047 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1211 13:04:21.045297 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1211 13:04:21.046774 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a6e962a8c37823fc7d0047f9c1cae14311bfa1f36236addff039bba3369a021\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca4b5408884ed07718750ad82f77961d242327ca11991fd4dbbd1b5b53c5b642\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca4b5408884ed07718750ad82f77961d242327ca11991fd4dbbd1b5b53c5b642\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.435218 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.448787 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.450099 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.450152 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.450217 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.450245 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 11 13:04:22 crc kubenswrapper[4898]: E1211 13:04:22.450274 4898 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 11 13:04:22 crc kubenswrapper[4898]: E1211 13:04:22.450381 4898 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 11 13:04:22 crc kubenswrapper[4898]: E1211 13:04:22.450405 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-11 13:04:23.450358155 +0000 UTC m=+21.022684582 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 11 13:04:22 crc kubenswrapper[4898]: E1211 13:04:22.450414 4898 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 11 13:04:22 crc kubenswrapper[4898]: E1211 13:04:22.450296 4898 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 11 13:04:22 crc kubenswrapper[4898]: E1211 13:04:22.450481 4898 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 11 13:04:22 crc kubenswrapper[4898]: E1211 13:04:22.450504 4898 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 11 13:04:22 crc kubenswrapper[4898]: E1211 13:04:22.450509 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-11 13:04:23.450489109 +0000 UTC m=+21.022815546 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 11 13:04:22 crc kubenswrapper[4898]: E1211 13:04:22.450517 4898 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 11 13:04:22 crc kubenswrapper[4898]: E1211 13:04:22.450426 4898 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 11 13:04:22 crc kubenswrapper[4898]: E1211 13:04:22.450557 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-11 13:04:23.4505499 +0000 UTC m=+21.022876337 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 11 13:04:22 crc kubenswrapper[4898]: E1211 13:04:22.450570 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-11 13:04:23.450564641 +0000 UTC m=+21.022891078 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.460915 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.474091 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.482025 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.490720 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.498536 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca3db723-c4e9-4a38-9cee-827f045c7be3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b72e9f6787b4bfadbc89cf669d5f31678f0a85a726107e3b3d2e06eff0e18ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a52b842d9e7001cf1e9c3284c56ac9a76f642b66457d6717c6367e00ef89fdf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://583816a3f9ba9a092d6cd8d75bba54cf42e3ff9f65230f83822baec967dd53dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f419f755ec0a61ffad06dca0d35df43386ebd253152f29cf04312cd0ea89d5de\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.706427 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-dlqfj"] Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.706903 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-dlqfj" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.708982 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-h86cf"] Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.709522 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-h86cf" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.710049 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-7mmvk"] Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.710759 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.712972 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.713199 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.713416 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.713555 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.714382 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-7lxfm"] Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.715028 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-qndxl"] Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.715143 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-7lxfm" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.715946 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-qndxl" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.720759 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.720894 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.720942 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.721146 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.721329 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.721370 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.722282 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.722504 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.722705 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.722846 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.722918 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.723525 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.723590 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.723815 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.723925 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.723941 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.724183 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.740950 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.745201 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"88d684b6-361a-40e6-978e-b46ea34398f5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fdd8f4f5e41c02e8bfa32782027f6483a3568a3e1cb28f2abc7807eaada32646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad23dc07c8ad7cf6100e3b53e1d9ed8abac20e942968edb5decdde1b263de306\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce1bf1a7718645350f992e26a366a0fefa0d757b696c932d3552dd24a4c93550\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4801f56faef50a1c100b7f54be9508a2be17ab655648fdb92cffaab1cb198a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87f5a835b639c4ce50c0d2b5f1a865f1b67bc5cc06d9fd88875d9defb71dbed2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77f3c45a0ef619bd5ee7b89f62fdaf4ae983c8f75e23034eb88019194f6ef88a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77f3c45a0ef619bd5ee7b89f62fdaf4ae983c8f75e23034eb88019194f6ef88a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32f392949d5ebc630af8b0646f8f482f806803d464e2cedbcbe731328d6ba5ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32f392949d5ebc630af8b0646f8f482f806803d464e2cedbcbe731328d6ba5ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://bf8553e4b1a22afaf454efae3707136b9e6e320a54fd269f48a5f72297249d21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf8553e4b1a22afaf454efae3707136b9e6e320a54fd269f48a5f72297249d21\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:22Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.752182 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4e8ed6cb-b822-4b64-9e00-e755c5aea812-multus-cni-dir\") pod \"multus-dlqfj\" (UID: \"4e8ed6cb-b822-4b64-9e00-e755c5aea812\") " pod="openshift-multus/multus-dlqfj" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.752218 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4e8ed6cb-b822-4b64-9e00-e755c5aea812-host-run-netns\") pod \"multus-dlqfj\" (UID: \"4e8ed6cb-b822-4b64-9e00-e755c5aea812\") " pod="openshift-multus/multus-dlqfj" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.752239 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/4e8ed6cb-b822-4b64-9e00-e755c5aea812-host-run-multus-certs\") pod \"multus-dlqfj\" (UID: \"4e8ed6cb-b822-4b64-9e00-e755c5aea812\") " pod="openshift-multus/multus-dlqfj" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.752262 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c-proxy-tls\") pod \"machine-config-daemon-7mmvk\" (UID: \"b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c\") " pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.752280 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/45192e97-2770-4866-9865-dc2f45b3f616-cni-binary-copy\") pod \"multus-additional-cni-plugins-7lxfm\" (UID: \"45192e97-2770-4866-9865-dc2f45b3f616\") " pod="openshift-multus/multus-additional-cni-plugins-7lxfm" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.752310 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bvrpx\" (UniqueName: \"kubernetes.io/projected/4e8ed6cb-b822-4b64-9e00-e755c5aea812-kube-api-access-bvrpx\") pod \"multus-dlqfj\" (UID: \"4e8ed6cb-b822-4b64-9e00-e755c5aea812\") " pod="openshift-multus/multus-dlqfj" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.752329 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1efa7034-8a95-4e6e-bd84-0189dc5acaa3-run-openvswitch\") pod \"ovnkube-node-qndxl\" (UID: \"1efa7034-8a95-4e6e-bd84-0189dc5acaa3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qndxl" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.752355 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1efa7034-8a95-4e6e-bd84-0189dc5acaa3-env-overrides\") pod \"ovnkube-node-qndxl\" (UID: \"1efa7034-8a95-4e6e-bd84-0189dc5acaa3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qndxl" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.752372 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4e8ed6cb-b822-4b64-9e00-e755c5aea812-os-release\") pod \"multus-dlqfj\" (UID: \"4e8ed6cb-b822-4b64-9e00-e755c5aea812\") " pod="openshift-multus/multus-dlqfj" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.752405 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c-mcd-auth-proxy-config\") pod \"machine-config-daemon-7mmvk\" (UID: \"b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c\") " pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.752448 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xv2hc\" (UniqueName: \"kubernetes.io/projected/00c0a5c3-6be3-4c77-a628-cf6710a1f10f-kube-api-access-xv2hc\") pod \"node-resolver-h86cf\" (UID: \"00c0a5c3-6be3-4c77-a628-cf6710a1f10f\") " pod="openshift-dns/node-resolver-h86cf" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.752498 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1efa7034-8a95-4e6e-bd84-0189dc5acaa3-host-run-netns\") pod \"ovnkube-node-qndxl\" (UID: \"1efa7034-8a95-4e6e-bd84-0189dc5acaa3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qndxl" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.752522 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/4e8ed6cb-b822-4b64-9e00-e755c5aea812-host-run-k8s-cni-cncf-io\") pod \"multus-dlqfj\" (UID: \"4e8ed6cb-b822-4b64-9e00-e755c5aea812\") " pod="openshift-multus/multus-dlqfj" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.752539 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/4e8ed6cb-b822-4b64-9e00-e755c5aea812-hostroot\") pod \"multus-dlqfj\" (UID: \"4e8ed6cb-b822-4b64-9e00-e755c5aea812\") " pod="openshift-multus/multus-dlqfj" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.752572 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/1efa7034-8a95-4e6e-bd84-0189dc5acaa3-host-kubelet\") pod \"ovnkube-node-qndxl\" (UID: \"1efa7034-8a95-4e6e-bd84-0189dc5acaa3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qndxl" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.752587 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4e8ed6cb-b822-4b64-9e00-e755c5aea812-cnibin\") pod \"multus-dlqfj\" (UID: \"4e8ed6cb-b822-4b64-9e00-e755c5aea812\") " pod="openshift-multus/multus-dlqfj" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.752601 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/1efa7034-8a95-4e6e-bd84-0189dc5acaa3-node-log\") pod \"ovnkube-node-qndxl\" (UID: \"1efa7034-8a95-4e6e-bd84-0189dc5acaa3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qndxl" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.752653 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1efa7034-8a95-4e6e-bd84-0189dc5acaa3-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-qndxl\" (UID: \"1efa7034-8a95-4e6e-bd84-0189dc5acaa3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qndxl" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.752677 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1efa7034-8a95-4e6e-bd84-0189dc5acaa3-ovnkube-config\") pod \"ovnkube-node-qndxl\" (UID: \"1efa7034-8a95-4e6e-bd84-0189dc5acaa3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qndxl" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.752705 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/45192e97-2770-4866-9865-dc2f45b3f616-cnibin\") pod \"multus-additional-cni-plugins-7lxfm\" (UID: \"45192e97-2770-4866-9865-dc2f45b3f616\") " pod="openshift-multus/multus-additional-cni-plugins-7lxfm" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.752721 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/45192e97-2770-4866-9865-dc2f45b3f616-os-release\") pod \"multus-additional-cni-plugins-7lxfm\" (UID: \"45192e97-2770-4866-9865-dc2f45b3f616\") " pod="openshift-multus/multus-additional-cni-plugins-7lxfm" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.752740 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1efa7034-8a95-4e6e-bd84-0189dc5acaa3-etc-openvswitch\") pod \"ovnkube-node-qndxl\" (UID: \"1efa7034-8a95-4e6e-bd84-0189dc5acaa3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qndxl" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.752761 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1efa7034-8a95-4e6e-bd84-0189dc5acaa3-ovn-node-metrics-cert\") pod \"ovnkube-node-qndxl\" (UID: \"1efa7034-8a95-4e6e-bd84-0189dc5acaa3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qndxl" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.752793 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/4e8ed6cb-b822-4b64-9e00-e755c5aea812-host-var-lib-cni-multus\") pod \"multus-dlqfj\" (UID: \"4e8ed6cb-b822-4b64-9e00-e755c5aea812\") " pod="openshift-multus/multus-dlqfj" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.752810 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/1efa7034-8a95-4e6e-bd84-0189dc5acaa3-ovnkube-script-lib\") pod \"ovnkube-node-qndxl\" (UID: \"1efa7034-8a95-4e6e-bd84-0189dc5acaa3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qndxl" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.752827 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/1efa7034-8a95-4e6e-bd84-0189dc5acaa3-run-ovn\") pod \"ovnkube-node-qndxl\" (UID: \"1efa7034-8a95-4e6e-bd84-0189dc5acaa3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qndxl" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.752853 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/1efa7034-8a95-4e6e-bd84-0189dc5acaa3-log-socket\") pod \"ovnkube-node-qndxl\" (UID: \"1efa7034-8a95-4e6e-bd84-0189dc5acaa3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qndxl" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.752881 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1efa7034-8a95-4e6e-bd84-0189dc5acaa3-host-cni-bin\") pod \"ovnkube-node-qndxl\" (UID: \"1efa7034-8a95-4e6e-bd84-0189dc5acaa3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qndxl" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.752948 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c-rootfs\") pod \"machine-config-daemon-7mmvk\" (UID: \"b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c\") " pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.752983 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-96jns\" (UniqueName: \"kubernetes.io/projected/45192e97-2770-4866-9865-dc2f45b3f616-kube-api-access-96jns\") pod \"multus-additional-cni-plugins-7lxfm\" (UID: \"45192e97-2770-4866-9865-dc2f45b3f616\") " pod="openshift-multus/multus-additional-cni-plugins-7lxfm" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.753021 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/1efa7034-8a95-4e6e-bd84-0189dc5acaa3-systemd-units\") pod \"ovnkube-node-qndxl\" (UID: \"1efa7034-8a95-4e6e-bd84-0189dc5acaa3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qndxl" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.753049 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1efa7034-8a95-4e6e-bd84-0189dc5acaa3-var-lib-openvswitch\") pod \"ovnkube-node-qndxl\" (UID: \"1efa7034-8a95-4e6e-bd84-0189dc5acaa3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qndxl" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.753070 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4e8ed6cb-b822-4b64-9e00-e755c5aea812-host-var-lib-kubelet\") pod \"multus-dlqfj\" (UID: \"4e8ed6cb-b822-4b64-9e00-e755c5aea812\") " pod="openshift-multus/multus-dlqfj" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.753131 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/1efa7034-8a95-4e6e-bd84-0189dc5acaa3-run-systemd\") pod \"ovnkube-node-qndxl\" (UID: \"1efa7034-8a95-4e6e-bd84-0189dc5acaa3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qndxl" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.753152 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-729s2\" (UniqueName: \"kubernetes.io/projected/1efa7034-8a95-4e6e-bd84-0189dc5acaa3-kube-api-access-729s2\") pod \"ovnkube-node-qndxl\" (UID: \"1efa7034-8a95-4e6e-bd84-0189dc5acaa3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qndxl" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.753173 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4e8ed6cb-b822-4b64-9e00-e755c5aea812-multus-conf-dir\") pod \"multus-dlqfj\" (UID: \"4e8ed6cb-b822-4b64-9e00-e755c5aea812\") " pod="openshift-multus/multus-dlqfj" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.753198 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1efa7034-8a95-4e6e-bd84-0189dc5acaa3-host-run-ovn-kubernetes\") pod \"ovnkube-node-qndxl\" (UID: \"1efa7034-8a95-4e6e-bd84-0189dc5acaa3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qndxl" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.753218 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/4e8ed6cb-b822-4b64-9e00-e755c5aea812-multus-socket-dir-parent\") pod \"multus-dlqfj\" (UID: \"4e8ed6cb-b822-4b64-9e00-e755c5aea812\") " pod="openshift-multus/multus-dlqfj" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.753240 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1efa7034-8a95-4e6e-bd84-0189dc5acaa3-host-slash\") pod \"ovnkube-node-qndxl\" (UID: \"1efa7034-8a95-4e6e-bd84-0189dc5acaa3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qndxl" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.753263 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/1efa7034-8a95-4e6e-bd84-0189dc5acaa3-host-cni-netd\") pod \"ovnkube-node-qndxl\" (UID: \"1efa7034-8a95-4e6e-bd84-0189dc5acaa3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qndxl" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.753288 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7cv42\" (UniqueName: \"kubernetes.io/projected/b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c-kube-api-access-7cv42\") pod \"machine-config-daemon-7mmvk\" (UID: \"b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c\") " pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.753308 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4e8ed6cb-b822-4b64-9e00-e755c5aea812-host-var-lib-cni-bin\") pod \"multus-dlqfj\" (UID: \"4e8ed6cb-b822-4b64-9e00-e755c5aea812\") " pod="openshift-multus/multus-dlqfj" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.753329 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4e8ed6cb-b822-4b64-9e00-e755c5aea812-multus-daemon-config\") pod \"multus-dlqfj\" (UID: \"4e8ed6cb-b822-4b64-9e00-e755c5aea812\") " pod="openshift-multus/multus-dlqfj" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.753349 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4e8ed6cb-b822-4b64-9e00-e755c5aea812-etc-kubernetes\") pod \"multus-dlqfj\" (UID: \"4e8ed6cb-b822-4b64-9e00-e755c5aea812\") " pod="openshift-multus/multus-dlqfj" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.753375 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/45192e97-2770-4866-9865-dc2f45b3f616-system-cni-dir\") pod \"multus-additional-cni-plugins-7lxfm\" (UID: \"45192e97-2770-4866-9865-dc2f45b3f616\") " pod="openshift-multus/multus-additional-cni-plugins-7lxfm" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.753395 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/45192e97-2770-4866-9865-dc2f45b3f616-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-7lxfm\" (UID: \"45192e97-2770-4866-9865-dc2f45b3f616\") " pod="openshift-multus/multus-additional-cni-plugins-7lxfm" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.753417 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/00c0a5c3-6be3-4c77-a628-cf6710a1f10f-hosts-file\") pod \"node-resolver-h86cf\" (UID: \"00c0a5c3-6be3-4c77-a628-cf6710a1f10f\") " pod="openshift-dns/node-resolver-h86cf" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.753436 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4e8ed6cb-b822-4b64-9e00-e755c5aea812-system-cni-dir\") pod \"multus-dlqfj\" (UID: \"4e8ed6cb-b822-4b64-9e00-e755c5aea812\") " pod="openshift-multus/multus-dlqfj" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.753471 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4e8ed6cb-b822-4b64-9e00-e755c5aea812-cni-binary-copy\") pod \"multus-dlqfj\" (UID: \"4e8ed6cb-b822-4b64-9e00-e755c5aea812\") " pod="openshift-multus/multus-dlqfj" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.753493 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/45192e97-2770-4866-9865-dc2f45b3f616-tuning-conf-dir\") pod \"multus-additional-cni-plugins-7lxfm\" (UID: \"45192e97-2770-4866-9865-dc2f45b3f616\") " pod="openshift-multus/multus-additional-cni-plugins-7lxfm" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.756412 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:22Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.766598 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:22Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.778313 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.778769 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca3db723-c4e9-4a38-9cee-827f045c7be3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b72e9f6787b4bfadbc89cf669d5f31678f0a85a726107e3b3d2e06eff0e18ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a52b842d9e7001cf1e9c3284c56ac9a76f642b66457d6717c6367e00ef89fdf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://583816a3f9ba9a092d6cd8d75bba54cf42e3ff9f65230f83822baec967dd53dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f419f755ec0a61ffad06dca0d35df43386ebd253152f29cf04312cd0ea89d5de\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:22Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.778900 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.779778 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.780370 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.780983 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.781564 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.782242 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.782790 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.783397 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.783934 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.784424 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.785107 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.787964 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.788573 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.789166 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.790284 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.790513 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:22Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.790970 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.791792 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.792407 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.793011 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.794444 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.795163 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.796191 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.797009 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.797595 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.798900 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.799962 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.801112 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.801873 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.803031 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.803664 4898 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.804307 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.805260 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:22Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.806784 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.807587 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.808040 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.812118 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.812972 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.813604 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.814817 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.815803 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.816582 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:22Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.816815 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.817445 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.818796 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.819836 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.820299 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.821308 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.821867 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.823066 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.823684 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.824246 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.825234 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.825905 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.827057 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.827704 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.828417 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:22Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.842137 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-dlqfj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e8ed6cb-b822-4b64-9e00-e755c5aea812\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bvrpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-dlqfj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:22Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.854689 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/45192e97-2770-4866-9865-dc2f45b3f616-cni-binary-copy\") pod \"multus-additional-cni-plugins-7lxfm\" (UID: \"45192e97-2770-4866-9865-dc2f45b3f616\") " pod="openshift-multus/multus-additional-cni-plugins-7lxfm" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.854728 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4e8ed6cb-b822-4b64-9e00-e755c5aea812-multus-cni-dir\") pod \"multus-dlqfj\" (UID: \"4e8ed6cb-b822-4b64-9e00-e755c5aea812\") " pod="openshift-multus/multus-dlqfj" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.854751 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4e8ed6cb-b822-4b64-9e00-e755c5aea812-host-run-netns\") pod \"multus-dlqfj\" (UID: \"4e8ed6cb-b822-4b64-9e00-e755c5aea812\") " pod="openshift-multus/multus-dlqfj" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.854769 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/4e8ed6cb-b822-4b64-9e00-e755c5aea812-host-run-multus-certs\") pod \"multus-dlqfj\" (UID: \"4e8ed6cb-b822-4b64-9e00-e755c5aea812\") " pod="openshift-multus/multus-dlqfj" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.854791 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c-proxy-tls\") pod \"machine-config-daemon-7mmvk\" (UID: \"b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c\") " pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.854810 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bvrpx\" (UniqueName: \"kubernetes.io/projected/4e8ed6cb-b822-4b64-9e00-e755c5aea812-kube-api-access-bvrpx\") pod \"multus-dlqfj\" (UID: \"4e8ed6cb-b822-4b64-9e00-e755c5aea812\") " pod="openshift-multus/multus-dlqfj" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.854829 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1efa7034-8a95-4e6e-bd84-0189dc5acaa3-host-run-netns\") pod \"ovnkube-node-qndxl\" (UID: \"1efa7034-8a95-4e6e-bd84-0189dc5acaa3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qndxl" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.854849 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1efa7034-8a95-4e6e-bd84-0189dc5acaa3-run-openvswitch\") pod \"ovnkube-node-qndxl\" (UID: \"1efa7034-8a95-4e6e-bd84-0189dc5acaa3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qndxl" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.854867 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1efa7034-8a95-4e6e-bd84-0189dc5acaa3-env-overrides\") pod \"ovnkube-node-qndxl\" (UID: \"1efa7034-8a95-4e6e-bd84-0189dc5acaa3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qndxl" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.854870 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4e8ed6cb-b822-4b64-9e00-e755c5aea812-host-run-netns\") pod \"multus-dlqfj\" (UID: \"4e8ed6cb-b822-4b64-9e00-e755c5aea812\") " pod="openshift-multus/multus-dlqfj" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.854884 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4e8ed6cb-b822-4b64-9e00-e755c5aea812-os-release\") pod \"multus-dlqfj\" (UID: \"4e8ed6cb-b822-4b64-9e00-e755c5aea812\") " pod="openshift-multus/multus-dlqfj" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.854922 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4e8ed6cb-b822-4b64-9e00-e755c5aea812-multus-cni-dir\") pod \"multus-dlqfj\" (UID: \"4e8ed6cb-b822-4b64-9e00-e755c5aea812\") " pod="openshift-multus/multus-dlqfj" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.854989 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c-mcd-auth-proxy-config\") pod \"machine-config-daemon-7mmvk\" (UID: \"b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c\") " pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.855022 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/4e8ed6cb-b822-4b64-9e00-e755c5aea812-host-run-multus-certs\") pod \"multus-dlqfj\" (UID: \"4e8ed6cb-b822-4b64-9e00-e755c5aea812\") " pod="openshift-multus/multus-dlqfj" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.855030 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xv2hc\" (UniqueName: \"kubernetes.io/projected/00c0a5c3-6be3-4c77-a628-cf6710a1f10f-kube-api-access-xv2hc\") pod \"node-resolver-h86cf\" (UID: \"00c0a5c3-6be3-4c77-a628-cf6710a1f10f\") " pod="openshift-dns/node-resolver-h86cf" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.855084 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4e8ed6cb-b822-4b64-9e00-e755c5aea812-os-release\") pod \"multus-dlqfj\" (UID: \"4e8ed6cb-b822-4b64-9e00-e755c5aea812\") " pod="openshift-multus/multus-dlqfj" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.855092 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/4e8ed6cb-b822-4b64-9e00-e755c5aea812-host-run-k8s-cni-cncf-io\") pod \"multus-dlqfj\" (UID: \"4e8ed6cb-b822-4b64-9e00-e755c5aea812\") " pod="openshift-multus/multus-dlqfj" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.855128 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/4e8ed6cb-b822-4b64-9e00-e755c5aea812-hostroot\") pod \"multus-dlqfj\" (UID: \"4e8ed6cb-b822-4b64-9e00-e755c5aea812\") " pod="openshift-multus/multus-dlqfj" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.855161 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/1efa7034-8a95-4e6e-bd84-0189dc5acaa3-host-kubelet\") pod \"ovnkube-node-qndxl\" (UID: \"1efa7034-8a95-4e6e-bd84-0189dc5acaa3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qndxl" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.855189 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4e8ed6cb-b822-4b64-9e00-e755c5aea812-cnibin\") pod \"multus-dlqfj\" (UID: \"4e8ed6cb-b822-4b64-9e00-e755c5aea812\") " pod="openshift-multus/multus-dlqfj" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.855215 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1efa7034-8a95-4e6e-bd84-0189dc5acaa3-etc-openvswitch\") pod \"ovnkube-node-qndxl\" (UID: \"1efa7034-8a95-4e6e-bd84-0189dc5acaa3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qndxl" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.855243 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/1efa7034-8a95-4e6e-bd84-0189dc5acaa3-node-log\") pod \"ovnkube-node-qndxl\" (UID: \"1efa7034-8a95-4e6e-bd84-0189dc5acaa3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qndxl" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.855257 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1efa7034-8a95-4e6e-bd84-0189dc5acaa3-host-run-netns\") pod \"ovnkube-node-qndxl\" (UID: \"1efa7034-8a95-4e6e-bd84-0189dc5acaa3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qndxl" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.855273 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1efa7034-8a95-4e6e-bd84-0189dc5acaa3-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-qndxl\" (UID: \"1efa7034-8a95-4e6e-bd84-0189dc5acaa3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qndxl" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.855293 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1efa7034-8a95-4e6e-bd84-0189dc5acaa3-run-openvswitch\") pod \"ovnkube-node-qndxl\" (UID: \"1efa7034-8a95-4e6e-bd84-0189dc5acaa3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qndxl" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.855301 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1efa7034-8a95-4e6e-bd84-0189dc5acaa3-ovnkube-config\") pod \"ovnkube-node-qndxl\" (UID: \"1efa7034-8a95-4e6e-bd84-0189dc5acaa3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qndxl" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.855349 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/45192e97-2770-4866-9865-dc2f45b3f616-cnibin\") pod \"multus-additional-cni-plugins-7lxfm\" (UID: \"45192e97-2770-4866-9865-dc2f45b3f616\") " pod="openshift-multus/multus-additional-cni-plugins-7lxfm" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.855376 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/45192e97-2770-4866-9865-dc2f45b3f616-os-release\") pod \"multus-additional-cni-plugins-7lxfm\" (UID: \"45192e97-2770-4866-9865-dc2f45b3f616\") " pod="openshift-multus/multus-additional-cni-plugins-7lxfm" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.855403 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1efa7034-8a95-4e6e-bd84-0189dc5acaa3-ovn-node-metrics-cert\") pod \"ovnkube-node-qndxl\" (UID: \"1efa7034-8a95-4e6e-bd84-0189dc5acaa3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qndxl" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.855428 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/4e8ed6cb-b822-4b64-9e00-e755c5aea812-host-var-lib-cni-multus\") pod \"multus-dlqfj\" (UID: \"4e8ed6cb-b822-4b64-9e00-e755c5aea812\") " pod="openshift-multus/multus-dlqfj" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.855486 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/1efa7034-8a95-4e6e-bd84-0189dc5acaa3-ovnkube-script-lib\") pod \"ovnkube-node-qndxl\" (UID: \"1efa7034-8a95-4e6e-bd84-0189dc5acaa3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qndxl" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.855521 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1efa7034-8a95-4e6e-bd84-0189dc5acaa3-var-lib-openvswitch\") pod \"ovnkube-node-qndxl\" (UID: \"1efa7034-8a95-4e6e-bd84-0189dc5acaa3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qndxl" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.855558 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/1efa7034-8a95-4e6e-bd84-0189dc5acaa3-run-ovn\") pod \"ovnkube-node-qndxl\" (UID: \"1efa7034-8a95-4e6e-bd84-0189dc5acaa3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qndxl" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.855582 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/1efa7034-8a95-4e6e-bd84-0189dc5acaa3-log-socket\") pod \"ovnkube-node-qndxl\" (UID: \"1efa7034-8a95-4e6e-bd84-0189dc5acaa3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qndxl" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.855610 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1efa7034-8a95-4e6e-bd84-0189dc5acaa3-host-cni-bin\") pod \"ovnkube-node-qndxl\" (UID: \"1efa7034-8a95-4e6e-bd84-0189dc5acaa3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qndxl" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.855639 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c-rootfs\") pod \"machine-config-daemon-7mmvk\" (UID: \"b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c\") " pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.855671 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-96jns\" (UniqueName: \"kubernetes.io/projected/45192e97-2770-4866-9865-dc2f45b3f616-kube-api-access-96jns\") pod \"multus-additional-cni-plugins-7lxfm\" (UID: \"45192e97-2770-4866-9865-dc2f45b3f616\") " pod="openshift-multus/multus-additional-cni-plugins-7lxfm" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.855699 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/1efa7034-8a95-4e6e-bd84-0189dc5acaa3-systemd-units\") pod \"ovnkube-node-qndxl\" (UID: \"1efa7034-8a95-4e6e-bd84-0189dc5acaa3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qndxl" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.855723 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4e8ed6cb-b822-4b64-9e00-e755c5aea812-host-var-lib-kubelet\") pod \"multus-dlqfj\" (UID: \"4e8ed6cb-b822-4b64-9e00-e755c5aea812\") " pod="openshift-multus/multus-dlqfj" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.855776 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/1efa7034-8a95-4e6e-bd84-0189dc5acaa3-run-systemd\") pod \"ovnkube-node-qndxl\" (UID: \"1efa7034-8a95-4e6e-bd84-0189dc5acaa3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qndxl" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.855785 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/45192e97-2770-4866-9865-dc2f45b3f616-cni-binary-copy\") pod \"multus-additional-cni-plugins-7lxfm\" (UID: \"45192e97-2770-4866-9865-dc2f45b3f616\") " pod="openshift-multus/multus-additional-cni-plugins-7lxfm" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.855803 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-729s2\" (UniqueName: \"kubernetes.io/projected/1efa7034-8a95-4e6e-bd84-0189dc5acaa3-kube-api-access-729s2\") pod \"ovnkube-node-qndxl\" (UID: \"1efa7034-8a95-4e6e-bd84-0189dc5acaa3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qndxl" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.855831 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/4e8ed6cb-b822-4b64-9e00-e755c5aea812-multus-socket-dir-parent\") pod \"multus-dlqfj\" (UID: \"4e8ed6cb-b822-4b64-9e00-e755c5aea812\") " pod="openshift-multus/multus-dlqfj" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.855843 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1efa7034-8a95-4e6e-bd84-0189dc5acaa3-env-overrides\") pod \"ovnkube-node-qndxl\" (UID: \"1efa7034-8a95-4e6e-bd84-0189dc5acaa3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qndxl" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.855857 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4e8ed6cb-b822-4b64-9e00-e755c5aea812-multus-conf-dir\") pod \"multus-dlqfj\" (UID: \"4e8ed6cb-b822-4b64-9e00-e755c5aea812\") " pod="openshift-multus/multus-dlqfj" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.855889 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1efa7034-8a95-4e6e-bd84-0189dc5acaa3-host-run-ovn-kubernetes\") pod \"ovnkube-node-qndxl\" (UID: \"1efa7034-8a95-4e6e-bd84-0189dc5acaa3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qndxl" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.855918 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1efa7034-8a95-4e6e-bd84-0189dc5acaa3-host-slash\") pod \"ovnkube-node-qndxl\" (UID: \"1efa7034-8a95-4e6e-bd84-0189dc5acaa3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qndxl" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.855945 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/1efa7034-8a95-4e6e-bd84-0189dc5acaa3-host-cni-netd\") pod \"ovnkube-node-qndxl\" (UID: \"1efa7034-8a95-4e6e-bd84-0189dc5acaa3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qndxl" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.855976 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7cv42\" (UniqueName: \"kubernetes.io/projected/b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c-kube-api-access-7cv42\") pod \"machine-config-daemon-7mmvk\" (UID: \"b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c\") " pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.856008 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4e8ed6cb-b822-4b64-9e00-e755c5aea812-host-var-lib-cni-bin\") pod \"multus-dlqfj\" (UID: \"4e8ed6cb-b822-4b64-9e00-e755c5aea812\") " pod="openshift-multus/multus-dlqfj" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.856039 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4e8ed6cb-b822-4b64-9e00-e755c5aea812-multus-daemon-config\") pod \"multus-dlqfj\" (UID: \"4e8ed6cb-b822-4b64-9e00-e755c5aea812\") " pod="openshift-multus/multus-dlqfj" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.856071 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4e8ed6cb-b822-4b64-9e00-e755c5aea812-cni-binary-copy\") pod \"multus-dlqfj\" (UID: \"4e8ed6cb-b822-4b64-9e00-e755c5aea812\") " pod="openshift-multus/multus-dlqfj" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.856103 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4e8ed6cb-b822-4b64-9e00-e755c5aea812-etc-kubernetes\") pod \"multus-dlqfj\" (UID: \"4e8ed6cb-b822-4b64-9e00-e755c5aea812\") " pod="openshift-multus/multus-dlqfj" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.856130 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/45192e97-2770-4866-9865-dc2f45b3f616-system-cni-dir\") pod \"multus-additional-cni-plugins-7lxfm\" (UID: \"45192e97-2770-4866-9865-dc2f45b3f616\") " pod="openshift-multus/multus-additional-cni-plugins-7lxfm" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.856160 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/45192e97-2770-4866-9865-dc2f45b3f616-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-7lxfm\" (UID: \"45192e97-2770-4866-9865-dc2f45b3f616\") " pod="openshift-multus/multus-additional-cni-plugins-7lxfm" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.856189 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/00c0a5c3-6be3-4c77-a628-cf6710a1f10f-hosts-file\") pod \"node-resolver-h86cf\" (UID: \"00c0a5c3-6be3-4c77-a628-cf6710a1f10f\") " pod="openshift-dns/node-resolver-h86cf" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.856220 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4e8ed6cb-b822-4b64-9e00-e755c5aea812-system-cni-dir\") pod \"multus-dlqfj\" (UID: \"4e8ed6cb-b822-4b64-9e00-e755c5aea812\") " pod="openshift-multus/multus-dlqfj" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.856243 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/45192e97-2770-4866-9865-dc2f45b3f616-tuning-conf-dir\") pod \"multus-additional-cni-plugins-7lxfm\" (UID: \"45192e97-2770-4866-9865-dc2f45b3f616\") " pod="openshift-multus/multus-additional-cni-plugins-7lxfm" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.856440 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1efa7034-8a95-4e6e-bd84-0189dc5acaa3-etc-openvswitch\") pod \"ovnkube-node-qndxl\" (UID: \"1efa7034-8a95-4e6e-bd84-0189dc5acaa3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qndxl" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.856509 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1efa7034-8a95-4e6e-bd84-0189dc5acaa3-var-lib-openvswitch\") pod \"ovnkube-node-qndxl\" (UID: \"1efa7034-8a95-4e6e-bd84-0189dc5acaa3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qndxl" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.856479 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/1efa7034-8a95-4e6e-bd84-0189dc5acaa3-ovnkube-script-lib\") pod \"ovnkube-node-qndxl\" (UID: \"1efa7034-8a95-4e6e-bd84-0189dc5acaa3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qndxl" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.856559 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/1efa7034-8a95-4e6e-bd84-0189dc5acaa3-log-socket\") pod \"ovnkube-node-qndxl\" (UID: \"1efa7034-8a95-4e6e-bd84-0189dc5acaa3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qndxl" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.856565 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/1efa7034-8a95-4e6e-bd84-0189dc5acaa3-run-ovn\") pod \"ovnkube-node-qndxl\" (UID: \"1efa7034-8a95-4e6e-bd84-0189dc5acaa3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qndxl" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.856596 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1efa7034-8a95-4e6e-bd84-0189dc5acaa3-host-cni-bin\") pod \"ovnkube-node-qndxl\" (UID: \"1efa7034-8a95-4e6e-bd84-0189dc5acaa3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qndxl" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.856608 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/4e8ed6cb-b822-4b64-9e00-e755c5aea812-host-run-k8s-cni-cncf-io\") pod \"multus-dlqfj\" (UID: \"4e8ed6cb-b822-4b64-9e00-e755c5aea812\") " pod="openshift-multus/multus-dlqfj" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.856627 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c-mcd-auth-proxy-config\") pod \"machine-config-daemon-7mmvk\" (UID: \"b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c\") " pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.856659 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/4e8ed6cb-b822-4b64-9e00-e755c5aea812-hostroot\") pod \"multus-dlqfj\" (UID: \"4e8ed6cb-b822-4b64-9e00-e755c5aea812\") " pod="openshift-multus/multus-dlqfj" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.856662 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4e8ed6cb-b822-4b64-9e00-e755c5aea812-cnibin\") pod \"multus-dlqfj\" (UID: \"4e8ed6cb-b822-4b64-9e00-e755c5aea812\") " pod="openshift-multus/multus-dlqfj" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.856632 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/1efa7034-8a95-4e6e-bd84-0189dc5acaa3-host-kubelet\") pod \"ovnkube-node-qndxl\" (UID: \"1efa7034-8a95-4e6e-bd84-0189dc5acaa3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qndxl" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.856697 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/45192e97-2770-4866-9865-dc2f45b3f616-cnibin\") pod \"multus-additional-cni-plugins-7lxfm\" (UID: \"45192e97-2770-4866-9865-dc2f45b3f616\") " pod="openshift-multus/multus-additional-cni-plugins-7lxfm" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.855919 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/4e8ed6cb-b822-4b64-9e00-e755c5aea812-host-var-lib-cni-multus\") pod \"multus-dlqfj\" (UID: \"4e8ed6cb-b822-4b64-9e00-e755c5aea812\") " pod="openshift-multus/multus-dlqfj" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.856719 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/1efa7034-8a95-4e6e-bd84-0189dc5acaa3-node-log\") pod \"ovnkube-node-qndxl\" (UID: \"1efa7034-8a95-4e6e-bd84-0189dc5acaa3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qndxl" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.856763 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1efa7034-8a95-4e6e-bd84-0189dc5acaa3-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-qndxl\" (UID: \"1efa7034-8a95-4e6e-bd84-0189dc5acaa3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qndxl" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.856830 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/45192e97-2770-4866-9865-dc2f45b3f616-os-release\") pod \"multus-additional-cni-plugins-7lxfm\" (UID: \"45192e97-2770-4866-9865-dc2f45b3f616\") " pod="openshift-multus/multus-additional-cni-plugins-7lxfm" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.856904 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/1efa7034-8a95-4e6e-bd84-0189dc5acaa3-systemd-units\") pod \"ovnkube-node-qndxl\" (UID: \"1efa7034-8a95-4e6e-bd84-0189dc5acaa3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qndxl" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.856946 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4e8ed6cb-b822-4b64-9e00-e755c5aea812-host-var-lib-kubelet\") pod \"multus-dlqfj\" (UID: \"4e8ed6cb-b822-4b64-9e00-e755c5aea812\") " pod="openshift-multus/multus-dlqfj" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.856976 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/1efa7034-8a95-4e6e-bd84-0189dc5acaa3-run-systemd\") pod \"ovnkube-node-qndxl\" (UID: \"1efa7034-8a95-4e6e-bd84-0189dc5acaa3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qndxl" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.857156 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/4e8ed6cb-b822-4b64-9e00-e755c5aea812-multus-socket-dir-parent\") pod \"multus-dlqfj\" (UID: \"4e8ed6cb-b822-4b64-9e00-e755c5aea812\") " pod="openshift-multus/multus-dlqfj" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.857192 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4e8ed6cb-b822-4b64-9e00-e755c5aea812-multus-conf-dir\") pod \"multus-dlqfj\" (UID: \"4e8ed6cb-b822-4b64-9e00-e755c5aea812\") " pod="openshift-multus/multus-dlqfj" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.857224 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1efa7034-8a95-4e6e-bd84-0189dc5acaa3-host-run-ovn-kubernetes\") pod \"ovnkube-node-qndxl\" (UID: \"1efa7034-8a95-4e6e-bd84-0189dc5acaa3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qndxl" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.855892 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c-rootfs\") pod \"machine-config-daemon-7mmvk\" (UID: \"b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c\") " pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.857263 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1efa7034-8a95-4e6e-bd84-0189dc5acaa3-host-slash\") pod \"ovnkube-node-qndxl\" (UID: \"1efa7034-8a95-4e6e-bd84-0189dc5acaa3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qndxl" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.857411 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/00c0a5c3-6be3-4c77-a628-cf6710a1f10f-hosts-file\") pod \"node-resolver-h86cf\" (UID: \"00c0a5c3-6be3-4c77-a628-cf6710a1f10f\") " pod="openshift-dns/node-resolver-h86cf" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.857502 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4e8ed6cb-b822-4b64-9e00-e755c5aea812-etc-kubernetes\") pod \"multus-dlqfj\" (UID: \"4e8ed6cb-b822-4b64-9e00-e755c5aea812\") " pod="openshift-multus/multus-dlqfj" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.857553 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1efa7034-8a95-4e6e-bd84-0189dc5acaa3-ovnkube-config\") pod \"ovnkube-node-qndxl\" (UID: \"1efa7034-8a95-4e6e-bd84-0189dc5acaa3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qndxl" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.857575 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4e8ed6cb-b822-4b64-9e00-e755c5aea812-system-cni-dir\") pod \"multus-dlqfj\" (UID: \"4e8ed6cb-b822-4b64-9e00-e755c5aea812\") " pod="openshift-multus/multus-dlqfj" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.857586 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4e8ed6cb-b822-4b64-9e00-e755c5aea812-host-var-lib-cni-bin\") pod \"multus-dlqfj\" (UID: \"4e8ed6cb-b822-4b64-9e00-e755c5aea812\") " pod="openshift-multus/multus-dlqfj" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.857588 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/45192e97-2770-4866-9865-dc2f45b3f616-system-cni-dir\") pod \"multus-additional-cni-plugins-7lxfm\" (UID: \"45192e97-2770-4866-9865-dc2f45b3f616\") " pod="openshift-multus/multus-additional-cni-plugins-7lxfm" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.857637 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/1efa7034-8a95-4e6e-bd84-0189dc5acaa3-host-cni-netd\") pod \"ovnkube-node-qndxl\" (UID: \"1efa7034-8a95-4e6e-bd84-0189dc5acaa3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qndxl" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.857772 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4e8ed6cb-b822-4b64-9e00-e755c5aea812-cni-binary-copy\") pod \"multus-dlqfj\" (UID: \"4e8ed6cb-b822-4b64-9e00-e755c5aea812\") " pod="openshift-multus/multus-dlqfj" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.857882 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/45192e97-2770-4866-9865-dc2f45b3f616-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-7lxfm\" (UID: \"45192e97-2770-4866-9865-dc2f45b3f616\") " pod="openshift-multus/multus-additional-cni-plugins-7lxfm" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.857934 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/45192e97-2770-4866-9865-dc2f45b3f616-tuning-conf-dir\") pod \"multus-additional-cni-plugins-7lxfm\" (UID: \"45192e97-2770-4866-9865-dc2f45b3f616\") " pod="openshift-multus/multus-additional-cni-plugins-7lxfm" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.858078 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4e8ed6cb-b822-4b64-9e00-e755c5aea812-multus-daemon-config\") pod \"multus-dlqfj\" (UID: \"4e8ed6cb-b822-4b64-9e00-e755c5aea812\") " pod="openshift-multus/multus-dlqfj" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.858099 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fc485a2-a69e-4453-8c3d-4b9698caa632\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df081be8bffcbf5068ade5f09ad81f2ea332d110fb0dffc1ab215f4a447e75d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://765cb993103d23770e20a725ce1f28ca635d13d2ecd05676039a833274ecf3e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41568c567a418a6ebd534c09d02c4331dfa2cd00580441bfdd83a15a12153a59\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8857b4ade110b31598d9080f45bc15f5702b2c278fe7909e4d172c9877e61534\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8857b4ade110b31598d9080f45bc15f5702b2c278fe7909e4d172c9877e61534\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"message\\\":\\\"falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1211 13:04:15.326141 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1211 13:04:15.327902 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-38072788/tls.crt::/tmp/serving-cert-38072788/tls.key\\\\\\\"\\\\nI1211 13:04:21.035582 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1211 13:04:21.038444 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1211 13:04:21.038483 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1211 13:04:21.038515 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1211 13:04:21.038527 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1211 13:04:21.044993 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1211 13:04:21.045022 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1211 13:04:21.045029 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1211 13:04:21.045034 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1211 13:04:21.045038 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1211 13:04:21.045043 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1211 13:04:21.045047 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1211 13:04:21.045297 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1211 13:04:21.046774 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a6e962a8c37823fc7d0047f9c1cae14311bfa1f36236addff039bba3369a021\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca4b5408884ed07718750ad82f77961d242327ca11991fd4dbbd1b5b53c5b642\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca4b5408884ed07718750ad82f77961d242327ca11991fd4dbbd1b5b53c5b642\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:22Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.860999 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1efa7034-8a95-4e6e-bd84-0189dc5acaa3-ovn-node-metrics-cert\") pod \"ovnkube-node-qndxl\" (UID: \"1efa7034-8a95-4e6e-bd84-0189dc5acaa3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qndxl" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.860888 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c-proxy-tls\") pod \"machine-config-daemon-7mmvk\" (UID: \"b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c\") " pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.877447 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fc485a2-a69e-4453-8c3d-4b9698caa632\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df081be8bffcbf5068ade5f09ad81f2ea332d110fb0dffc1ab215f4a447e75d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://765cb993103d23770e20a725ce1f28ca635d13d2ecd05676039a833274ecf3e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41568c567a418a6ebd534c09d02c4331dfa2cd00580441bfdd83a15a12153a59\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8857b4ade110b31598d9080f45bc15f5702b2c278fe7909e4d172c9877e61534\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8857b4ade110b31598d9080f45bc15f5702b2c278fe7909e4d172c9877e61534\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"message\\\":\\\"falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1211 13:04:15.326141 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1211 13:04:15.327902 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-38072788/tls.crt::/tmp/serving-cert-38072788/tls.key\\\\\\\"\\\\nI1211 13:04:21.035582 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1211 13:04:21.038444 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1211 13:04:21.038483 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1211 13:04:21.038515 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1211 13:04:21.038527 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1211 13:04:21.044993 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1211 13:04:21.045022 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1211 13:04:21.045029 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1211 13:04:21.045034 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1211 13:04:21.045038 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1211 13:04:21.045043 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1211 13:04:21.045047 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1211 13:04:21.045297 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1211 13:04:21.046774 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a6e962a8c37823fc7d0047f9c1cae14311bfa1f36236addff039bba3369a021\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca4b5408884ed07718750ad82f77961d242327ca11991fd4dbbd1b5b53c5b642\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca4b5408884ed07718750ad82f77961d242327ca11991fd4dbbd1b5b53c5b642\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:22Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.878881 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-96jns\" (UniqueName: \"kubernetes.io/projected/45192e97-2770-4866-9865-dc2f45b3f616-kube-api-access-96jns\") pod \"multus-additional-cni-plugins-7lxfm\" (UID: \"45192e97-2770-4866-9865-dc2f45b3f616\") " pod="openshift-multus/multus-additional-cni-plugins-7lxfm" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.878937 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-729s2\" (UniqueName: \"kubernetes.io/projected/1efa7034-8a95-4e6e-bd84-0189dc5acaa3-kube-api-access-729s2\") pod \"ovnkube-node-qndxl\" (UID: \"1efa7034-8a95-4e6e-bd84-0189dc5acaa3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qndxl" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.887728 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xv2hc\" (UniqueName: \"kubernetes.io/projected/00c0a5c3-6be3-4c77-a628-cf6710a1f10f-kube-api-access-xv2hc\") pod \"node-resolver-h86cf\" (UID: \"00c0a5c3-6be3-4c77-a628-cf6710a1f10f\") " pod="openshift-dns/node-resolver-h86cf" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.887760 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7cv42\" (UniqueName: \"kubernetes.io/projected/b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c-kube-api-access-7cv42\") pod \"machine-config-daemon-7mmvk\" (UID: \"b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c\") " pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.892955 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"5da705528d8f52dac75bc0e8933f0291198edd28851afbf8e4dffa510f0e0b64"} Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.894052 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"337971ff1d0d3cf7ee256777a71e4549ec0b973b58b437caaa152c589047141f"} Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.894093 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"da39f485a52399a6647e5d5d284e8f75f69e86aac167e6b4e7aa5a0eaa0c045b"} Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.896575 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.898022 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"7ccc1530df8a90b252f6008c3a1734fd489bcb57944c2cc46bdf3c7554d837c0"} Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.898374 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.898373 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bvrpx\" (UniqueName: \"kubernetes.io/projected/4e8ed6cb-b822-4b64-9e00-e755c5aea812-kube-api-access-bvrpx\") pod \"multus-dlqfj\" (UID: \"4e8ed6cb-b822-4b64-9e00-e755c5aea812\") " pod="openshift-multus/multus-dlqfj" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.902787 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:22Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.905606 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"9dab79b9393ae8ca48f0b4366d9b4e6fd25015a3d42f913b3d3d5ce78cbf0cc6"} Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.905659 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"bee512bd389a5e402acad6fdbf075e8d90bbfb3899efaecee992ff8ef64617fb"} Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.905671 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"648246e99a8eab4fb1a1cfbf6a44126f5c2a0ef3378cc33dfbdd4d6450297635"} Dec 11 13:04:22 crc kubenswrapper[4898]: E1211 13:04:22.913419 4898 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"etcd-crc\" already exists" pod="openshift-etcd/etcd-crc" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.925957 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:22Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.940161 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7lxfm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45192e97-2770-4866-9865-dc2f45b3f616\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7lxfm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:22Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.950185 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca3db723-c4e9-4a38-9cee-827f045c7be3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b72e9f6787b4bfadbc89cf669d5f31678f0a85a726107e3b3d2e06eff0e18ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a52b842d9e7001cf1e9c3284c56ac9a76f642b66457d6717c6367e00ef89fdf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://583816a3f9ba9a092d6cd8d75bba54cf42e3ff9f65230f83822baec967dd53dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f419f755ec0a61ffad06dca0d35df43386ebd253152f29cf04312cd0ea89d5de\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:22Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.962541 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:22Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.974070 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:22Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:22 crc kubenswrapper[4898]: I1211 13:04:22.987680 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:22Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:23 crc kubenswrapper[4898]: I1211 13:04:23.000041 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:22Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:23 crc kubenswrapper[4898]: I1211 13:04:23.014341 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-dlqfj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e8ed6cb-b822-4b64-9e00-e755c5aea812\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bvrpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-dlqfj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:23Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:23 crc kubenswrapper[4898]: I1211 13:04:23.021209 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-dlqfj" Dec 11 13:04:23 crc kubenswrapper[4898]: I1211 13:04:23.026101 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-h86cf" Dec 11 13:04:23 crc kubenswrapper[4898]: I1211 13:04:23.027777 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7cv42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7cv42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7mmvk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:23Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:23 crc kubenswrapper[4898]: I1211 13:04:23.033676 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" Dec 11 13:04:23 crc kubenswrapper[4898]: I1211 13:04:23.038638 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-7lxfm" Dec 11 13:04:23 crc kubenswrapper[4898]: W1211 13:04:23.040743 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4e8ed6cb_b822_4b64_9e00_e755c5aea812.slice/crio-30c3b38a6410b32a3da6586e733393efb15056a6a36e07e52fd61c7015f52601 WatchSource:0}: Error finding container 30c3b38a6410b32a3da6586e733393efb15056a6a36e07e52fd61c7015f52601: Status 404 returned error can't find the container with id 30c3b38a6410b32a3da6586e733393efb15056a6a36e07e52fd61c7015f52601 Dec 11 13:04:23 crc kubenswrapper[4898]: I1211 13:04:23.043913 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-qndxl" Dec 11 13:04:23 crc kubenswrapper[4898]: I1211 13:04:23.066488 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qndxl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1efa7034-8a95-4e6e-bd84-0189dc5acaa3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qndxl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:23Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:23 crc kubenswrapper[4898]: W1211 13:04:23.071004 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1efa7034_8a95_4e6e_bd84_0189dc5acaa3.slice/crio-f1c2b6189ae69df6ce73a2937e9e0c5b5e610f3e04da5f5d832b13e9de9295d6 WatchSource:0}: Error finding container f1c2b6189ae69df6ce73a2937e9e0c5b5e610f3e04da5f5d832b13e9de9295d6: Status 404 returned error can't find the container with id f1c2b6189ae69df6ce73a2937e9e0c5b5e610f3e04da5f5d832b13e9de9295d6 Dec 11 13:04:23 crc kubenswrapper[4898]: W1211 13:04:23.071391 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod45192e97_2770_4866_9865_dc2f45b3f616.slice/crio-791181cab4fb1ed7d72684b106e8d64375083ebb9dfe5144774b4eefbfc8fda7 WatchSource:0}: Error finding container 791181cab4fb1ed7d72684b106e8d64375083ebb9dfe5144774b4eefbfc8fda7: Status 404 returned error can't find the container with id 791181cab4fb1ed7d72684b106e8d64375083ebb9dfe5144774b4eefbfc8fda7 Dec 11 13:04:23 crc kubenswrapper[4898]: I1211 13:04:23.094617 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"88d684b6-361a-40e6-978e-b46ea34398f5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fdd8f4f5e41c02e8bfa32782027f6483a3568a3e1cb28f2abc7807eaada32646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad23dc07c8ad7cf6100e3b53e1d9ed8abac20e942968edb5decdde1b263de306\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce1bf1a7718645350f992e26a366a0fefa0d757b696c932d3552dd24a4c93550\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4801f56faef50a1c100b7f54be9508a2be17ab655648fdb92cffaab1cb198a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87f5a835b639c4ce50c0d2b5f1a865f1b67bc5cc06d9fd88875d9defb71dbed2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77f3c45a0ef619bd5ee7b89f62fdaf4ae983c8f75e23034eb88019194f6ef88a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77f3c45a0ef619bd5ee7b89f62fdaf4ae983c8f75e23034eb88019194f6ef88a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32f392949d5ebc630af8b0646f8f482f806803d464e2cedbcbe731328d6ba5ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32f392949d5ebc630af8b0646f8f482f806803d464e2cedbcbe731328d6ba5ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://bf8553e4b1a22afaf454efae3707136b9e6e320a54fd269f48a5f72297249d21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf8553e4b1a22afaf454efae3707136b9e6e320a54fd269f48a5f72297249d21\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:23Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:23 crc kubenswrapper[4898]: I1211 13:04:23.104808 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-h86cf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"00c0a5c3-6be3-4c77-a628-cf6710a1f10f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xv2hc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-h86cf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:23Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:23 crc kubenswrapper[4898]: I1211 13:04:23.118660 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fc485a2-a69e-4453-8c3d-4b9698caa632\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df081be8bffcbf5068ade5f09ad81f2ea332d110fb0dffc1ab215f4a447e75d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://765cb993103d23770e20a725ce1f28ca635d13d2ecd05676039a833274ecf3e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41568c567a418a6ebd534c09d02c4331dfa2cd00580441bfdd83a15a12153a59\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ccc1530df8a90b252f6008c3a1734fd489bcb57944c2cc46bdf3c7554d837c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8857b4ade110b31598d9080f45bc15f5702b2c278fe7909e4d172c9877e61534\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"message\\\":\\\"falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1211 13:04:15.326141 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1211 13:04:15.327902 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-38072788/tls.crt::/tmp/serving-cert-38072788/tls.key\\\\\\\"\\\\nI1211 13:04:21.035582 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1211 13:04:21.038444 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1211 13:04:21.038483 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1211 13:04:21.038515 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1211 13:04:21.038527 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1211 13:04:21.044993 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1211 13:04:21.045022 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1211 13:04:21.045029 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1211 13:04:21.045034 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1211 13:04:21.045038 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1211 13:04:21.045043 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1211 13:04:21.045047 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1211 13:04:21.045297 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1211 13:04:21.046774 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a6e962a8c37823fc7d0047f9c1cae14311bfa1f36236addff039bba3369a021\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca4b5408884ed07718750ad82f77961d242327ca11991fd4dbbd1b5b53c5b642\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca4b5408884ed07718750ad82f77961d242327ca11991fd4dbbd1b5b53c5b642\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:23Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:23 crc kubenswrapper[4898]: I1211 13:04:23.140970 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9dab79b9393ae8ca48f0b4366d9b4e6fd25015a3d42f913b3d3d5ce78cbf0cc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bee512bd389a5e402acad6fdbf075e8d90bbfb3899efaecee992ff8ef64617fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:23Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:23 crc kubenswrapper[4898]: I1211 13:04:23.158191 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:23Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:23 crc kubenswrapper[4898]: I1211 13:04:23.178911 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7lxfm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45192e97-2770-4866-9865-dc2f45b3f616\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7lxfm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:23Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:23 crc kubenswrapper[4898]: I1211 13:04:23.191434 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca3db723-c4e9-4a38-9cee-827f045c7be3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b72e9f6787b4bfadbc89cf669d5f31678f0a85a726107e3b3d2e06eff0e18ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a52b842d9e7001cf1e9c3284c56ac9a76f642b66457d6717c6367e00ef89fdf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://583816a3f9ba9a092d6cd8d75bba54cf42e3ff9f65230f83822baec967dd53dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f419f755ec0a61ffad06dca0d35df43386ebd253152f29cf04312cd0ea89d5de\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:23Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:23 crc kubenswrapper[4898]: I1211 13:04:23.204214 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:23Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:23 crc kubenswrapper[4898]: I1211 13:04:23.221779 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:23Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:23 crc kubenswrapper[4898]: I1211 13:04:23.238482 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://337971ff1d0d3cf7ee256777a71e4549ec0b973b58b437caaa152c589047141f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:23Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:23 crc kubenswrapper[4898]: I1211 13:04:23.266171 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:23Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:23 crc kubenswrapper[4898]: I1211 13:04:23.307115 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-dlqfj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e8ed6cb-b822-4b64-9e00-e755c5aea812\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bvrpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-dlqfj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:23Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:23 crc kubenswrapper[4898]: I1211 13:04:23.354565 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"88d684b6-361a-40e6-978e-b46ea34398f5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fdd8f4f5e41c02e8bfa32782027f6483a3568a3e1cb28f2abc7807eaada32646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad23dc07c8ad7cf6100e3b53e1d9ed8abac20e942968edb5decdde1b263de306\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce1bf1a7718645350f992e26a366a0fefa0d757b696c932d3552dd24a4c93550\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4801f56faef50a1c100b7f54be9508a2be17ab655648fdb92cffaab1cb198a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87f5a835b639c4ce50c0d2b5f1a865f1b67bc5cc06d9fd88875d9defb71dbed2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77f3c45a0ef619bd5ee7b89f62fdaf4ae983c8f75e23034eb88019194f6ef88a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77f3c45a0ef619bd5ee7b89f62fdaf4ae983c8f75e23034eb88019194f6ef88a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32f392949d5ebc630af8b0646f8f482f806803d464e2cedbcbe731328d6ba5ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32f392949d5ebc630af8b0646f8f482f806803d464e2cedbcbe731328d6ba5ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://bf8553e4b1a22afaf454efae3707136b9e6e320a54fd269f48a5f72297249d21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf8553e4b1a22afaf454efae3707136b9e6e320a54fd269f48a5f72297249d21\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:23Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:23 crc kubenswrapper[4898]: I1211 13:04:23.359944 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 11 13:04:23 crc kubenswrapper[4898]: E1211 13:04:23.360183 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-11 13:04:25.360157103 +0000 UTC m=+22.932483600 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 13:04:23 crc kubenswrapper[4898]: I1211 13:04:23.387394 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-h86cf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"00c0a5c3-6be3-4c77-a628-cf6710a1f10f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xv2hc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-h86cf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:23Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:23 crc kubenswrapper[4898]: I1211 13:04:23.426735 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7cv42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7cv42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7mmvk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:23Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:23 crc kubenswrapper[4898]: I1211 13:04:23.461304 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 11 13:04:23 crc kubenswrapper[4898]: E1211 13:04:23.461430 4898 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 11 13:04:23 crc kubenswrapper[4898]: E1211 13:04:23.461652 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-11 13:04:25.461632285 +0000 UTC m=+23.033958732 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 11 13:04:23 crc kubenswrapper[4898]: I1211 13:04:23.461561 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 11 13:04:23 crc kubenswrapper[4898]: I1211 13:04:23.461786 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 11 13:04:23 crc kubenswrapper[4898]: E1211 13:04:23.461873 4898 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 11 13:04:23 crc kubenswrapper[4898]: E1211 13:04:23.462045 4898 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 11 13:04:23 crc kubenswrapper[4898]: E1211 13:04:23.462135 4898 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 11 13:04:23 crc kubenswrapper[4898]: I1211 13:04:23.461886 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 11 13:04:23 crc kubenswrapper[4898]: E1211 13:04:23.461969 4898 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 11 13:04:23 crc kubenswrapper[4898]: E1211 13:04:23.462302 4898 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 11 13:04:23 crc kubenswrapper[4898]: E1211 13:04:23.462318 4898 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 11 13:04:23 crc kubenswrapper[4898]: E1211 13:04:23.461996 4898 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 11 13:04:23 crc kubenswrapper[4898]: E1211 13:04:23.462421 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-11 13:04:25.462235562 +0000 UTC m=+23.034561999 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 11 13:04:23 crc kubenswrapper[4898]: E1211 13:04:23.462449 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-11 13:04:25.462441997 +0000 UTC m=+23.034768434 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 11 13:04:23 crc kubenswrapper[4898]: E1211 13:04:23.462485 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-11 13:04:25.462474708 +0000 UTC m=+23.034801155 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 11 13:04:23 crc kubenswrapper[4898]: I1211 13:04:23.475897 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qndxl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1efa7034-8a95-4e6e-bd84-0189dc5acaa3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qndxl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:23Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:23 crc kubenswrapper[4898]: I1211 13:04:23.774623 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 11 13:04:23 crc kubenswrapper[4898]: I1211 13:04:23.774650 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 11 13:04:23 crc kubenswrapper[4898]: I1211 13:04:23.774699 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 11 13:04:23 crc kubenswrapper[4898]: E1211 13:04:23.774733 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 11 13:04:23 crc kubenswrapper[4898]: E1211 13:04:23.775028 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 11 13:04:23 crc kubenswrapper[4898]: E1211 13:04:23.775076 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 11 13:04:23 crc kubenswrapper[4898]: I1211 13:04:23.910790 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-dlqfj" event={"ID":"4e8ed6cb-b822-4b64-9e00-e755c5aea812","Type":"ContainerStarted","Data":"76ddfb054d2dd7ec27a771c96a0fca2cead8eacc3221c22b36c1d075414a1995"} Dec 11 13:04:23 crc kubenswrapper[4898]: I1211 13:04:23.911309 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-dlqfj" event={"ID":"4e8ed6cb-b822-4b64-9e00-e755c5aea812","Type":"ContainerStarted","Data":"30c3b38a6410b32a3da6586e733393efb15056a6a36e07e52fd61c7015f52601"} Dec 11 13:04:23 crc kubenswrapper[4898]: I1211 13:04:23.913507 4898 generic.go:334] "Generic (PLEG): container finished" podID="45192e97-2770-4866-9865-dc2f45b3f616" containerID="14a9667311a747f8e844897394432d67e0897b22569dd1314f4ebeccfce1c531" exitCode=0 Dec 11 13:04:23 crc kubenswrapper[4898]: I1211 13:04:23.913603 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7lxfm" event={"ID":"45192e97-2770-4866-9865-dc2f45b3f616","Type":"ContainerDied","Data":"14a9667311a747f8e844897394432d67e0897b22569dd1314f4ebeccfce1c531"} Dec 11 13:04:23 crc kubenswrapper[4898]: I1211 13:04:23.913652 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7lxfm" event={"ID":"45192e97-2770-4866-9865-dc2f45b3f616","Type":"ContainerStarted","Data":"791181cab4fb1ed7d72684b106e8d64375083ebb9dfe5144774b4eefbfc8fda7"} Dec 11 13:04:23 crc kubenswrapper[4898]: I1211 13:04:23.915752 4898 generic.go:334] "Generic (PLEG): container finished" podID="1efa7034-8a95-4e6e-bd84-0189dc5acaa3" containerID="8fc397b24e8480c4d07281b09fb871f859eb9fdf47fa9103b9243f91e5800d61" exitCode=0 Dec 11 13:04:23 crc kubenswrapper[4898]: I1211 13:04:23.915825 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qndxl" event={"ID":"1efa7034-8a95-4e6e-bd84-0189dc5acaa3","Type":"ContainerDied","Data":"8fc397b24e8480c4d07281b09fb871f859eb9fdf47fa9103b9243f91e5800d61"} Dec 11 13:04:23 crc kubenswrapper[4898]: I1211 13:04:23.915853 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qndxl" event={"ID":"1efa7034-8a95-4e6e-bd84-0189dc5acaa3","Type":"ContainerStarted","Data":"f1c2b6189ae69df6ce73a2937e9e0c5b5e610f3e04da5f5d832b13e9de9295d6"} Dec 11 13:04:23 crc kubenswrapper[4898]: I1211 13:04:23.918673 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" event={"ID":"b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c","Type":"ContainerStarted","Data":"a6782ac5869029f1d6c05198c469d34070decbbbbab0c093963024040f067ecf"} Dec 11 13:04:23 crc kubenswrapper[4898]: I1211 13:04:23.918716 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" event={"ID":"b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c","Type":"ContainerStarted","Data":"eb7b98c50effebe09db7e51fe139da11af7b116ce6b7d81f0dada92fa29c82a6"} Dec 11 13:04:23 crc kubenswrapper[4898]: I1211 13:04:23.918729 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" event={"ID":"b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c","Type":"ContainerStarted","Data":"68f8158fe4b917c76838068f083d557704eb069c1aa671b59efb279247619721"} Dec 11 13:04:23 crc kubenswrapper[4898]: I1211 13:04:23.920393 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-h86cf" event={"ID":"00c0a5c3-6be3-4c77-a628-cf6710a1f10f","Type":"ContainerStarted","Data":"8cb94931502e22d03595fc3289c2b098ecf881b9f8c14180a036a11b2f65bd91"} Dec 11 13:04:23 crc kubenswrapper[4898]: I1211 13:04:23.920423 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-h86cf" event={"ID":"00c0a5c3-6be3-4c77-a628-cf6710a1f10f","Type":"ContainerStarted","Data":"306f7a80bc75c005ed3d3e5ca976e34c0bfd4a080da37f750ac90badd34a3002"} Dec 11 13:04:23 crc kubenswrapper[4898]: I1211 13:04:23.928971 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fc485a2-a69e-4453-8c3d-4b9698caa632\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df081be8bffcbf5068ade5f09ad81f2ea332d110fb0dffc1ab215f4a447e75d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://765cb993103d23770e20a725ce1f28ca635d13d2ecd05676039a833274ecf3e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41568c567a418a6ebd534c09d02c4331dfa2cd00580441bfdd83a15a12153a59\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ccc1530df8a90b252f6008c3a1734fd489bcb57944c2cc46bdf3c7554d837c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8857b4ade110b31598d9080f45bc15f5702b2c278fe7909e4d172c9877e61534\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"message\\\":\\\"falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1211 13:04:15.326141 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1211 13:04:15.327902 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-38072788/tls.crt::/tmp/serving-cert-38072788/tls.key\\\\\\\"\\\\nI1211 13:04:21.035582 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1211 13:04:21.038444 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1211 13:04:21.038483 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1211 13:04:21.038515 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1211 13:04:21.038527 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1211 13:04:21.044993 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1211 13:04:21.045022 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1211 13:04:21.045029 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1211 13:04:21.045034 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1211 13:04:21.045038 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1211 13:04:21.045043 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1211 13:04:21.045047 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1211 13:04:21.045297 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1211 13:04:21.046774 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a6e962a8c37823fc7d0047f9c1cae14311bfa1f36236addff039bba3369a021\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca4b5408884ed07718750ad82f77961d242327ca11991fd4dbbd1b5b53c5b642\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca4b5408884ed07718750ad82f77961d242327ca11991fd4dbbd1b5b53c5b642\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:23Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:23 crc kubenswrapper[4898]: I1211 13:04:23.951367 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9dab79b9393ae8ca48f0b4366d9b4e6fd25015a3d42f913b3d3d5ce78cbf0cc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bee512bd389a5e402acad6fdbf075e8d90bbfb3899efaecee992ff8ef64617fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:23Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:23 crc kubenswrapper[4898]: I1211 13:04:23.963804 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:23Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:23 crc kubenswrapper[4898]: I1211 13:04:23.981510 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7lxfm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45192e97-2770-4866-9865-dc2f45b3f616\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7lxfm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:23Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:23 crc kubenswrapper[4898]: I1211 13:04:23.992535 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca3db723-c4e9-4a38-9cee-827f045c7be3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b72e9f6787b4bfadbc89cf669d5f31678f0a85a726107e3b3d2e06eff0e18ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a52b842d9e7001cf1e9c3284c56ac9a76f642b66457d6717c6367e00ef89fdf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://583816a3f9ba9a092d6cd8d75bba54cf42e3ff9f65230f83822baec967dd53dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f419f755ec0a61ffad06dca0d35df43386ebd253152f29cf04312cd0ea89d5de\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:23Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:24 crc kubenswrapper[4898]: I1211 13:04:24.001850 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:24Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:24 crc kubenswrapper[4898]: I1211 13:04:24.013125 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:24Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:24 crc kubenswrapper[4898]: I1211 13:04:24.025572 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://337971ff1d0d3cf7ee256777a71e4549ec0b973b58b437caaa152c589047141f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:24Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:24 crc kubenswrapper[4898]: I1211 13:04:24.036140 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:24Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:24 crc kubenswrapper[4898]: I1211 13:04:24.047922 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-dlqfj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e8ed6cb-b822-4b64-9e00-e755c5aea812\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76ddfb054d2dd7ec27a771c96a0fca2cead8eacc3221c22b36c1d075414a1995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bvrpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-dlqfj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:24Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:24 crc kubenswrapper[4898]: I1211 13:04:24.064482 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"88d684b6-361a-40e6-978e-b46ea34398f5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fdd8f4f5e41c02e8bfa32782027f6483a3568a3e1cb28f2abc7807eaada32646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad23dc07c8ad7cf6100e3b53e1d9ed8abac20e942968edb5decdde1b263de306\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce1bf1a7718645350f992e26a366a0fefa0d757b696c932d3552dd24a4c93550\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4801f56faef50a1c100b7f54be9508a2be17ab655648fdb92cffaab1cb198a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87f5a835b639c4ce50c0d2b5f1a865f1b67bc5cc06d9fd88875d9defb71dbed2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77f3c45a0ef619bd5ee7b89f62fdaf4ae983c8f75e23034eb88019194f6ef88a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77f3c45a0ef619bd5ee7b89f62fdaf4ae983c8f75e23034eb88019194f6ef88a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32f392949d5ebc630af8b0646f8f482f806803d464e2cedbcbe731328d6ba5ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32f392949d5ebc630af8b0646f8f482f806803d464e2cedbcbe731328d6ba5ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://bf8553e4b1a22afaf454efae3707136b9e6e320a54fd269f48a5f72297249d21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf8553e4b1a22afaf454efae3707136b9e6e320a54fd269f48a5f72297249d21\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:24Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:24 crc kubenswrapper[4898]: I1211 13:04:24.073338 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-h86cf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"00c0a5c3-6be3-4c77-a628-cf6710a1f10f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xv2hc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-h86cf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:24Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:24 crc kubenswrapper[4898]: I1211 13:04:24.085921 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7cv42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7cv42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7mmvk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:24Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:24 crc kubenswrapper[4898]: I1211 13:04:24.110497 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qndxl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1efa7034-8a95-4e6e-bd84-0189dc5acaa3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qndxl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:24Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:24 crc kubenswrapper[4898]: I1211 13:04:24.125645 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-h86cf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"00c0a5c3-6be3-4c77-a628-cf6710a1f10f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cb94931502e22d03595fc3289c2b098ecf881b9f8c14180a036a11b2f65bd91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xv2hc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-h86cf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:24Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:24 crc kubenswrapper[4898]: I1211 13:04:24.146276 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6782ac5869029f1d6c05198c469d34070decbbbbab0c093963024040f067ecf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7cv42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb7b98c50effebe09db7e51fe139da11af7b116ce6b7d81f0dada92fa29c82a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7cv42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7mmvk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:24Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:24 crc kubenswrapper[4898]: I1211 13:04:24.164524 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qndxl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1efa7034-8a95-4e6e-bd84-0189dc5acaa3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fc397b24e8480c4d07281b09fb871f859eb9fdf47fa9103b9243f91e5800d61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fc397b24e8480c4d07281b09fb871f859eb9fdf47fa9103b9243f91e5800d61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qndxl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:24Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:24 crc kubenswrapper[4898]: I1211 13:04:24.187727 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-jmvlr"] Dec 11 13:04:24 crc kubenswrapper[4898]: I1211 13:04:24.188141 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-jmvlr" Dec 11 13:04:24 crc kubenswrapper[4898]: I1211 13:04:24.196684 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"88d684b6-361a-40e6-978e-b46ea34398f5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fdd8f4f5e41c02e8bfa32782027f6483a3568a3e1cb28f2abc7807eaada32646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad23dc07c8ad7cf6100e3b53e1d9ed8abac20e942968edb5decdde1b263de306\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce1bf1a7718645350f992e26a366a0fefa0d757b696c932d3552dd24a4c93550\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4801f56faef50a1c100b7f54be9508a2be17ab655648fdb92cffaab1cb198a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87f5a835b639c4ce50c0d2b5f1a865f1b67bc5cc06d9fd88875d9defb71dbed2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77f3c45a0ef619bd5ee7b89f62fdaf4ae983c8f75e23034eb88019194f6ef88a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77f3c45a0ef619bd5ee7b89f62fdaf4ae983c8f75e23034eb88019194f6ef88a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32f392949d5ebc630af8b0646f8f482f806803d464e2cedbcbe731328d6ba5ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32f392949d5ebc630af8b0646f8f482f806803d464e2cedbcbe731328d6ba5ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://bf8553e4b1a22afaf454efae3707136b9e6e320a54fd269f48a5f72297249d21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf8553e4b1a22afaf454efae3707136b9e6e320a54fd269f48a5f72297249d21\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:24Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:24 crc kubenswrapper[4898]: I1211 13:04:24.198470 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Dec 11 13:04:24 crc kubenswrapper[4898]: I1211 13:04:24.219366 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Dec 11 13:04:24 crc kubenswrapper[4898]: I1211 13:04:24.239374 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Dec 11 13:04:24 crc kubenswrapper[4898]: I1211 13:04:24.258864 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Dec 11 13:04:24 crc kubenswrapper[4898]: I1211 13:04:24.269559 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8009db93-3f68-4a84-87ae-863f64e231e1-host\") pod \"node-ca-jmvlr\" (UID: \"8009db93-3f68-4a84-87ae-863f64e231e1\") " pod="openshift-image-registry/node-ca-jmvlr" Dec 11 13:04:24 crc kubenswrapper[4898]: I1211 13:04:24.269602 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/8009db93-3f68-4a84-87ae-863f64e231e1-serviceca\") pod \"node-ca-jmvlr\" (UID: \"8009db93-3f68-4a84-87ae-863f64e231e1\") " pod="openshift-image-registry/node-ca-jmvlr" Dec 11 13:04:24 crc kubenswrapper[4898]: I1211 13:04:24.269629 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xkmx4\" (UniqueName: \"kubernetes.io/projected/8009db93-3f68-4a84-87ae-863f64e231e1-kube-api-access-xkmx4\") pod \"node-ca-jmvlr\" (UID: \"8009db93-3f68-4a84-87ae-863f64e231e1\") " pod="openshift-image-registry/node-ca-jmvlr" Dec 11 13:04:24 crc kubenswrapper[4898]: I1211 13:04:24.307434 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7lxfm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45192e97-2770-4866-9865-dc2f45b3f616\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14a9667311a747f8e844897394432d67e0897b22569dd1314f4ebeccfce1c531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14a9667311a747f8e844897394432d67e0897b22569dd1314f4ebeccfce1c531\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7lxfm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:24Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:24 crc kubenswrapper[4898]: I1211 13:04:24.347188 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fc485a2-a69e-4453-8c3d-4b9698caa632\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df081be8bffcbf5068ade5f09ad81f2ea332d110fb0dffc1ab215f4a447e75d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://765cb993103d23770e20a725ce1f28ca635d13d2ecd05676039a833274ecf3e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41568c567a418a6ebd534c09d02c4331dfa2cd00580441bfdd83a15a12153a59\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ccc1530df8a90b252f6008c3a1734fd489bcb57944c2cc46bdf3c7554d837c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8857b4ade110b31598d9080f45bc15f5702b2c278fe7909e4d172c9877e61534\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"message\\\":\\\"falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1211 13:04:15.326141 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1211 13:04:15.327902 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-38072788/tls.crt::/tmp/serving-cert-38072788/tls.key\\\\\\\"\\\\nI1211 13:04:21.035582 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1211 13:04:21.038444 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1211 13:04:21.038483 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1211 13:04:21.038515 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1211 13:04:21.038527 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1211 13:04:21.044993 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1211 13:04:21.045022 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1211 13:04:21.045029 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1211 13:04:21.045034 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1211 13:04:21.045038 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1211 13:04:21.045043 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1211 13:04:21.045047 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1211 13:04:21.045297 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1211 13:04:21.046774 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a6e962a8c37823fc7d0047f9c1cae14311bfa1f36236addff039bba3369a021\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca4b5408884ed07718750ad82f77961d242327ca11991fd4dbbd1b5b53c5b642\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca4b5408884ed07718750ad82f77961d242327ca11991fd4dbbd1b5b53c5b642\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:24Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:24 crc kubenswrapper[4898]: I1211 13:04:24.370625 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8009db93-3f68-4a84-87ae-863f64e231e1-host\") pod \"node-ca-jmvlr\" (UID: \"8009db93-3f68-4a84-87ae-863f64e231e1\") " pod="openshift-image-registry/node-ca-jmvlr" Dec 11 13:04:24 crc kubenswrapper[4898]: I1211 13:04:24.370664 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/8009db93-3f68-4a84-87ae-863f64e231e1-serviceca\") pod \"node-ca-jmvlr\" (UID: \"8009db93-3f68-4a84-87ae-863f64e231e1\") " pod="openshift-image-registry/node-ca-jmvlr" Dec 11 13:04:24 crc kubenswrapper[4898]: I1211 13:04:24.370686 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xkmx4\" (UniqueName: \"kubernetes.io/projected/8009db93-3f68-4a84-87ae-863f64e231e1-kube-api-access-xkmx4\") pod \"node-ca-jmvlr\" (UID: \"8009db93-3f68-4a84-87ae-863f64e231e1\") " pod="openshift-image-registry/node-ca-jmvlr" Dec 11 13:04:24 crc kubenswrapper[4898]: I1211 13:04:24.370857 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8009db93-3f68-4a84-87ae-863f64e231e1-host\") pod \"node-ca-jmvlr\" (UID: \"8009db93-3f68-4a84-87ae-863f64e231e1\") " pod="openshift-image-registry/node-ca-jmvlr" Dec 11 13:04:24 crc kubenswrapper[4898]: I1211 13:04:24.371653 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/8009db93-3f68-4a84-87ae-863f64e231e1-serviceca\") pod \"node-ca-jmvlr\" (UID: \"8009db93-3f68-4a84-87ae-863f64e231e1\") " pod="openshift-image-registry/node-ca-jmvlr" Dec 11 13:04:24 crc kubenswrapper[4898]: I1211 13:04:24.386978 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9dab79b9393ae8ca48f0b4366d9b4e6fd25015a3d42f913b3d3d5ce78cbf0cc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bee512bd389a5e402acad6fdbf075e8d90bbfb3899efaecee992ff8ef64617fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:24Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:24 crc kubenswrapper[4898]: I1211 13:04:24.413704 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xkmx4\" (UniqueName: \"kubernetes.io/projected/8009db93-3f68-4a84-87ae-863f64e231e1-kube-api-access-xkmx4\") pod \"node-ca-jmvlr\" (UID: \"8009db93-3f68-4a84-87ae-863f64e231e1\") " pod="openshift-image-registry/node-ca-jmvlr" Dec 11 13:04:24 crc kubenswrapper[4898]: I1211 13:04:24.448134 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:24Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:24 crc kubenswrapper[4898]: I1211 13:04:24.491246 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca3db723-c4e9-4a38-9cee-827f045c7be3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b72e9f6787b4bfadbc89cf669d5f31678f0a85a726107e3b3d2e06eff0e18ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a52b842d9e7001cf1e9c3284c56ac9a76f642b66457d6717c6367e00ef89fdf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://583816a3f9ba9a092d6cd8d75bba54cf42e3ff9f65230f83822baec967dd53dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f419f755ec0a61ffad06dca0d35df43386ebd253152f29cf04312cd0ea89d5de\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:24Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:24 crc kubenswrapper[4898]: I1211 13:04:24.533561 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:24Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:24 crc kubenswrapper[4898]: I1211 13:04:24.567851 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:24Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:24 crc kubenswrapper[4898]: I1211 13:04:24.567872 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-jmvlr" Dec 11 13:04:24 crc kubenswrapper[4898]: I1211 13:04:24.615718 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://337971ff1d0d3cf7ee256777a71e4549ec0b973b58b437caaa152c589047141f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:24Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:24 crc kubenswrapper[4898]: I1211 13:04:24.650925 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:24Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:24 crc kubenswrapper[4898]: I1211 13:04:24.695579 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-dlqfj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e8ed6cb-b822-4b64-9e00-e755c5aea812\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76ddfb054d2dd7ec27a771c96a0fca2cead8eacc3221c22b36c1d075414a1995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bvrpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-dlqfj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:24Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:24 crc kubenswrapper[4898]: I1211 13:04:24.727587 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fc485a2-a69e-4453-8c3d-4b9698caa632\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df081be8bffcbf5068ade5f09ad81f2ea332d110fb0dffc1ab215f4a447e75d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://765cb993103d23770e20a725ce1f28ca635d13d2ecd05676039a833274ecf3e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41568c567a418a6ebd534c09d02c4331dfa2cd00580441bfdd83a15a12153a59\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ccc1530df8a90b252f6008c3a1734fd489bcb57944c2cc46bdf3c7554d837c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8857b4ade110b31598d9080f45bc15f5702b2c278fe7909e4d172c9877e61534\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"message\\\":\\\"falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1211 13:04:15.326141 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1211 13:04:15.327902 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-38072788/tls.crt::/tmp/serving-cert-38072788/tls.key\\\\\\\"\\\\nI1211 13:04:21.035582 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1211 13:04:21.038444 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1211 13:04:21.038483 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1211 13:04:21.038515 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1211 13:04:21.038527 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1211 13:04:21.044993 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1211 13:04:21.045022 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1211 13:04:21.045029 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1211 13:04:21.045034 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1211 13:04:21.045038 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1211 13:04:21.045043 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1211 13:04:21.045047 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1211 13:04:21.045297 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1211 13:04:21.046774 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a6e962a8c37823fc7d0047f9c1cae14311bfa1f36236addff039bba3369a021\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca4b5408884ed07718750ad82f77961d242327ca11991fd4dbbd1b5b53c5b642\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca4b5408884ed07718750ad82f77961d242327ca11991fd4dbbd1b5b53c5b642\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:24Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:24 crc kubenswrapper[4898]: I1211 13:04:24.768012 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9dab79b9393ae8ca48f0b4366d9b4e6fd25015a3d42f913b3d3d5ce78cbf0cc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bee512bd389a5e402acad6fdbf075e8d90bbfb3899efaecee992ff8ef64617fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:24Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:24 crc kubenswrapper[4898]: I1211 13:04:24.807663 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:24Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:24 crc kubenswrapper[4898]: I1211 13:04:24.850920 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7lxfm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45192e97-2770-4866-9865-dc2f45b3f616\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14a9667311a747f8e844897394432d67e0897b22569dd1314f4ebeccfce1c531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14a9667311a747f8e844897394432d67e0897b22569dd1314f4ebeccfce1c531\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7lxfm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:24Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:24 crc kubenswrapper[4898]: I1211 13:04:24.888240 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca3db723-c4e9-4a38-9cee-827f045c7be3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b72e9f6787b4bfadbc89cf669d5f31678f0a85a726107e3b3d2e06eff0e18ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a52b842d9e7001cf1e9c3284c56ac9a76f642b66457d6717c6367e00ef89fdf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://583816a3f9ba9a092d6cd8d75bba54cf42e3ff9f65230f83822baec967dd53dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f419f755ec0a61ffad06dca0d35df43386ebd253152f29cf04312cd0ea89d5de\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:24Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:24 crc kubenswrapper[4898]: I1211 13:04:24.924685 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"b28cc743c1173591c6ee75d4bb3841608d01bca50dab896d4c5de930a67ee22f"} Dec 11 13:04:24 crc kubenswrapper[4898]: I1211 13:04:24.925593 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-jmvlr" event={"ID":"8009db93-3f68-4a84-87ae-863f64e231e1","Type":"ContainerStarted","Data":"08079d059f71e05ad9d28ebed3d2608a035d3b6fc87db1d80b3974b439ac5bdf"} Dec 11 13:04:24 crc kubenswrapper[4898]: I1211 13:04:24.925617 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-jmvlr" event={"ID":"8009db93-3f68-4a84-87ae-863f64e231e1","Type":"ContainerStarted","Data":"a77da79ea056e6886dc090b12c56ba99f32df690a0a0eaa1b604dace5b687067"} Dec 11 13:04:24 crc kubenswrapper[4898]: I1211 13:04:24.926696 4898 generic.go:334] "Generic (PLEG): container finished" podID="45192e97-2770-4866-9865-dc2f45b3f616" containerID="35bb75c37377565dd0410255d00b3a848a3d22801107bd4d9547ce5574cda121" exitCode=0 Dec 11 13:04:24 crc kubenswrapper[4898]: I1211 13:04:24.926739 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7lxfm" event={"ID":"45192e97-2770-4866-9865-dc2f45b3f616","Type":"ContainerDied","Data":"35bb75c37377565dd0410255d00b3a848a3d22801107bd4d9547ce5574cda121"} Dec 11 13:04:24 crc kubenswrapper[4898]: I1211 13:04:24.929857 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:24Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:24 crc kubenswrapper[4898]: I1211 13:04:24.933364 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qndxl" event={"ID":"1efa7034-8a95-4e6e-bd84-0189dc5acaa3","Type":"ContainerStarted","Data":"6ab0efc2d7d46f575125a46f025b016017e66f74466a9d796dfb0285577f5bcb"} Dec 11 13:04:24 crc kubenswrapper[4898]: I1211 13:04:24.933399 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qndxl" event={"ID":"1efa7034-8a95-4e6e-bd84-0189dc5acaa3","Type":"ContainerStarted","Data":"d992af21fd68cce19a4be8328466a3d46df62ed908fe4e911ba14e7c009ce982"} Dec 11 13:04:24 crc kubenswrapper[4898]: I1211 13:04:24.933408 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qndxl" event={"ID":"1efa7034-8a95-4e6e-bd84-0189dc5acaa3","Type":"ContainerStarted","Data":"dd81ec4902eeb225d5a2969e211c63bbab448c1cccb327cfa84fb68c4f386ab0"} Dec 11 13:04:24 crc kubenswrapper[4898]: I1211 13:04:24.933416 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qndxl" event={"ID":"1efa7034-8a95-4e6e-bd84-0189dc5acaa3","Type":"ContainerStarted","Data":"d2a77e22893dca5ea818fc6992db6c34f8eb90ce2f25e814bec5f314d60ad569"} Dec 11 13:04:24 crc kubenswrapper[4898]: I1211 13:04:24.933426 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qndxl" event={"ID":"1efa7034-8a95-4e6e-bd84-0189dc5acaa3","Type":"ContainerStarted","Data":"a60901e27e67b7483e02800a91a86f2676b610d538c999ab1df0d42fde146f90"} Dec 11 13:04:24 crc kubenswrapper[4898]: I1211 13:04:24.933434 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qndxl" event={"ID":"1efa7034-8a95-4e6e-bd84-0189dc5acaa3","Type":"ContainerStarted","Data":"b7e7d7ded2a6e0e20781a16b0ed56a744db593c6fe40c75e89a088ccb8cdb127"} Dec 11 13:04:24 crc kubenswrapper[4898]: I1211 13:04:24.968066 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:24Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:25 crc kubenswrapper[4898]: I1211 13:04:25.009071 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://337971ff1d0d3cf7ee256777a71e4549ec0b973b58b437caaa152c589047141f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:25Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:25 crc kubenswrapper[4898]: I1211 13:04:25.047581 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:25Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:25 crc kubenswrapper[4898]: I1211 13:04:25.090930 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-dlqfj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e8ed6cb-b822-4b64-9e00-e755c5aea812\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76ddfb054d2dd7ec27a771c96a0fca2cead8eacc3221c22b36c1d075414a1995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bvrpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-dlqfj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:25Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:25 crc kubenswrapper[4898]: I1211 13:04:25.127274 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jmvlr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8009db93-3f68-4a84-87ae-863f64e231e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkmx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:24Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jmvlr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:25Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:25 crc kubenswrapper[4898]: I1211 13:04:25.172749 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qndxl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1efa7034-8a95-4e6e-bd84-0189dc5acaa3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fc397b24e8480c4d07281b09fb871f859eb9fdf47fa9103b9243f91e5800d61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fc397b24e8480c4d07281b09fb871f859eb9fdf47fa9103b9243f91e5800d61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qndxl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:25Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:25 crc kubenswrapper[4898]: I1211 13:04:25.213318 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"88d684b6-361a-40e6-978e-b46ea34398f5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fdd8f4f5e41c02e8bfa32782027f6483a3568a3e1cb28f2abc7807eaada32646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad23dc07c8ad7cf6100e3b53e1d9ed8abac20e942968edb5decdde1b263de306\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce1bf1a7718645350f992e26a366a0fefa0d757b696c932d3552dd24a4c93550\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4801f56faef50a1c100b7f54be9508a2be17ab655648fdb92cffaab1cb198a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87f5a835b639c4ce50c0d2b5f1a865f1b67bc5cc06d9fd88875d9defb71dbed2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77f3c45a0ef619bd5ee7b89f62fdaf4ae983c8f75e23034eb88019194f6ef88a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77f3c45a0ef619bd5ee7b89f62fdaf4ae983c8f75e23034eb88019194f6ef88a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32f392949d5ebc630af8b0646f8f482f806803d464e2cedbcbe731328d6ba5ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32f392949d5ebc630af8b0646f8f482f806803d464e2cedbcbe731328d6ba5ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://bf8553e4b1a22afaf454efae3707136b9e6e320a54fd269f48a5f72297249d21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf8553e4b1a22afaf454efae3707136b9e6e320a54fd269f48a5f72297249d21\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:25Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:25 crc kubenswrapper[4898]: I1211 13:04:25.246564 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-h86cf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"00c0a5c3-6be3-4c77-a628-cf6710a1f10f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cb94931502e22d03595fc3289c2b098ecf881b9f8c14180a036a11b2f65bd91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xv2hc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-h86cf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:25Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:25 crc kubenswrapper[4898]: I1211 13:04:25.291095 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6782ac5869029f1d6c05198c469d34070decbbbbab0c093963024040f067ecf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7cv42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb7b98c50effebe09db7e51fe139da11af7b116ce6b7d81f0dada92fa29c82a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7cv42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7mmvk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:25Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:25 crc kubenswrapper[4898]: I1211 13:04:25.339953 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"88d684b6-361a-40e6-978e-b46ea34398f5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fdd8f4f5e41c02e8bfa32782027f6483a3568a3e1cb28f2abc7807eaada32646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad23dc07c8ad7cf6100e3b53e1d9ed8abac20e942968edb5decdde1b263de306\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce1bf1a7718645350f992e26a366a0fefa0d757b696c932d3552dd24a4c93550\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4801f56faef50a1c100b7f54be9508a2be17ab655648fdb92cffaab1cb198a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87f5a835b639c4ce50c0d2b5f1a865f1b67bc5cc06d9fd88875d9defb71dbed2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77f3c45a0ef619bd5ee7b89f62fdaf4ae983c8f75e23034eb88019194f6ef88a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77f3c45a0ef619bd5ee7b89f62fdaf4ae983c8f75e23034eb88019194f6ef88a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32f392949d5ebc630af8b0646f8f482f806803d464e2cedbcbe731328d6ba5ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32f392949d5ebc630af8b0646f8f482f806803d464e2cedbcbe731328d6ba5ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://bf8553e4b1a22afaf454efae3707136b9e6e320a54fd269f48a5f72297249d21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf8553e4b1a22afaf454efae3707136b9e6e320a54fd269f48a5f72297249d21\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:25Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:25 crc kubenswrapper[4898]: I1211 13:04:25.364388 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-h86cf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"00c0a5c3-6be3-4c77-a628-cf6710a1f10f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cb94931502e22d03595fc3289c2b098ecf881b9f8c14180a036a11b2f65bd91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xv2hc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-h86cf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:25Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:25 crc kubenswrapper[4898]: I1211 13:04:25.380600 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 11 13:04:25 crc kubenswrapper[4898]: E1211 13:04:25.380768 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-11 13:04:29.380747941 +0000 UTC m=+26.953074398 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 13:04:25 crc kubenswrapper[4898]: I1211 13:04:25.405853 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6782ac5869029f1d6c05198c469d34070decbbbbab0c093963024040f067ecf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7cv42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb7b98c50effebe09db7e51fe139da11af7b116ce6b7d81f0dada92fa29c82a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7cv42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7mmvk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:25Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:25 crc kubenswrapper[4898]: I1211 13:04:25.453504 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qndxl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1efa7034-8a95-4e6e-bd84-0189dc5acaa3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fc397b24e8480c4d07281b09fb871f859eb9fdf47fa9103b9243f91e5800d61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fc397b24e8480c4d07281b09fb871f859eb9fdf47fa9103b9243f91e5800d61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qndxl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:25Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:25 crc kubenswrapper[4898]: I1211 13:04:25.481620 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 11 13:04:25 crc kubenswrapper[4898]: I1211 13:04:25.481679 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 11 13:04:25 crc kubenswrapper[4898]: I1211 13:04:25.481713 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 11 13:04:25 crc kubenswrapper[4898]: I1211 13:04:25.481737 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 11 13:04:25 crc kubenswrapper[4898]: E1211 13:04:25.481762 4898 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 11 13:04:25 crc kubenswrapper[4898]: E1211 13:04:25.481810 4898 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 11 13:04:25 crc kubenswrapper[4898]: E1211 13:04:25.481836 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-11 13:04:29.481815141 +0000 UTC m=+27.054141588 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 11 13:04:25 crc kubenswrapper[4898]: E1211 13:04:25.481860 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-11 13:04:29.481847922 +0000 UTC m=+27.054174359 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 11 13:04:25 crc kubenswrapper[4898]: E1211 13:04:25.481872 4898 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 11 13:04:25 crc kubenswrapper[4898]: E1211 13:04:25.481896 4898 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 11 13:04:25 crc kubenswrapper[4898]: E1211 13:04:25.481906 4898 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 11 13:04:25 crc kubenswrapper[4898]: E1211 13:04:25.481932 4898 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 11 13:04:25 crc kubenswrapper[4898]: E1211 13:04:25.481946 4898 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 11 13:04:25 crc kubenswrapper[4898]: E1211 13:04:25.481993 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-11 13:04:29.481976236 +0000 UTC m=+27.054302683 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 11 13:04:25 crc kubenswrapper[4898]: E1211 13:04:25.481912 4898 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 11 13:04:25 crc kubenswrapper[4898]: E1211 13:04:25.482043 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-11 13:04:29.482031947 +0000 UTC m=+27.054358404 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 11 13:04:25 crc kubenswrapper[4898]: I1211 13:04:25.490983 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fc485a2-a69e-4453-8c3d-4b9698caa632\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df081be8bffcbf5068ade5f09ad81f2ea332d110fb0dffc1ab215f4a447e75d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://765cb993103d23770e20a725ce1f28ca635d13d2ecd05676039a833274ecf3e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41568c567a418a6ebd534c09d02c4331dfa2cd00580441bfdd83a15a12153a59\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ccc1530df8a90b252f6008c3a1734fd489bcb57944c2cc46bdf3c7554d837c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8857b4ade110b31598d9080f45bc15f5702b2c278fe7909e4d172c9877e61534\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"message\\\":\\\"falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1211 13:04:15.326141 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1211 13:04:15.327902 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-38072788/tls.crt::/tmp/serving-cert-38072788/tls.key\\\\\\\"\\\\nI1211 13:04:21.035582 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1211 13:04:21.038444 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1211 13:04:21.038483 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1211 13:04:21.038515 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1211 13:04:21.038527 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1211 13:04:21.044993 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1211 13:04:21.045022 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1211 13:04:21.045029 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1211 13:04:21.045034 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1211 13:04:21.045038 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1211 13:04:21.045043 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1211 13:04:21.045047 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1211 13:04:21.045297 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1211 13:04:21.046774 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a6e962a8c37823fc7d0047f9c1cae14311bfa1f36236addff039bba3369a021\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca4b5408884ed07718750ad82f77961d242327ca11991fd4dbbd1b5b53c5b642\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca4b5408884ed07718750ad82f77961d242327ca11991fd4dbbd1b5b53c5b642\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:25Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:25 crc kubenswrapper[4898]: I1211 13:04:25.529679 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9dab79b9393ae8ca48f0b4366d9b4e6fd25015a3d42f913b3d3d5ce78cbf0cc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bee512bd389a5e402acad6fdbf075e8d90bbfb3899efaecee992ff8ef64617fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:25Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:25 crc kubenswrapper[4898]: I1211 13:04:25.569165 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:25Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:25 crc kubenswrapper[4898]: I1211 13:04:25.585026 4898 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 11 13:04:25 crc kubenswrapper[4898]: I1211 13:04:25.586765 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:25 crc kubenswrapper[4898]: I1211 13:04:25.586799 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:25 crc kubenswrapper[4898]: I1211 13:04:25.586808 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:25 crc kubenswrapper[4898]: I1211 13:04:25.586862 4898 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 11 13:04:25 crc kubenswrapper[4898]: I1211 13:04:25.608757 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7lxfm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45192e97-2770-4866-9865-dc2f45b3f616\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14a9667311a747f8e844897394432d67e0897b22569dd1314f4ebeccfce1c531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14a9667311a747f8e844897394432d67e0897b22569dd1314f4ebeccfce1c531\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://35bb75c37377565dd0410255d00b3a848a3d22801107bd4d9547ce5574cda121\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35bb75c37377565dd0410255d00b3a848a3d22801107bd4d9547ce5574cda121\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7lxfm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:25Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:25 crc kubenswrapper[4898]: I1211 13:04:25.661513 4898 kubelet_node_status.go:115] "Node was previously registered" node="crc" Dec 11 13:04:25 crc kubenswrapper[4898]: I1211 13:04:25.662145 4898 kubelet_node_status.go:79] "Successfully registered node" node="crc" Dec 11 13:04:25 crc kubenswrapper[4898]: I1211 13:04:25.663751 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:25 crc kubenswrapper[4898]: I1211 13:04:25.663810 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:25 crc kubenswrapper[4898]: I1211 13:04:25.663826 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:25 crc kubenswrapper[4898]: I1211 13:04:25.663851 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:25 crc kubenswrapper[4898]: I1211 13:04:25.663868 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:25Z","lastTransitionTime":"2025-12-11T13:04:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:25 crc kubenswrapper[4898]: E1211 13:04:25.684714 4898 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-11T13:04:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-11T13:04:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-11T13:04:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-11T13:04:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"160776e4-8c30-4eed-9dbf-aa0b51733cfb\\\",\\\"systemUUID\\\":\\\"3ede5a2c-67f9-4bff-827e-03a23908e5c0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:25Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:25 crc kubenswrapper[4898]: I1211 13:04:25.689691 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:25 crc kubenswrapper[4898]: I1211 13:04:25.689773 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:25 crc kubenswrapper[4898]: I1211 13:04:25.689798 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:25 crc kubenswrapper[4898]: I1211 13:04:25.689828 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:25 crc kubenswrapper[4898]: I1211 13:04:25.689855 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:25Z","lastTransitionTime":"2025-12-11T13:04:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:25 crc kubenswrapper[4898]: I1211 13:04:25.697007 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca3db723-c4e9-4a38-9cee-827f045c7be3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b72e9f6787b4bfadbc89cf669d5f31678f0a85a726107e3b3d2e06eff0e18ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a52b842d9e7001cf1e9c3284c56ac9a76f642b66457d6717c6367e00ef89fdf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://583816a3f9ba9a092d6cd8d75bba54cf42e3ff9f65230f83822baec967dd53dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f419f755ec0a61ffad06dca0d35df43386ebd253152f29cf04312cd0ea89d5de\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:25Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:25 crc kubenswrapper[4898]: E1211 13:04:25.711098 4898 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-11T13:04:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-11T13:04:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-11T13:04:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-11T13:04:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"160776e4-8c30-4eed-9dbf-aa0b51733cfb\\\",\\\"systemUUID\\\":\\\"3ede5a2c-67f9-4bff-827e-03a23908e5c0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:25Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:25 crc kubenswrapper[4898]: I1211 13:04:25.715689 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:25 crc kubenswrapper[4898]: I1211 13:04:25.715751 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:25 crc kubenswrapper[4898]: I1211 13:04:25.715776 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:25 crc kubenswrapper[4898]: I1211 13:04:25.715806 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:25 crc kubenswrapper[4898]: I1211 13:04:25.715834 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:25Z","lastTransitionTime":"2025-12-11T13:04:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:25 crc kubenswrapper[4898]: E1211 13:04:25.729169 4898 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-11T13:04:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-11T13:04:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-11T13:04:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-11T13:04:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"160776e4-8c30-4eed-9dbf-aa0b51733cfb\\\",\\\"systemUUID\\\":\\\"3ede5a2c-67f9-4bff-827e-03a23908e5c0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:25Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:25 crc kubenswrapper[4898]: I1211 13:04:25.733172 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:25 crc kubenswrapper[4898]: I1211 13:04:25.733253 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:25 crc kubenswrapper[4898]: I1211 13:04:25.733271 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:25 crc kubenswrapper[4898]: I1211 13:04:25.733159 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:25Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:25 crc kubenswrapper[4898]: I1211 13:04:25.733291 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:25 crc kubenswrapper[4898]: I1211 13:04:25.733347 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:25Z","lastTransitionTime":"2025-12-11T13:04:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:25 crc kubenswrapper[4898]: E1211 13:04:25.752513 4898 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-11T13:04:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-11T13:04:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-11T13:04:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-11T13:04:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"160776e4-8c30-4eed-9dbf-aa0b51733cfb\\\",\\\"systemUUID\\\":\\\"3ede5a2c-67f9-4bff-827e-03a23908e5c0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:25Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:25 crc kubenswrapper[4898]: I1211 13:04:25.756367 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:25 crc kubenswrapper[4898]: I1211 13:04:25.756398 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:25 crc kubenswrapper[4898]: I1211 13:04:25.756406 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:25 crc kubenswrapper[4898]: I1211 13:04:25.756420 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:25 crc kubenswrapper[4898]: I1211 13:04:25.756429 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:25Z","lastTransitionTime":"2025-12-11T13:04:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:25 crc kubenswrapper[4898]: I1211 13:04:25.770805 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:25Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:25 crc kubenswrapper[4898]: E1211 13:04:25.772004 4898 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-11T13:04:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-11T13:04:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-11T13:04:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-11T13:04:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"160776e4-8c30-4eed-9dbf-aa0b51733cfb\\\",\\\"systemUUID\\\":\\\"3ede5a2c-67f9-4bff-827e-03a23908e5c0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:25Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:25 crc kubenswrapper[4898]: E1211 13:04:25.772153 4898 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 11 13:04:25 crc kubenswrapper[4898]: I1211 13:04:25.773802 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 11 13:04:25 crc kubenswrapper[4898]: E1211 13:04:25.773886 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 11 13:04:25 crc kubenswrapper[4898]: I1211 13:04:25.774133 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:25 crc kubenswrapper[4898]: I1211 13:04:25.774150 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:25 crc kubenswrapper[4898]: I1211 13:04:25.774157 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:25 crc kubenswrapper[4898]: I1211 13:04:25.774168 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:25 crc kubenswrapper[4898]: I1211 13:04:25.774176 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:25Z","lastTransitionTime":"2025-12-11T13:04:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:25 crc kubenswrapper[4898]: I1211 13:04:25.774298 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 11 13:04:25 crc kubenswrapper[4898]: E1211 13:04:25.774342 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 11 13:04:25 crc kubenswrapper[4898]: I1211 13:04:25.774380 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 11 13:04:25 crc kubenswrapper[4898]: E1211 13:04:25.774422 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 11 13:04:25 crc kubenswrapper[4898]: I1211 13:04:25.807386 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://337971ff1d0d3cf7ee256777a71e4549ec0b973b58b437caaa152c589047141f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:25Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:25 crc kubenswrapper[4898]: I1211 13:04:25.850970 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b28cc743c1173591c6ee75d4bb3841608d01bca50dab896d4c5de930a67ee22f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:25Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:25 crc kubenswrapper[4898]: I1211 13:04:25.876919 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:25 crc kubenswrapper[4898]: I1211 13:04:25.876981 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:25 crc kubenswrapper[4898]: I1211 13:04:25.876994 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:25 crc kubenswrapper[4898]: I1211 13:04:25.877017 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:25 crc kubenswrapper[4898]: I1211 13:04:25.877030 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:25Z","lastTransitionTime":"2025-12-11T13:04:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:25 crc kubenswrapper[4898]: I1211 13:04:25.887231 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-dlqfj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e8ed6cb-b822-4b64-9e00-e755c5aea812\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76ddfb054d2dd7ec27a771c96a0fca2cead8eacc3221c22b36c1d075414a1995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bvrpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-dlqfj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:25Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:25 crc kubenswrapper[4898]: I1211 13:04:25.926412 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jmvlr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8009db93-3f68-4a84-87ae-863f64e231e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08079d059f71e05ad9d28ebed3d2608a035d3b6fc87db1d80b3974b439ac5bdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkmx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:24Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jmvlr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:25Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:25 crc kubenswrapper[4898]: I1211 13:04:25.938718 4898 generic.go:334] "Generic (PLEG): container finished" podID="45192e97-2770-4866-9865-dc2f45b3f616" containerID="fd3bae44b475629c42174d195e162b9afe545e8dc84857847b1e2d936cf1a32d" exitCode=0 Dec 11 13:04:25 crc kubenswrapper[4898]: I1211 13:04:25.938827 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7lxfm" event={"ID":"45192e97-2770-4866-9865-dc2f45b3f616","Type":"ContainerDied","Data":"fd3bae44b475629c42174d195e162b9afe545e8dc84857847b1e2d936cf1a32d"} Dec 11 13:04:25 crc kubenswrapper[4898]: I1211 13:04:25.974132 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qndxl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1efa7034-8a95-4e6e-bd84-0189dc5acaa3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fc397b24e8480c4d07281b09fb871f859eb9fdf47fa9103b9243f91e5800d61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fc397b24e8480c4d07281b09fb871f859eb9fdf47fa9103b9243f91e5800d61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qndxl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:25Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:25 crc kubenswrapper[4898]: I1211 13:04:25.980221 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:25 crc kubenswrapper[4898]: I1211 13:04:25.980290 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:25 crc kubenswrapper[4898]: I1211 13:04:25.980305 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:25 crc kubenswrapper[4898]: I1211 13:04:25.980326 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:25 crc kubenswrapper[4898]: I1211 13:04:25.980340 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:25Z","lastTransitionTime":"2025-12-11T13:04:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:26 crc kubenswrapper[4898]: I1211 13:04:26.026295 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"88d684b6-361a-40e6-978e-b46ea34398f5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fdd8f4f5e41c02e8bfa32782027f6483a3568a3e1cb28f2abc7807eaada32646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad23dc07c8ad7cf6100e3b53e1d9ed8abac20e942968edb5decdde1b263de306\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce1bf1a7718645350f992e26a366a0fefa0d757b696c932d3552dd24a4c93550\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4801f56faef50a1c100b7f54be9508a2be17ab655648fdb92cffaab1cb198a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87f5a835b639c4ce50c0d2b5f1a865f1b67bc5cc06d9fd88875d9defb71dbed2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77f3c45a0ef619bd5ee7b89f62fdaf4ae983c8f75e23034eb88019194f6ef88a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77f3c45a0ef619bd5ee7b89f62fdaf4ae983c8f75e23034eb88019194f6ef88a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32f392949d5ebc630af8b0646f8f482f806803d464e2cedbcbe731328d6ba5ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32f392949d5ebc630af8b0646f8f482f806803d464e2cedbcbe731328d6ba5ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://bf8553e4b1a22afaf454efae3707136b9e6e320a54fd269f48a5f72297249d21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf8553e4b1a22afaf454efae3707136b9e6e320a54fd269f48a5f72297249d21\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:26Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:26 crc kubenswrapper[4898]: I1211 13:04:26.050199 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-h86cf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"00c0a5c3-6be3-4c77-a628-cf6710a1f10f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cb94931502e22d03595fc3289c2b098ecf881b9f8c14180a036a11b2f65bd91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xv2hc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-h86cf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:26Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:26 crc kubenswrapper[4898]: I1211 13:04:26.082603 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:26 crc kubenswrapper[4898]: I1211 13:04:26.082670 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:26 crc kubenswrapper[4898]: I1211 13:04:26.082683 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:26 crc kubenswrapper[4898]: I1211 13:04:26.082701 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:26 crc kubenswrapper[4898]: I1211 13:04:26.082713 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:26Z","lastTransitionTime":"2025-12-11T13:04:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:26 crc kubenswrapper[4898]: I1211 13:04:26.090388 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6782ac5869029f1d6c05198c469d34070decbbbbab0c093963024040f067ecf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7cv42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb7b98c50effebe09db7e51fe139da11af7b116ce6b7d81f0dada92fa29c82a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7cv42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7mmvk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:26Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:26 crc kubenswrapper[4898]: I1211 13:04:26.131301 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fc485a2-a69e-4453-8c3d-4b9698caa632\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df081be8bffcbf5068ade5f09ad81f2ea332d110fb0dffc1ab215f4a447e75d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://765cb993103d23770e20a725ce1f28ca635d13d2ecd05676039a833274ecf3e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41568c567a418a6ebd534c09d02c4331dfa2cd00580441bfdd83a15a12153a59\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ccc1530df8a90b252f6008c3a1734fd489bcb57944c2cc46bdf3c7554d837c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8857b4ade110b31598d9080f45bc15f5702b2c278fe7909e4d172c9877e61534\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"message\\\":\\\"falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1211 13:04:15.326141 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1211 13:04:15.327902 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-38072788/tls.crt::/tmp/serving-cert-38072788/tls.key\\\\\\\"\\\\nI1211 13:04:21.035582 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1211 13:04:21.038444 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1211 13:04:21.038483 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1211 13:04:21.038515 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1211 13:04:21.038527 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1211 13:04:21.044993 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1211 13:04:21.045022 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1211 13:04:21.045029 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1211 13:04:21.045034 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1211 13:04:21.045038 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1211 13:04:21.045043 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1211 13:04:21.045047 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1211 13:04:21.045297 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1211 13:04:21.046774 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a6e962a8c37823fc7d0047f9c1cae14311bfa1f36236addff039bba3369a021\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca4b5408884ed07718750ad82f77961d242327ca11991fd4dbbd1b5b53c5b642\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca4b5408884ed07718750ad82f77961d242327ca11991fd4dbbd1b5b53c5b642\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:26Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:26 crc kubenswrapper[4898]: I1211 13:04:26.168655 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9dab79b9393ae8ca48f0b4366d9b4e6fd25015a3d42f913b3d3d5ce78cbf0cc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bee512bd389a5e402acad6fdbf075e8d90bbfb3899efaecee992ff8ef64617fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:26Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:26 crc kubenswrapper[4898]: I1211 13:04:26.188430 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:26 crc kubenswrapper[4898]: I1211 13:04:26.188508 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:26 crc kubenswrapper[4898]: I1211 13:04:26.188525 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:26 crc kubenswrapper[4898]: I1211 13:04:26.188549 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:26 crc kubenswrapper[4898]: I1211 13:04:26.188565 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:26Z","lastTransitionTime":"2025-12-11T13:04:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:26 crc kubenswrapper[4898]: I1211 13:04:26.209023 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:26Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:26 crc kubenswrapper[4898]: I1211 13:04:26.253229 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7lxfm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45192e97-2770-4866-9865-dc2f45b3f616\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14a9667311a747f8e844897394432d67e0897b22569dd1314f4ebeccfce1c531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14a9667311a747f8e844897394432d67e0897b22569dd1314f4ebeccfce1c531\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://35bb75c37377565dd0410255d00b3a848a3d22801107bd4d9547ce5574cda121\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35bb75c37377565dd0410255d00b3a848a3d22801107bd4d9547ce5574cda121\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd3bae44b475629c42174d195e162b9afe545e8dc84857847b1e2d936cf1a32d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd3bae44b475629c42174d195e162b9afe545e8dc84857847b1e2d936cf1a32d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7lxfm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:26Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:26 crc kubenswrapper[4898]: I1211 13:04:26.287627 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca3db723-c4e9-4a38-9cee-827f045c7be3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b72e9f6787b4bfadbc89cf669d5f31678f0a85a726107e3b3d2e06eff0e18ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a52b842d9e7001cf1e9c3284c56ac9a76f642b66457d6717c6367e00ef89fdf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://583816a3f9ba9a092d6cd8d75bba54cf42e3ff9f65230f83822baec967dd53dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f419f755ec0a61ffad06dca0d35df43386ebd253152f29cf04312cd0ea89d5de\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:26Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:26 crc kubenswrapper[4898]: I1211 13:04:26.291050 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:26 crc kubenswrapper[4898]: I1211 13:04:26.291264 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:26 crc kubenswrapper[4898]: I1211 13:04:26.291377 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:26 crc kubenswrapper[4898]: I1211 13:04:26.291475 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:26 crc kubenswrapper[4898]: I1211 13:04:26.291752 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:26Z","lastTransitionTime":"2025-12-11T13:04:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:26 crc kubenswrapper[4898]: I1211 13:04:26.327164 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:26Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:26 crc kubenswrapper[4898]: I1211 13:04:26.371058 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:26Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:26 crc kubenswrapper[4898]: I1211 13:04:26.394228 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:26 crc kubenswrapper[4898]: I1211 13:04:26.394270 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:26 crc kubenswrapper[4898]: I1211 13:04:26.394281 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:26 crc kubenswrapper[4898]: I1211 13:04:26.394296 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:26 crc kubenswrapper[4898]: I1211 13:04:26.394307 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:26Z","lastTransitionTime":"2025-12-11T13:04:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:26 crc kubenswrapper[4898]: I1211 13:04:26.407980 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://337971ff1d0d3cf7ee256777a71e4549ec0b973b58b437caaa152c589047141f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:26Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:26 crc kubenswrapper[4898]: I1211 13:04:26.447052 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b28cc743c1173591c6ee75d4bb3841608d01bca50dab896d4c5de930a67ee22f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:26Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:26 crc kubenswrapper[4898]: I1211 13:04:26.489112 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-dlqfj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e8ed6cb-b822-4b64-9e00-e755c5aea812\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76ddfb054d2dd7ec27a771c96a0fca2cead8eacc3221c22b36c1d075414a1995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bvrpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-dlqfj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:26Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:26 crc kubenswrapper[4898]: I1211 13:04:26.497110 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:26 crc kubenswrapper[4898]: I1211 13:04:26.497160 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:26 crc kubenswrapper[4898]: I1211 13:04:26.497172 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:26 crc kubenswrapper[4898]: I1211 13:04:26.497188 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:26 crc kubenswrapper[4898]: I1211 13:04:26.497199 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:26Z","lastTransitionTime":"2025-12-11T13:04:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:26 crc kubenswrapper[4898]: I1211 13:04:26.524872 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jmvlr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8009db93-3f68-4a84-87ae-863f64e231e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08079d059f71e05ad9d28ebed3d2608a035d3b6fc87db1d80b3974b439ac5bdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkmx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:24Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jmvlr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:26Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:26 crc kubenswrapper[4898]: I1211 13:04:26.599730 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:26 crc kubenswrapper[4898]: I1211 13:04:26.599766 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:26 crc kubenswrapper[4898]: I1211 13:04:26.599777 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:26 crc kubenswrapper[4898]: I1211 13:04:26.599789 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:26 crc kubenswrapper[4898]: I1211 13:04:26.599800 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:26Z","lastTransitionTime":"2025-12-11T13:04:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:26 crc kubenswrapper[4898]: I1211 13:04:26.703784 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:26 crc kubenswrapper[4898]: I1211 13:04:26.703846 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:26 crc kubenswrapper[4898]: I1211 13:04:26.703866 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:26 crc kubenswrapper[4898]: I1211 13:04:26.703895 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:26 crc kubenswrapper[4898]: I1211 13:04:26.703919 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:26Z","lastTransitionTime":"2025-12-11T13:04:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:26 crc kubenswrapper[4898]: I1211 13:04:26.807360 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:26 crc kubenswrapper[4898]: I1211 13:04:26.807402 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:26 crc kubenswrapper[4898]: I1211 13:04:26.807415 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:26 crc kubenswrapper[4898]: I1211 13:04:26.807433 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:26 crc kubenswrapper[4898]: I1211 13:04:26.807446 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:26Z","lastTransitionTime":"2025-12-11T13:04:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:26 crc kubenswrapper[4898]: I1211 13:04:26.909975 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:26 crc kubenswrapper[4898]: I1211 13:04:26.910032 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:26 crc kubenswrapper[4898]: I1211 13:04:26.910048 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:26 crc kubenswrapper[4898]: I1211 13:04:26.910071 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:26 crc kubenswrapper[4898]: I1211 13:04:26.910089 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:26Z","lastTransitionTime":"2025-12-11T13:04:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:26 crc kubenswrapper[4898]: I1211 13:04:26.946972 4898 generic.go:334] "Generic (PLEG): container finished" podID="45192e97-2770-4866-9865-dc2f45b3f616" containerID="f7d5edf28866aeb08ed16121e0cea183463e58a708b157da20edd847bdad9860" exitCode=0 Dec 11 13:04:26 crc kubenswrapper[4898]: I1211 13:04:26.947083 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7lxfm" event={"ID":"45192e97-2770-4866-9865-dc2f45b3f616","Type":"ContainerDied","Data":"f7d5edf28866aeb08ed16121e0cea183463e58a708b157da20edd847bdad9860"} Dec 11 13:04:26 crc kubenswrapper[4898]: I1211 13:04:26.967396 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca3db723-c4e9-4a38-9cee-827f045c7be3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b72e9f6787b4bfadbc89cf669d5f31678f0a85a726107e3b3d2e06eff0e18ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a52b842d9e7001cf1e9c3284c56ac9a76f642b66457d6717c6367e00ef89fdf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://583816a3f9ba9a092d6cd8d75bba54cf42e3ff9f65230f83822baec967dd53dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f419f755ec0a61ffad06dca0d35df43386ebd253152f29cf04312cd0ea89d5de\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:26Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:26 crc kubenswrapper[4898]: I1211 13:04:26.986329 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:26Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:27 crc kubenswrapper[4898]: I1211 13:04:27.003634 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:27Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:27 crc kubenswrapper[4898]: I1211 13:04:27.012064 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:27 crc kubenswrapper[4898]: I1211 13:04:27.012114 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:27 crc kubenswrapper[4898]: I1211 13:04:27.012135 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:27 crc kubenswrapper[4898]: I1211 13:04:27.012160 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:27 crc kubenswrapper[4898]: I1211 13:04:27.012177 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:27Z","lastTransitionTime":"2025-12-11T13:04:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:27 crc kubenswrapper[4898]: I1211 13:04:27.023255 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://337971ff1d0d3cf7ee256777a71e4549ec0b973b58b437caaa152c589047141f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:27Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:27 crc kubenswrapper[4898]: I1211 13:04:27.038346 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b28cc743c1173591c6ee75d4bb3841608d01bca50dab896d4c5de930a67ee22f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:27Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:27 crc kubenswrapper[4898]: I1211 13:04:27.061254 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-dlqfj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e8ed6cb-b822-4b64-9e00-e755c5aea812\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76ddfb054d2dd7ec27a771c96a0fca2cead8eacc3221c22b36c1d075414a1995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bvrpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-dlqfj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:27Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:27 crc kubenswrapper[4898]: I1211 13:04:27.073286 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jmvlr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8009db93-3f68-4a84-87ae-863f64e231e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08079d059f71e05ad9d28ebed3d2608a035d3b6fc87db1d80b3974b439ac5bdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkmx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:24Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jmvlr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:27Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:27 crc kubenswrapper[4898]: I1211 13:04:27.097496 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qndxl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1efa7034-8a95-4e6e-bd84-0189dc5acaa3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fc397b24e8480c4d07281b09fb871f859eb9fdf47fa9103b9243f91e5800d61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fc397b24e8480c4d07281b09fb871f859eb9fdf47fa9103b9243f91e5800d61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qndxl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:27Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:27 crc kubenswrapper[4898]: I1211 13:04:27.115133 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"88d684b6-361a-40e6-978e-b46ea34398f5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fdd8f4f5e41c02e8bfa32782027f6483a3568a3e1cb28f2abc7807eaada32646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad23dc07c8ad7cf6100e3b53e1d9ed8abac20e942968edb5decdde1b263de306\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce1bf1a7718645350f992e26a366a0fefa0d757b696c932d3552dd24a4c93550\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4801f56faef50a1c100b7f54be9508a2be17ab655648fdb92cffaab1cb198a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87f5a835b639c4ce50c0d2b5f1a865f1b67bc5cc06d9fd88875d9defb71dbed2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77f3c45a0ef619bd5ee7b89f62fdaf4ae983c8f75e23034eb88019194f6ef88a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77f3c45a0ef619bd5ee7b89f62fdaf4ae983c8f75e23034eb88019194f6ef88a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32f392949d5ebc630af8b0646f8f482f806803d464e2cedbcbe731328d6ba5ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32f392949d5ebc630af8b0646f8f482f806803d464e2cedbcbe731328d6ba5ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://bf8553e4b1a22afaf454efae3707136b9e6e320a54fd269f48a5f72297249d21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf8553e4b1a22afaf454efae3707136b9e6e320a54fd269f48a5f72297249d21\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:27Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:27 crc kubenswrapper[4898]: I1211 13:04:27.117362 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:27 crc kubenswrapper[4898]: I1211 13:04:27.117401 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:27 crc kubenswrapper[4898]: I1211 13:04:27.117409 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:27 crc kubenswrapper[4898]: I1211 13:04:27.117423 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:27 crc kubenswrapper[4898]: I1211 13:04:27.117431 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:27Z","lastTransitionTime":"2025-12-11T13:04:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:27 crc kubenswrapper[4898]: I1211 13:04:27.126509 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-h86cf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"00c0a5c3-6be3-4c77-a628-cf6710a1f10f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cb94931502e22d03595fc3289c2b098ecf881b9f8c14180a036a11b2f65bd91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xv2hc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-h86cf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:27Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:27 crc kubenswrapper[4898]: I1211 13:04:27.139346 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6782ac5869029f1d6c05198c469d34070decbbbbab0c093963024040f067ecf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7cv42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb7b98c50effebe09db7e51fe139da11af7b116ce6b7d81f0dada92fa29c82a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7cv42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7mmvk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:27Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:27 crc kubenswrapper[4898]: I1211 13:04:27.153557 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fc485a2-a69e-4453-8c3d-4b9698caa632\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df081be8bffcbf5068ade5f09ad81f2ea332d110fb0dffc1ab215f4a447e75d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://765cb993103d23770e20a725ce1f28ca635d13d2ecd05676039a833274ecf3e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41568c567a418a6ebd534c09d02c4331dfa2cd00580441bfdd83a15a12153a59\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ccc1530df8a90b252f6008c3a1734fd489bcb57944c2cc46bdf3c7554d837c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8857b4ade110b31598d9080f45bc15f5702b2c278fe7909e4d172c9877e61534\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"message\\\":\\\"falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1211 13:04:15.326141 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1211 13:04:15.327902 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-38072788/tls.crt::/tmp/serving-cert-38072788/tls.key\\\\\\\"\\\\nI1211 13:04:21.035582 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1211 13:04:21.038444 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1211 13:04:21.038483 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1211 13:04:21.038515 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1211 13:04:21.038527 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1211 13:04:21.044993 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1211 13:04:21.045022 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1211 13:04:21.045029 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1211 13:04:21.045034 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1211 13:04:21.045038 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1211 13:04:21.045043 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1211 13:04:21.045047 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1211 13:04:21.045297 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1211 13:04:21.046774 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a6e962a8c37823fc7d0047f9c1cae14311bfa1f36236addff039bba3369a021\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca4b5408884ed07718750ad82f77961d242327ca11991fd4dbbd1b5b53c5b642\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca4b5408884ed07718750ad82f77961d242327ca11991fd4dbbd1b5b53c5b642\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:27Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:27 crc kubenswrapper[4898]: I1211 13:04:27.168242 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9dab79b9393ae8ca48f0b4366d9b4e6fd25015a3d42f913b3d3d5ce78cbf0cc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bee512bd389a5e402acad6fdbf075e8d90bbfb3899efaecee992ff8ef64617fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:27Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:27 crc kubenswrapper[4898]: I1211 13:04:27.183524 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:27Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:27 crc kubenswrapper[4898]: I1211 13:04:27.200665 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7lxfm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45192e97-2770-4866-9865-dc2f45b3f616\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14a9667311a747f8e844897394432d67e0897b22569dd1314f4ebeccfce1c531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14a9667311a747f8e844897394432d67e0897b22569dd1314f4ebeccfce1c531\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://35bb75c37377565dd0410255d00b3a848a3d22801107bd4d9547ce5574cda121\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35bb75c37377565dd0410255d00b3a848a3d22801107bd4d9547ce5574cda121\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd3bae44b475629c42174d195e162b9afe545e8dc84857847b1e2d936cf1a32d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd3bae44b475629c42174d195e162b9afe545e8dc84857847b1e2d936cf1a32d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7d5edf28866aeb08ed16121e0cea183463e58a708b157da20edd847bdad9860\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7d5edf28866aeb08ed16121e0cea183463e58a708b157da20edd847bdad9860\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7lxfm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:27Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:27 crc kubenswrapper[4898]: I1211 13:04:27.219162 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:27 crc kubenswrapper[4898]: I1211 13:04:27.219195 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:27 crc kubenswrapper[4898]: I1211 13:04:27.219204 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:27 crc kubenswrapper[4898]: I1211 13:04:27.219217 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:27 crc kubenswrapper[4898]: I1211 13:04:27.219227 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:27Z","lastTransitionTime":"2025-12-11T13:04:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:27 crc kubenswrapper[4898]: I1211 13:04:27.322174 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:27 crc kubenswrapper[4898]: I1211 13:04:27.322227 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:27 crc kubenswrapper[4898]: I1211 13:04:27.322241 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:27 crc kubenswrapper[4898]: I1211 13:04:27.322259 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:27 crc kubenswrapper[4898]: I1211 13:04:27.322283 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:27Z","lastTransitionTime":"2025-12-11T13:04:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:27 crc kubenswrapper[4898]: I1211 13:04:27.425446 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:27 crc kubenswrapper[4898]: I1211 13:04:27.425552 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:27 crc kubenswrapper[4898]: I1211 13:04:27.425573 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:27 crc kubenswrapper[4898]: I1211 13:04:27.425599 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:27 crc kubenswrapper[4898]: I1211 13:04:27.425617 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:27Z","lastTransitionTime":"2025-12-11T13:04:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:27 crc kubenswrapper[4898]: I1211 13:04:27.528485 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:27 crc kubenswrapper[4898]: I1211 13:04:27.528535 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:27 crc kubenswrapper[4898]: I1211 13:04:27.528551 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:27 crc kubenswrapper[4898]: I1211 13:04:27.528570 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:27 crc kubenswrapper[4898]: I1211 13:04:27.528582 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:27Z","lastTransitionTime":"2025-12-11T13:04:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:27 crc kubenswrapper[4898]: I1211 13:04:27.631288 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:27 crc kubenswrapper[4898]: I1211 13:04:27.631357 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:27 crc kubenswrapper[4898]: I1211 13:04:27.631370 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:27 crc kubenswrapper[4898]: I1211 13:04:27.631386 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:27 crc kubenswrapper[4898]: I1211 13:04:27.631421 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:27Z","lastTransitionTime":"2025-12-11T13:04:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:27 crc kubenswrapper[4898]: I1211 13:04:27.734117 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:27 crc kubenswrapper[4898]: I1211 13:04:27.734160 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:27 crc kubenswrapper[4898]: I1211 13:04:27.734171 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:27 crc kubenswrapper[4898]: I1211 13:04:27.734187 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:27 crc kubenswrapper[4898]: I1211 13:04:27.734201 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:27Z","lastTransitionTime":"2025-12-11T13:04:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:27 crc kubenswrapper[4898]: I1211 13:04:27.774852 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 11 13:04:27 crc kubenswrapper[4898]: E1211 13:04:27.774995 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 11 13:04:27 crc kubenswrapper[4898]: I1211 13:04:27.775070 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 11 13:04:27 crc kubenswrapper[4898]: I1211 13:04:27.775198 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 11 13:04:27 crc kubenswrapper[4898]: E1211 13:04:27.775277 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 11 13:04:27 crc kubenswrapper[4898]: E1211 13:04:27.775377 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 11 13:04:27 crc kubenswrapper[4898]: I1211 13:04:27.836568 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:27 crc kubenswrapper[4898]: I1211 13:04:27.836637 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:27 crc kubenswrapper[4898]: I1211 13:04:27.836654 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:27 crc kubenswrapper[4898]: I1211 13:04:27.836677 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:27 crc kubenswrapper[4898]: I1211 13:04:27.836694 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:27Z","lastTransitionTime":"2025-12-11T13:04:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:27 crc kubenswrapper[4898]: I1211 13:04:27.939799 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:27 crc kubenswrapper[4898]: I1211 13:04:27.939868 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:27 crc kubenswrapper[4898]: I1211 13:04:27.939885 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:27 crc kubenswrapper[4898]: I1211 13:04:27.939908 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:27 crc kubenswrapper[4898]: I1211 13:04:27.939924 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:27Z","lastTransitionTime":"2025-12-11T13:04:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:27 crc kubenswrapper[4898]: I1211 13:04:27.954374 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qndxl" event={"ID":"1efa7034-8a95-4e6e-bd84-0189dc5acaa3","Type":"ContainerStarted","Data":"e3aa907a5ca865510f1aafd0f949963b5938b6fe62fa9fd690447c4ab62566f0"} Dec 11 13:04:27 crc kubenswrapper[4898]: I1211 13:04:27.956816 4898 generic.go:334] "Generic (PLEG): container finished" podID="45192e97-2770-4866-9865-dc2f45b3f616" containerID="951e5da3fc6041496960dcfcd28bb3e918a40d9303c6609880bc2f8dcca29688" exitCode=0 Dec 11 13:04:27 crc kubenswrapper[4898]: I1211 13:04:27.956860 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7lxfm" event={"ID":"45192e97-2770-4866-9865-dc2f45b3f616","Type":"ContainerDied","Data":"951e5da3fc6041496960dcfcd28bb3e918a40d9303c6609880bc2f8dcca29688"} Dec 11 13:04:27 crc kubenswrapper[4898]: I1211 13:04:27.969893 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://337971ff1d0d3cf7ee256777a71e4549ec0b973b58b437caaa152c589047141f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:27Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:27 crc kubenswrapper[4898]: I1211 13:04:27.987430 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b28cc743c1173591c6ee75d4bb3841608d01bca50dab896d4c5de930a67ee22f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:27Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:28 crc kubenswrapper[4898]: I1211 13:04:27.999944 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-dlqfj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e8ed6cb-b822-4b64-9e00-e755c5aea812\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76ddfb054d2dd7ec27a771c96a0fca2cead8eacc3221c22b36c1d075414a1995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bvrpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-dlqfj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:27Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:28 crc kubenswrapper[4898]: I1211 13:04:28.010479 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jmvlr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8009db93-3f68-4a84-87ae-863f64e231e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08079d059f71e05ad9d28ebed3d2608a035d3b6fc87db1d80b3974b439ac5bdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkmx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:24Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jmvlr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:28Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:28 crc kubenswrapper[4898]: I1211 13:04:28.029449 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"88d684b6-361a-40e6-978e-b46ea34398f5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fdd8f4f5e41c02e8bfa32782027f6483a3568a3e1cb28f2abc7807eaada32646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad23dc07c8ad7cf6100e3b53e1d9ed8abac20e942968edb5decdde1b263de306\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce1bf1a7718645350f992e26a366a0fefa0d757b696c932d3552dd24a4c93550\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4801f56faef50a1c100b7f54be9508a2be17ab655648fdb92cffaab1cb198a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87f5a835b639c4ce50c0d2b5f1a865f1b67bc5cc06d9fd88875d9defb71dbed2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77f3c45a0ef619bd5ee7b89f62fdaf4ae983c8f75e23034eb88019194f6ef88a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77f3c45a0ef619bd5ee7b89f62fdaf4ae983c8f75e23034eb88019194f6ef88a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32f392949d5ebc630af8b0646f8f482f806803d464e2cedbcbe731328d6ba5ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32f392949d5ebc630af8b0646f8f482f806803d464e2cedbcbe731328d6ba5ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://bf8553e4b1a22afaf454efae3707136b9e6e320a54fd269f48a5f72297249d21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf8553e4b1a22afaf454efae3707136b9e6e320a54fd269f48a5f72297249d21\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:28Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:28 crc kubenswrapper[4898]: I1211 13:04:28.040022 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-h86cf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"00c0a5c3-6be3-4c77-a628-cf6710a1f10f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cb94931502e22d03595fc3289c2b098ecf881b9f8c14180a036a11b2f65bd91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xv2hc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-h86cf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:28Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:28 crc kubenswrapper[4898]: I1211 13:04:28.042226 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:28 crc kubenswrapper[4898]: I1211 13:04:28.042260 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:28 crc kubenswrapper[4898]: I1211 13:04:28.042270 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:28 crc kubenswrapper[4898]: I1211 13:04:28.042289 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:28 crc kubenswrapper[4898]: I1211 13:04:28.042300 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:28Z","lastTransitionTime":"2025-12-11T13:04:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:28 crc kubenswrapper[4898]: I1211 13:04:28.050738 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6782ac5869029f1d6c05198c469d34070decbbbbab0c093963024040f067ecf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7cv42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb7b98c50effebe09db7e51fe139da11af7b116ce6b7d81f0dada92fa29c82a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7cv42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7mmvk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:28Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:28 crc kubenswrapper[4898]: I1211 13:04:28.065121 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qndxl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1efa7034-8a95-4e6e-bd84-0189dc5acaa3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fc397b24e8480c4d07281b09fb871f859eb9fdf47fa9103b9243f91e5800d61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fc397b24e8480c4d07281b09fb871f859eb9fdf47fa9103b9243f91e5800d61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qndxl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:28Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:28 crc kubenswrapper[4898]: I1211 13:04:28.078598 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fc485a2-a69e-4453-8c3d-4b9698caa632\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df081be8bffcbf5068ade5f09ad81f2ea332d110fb0dffc1ab215f4a447e75d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://765cb993103d23770e20a725ce1f28ca635d13d2ecd05676039a833274ecf3e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41568c567a418a6ebd534c09d02c4331dfa2cd00580441bfdd83a15a12153a59\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ccc1530df8a90b252f6008c3a1734fd489bcb57944c2cc46bdf3c7554d837c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8857b4ade110b31598d9080f45bc15f5702b2c278fe7909e4d172c9877e61534\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"message\\\":\\\"falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1211 13:04:15.326141 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1211 13:04:15.327902 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-38072788/tls.crt::/tmp/serving-cert-38072788/tls.key\\\\\\\"\\\\nI1211 13:04:21.035582 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1211 13:04:21.038444 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1211 13:04:21.038483 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1211 13:04:21.038515 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1211 13:04:21.038527 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1211 13:04:21.044993 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1211 13:04:21.045022 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1211 13:04:21.045029 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1211 13:04:21.045034 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1211 13:04:21.045038 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1211 13:04:21.045043 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1211 13:04:21.045047 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1211 13:04:21.045297 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1211 13:04:21.046774 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a6e962a8c37823fc7d0047f9c1cae14311bfa1f36236addff039bba3369a021\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca4b5408884ed07718750ad82f77961d242327ca11991fd4dbbd1b5b53c5b642\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca4b5408884ed07718750ad82f77961d242327ca11991fd4dbbd1b5b53c5b642\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:28Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:28 crc kubenswrapper[4898]: I1211 13:04:28.100923 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9dab79b9393ae8ca48f0b4366d9b4e6fd25015a3d42f913b3d3d5ce78cbf0cc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bee512bd389a5e402acad6fdbf075e8d90bbfb3899efaecee992ff8ef64617fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:28Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:28 crc kubenswrapper[4898]: I1211 13:04:28.113611 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:28Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:28 crc kubenswrapper[4898]: I1211 13:04:28.126482 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7lxfm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45192e97-2770-4866-9865-dc2f45b3f616\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14a9667311a747f8e844897394432d67e0897b22569dd1314f4ebeccfce1c531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14a9667311a747f8e844897394432d67e0897b22569dd1314f4ebeccfce1c531\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://35bb75c37377565dd0410255d00b3a848a3d22801107bd4d9547ce5574cda121\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35bb75c37377565dd0410255d00b3a848a3d22801107bd4d9547ce5574cda121\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd3bae44b475629c42174d195e162b9afe545e8dc84857847b1e2d936cf1a32d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd3bae44b475629c42174d195e162b9afe545e8dc84857847b1e2d936cf1a32d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7d5edf28866aeb08ed16121e0cea183463e58a708b157da20edd847bdad9860\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7d5edf28866aeb08ed16121e0cea183463e58a708b157da20edd847bdad9860\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://951e5da3fc6041496960dcfcd28bb3e918a40d9303c6609880bc2f8dcca29688\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://951e5da3fc6041496960dcfcd28bb3e918a40d9303c6609880bc2f8dcca29688\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7lxfm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:28Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:28 crc kubenswrapper[4898]: I1211 13:04:28.138482 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca3db723-c4e9-4a38-9cee-827f045c7be3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b72e9f6787b4bfadbc89cf669d5f31678f0a85a726107e3b3d2e06eff0e18ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a52b842d9e7001cf1e9c3284c56ac9a76f642b66457d6717c6367e00ef89fdf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://583816a3f9ba9a092d6cd8d75bba54cf42e3ff9f65230f83822baec967dd53dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f419f755ec0a61ffad06dca0d35df43386ebd253152f29cf04312cd0ea89d5de\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:28Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:28 crc kubenswrapper[4898]: I1211 13:04:28.145136 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:28 crc kubenswrapper[4898]: I1211 13:04:28.145277 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:28 crc kubenswrapper[4898]: I1211 13:04:28.145335 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:28 crc kubenswrapper[4898]: I1211 13:04:28.145395 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:28 crc kubenswrapper[4898]: I1211 13:04:28.145498 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:28Z","lastTransitionTime":"2025-12-11T13:04:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:28 crc kubenswrapper[4898]: I1211 13:04:28.150880 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:28Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:28 crc kubenswrapper[4898]: I1211 13:04:28.165264 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:28Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:28 crc kubenswrapper[4898]: I1211 13:04:28.248075 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:28 crc kubenswrapper[4898]: I1211 13:04:28.248425 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:28 crc kubenswrapper[4898]: I1211 13:04:28.248439 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:28 crc kubenswrapper[4898]: I1211 13:04:28.248475 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:28 crc kubenswrapper[4898]: I1211 13:04:28.248488 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:28Z","lastTransitionTime":"2025-12-11T13:04:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:28 crc kubenswrapper[4898]: I1211 13:04:28.351007 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:28 crc kubenswrapper[4898]: I1211 13:04:28.351314 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:28 crc kubenswrapper[4898]: I1211 13:04:28.351410 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:28 crc kubenswrapper[4898]: I1211 13:04:28.351517 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:28 crc kubenswrapper[4898]: I1211 13:04:28.351593 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:28Z","lastTransitionTime":"2025-12-11T13:04:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:28 crc kubenswrapper[4898]: I1211 13:04:28.454831 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:28 crc kubenswrapper[4898]: I1211 13:04:28.454874 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:28 crc kubenswrapper[4898]: I1211 13:04:28.454885 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:28 crc kubenswrapper[4898]: I1211 13:04:28.454902 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:28 crc kubenswrapper[4898]: I1211 13:04:28.454914 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:28Z","lastTransitionTime":"2025-12-11T13:04:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:28 crc kubenswrapper[4898]: I1211 13:04:28.557209 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:28 crc kubenswrapper[4898]: I1211 13:04:28.557240 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:28 crc kubenswrapper[4898]: I1211 13:04:28.557248 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:28 crc kubenswrapper[4898]: I1211 13:04:28.557261 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:28 crc kubenswrapper[4898]: I1211 13:04:28.557270 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:28Z","lastTransitionTime":"2025-12-11T13:04:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:28 crc kubenswrapper[4898]: I1211 13:04:28.660698 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:28 crc kubenswrapper[4898]: I1211 13:04:28.660775 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:28 crc kubenswrapper[4898]: I1211 13:04:28.660797 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:28 crc kubenswrapper[4898]: I1211 13:04:28.660827 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:28 crc kubenswrapper[4898]: I1211 13:04:28.660848 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:28Z","lastTransitionTime":"2025-12-11T13:04:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:28 crc kubenswrapper[4898]: I1211 13:04:28.763142 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:28 crc kubenswrapper[4898]: I1211 13:04:28.763178 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:28 crc kubenswrapper[4898]: I1211 13:04:28.763187 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:28 crc kubenswrapper[4898]: I1211 13:04:28.763200 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:28 crc kubenswrapper[4898]: I1211 13:04:28.763211 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:28Z","lastTransitionTime":"2025-12-11T13:04:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:28 crc kubenswrapper[4898]: I1211 13:04:28.866697 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:28 crc kubenswrapper[4898]: I1211 13:04:28.866743 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:28 crc kubenswrapper[4898]: I1211 13:04:28.866755 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:28 crc kubenswrapper[4898]: I1211 13:04:28.866772 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:28 crc kubenswrapper[4898]: I1211 13:04:28.866784 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:28Z","lastTransitionTime":"2025-12-11T13:04:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:28 crc kubenswrapper[4898]: I1211 13:04:28.964657 4898 generic.go:334] "Generic (PLEG): container finished" podID="45192e97-2770-4866-9865-dc2f45b3f616" containerID="39b6a95a0387b8f1ea4992bfa28355272a299bbb81adddf7ad39835786b43b2d" exitCode=0 Dec 11 13:04:28 crc kubenswrapper[4898]: I1211 13:04:28.964705 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7lxfm" event={"ID":"45192e97-2770-4866-9865-dc2f45b3f616","Type":"ContainerDied","Data":"39b6a95a0387b8f1ea4992bfa28355272a299bbb81adddf7ad39835786b43b2d"} Dec 11 13:04:28 crc kubenswrapper[4898]: I1211 13:04:28.968889 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:28 crc kubenswrapper[4898]: I1211 13:04:28.968926 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:28 crc kubenswrapper[4898]: I1211 13:04:28.968938 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:28 crc kubenswrapper[4898]: I1211 13:04:28.968953 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:28 crc kubenswrapper[4898]: I1211 13:04:28.968965 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:28Z","lastTransitionTime":"2025-12-11T13:04:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:29 crc kubenswrapper[4898]: I1211 13:04:29.015330 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fc485a2-a69e-4453-8c3d-4b9698caa632\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df081be8bffcbf5068ade5f09ad81f2ea332d110fb0dffc1ab215f4a447e75d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://765cb993103d23770e20a725ce1f28ca635d13d2ecd05676039a833274ecf3e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41568c567a418a6ebd534c09d02c4331dfa2cd00580441bfdd83a15a12153a59\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ccc1530df8a90b252f6008c3a1734fd489bcb57944c2cc46bdf3c7554d837c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8857b4ade110b31598d9080f45bc15f5702b2c278fe7909e4d172c9877e61534\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"message\\\":\\\"falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1211 13:04:15.326141 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1211 13:04:15.327902 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-38072788/tls.crt::/tmp/serving-cert-38072788/tls.key\\\\\\\"\\\\nI1211 13:04:21.035582 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1211 13:04:21.038444 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1211 13:04:21.038483 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1211 13:04:21.038515 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1211 13:04:21.038527 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1211 13:04:21.044993 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1211 13:04:21.045022 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1211 13:04:21.045029 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1211 13:04:21.045034 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1211 13:04:21.045038 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1211 13:04:21.045043 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1211 13:04:21.045047 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1211 13:04:21.045297 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1211 13:04:21.046774 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a6e962a8c37823fc7d0047f9c1cae14311bfa1f36236addff039bba3369a021\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca4b5408884ed07718750ad82f77961d242327ca11991fd4dbbd1b5b53c5b642\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca4b5408884ed07718750ad82f77961d242327ca11991fd4dbbd1b5b53c5b642\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:29Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:29 crc kubenswrapper[4898]: I1211 13:04:29.032011 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9dab79b9393ae8ca48f0b4366d9b4e6fd25015a3d42f913b3d3d5ce78cbf0cc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bee512bd389a5e402acad6fdbf075e8d90bbfb3899efaecee992ff8ef64617fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:29Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:29 crc kubenswrapper[4898]: I1211 13:04:29.051382 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:29Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:29 crc kubenswrapper[4898]: I1211 13:04:29.067788 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7lxfm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45192e97-2770-4866-9865-dc2f45b3f616\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14a9667311a747f8e844897394432d67e0897b22569dd1314f4ebeccfce1c531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14a9667311a747f8e844897394432d67e0897b22569dd1314f4ebeccfce1c531\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://35bb75c37377565dd0410255d00b3a848a3d22801107bd4d9547ce5574cda121\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35bb75c37377565dd0410255d00b3a848a3d22801107bd4d9547ce5574cda121\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd3bae44b475629c42174d195e162b9afe545e8dc84857847b1e2d936cf1a32d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd3bae44b475629c42174d195e162b9afe545e8dc84857847b1e2d936cf1a32d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7d5edf28866aeb08ed16121e0cea183463e58a708b157da20edd847bdad9860\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7d5edf28866aeb08ed16121e0cea183463e58a708b157da20edd847bdad9860\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://951e5da3fc6041496960dcfcd28bb3e918a40d9303c6609880bc2f8dcca29688\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://951e5da3fc6041496960dcfcd28bb3e918a40d9303c6609880bc2f8dcca29688\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39b6a95a0387b8f1ea4992bfa28355272a299bbb81adddf7ad39835786b43b2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39b6a95a0387b8f1ea4992bfa28355272a299bbb81adddf7ad39835786b43b2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7lxfm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:29Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:29 crc kubenswrapper[4898]: I1211 13:04:29.071169 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:29 crc kubenswrapper[4898]: I1211 13:04:29.071205 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:29 crc kubenswrapper[4898]: I1211 13:04:29.071214 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:29 crc kubenswrapper[4898]: I1211 13:04:29.071229 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:29 crc kubenswrapper[4898]: I1211 13:04:29.071238 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:29Z","lastTransitionTime":"2025-12-11T13:04:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:29 crc kubenswrapper[4898]: I1211 13:04:29.083059 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca3db723-c4e9-4a38-9cee-827f045c7be3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b72e9f6787b4bfadbc89cf669d5f31678f0a85a726107e3b3d2e06eff0e18ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a52b842d9e7001cf1e9c3284c56ac9a76f642b66457d6717c6367e00ef89fdf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://583816a3f9ba9a092d6cd8d75bba54cf42e3ff9f65230f83822baec967dd53dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f419f755ec0a61ffad06dca0d35df43386ebd253152f29cf04312cd0ea89d5de\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:29Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:29 crc kubenswrapper[4898]: I1211 13:04:29.097970 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:29Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:29 crc kubenswrapper[4898]: I1211 13:04:29.112236 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:29Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:29 crc kubenswrapper[4898]: I1211 13:04:29.128604 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://337971ff1d0d3cf7ee256777a71e4549ec0b973b58b437caaa152c589047141f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:29Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:29 crc kubenswrapper[4898]: I1211 13:04:29.145534 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b28cc743c1173591c6ee75d4bb3841608d01bca50dab896d4c5de930a67ee22f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:29Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:29 crc kubenswrapper[4898]: I1211 13:04:29.161626 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-dlqfj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e8ed6cb-b822-4b64-9e00-e755c5aea812\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76ddfb054d2dd7ec27a771c96a0fca2cead8eacc3221c22b36c1d075414a1995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bvrpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-dlqfj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:29Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:29 crc kubenswrapper[4898]: I1211 13:04:29.173017 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:29 crc kubenswrapper[4898]: I1211 13:04:29.173061 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:29 crc kubenswrapper[4898]: I1211 13:04:29.173072 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:29 crc kubenswrapper[4898]: I1211 13:04:29.173089 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:29 crc kubenswrapper[4898]: I1211 13:04:29.173101 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:29Z","lastTransitionTime":"2025-12-11T13:04:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:29 crc kubenswrapper[4898]: I1211 13:04:29.173223 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jmvlr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8009db93-3f68-4a84-87ae-863f64e231e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08079d059f71e05ad9d28ebed3d2608a035d3b6fc87db1d80b3974b439ac5bdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkmx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:24Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jmvlr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:29Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:29 crc kubenswrapper[4898]: I1211 13:04:29.193993 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qndxl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1efa7034-8a95-4e6e-bd84-0189dc5acaa3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fc397b24e8480c4d07281b09fb871f859eb9fdf47fa9103b9243f91e5800d61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fc397b24e8480c4d07281b09fb871f859eb9fdf47fa9103b9243f91e5800d61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qndxl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:29Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:29 crc kubenswrapper[4898]: I1211 13:04:29.215902 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"88d684b6-361a-40e6-978e-b46ea34398f5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fdd8f4f5e41c02e8bfa32782027f6483a3568a3e1cb28f2abc7807eaada32646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad23dc07c8ad7cf6100e3b53e1d9ed8abac20e942968edb5decdde1b263de306\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce1bf1a7718645350f992e26a366a0fefa0d757b696c932d3552dd24a4c93550\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4801f56faef50a1c100b7f54be9508a2be17ab655648fdb92cffaab1cb198a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87f5a835b639c4ce50c0d2b5f1a865f1b67bc5cc06d9fd88875d9defb71dbed2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77f3c45a0ef619bd5ee7b89f62fdaf4ae983c8f75e23034eb88019194f6ef88a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77f3c45a0ef619bd5ee7b89f62fdaf4ae983c8f75e23034eb88019194f6ef88a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32f392949d5ebc630af8b0646f8f482f806803d464e2cedbcbe731328d6ba5ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32f392949d5ebc630af8b0646f8f482f806803d464e2cedbcbe731328d6ba5ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://bf8553e4b1a22afaf454efae3707136b9e6e320a54fd269f48a5f72297249d21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf8553e4b1a22afaf454efae3707136b9e6e320a54fd269f48a5f72297249d21\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:29Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:29 crc kubenswrapper[4898]: I1211 13:04:29.227850 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-h86cf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"00c0a5c3-6be3-4c77-a628-cf6710a1f10f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cb94931502e22d03595fc3289c2b098ecf881b9f8c14180a036a11b2f65bd91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xv2hc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-h86cf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:29Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:29 crc kubenswrapper[4898]: I1211 13:04:29.241618 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6782ac5869029f1d6c05198c469d34070decbbbbab0c093963024040f067ecf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7cv42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb7b98c50effebe09db7e51fe139da11af7b116ce6b7d81f0dada92fa29c82a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7cv42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7mmvk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:29Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:29 crc kubenswrapper[4898]: I1211 13:04:29.275564 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:29 crc kubenswrapper[4898]: I1211 13:04:29.275856 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:29 crc kubenswrapper[4898]: I1211 13:04:29.275966 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:29 crc kubenswrapper[4898]: I1211 13:04:29.276085 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:29 crc kubenswrapper[4898]: I1211 13:04:29.276174 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:29Z","lastTransitionTime":"2025-12-11T13:04:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:29 crc kubenswrapper[4898]: I1211 13:04:29.379664 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:29 crc kubenswrapper[4898]: I1211 13:04:29.379707 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:29 crc kubenswrapper[4898]: I1211 13:04:29.379723 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:29 crc kubenswrapper[4898]: I1211 13:04:29.379745 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:29 crc kubenswrapper[4898]: I1211 13:04:29.379762 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:29Z","lastTransitionTime":"2025-12-11T13:04:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:29 crc kubenswrapper[4898]: I1211 13:04:29.416968 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 11 13:04:29 crc kubenswrapper[4898]: E1211 13:04:29.417200 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-11 13:04:37.417162513 +0000 UTC m=+34.989488990 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 13:04:29 crc kubenswrapper[4898]: I1211 13:04:29.482537 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:29 crc kubenswrapper[4898]: I1211 13:04:29.482592 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:29 crc kubenswrapper[4898]: I1211 13:04:29.482608 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:29 crc kubenswrapper[4898]: I1211 13:04:29.482633 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:29 crc kubenswrapper[4898]: I1211 13:04:29.482647 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:29Z","lastTransitionTime":"2025-12-11T13:04:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:29 crc kubenswrapper[4898]: I1211 13:04:29.518482 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 11 13:04:29 crc kubenswrapper[4898]: I1211 13:04:29.518584 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 11 13:04:29 crc kubenswrapper[4898]: I1211 13:04:29.518661 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 11 13:04:29 crc kubenswrapper[4898]: I1211 13:04:29.518713 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 11 13:04:29 crc kubenswrapper[4898]: E1211 13:04:29.518723 4898 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 11 13:04:29 crc kubenswrapper[4898]: E1211 13:04:29.518814 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-11 13:04:37.518788459 +0000 UTC m=+35.091114936 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 11 13:04:29 crc kubenswrapper[4898]: E1211 13:04:29.518911 4898 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 11 13:04:29 crc kubenswrapper[4898]: E1211 13:04:29.518933 4898 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 11 13:04:29 crc kubenswrapper[4898]: E1211 13:04:29.519017 4898 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 11 13:04:29 crc kubenswrapper[4898]: E1211 13:04:29.519077 4898 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 11 13:04:29 crc kubenswrapper[4898]: E1211 13:04:29.519083 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-11 13:04:37.519050186 +0000 UTC m=+35.091376653 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 11 13:04:29 crc kubenswrapper[4898]: E1211 13:04:29.519102 4898 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 11 13:04:29 crc kubenswrapper[4898]: E1211 13:04:29.518946 4898 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 11 13:04:29 crc kubenswrapper[4898]: E1211 13:04:29.519179 4898 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 11 13:04:29 crc kubenswrapper[4898]: E1211 13:04:29.519293 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-11 13:04:37.519217511 +0000 UTC m=+35.091544008 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 11 13:04:29 crc kubenswrapper[4898]: E1211 13:04:29.519328 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-11 13:04:37.519314163 +0000 UTC m=+35.091640740 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 11 13:04:29 crc kubenswrapper[4898]: I1211 13:04:29.585278 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:29 crc kubenswrapper[4898]: I1211 13:04:29.585324 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:29 crc kubenswrapper[4898]: I1211 13:04:29.585333 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:29 crc kubenswrapper[4898]: I1211 13:04:29.585348 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:29 crc kubenswrapper[4898]: I1211 13:04:29.585359 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:29Z","lastTransitionTime":"2025-12-11T13:04:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:29 crc kubenswrapper[4898]: I1211 13:04:29.688018 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:29 crc kubenswrapper[4898]: I1211 13:04:29.688055 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:29 crc kubenswrapper[4898]: I1211 13:04:29.688066 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:29 crc kubenswrapper[4898]: I1211 13:04:29.688085 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:29 crc kubenswrapper[4898]: I1211 13:04:29.688097 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:29Z","lastTransitionTime":"2025-12-11T13:04:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:29 crc kubenswrapper[4898]: I1211 13:04:29.774243 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 11 13:04:29 crc kubenswrapper[4898]: I1211 13:04:29.774272 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 11 13:04:29 crc kubenswrapper[4898]: I1211 13:04:29.774243 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 11 13:04:29 crc kubenswrapper[4898]: E1211 13:04:29.774422 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 11 13:04:29 crc kubenswrapper[4898]: E1211 13:04:29.774649 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 11 13:04:29 crc kubenswrapper[4898]: E1211 13:04:29.774748 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 11 13:04:29 crc kubenswrapper[4898]: I1211 13:04:29.790194 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:29 crc kubenswrapper[4898]: I1211 13:04:29.790222 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:29 crc kubenswrapper[4898]: I1211 13:04:29.790230 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:29 crc kubenswrapper[4898]: I1211 13:04:29.790243 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:29 crc kubenswrapper[4898]: I1211 13:04:29.790252 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:29Z","lastTransitionTime":"2025-12-11T13:04:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:29 crc kubenswrapper[4898]: I1211 13:04:29.893323 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:29 crc kubenswrapper[4898]: I1211 13:04:29.893367 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:29 crc kubenswrapper[4898]: I1211 13:04:29.893381 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:29 crc kubenswrapper[4898]: I1211 13:04:29.893403 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:29 crc kubenswrapper[4898]: I1211 13:04:29.893416 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:29Z","lastTransitionTime":"2025-12-11T13:04:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:29 crc kubenswrapper[4898]: I1211 13:04:29.974724 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7lxfm" event={"ID":"45192e97-2770-4866-9865-dc2f45b3f616","Type":"ContainerStarted","Data":"d4fc2a5689b589ccda6ec382b6a586f4e47c42d4f137d852e28688a18b98e6e3"} Dec 11 13:04:29 crc kubenswrapper[4898]: I1211 13:04:29.984198 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qndxl" event={"ID":"1efa7034-8a95-4e6e-bd84-0189dc5acaa3","Type":"ContainerStarted","Data":"7d5786093fd7a55dfc5d4ea0d8610920f881ca9bc79361cbeeadf602c52cef54"} Dec 11 13:04:29 crc kubenswrapper[4898]: I1211 13:04:29.984641 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-qndxl" Dec 11 13:04:29 crc kubenswrapper[4898]: I1211 13:04:29.984703 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-qndxl" Dec 11 13:04:29 crc kubenswrapper[4898]: I1211 13:04:29.996656 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:29 crc kubenswrapper[4898]: I1211 13:04:29.996724 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:29 crc kubenswrapper[4898]: I1211 13:04:29.996746 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:29 crc kubenswrapper[4898]: I1211 13:04:29.996776 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:29 crc kubenswrapper[4898]: I1211 13:04:29.996798 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:29Z","lastTransitionTime":"2025-12-11T13:04:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:30 crc kubenswrapper[4898]: I1211 13:04:30.002153 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-dlqfj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e8ed6cb-b822-4b64-9e00-e755c5aea812\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76ddfb054d2dd7ec27a771c96a0fca2cead8eacc3221c22b36c1d075414a1995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bvrpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-dlqfj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:29Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:30 crc kubenswrapper[4898]: I1211 13:04:30.015211 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-qndxl" Dec 11 13:04:30 crc kubenswrapper[4898]: I1211 13:04:30.016729 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-qndxl" Dec 11 13:04:30 crc kubenswrapper[4898]: I1211 13:04:30.019781 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jmvlr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8009db93-3f68-4a84-87ae-863f64e231e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08079d059f71e05ad9d28ebed3d2608a035d3b6fc87db1d80b3974b439ac5bdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkmx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:24Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jmvlr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:30Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:30 crc kubenswrapper[4898]: I1211 13:04:30.036592 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://337971ff1d0d3cf7ee256777a71e4549ec0b973b58b437caaa152c589047141f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:30Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:30 crc kubenswrapper[4898]: I1211 13:04:30.054041 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b28cc743c1173591c6ee75d4bb3841608d01bca50dab896d4c5de930a67ee22f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:30Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:30 crc kubenswrapper[4898]: I1211 13:04:30.078996 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"88d684b6-361a-40e6-978e-b46ea34398f5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fdd8f4f5e41c02e8bfa32782027f6483a3568a3e1cb28f2abc7807eaada32646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad23dc07c8ad7cf6100e3b53e1d9ed8abac20e942968edb5decdde1b263de306\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce1bf1a7718645350f992e26a366a0fefa0d757b696c932d3552dd24a4c93550\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4801f56faef50a1c100b7f54be9508a2be17ab655648fdb92cffaab1cb198a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87f5a835b639c4ce50c0d2b5f1a865f1b67bc5cc06d9fd88875d9defb71dbed2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77f3c45a0ef619bd5ee7b89f62fdaf4ae983c8f75e23034eb88019194f6ef88a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77f3c45a0ef619bd5ee7b89f62fdaf4ae983c8f75e23034eb88019194f6ef88a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32f392949d5ebc630af8b0646f8f482f806803d464e2cedbcbe731328d6ba5ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32f392949d5ebc630af8b0646f8f482f806803d464e2cedbcbe731328d6ba5ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://bf8553e4b1a22afaf454efae3707136b9e6e320a54fd269f48a5f72297249d21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf8553e4b1a22afaf454efae3707136b9e6e320a54fd269f48a5f72297249d21\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:30Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:30 crc kubenswrapper[4898]: I1211 13:04:30.091522 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-h86cf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"00c0a5c3-6be3-4c77-a628-cf6710a1f10f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cb94931502e22d03595fc3289c2b098ecf881b9f8c14180a036a11b2f65bd91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xv2hc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-h86cf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:30Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:30 crc kubenswrapper[4898]: I1211 13:04:30.099390 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:30 crc kubenswrapper[4898]: I1211 13:04:30.099436 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:30 crc kubenswrapper[4898]: I1211 13:04:30.099447 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:30 crc kubenswrapper[4898]: I1211 13:04:30.099479 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:30 crc kubenswrapper[4898]: I1211 13:04:30.099492 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:30Z","lastTransitionTime":"2025-12-11T13:04:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:30 crc kubenswrapper[4898]: I1211 13:04:30.106436 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6782ac5869029f1d6c05198c469d34070decbbbbab0c093963024040f067ecf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7cv42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb7b98c50effebe09db7e51fe139da11af7b116ce6b7d81f0dada92fa29c82a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7cv42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7mmvk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:30Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:30 crc kubenswrapper[4898]: I1211 13:04:30.135285 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qndxl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1efa7034-8a95-4e6e-bd84-0189dc5acaa3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fc397b24e8480c4d07281b09fb871f859eb9fdf47fa9103b9243f91e5800d61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fc397b24e8480c4d07281b09fb871f859eb9fdf47fa9103b9243f91e5800d61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qndxl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:30Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:30 crc kubenswrapper[4898]: I1211 13:04:30.151832 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:30Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:30 crc kubenswrapper[4898]: I1211 13:04:30.169443 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7lxfm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45192e97-2770-4866-9865-dc2f45b3f616\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4fc2a5689b589ccda6ec382b6a586f4e47c42d4f137d852e28688a18b98e6e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14a9667311a747f8e844897394432d67e0897b22569dd1314f4ebeccfce1c531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14a9667311a747f8e844897394432d67e0897b22569dd1314f4ebeccfce1c531\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://35bb75c37377565dd0410255d00b3a848a3d22801107bd4d9547ce5574cda121\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35bb75c37377565dd0410255d00b3a848a3d22801107bd4d9547ce5574cda121\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd3bae44b475629c42174d195e162b9afe545e8dc84857847b1e2d936cf1a32d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd3bae44b475629c42174d195e162b9afe545e8dc84857847b1e2d936cf1a32d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7d5edf28866aeb08ed16121e0cea183463e58a708b157da20edd847bdad9860\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7d5edf28866aeb08ed16121e0cea183463e58a708b157da20edd847bdad9860\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://951e5da3fc6041496960dcfcd28bb3e918a40d9303c6609880bc2f8dcca29688\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://951e5da3fc6041496960dcfcd28bb3e918a40d9303c6609880bc2f8dcca29688\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39b6a95a0387b8f1ea4992bfa28355272a299bbb81adddf7ad39835786b43b2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39b6a95a0387b8f1ea4992bfa28355272a299bbb81adddf7ad39835786b43b2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7lxfm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:30Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:30 crc kubenswrapper[4898]: I1211 13:04:30.184838 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fc485a2-a69e-4453-8c3d-4b9698caa632\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df081be8bffcbf5068ade5f09ad81f2ea332d110fb0dffc1ab215f4a447e75d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://765cb993103d23770e20a725ce1f28ca635d13d2ecd05676039a833274ecf3e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41568c567a418a6ebd534c09d02c4331dfa2cd00580441bfdd83a15a12153a59\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ccc1530df8a90b252f6008c3a1734fd489bcb57944c2cc46bdf3c7554d837c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8857b4ade110b31598d9080f45bc15f5702b2c278fe7909e4d172c9877e61534\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"message\\\":\\\"falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1211 13:04:15.326141 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1211 13:04:15.327902 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-38072788/tls.crt::/tmp/serving-cert-38072788/tls.key\\\\\\\"\\\\nI1211 13:04:21.035582 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1211 13:04:21.038444 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1211 13:04:21.038483 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1211 13:04:21.038515 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1211 13:04:21.038527 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1211 13:04:21.044993 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1211 13:04:21.045022 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1211 13:04:21.045029 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1211 13:04:21.045034 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1211 13:04:21.045038 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1211 13:04:21.045043 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1211 13:04:21.045047 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1211 13:04:21.045297 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1211 13:04:21.046774 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a6e962a8c37823fc7d0047f9c1cae14311bfa1f36236addff039bba3369a021\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca4b5408884ed07718750ad82f77961d242327ca11991fd4dbbd1b5b53c5b642\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca4b5408884ed07718750ad82f77961d242327ca11991fd4dbbd1b5b53c5b642\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:30Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:30 crc kubenswrapper[4898]: I1211 13:04:30.201285 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:30 crc kubenswrapper[4898]: I1211 13:04:30.201325 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:30 crc kubenswrapper[4898]: I1211 13:04:30.201334 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:30 crc kubenswrapper[4898]: I1211 13:04:30.201348 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:30 crc kubenswrapper[4898]: I1211 13:04:30.201357 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:30Z","lastTransitionTime":"2025-12-11T13:04:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:30 crc kubenswrapper[4898]: I1211 13:04:30.201453 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9dab79b9393ae8ca48f0b4366d9b4e6fd25015a3d42f913b3d3d5ce78cbf0cc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bee512bd389a5e402acad6fdbf075e8d90bbfb3899efaecee992ff8ef64617fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:30Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:30 crc kubenswrapper[4898]: I1211 13:04:30.220508 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:30Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:30 crc kubenswrapper[4898]: I1211 13:04:30.235037 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca3db723-c4e9-4a38-9cee-827f045c7be3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b72e9f6787b4bfadbc89cf669d5f31678f0a85a726107e3b3d2e06eff0e18ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a52b842d9e7001cf1e9c3284c56ac9a76f642b66457d6717c6367e00ef89fdf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://583816a3f9ba9a092d6cd8d75bba54cf42e3ff9f65230f83822baec967dd53dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f419f755ec0a61ffad06dca0d35df43386ebd253152f29cf04312cd0ea89d5de\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:30Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:30 crc kubenswrapper[4898]: I1211 13:04:30.253012 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:30Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:30 crc kubenswrapper[4898]: I1211 13:04:30.267048 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca3db723-c4e9-4a38-9cee-827f045c7be3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b72e9f6787b4bfadbc89cf669d5f31678f0a85a726107e3b3d2e06eff0e18ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a52b842d9e7001cf1e9c3284c56ac9a76f642b66457d6717c6367e00ef89fdf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://583816a3f9ba9a092d6cd8d75bba54cf42e3ff9f65230f83822baec967dd53dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f419f755ec0a61ffad06dca0d35df43386ebd253152f29cf04312cd0ea89d5de\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:30Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:30 crc kubenswrapper[4898]: I1211 13:04:30.284834 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:30Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:30 crc kubenswrapper[4898]: I1211 13:04:30.301239 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:30Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:30 crc kubenswrapper[4898]: I1211 13:04:30.303867 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:30 crc kubenswrapper[4898]: I1211 13:04:30.303904 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:30 crc kubenswrapper[4898]: I1211 13:04:30.303914 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:30 crc kubenswrapper[4898]: I1211 13:04:30.303928 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:30 crc kubenswrapper[4898]: I1211 13:04:30.303937 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:30Z","lastTransitionTime":"2025-12-11T13:04:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:30 crc kubenswrapper[4898]: I1211 13:04:30.318887 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://337971ff1d0d3cf7ee256777a71e4549ec0b973b58b437caaa152c589047141f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:30Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:30 crc kubenswrapper[4898]: I1211 13:04:30.335305 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b28cc743c1173591c6ee75d4bb3841608d01bca50dab896d4c5de930a67ee22f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:30Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:30 crc kubenswrapper[4898]: I1211 13:04:30.348896 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-dlqfj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e8ed6cb-b822-4b64-9e00-e755c5aea812\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76ddfb054d2dd7ec27a771c96a0fca2cead8eacc3221c22b36c1d075414a1995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bvrpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-dlqfj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:30Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:30 crc kubenswrapper[4898]: I1211 13:04:30.358160 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jmvlr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8009db93-3f68-4a84-87ae-863f64e231e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08079d059f71e05ad9d28ebed3d2608a035d3b6fc87db1d80b3974b439ac5bdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkmx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:24Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jmvlr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:30Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:30 crc kubenswrapper[4898]: I1211 13:04:30.368751 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6782ac5869029f1d6c05198c469d34070decbbbbab0c093963024040f067ecf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7cv42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb7b98c50effebe09db7e51fe139da11af7b116ce6b7d81f0dada92fa29c82a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7cv42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7mmvk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:30Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:30 crc kubenswrapper[4898]: I1211 13:04:30.384691 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qndxl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1efa7034-8a95-4e6e-bd84-0189dc5acaa3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2a77e22893dca5ea818fc6992db6c34f8eb90ce2f25e814bec5f314d60ad569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd81ec4902eeb225d5a2969e211c63bbab448c1cccb327cfa84fb68c4f386ab0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ab0efc2d7d46f575125a46f025b016017e66f74466a9d796dfb0285577f5bcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d992af21fd68cce19a4be8328466a3d46df62ed908fe4e911ba14e7c009ce982\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a60901e27e67b7483e02800a91a86f2676b610d538c999ab1df0d42fde146f90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7e7d7ded2a6e0e20781a16b0ed56a744db593c6fe40c75e89a088ccb8cdb127\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d5786093fd7a55dfc5d4ea0d8610920f881ca9bc79361cbeeadf602c52cef54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3aa907a5ca865510f1aafd0f949963b5938b6fe62fa9fd690447c4ab62566f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fc397b24e8480c4d07281b09fb871f859eb9fdf47fa9103b9243f91e5800d61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fc397b24e8480c4d07281b09fb871f859eb9fdf47fa9103b9243f91e5800d61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qndxl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:30Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:30 crc kubenswrapper[4898]: I1211 13:04:30.404039 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"88d684b6-361a-40e6-978e-b46ea34398f5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fdd8f4f5e41c02e8bfa32782027f6483a3568a3e1cb28f2abc7807eaada32646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad23dc07c8ad7cf6100e3b53e1d9ed8abac20e942968edb5decdde1b263de306\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce1bf1a7718645350f992e26a366a0fefa0d757b696c932d3552dd24a4c93550\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4801f56faef50a1c100b7f54be9508a2be17ab655648fdb92cffaab1cb198a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87f5a835b639c4ce50c0d2b5f1a865f1b67bc5cc06d9fd88875d9defb71dbed2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77f3c45a0ef619bd5ee7b89f62fdaf4ae983c8f75e23034eb88019194f6ef88a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77f3c45a0ef619bd5ee7b89f62fdaf4ae983c8f75e23034eb88019194f6ef88a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32f392949d5ebc630af8b0646f8f482f806803d464e2cedbcbe731328d6ba5ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32f392949d5ebc630af8b0646f8f482f806803d464e2cedbcbe731328d6ba5ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://bf8553e4b1a22afaf454efae3707136b9e6e320a54fd269f48a5f72297249d21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf8553e4b1a22afaf454efae3707136b9e6e320a54fd269f48a5f72297249d21\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:30Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:30 crc kubenswrapper[4898]: I1211 13:04:30.405984 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:30 crc kubenswrapper[4898]: I1211 13:04:30.406013 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:30 crc kubenswrapper[4898]: I1211 13:04:30.406022 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:30 crc kubenswrapper[4898]: I1211 13:04:30.406035 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:30 crc kubenswrapper[4898]: I1211 13:04:30.406046 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:30Z","lastTransitionTime":"2025-12-11T13:04:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:30 crc kubenswrapper[4898]: I1211 13:04:30.415370 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-h86cf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"00c0a5c3-6be3-4c77-a628-cf6710a1f10f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cb94931502e22d03595fc3289c2b098ecf881b9f8c14180a036a11b2f65bd91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xv2hc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-h86cf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:30Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:30 crc kubenswrapper[4898]: I1211 13:04:30.427619 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fc485a2-a69e-4453-8c3d-4b9698caa632\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df081be8bffcbf5068ade5f09ad81f2ea332d110fb0dffc1ab215f4a447e75d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://765cb993103d23770e20a725ce1f28ca635d13d2ecd05676039a833274ecf3e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41568c567a418a6ebd534c09d02c4331dfa2cd00580441bfdd83a15a12153a59\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ccc1530df8a90b252f6008c3a1734fd489bcb57944c2cc46bdf3c7554d837c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8857b4ade110b31598d9080f45bc15f5702b2c278fe7909e4d172c9877e61534\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"message\\\":\\\"falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1211 13:04:15.326141 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1211 13:04:15.327902 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-38072788/tls.crt::/tmp/serving-cert-38072788/tls.key\\\\\\\"\\\\nI1211 13:04:21.035582 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1211 13:04:21.038444 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1211 13:04:21.038483 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1211 13:04:21.038515 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1211 13:04:21.038527 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1211 13:04:21.044993 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1211 13:04:21.045022 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1211 13:04:21.045029 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1211 13:04:21.045034 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1211 13:04:21.045038 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1211 13:04:21.045043 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1211 13:04:21.045047 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1211 13:04:21.045297 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1211 13:04:21.046774 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a6e962a8c37823fc7d0047f9c1cae14311bfa1f36236addff039bba3369a021\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca4b5408884ed07718750ad82f77961d242327ca11991fd4dbbd1b5b53c5b642\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca4b5408884ed07718750ad82f77961d242327ca11991fd4dbbd1b5b53c5b642\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:30Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:30 crc kubenswrapper[4898]: I1211 13:04:30.440947 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9dab79b9393ae8ca48f0b4366d9b4e6fd25015a3d42f913b3d3d5ce78cbf0cc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bee512bd389a5e402acad6fdbf075e8d90bbfb3899efaecee992ff8ef64617fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:30Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:30 crc kubenswrapper[4898]: I1211 13:04:30.455117 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:30Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:30 crc kubenswrapper[4898]: I1211 13:04:30.470354 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7lxfm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45192e97-2770-4866-9865-dc2f45b3f616\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4fc2a5689b589ccda6ec382b6a586f4e47c42d4f137d852e28688a18b98e6e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14a9667311a747f8e844897394432d67e0897b22569dd1314f4ebeccfce1c531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14a9667311a747f8e844897394432d67e0897b22569dd1314f4ebeccfce1c531\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://35bb75c37377565dd0410255d00b3a848a3d22801107bd4d9547ce5574cda121\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35bb75c37377565dd0410255d00b3a848a3d22801107bd4d9547ce5574cda121\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd3bae44b475629c42174d195e162b9afe545e8dc84857847b1e2d936cf1a32d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd3bae44b475629c42174d195e162b9afe545e8dc84857847b1e2d936cf1a32d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7d5edf28866aeb08ed16121e0cea183463e58a708b157da20edd847bdad9860\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7d5edf28866aeb08ed16121e0cea183463e58a708b157da20edd847bdad9860\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://951e5da3fc6041496960dcfcd28bb3e918a40d9303c6609880bc2f8dcca29688\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://951e5da3fc6041496960dcfcd28bb3e918a40d9303c6609880bc2f8dcca29688\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39b6a95a0387b8f1ea4992bfa28355272a299bbb81adddf7ad39835786b43b2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39b6a95a0387b8f1ea4992bfa28355272a299bbb81adddf7ad39835786b43b2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7lxfm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:30Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:30 crc kubenswrapper[4898]: I1211 13:04:30.507900 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:30 crc kubenswrapper[4898]: I1211 13:04:30.507943 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:30 crc kubenswrapper[4898]: I1211 13:04:30.507956 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:30 crc kubenswrapper[4898]: I1211 13:04:30.507990 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:30 crc kubenswrapper[4898]: I1211 13:04:30.508004 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:30Z","lastTransitionTime":"2025-12-11T13:04:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:30 crc kubenswrapper[4898]: I1211 13:04:30.610750 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:30 crc kubenswrapper[4898]: I1211 13:04:30.610820 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:30 crc kubenswrapper[4898]: I1211 13:04:30.610837 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:30 crc kubenswrapper[4898]: I1211 13:04:30.610861 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:30 crc kubenswrapper[4898]: I1211 13:04:30.610879 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:30Z","lastTransitionTime":"2025-12-11T13:04:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:30 crc kubenswrapper[4898]: I1211 13:04:30.713819 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:30 crc kubenswrapper[4898]: I1211 13:04:30.713885 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:30 crc kubenswrapper[4898]: I1211 13:04:30.713901 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:30 crc kubenswrapper[4898]: I1211 13:04:30.713926 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:30 crc kubenswrapper[4898]: I1211 13:04:30.713942 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:30Z","lastTransitionTime":"2025-12-11T13:04:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:30 crc kubenswrapper[4898]: I1211 13:04:30.816847 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:30 crc kubenswrapper[4898]: I1211 13:04:30.816883 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:30 crc kubenswrapper[4898]: I1211 13:04:30.816891 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:30 crc kubenswrapper[4898]: I1211 13:04:30.816905 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:30 crc kubenswrapper[4898]: I1211 13:04:30.816943 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:30Z","lastTransitionTime":"2025-12-11T13:04:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:30 crc kubenswrapper[4898]: I1211 13:04:30.919748 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:30 crc kubenswrapper[4898]: I1211 13:04:30.919820 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:30 crc kubenswrapper[4898]: I1211 13:04:30.919837 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:30 crc kubenswrapper[4898]: I1211 13:04:30.919859 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:30 crc kubenswrapper[4898]: I1211 13:04:30.919877 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:30Z","lastTransitionTime":"2025-12-11T13:04:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:30 crc kubenswrapper[4898]: I1211 13:04:30.987602 4898 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 11 13:04:31 crc kubenswrapper[4898]: I1211 13:04:31.022349 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:31 crc kubenswrapper[4898]: I1211 13:04:31.022393 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:31 crc kubenswrapper[4898]: I1211 13:04:31.022402 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:31 crc kubenswrapper[4898]: I1211 13:04:31.022416 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:31 crc kubenswrapper[4898]: I1211 13:04:31.022426 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:31Z","lastTransitionTime":"2025-12-11T13:04:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:31 crc kubenswrapper[4898]: I1211 13:04:31.125151 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:31 crc kubenswrapper[4898]: I1211 13:04:31.125194 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:31 crc kubenswrapper[4898]: I1211 13:04:31.125205 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:31 crc kubenswrapper[4898]: I1211 13:04:31.125221 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:31 crc kubenswrapper[4898]: I1211 13:04:31.125233 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:31Z","lastTransitionTime":"2025-12-11T13:04:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:31 crc kubenswrapper[4898]: I1211 13:04:31.227964 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:31 crc kubenswrapper[4898]: I1211 13:04:31.227998 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:31 crc kubenswrapper[4898]: I1211 13:04:31.228008 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:31 crc kubenswrapper[4898]: I1211 13:04:31.228023 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:31 crc kubenswrapper[4898]: I1211 13:04:31.228035 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:31Z","lastTransitionTime":"2025-12-11T13:04:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:31 crc kubenswrapper[4898]: I1211 13:04:31.330359 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:31 crc kubenswrapper[4898]: I1211 13:04:31.330388 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:31 crc kubenswrapper[4898]: I1211 13:04:31.330396 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:31 crc kubenswrapper[4898]: I1211 13:04:31.330412 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:31 crc kubenswrapper[4898]: I1211 13:04:31.330420 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:31Z","lastTransitionTime":"2025-12-11T13:04:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:31 crc kubenswrapper[4898]: I1211 13:04:31.433135 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:31 crc kubenswrapper[4898]: I1211 13:04:31.433169 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:31 crc kubenswrapper[4898]: I1211 13:04:31.433178 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:31 crc kubenswrapper[4898]: I1211 13:04:31.433196 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:31 crc kubenswrapper[4898]: I1211 13:04:31.433206 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:31Z","lastTransitionTime":"2025-12-11T13:04:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:31 crc kubenswrapper[4898]: I1211 13:04:31.535196 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:31 crc kubenswrapper[4898]: I1211 13:04:31.535236 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:31 crc kubenswrapper[4898]: I1211 13:04:31.535249 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:31 crc kubenswrapper[4898]: I1211 13:04:31.535265 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:31 crc kubenswrapper[4898]: I1211 13:04:31.535277 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:31Z","lastTransitionTime":"2025-12-11T13:04:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:31 crc kubenswrapper[4898]: I1211 13:04:31.637618 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:31 crc kubenswrapper[4898]: I1211 13:04:31.637660 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:31 crc kubenswrapper[4898]: I1211 13:04:31.637669 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:31 crc kubenswrapper[4898]: I1211 13:04:31.637682 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:31 crc kubenswrapper[4898]: I1211 13:04:31.637691 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:31Z","lastTransitionTime":"2025-12-11T13:04:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:31 crc kubenswrapper[4898]: I1211 13:04:31.739326 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:31 crc kubenswrapper[4898]: I1211 13:04:31.739369 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:31 crc kubenswrapper[4898]: I1211 13:04:31.739378 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:31 crc kubenswrapper[4898]: I1211 13:04:31.739396 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:31 crc kubenswrapper[4898]: I1211 13:04:31.739409 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:31Z","lastTransitionTime":"2025-12-11T13:04:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:31 crc kubenswrapper[4898]: I1211 13:04:31.774822 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 11 13:04:31 crc kubenswrapper[4898]: I1211 13:04:31.774873 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 11 13:04:31 crc kubenswrapper[4898]: I1211 13:04:31.774823 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 11 13:04:31 crc kubenswrapper[4898]: E1211 13:04:31.774940 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 11 13:04:31 crc kubenswrapper[4898]: E1211 13:04:31.775090 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 11 13:04:31 crc kubenswrapper[4898]: E1211 13:04:31.775227 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 11 13:04:31 crc kubenswrapper[4898]: I1211 13:04:31.841975 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:31 crc kubenswrapper[4898]: I1211 13:04:31.842014 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:31 crc kubenswrapper[4898]: I1211 13:04:31.842026 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:31 crc kubenswrapper[4898]: I1211 13:04:31.842043 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:31 crc kubenswrapper[4898]: I1211 13:04:31.842054 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:31Z","lastTransitionTime":"2025-12-11T13:04:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:31 crc kubenswrapper[4898]: I1211 13:04:31.944798 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:31 crc kubenswrapper[4898]: I1211 13:04:31.944867 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:31 crc kubenswrapper[4898]: I1211 13:04:31.944888 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:31 crc kubenswrapper[4898]: I1211 13:04:31.944910 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:31 crc kubenswrapper[4898]: I1211 13:04:31.944927 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:31Z","lastTransitionTime":"2025-12-11T13:04:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:31 crc kubenswrapper[4898]: I1211 13:04:31.990175 4898 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 11 13:04:32 crc kubenswrapper[4898]: I1211 13:04:32.046984 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:32 crc kubenswrapper[4898]: I1211 13:04:32.047021 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:32 crc kubenswrapper[4898]: I1211 13:04:32.047034 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:32 crc kubenswrapper[4898]: I1211 13:04:32.047051 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:32 crc kubenswrapper[4898]: I1211 13:04:32.047063 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:32Z","lastTransitionTime":"2025-12-11T13:04:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:32 crc kubenswrapper[4898]: I1211 13:04:32.149773 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:32 crc kubenswrapper[4898]: I1211 13:04:32.149820 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:32 crc kubenswrapper[4898]: I1211 13:04:32.149835 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:32 crc kubenswrapper[4898]: I1211 13:04:32.149855 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:32 crc kubenswrapper[4898]: I1211 13:04:32.150180 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:32Z","lastTransitionTime":"2025-12-11T13:04:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:32 crc kubenswrapper[4898]: I1211 13:04:32.253702 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:32 crc kubenswrapper[4898]: I1211 13:04:32.253760 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:32 crc kubenswrapper[4898]: I1211 13:04:32.253772 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:32 crc kubenswrapper[4898]: I1211 13:04:32.253794 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:32 crc kubenswrapper[4898]: I1211 13:04:32.253807 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:32Z","lastTransitionTime":"2025-12-11T13:04:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:32 crc kubenswrapper[4898]: I1211 13:04:32.357147 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:32 crc kubenswrapper[4898]: I1211 13:04:32.357212 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:32 crc kubenswrapper[4898]: I1211 13:04:32.357231 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:32 crc kubenswrapper[4898]: I1211 13:04:32.357260 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:32 crc kubenswrapper[4898]: I1211 13:04:32.357284 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:32Z","lastTransitionTime":"2025-12-11T13:04:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:32 crc kubenswrapper[4898]: I1211 13:04:32.460700 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:32 crc kubenswrapper[4898]: I1211 13:04:32.460769 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:32 crc kubenswrapper[4898]: I1211 13:04:32.460786 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:32 crc kubenswrapper[4898]: I1211 13:04:32.460810 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:32 crc kubenswrapper[4898]: I1211 13:04:32.460828 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:32Z","lastTransitionTime":"2025-12-11T13:04:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:32 crc kubenswrapper[4898]: I1211 13:04:32.563018 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:32 crc kubenswrapper[4898]: I1211 13:04:32.563086 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:32 crc kubenswrapper[4898]: I1211 13:04:32.563097 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:32 crc kubenswrapper[4898]: I1211 13:04:32.563115 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:32 crc kubenswrapper[4898]: I1211 13:04:32.563127 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:32Z","lastTransitionTime":"2025-12-11T13:04:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:32 crc kubenswrapper[4898]: I1211 13:04:32.665534 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:32 crc kubenswrapper[4898]: I1211 13:04:32.665597 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:32 crc kubenswrapper[4898]: I1211 13:04:32.665614 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:32 crc kubenswrapper[4898]: I1211 13:04:32.665635 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:32 crc kubenswrapper[4898]: I1211 13:04:32.665651 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:32Z","lastTransitionTime":"2025-12-11T13:04:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:32 crc kubenswrapper[4898]: I1211 13:04:32.768538 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:32 crc kubenswrapper[4898]: I1211 13:04:32.768585 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:32 crc kubenswrapper[4898]: I1211 13:04:32.768596 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:32 crc kubenswrapper[4898]: I1211 13:04:32.768613 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:32 crc kubenswrapper[4898]: I1211 13:04:32.768625 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:32Z","lastTransitionTime":"2025-12-11T13:04:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:32 crc kubenswrapper[4898]: I1211 13:04:32.793101 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qndxl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1efa7034-8a95-4e6e-bd84-0189dc5acaa3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2a77e22893dca5ea818fc6992db6c34f8eb90ce2f25e814bec5f314d60ad569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd81ec4902eeb225d5a2969e211c63bbab448c1cccb327cfa84fb68c4f386ab0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ab0efc2d7d46f575125a46f025b016017e66f74466a9d796dfb0285577f5bcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d992af21fd68cce19a4be8328466a3d46df62ed908fe4e911ba14e7c009ce982\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a60901e27e67b7483e02800a91a86f2676b610d538c999ab1df0d42fde146f90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7e7d7ded2a6e0e20781a16b0ed56a744db593c6fe40c75e89a088ccb8cdb127\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d5786093fd7a55dfc5d4ea0d8610920f881ca9bc79361cbeeadf602c52cef54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3aa907a5ca865510f1aafd0f949963b5938b6fe62fa9fd690447c4ab62566f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fc397b24e8480c4d07281b09fb871f859eb9fdf47fa9103b9243f91e5800d61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fc397b24e8480c4d07281b09fb871f859eb9fdf47fa9103b9243f91e5800d61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qndxl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:32Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:32 crc kubenswrapper[4898]: I1211 13:04:32.813579 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"88d684b6-361a-40e6-978e-b46ea34398f5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fdd8f4f5e41c02e8bfa32782027f6483a3568a3e1cb28f2abc7807eaada32646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad23dc07c8ad7cf6100e3b53e1d9ed8abac20e942968edb5decdde1b263de306\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce1bf1a7718645350f992e26a366a0fefa0d757b696c932d3552dd24a4c93550\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4801f56faef50a1c100b7f54be9508a2be17ab655648fdb92cffaab1cb198a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87f5a835b639c4ce50c0d2b5f1a865f1b67bc5cc06d9fd88875d9defb71dbed2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77f3c45a0ef619bd5ee7b89f62fdaf4ae983c8f75e23034eb88019194f6ef88a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77f3c45a0ef619bd5ee7b89f62fdaf4ae983c8f75e23034eb88019194f6ef88a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32f392949d5ebc630af8b0646f8f482f806803d464e2cedbcbe731328d6ba5ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32f392949d5ebc630af8b0646f8f482f806803d464e2cedbcbe731328d6ba5ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://bf8553e4b1a22afaf454efae3707136b9e6e320a54fd269f48a5f72297249d21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf8553e4b1a22afaf454efae3707136b9e6e320a54fd269f48a5f72297249d21\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:32Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:32 crc kubenswrapper[4898]: I1211 13:04:32.825241 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-h86cf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"00c0a5c3-6be3-4c77-a628-cf6710a1f10f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cb94931502e22d03595fc3289c2b098ecf881b9f8c14180a036a11b2f65bd91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xv2hc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-h86cf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:32Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:32 crc kubenswrapper[4898]: I1211 13:04:32.839425 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6782ac5869029f1d6c05198c469d34070decbbbbab0c093963024040f067ecf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7cv42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb7b98c50effebe09db7e51fe139da11af7b116ce6b7d81f0dada92fa29c82a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7cv42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7mmvk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:32Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:32 crc kubenswrapper[4898]: I1211 13:04:32.856868 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fc485a2-a69e-4453-8c3d-4b9698caa632\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df081be8bffcbf5068ade5f09ad81f2ea332d110fb0dffc1ab215f4a447e75d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://765cb993103d23770e20a725ce1f28ca635d13d2ecd05676039a833274ecf3e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41568c567a418a6ebd534c09d02c4331dfa2cd00580441bfdd83a15a12153a59\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ccc1530df8a90b252f6008c3a1734fd489bcb57944c2cc46bdf3c7554d837c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8857b4ade110b31598d9080f45bc15f5702b2c278fe7909e4d172c9877e61534\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"message\\\":\\\"falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1211 13:04:15.326141 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1211 13:04:15.327902 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-38072788/tls.crt::/tmp/serving-cert-38072788/tls.key\\\\\\\"\\\\nI1211 13:04:21.035582 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1211 13:04:21.038444 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1211 13:04:21.038483 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1211 13:04:21.038515 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1211 13:04:21.038527 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1211 13:04:21.044993 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1211 13:04:21.045022 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1211 13:04:21.045029 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1211 13:04:21.045034 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1211 13:04:21.045038 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1211 13:04:21.045043 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1211 13:04:21.045047 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1211 13:04:21.045297 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1211 13:04:21.046774 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a6e962a8c37823fc7d0047f9c1cae14311bfa1f36236addff039bba3369a021\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca4b5408884ed07718750ad82f77961d242327ca11991fd4dbbd1b5b53c5b642\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca4b5408884ed07718750ad82f77961d242327ca11991fd4dbbd1b5b53c5b642\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:32Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:32 crc kubenswrapper[4898]: I1211 13:04:32.870959 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:32 crc kubenswrapper[4898]: I1211 13:04:32.871039 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:32 crc kubenswrapper[4898]: I1211 13:04:32.871062 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:32 crc kubenswrapper[4898]: I1211 13:04:32.871092 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:32 crc kubenswrapper[4898]: I1211 13:04:32.871114 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:32Z","lastTransitionTime":"2025-12-11T13:04:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:32 crc kubenswrapper[4898]: I1211 13:04:32.873721 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9dab79b9393ae8ca48f0b4366d9b4e6fd25015a3d42f913b3d3d5ce78cbf0cc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bee512bd389a5e402acad6fdbf075e8d90bbfb3899efaecee992ff8ef64617fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:32Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:32 crc kubenswrapper[4898]: I1211 13:04:32.891281 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:32Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:32 crc kubenswrapper[4898]: I1211 13:04:32.915274 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7lxfm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45192e97-2770-4866-9865-dc2f45b3f616\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4fc2a5689b589ccda6ec382b6a586f4e47c42d4f137d852e28688a18b98e6e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14a9667311a747f8e844897394432d67e0897b22569dd1314f4ebeccfce1c531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14a9667311a747f8e844897394432d67e0897b22569dd1314f4ebeccfce1c531\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://35bb75c37377565dd0410255d00b3a848a3d22801107bd4d9547ce5574cda121\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35bb75c37377565dd0410255d00b3a848a3d22801107bd4d9547ce5574cda121\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd3bae44b475629c42174d195e162b9afe545e8dc84857847b1e2d936cf1a32d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd3bae44b475629c42174d195e162b9afe545e8dc84857847b1e2d936cf1a32d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7d5edf28866aeb08ed16121e0cea183463e58a708b157da20edd847bdad9860\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7d5edf28866aeb08ed16121e0cea183463e58a708b157da20edd847bdad9860\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://951e5da3fc6041496960dcfcd28bb3e918a40d9303c6609880bc2f8dcca29688\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://951e5da3fc6041496960dcfcd28bb3e918a40d9303c6609880bc2f8dcca29688\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39b6a95a0387b8f1ea4992bfa28355272a299bbb81adddf7ad39835786b43b2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39b6a95a0387b8f1ea4992bfa28355272a299bbb81adddf7ad39835786b43b2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7lxfm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:32Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:32 crc kubenswrapper[4898]: I1211 13:04:32.930890 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca3db723-c4e9-4a38-9cee-827f045c7be3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b72e9f6787b4bfadbc89cf669d5f31678f0a85a726107e3b3d2e06eff0e18ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a52b842d9e7001cf1e9c3284c56ac9a76f642b66457d6717c6367e00ef89fdf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://583816a3f9ba9a092d6cd8d75bba54cf42e3ff9f65230f83822baec967dd53dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f419f755ec0a61ffad06dca0d35df43386ebd253152f29cf04312cd0ea89d5de\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:32Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:32 crc kubenswrapper[4898]: I1211 13:04:32.949123 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:32Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:32 crc kubenswrapper[4898]: I1211 13:04:32.967184 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:32Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:32 crc kubenswrapper[4898]: I1211 13:04:32.974080 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:32 crc kubenswrapper[4898]: I1211 13:04:32.974127 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:32 crc kubenswrapper[4898]: I1211 13:04:32.974142 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:32 crc kubenswrapper[4898]: I1211 13:04:32.974162 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:32 crc kubenswrapper[4898]: I1211 13:04:32.974178 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:32Z","lastTransitionTime":"2025-12-11T13:04:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:32 crc kubenswrapper[4898]: I1211 13:04:32.984842 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://337971ff1d0d3cf7ee256777a71e4549ec0b973b58b437caaa152c589047141f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:32Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:32 crc kubenswrapper[4898]: I1211 13:04:32.995603 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qndxl_1efa7034-8a95-4e6e-bd84-0189dc5acaa3/ovnkube-controller/0.log" Dec 11 13:04:32 crc kubenswrapper[4898]: I1211 13:04:32.999081 4898 generic.go:334] "Generic (PLEG): container finished" podID="1efa7034-8a95-4e6e-bd84-0189dc5acaa3" containerID="7d5786093fd7a55dfc5d4ea0d8610920f881ca9bc79361cbeeadf602c52cef54" exitCode=1 Dec 11 13:04:32 crc kubenswrapper[4898]: I1211 13:04:32.999144 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qndxl" event={"ID":"1efa7034-8a95-4e6e-bd84-0189dc5acaa3","Type":"ContainerDied","Data":"7d5786093fd7a55dfc5d4ea0d8610920f881ca9bc79361cbeeadf602c52cef54"} Dec 11 13:04:33 crc kubenswrapper[4898]: I1211 13:04:32.999960 4898 scope.go:117] "RemoveContainer" containerID="7d5786093fd7a55dfc5d4ea0d8610920f881ca9bc79361cbeeadf602c52cef54" Dec 11 13:04:33 crc kubenswrapper[4898]: I1211 13:04:33.003604 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b28cc743c1173591c6ee75d4bb3841608d01bca50dab896d4c5de930a67ee22f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:33Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:33 crc kubenswrapper[4898]: I1211 13:04:33.025083 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-dlqfj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e8ed6cb-b822-4b64-9e00-e755c5aea812\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76ddfb054d2dd7ec27a771c96a0fca2cead8eacc3221c22b36c1d075414a1995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bvrpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-dlqfj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:33Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:33 crc kubenswrapper[4898]: I1211 13:04:33.038011 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jmvlr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8009db93-3f68-4a84-87ae-863f64e231e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08079d059f71e05ad9d28ebed3d2608a035d3b6fc87db1d80b3974b439ac5bdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkmx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:24Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jmvlr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:33Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:33 crc kubenswrapper[4898]: I1211 13:04:33.058645 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9dab79b9393ae8ca48f0b4366d9b4e6fd25015a3d42f913b3d3d5ce78cbf0cc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bee512bd389a5e402acad6fdbf075e8d90bbfb3899efaecee992ff8ef64617fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:33Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:33 crc kubenswrapper[4898]: I1211 13:04:33.077006 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:33 crc kubenswrapper[4898]: I1211 13:04:33.077062 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:33 crc kubenswrapper[4898]: I1211 13:04:33.077077 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:33 crc kubenswrapper[4898]: I1211 13:04:33.077098 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:33 crc kubenswrapper[4898]: I1211 13:04:33.077113 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:33Z","lastTransitionTime":"2025-12-11T13:04:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:33 crc kubenswrapper[4898]: I1211 13:04:33.077644 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:33Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:33 crc kubenswrapper[4898]: I1211 13:04:33.093681 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7lxfm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45192e97-2770-4866-9865-dc2f45b3f616\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4fc2a5689b589ccda6ec382b6a586f4e47c42d4f137d852e28688a18b98e6e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14a9667311a747f8e844897394432d67e0897b22569dd1314f4ebeccfce1c531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14a9667311a747f8e844897394432d67e0897b22569dd1314f4ebeccfce1c531\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://35bb75c37377565dd0410255d00b3a848a3d22801107bd4d9547ce5574cda121\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35bb75c37377565dd0410255d00b3a848a3d22801107bd4d9547ce5574cda121\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd3bae44b475629c42174d195e162b9afe545e8dc84857847b1e2d936cf1a32d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd3bae44b475629c42174d195e162b9afe545e8dc84857847b1e2d936cf1a32d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7d5edf28866aeb08ed16121e0cea183463e58a708b157da20edd847bdad9860\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7d5edf28866aeb08ed16121e0cea183463e58a708b157da20edd847bdad9860\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://951e5da3fc6041496960dcfcd28bb3e918a40d9303c6609880bc2f8dcca29688\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://951e5da3fc6041496960dcfcd28bb3e918a40d9303c6609880bc2f8dcca29688\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39b6a95a0387b8f1ea4992bfa28355272a299bbb81adddf7ad39835786b43b2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39b6a95a0387b8f1ea4992bfa28355272a299bbb81adddf7ad39835786b43b2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7lxfm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:33Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:33 crc kubenswrapper[4898]: I1211 13:04:33.109038 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fc485a2-a69e-4453-8c3d-4b9698caa632\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df081be8bffcbf5068ade5f09ad81f2ea332d110fb0dffc1ab215f4a447e75d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://765cb993103d23770e20a725ce1f28ca635d13d2ecd05676039a833274ecf3e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41568c567a418a6ebd534c09d02c4331dfa2cd00580441bfdd83a15a12153a59\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ccc1530df8a90b252f6008c3a1734fd489bcb57944c2cc46bdf3c7554d837c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8857b4ade110b31598d9080f45bc15f5702b2c278fe7909e4d172c9877e61534\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"message\\\":\\\"falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1211 13:04:15.326141 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1211 13:04:15.327902 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-38072788/tls.crt::/tmp/serving-cert-38072788/tls.key\\\\\\\"\\\\nI1211 13:04:21.035582 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1211 13:04:21.038444 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1211 13:04:21.038483 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1211 13:04:21.038515 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1211 13:04:21.038527 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1211 13:04:21.044993 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1211 13:04:21.045022 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1211 13:04:21.045029 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1211 13:04:21.045034 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1211 13:04:21.045038 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1211 13:04:21.045043 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1211 13:04:21.045047 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1211 13:04:21.045297 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1211 13:04:21.046774 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a6e962a8c37823fc7d0047f9c1cae14311bfa1f36236addff039bba3369a021\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca4b5408884ed07718750ad82f77961d242327ca11991fd4dbbd1b5b53c5b642\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca4b5408884ed07718750ad82f77961d242327ca11991fd4dbbd1b5b53c5b642\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:33Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:33 crc kubenswrapper[4898]: I1211 13:04:33.120883 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:33Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:33 crc kubenswrapper[4898]: I1211 13:04:33.134276 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:33Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:33 crc kubenswrapper[4898]: I1211 13:04:33.146840 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca3db723-c4e9-4a38-9cee-827f045c7be3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b72e9f6787b4bfadbc89cf669d5f31678f0a85a726107e3b3d2e06eff0e18ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a52b842d9e7001cf1e9c3284c56ac9a76f642b66457d6717c6367e00ef89fdf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://583816a3f9ba9a092d6cd8d75bba54cf42e3ff9f65230f83822baec967dd53dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f419f755ec0a61ffad06dca0d35df43386ebd253152f29cf04312cd0ea89d5de\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:33Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:33 crc kubenswrapper[4898]: I1211 13:04:33.157081 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b28cc743c1173591c6ee75d4bb3841608d01bca50dab896d4c5de930a67ee22f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:33Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:33 crc kubenswrapper[4898]: I1211 13:04:33.169195 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-dlqfj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e8ed6cb-b822-4b64-9e00-e755c5aea812\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76ddfb054d2dd7ec27a771c96a0fca2cead8eacc3221c22b36c1d075414a1995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bvrpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-dlqfj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:33Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:33 crc kubenswrapper[4898]: I1211 13:04:33.178280 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jmvlr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8009db93-3f68-4a84-87ae-863f64e231e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08079d059f71e05ad9d28ebed3d2608a035d3b6fc87db1d80b3974b439ac5bdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkmx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:24Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jmvlr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:33Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:33 crc kubenswrapper[4898]: I1211 13:04:33.178713 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:33 crc kubenswrapper[4898]: I1211 13:04:33.178739 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:33 crc kubenswrapper[4898]: I1211 13:04:33.178748 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:33 crc kubenswrapper[4898]: I1211 13:04:33.178761 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:33 crc kubenswrapper[4898]: I1211 13:04:33.178770 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:33Z","lastTransitionTime":"2025-12-11T13:04:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:33 crc kubenswrapper[4898]: I1211 13:04:33.192441 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://337971ff1d0d3cf7ee256777a71e4549ec0b973b58b437caaa152c589047141f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:33Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:33 crc kubenswrapper[4898]: I1211 13:04:33.213734 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"88d684b6-361a-40e6-978e-b46ea34398f5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fdd8f4f5e41c02e8bfa32782027f6483a3568a3e1cb28f2abc7807eaada32646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad23dc07c8ad7cf6100e3b53e1d9ed8abac20e942968edb5decdde1b263de306\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce1bf1a7718645350f992e26a366a0fefa0d757b696c932d3552dd24a4c93550\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4801f56faef50a1c100b7f54be9508a2be17ab655648fdb92cffaab1cb198a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87f5a835b639c4ce50c0d2b5f1a865f1b67bc5cc06d9fd88875d9defb71dbed2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77f3c45a0ef619bd5ee7b89f62fdaf4ae983c8f75e23034eb88019194f6ef88a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77f3c45a0ef619bd5ee7b89f62fdaf4ae983c8f75e23034eb88019194f6ef88a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32f392949d5ebc630af8b0646f8f482f806803d464e2cedbcbe731328d6ba5ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32f392949d5ebc630af8b0646f8f482f806803d464e2cedbcbe731328d6ba5ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://bf8553e4b1a22afaf454efae3707136b9e6e320a54fd269f48a5f72297249d21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf8553e4b1a22afaf454efae3707136b9e6e320a54fd269f48a5f72297249d21\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:33Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:33 crc kubenswrapper[4898]: I1211 13:04:33.223272 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-h86cf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"00c0a5c3-6be3-4c77-a628-cf6710a1f10f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cb94931502e22d03595fc3289c2b098ecf881b9f8c14180a036a11b2f65bd91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xv2hc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-h86cf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:33Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:33 crc kubenswrapper[4898]: I1211 13:04:33.235055 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6782ac5869029f1d6c05198c469d34070decbbbbab0c093963024040f067ecf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7cv42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb7b98c50effebe09db7e51fe139da11af7b116ce6b7d81f0dada92fa29c82a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7cv42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7mmvk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:33Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:33 crc kubenswrapper[4898]: I1211 13:04:33.260849 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qndxl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1efa7034-8a95-4e6e-bd84-0189dc5acaa3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2a77e22893dca5ea818fc6992db6c34f8eb90ce2f25e814bec5f314d60ad569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd81ec4902eeb225d5a2969e211c63bbab448c1cccb327cfa84fb68c4f386ab0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ab0efc2d7d46f575125a46f025b016017e66f74466a9d796dfb0285577f5bcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d992af21fd68cce19a4be8328466a3d46df62ed908fe4e911ba14e7c009ce982\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a60901e27e67b7483e02800a91a86f2676b610d538c999ab1df0d42fde146f90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7e7d7ded2a6e0e20781a16b0ed56a744db593c6fe40c75e89a088ccb8cdb127\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d5786093fd7a55dfc5d4ea0d8610920f881ca9bc79361cbeeadf602c52cef54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d5786093fd7a55dfc5d4ea0d8610920f881ca9bc79361cbeeadf602c52cef54\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-11T13:04:32Z\\\",\\\"message\\\":\\\".go:140\\\\nI1211 13:04:32.001534 6218 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1211 13:04:32.001853 6218 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1211 13:04:32.001870 6218 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI1211 13:04:32.001929 6218 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1211 13:04:32.002095 6218 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI1211 13:04:32.002453 6218 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1211 13:04:32.002547 6218 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3aa907a5ca865510f1aafd0f949963b5938b6fe62fa9fd690447c4ab62566f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fc397b24e8480c4d07281b09fb871f859eb9fdf47fa9103b9243f91e5800d61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fc397b24e8480c4d07281b09fb871f859eb9fdf47fa9103b9243f91e5800d61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qndxl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:33Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:33 crc kubenswrapper[4898]: I1211 13:04:33.281268 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:33 crc kubenswrapper[4898]: I1211 13:04:33.281308 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:33 crc kubenswrapper[4898]: I1211 13:04:33.281343 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:33 crc kubenswrapper[4898]: I1211 13:04:33.281364 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:33 crc kubenswrapper[4898]: I1211 13:04:33.281375 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:33Z","lastTransitionTime":"2025-12-11T13:04:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:33 crc kubenswrapper[4898]: I1211 13:04:33.383653 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:33 crc kubenswrapper[4898]: I1211 13:04:33.383695 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:33 crc kubenswrapper[4898]: I1211 13:04:33.383707 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:33 crc kubenswrapper[4898]: I1211 13:04:33.383721 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:33 crc kubenswrapper[4898]: I1211 13:04:33.383732 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:33Z","lastTransitionTime":"2025-12-11T13:04:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:33 crc kubenswrapper[4898]: I1211 13:04:33.486538 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:33 crc kubenswrapper[4898]: I1211 13:04:33.486601 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:33 crc kubenswrapper[4898]: I1211 13:04:33.486619 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:33 crc kubenswrapper[4898]: I1211 13:04:33.486643 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:33 crc kubenswrapper[4898]: I1211 13:04:33.486661 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:33Z","lastTransitionTime":"2025-12-11T13:04:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:33 crc kubenswrapper[4898]: I1211 13:04:33.588866 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:33 crc kubenswrapper[4898]: I1211 13:04:33.588900 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:33 crc kubenswrapper[4898]: I1211 13:04:33.588908 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:33 crc kubenswrapper[4898]: I1211 13:04:33.588922 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:33 crc kubenswrapper[4898]: I1211 13:04:33.588932 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:33Z","lastTransitionTime":"2025-12-11T13:04:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:33 crc kubenswrapper[4898]: I1211 13:04:33.691978 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:33 crc kubenswrapper[4898]: I1211 13:04:33.692057 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:33 crc kubenswrapper[4898]: I1211 13:04:33.692081 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:33 crc kubenswrapper[4898]: I1211 13:04:33.692120 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:33 crc kubenswrapper[4898]: I1211 13:04:33.692142 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:33Z","lastTransitionTime":"2025-12-11T13:04:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:33 crc kubenswrapper[4898]: I1211 13:04:33.774753 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 11 13:04:33 crc kubenswrapper[4898]: I1211 13:04:33.774753 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 11 13:04:33 crc kubenswrapper[4898]: E1211 13:04:33.774964 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 11 13:04:33 crc kubenswrapper[4898]: E1211 13:04:33.775055 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 11 13:04:33 crc kubenswrapper[4898]: I1211 13:04:33.774778 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 11 13:04:33 crc kubenswrapper[4898]: E1211 13:04:33.775211 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 11 13:04:33 crc kubenswrapper[4898]: I1211 13:04:33.795471 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:33 crc kubenswrapper[4898]: I1211 13:04:33.795516 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:33 crc kubenswrapper[4898]: I1211 13:04:33.795528 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:33 crc kubenswrapper[4898]: I1211 13:04:33.795546 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:33 crc kubenswrapper[4898]: I1211 13:04:33.795558 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:33Z","lastTransitionTime":"2025-12-11T13:04:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:33 crc kubenswrapper[4898]: I1211 13:04:33.898126 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:33 crc kubenswrapper[4898]: I1211 13:04:33.898164 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:33 crc kubenswrapper[4898]: I1211 13:04:33.898176 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:33 crc kubenswrapper[4898]: I1211 13:04:33.898193 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:33 crc kubenswrapper[4898]: I1211 13:04:33.898204 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:33Z","lastTransitionTime":"2025-12-11T13:04:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:34 crc kubenswrapper[4898]: I1211 13:04:33.999972 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:34 crc kubenswrapper[4898]: I1211 13:04:34.000003 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:34 crc kubenswrapper[4898]: I1211 13:04:34.000011 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:34 crc kubenswrapper[4898]: I1211 13:04:34.000023 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:34 crc kubenswrapper[4898]: I1211 13:04:34.000031 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:34Z","lastTransitionTime":"2025-12-11T13:04:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:34 crc kubenswrapper[4898]: I1211 13:04:34.004505 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qndxl_1efa7034-8a95-4e6e-bd84-0189dc5acaa3/ovnkube-controller/0.log" Dec 11 13:04:34 crc kubenswrapper[4898]: I1211 13:04:34.009316 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qndxl" event={"ID":"1efa7034-8a95-4e6e-bd84-0189dc5acaa3","Type":"ContainerStarted","Data":"378427d319c4df6bcb001488490b4a1a785a18161dfed6887f08fbcea9e06c5a"} Dec 11 13:04:34 crc kubenswrapper[4898]: I1211 13:04:34.009536 4898 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 11 13:04:34 crc kubenswrapper[4898]: I1211 13:04:34.036018 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fc485a2-a69e-4453-8c3d-4b9698caa632\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df081be8bffcbf5068ade5f09ad81f2ea332d110fb0dffc1ab215f4a447e75d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://765cb993103d23770e20a725ce1f28ca635d13d2ecd05676039a833274ecf3e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41568c567a418a6ebd534c09d02c4331dfa2cd00580441bfdd83a15a12153a59\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ccc1530df8a90b252f6008c3a1734fd489bcb57944c2cc46bdf3c7554d837c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8857b4ade110b31598d9080f45bc15f5702b2c278fe7909e4d172c9877e61534\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"message\\\":\\\"falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1211 13:04:15.326141 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1211 13:04:15.327902 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-38072788/tls.crt::/tmp/serving-cert-38072788/tls.key\\\\\\\"\\\\nI1211 13:04:21.035582 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1211 13:04:21.038444 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1211 13:04:21.038483 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1211 13:04:21.038515 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1211 13:04:21.038527 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1211 13:04:21.044993 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1211 13:04:21.045022 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1211 13:04:21.045029 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1211 13:04:21.045034 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1211 13:04:21.045038 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1211 13:04:21.045043 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1211 13:04:21.045047 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1211 13:04:21.045297 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1211 13:04:21.046774 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a6e962a8c37823fc7d0047f9c1cae14311bfa1f36236addff039bba3369a021\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca4b5408884ed07718750ad82f77961d242327ca11991fd4dbbd1b5b53c5b642\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca4b5408884ed07718750ad82f77961d242327ca11991fd4dbbd1b5b53c5b642\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:34Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:34 crc kubenswrapper[4898]: I1211 13:04:34.053820 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9dab79b9393ae8ca48f0b4366d9b4e6fd25015a3d42f913b3d3d5ce78cbf0cc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bee512bd389a5e402acad6fdbf075e8d90bbfb3899efaecee992ff8ef64617fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:34Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:34 crc kubenswrapper[4898]: I1211 13:04:34.067018 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:34Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:34 crc kubenswrapper[4898]: I1211 13:04:34.086505 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7lxfm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45192e97-2770-4866-9865-dc2f45b3f616\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4fc2a5689b589ccda6ec382b6a586f4e47c42d4f137d852e28688a18b98e6e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14a9667311a747f8e844897394432d67e0897b22569dd1314f4ebeccfce1c531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14a9667311a747f8e844897394432d67e0897b22569dd1314f4ebeccfce1c531\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://35bb75c37377565dd0410255d00b3a848a3d22801107bd4d9547ce5574cda121\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35bb75c37377565dd0410255d00b3a848a3d22801107bd4d9547ce5574cda121\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd3bae44b475629c42174d195e162b9afe545e8dc84857847b1e2d936cf1a32d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd3bae44b475629c42174d195e162b9afe545e8dc84857847b1e2d936cf1a32d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7d5edf28866aeb08ed16121e0cea183463e58a708b157da20edd847bdad9860\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7d5edf28866aeb08ed16121e0cea183463e58a708b157da20edd847bdad9860\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://951e5da3fc6041496960dcfcd28bb3e918a40d9303c6609880bc2f8dcca29688\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://951e5da3fc6041496960dcfcd28bb3e918a40d9303c6609880bc2f8dcca29688\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39b6a95a0387b8f1ea4992bfa28355272a299bbb81adddf7ad39835786b43b2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39b6a95a0387b8f1ea4992bfa28355272a299bbb81adddf7ad39835786b43b2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7lxfm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:34Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:34 crc kubenswrapper[4898]: I1211 13:04:34.102267 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:34 crc kubenswrapper[4898]: I1211 13:04:34.102300 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:34 crc kubenswrapper[4898]: I1211 13:04:34.102309 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:34 crc kubenswrapper[4898]: I1211 13:04:34.102323 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:34 crc kubenswrapper[4898]: I1211 13:04:34.102335 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:34Z","lastTransitionTime":"2025-12-11T13:04:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:34 crc kubenswrapper[4898]: I1211 13:04:34.110176 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca3db723-c4e9-4a38-9cee-827f045c7be3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b72e9f6787b4bfadbc89cf669d5f31678f0a85a726107e3b3d2e06eff0e18ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a52b842d9e7001cf1e9c3284c56ac9a76f642b66457d6717c6367e00ef89fdf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://583816a3f9ba9a092d6cd8d75bba54cf42e3ff9f65230f83822baec967dd53dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f419f755ec0a61ffad06dca0d35df43386ebd253152f29cf04312cd0ea89d5de\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:34Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:34 crc kubenswrapper[4898]: I1211 13:04:34.124075 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:34Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:34 crc kubenswrapper[4898]: I1211 13:04:34.135634 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:34Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:34 crc kubenswrapper[4898]: I1211 13:04:34.145738 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://337971ff1d0d3cf7ee256777a71e4549ec0b973b58b437caaa152c589047141f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:34Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:34 crc kubenswrapper[4898]: I1211 13:04:34.155715 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b28cc743c1173591c6ee75d4bb3841608d01bca50dab896d4c5de930a67ee22f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:34Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:34 crc kubenswrapper[4898]: I1211 13:04:34.167420 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-dlqfj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e8ed6cb-b822-4b64-9e00-e755c5aea812\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76ddfb054d2dd7ec27a771c96a0fca2cead8eacc3221c22b36c1d075414a1995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bvrpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-dlqfj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:34Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:34 crc kubenswrapper[4898]: I1211 13:04:34.176826 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jmvlr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8009db93-3f68-4a84-87ae-863f64e231e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08079d059f71e05ad9d28ebed3d2608a035d3b6fc87db1d80b3974b439ac5bdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkmx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:24Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jmvlr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:34Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:34 crc kubenswrapper[4898]: I1211 13:04:34.204825 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:34 crc kubenswrapper[4898]: I1211 13:04:34.204867 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:34 crc kubenswrapper[4898]: I1211 13:04:34.204880 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:34 crc kubenswrapper[4898]: I1211 13:04:34.204897 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:34 crc kubenswrapper[4898]: I1211 13:04:34.204910 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:34Z","lastTransitionTime":"2025-12-11T13:04:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:34 crc kubenswrapper[4898]: I1211 13:04:34.205786 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qndxl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1efa7034-8a95-4e6e-bd84-0189dc5acaa3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2a77e22893dca5ea818fc6992db6c34f8eb90ce2f25e814bec5f314d60ad569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd81ec4902eeb225d5a2969e211c63bbab448c1cccb327cfa84fb68c4f386ab0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ab0efc2d7d46f575125a46f025b016017e66f74466a9d796dfb0285577f5bcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d992af21fd68cce19a4be8328466a3d46df62ed908fe4e911ba14e7c009ce982\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a60901e27e67b7483e02800a91a86f2676b610d538c999ab1df0d42fde146f90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7e7d7ded2a6e0e20781a16b0ed56a744db593c6fe40c75e89a088ccb8cdb127\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://378427d319c4df6bcb001488490b4a1a785a18161dfed6887f08fbcea9e06c5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d5786093fd7a55dfc5d4ea0d8610920f881ca9bc79361cbeeadf602c52cef54\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-11T13:04:32Z\\\",\\\"message\\\":\\\".go:140\\\\nI1211 13:04:32.001534 6218 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1211 13:04:32.001853 6218 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1211 13:04:32.001870 6218 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI1211 13:04:32.001929 6218 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1211 13:04:32.002095 6218 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI1211 13:04:32.002453 6218 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1211 13:04:32.002547 6218 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:29Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3aa907a5ca865510f1aafd0f949963b5938b6fe62fa9fd690447c4ab62566f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fc397b24e8480c4d07281b09fb871f859eb9fdf47fa9103b9243f91e5800d61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fc397b24e8480c4d07281b09fb871f859eb9fdf47fa9103b9243f91e5800d61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qndxl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:34Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:34 crc kubenswrapper[4898]: I1211 13:04:34.223402 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"88d684b6-361a-40e6-978e-b46ea34398f5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fdd8f4f5e41c02e8bfa32782027f6483a3568a3e1cb28f2abc7807eaada32646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad23dc07c8ad7cf6100e3b53e1d9ed8abac20e942968edb5decdde1b263de306\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce1bf1a7718645350f992e26a366a0fefa0d757b696c932d3552dd24a4c93550\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4801f56faef50a1c100b7f54be9508a2be17ab655648fdb92cffaab1cb198a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87f5a835b639c4ce50c0d2b5f1a865f1b67bc5cc06d9fd88875d9defb71dbed2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77f3c45a0ef619bd5ee7b89f62fdaf4ae983c8f75e23034eb88019194f6ef88a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77f3c45a0ef619bd5ee7b89f62fdaf4ae983c8f75e23034eb88019194f6ef88a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32f392949d5ebc630af8b0646f8f482f806803d464e2cedbcbe731328d6ba5ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32f392949d5ebc630af8b0646f8f482f806803d464e2cedbcbe731328d6ba5ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://bf8553e4b1a22afaf454efae3707136b9e6e320a54fd269f48a5f72297249d21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf8553e4b1a22afaf454efae3707136b9e6e320a54fd269f48a5f72297249d21\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:34Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:34 crc kubenswrapper[4898]: I1211 13:04:34.233200 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-h86cf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"00c0a5c3-6be3-4c77-a628-cf6710a1f10f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cb94931502e22d03595fc3289c2b098ecf881b9f8c14180a036a11b2f65bd91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xv2hc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-h86cf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:34Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:34 crc kubenswrapper[4898]: I1211 13:04:34.245021 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6782ac5869029f1d6c05198c469d34070decbbbbab0c093963024040f067ecf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7cv42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb7b98c50effebe09db7e51fe139da11af7b116ce6b7d81f0dada92fa29c82a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7cv42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7mmvk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:34Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:34 crc kubenswrapper[4898]: I1211 13:04:34.307424 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:34 crc kubenswrapper[4898]: I1211 13:04:34.307539 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:34 crc kubenswrapper[4898]: I1211 13:04:34.307574 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:34 crc kubenswrapper[4898]: I1211 13:04:34.307605 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:34 crc kubenswrapper[4898]: I1211 13:04:34.307627 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:34Z","lastTransitionTime":"2025-12-11T13:04:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:34 crc kubenswrapper[4898]: I1211 13:04:34.410698 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:34 crc kubenswrapper[4898]: I1211 13:04:34.410754 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:34 crc kubenswrapper[4898]: I1211 13:04:34.410770 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:34 crc kubenswrapper[4898]: I1211 13:04:34.410791 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:34 crc kubenswrapper[4898]: I1211 13:04:34.410808 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:34Z","lastTransitionTime":"2025-12-11T13:04:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:34 crc kubenswrapper[4898]: I1211 13:04:34.514058 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:34 crc kubenswrapper[4898]: I1211 13:04:34.514112 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:34 crc kubenswrapper[4898]: I1211 13:04:34.514129 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:34 crc kubenswrapper[4898]: I1211 13:04:34.514152 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:34 crc kubenswrapper[4898]: I1211 13:04:34.514170 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:34Z","lastTransitionTime":"2025-12-11T13:04:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:34 crc kubenswrapper[4898]: I1211 13:04:34.616872 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:34 crc kubenswrapper[4898]: I1211 13:04:34.616949 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:34 crc kubenswrapper[4898]: I1211 13:04:34.616972 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:34 crc kubenswrapper[4898]: I1211 13:04:34.617002 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:34 crc kubenswrapper[4898]: I1211 13:04:34.617025 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:34Z","lastTransitionTime":"2025-12-11T13:04:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:34 crc kubenswrapper[4898]: I1211 13:04:34.720312 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:34 crc kubenswrapper[4898]: I1211 13:04:34.720370 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:34 crc kubenswrapper[4898]: I1211 13:04:34.720385 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:34 crc kubenswrapper[4898]: I1211 13:04:34.720408 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:34 crc kubenswrapper[4898]: I1211 13:04:34.720423 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:34Z","lastTransitionTime":"2025-12-11T13:04:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:34 crc kubenswrapper[4898]: I1211 13:04:34.823022 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:34 crc kubenswrapper[4898]: I1211 13:04:34.823086 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:34 crc kubenswrapper[4898]: I1211 13:04:34.823111 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:34 crc kubenswrapper[4898]: I1211 13:04:34.823146 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:34 crc kubenswrapper[4898]: I1211 13:04:34.823169 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:34Z","lastTransitionTime":"2025-12-11T13:04:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:34 crc kubenswrapper[4898]: I1211 13:04:34.926885 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:34 crc kubenswrapper[4898]: I1211 13:04:34.926949 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:34 crc kubenswrapper[4898]: I1211 13:04:34.926966 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:34 crc kubenswrapper[4898]: I1211 13:04:34.926994 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:34 crc kubenswrapper[4898]: I1211 13:04:34.927011 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:34Z","lastTransitionTime":"2025-12-11T13:04:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:34 crc kubenswrapper[4898]: I1211 13:04:34.947706 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-brphv"] Dec 11 13:04:34 crc kubenswrapper[4898]: I1211 13:04:34.948535 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-brphv" Dec 11 13:04:34 crc kubenswrapper[4898]: I1211 13:04:34.952000 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Dec 11 13:04:34 crc kubenswrapper[4898]: I1211 13:04:34.952142 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Dec 11 13:04:34 crc kubenswrapper[4898]: I1211 13:04:34.970932 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/3f71cb6f-609f-4fab-86eb-e01518fb8c61-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-brphv\" (UID: \"3f71cb6f-609f-4fab-86eb-e01518fb8c61\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-brphv" Dec 11 13:04:34 crc kubenswrapper[4898]: I1211 13:04:34.971029 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/3f71cb6f-609f-4fab-86eb-e01518fb8c61-env-overrides\") pod \"ovnkube-control-plane-749d76644c-brphv\" (UID: \"3f71cb6f-609f-4fab-86eb-e01518fb8c61\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-brphv" Dec 11 13:04:34 crc kubenswrapper[4898]: I1211 13:04:34.971102 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3f71cb6f-609f-4fab-86eb-e01518fb8c61-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-brphv\" (UID: \"3f71cb6f-609f-4fab-86eb-e01518fb8c61\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-brphv" Dec 11 13:04:34 crc kubenswrapper[4898]: I1211 13:04:34.971160 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qwpjl\" (UniqueName: \"kubernetes.io/projected/3f71cb6f-609f-4fab-86eb-e01518fb8c61-kube-api-access-qwpjl\") pod \"ovnkube-control-plane-749d76644c-brphv\" (UID: \"3f71cb6f-609f-4fab-86eb-e01518fb8c61\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-brphv" Dec 11 13:04:34 crc kubenswrapper[4898]: I1211 13:04:34.980571 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"88d684b6-361a-40e6-978e-b46ea34398f5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fdd8f4f5e41c02e8bfa32782027f6483a3568a3e1cb28f2abc7807eaada32646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad23dc07c8ad7cf6100e3b53e1d9ed8abac20e942968edb5decdde1b263de306\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce1bf1a7718645350f992e26a366a0fefa0d757b696c932d3552dd24a4c93550\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4801f56faef50a1c100b7f54be9508a2be17ab655648fdb92cffaab1cb198a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87f5a835b639c4ce50c0d2b5f1a865f1b67bc5cc06d9fd88875d9defb71dbed2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77f3c45a0ef619bd5ee7b89f62fdaf4ae983c8f75e23034eb88019194f6ef88a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77f3c45a0ef619bd5ee7b89f62fdaf4ae983c8f75e23034eb88019194f6ef88a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32f392949d5ebc630af8b0646f8f482f806803d464e2cedbcbe731328d6ba5ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32f392949d5ebc630af8b0646f8f482f806803d464e2cedbcbe731328d6ba5ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://bf8553e4b1a22afaf454efae3707136b9e6e320a54fd269f48a5f72297249d21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf8553e4b1a22afaf454efae3707136b9e6e320a54fd269f48a5f72297249d21\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:34Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:34 crc kubenswrapper[4898]: I1211 13:04:34.993953 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-h86cf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"00c0a5c3-6be3-4c77-a628-cf6710a1f10f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cb94931502e22d03595fc3289c2b098ecf881b9f8c14180a036a11b2f65bd91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xv2hc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-h86cf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:34Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:35 crc kubenswrapper[4898]: I1211 13:04:35.008701 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6782ac5869029f1d6c05198c469d34070decbbbbab0c093963024040f067ecf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7cv42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb7b98c50effebe09db7e51fe139da11af7b116ce6b7d81f0dada92fa29c82a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7cv42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7mmvk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:35Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:35 crc kubenswrapper[4898]: I1211 13:04:35.030545 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:35 crc kubenswrapper[4898]: I1211 13:04:35.030586 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:35 crc kubenswrapper[4898]: I1211 13:04:35.030597 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:35 crc kubenswrapper[4898]: I1211 13:04:35.030614 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:35 crc kubenswrapper[4898]: I1211 13:04:35.030630 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:35Z","lastTransitionTime":"2025-12-11T13:04:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:35 crc kubenswrapper[4898]: I1211 13:04:35.041641 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qndxl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1efa7034-8a95-4e6e-bd84-0189dc5acaa3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2a77e22893dca5ea818fc6992db6c34f8eb90ce2f25e814bec5f314d60ad569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd81ec4902eeb225d5a2969e211c63bbab448c1cccb327cfa84fb68c4f386ab0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ab0efc2d7d46f575125a46f025b016017e66f74466a9d796dfb0285577f5bcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d992af21fd68cce19a4be8328466a3d46df62ed908fe4e911ba14e7c009ce982\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a60901e27e67b7483e02800a91a86f2676b610d538c999ab1df0d42fde146f90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7e7d7ded2a6e0e20781a16b0ed56a744db593c6fe40c75e89a088ccb8cdb127\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://378427d319c4df6bcb001488490b4a1a785a18161dfed6887f08fbcea9e06c5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d5786093fd7a55dfc5d4ea0d8610920f881ca9bc79361cbeeadf602c52cef54\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-11T13:04:32Z\\\",\\\"message\\\":\\\".go:140\\\\nI1211 13:04:32.001534 6218 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1211 13:04:32.001853 6218 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1211 13:04:32.001870 6218 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI1211 13:04:32.001929 6218 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1211 13:04:32.002095 6218 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI1211 13:04:32.002453 6218 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1211 13:04:32.002547 6218 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:29Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3aa907a5ca865510f1aafd0f949963b5938b6fe62fa9fd690447c4ab62566f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fc397b24e8480c4d07281b09fb871f859eb9fdf47fa9103b9243f91e5800d61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fc397b24e8480c4d07281b09fb871f859eb9fdf47fa9103b9243f91e5800d61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qndxl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:35Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:35 crc kubenswrapper[4898]: I1211 13:04:35.055904 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-brphv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f71cb6f-609f-4fab-86eb-e01518fb8c61\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwpjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwpjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-brphv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:35Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:35 crc kubenswrapper[4898]: I1211 13:04:35.072432 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qwpjl\" (UniqueName: \"kubernetes.io/projected/3f71cb6f-609f-4fab-86eb-e01518fb8c61-kube-api-access-qwpjl\") pod \"ovnkube-control-plane-749d76644c-brphv\" (UID: \"3f71cb6f-609f-4fab-86eb-e01518fb8c61\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-brphv" Dec 11 13:04:35 crc kubenswrapper[4898]: I1211 13:04:35.072554 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/3f71cb6f-609f-4fab-86eb-e01518fb8c61-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-brphv\" (UID: \"3f71cb6f-609f-4fab-86eb-e01518fb8c61\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-brphv" Dec 11 13:04:35 crc kubenswrapper[4898]: I1211 13:04:35.072582 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/3f71cb6f-609f-4fab-86eb-e01518fb8c61-env-overrides\") pod \"ovnkube-control-plane-749d76644c-brphv\" (UID: \"3f71cb6f-609f-4fab-86eb-e01518fb8c61\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-brphv" Dec 11 13:04:35 crc kubenswrapper[4898]: I1211 13:04:35.072612 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3f71cb6f-609f-4fab-86eb-e01518fb8c61-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-brphv\" (UID: \"3f71cb6f-609f-4fab-86eb-e01518fb8c61\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-brphv" Dec 11 13:04:35 crc kubenswrapper[4898]: I1211 13:04:35.073380 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/3f71cb6f-609f-4fab-86eb-e01518fb8c61-env-overrides\") pod \"ovnkube-control-plane-749d76644c-brphv\" (UID: \"3f71cb6f-609f-4fab-86eb-e01518fb8c61\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-brphv" Dec 11 13:04:35 crc kubenswrapper[4898]: I1211 13:04:35.073476 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fc485a2-a69e-4453-8c3d-4b9698caa632\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df081be8bffcbf5068ade5f09ad81f2ea332d110fb0dffc1ab215f4a447e75d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://765cb993103d23770e20a725ce1f28ca635d13d2ecd05676039a833274ecf3e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41568c567a418a6ebd534c09d02c4331dfa2cd00580441bfdd83a15a12153a59\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ccc1530df8a90b252f6008c3a1734fd489bcb57944c2cc46bdf3c7554d837c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8857b4ade110b31598d9080f45bc15f5702b2c278fe7909e4d172c9877e61534\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"message\\\":\\\"falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1211 13:04:15.326141 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1211 13:04:15.327902 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-38072788/tls.crt::/tmp/serving-cert-38072788/tls.key\\\\\\\"\\\\nI1211 13:04:21.035582 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1211 13:04:21.038444 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1211 13:04:21.038483 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1211 13:04:21.038515 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1211 13:04:21.038527 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1211 13:04:21.044993 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1211 13:04:21.045022 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1211 13:04:21.045029 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1211 13:04:21.045034 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1211 13:04:21.045038 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1211 13:04:21.045043 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1211 13:04:21.045047 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1211 13:04:21.045297 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1211 13:04:21.046774 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a6e962a8c37823fc7d0047f9c1cae14311bfa1f36236addff039bba3369a021\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca4b5408884ed07718750ad82f77961d242327ca11991fd4dbbd1b5b53c5b642\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca4b5408884ed07718750ad82f77961d242327ca11991fd4dbbd1b5b53c5b642\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:35Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:35 crc kubenswrapper[4898]: I1211 13:04:35.073581 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/3f71cb6f-609f-4fab-86eb-e01518fb8c61-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-brphv\" (UID: \"3f71cb6f-609f-4fab-86eb-e01518fb8c61\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-brphv" Dec 11 13:04:35 crc kubenswrapper[4898]: I1211 13:04:35.078491 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3f71cb6f-609f-4fab-86eb-e01518fb8c61-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-brphv\" (UID: \"3f71cb6f-609f-4fab-86eb-e01518fb8c61\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-brphv" Dec 11 13:04:35 crc kubenswrapper[4898]: I1211 13:04:35.088300 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9dab79b9393ae8ca48f0b4366d9b4e6fd25015a3d42f913b3d3d5ce78cbf0cc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bee512bd389a5e402acad6fdbf075e8d90bbfb3899efaecee992ff8ef64617fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:35Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:35 crc kubenswrapper[4898]: I1211 13:04:35.088492 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qwpjl\" (UniqueName: \"kubernetes.io/projected/3f71cb6f-609f-4fab-86eb-e01518fb8c61-kube-api-access-qwpjl\") pod \"ovnkube-control-plane-749d76644c-brphv\" (UID: \"3f71cb6f-609f-4fab-86eb-e01518fb8c61\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-brphv" Dec 11 13:04:35 crc kubenswrapper[4898]: I1211 13:04:35.103556 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:35Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:35 crc kubenswrapper[4898]: I1211 13:04:35.120261 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7lxfm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45192e97-2770-4866-9865-dc2f45b3f616\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4fc2a5689b589ccda6ec382b6a586f4e47c42d4f137d852e28688a18b98e6e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14a9667311a747f8e844897394432d67e0897b22569dd1314f4ebeccfce1c531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14a9667311a747f8e844897394432d67e0897b22569dd1314f4ebeccfce1c531\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://35bb75c37377565dd0410255d00b3a848a3d22801107bd4d9547ce5574cda121\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35bb75c37377565dd0410255d00b3a848a3d22801107bd4d9547ce5574cda121\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd3bae44b475629c42174d195e162b9afe545e8dc84857847b1e2d936cf1a32d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd3bae44b475629c42174d195e162b9afe545e8dc84857847b1e2d936cf1a32d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7d5edf28866aeb08ed16121e0cea183463e58a708b157da20edd847bdad9860\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7d5edf28866aeb08ed16121e0cea183463e58a708b157da20edd847bdad9860\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://951e5da3fc6041496960dcfcd28bb3e918a40d9303c6609880bc2f8dcca29688\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://951e5da3fc6041496960dcfcd28bb3e918a40d9303c6609880bc2f8dcca29688\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39b6a95a0387b8f1ea4992bfa28355272a299bbb81adddf7ad39835786b43b2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39b6a95a0387b8f1ea4992bfa28355272a299bbb81adddf7ad39835786b43b2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7lxfm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:35Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:35 crc kubenswrapper[4898]: I1211 13:04:35.132895 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:35 crc kubenswrapper[4898]: I1211 13:04:35.132933 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:35 crc kubenswrapper[4898]: I1211 13:04:35.132945 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:35 crc kubenswrapper[4898]: I1211 13:04:35.132957 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:35 crc kubenswrapper[4898]: I1211 13:04:35.132967 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:35Z","lastTransitionTime":"2025-12-11T13:04:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:35 crc kubenswrapper[4898]: I1211 13:04:35.135620 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca3db723-c4e9-4a38-9cee-827f045c7be3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b72e9f6787b4bfadbc89cf669d5f31678f0a85a726107e3b3d2e06eff0e18ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a52b842d9e7001cf1e9c3284c56ac9a76f642b66457d6717c6367e00ef89fdf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://583816a3f9ba9a092d6cd8d75bba54cf42e3ff9f65230f83822baec967dd53dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f419f755ec0a61ffad06dca0d35df43386ebd253152f29cf04312cd0ea89d5de\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:35Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:35 crc kubenswrapper[4898]: I1211 13:04:35.146946 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:35Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:35 crc kubenswrapper[4898]: I1211 13:04:35.158379 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:35Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:35 crc kubenswrapper[4898]: I1211 13:04:35.169199 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://337971ff1d0d3cf7ee256777a71e4549ec0b973b58b437caaa152c589047141f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:35Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:35 crc kubenswrapper[4898]: I1211 13:04:35.179993 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b28cc743c1173591c6ee75d4bb3841608d01bca50dab896d4c5de930a67ee22f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:35Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:35 crc kubenswrapper[4898]: I1211 13:04:35.194571 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-dlqfj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e8ed6cb-b822-4b64-9e00-e755c5aea812\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76ddfb054d2dd7ec27a771c96a0fca2cead8eacc3221c22b36c1d075414a1995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bvrpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-dlqfj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:35Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:35 crc kubenswrapper[4898]: I1211 13:04:35.205630 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jmvlr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8009db93-3f68-4a84-87ae-863f64e231e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08079d059f71e05ad9d28ebed3d2608a035d3b6fc87db1d80b3974b439ac5bdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkmx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:24Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jmvlr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:35Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:35 crc kubenswrapper[4898]: I1211 13:04:35.235716 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:35 crc kubenswrapper[4898]: I1211 13:04:35.235758 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:35 crc kubenswrapper[4898]: I1211 13:04:35.235769 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:35 crc kubenswrapper[4898]: I1211 13:04:35.235786 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:35 crc kubenswrapper[4898]: I1211 13:04:35.235798 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:35Z","lastTransitionTime":"2025-12-11T13:04:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:35 crc kubenswrapper[4898]: I1211 13:04:35.270229 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-brphv" Dec 11 13:04:35 crc kubenswrapper[4898]: W1211 13:04:35.289850 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3f71cb6f_609f_4fab_86eb_e01518fb8c61.slice/crio-29d3f90b7bbb7f0ae6fa7e3d5e8a1fe9b98409e812de8fd648a3037ce863d1ad WatchSource:0}: Error finding container 29d3f90b7bbb7f0ae6fa7e3d5e8a1fe9b98409e812de8fd648a3037ce863d1ad: Status 404 returned error can't find the container with id 29d3f90b7bbb7f0ae6fa7e3d5e8a1fe9b98409e812de8fd648a3037ce863d1ad Dec 11 13:04:35 crc kubenswrapper[4898]: I1211 13:04:35.340052 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:35 crc kubenswrapper[4898]: I1211 13:04:35.340131 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:35 crc kubenswrapper[4898]: I1211 13:04:35.340158 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:35 crc kubenswrapper[4898]: I1211 13:04:35.340190 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:35 crc kubenswrapper[4898]: I1211 13:04:35.340214 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:35Z","lastTransitionTime":"2025-12-11T13:04:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:35 crc kubenswrapper[4898]: I1211 13:04:35.443099 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:35 crc kubenswrapper[4898]: I1211 13:04:35.443150 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:35 crc kubenswrapper[4898]: I1211 13:04:35.443166 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:35 crc kubenswrapper[4898]: I1211 13:04:35.443183 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:35 crc kubenswrapper[4898]: I1211 13:04:35.443196 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:35Z","lastTransitionTime":"2025-12-11T13:04:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:35 crc kubenswrapper[4898]: I1211 13:04:35.547178 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:35 crc kubenswrapper[4898]: I1211 13:04:35.547263 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:35 crc kubenswrapper[4898]: I1211 13:04:35.547285 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:35 crc kubenswrapper[4898]: I1211 13:04:35.547320 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:35 crc kubenswrapper[4898]: I1211 13:04:35.547340 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:35Z","lastTransitionTime":"2025-12-11T13:04:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:35 crc kubenswrapper[4898]: I1211 13:04:35.650309 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:35 crc kubenswrapper[4898]: I1211 13:04:35.650392 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:35 crc kubenswrapper[4898]: I1211 13:04:35.650414 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:35 crc kubenswrapper[4898]: I1211 13:04:35.650449 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:35 crc kubenswrapper[4898]: I1211 13:04:35.650520 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:35Z","lastTransitionTime":"2025-12-11T13:04:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:35 crc kubenswrapper[4898]: I1211 13:04:35.753122 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:35 crc kubenswrapper[4898]: I1211 13:04:35.753161 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:35 crc kubenswrapper[4898]: I1211 13:04:35.753184 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:35 crc kubenswrapper[4898]: I1211 13:04:35.753200 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:35 crc kubenswrapper[4898]: I1211 13:04:35.753211 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:35Z","lastTransitionTime":"2025-12-11T13:04:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:35 crc kubenswrapper[4898]: I1211 13:04:35.774078 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 11 13:04:35 crc kubenswrapper[4898]: I1211 13:04:35.774160 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 11 13:04:35 crc kubenswrapper[4898]: E1211 13:04:35.774227 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 11 13:04:35 crc kubenswrapper[4898]: E1211 13:04:35.774351 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 11 13:04:35 crc kubenswrapper[4898]: I1211 13:04:35.774082 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 11 13:04:35 crc kubenswrapper[4898]: E1211 13:04:35.774552 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 11 13:04:35 crc kubenswrapper[4898]: I1211 13:04:35.856043 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:35 crc kubenswrapper[4898]: I1211 13:04:35.856101 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:35 crc kubenswrapper[4898]: I1211 13:04:35.856118 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:35 crc kubenswrapper[4898]: I1211 13:04:35.856143 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:35 crc kubenswrapper[4898]: I1211 13:04:35.856164 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:35Z","lastTransitionTime":"2025-12-11T13:04:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:35 crc kubenswrapper[4898]: I1211 13:04:35.960381 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:35 crc kubenswrapper[4898]: I1211 13:04:35.960436 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:35 crc kubenswrapper[4898]: I1211 13:04:35.960448 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:35 crc kubenswrapper[4898]: I1211 13:04:35.960490 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:35 crc kubenswrapper[4898]: I1211 13:04:35.960504 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:35Z","lastTransitionTime":"2025-12-11T13:04:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:36 crc kubenswrapper[4898]: I1211 13:04:36.008767 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:36 crc kubenswrapper[4898]: I1211 13:04:36.008810 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:36 crc kubenswrapper[4898]: I1211 13:04:36.008819 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:36 crc kubenswrapper[4898]: I1211 13:04:36.008836 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:36 crc kubenswrapper[4898]: I1211 13:04:36.008847 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:36Z","lastTransitionTime":"2025-12-11T13:04:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:36 crc kubenswrapper[4898]: I1211 13:04:36.017841 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-brphv" event={"ID":"3f71cb6f-609f-4fab-86eb-e01518fb8c61","Type":"ContainerStarted","Data":"4cedba92b63b5430b84d32f2b59c5cc0c4e1cd2bc2d513f2f905ebb59c34639b"} Dec 11 13:04:36 crc kubenswrapper[4898]: I1211 13:04:36.017898 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-brphv" event={"ID":"3f71cb6f-609f-4fab-86eb-e01518fb8c61","Type":"ContainerStarted","Data":"8d17922d87e50d85f50e76fef19a44cedae3bfd35727f8531089cfcf6da12d5f"} Dec 11 13:04:36 crc kubenswrapper[4898]: I1211 13:04:36.017911 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-brphv" event={"ID":"3f71cb6f-609f-4fab-86eb-e01518fb8c61","Type":"ContainerStarted","Data":"29d3f90b7bbb7f0ae6fa7e3d5e8a1fe9b98409e812de8fd648a3037ce863d1ad"} Dec 11 13:04:36 crc kubenswrapper[4898]: I1211 13:04:36.020161 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qndxl_1efa7034-8a95-4e6e-bd84-0189dc5acaa3/ovnkube-controller/1.log" Dec 11 13:04:36 crc kubenswrapper[4898]: I1211 13:04:36.021089 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qndxl_1efa7034-8a95-4e6e-bd84-0189dc5acaa3/ovnkube-controller/0.log" Dec 11 13:04:36 crc kubenswrapper[4898]: I1211 13:04:36.023948 4898 generic.go:334] "Generic (PLEG): container finished" podID="1efa7034-8a95-4e6e-bd84-0189dc5acaa3" containerID="378427d319c4df6bcb001488490b4a1a785a18161dfed6887f08fbcea9e06c5a" exitCode=1 Dec 11 13:04:36 crc kubenswrapper[4898]: I1211 13:04:36.023983 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qndxl" event={"ID":"1efa7034-8a95-4e6e-bd84-0189dc5acaa3","Type":"ContainerDied","Data":"378427d319c4df6bcb001488490b4a1a785a18161dfed6887f08fbcea9e06c5a"} Dec 11 13:04:36 crc kubenswrapper[4898]: I1211 13:04:36.024028 4898 scope.go:117] "RemoveContainer" containerID="7d5786093fd7a55dfc5d4ea0d8610920f881ca9bc79361cbeeadf602c52cef54" Dec 11 13:04:36 crc kubenswrapper[4898]: I1211 13:04:36.024841 4898 scope.go:117] "RemoveContainer" containerID="378427d319c4df6bcb001488490b4a1a785a18161dfed6887f08fbcea9e06c5a" Dec 11 13:04:36 crc kubenswrapper[4898]: E1211 13:04:36.025059 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-qndxl_openshift-ovn-kubernetes(1efa7034-8a95-4e6e-bd84-0189dc5acaa3)\"" pod="openshift-ovn-kubernetes/ovnkube-node-qndxl" podUID="1efa7034-8a95-4e6e-bd84-0189dc5acaa3" Dec 11 13:04:36 crc kubenswrapper[4898]: E1211 13:04:36.028559 4898 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-11T13:04:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-11T13:04:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-11T13:04:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-11T13:04:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"160776e4-8c30-4eed-9dbf-aa0b51733cfb\\\",\\\"systemUUID\\\":\\\"3ede5a2c-67f9-4bff-827e-03a23908e5c0\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:36Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:36 crc kubenswrapper[4898]: I1211 13:04:36.033266 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:36 crc kubenswrapper[4898]: I1211 13:04:36.033302 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:36 crc kubenswrapper[4898]: I1211 13:04:36.033313 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:36 crc kubenswrapper[4898]: I1211 13:04:36.033329 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:36 crc kubenswrapper[4898]: I1211 13:04:36.033340 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:36Z","lastTransitionTime":"2025-12-11T13:04:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:36 crc kubenswrapper[4898]: I1211 13:04:36.038084 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fc485a2-a69e-4453-8c3d-4b9698caa632\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df081be8bffcbf5068ade5f09ad81f2ea332d110fb0dffc1ab215f4a447e75d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://765cb993103d23770e20a725ce1f28ca635d13d2ecd05676039a833274ecf3e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41568c567a418a6ebd534c09d02c4331dfa2cd00580441bfdd83a15a12153a59\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ccc1530df8a90b252f6008c3a1734fd489bcb57944c2cc46bdf3c7554d837c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8857b4ade110b31598d9080f45bc15f5702b2c278fe7909e4d172c9877e61534\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"message\\\":\\\"falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1211 13:04:15.326141 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1211 13:04:15.327902 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-38072788/tls.crt::/tmp/serving-cert-38072788/tls.key\\\\\\\"\\\\nI1211 13:04:21.035582 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1211 13:04:21.038444 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1211 13:04:21.038483 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1211 13:04:21.038515 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1211 13:04:21.038527 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1211 13:04:21.044993 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1211 13:04:21.045022 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1211 13:04:21.045029 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1211 13:04:21.045034 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1211 13:04:21.045038 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1211 13:04:21.045043 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1211 13:04:21.045047 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1211 13:04:21.045297 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1211 13:04:21.046774 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a6e962a8c37823fc7d0047f9c1cae14311bfa1f36236addff039bba3369a021\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca4b5408884ed07718750ad82f77961d242327ca11991fd4dbbd1b5b53c5b642\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca4b5408884ed07718750ad82f77961d242327ca11991fd4dbbd1b5b53c5b642\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:36Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:36 crc kubenswrapper[4898]: E1211 13:04:36.052770 4898 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-11T13:04:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-11T13:04:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-11T13:04:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-11T13:04:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"160776e4-8c30-4eed-9dbf-aa0b51733cfb\\\",\\\"systemUUID\\\":\\\"3ede5a2c-67f9-4bff-827e-03a23908e5c0\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:36Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:36 crc kubenswrapper[4898]: I1211 13:04:36.057607 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:36 crc kubenswrapper[4898]: I1211 13:04:36.057652 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:36 crc kubenswrapper[4898]: I1211 13:04:36.057664 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:36 crc kubenswrapper[4898]: I1211 13:04:36.057683 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:36 crc kubenswrapper[4898]: I1211 13:04:36.057694 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:36Z","lastTransitionTime":"2025-12-11T13:04:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:36 crc kubenswrapper[4898]: I1211 13:04:36.063414 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9dab79b9393ae8ca48f0b4366d9b4e6fd25015a3d42f913b3d3d5ce78cbf0cc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bee512bd389a5e402acad6fdbf075e8d90bbfb3899efaecee992ff8ef64617fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:36Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:36 crc kubenswrapper[4898]: I1211 13:04:36.083886 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:36Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:36 crc kubenswrapper[4898]: E1211 13:04:36.084203 4898 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-11T13:04:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-11T13:04:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-11T13:04:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-11T13:04:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"160776e4-8c30-4eed-9dbf-aa0b51733cfb\\\",\\\"systemUUID\\\":\\\"3ede5a2c-67f9-4bff-827e-03a23908e5c0\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:36Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:36 crc kubenswrapper[4898]: I1211 13:04:36.088820 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:36 crc kubenswrapper[4898]: I1211 13:04:36.088854 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:36 crc kubenswrapper[4898]: I1211 13:04:36.088864 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:36 crc kubenswrapper[4898]: I1211 13:04:36.088879 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:36 crc kubenswrapper[4898]: I1211 13:04:36.088890 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:36Z","lastTransitionTime":"2025-12-11T13:04:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:36 crc kubenswrapper[4898]: E1211 13:04:36.107845 4898 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-11T13:04:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-11T13:04:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-11T13:04:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-11T13:04:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"160776e4-8c30-4eed-9dbf-aa0b51733cfb\\\",\\\"systemUUID\\\":\\\"3ede5a2c-67f9-4bff-827e-03a23908e5c0\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:36Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:36 crc kubenswrapper[4898]: I1211 13:04:36.113290 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7lxfm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45192e97-2770-4866-9865-dc2f45b3f616\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4fc2a5689b589ccda6ec382b6a586f4e47c42d4f137d852e28688a18b98e6e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14a9667311a747f8e844897394432d67e0897b22569dd1314f4ebeccfce1c531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14a9667311a747f8e844897394432d67e0897b22569dd1314f4ebeccfce1c531\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://35bb75c37377565dd0410255d00b3a848a3d22801107bd4d9547ce5574cda121\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35bb75c37377565dd0410255d00b3a848a3d22801107bd4d9547ce5574cda121\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd3bae44b475629c42174d195e162b9afe545e8dc84857847b1e2d936cf1a32d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd3bae44b475629c42174d195e162b9afe545e8dc84857847b1e2d936cf1a32d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7d5edf28866aeb08ed16121e0cea183463e58a708b157da20edd847bdad9860\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7d5edf28866aeb08ed16121e0cea183463e58a708b157da20edd847bdad9860\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://951e5da3fc6041496960dcfcd28bb3e918a40d9303c6609880bc2f8dcca29688\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://951e5da3fc6041496960dcfcd28bb3e918a40d9303c6609880bc2f8dcca29688\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39b6a95a0387b8f1ea4992bfa28355272a299bbb81adddf7ad39835786b43b2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39b6a95a0387b8f1ea4992bfa28355272a299bbb81adddf7ad39835786b43b2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7lxfm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:36Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:36 crc kubenswrapper[4898]: I1211 13:04:36.117428 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:36 crc kubenswrapper[4898]: I1211 13:04:36.117480 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:36 crc kubenswrapper[4898]: I1211 13:04:36.117491 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:36 crc kubenswrapper[4898]: I1211 13:04:36.117505 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:36 crc kubenswrapper[4898]: I1211 13:04:36.117514 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:36Z","lastTransitionTime":"2025-12-11T13:04:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:36 crc kubenswrapper[4898]: I1211 13:04:36.128937 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca3db723-c4e9-4a38-9cee-827f045c7be3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b72e9f6787b4bfadbc89cf669d5f31678f0a85a726107e3b3d2e06eff0e18ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a52b842d9e7001cf1e9c3284c56ac9a76f642b66457d6717c6367e00ef89fdf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://583816a3f9ba9a092d6cd8d75bba54cf42e3ff9f65230f83822baec967dd53dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f419f755ec0a61ffad06dca0d35df43386ebd253152f29cf04312cd0ea89d5de\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:36Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:36 crc kubenswrapper[4898]: E1211 13:04:36.130241 4898 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-11T13:04:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-11T13:04:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-11T13:04:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-11T13:04:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"160776e4-8c30-4eed-9dbf-aa0b51733cfb\\\",\\\"systemUUID\\\":\\\"3ede5a2c-67f9-4bff-827e-03a23908e5c0\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:36Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:36 crc kubenswrapper[4898]: E1211 13:04:36.130344 4898 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 11 13:04:36 crc kubenswrapper[4898]: I1211 13:04:36.131867 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:36 crc kubenswrapper[4898]: I1211 13:04:36.131891 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:36 crc kubenswrapper[4898]: I1211 13:04:36.131900 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:36 crc kubenswrapper[4898]: I1211 13:04:36.131912 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:36 crc kubenswrapper[4898]: I1211 13:04:36.131922 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:36Z","lastTransitionTime":"2025-12-11T13:04:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:36 crc kubenswrapper[4898]: I1211 13:04:36.142067 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:36Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:36 crc kubenswrapper[4898]: I1211 13:04:36.155181 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:36Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:36 crc kubenswrapper[4898]: I1211 13:04:36.168092 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://337971ff1d0d3cf7ee256777a71e4549ec0b973b58b437caaa152c589047141f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:36Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:36 crc kubenswrapper[4898]: I1211 13:04:36.181269 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b28cc743c1173591c6ee75d4bb3841608d01bca50dab896d4c5de930a67ee22f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:36Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:36 crc kubenswrapper[4898]: I1211 13:04:36.194254 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-dlqfj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e8ed6cb-b822-4b64-9e00-e755c5aea812\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76ddfb054d2dd7ec27a771c96a0fca2cead8eacc3221c22b36c1d075414a1995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bvrpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-dlqfj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:36Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:36 crc kubenswrapper[4898]: I1211 13:04:36.204140 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jmvlr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8009db93-3f68-4a84-87ae-863f64e231e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08079d059f71e05ad9d28ebed3d2608a035d3b6fc87db1d80b3974b439ac5bdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkmx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:24Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jmvlr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:36Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:36 crc kubenswrapper[4898]: I1211 13:04:36.216450 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6782ac5869029f1d6c05198c469d34070decbbbbab0c093963024040f067ecf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7cv42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb7b98c50effebe09db7e51fe139da11af7b116ce6b7d81f0dada92fa29c82a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7cv42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7mmvk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:36Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:36 crc kubenswrapper[4898]: I1211 13:04:36.233803 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qndxl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1efa7034-8a95-4e6e-bd84-0189dc5acaa3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2a77e22893dca5ea818fc6992db6c34f8eb90ce2f25e814bec5f314d60ad569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd81ec4902eeb225d5a2969e211c63bbab448c1cccb327cfa84fb68c4f386ab0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ab0efc2d7d46f575125a46f025b016017e66f74466a9d796dfb0285577f5bcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d992af21fd68cce19a4be8328466a3d46df62ed908fe4e911ba14e7c009ce982\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a60901e27e67b7483e02800a91a86f2676b610d538c999ab1df0d42fde146f90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7e7d7ded2a6e0e20781a16b0ed56a744db593c6fe40c75e89a088ccb8cdb127\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://378427d319c4df6bcb001488490b4a1a785a18161dfed6887f08fbcea9e06c5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d5786093fd7a55dfc5d4ea0d8610920f881ca9bc79361cbeeadf602c52cef54\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-11T13:04:32Z\\\",\\\"message\\\":\\\".go:140\\\\nI1211 13:04:32.001534 6218 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1211 13:04:32.001853 6218 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1211 13:04:32.001870 6218 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI1211 13:04:32.001929 6218 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1211 13:04:32.002095 6218 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI1211 13:04:32.002453 6218 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1211 13:04:32.002547 6218 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:29Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3aa907a5ca865510f1aafd0f949963b5938b6fe62fa9fd690447c4ab62566f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fc397b24e8480c4d07281b09fb871f859eb9fdf47fa9103b9243f91e5800d61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fc397b24e8480c4d07281b09fb871f859eb9fdf47fa9103b9243f91e5800d61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qndxl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:36Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:36 crc kubenswrapper[4898]: I1211 13:04:36.234109 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:36 crc kubenswrapper[4898]: I1211 13:04:36.234135 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:36 crc kubenswrapper[4898]: I1211 13:04:36.234147 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:36 crc kubenswrapper[4898]: I1211 13:04:36.234163 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:36 crc kubenswrapper[4898]: I1211 13:04:36.234174 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:36Z","lastTransitionTime":"2025-12-11T13:04:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:36 crc kubenswrapper[4898]: I1211 13:04:36.247431 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-brphv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f71cb6f-609f-4fab-86eb-e01518fb8c61\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d17922d87e50d85f50e76fef19a44cedae3bfd35727f8531089cfcf6da12d5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwpjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cedba92b63b5430b84d32f2b59c5cc0c4e1cd2bc2d513f2f905ebb59c34639b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwpjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-brphv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:36Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:36 crc kubenswrapper[4898]: I1211 13:04:36.265287 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"88d684b6-361a-40e6-978e-b46ea34398f5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fdd8f4f5e41c02e8bfa32782027f6483a3568a3e1cb28f2abc7807eaada32646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad23dc07c8ad7cf6100e3b53e1d9ed8abac20e942968edb5decdde1b263de306\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce1bf1a7718645350f992e26a366a0fefa0d757b696c932d3552dd24a4c93550\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4801f56faef50a1c100b7f54be9508a2be17ab655648fdb92cffaab1cb198a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87f5a835b639c4ce50c0d2b5f1a865f1b67bc5cc06d9fd88875d9defb71dbed2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77f3c45a0ef619bd5ee7b89f62fdaf4ae983c8f75e23034eb88019194f6ef88a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77f3c45a0ef619bd5ee7b89f62fdaf4ae983c8f75e23034eb88019194f6ef88a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32f392949d5ebc630af8b0646f8f482f806803d464e2cedbcbe731328d6ba5ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32f392949d5ebc630af8b0646f8f482f806803d464e2cedbcbe731328d6ba5ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://bf8553e4b1a22afaf454efae3707136b9e6e320a54fd269f48a5f72297249d21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf8553e4b1a22afaf454efae3707136b9e6e320a54fd269f48a5f72297249d21\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:36Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:36 crc kubenswrapper[4898]: I1211 13:04:36.276533 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-h86cf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"00c0a5c3-6be3-4c77-a628-cf6710a1f10f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cb94931502e22d03595fc3289c2b098ecf881b9f8c14180a036a11b2f65bd91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xv2hc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-h86cf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:36Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:36 crc kubenswrapper[4898]: I1211 13:04:36.300038 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"88d684b6-361a-40e6-978e-b46ea34398f5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fdd8f4f5e41c02e8bfa32782027f6483a3568a3e1cb28f2abc7807eaada32646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad23dc07c8ad7cf6100e3b53e1d9ed8abac20e942968edb5decdde1b263de306\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce1bf1a7718645350f992e26a366a0fefa0d757b696c932d3552dd24a4c93550\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4801f56faef50a1c100b7f54be9508a2be17ab655648fdb92cffaab1cb198a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87f5a835b639c4ce50c0d2b5f1a865f1b67bc5cc06d9fd88875d9defb71dbed2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77f3c45a0ef619bd5ee7b89f62fdaf4ae983c8f75e23034eb88019194f6ef88a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77f3c45a0ef619bd5ee7b89f62fdaf4ae983c8f75e23034eb88019194f6ef88a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32f392949d5ebc630af8b0646f8f482f806803d464e2cedbcbe731328d6ba5ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32f392949d5ebc630af8b0646f8f482f806803d464e2cedbcbe731328d6ba5ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://bf8553e4b1a22afaf454efae3707136b9e6e320a54fd269f48a5f72297249d21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf8553e4b1a22afaf454efae3707136b9e6e320a54fd269f48a5f72297249d21\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:36Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:36 crc kubenswrapper[4898]: I1211 13:04:36.309538 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-h86cf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"00c0a5c3-6be3-4c77-a628-cf6710a1f10f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cb94931502e22d03595fc3289c2b098ecf881b9f8c14180a036a11b2f65bd91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xv2hc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-h86cf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:36Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:36 crc kubenswrapper[4898]: I1211 13:04:36.318922 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6782ac5869029f1d6c05198c469d34070decbbbbab0c093963024040f067ecf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7cv42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb7b98c50effebe09db7e51fe139da11af7b116ce6b7d81f0dada92fa29c82a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7cv42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7mmvk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:36Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:36 crc kubenswrapper[4898]: I1211 13:04:36.336819 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:36 crc kubenswrapper[4898]: I1211 13:04:36.336878 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:36 crc kubenswrapper[4898]: I1211 13:04:36.336895 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:36 crc kubenswrapper[4898]: I1211 13:04:36.336923 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:36 crc kubenswrapper[4898]: I1211 13:04:36.336942 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:36Z","lastTransitionTime":"2025-12-11T13:04:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:36 crc kubenswrapper[4898]: I1211 13:04:36.339746 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qndxl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1efa7034-8a95-4e6e-bd84-0189dc5acaa3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2a77e22893dca5ea818fc6992db6c34f8eb90ce2f25e814bec5f314d60ad569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd81ec4902eeb225d5a2969e211c63bbab448c1cccb327cfa84fb68c4f386ab0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ab0efc2d7d46f575125a46f025b016017e66f74466a9d796dfb0285577f5bcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d992af21fd68cce19a4be8328466a3d46df62ed908fe4e911ba14e7c009ce982\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a60901e27e67b7483e02800a91a86f2676b610d538c999ab1df0d42fde146f90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7e7d7ded2a6e0e20781a16b0ed56a744db593c6fe40c75e89a088ccb8cdb127\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://378427d319c4df6bcb001488490b4a1a785a18161dfed6887f08fbcea9e06c5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d5786093fd7a55dfc5d4ea0d8610920f881ca9bc79361cbeeadf602c52cef54\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-11T13:04:32Z\\\",\\\"message\\\":\\\".go:140\\\\nI1211 13:04:32.001534 6218 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1211 13:04:32.001853 6218 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1211 13:04:32.001870 6218 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI1211 13:04:32.001929 6218 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1211 13:04:32.002095 6218 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI1211 13:04:32.002453 6218 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1211 13:04:32.002547 6218 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:29Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://378427d319c4df6bcb001488490b4a1a785a18161dfed6887f08fbcea9e06c5a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-11T13:04:35Z\\\",\\\"message\\\":\\\"ent time 2025-12-11T13:04:34Z is after 2025-08-24T17:21:41Z]\\\\nI1211 13:04:34.324602 6346 services_controller.go:434] Service openshift-network-console/networking-console-plugin retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{networking-console-plugin openshift-network-console 45966864-9dcc-4747-a124-95f4ab710d2d 13853 0 2025-02-23 05:40:35 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[app.kubernetes.io/component:networking-console-plugin app.kubernetes.io/managed-by:cluster-network-operator app.kubernetes.io/name:networking-console-plugin app.kubernetes.io/part-of:cluster-network-operator] map[openshift.io/description:Expose the networking console plugin service on port 9443. This port is for internal use, and no other usage is guaranteed. service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-secret-name:networking-console-plugin-cert service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{operator.openshift.io/v1 Network cluster 8d01ddba-7e05-4639-926a-4485de3b6327 0xc0073ce7ae 0xc0073ce7af}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:https,Protocol:TCP,Port:9443,TargetPort:{1 0 https},NodePort:0,AppProtocol:nil,},},Selector:map[string]string{app.kubernetes.io/component: networking-console-plugin,app.kuberne\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3aa907a5ca865510f1aafd0f949963b5938b6fe62fa9fd690447c4ab62566f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fc397b24e8480c4d07281b09fb871f859eb9fdf47fa9103b9243f91e5800d61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fc397b24e8480c4d07281b09fb871f859eb9fdf47fa9103b9243f91e5800d61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qndxl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:36Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:36 crc kubenswrapper[4898]: I1211 13:04:36.354046 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-brphv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f71cb6f-609f-4fab-86eb-e01518fb8c61\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d17922d87e50d85f50e76fef19a44cedae3bfd35727f8531089cfcf6da12d5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwpjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cedba92b63b5430b84d32f2b59c5cc0c4e1cd2bc2d513f2f905ebb59c34639b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwpjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-brphv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:36Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:36 crc kubenswrapper[4898]: I1211 13:04:36.367176 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:36Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:36 crc kubenswrapper[4898]: I1211 13:04:36.387573 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7lxfm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45192e97-2770-4866-9865-dc2f45b3f616\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4fc2a5689b589ccda6ec382b6a586f4e47c42d4f137d852e28688a18b98e6e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14a9667311a747f8e844897394432d67e0897b22569dd1314f4ebeccfce1c531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14a9667311a747f8e844897394432d67e0897b22569dd1314f4ebeccfce1c531\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://35bb75c37377565dd0410255d00b3a848a3d22801107bd4d9547ce5574cda121\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35bb75c37377565dd0410255d00b3a848a3d22801107bd4d9547ce5574cda121\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd3bae44b475629c42174d195e162b9afe545e8dc84857847b1e2d936cf1a32d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd3bae44b475629c42174d195e162b9afe545e8dc84857847b1e2d936cf1a32d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7d5edf28866aeb08ed16121e0cea183463e58a708b157da20edd847bdad9860\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7d5edf28866aeb08ed16121e0cea183463e58a708b157da20edd847bdad9860\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://951e5da3fc6041496960dcfcd28bb3e918a40d9303c6609880bc2f8dcca29688\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://951e5da3fc6041496960dcfcd28bb3e918a40d9303c6609880bc2f8dcca29688\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39b6a95a0387b8f1ea4992bfa28355272a299bbb81adddf7ad39835786b43b2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39b6a95a0387b8f1ea4992bfa28355272a299bbb81adddf7ad39835786b43b2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7lxfm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:36Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:36 crc kubenswrapper[4898]: I1211 13:04:36.406913 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fc485a2-a69e-4453-8c3d-4b9698caa632\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df081be8bffcbf5068ade5f09ad81f2ea332d110fb0dffc1ab215f4a447e75d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://765cb993103d23770e20a725ce1f28ca635d13d2ecd05676039a833274ecf3e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41568c567a418a6ebd534c09d02c4331dfa2cd00580441bfdd83a15a12153a59\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ccc1530df8a90b252f6008c3a1734fd489bcb57944c2cc46bdf3c7554d837c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8857b4ade110b31598d9080f45bc15f5702b2c278fe7909e4d172c9877e61534\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"message\\\":\\\"falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1211 13:04:15.326141 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1211 13:04:15.327902 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-38072788/tls.crt::/tmp/serving-cert-38072788/tls.key\\\\\\\"\\\\nI1211 13:04:21.035582 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1211 13:04:21.038444 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1211 13:04:21.038483 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1211 13:04:21.038515 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1211 13:04:21.038527 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1211 13:04:21.044993 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1211 13:04:21.045022 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1211 13:04:21.045029 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1211 13:04:21.045034 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1211 13:04:21.045038 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1211 13:04:21.045043 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1211 13:04:21.045047 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1211 13:04:21.045297 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1211 13:04:21.046774 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a6e962a8c37823fc7d0047f9c1cae14311bfa1f36236addff039bba3369a021\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca4b5408884ed07718750ad82f77961d242327ca11991fd4dbbd1b5b53c5b642\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca4b5408884ed07718750ad82f77961d242327ca11991fd4dbbd1b5b53c5b642\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:36Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:36 crc kubenswrapper[4898]: I1211 13:04:36.421431 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9dab79b9393ae8ca48f0b4366d9b4e6fd25015a3d42f913b3d3d5ce78cbf0cc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bee512bd389a5e402acad6fdbf075e8d90bbfb3899efaecee992ff8ef64617fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:36Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:36 crc kubenswrapper[4898]: I1211 13:04:36.435819 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:36Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:36 crc kubenswrapper[4898]: I1211 13:04:36.439337 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:36 crc kubenswrapper[4898]: I1211 13:04:36.439379 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:36 crc kubenswrapper[4898]: I1211 13:04:36.439391 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:36 crc kubenswrapper[4898]: I1211 13:04:36.439409 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:36 crc kubenswrapper[4898]: I1211 13:04:36.439421 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:36Z","lastTransitionTime":"2025-12-11T13:04:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:36 crc kubenswrapper[4898]: I1211 13:04:36.448656 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca3db723-c4e9-4a38-9cee-827f045c7be3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b72e9f6787b4bfadbc89cf669d5f31678f0a85a726107e3b3d2e06eff0e18ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a52b842d9e7001cf1e9c3284c56ac9a76f642b66457d6717c6367e00ef89fdf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://583816a3f9ba9a092d6cd8d75bba54cf42e3ff9f65230f83822baec967dd53dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f419f755ec0a61ffad06dca0d35df43386ebd253152f29cf04312cd0ea89d5de\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:36Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:36 crc kubenswrapper[4898]: I1211 13:04:36.460298 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:36Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:36 crc kubenswrapper[4898]: I1211 13:04:36.473596 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-dlqfj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e8ed6cb-b822-4b64-9e00-e755c5aea812\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76ddfb054d2dd7ec27a771c96a0fca2cead8eacc3221c22b36c1d075414a1995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bvrpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-dlqfj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:36Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:36 crc kubenswrapper[4898]: I1211 13:04:36.478334 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-zcq7l"] Dec 11 13:04:36 crc kubenswrapper[4898]: I1211 13:04:36.478893 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zcq7l" Dec 11 13:04:36 crc kubenswrapper[4898]: E1211 13:04:36.478969 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zcq7l" podUID="34380c7c-1d75-4f6f-a6cb-b015a55ca978" Dec 11 13:04:36 crc kubenswrapper[4898]: I1211 13:04:36.488442 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jmvlr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8009db93-3f68-4a84-87ae-863f64e231e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08079d059f71e05ad9d28ebed3d2608a035d3b6fc87db1d80b3974b439ac5bdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkmx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:24Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jmvlr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:36Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:36 crc kubenswrapper[4898]: I1211 13:04:36.500133 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://337971ff1d0d3cf7ee256777a71e4549ec0b973b58b437caaa152c589047141f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:36Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:36 crc kubenswrapper[4898]: I1211 13:04:36.511245 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b28cc743c1173591c6ee75d4bb3841608d01bca50dab896d4c5de930a67ee22f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:36Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:36 crc kubenswrapper[4898]: I1211 13:04:36.522287 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b28cc743c1173591c6ee75d4bb3841608d01bca50dab896d4c5de930a67ee22f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:36Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:36 crc kubenswrapper[4898]: I1211 13:04:36.534607 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-dlqfj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e8ed6cb-b822-4b64-9e00-e755c5aea812\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76ddfb054d2dd7ec27a771c96a0fca2cead8eacc3221c22b36c1d075414a1995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bvrpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-dlqfj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:36Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:36 crc kubenswrapper[4898]: I1211 13:04:36.541997 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:36 crc kubenswrapper[4898]: I1211 13:04:36.542046 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:36 crc kubenswrapper[4898]: I1211 13:04:36.542061 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:36 crc kubenswrapper[4898]: I1211 13:04:36.542082 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:36 crc kubenswrapper[4898]: I1211 13:04:36.542099 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:36Z","lastTransitionTime":"2025-12-11T13:04:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:36 crc kubenswrapper[4898]: I1211 13:04:36.546209 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jmvlr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8009db93-3f68-4a84-87ae-863f64e231e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08079d059f71e05ad9d28ebed3d2608a035d3b6fc87db1d80b3974b439ac5bdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkmx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:24Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jmvlr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:36Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:36 crc kubenswrapper[4898]: I1211 13:04:36.562056 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://337971ff1d0d3cf7ee256777a71e4549ec0b973b58b437caaa152c589047141f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:36Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:36 crc kubenswrapper[4898]: I1211 13:04:36.583071 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"88d684b6-361a-40e6-978e-b46ea34398f5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fdd8f4f5e41c02e8bfa32782027f6483a3568a3e1cb28f2abc7807eaada32646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad23dc07c8ad7cf6100e3b53e1d9ed8abac20e942968edb5decdde1b263de306\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce1bf1a7718645350f992e26a366a0fefa0d757b696c932d3552dd24a4c93550\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4801f56faef50a1c100b7f54be9508a2be17ab655648fdb92cffaab1cb198a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87f5a835b639c4ce50c0d2b5f1a865f1b67bc5cc06d9fd88875d9defb71dbed2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77f3c45a0ef619bd5ee7b89f62fdaf4ae983c8f75e23034eb88019194f6ef88a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77f3c45a0ef619bd5ee7b89f62fdaf4ae983c8f75e23034eb88019194f6ef88a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32f392949d5ebc630af8b0646f8f482f806803d464e2cedbcbe731328d6ba5ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32f392949d5ebc630af8b0646f8f482f806803d464e2cedbcbe731328d6ba5ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://bf8553e4b1a22afaf454efae3707136b9e6e320a54fd269f48a5f72297249d21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf8553e4b1a22afaf454efae3707136b9e6e320a54fd269f48a5f72297249d21\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:36Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:36 crc kubenswrapper[4898]: I1211 13:04:36.586339 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/34380c7c-1d75-4f6f-a6cb-b015a55ca978-metrics-certs\") pod \"network-metrics-daemon-zcq7l\" (UID: \"34380c7c-1d75-4f6f-a6cb-b015a55ca978\") " pod="openshift-multus/network-metrics-daemon-zcq7l" Dec 11 13:04:36 crc kubenswrapper[4898]: I1211 13:04:36.586499 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4cxl8\" (UniqueName: \"kubernetes.io/projected/34380c7c-1d75-4f6f-a6cb-b015a55ca978-kube-api-access-4cxl8\") pod \"network-metrics-daemon-zcq7l\" (UID: \"34380c7c-1d75-4f6f-a6cb-b015a55ca978\") " pod="openshift-multus/network-metrics-daemon-zcq7l" Dec 11 13:04:36 crc kubenswrapper[4898]: I1211 13:04:36.596679 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-h86cf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"00c0a5c3-6be3-4c77-a628-cf6710a1f10f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cb94931502e22d03595fc3289c2b098ecf881b9f8c14180a036a11b2f65bd91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xv2hc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-h86cf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:36Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:36 crc kubenswrapper[4898]: I1211 13:04:36.610742 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6782ac5869029f1d6c05198c469d34070decbbbbab0c093963024040f067ecf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7cv42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb7b98c50effebe09db7e51fe139da11af7b116ce6b7d81f0dada92fa29c82a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7cv42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7mmvk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:36Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:36 crc kubenswrapper[4898]: I1211 13:04:36.633654 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qndxl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1efa7034-8a95-4e6e-bd84-0189dc5acaa3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2a77e22893dca5ea818fc6992db6c34f8eb90ce2f25e814bec5f314d60ad569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd81ec4902eeb225d5a2969e211c63bbab448c1cccb327cfa84fb68c4f386ab0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ab0efc2d7d46f575125a46f025b016017e66f74466a9d796dfb0285577f5bcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d992af21fd68cce19a4be8328466a3d46df62ed908fe4e911ba14e7c009ce982\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a60901e27e67b7483e02800a91a86f2676b610d538c999ab1df0d42fde146f90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7e7d7ded2a6e0e20781a16b0ed56a744db593c6fe40c75e89a088ccb8cdb127\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://378427d319c4df6bcb001488490b4a1a785a18161dfed6887f08fbcea9e06c5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d5786093fd7a55dfc5d4ea0d8610920f881ca9bc79361cbeeadf602c52cef54\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-11T13:04:32Z\\\",\\\"message\\\":\\\".go:140\\\\nI1211 13:04:32.001534 6218 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1211 13:04:32.001853 6218 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1211 13:04:32.001870 6218 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI1211 13:04:32.001929 6218 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1211 13:04:32.002095 6218 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI1211 13:04:32.002453 6218 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1211 13:04:32.002547 6218 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:29Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://378427d319c4df6bcb001488490b4a1a785a18161dfed6887f08fbcea9e06c5a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-11T13:04:35Z\\\",\\\"message\\\":\\\"ent time 2025-12-11T13:04:34Z is after 2025-08-24T17:21:41Z]\\\\nI1211 13:04:34.324602 6346 services_controller.go:434] Service openshift-network-console/networking-console-plugin retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{networking-console-plugin openshift-network-console 45966864-9dcc-4747-a124-95f4ab710d2d 13853 0 2025-02-23 05:40:35 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[app.kubernetes.io/component:networking-console-plugin app.kubernetes.io/managed-by:cluster-network-operator app.kubernetes.io/name:networking-console-plugin app.kubernetes.io/part-of:cluster-network-operator] map[openshift.io/description:Expose the networking console plugin service on port 9443. This port is for internal use, and no other usage is guaranteed. service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-secret-name:networking-console-plugin-cert service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{operator.openshift.io/v1 Network cluster 8d01ddba-7e05-4639-926a-4485de3b6327 0xc0073ce7ae 0xc0073ce7af}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:https,Protocol:TCP,Port:9443,TargetPort:{1 0 https},NodePort:0,AppProtocol:nil,},},Selector:map[string]string{app.kubernetes.io/component: networking-console-plugin,app.kuberne\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3aa907a5ca865510f1aafd0f949963b5938b6fe62fa9fd690447c4ab62566f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fc397b24e8480c4d07281b09fb871f859eb9fdf47fa9103b9243f91e5800d61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fc397b24e8480c4d07281b09fb871f859eb9fdf47fa9103b9243f91e5800d61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qndxl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:36Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:36 crc kubenswrapper[4898]: I1211 13:04:36.644609 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:36 crc kubenswrapper[4898]: I1211 13:04:36.644658 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:36 crc kubenswrapper[4898]: I1211 13:04:36.644673 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:36 crc kubenswrapper[4898]: I1211 13:04:36.644693 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:36 crc kubenswrapper[4898]: I1211 13:04:36.644711 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:36Z","lastTransitionTime":"2025-12-11T13:04:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:36 crc kubenswrapper[4898]: I1211 13:04:36.646204 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-brphv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f71cb6f-609f-4fab-86eb-e01518fb8c61\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d17922d87e50d85f50e76fef19a44cedae3bfd35727f8531089cfcf6da12d5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwpjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cedba92b63b5430b84d32f2b59c5cc0c4e1cd2bc2d513f2f905ebb59c34639b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwpjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-brphv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:36Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:36 crc kubenswrapper[4898]: I1211 13:04:36.659339 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9dab79b9393ae8ca48f0b4366d9b4e6fd25015a3d42f913b3d3d5ce78cbf0cc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bee512bd389a5e402acad6fdbf075e8d90bbfb3899efaecee992ff8ef64617fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:36Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:36 crc kubenswrapper[4898]: I1211 13:04:36.674490 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:36Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:36 crc kubenswrapper[4898]: I1211 13:04:36.687173 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4cxl8\" (UniqueName: \"kubernetes.io/projected/34380c7c-1d75-4f6f-a6cb-b015a55ca978-kube-api-access-4cxl8\") pod \"network-metrics-daemon-zcq7l\" (UID: \"34380c7c-1d75-4f6f-a6cb-b015a55ca978\") " pod="openshift-multus/network-metrics-daemon-zcq7l" Dec 11 13:04:36 crc kubenswrapper[4898]: I1211 13:04:36.687239 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/34380c7c-1d75-4f6f-a6cb-b015a55ca978-metrics-certs\") pod \"network-metrics-daemon-zcq7l\" (UID: \"34380c7c-1d75-4f6f-a6cb-b015a55ca978\") " pod="openshift-multus/network-metrics-daemon-zcq7l" Dec 11 13:04:36 crc kubenswrapper[4898]: E1211 13:04:36.687375 4898 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 11 13:04:36 crc kubenswrapper[4898]: E1211 13:04:36.687429 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/34380c7c-1d75-4f6f-a6cb-b015a55ca978-metrics-certs podName:34380c7c-1d75-4f6f-a6cb-b015a55ca978 nodeName:}" failed. No retries permitted until 2025-12-11 13:04:37.187413708 +0000 UTC m=+34.759740155 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/34380c7c-1d75-4f6f-a6cb-b015a55ca978-metrics-certs") pod "network-metrics-daemon-zcq7l" (UID: "34380c7c-1d75-4f6f-a6cb-b015a55ca978") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 11 13:04:36 crc kubenswrapper[4898]: I1211 13:04:36.691894 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7lxfm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45192e97-2770-4866-9865-dc2f45b3f616\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4fc2a5689b589ccda6ec382b6a586f4e47c42d4f137d852e28688a18b98e6e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14a9667311a747f8e844897394432d67e0897b22569dd1314f4ebeccfce1c531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14a9667311a747f8e844897394432d67e0897b22569dd1314f4ebeccfce1c531\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://35bb75c37377565dd0410255d00b3a848a3d22801107bd4d9547ce5574cda121\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35bb75c37377565dd0410255d00b3a848a3d22801107bd4d9547ce5574cda121\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd3bae44b475629c42174d195e162b9afe545e8dc84857847b1e2d936cf1a32d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd3bae44b475629c42174d195e162b9afe545e8dc84857847b1e2d936cf1a32d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7d5edf28866aeb08ed16121e0cea183463e58a708b157da20edd847bdad9860\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7d5edf28866aeb08ed16121e0cea183463e58a708b157da20edd847bdad9860\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://951e5da3fc6041496960dcfcd28bb3e918a40d9303c6609880bc2f8dcca29688\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://951e5da3fc6041496960dcfcd28bb3e918a40d9303c6609880bc2f8dcca29688\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39b6a95a0387b8f1ea4992bfa28355272a299bbb81adddf7ad39835786b43b2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39b6a95a0387b8f1ea4992bfa28355272a299bbb81adddf7ad39835786b43b2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7lxfm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:36Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:36 crc kubenswrapper[4898]: I1211 13:04:36.704717 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-zcq7l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34380c7c-1d75-4f6f-a6cb-b015a55ca978\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:36Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-zcq7l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:36Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:36 crc kubenswrapper[4898]: I1211 13:04:36.716969 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4cxl8\" (UniqueName: \"kubernetes.io/projected/34380c7c-1d75-4f6f-a6cb-b015a55ca978-kube-api-access-4cxl8\") pod \"network-metrics-daemon-zcq7l\" (UID: \"34380c7c-1d75-4f6f-a6cb-b015a55ca978\") " pod="openshift-multus/network-metrics-daemon-zcq7l" Dec 11 13:04:36 crc kubenswrapper[4898]: I1211 13:04:36.722330 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fc485a2-a69e-4453-8c3d-4b9698caa632\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df081be8bffcbf5068ade5f09ad81f2ea332d110fb0dffc1ab215f4a447e75d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://765cb993103d23770e20a725ce1f28ca635d13d2ecd05676039a833274ecf3e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41568c567a418a6ebd534c09d02c4331dfa2cd00580441bfdd83a15a12153a59\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ccc1530df8a90b252f6008c3a1734fd489bcb57944c2cc46bdf3c7554d837c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8857b4ade110b31598d9080f45bc15f5702b2c278fe7909e4d172c9877e61534\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"message\\\":\\\"falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1211 13:04:15.326141 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1211 13:04:15.327902 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-38072788/tls.crt::/tmp/serving-cert-38072788/tls.key\\\\\\\"\\\\nI1211 13:04:21.035582 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1211 13:04:21.038444 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1211 13:04:21.038483 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1211 13:04:21.038515 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1211 13:04:21.038527 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1211 13:04:21.044993 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1211 13:04:21.045022 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1211 13:04:21.045029 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1211 13:04:21.045034 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1211 13:04:21.045038 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1211 13:04:21.045043 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1211 13:04:21.045047 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1211 13:04:21.045297 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1211 13:04:21.046774 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a6e962a8c37823fc7d0047f9c1cae14311bfa1f36236addff039bba3369a021\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca4b5408884ed07718750ad82f77961d242327ca11991fd4dbbd1b5b53c5b642\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca4b5408884ed07718750ad82f77961d242327ca11991fd4dbbd1b5b53c5b642\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:36Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:36 crc kubenswrapper[4898]: I1211 13:04:36.737353 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:36Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:36 crc kubenswrapper[4898]: I1211 13:04:36.747395 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:36 crc kubenswrapper[4898]: I1211 13:04:36.747441 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:36 crc kubenswrapper[4898]: I1211 13:04:36.747477 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:36 crc kubenswrapper[4898]: I1211 13:04:36.747501 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:36 crc kubenswrapper[4898]: I1211 13:04:36.747517 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:36Z","lastTransitionTime":"2025-12-11T13:04:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:36 crc kubenswrapper[4898]: I1211 13:04:36.755914 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:36Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:36 crc kubenswrapper[4898]: I1211 13:04:36.771138 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca3db723-c4e9-4a38-9cee-827f045c7be3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b72e9f6787b4bfadbc89cf669d5f31678f0a85a726107e3b3d2e06eff0e18ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a52b842d9e7001cf1e9c3284c56ac9a76f642b66457d6717c6367e00ef89fdf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://583816a3f9ba9a092d6cd8d75bba54cf42e3ff9f65230f83822baec967dd53dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f419f755ec0a61ffad06dca0d35df43386ebd253152f29cf04312cd0ea89d5de\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:36Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:36 crc kubenswrapper[4898]: I1211 13:04:36.850089 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:36 crc kubenswrapper[4898]: I1211 13:04:36.850161 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:36 crc kubenswrapper[4898]: I1211 13:04:36.850184 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:36 crc kubenswrapper[4898]: I1211 13:04:36.850217 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:36 crc kubenswrapper[4898]: I1211 13:04:36.850239 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:36Z","lastTransitionTime":"2025-12-11T13:04:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:36 crc kubenswrapper[4898]: I1211 13:04:36.954329 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:36 crc kubenswrapper[4898]: I1211 13:04:36.954405 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:36 crc kubenswrapper[4898]: I1211 13:04:36.954427 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:36 crc kubenswrapper[4898]: I1211 13:04:36.954510 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:36 crc kubenswrapper[4898]: I1211 13:04:36.954545 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:36Z","lastTransitionTime":"2025-12-11T13:04:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:37 crc kubenswrapper[4898]: I1211 13:04:37.031771 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qndxl_1efa7034-8a95-4e6e-bd84-0189dc5acaa3/ovnkube-controller/1.log" Dec 11 13:04:37 crc kubenswrapper[4898]: I1211 13:04:37.057637 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:37 crc kubenswrapper[4898]: I1211 13:04:37.057686 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:37 crc kubenswrapper[4898]: I1211 13:04:37.057699 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:37 crc kubenswrapper[4898]: I1211 13:04:37.057718 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:37 crc kubenswrapper[4898]: I1211 13:04:37.057731 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:37Z","lastTransitionTime":"2025-12-11T13:04:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:37 crc kubenswrapper[4898]: I1211 13:04:37.160211 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:37 crc kubenswrapper[4898]: I1211 13:04:37.160287 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:37 crc kubenswrapper[4898]: I1211 13:04:37.160310 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:37 crc kubenswrapper[4898]: I1211 13:04:37.160340 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:37 crc kubenswrapper[4898]: I1211 13:04:37.160360 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:37Z","lastTransitionTime":"2025-12-11T13:04:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:37 crc kubenswrapper[4898]: I1211 13:04:37.193380 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/34380c7c-1d75-4f6f-a6cb-b015a55ca978-metrics-certs\") pod \"network-metrics-daemon-zcq7l\" (UID: \"34380c7c-1d75-4f6f-a6cb-b015a55ca978\") " pod="openshift-multus/network-metrics-daemon-zcq7l" Dec 11 13:04:37 crc kubenswrapper[4898]: E1211 13:04:37.193630 4898 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 11 13:04:37 crc kubenswrapper[4898]: E1211 13:04:37.193742 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/34380c7c-1d75-4f6f-a6cb-b015a55ca978-metrics-certs podName:34380c7c-1d75-4f6f-a6cb-b015a55ca978 nodeName:}" failed. No retries permitted until 2025-12-11 13:04:38.193710525 +0000 UTC m=+35.766037002 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/34380c7c-1d75-4f6f-a6cb-b015a55ca978-metrics-certs") pod "network-metrics-daemon-zcq7l" (UID: "34380c7c-1d75-4f6f-a6cb-b015a55ca978") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 11 13:04:37 crc kubenswrapper[4898]: I1211 13:04:37.262331 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:37 crc kubenswrapper[4898]: I1211 13:04:37.262394 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:37 crc kubenswrapper[4898]: I1211 13:04:37.262417 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:37 crc kubenswrapper[4898]: I1211 13:04:37.262447 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:37 crc kubenswrapper[4898]: I1211 13:04:37.262502 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:37Z","lastTransitionTime":"2025-12-11T13:04:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:37 crc kubenswrapper[4898]: I1211 13:04:37.365771 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:37 crc kubenswrapper[4898]: I1211 13:04:37.365838 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:37 crc kubenswrapper[4898]: I1211 13:04:37.365852 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:37 crc kubenswrapper[4898]: I1211 13:04:37.365875 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:37 crc kubenswrapper[4898]: I1211 13:04:37.366239 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:37Z","lastTransitionTime":"2025-12-11T13:04:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:37 crc kubenswrapper[4898]: I1211 13:04:37.469099 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:37 crc kubenswrapper[4898]: I1211 13:04:37.469179 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:37 crc kubenswrapper[4898]: I1211 13:04:37.469201 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:37 crc kubenswrapper[4898]: I1211 13:04:37.469234 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:37 crc kubenswrapper[4898]: I1211 13:04:37.469256 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:37Z","lastTransitionTime":"2025-12-11T13:04:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:37 crc kubenswrapper[4898]: I1211 13:04:37.496884 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 11 13:04:37 crc kubenswrapper[4898]: E1211 13:04:37.497171 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-11 13:04:53.497132403 +0000 UTC m=+51.069458880 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 13:04:37 crc kubenswrapper[4898]: I1211 13:04:37.571492 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:37 crc kubenswrapper[4898]: I1211 13:04:37.571550 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:37 crc kubenswrapper[4898]: I1211 13:04:37.571571 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:37 crc kubenswrapper[4898]: I1211 13:04:37.571596 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:37 crc kubenswrapper[4898]: I1211 13:04:37.571614 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:37Z","lastTransitionTime":"2025-12-11T13:04:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:37 crc kubenswrapper[4898]: I1211 13:04:37.602141 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 11 13:04:37 crc kubenswrapper[4898]: I1211 13:04:37.602220 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 11 13:04:37 crc kubenswrapper[4898]: I1211 13:04:37.602284 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 11 13:04:37 crc kubenswrapper[4898]: I1211 13:04:37.602319 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 11 13:04:37 crc kubenswrapper[4898]: E1211 13:04:37.602547 4898 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 11 13:04:37 crc kubenswrapper[4898]: E1211 13:04:37.602578 4898 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 11 13:04:37 crc kubenswrapper[4898]: E1211 13:04:37.602594 4898 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 11 13:04:37 crc kubenswrapper[4898]: E1211 13:04:37.602767 4898 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 11 13:04:37 crc kubenswrapper[4898]: E1211 13:04:37.602963 4898 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 11 13:04:37 crc kubenswrapper[4898]: E1211 13:04:37.602876 4898 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 11 13:04:37 crc kubenswrapper[4898]: E1211 13:04:37.603064 4898 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 11 13:04:37 crc kubenswrapper[4898]: E1211 13:04:37.603101 4898 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 11 13:04:37 crc kubenswrapper[4898]: E1211 13:04:37.602892 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-11 13:04:53.60277627 +0000 UTC m=+51.175102767 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 11 13:04:37 crc kubenswrapper[4898]: E1211 13:04:37.603276 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-11 13:04:53.603182211 +0000 UTC m=+51.175508688 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 11 13:04:37 crc kubenswrapper[4898]: E1211 13:04:37.603997 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-11 13:04:53.603356466 +0000 UTC m=+51.175682933 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 11 13:04:37 crc kubenswrapper[4898]: E1211 13:04:37.604053 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-11 13:04:53.604026154 +0000 UTC m=+51.176352631 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 11 13:04:37 crc kubenswrapper[4898]: I1211 13:04:37.675649 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:37 crc kubenswrapper[4898]: I1211 13:04:37.675728 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:37 crc kubenswrapper[4898]: I1211 13:04:37.675753 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:37 crc kubenswrapper[4898]: I1211 13:04:37.675784 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:37 crc kubenswrapper[4898]: I1211 13:04:37.675807 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:37Z","lastTransitionTime":"2025-12-11T13:04:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:37 crc kubenswrapper[4898]: I1211 13:04:37.774600 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 11 13:04:37 crc kubenswrapper[4898]: I1211 13:04:37.774638 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zcq7l" Dec 11 13:04:37 crc kubenswrapper[4898]: I1211 13:04:37.774617 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 11 13:04:37 crc kubenswrapper[4898]: E1211 13:04:37.774747 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 11 13:04:37 crc kubenswrapper[4898]: E1211 13:04:37.774820 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zcq7l" podUID="34380c7c-1d75-4f6f-a6cb-b015a55ca978" Dec 11 13:04:37 crc kubenswrapper[4898]: I1211 13:04:37.774860 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 11 13:04:37 crc kubenswrapper[4898]: E1211 13:04:37.774935 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 11 13:04:37 crc kubenswrapper[4898]: E1211 13:04:37.775111 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 11 13:04:37 crc kubenswrapper[4898]: I1211 13:04:37.779102 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:37 crc kubenswrapper[4898]: I1211 13:04:37.779141 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:37 crc kubenswrapper[4898]: I1211 13:04:37.779153 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:37 crc kubenswrapper[4898]: I1211 13:04:37.779169 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:37 crc kubenswrapper[4898]: I1211 13:04:37.779181 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:37Z","lastTransitionTime":"2025-12-11T13:04:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:37 crc kubenswrapper[4898]: I1211 13:04:37.882420 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:37 crc kubenswrapper[4898]: I1211 13:04:37.882503 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:37 crc kubenswrapper[4898]: I1211 13:04:37.882516 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:37 crc kubenswrapper[4898]: I1211 13:04:37.882573 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:37 crc kubenswrapper[4898]: I1211 13:04:37.882593 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:37Z","lastTransitionTime":"2025-12-11T13:04:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:37 crc kubenswrapper[4898]: I1211 13:04:37.985874 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:37 crc kubenswrapper[4898]: I1211 13:04:37.985955 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:37 crc kubenswrapper[4898]: I1211 13:04:37.985978 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:37 crc kubenswrapper[4898]: I1211 13:04:37.986008 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:37 crc kubenswrapper[4898]: I1211 13:04:37.986034 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:37Z","lastTransitionTime":"2025-12-11T13:04:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:38 crc kubenswrapper[4898]: I1211 13:04:38.088812 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:38 crc kubenswrapper[4898]: I1211 13:04:38.088860 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:38 crc kubenswrapper[4898]: I1211 13:04:38.088869 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:38 crc kubenswrapper[4898]: I1211 13:04:38.088882 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:38 crc kubenswrapper[4898]: I1211 13:04:38.088891 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:38Z","lastTransitionTime":"2025-12-11T13:04:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:38 crc kubenswrapper[4898]: I1211 13:04:38.191493 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:38 crc kubenswrapper[4898]: I1211 13:04:38.191546 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:38 crc kubenswrapper[4898]: I1211 13:04:38.191557 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:38 crc kubenswrapper[4898]: I1211 13:04:38.191573 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:38 crc kubenswrapper[4898]: I1211 13:04:38.191584 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:38Z","lastTransitionTime":"2025-12-11T13:04:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:38 crc kubenswrapper[4898]: I1211 13:04:38.208954 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/34380c7c-1d75-4f6f-a6cb-b015a55ca978-metrics-certs\") pod \"network-metrics-daemon-zcq7l\" (UID: \"34380c7c-1d75-4f6f-a6cb-b015a55ca978\") " pod="openshift-multus/network-metrics-daemon-zcq7l" Dec 11 13:04:38 crc kubenswrapper[4898]: E1211 13:04:38.209091 4898 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 11 13:04:38 crc kubenswrapper[4898]: E1211 13:04:38.209147 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/34380c7c-1d75-4f6f-a6cb-b015a55ca978-metrics-certs podName:34380c7c-1d75-4f6f-a6cb-b015a55ca978 nodeName:}" failed. No retries permitted until 2025-12-11 13:04:40.209130801 +0000 UTC m=+37.781457248 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/34380c7c-1d75-4f6f-a6cb-b015a55ca978-metrics-certs") pod "network-metrics-daemon-zcq7l" (UID: "34380c7c-1d75-4f6f-a6cb-b015a55ca978") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 11 13:04:38 crc kubenswrapper[4898]: I1211 13:04:38.293984 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:38 crc kubenswrapper[4898]: I1211 13:04:38.294023 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:38 crc kubenswrapper[4898]: I1211 13:04:38.294031 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:38 crc kubenswrapper[4898]: I1211 13:04:38.294045 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:38 crc kubenswrapper[4898]: I1211 13:04:38.294053 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:38Z","lastTransitionTime":"2025-12-11T13:04:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:38 crc kubenswrapper[4898]: I1211 13:04:38.396921 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:38 crc kubenswrapper[4898]: I1211 13:04:38.397225 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:38 crc kubenswrapper[4898]: I1211 13:04:38.397309 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:38 crc kubenswrapper[4898]: I1211 13:04:38.397405 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:38 crc kubenswrapper[4898]: I1211 13:04:38.397521 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:38Z","lastTransitionTime":"2025-12-11T13:04:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:38 crc kubenswrapper[4898]: I1211 13:04:38.500538 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:38 crc kubenswrapper[4898]: I1211 13:04:38.500601 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:38 crc kubenswrapper[4898]: I1211 13:04:38.500619 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:38 crc kubenswrapper[4898]: I1211 13:04:38.500642 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:38 crc kubenswrapper[4898]: I1211 13:04:38.500661 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:38Z","lastTransitionTime":"2025-12-11T13:04:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:38 crc kubenswrapper[4898]: I1211 13:04:38.603214 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:38 crc kubenswrapper[4898]: I1211 13:04:38.603284 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:38 crc kubenswrapper[4898]: I1211 13:04:38.603300 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:38 crc kubenswrapper[4898]: I1211 13:04:38.603325 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:38 crc kubenswrapper[4898]: I1211 13:04:38.603345 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:38Z","lastTransitionTime":"2025-12-11T13:04:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:38 crc kubenswrapper[4898]: I1211 13:04:38.707337 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:38 crc kubenswrapper[4898]: I1211 13:04:38.707415 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:38 crc kubenswrapper[4898]: I1211 13:04:38.707438 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:38 crc kubenswrapper[4898]: I1211 13:04:38.707509 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:38 crc kubenswrapper[4898]: I1211 13:04:38.707540 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:38Z","lastTransitionTime":"2025-12-11T13:04:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:38 crc kubenswrapper[4898]: I1211 13:04:38.810222 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:38 crc kubenswrapper[4898]: I1211 13:04:38.810282 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:38 crc kubenswrapper[4898]: I1211 13:04:38.810300 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:38 crc kubenswrapper[4898]: I1211 13:04:38.810320 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:38 crc kubenswrapper[4898]: I1211 13:04:38.810337 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:38Z","lastTransitionTime":"2025-12-11T13:04:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:38 crc kubenswrapper[4898]: I1211 13:04:38.913856 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:38 crc kubenswrapper[4898]: I1211 13:04:38.913912 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:38 crc kubenswrapper[4898]: I1211 13:04:38.913936 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:38 crc kubenswrapper[4898]: I1211 13:04:38.913958 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:38 crc kubenswrapper[4898]: I1211 13:04:38.913973 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:38Z","lastTransitionTime":"2025-12-11T13:04:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:39 crc kubenswrapper[4898]: I1211 13:04:39.016875 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:39 crc kubenswrapper[4898]: I1211 13:04:39.016951 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:39 crc kubenswrapper[4898]: I1211 13:04:39.016964 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:39 crc kubenswrapper[4898]: I1211 13:04:39.016981 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:39 crc kubenswrapper[4898]: I1211 13:04:39.016993 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:39Z","lastTransitionTime":"2025-12-11T13:04:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:39 crc kubenswrapper[4898]: I1211 13:04:39.119759 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:39 crc kubenswrapper[4898]: I1211 13:04:39.119822 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:39 crc kubenswrapper[4898]: I1211 13:04:39.119839 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:39 crc kubenswrapper[4898]: I1211 13:04:39.119861 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:39 crc kubenswrapper[4898]: I1211 13:04:39.119878 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:39Z","lastTransitionTime":"2025-12-11T13:04:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:39 crc kubenswrapper[4898]: I1211 13:04:39.222327 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:39 crc kubenswrapper[4898]: I1211 13:04:39.222388 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:39 crc kubenswrapper[4898]: I1211 13:04:39.222409 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:39 crc kubenswrapper[4898]: I1211 13:04:39.222437 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:39 crc kubenswrapper[4898]: I1211 13:04:39.222490 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:39Z","lastTransitionTime":"2025-12-11T13:04:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:39 crc kubenswrapper[4898]: I1211 13:04:39.325099 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:39 crc kubenswrapper[4898]: I1211 13:04:39.325171 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:39 crc kubenswrapper[4898]: I1211 13:04:39.325196 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:39 crc kubenswrapper[4898]: I1211 13:04:39.325224 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:39 crc kubenswrapper[4898]: I1211 13:04:39.325246 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:39Z","lastTransitionTime":"2025-12-11T13:04:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:39 crc kubenswrapper[4898]: I1211 13:04:39.428710 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:39 crc kubenswrapper[4898]: I1211 13:04:39.428765 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:39 crc kubenswrapper[4898]: I1211 13:04:39.428781 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:39 crc kubenswrapper[4898]: I1211 13:04:39.428803 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:39 crc kubenswrapper[4898]: I1211 13:04:39.428822 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:39Z","lastTransitionTime":"2025-12-11T13:04:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:39 crc kubenswrapper[4898]: I1211 13:04:39.532077 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:39 crc kubenswrapper[4898]: I1211 13:04:39.532146 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:39 crc kubenswrapper[4898]: I1211 13:04:39.532163 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:39 crc kubenswrapper[4898]: I1211 13:04:39.532187 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:39 crc kubenswrapper[4898]: I1211 13:04:39.532208 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:39Z","lastTransitionTime":"2025-12-11T13:04:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:39 crc kubenswrapper[4898]: I1211 13:04:39.635543 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:39 crc kubenswrapper[4898]: I1211 13:04:39.635671 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:39 crc kubenswrapper[4898]: I1211 13:04:39.635725 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:39 crc kubenswrapper[4898]: I1211 13:04:39.635747 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:39 crc kubenswrapper[4898]: I1211 13:04:39.635768 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:39Z","lastTransitionTime":"2025-12-11T13:04:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:39 crc kubenswrapper[4898]: I1211 13:04:39.739401 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:39 crc kubenswrapper[4898]: I1211 13:04:39.739506 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:39 crc kubenswrapper[4898]: I1211 13:04:39.739553 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:39 crc kubenswrapper[4898]: I1211 13:04:39.739587 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:39 crc kubenswrapper[4898]: I1211 13:04:39.739611 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:39Z","lastTransitionTime":"2025-12-11T13:04:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:39 crc kubenswrapper[4898]: I1211 13:04:39.773958 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 11 13:04:39 crc kubenswrapper[4898]: I1211 13:04:39.774019 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 11 13:04:39 crc kubenswrapper[4898]: I1211 13:04:39.773961 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zcq7l" Dec 11 13:04:39 crc kubenswrapper[4898]: E1211 13:04:39.774138 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 11 13:04:39 crc kubenswrapper[4898]: E1211 13:04:39.774271 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zcq7l" podUID="34380c7c-1d75-4f6f-a6cb-b015a55ca978" Dec 11 13:04:39 crc kubenswrapper[4898]: E1211 13:04:39.774397 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 11 13:04:39 crc kubenswrapper[4898]: I1211 13:04:39.774623 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 11 13:04:39 crc kubenswrapper[4898]: E1211 13:04:39.774746 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 11 13:04:39 crc kubenswrapper[4898]: I1211 13:04:39.843044 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:39 crc kubenswrapper[4898]: I1211 13:04:39.843113 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:39 crc kubenswrapper[4898]: I1211 13:04:39.843135 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:39 crc kubenswrapper[4898]: I1211 13:04:39.843164 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:39 crc kubenswrapper[4898]: I1211 13:04:39.843187 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:39Z","lastTransitionTime":"2025-12-11T13:04:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:39 crc kubenswrapper[4898]: I1211 13:04:39.946655 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:39 crc kubenswrapper[4898]: I1211 13:04:39.946715 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:39 crc kubenswrapper[4898]: I1211 13:04:39.946731 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:39 crc kubenswrapper[4898]: I1211 13:04:39.946756 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:39 crc kubenswrapper[4898]: I1211 13:04:39.946774 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:39Z","lastTransitionTime":"2025-12-11T13:04:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:40 crc kubenswrapper[4898]: I1211 13:04:40.049401 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:40 crc kubenswrapper[4898]: I1211 13:04:40.049492 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:40 crc kubenswrapper[4898]: I1211 13:04:40.049510 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:40 crc kubenswrapper[4898]: I1211 13:04:40.049532 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:40 crc kubenswrapper[4898]: I1211 13:04:40.049549 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:40Z","lastTransitionTime":"2025-12-11T13:04:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:40 crc kubenswrapper[4898]: I1211 13:04:40.152711 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:40 crc kubenswrapper[4898]: I1211 13:04:40.152790 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:40 crc kubenswrapper[4898]: I1211 13:04:40.152814 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:40 crc kubenswrapper[4898]: I1211 13:04:40.152842 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:40 crc kubenswrapper[4898]: I1211 13:04:40.152863 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:40Z","lastTransitionTime":"2025-12-11T13:04:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:40 crc kubenswrapper[4898]: I1211 13:04:40.230321 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/34380c7c-1d75-4f6f-a6cb-b015a55ca978-metrics-certs\") pod \"network-metrics-daemon-zcq7l\" (UID: \"34380c7c-1d75-4f6f-a6cb-b015a55ca978\") " pod="openshift-multus/network-metrics-daemon-zcq7l" Dec 11 13:04:40 crc kubenswrapper[4898]: E1211 13:04:40.230527 4898 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 11 13:04:40 crc kubenswrapper[4898]: E1211 13:04:40.230625 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/34380c7c-1d75-4f6f-a6cb-b015a55ca978-metrics-certs podName:34380c7c-1d75-4f6f-a6cb-b015a55ca978 nodeName:}" failed. No retries permitted until 2025-12-11 13:04:44.230599403 +0000 UTC m=+41.802925880 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/34380c7c-1d75-4f6f-a6cb-b015a55ca978-metrics-certs") pod "network-metrics-daemon-zcq7l" (UID: "34380c7c-1d75-4f6f-a6cb-b015a55ca978") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 11 13:04:40 crc kubenswrapper[4898]: I1211 13:04:40.255039 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:40 crc kubenswrapper[4898]: I1211 13:04:40.255089 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:40 crc kubenswrapper[4898]: I1211 13:04:40.255101 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:40 crc kubenswrapper[4898]: I1211 13:04:40.255116 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:40 crc kubenswrapper[4898]: I1211 13:04:40.255125 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:40Z","lastTransitionTime":"2025-12-11T13:04:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:40 crc kubenswrapper[4898]: I1211 13:04:40.357928 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:40 crc kubenswrapper[4898]: I1211 13:04:40.357991 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:40 crc kubenswrapper[4898]: I1211 13:04:40.358001 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:40 crc kubenswrapper[4898]: I1211 13:04:40.358022 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:40 crc kubenswrapper[4898]: I1211 13:04:40.358038 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:40Z","lastTransitionTime":"2025-12-11T13:04:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:40 crc kubenswrapper[4898]: I1211 13:04:40.460633 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:40 crc kubenswrapper[4898]: I1211 13:04:40.460691 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:40 crc kubenswrapper[4898]: I1211 13:04:40.460707 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:40 crc kubenswrapper[4898]: I1211 13:04:40.460729 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:40 crc kubenswrapper[4898]: I1211 13:04:40.460750 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:40Z","lastTransitionTime":"2025-12-11T13:04:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:40 crc kubenswrapper[4898]: I1211 13:04:40.563192 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:40 crc kubenswrapper[4898]: I1211 13:04:40.563238 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:40 crc kubenswrapper[4898]: I1211 13:04:40.563252 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:40 crc kubenswrapper[4898]: I1211 13:04:40.563272 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:40 crc kubenswrapper[4898]: I1211 13:04:40.563285 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:40Z","lastTransitionTime":"2025-12-11T13:04:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:40 crc kubenswrapper[4898]: I1211 13:04:40.666094 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:40 crc kubenswrapper[4898]: I1211 13:04:40.666174 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:40 crc kubenswrapper[4898]: I1211 13:04:40.666198 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:40 crc kubenswrapper[4898]: I1211 13:04:40.666227 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:40 crc kubenswrapper[4898]: I1211 13:04:40.666250 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:40Z","lastTransitionTime":"2025-12-11T13:04:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:40 crc kubenswrapper[4898]: I1211 13:04:40.769055 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:40 crc kubenswrapper[4898]: I1211 13:04:40.769120 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:40 crc kubenswrapper[4898]: I1211 13:04:40.769135 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:40 crc kubenswrapper[4898]: I1211 13:04:40.769156 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:40 crc kubenswrapper[4898]: I1211 13:04:40.769172 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:40Z","lastTransitionTime":"2025-12-11T13:04:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:40 crc kubenswrapper[4898]: I1211 13:04:40.779324 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-qndxl" Dec 11 13:04:40 crc kubenswrapper[4898]: I1211 13:04:40.780738 4898 scope.go:117] "RemoveContainer" containerID="378427d319c4df6bcb001488490b4a1a785a18161dfed6887f08fbcea9e06c5a" Dec 11 13:04:40 crc kubenswrapper[4898]: E1211 13:04:40.781191 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-qndxl_openshift-ovn-kubernetes(1efa7034-8a95-4e6e-bd84-0189dc5acaa3)\"" pod="openshift-ovn-kubernetes/ovnkube-node-qndxl" podUID="1efa7034-8a95-4e6e-bd84-0189dc5acaa3" Dec 11 13:04:40 crc kubenswrapper[4898]: I1211 13:04:40.792783 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:40Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:40 crc kubenswrapper[4898]: I1211 13:04:40.806589 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7lxfm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45192e97-2770-4866-9865-dc2f45b3f616\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4fc2a5689b589ccda6ec382b6a586f4e47c42d4f137d852e28688a18b98e6e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14a9667311a747f8e844897394432d67e0897b22569dd1314f4ebeccfce1c531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14a9667311a747f8e844897394432d67e0897b22569dd1314f4ebeccfce1c531\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://35bb75c37377565dd0410255d00b3a848a3d22801107bd4d9547ce5574cda121\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35bb75c37377565dd0410255d00b3a848a3d22801107bd4d9547ce5574cda121\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd3bae44b475629c42174d195e162b9afe545e8dc84857847b1e2d936cf1a32d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd3bae44b475629c42174d195e162b9afe545e8dc84857847b1e2d936cf1a32d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7d5edf28866aeb08ed16121e0cea183463e58a708b157da20edd847bdad9860\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7d5edf28866aeb08ed16121e0cea183463e58a708b157da20edd847bdad9860\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://951e5da3fc6041496960dcfcd28bb3e918a40d9303c6609880bc2f8dcca29688\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://951e5da3fc6041496960dcfcd28bb3e918a40d9303c6609880bc2f8dcca29688\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39b6a95a0387b8f1ea4992bfa28355272a299bbb81adddf7ad39835786b43b2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39b6a95a0387b8f1ea4992bfa28355272a299bbb81adddf7ad39835786b43b2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7lxfm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:40Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:40 crc kubenswrapper[4898]: I1211 13:04:40.819537 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-zcq7l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34380c7c-1d75-4f6f-a6cb-b015a55ca978\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:36Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-zcq7l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:40Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:40 crc kubenswrapper[4898]: I1211 13:04:40.835777 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fc485a2-a69e-4453-8c3d-4b9698caa632\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df081be8bffcbf5068ade5f09ad81f2ea332d110fb0dffc1ab215f4a447e75d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://765cb993103d23770e20a725ce1f28ca635d13d2ecd05676039a833274ecf3e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41568c567a418a6ebd534c09d02c4331dfa2cd00580441bfdd83a15a12153a59\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ccc1530df8a90b252f6008c3a1734fd489bcb57944c2cc46bdf3c7554d837c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8857b4ade110b31598d9080f45bc15f5702b2c278fe7909e4d172c9877e61534\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"message\\\":\\\"falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1211 13:04:15.326141 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1211 13:04:15.327902 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-38072788/tls.crt::/tmp/serving-cert-38072788/tls.key\\\\\\\"\\\\nI1211 13:04:21.035582 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1211 13:04:21.038444 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1211 13:04:21.038483 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1211 13:04:21.038515 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1211 13:04:21.038527 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1211 13:04:21.044993 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1211 13:04:21.045022 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1211 13:04:21.045029 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1211 13:04:21.045034 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1211 13:04:21.045038 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1211 13:04:21.045043 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1211 13:04:21.045047 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1211 13:04:21.045297 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1211 13:04:21.046774 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a6e962a8c37823fc7d0047f9c1cae14311bfa1f36236addff039bba3369a021\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca4b5408884ed07718750ad82f77961d242327ca11991fd4dbbd1b5b53c5b642\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca4b5408884ed07718750ad82f77961d242327ca11991fd4dbbd1b5b53c5b642\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:40Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:40 crc kubenswrapper[4898]: I1211 13:04:40.848926 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9dab79b9393ae8ca48f0b4366d9b4e6fd25015a3d42f913b3d3d5ce78cbf0cc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bee512bd389a5e402acad6fdbf075e8d90bbfb3899efaecee992ff8ef64617fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:40Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:40 crc kubenswrapper[4898]: I1211 13:04:40.858895 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:40Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:40 crc kubenswrapper[4898]: I1211 13:04:40.871931 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:40 crc kubenswrapper[4898]: I1211 13:04:40.871981 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:40 crc kubenswrapper[4898]: I1211 13:04:40.871997 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:40 crc kubenswrapper[4898]: I1211 13:04:40.872021 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:40 crc kubenswrapper[4898]: I1211 13:04:40.872040 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:40Z","lastTransitionTime":"2025-12-11T13:04:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:40 crc kubenswrapper[4898]: I1211 13:04:40.873217 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca3db723-c4e9-4a38-9cee-827f045c7be3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b72e9f6787b4bfadbc89cf669d5f31678f0a85a726107e3b3d2e06eff0e18ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a52b842d9e7001cf1e9c3284c56ac9a76f642b66457d6717c6367e00ef89fdf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://583816a3f9ba9a092d6cd8d75bba54cf42e3ff9f65230f83822baec967dd53dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f419f755ec0a61ffad06dca0d35df43386ebd253152f29cf04312cd0ea89d5de\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:40Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:40 crc kubenswrapper[4898]: I1211 13:04:40.887677 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:40Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:40 crc kubenswrapper[4898]: I1211 13:04:40.901295 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-dlqfj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e8ed6cb-b822-4b64-9e00-e755c5aea812\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76ddfb054d2dd7ec27a771c96a0fca2cead8eacc3221c22b36c1d075414a1995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bvrpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-dlqfj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:40Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:40 crc kubenswrapper[4898]: I1211 13:04:40.912599 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jmvlr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8009db93-3f68-4a84-87ae-863f64e231e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08079d059f71e05ad9d28ebed3d2608a035d3b6fc87db1d80b3974b439ac5bdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkmx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:24Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jmvlr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:40Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:40 crc kubenswrapper[4898]: I1211 13:04:40.925069 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://337971ff1d0d3cf7ee256777a71e4549ec0b973b58b437caaa152c589047141f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:40Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:40 crc kubenswrapper[4898]: I1211 13:04:40.936113 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b28cc743c1173591c6ee75d4bb3841608d01bca50dab896d4c5de930a67ee22f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:40Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:40 crc kubenswrapper[4898]: I1211 13:04:40.956399 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"88d684b6-361a-40e6-978e-b46ea34398f5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fdd8f4f5e41c02e8bfa32782027f6483a3568a3e1cb28f2abc7807eaada32646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad23dc07c8ad7cf6100e3b53e1d9ed8abac20e942968edb5decdde1b263de306\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce1bf1a7718645350f992e26a366a0fefa0d757b696c932d3552dd24a4c93550\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4801f56faef50a1c100b7f54be9508a2be17ab655648fdb92cffaab1cb198a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87f5a835b639c4ce50c0d2b5f1a865f1b67bc5cc06d9fd88875d9defb71dbed2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77f3c45a0ef619bd5ee7b89f62fdaf4ae983c8f75e23034eb88019194f6ef88a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77f3c45a0ef619bd5ee7b89f62fdaf4ae983c8f75e23034eb88019194f6ef88a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32f392949d5ebc630af8b0646f8f482f806803d464e2cedbcbe731328d6ba5ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32f392949d5ebc630af8b0646f8f482f806803d464e2cedbcbe731328d6ba5ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://bf8553e4b1a22afaf454efae3707136b9e6e320a54fd269f48a5f72297249d21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf8553e4b1a22afaf454efae3707136b9e6e320a54fd269f48a5f72297249d21\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:40Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:40 crc kubenswrapper[4898]: I1211 13:04:40.965416 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-h86cf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"00c0a5c3-6be3-4c77-a628-cf6710a1f10f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cb94931502e22d03595fc3289c2b098ecf881b9f8c14180a036a11b2f65bd91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xv2hc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-h86cf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:40Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:40 crc kubenswrapper[4898]: I1211 13:04:40.974228 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:40 crc kubenswrapper[4898]: I1211 13:04:40.974264 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:40 crc kubenswrapper[4898]: I1211 13:04:40.974274 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:40 crc kubenswrapper[4898]: I1211 13:04:40.974290 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:40 crc kubenswrapper[4898]: I1211 13:04:40.974301 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:40Z","lastTransitionTime":"2025-12-11T13:04:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:40 crc kubenswrapper[4898]: I1211 13:04:40.977951 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6782ac5869029f1d6c05198c469d34070decbbbbab0c093963024040f067ecf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7cv42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb7b98c50effebe09db7e51fe139da11af7b116ce6b7d81f0dada92fa29c82a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7cv42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7mmvk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:40Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:40 crc kubenswrapper[4898]: I1211 13:04:40.995402 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qndxl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1efa7034-8a95-4e6e-bd84-0189dc5acaa3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2a77e22893dca5ea818fc6992db6c34f8eb90ce2f25e814bec5f314d60ad569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd81ec4902eeb225d5a2969e211c63bbab448c1cccb327cfa84fb68c4f386ab0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ab0efc2d7d46f575125a46f025b016017e66f74466a9d796dfb0285577f5bcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d992af21fd68cce19a4be8328466a3d46df62ed908fe4e911ba14e7c009ce982\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a60901e27e67b7483e02800a91a86f2676b610d538c999ab1df0d42fde146f90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7e7d7ded2a6e0e20781a16b0ed56a744db593c6fe40c75e89a088ccb8cdb127\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://378427d319c4df6bcb001488490b4a1a785a18161dfed6887f08fbcea9e06c5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://378427d319c4df6bcb001488490b4a1a785a18161dfed6887f08fbcea9e06c5a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-11T13:04:35Z\\\",\\\"message\\\":\\\"ent time 2025-12-11T13:04:34Z is after 2025-08-24T17:21:41Z]\\\\nI1211 13:04:34.324602 6346 services_controller.go:434] Service openshift-network-console/networking-console-plugin retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{networking-console-plugin openshift-network-console 45966864-9dcc-4747-a124-95f4ab710d2d 13853 0 2025-02-23 05:40:35 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[app.kubernetes.io/component:networking-console-plugin app.kubernetes.io/managed-by:cluster-network-operator app.kubernetes.io/name:networking-console-plugin app.kubernetes.io/part-of:cluster-network-operator] map[openshift.io/description:Expose the networking console plugin service on port 9443. This port is for internal use, and no other usage is guaranteed. service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-secret-name:networking-console-plugin-cert service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{operator.openshift.io/v1 Network cluster 8d01ddba-7e05-4639-926a-4485de3b6327 0xc0073ce7ae 0xc0073ce7af}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:https,Protocol:TCP,Port:9443,TargetPort:{1 0 https},NodePort:0,AppProtocol:nil,},},Selector:map[string]string{app.kubernetes.io/component: networking-console-plugin,app.kuberne\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:33Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-qndxl_openshift-ovn-kubernetes(1efa7034-8a95-4e6e-bd84-0189dc5acaa3)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3aa907a5ca865510f1aafd0f949963b5938b6fe62fa9fd690447c4ab62566f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fc397b24e8480c4d07281b09fb871f859eb9fdf47fa9103b9243f91e5800d61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fc397b24e8480c4d07281b09fb871f859eb9fdf47fa9103b9243f91e5800d61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qndxl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:40Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:41 crc kubenswrapper[4898]: I1211 13:04:41.007249 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-brphv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f71cb6f-609f-4fab-86eb-e01518fb8c61\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d17922d87e50d85f50e76fef19a44cedae3bfd35727f8531089cfcf6da12d5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwpjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cedba92b63b5430b84d32f2b59c5cc0c4e1cd2bc2d513f2f905ebb59c34639b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwpjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-brphv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:41Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:41 crc kubenswrapper[4898]: I1211 13:04:41.077014 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:41 crc kubenswrapper[4898]: I1211 13:04:41.077075 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:41 crc kubenswrapper[4898]: I1211 13:04:41.077093 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:41 crc kubenswrapper[4898]: I1211 13:04:41.077117 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:41 crc kubenswrapper[4898]: I1211 13:04:41.077137 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:41Z","lastTransitionTime":"2025-12-11T13:04:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:41 crc kubenswrapper[4898]: I1211 13:04:41.180926 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:41 crc kubenswrapper[4898]: I1211 13:04:41.180995 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:41 crc kubenswrapper[4898]: I1211 13:04:41.181018 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:41 crc kubenswrapper[4898]: I1211 13:04:41.181044 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:41 crc kubenswrapper[4898]: I1211 13:04:41.181062 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:41Z","lastTransitionTime":"2025-12-11T13:04:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:41 crc kubenswrapper[4898]: I1211 13:04:41.283563 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:41 crc kubenswrapper[4898]: I1211 13:04:41.283639 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:41 crc kubenswrapper[4898]: I1211 13:04:41.283663 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:41 crc kubenswrapper[4898]: I1211 13:04:41.283694 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:41 crc kubenswrapper[4898]: I1211 13:04:41.283718 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:41Z","lastTransitionTime":"2025-12-11T13:04:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:41 crc kubenswrapper[4898]: I1211 13:04:41.386708 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:41 crc kubenswrapper[4898]: I1211 13:04:41.386770 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:41 crc kubenswrapper[4898]: I1211 13:04:41.386785 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:41 crc kubenswrapper[4898]: I1211 13:04:41.386807 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:41 crc kubenswrapper[4898]: I1211 13:04:41.386824 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:41Z","lastTransitionTime":"2025-12-11T13:04:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:41 crc kubenswrapper[4898]: I1211 13:04:41.489339 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:41 crc kubenswrapper[4898]: I1211 13:04:41.489417 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:41 crc kubenswrapper[4898]: I1211 13:04:41.489452 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:41 crc kubenswrapper[4898]: I1211 13:04:41.489507 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:41 crc kubenswrapper[4898]: I1211 13:04:41.489526 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:41Z","lastTransitionTime":"2025-12-11T13:04:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:41 crc kubenswrapper[4898]: I1211 13:04:41.591763 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:41 crc kubenswrapper[4898]: I1211 13:04:41.591811 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:41 crc kubenswrapper[4898]: I1211 13:04:41.591825 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:41 crc kubenswrapper[4898]: I1211 13:04:41.591848 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:41 crc kubenswrapper[4898]: I1211 13:04:41.591863 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:41Z","lastTransitionTime":"2025-12-11T13:04:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:41 crc kubenswrapper[4898]: I1211 13:04:41.695327 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:41 crc kubenswrapper[4898]: I1211 13:04:41.695399 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:41 crc kubenswrapper[4898]: I1211 13:04:41.695421 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:41 crc kubenswrapper[4898]: I1211 13:04:41.695489 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:41 crc kubenswrapper[4898]: I1211 13:04:41.695515 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:41Z","lastTransitionTime":"2025-12-11T13:04:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:41 crc kubenswrapper[4898]: I1211 13:04:41.748445 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 11 13:04:41 crc kubenswrapper[4898]: I1211 13:04:41.764665 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-zcq7l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34380c7c-1d75-4f6f-a6cb-b015a55ca978\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:36Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-zcq7l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:41Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:41 crc kubenswrapper[4898]: I1211 13:04:41.774540 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 11 13:04:41 crc kubenswrapper[4898]: I1211 13:04:41.774604 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 11 13:04:41 crc kubenswrapper[4898]: I1211 13:04:41.774556 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 11 13:04:41 crc kubenswrapper[4898]: I1211 13:04:41.774545 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zcq7l" Dec 11 13:04:41 crc kubenswrapper[4898]: E1211 13:04:41.774748 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 11 13:04:41 crc kubenswrapper[4898]: E1211 13:04:41.774846 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zcq7l" podUID="34380c7c-1d75-4f6f-a6cb-b015a55ca978" Dec 11 13:04:41 crc kubenswrapper[4898]: E1211 13:04:41.774977 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 11 13:04:41 crc kubenswrapper[4898]: E1211 13:04:41.775111 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 11 13:04:41 crc kubenswrapper[4898]: I1211 13:04:41.779923 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fc485a2-a69e-4453-8c3d-4b9698caa632\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df081be8bffcbf5068ade5f09ad81f2ea332d110fb0dffc1ab215f4a447e75d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://765cb993103d23770e20a725ce1f28ca635d13d2ecd05676039a833274ecf3e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41568c567a418a6ebd534c09d02c4331dfa2cd00580441bfdd83a15a12153a59\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ccc1530df8a90b252f6008c3a1734fd489bcb57944c2cc46bdf3c7554d837c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8857b4ade110b31598d9080f45bc15f5702b2c278fe7909e4d172c9877e61534\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"message\\\":\\\"falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1211 13:04:15.326141 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1211 13:04:15.327902 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-38072788/tls.crt::/tmp/serving-cert-38072788/tls.key\\\\\\\"\\\\nI1211 13:04:21.035582 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1211 13:04:21.038444 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1211 13:04:21.038483 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1211 13:04:21.038515 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1211 13:04:21.038527 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1211 13:04:21.044993 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1211 13:04:21.045022 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1211 13:04:21.045029 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1211 13:04:21.045034 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1211 13:04:21.045038 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1211 13:04:21.045043 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1211 13:04:21.045047 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1211 13:04:21.045297 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1211 13:04:21.046774 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a6e962a8c37823fc7d0047f9c1cae14311bfa1f36236addff039bba3369a021\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca4b5408884ed07718750ad82f77961d242327ca11991fd4dbbd1b5b53c5b642\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca4b5408884ed07718750ad82f77961d242327ca11991fd4dbbd1b5b53c5b642\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:41Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:41 crc kubenswrapper[4898]: I1211 13:04:41.792905 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9dab79b9393ae8ca48f0b4366d9b4e6fd25015a3d42f913b3d3d5ce78cbf0cc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bee512bd389a5e402acad6fdbf075e8d90bbfb3899efaecee992ff8ef64617fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:41Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:41 crc kubenswrapper[4898]: I1211 13:04:41.797542 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:41 crc kubenswrapper[4898]: I1211 13:04:41.797598 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:41 crc kubenswrapper[4898]: I1211 13:04:41.797615 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:41 crc kubenswrapper[4898]: I1211 13:04:41.797638 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:41 crc kubenswrapper[4898]: I1211 13:04:41.797656 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:41Z","lastTransitionTime":"2025-12-11T13:04:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:41 crc kubenswrapper[4898]: I1211 13:04:41.808377 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:41Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:41 crc kubenswrapper[4898]: I1211 13:04:41.830785 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7lxfm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45192e97-2770-4866-9865-dc2f45b3f616\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4fc2a5689b589ccda6ec382b6a586f4e47c42d4f137d852e28688a18b98e6e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14a9667311a747f8e844897394432d67e0897b22569dd1314f4ebeccfce1c531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14a9667311a747f8e844897394432d67e0897b22569dd1314f4ebeccfce1c531\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://35bb75c37377565dd0410255d00b3a848a3d22801107bd4d9547ce5574cda121\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35bb75c37377565dd0410255d00b3a848a3d22801107bd4d9547ce5574cda121\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd3bae44b475629c42174d195e162b9afe545e8dc84857847b1e2d936cf1a32d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd3bae44b475629c42174d195e162b9afe545e8dc84857847b1e2d936cf1a32d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7d5edf28866aeb08ed16121e0cea183463e58a708b157da20edd847bdad9860\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7d5edf28866aeb08ed16121e0cea183463e58a708b157da20edd847bdad9860\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://951e5da3fc6041496960dcfcd28bb3e918a40d9303c6609880bc2f8dcca29688\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://951e5da3fc6041496960dcfcd28bb3e918a40d9303c6609880bc2f8dcca29688\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39b6a95a0387b8f1ea4992bfa28355272a299bbb81adddf7ad39835786b43b2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39b6a95a0387b8f1ea4992bfa28355272a299bbb81adddf7ad39835786b43b2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7lxfm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:41Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:41 crc kubenswrapper[4898]: I1211 13:04:41.847363 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca3db723-c4e9-4a38-9cee-827f045c7be3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b72e9f6787b4bfadbc89cf669d5f31678f0a85a726107e3b3d2e06eff0e18ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a52b842d9e7001cf1e9c3284c56ac9a76f642b66457d6717c6367e00ef89fdf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://583816a3f9ba9a092d6cd8d75bba54cf42e3ff9f65230f83822baec967dd53dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f419f755ec0a61ffad06dca0d35df43386ebd253152f29cf04312cd0ea89d5de\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:41Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:41 crc kubenswrapper[4898]: I1211 13:04:41.865737 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:41Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:41 crc kubenswrapper[4898]: I1211 13:04:41.885049 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:41Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:41 crc kubenswrapper[4898]: I1211 13:04:41.900013 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:41 crc kubenswrapper[4898]: I1211 13:04:41.900111 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:41 crc kubenswrapper[4898]: I1211 13:04:41.900133 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:41 crc kubenswrapper[4898]: I1211 13:04:41.900543 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:41 crc kubenswrapper[4898]: I1211 13:04:41.900788 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:41Z","lastTransitionTime":"2025-12-11T13:04:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:41 crc kubenswrapper[4898]: I1211 13:04:41.903202 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://337971ff1d0d3cf7ee256777a71e4549ec0b973b58b437caaa152c589047141f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:41Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:41 crc kubenswrapper[4898]: I1211 13:04:41.918077 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b28cc743c1173591c6ee75d4bb3841608d01bca50dab896d4c5de930a67ee22f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:41Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:41 crc kubenswrapper[4898]: I1211 13:04:41.937495 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-dlqfj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e8ed6cb-b822-4b64-9e00-e755c5aea812\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76ddfb054d2dd7ec27a771c96a0fca2cead8eacc3221c22b36c1d075414a1995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bvrpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-dlqfj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:41Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:41 crc kubenswrapper[4898]: I1211 13:04:41.953093 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jmvlr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8009db93-3f68-4a84-87ae-863f64e231e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08079d059f71e05ad9d28ebed3d2608a035d3b6fc87db1d80b3974b439ac5bdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkmx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:24Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jmvlr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:41Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:41 crc kubenswrapper[4898]: I1211 13:04:41.966916 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6782ac5869029f1d6c05198c469d34070decbbbbab0c093963024040f067ecf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7cv42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb7b98c50effebe09db7e51fe139da11af7b116ce6b7d81f0dada92fa29c82a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7cv42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7mmvk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:41Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:41 crc kubenswrapper[4898]: I1211 13:04:41.987616 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qndxl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1efa7034-8a95-4e6e-bd84-0189dc5acaa3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2a77e22893dca5ea818fc6992db6c34f8eb90ce2f25e814bec5f314d60ad569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd81ec4902eeb225d5a2969e211c63bbab448c1cccb327cfa84fb68c4f386ab0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ab0efc2d7d46f575125a46f025b016017e66f74466a9d796dfb0285577f5bcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d992af21fd68cce19a4be8328466a3d46df62ed908fe4e911ba14e7c009ce982\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a60901e27e67b7483e02800a91a86f2676b610d538c999ab1df0d42fde146f90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7e7d7ded2a6e0e20781a16b0ed56a744db593c6fe40c75e89a088ccb8cdb127\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://378427d319c4df6bcb001488490b4a1a785a18161dfed6887f08fbcea9e06c5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://378427d319c4df6bcb001488490b4a1a785a18161dfed6887f08fbcea9e06c5a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-11T13:04:35Z\\\",\\\"message\\\":\\\"ent time 2025-12-11T13:04:34Z is after 2025-08-24T17:21:41Z]\\\\nI1211 13:04:34.324602 6346 services_controller.go:434] Service openshift-network-console/networking-console-plugin retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{networking-console-plugin openshift-network-console 45966864-9dcc-4747-a124-95f4ab710d2d 13853 0 2025-02-23 05:40:35 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[app.kubernetes.io/component:networking-console-plugin app.kubernetes.io/managed-by:cluster-network-operator app.kubernetes.io/name:networking-console-plugin app.kubernetes.io/part-of:cluster-network-operator] map[openshift.io/description:Expose the networking console plugin service on port 9443. This port is for internal use, and no other usage is guaranteed. service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-secret-name:networking-console-plugin-cert service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{operator.openshift.io/v1 Network cluster 8d01ddba-7e05-4639-926a-4485de3b6327 0xc0073ce7ae 0xc0073ce7af}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:https,Protocol:TCP,Port:9443,TargetPort:{1 0 https},NodePort:0,AppProtocol:nil,},},Selector:map[string]string{app.kubernetes.io/component: networking-console-plugin,app.kuberne\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:33Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-qndxl_openshift-ovn-kubernetes(1efa7034-8a95-4e6e-bd84-0189dc5acaa3)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3aa907a5ca865510f1aafd0f949963b5938b6fe62fa9fd690447c4ab62566f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fc397b24e8480c4d07281b09fb871f859eb9fdf47fa9103b9243f91e5800d61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fc397b24e8480c4d07281b09fb871f859eb9fdf47fa9103b9243f91e5800d61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qndxl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:41Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:42 crc kubenswrapper[4898]: I1211 13:04:42.004110 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:42 crc kubenswrapper[4898]: I1211 13:04:42.004180 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:42 crc kubenswrapper[4898]: I1211 13:04:42.004206 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:42 crc kubenswrapper[4898]: I1211 13:04:42.004236 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:42 crc kubenswrapper[4898]: I1211 13:04:42.004258 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:42Z","lastTransitionTime":"2025-12-11T13:04:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:42 crc kubenswrapper[4898]: I1211 13:04:42.005211 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-brphv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f71cb6f-609f-4fab-86eb-e01518fb8c61\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d17922d87e50d85f50e76fef19a44cedae3bfd35727f8531089cfcf6da12d5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwpjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cedba92b63b5430b84d32f2b59c5cc0c4e1cd2bc2d513f2f905ebb59c34639b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwpjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-brphv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:42Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:42 crc kubenswrapper[4898]: I1211 13:04:42.029052 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"88d684b6-361a-40e6-978e-b46ea34398f5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fdd8f4f5e41c02e8bfa32782027f6483a3568a3e1cb28f2abc7807eaada32646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad23dc07c8ad7cf6100e3b53e1d9ed8abac20e942968edb5decdde1b263de306\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce1bf1a7718645350f992e26a366a0fefa0d757b696c932d3552dd24a4c93550\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4801f56faef50a1c100b7f54be9508a2be17ab655648fdb92cffaab1cb198a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87f5a835b639c4ce50c0d2b5f1a865f1b67bc5cc06d9fd88875d9defb71dbed2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77f3c45a0ef619bd5ee7b89f62fdaf4ae983c8f75e23034eb88019194f6ef88a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77f3c45a0ef619bd5ee7b89f62fdaf4ae983c8f75e23034eb88019194f6ef88a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32f392949d5ebc630af8b0646f8f482f806803d464e2cedbcbe731328d6ba5ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32f392949d5ebc630af8b0646f8f482f806803d464e2cedbcbe731328d6ba5ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://bf8553e4b1a22afaf454efae3707136b9e6e320a54fd269f48a5f72297249d21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf8553e4b1a22afaf454efae3707136b9e6e320a54fd269f48a5f72297249d21\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:42Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:42 crc kubenswrapper[4898]: I1211 13:04:42.041590 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-h86cf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"00c0a5c3-6be3-4c77-a628-cf6710a1f10f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cb94931502e22d03595fc3289c2b098ecf881b9f8c14180a036a11b2f65bd91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xv2hc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-h86cf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:42Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:42 crc kubenswrapper[4898]: I1211 13:04:42.107365 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:42 crc kubenswrapper[4898]: I1211 13:04:42.107408 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:42 crc kubenswrapper[4898]: I1211 13:04:42.107418 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:42 crc kubenswrapper[4898]: I1211 13:04:42.107435 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:42 crc kubenswrapper[4898]: I1211 13:04:42.107447 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:42Z","lastTransitionTime":"2025-12-11T13:04:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:42 crc kubenswrapper[4898]: I1211 13:04:42.210442 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:42 crc kubenswrapper[4898]: I1211 13:04:42.210518 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:42 crc kubenswrapper[4898]: I1211 13:04:42.210532 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:42 crc kubenswrapper[4898]: I1211 13:04:42.210552 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:42 crc kubenswrapper[4898]: I1211 13:04:42.210566 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:42Z","lastTransitionTime":"2025-12-11T13:04:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:42 crc kubenswrapper[4898]: I1211 13:04:42.313847 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:42 crc kubenswrapper[4898]: I1211 13:04:42.313912 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:42 crc kubenswrapper[4898]: I1211 13:04:42.313936 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:42 crc kubenswrapper[4898]: I1211 13:04:42.313963 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:42 crc kubenswrapper[4898]: I1211 13:04:42.313985 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:42Z","lastTransitionTime":"2025-12-11T13:04:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:42 crc kubenswrapper[4898]: I1211 13:04:42.417271 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:42 crc kubenswrapper[4898]: I1211 13:04:42.417348 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:42 crc kubenswrapper[4898]: I1211 13:04:42.417373 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:42 crc kubenswrapper[4898]: I1211 13:04:42.417402 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:42 crc kubenswrapper[4898]: I1211 13:04:42.417425 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:42Z","lastTransitionTime":"2025-12-11T13:04:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:42 crc kubenswrapper[4898]: I1211 13:04:42.520526 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:42 crc kubenswrapper[4898]: I1211 13:04:42.520595 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:42 crc kubenswrapper[4898]: I1211 13:04:42.520609 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:42 crc kubenswrapper[4898]: I1211 13:04:42.520631 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:42 crc kubenswrapper[4898]: I1211 13:04:42.520646 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:42Z","lastTransitionTime":"2025-12-11T13:04:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:42 crc kubenswrapper[4898]: I1211 13:04:42.623365 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:42 crc kubenswrapper[4898]: I1211 13:04:42.623430 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:42 crc kubenswrapper[4898]: I1211 13:04:42.623447 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:42 crc kubenswrapper[4898]: I1211 13:04:42.623493 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:42 crc kubenswrapper[4898]: I1211 13:04:42.623510 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:42Z","lastTransitionTime":"2025-12-11T13:04:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:42 crc kubenswrapper[4898]: I1211 13:04:42.725848 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:42 crc kubenswrapper[4898]: I1211 13:04:42.725897 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:42 crc kubenswrapper[4898]: I1211 13:04:42.725926 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:42 crc kubenswrapper[4898]: I1211 13:04:42.725942 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:42 crc kubenswrapper[4898]: I1211 13:04:42.725952 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:42Z","lastTransitionTime":"2025-12-11T13:04:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:42 crc kubenswrapper[4898]: I1211 13:04:42.796852 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"88d684b6-361a-40e6-978e-b46ea34398f5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fdd8f4f5e41c02e8bfa32782027f6483a3568a3e1cb28f2abc7807eaada32646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad23dc07c8ad7cf6100e3b53e1d9ed8abac20e942968edb5decdde1b263de306\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce1bf1a7718645350f992e26a366a0fefa0d757b696c932d3552dd24a4c93550\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4801f56faef50a1c100b7f54be9508a2be17ab655648fdb92cffaab1cb198a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87f5a835b639c4ce50c0d2b5f1a865f1b67bc5cc06d9fd88875d9defb71dbed2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77f3c45a0ef619bd5ee7b89f62fdaf4ae983c8f75e23034eb88019194f6ef88a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77f3c45a0ef619bd5ee7b89f62fdaf4ae983c8f75e23034eb88019194f6ef88a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32f392949d5ebc630af8b0646f8f482f806803d464e2cedbcbe731328d6ba5ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32f392949d5ebc630af8b0646f8f482f806803d464e2cedbcbe731328d6ba5ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://bf8553e4b1a22afaf454efae3707136b9e6e320a54fd269f48a5f72297249d21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf8553e4b1a22afaf454efae3707136b9e6e320a54fd269f48a5f72297249d21\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:42Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:42 crc kubenswrapper[4898]: I1211 13:04:42.808374 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-h86cf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"00c0a5c3-6be3-4c77-a628-cf6710a1f10f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cb94931502e22d03595fc3289c2b098ecf881b9f8c14180a036a11b2f65bd91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xv2hc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-h86cf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:42Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:42 crc kubenswrapper[4898]: I1211 13:04:42.820990 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6782ac5869029f1d6c05198c469d34070decbbbbab0c093963024040f067ecf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7cv42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb7b98c50effebe09db7e51fe139da11af7b116ce6b7d81f0dada92fa29c82a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7cv42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7mmvk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:42Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:42 crc kubenswrapper[4898]: I1211 13:04:42.828156 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:42 crc kubenswrapper[4898]: I1211 13:04:42.828206 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:42 crc kubenswrapper[4898]: I1211 13:04:42.828219 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:42 crc kubenswrapper[4898]: I1211 13:04:42.828242 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:42 crc kubenswrapper[4898]: I1211 13:04:42.828255 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:42Z","lastTransitionTime":"2025-12-11T13:04:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:42 crc kubenswrapper[4898]: I1211 13:04:42.843727 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qndxl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1efa7034-8a95-4e6e-bd84-0189dc5acaa3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2a77e22893dca5ea818fc6992db6c34f8eb90ce2f25e814bec5f314d60ad569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd81ec4902eeb225d5a2969e211c63bbab448c1cccb327cfa84fb68c4f386ab0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ab0efc2d7d46f575125a46f025b016017e66f74466a9d796dfb0285577f5bcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d992af21fd68cce19a4be8328466a3d46df62ed908fe4e911ba14e7c009ce982\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a60901e27e67b7483e02800a91a86f2676b610d538c999ab1df0d42fde146f90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7e7d7ded2a6e0e20781a16b0ed56a744db593c6fe40c75e89a088ccb8cdb127\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://378427d319c4df6bcb001488490b4a1a785a18161dfed6887f08fbcea9e06c5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://378427d319c4df6bcb001488490b4a1a785a18161dfed6887f08fbcea9e06c5a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-11T13:04:35Z\\\",\\\"message\\\":\\\"ent time 2025-12-11T13:04:34Z is after 2025-08-24T17:21:41Z]\\\\nI1211 13:04:34.324602 6346 services_controller.go:434] Service openshift-network-console/networking-console-plugin retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{networking-console-plugin openshift-network-console 45966864-9dcc-4747-a124-95f4ab710d2d 13853 0 2025-02-23 05:40:35 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[app.kubernetes.io/component:networking-console-plugin app.kubernetes.io/managed-by:cluster-network-operator app.kubernetes.io/name:networking-console-plugin app.kubernetes.io/part-of:cluster-network-operator] map[openshift.io/description:Expose the networking console plugin service on port 9443. This port is for internal use, and no other usage is guaranteed. service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-secret-name:networking-console-plugin-cert service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{operator.openshift.io/v1 Network cluster 8d01ddba-7e05-4639-926a-4485de3b6327 0xc0073ce7ae 0xc0073ce7af}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:https,Protocol:TCP,Port:9443,TargetPort:{1 0 https},NodePort:0,AppProtocol:nil,},},Selector:map[string]string{app.kubernetes.io/component: networking-console-plugin,app.kuberne\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:33Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-qndxl_openshift-ovn-kubernetes(1efa7034-8a95-4e6e-bd84-0189dc5acaa3)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3aa907a5ca865510f1aafd0f949963b5938b6fe62fa9fd690447c4ab62566f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fc397b24e8480c4d07281b09fb871f859eb9fdf47fa9103b9243f91e5800d61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fc397b24e8480c4d07281b09fb871f859eb9fdf47fa9103b9243f91e5800d61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qndxl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:42Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:42 crc kubenswrapper[4898]: I1211 13:04:42.857232 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-brphv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f71cb6f-609f-4fab-86eb-e01518fb8c61\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d17922d87e50d85f50e76fef19a44cedae3bfd35727f8531089cfcf6da12d5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwpjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cedba92b63b5430b84d32f2b59c5cc0c4e1cd2bc2d513f2f905ebb59c34639b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwpjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-brphv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:42Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:42 crc kubenswrapper[4898]: I1211 13:04:42.875261 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9dab79b9393ae8ca48f0b4366d9b4e6fd25015a3d42f913b3d3d5ce78cbf0cc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bee512bd389a5e402acad6fdbf075e8d90bbfb3899efaecee992ff8ef64617fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:42Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:42 crc kubenswrapper[4898]: I1211 13:04:42.891411 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:42Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:42 crc kubenswrapper[4898]: I1211 13:04:42.905958 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7lxfm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45192e97-2770-4866-9865-dc2f45b3f616\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4fc2a5689b589ccda6ec382b6a586f4e47c42d4f137d852e28688a18b98e6e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14a9667311a747f8e844897394432d67e0897b22569dd1314f4ebeccfce1c531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14a9667311a747f8e844897394432d67e0897b22569dd1314f4ebeccfce1c531\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://35bb75c37377565dd0410255d00b3a848a3d22801107bd4d9547ce5574cda121\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35bb75c37377565dd0410255d00b3a848a3d22801107bd4d9547ce5574cda121\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd3bae44b475629c42174d195e162b9afe545e8dc84857847b1e2d936cf1a32d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd3bae44b475629c42174d195e162b9afe545e8dc84857847b1e2d936cf1a32d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7d5edf28866aeb08ed16121e0cea183463e58a708b157da20edd847bdad9860\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7d5edf28866aeb08ed16121e0cea183463e58a708b157da20edd847bdad9860\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://951e5da3fc6041496960dcfcd28bb3e918a40d9303c6609880bc2f8dcca29688\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://951e5da3fc6041496960dcfcd28bb3e918a40d9303c6609880bc2f8dcca29688\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39b6a95a0387b8f1ea4992bfa28355272a299bbb81adddf7ad39835786b43b2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39b6a95a0387b8f1ea4992bfa28355272a299bbb81adddf7ad39835786b43b2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7lxfm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:42Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:42 crc kubenswrapper[4898]: I1211 13:04:42.922348 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-zcq7l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34380c7c-1d75-4f6f-a6cb-b015a55ca978\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:36Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-zcq7l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:42Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:42 crc kubenswrapper[4898]: I1211 13:04:42.931973 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:42 crc kubenswrapper[4898]: I1211 13:04:42.932020 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:42 crc kubenswrapper[4898]: I1211 13:04:42.932031 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:42 crc kubenswrapper[4898]: I1211 13:04:42.932051 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:42 crc kubenswrapper[4898]: I1211 13:04:42.932065 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:42Z","lastTransitionTime":"2025-12-11T13:04:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:42 crc kubenswrapper[4898]: I1211 13:04:42.940819 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fc485a2-a69e-4453-8c3d-4b9698caa632\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df081be8bffcbf5068ade5f09ad81f2ea332d110fb0dffc1ab215f4a447e75d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://765cb993103d23770e20a725ce1f28ca635d13d2ecd05676039a833274ecf3e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41568c567a418a6ebd534c09d02c4331dfa2cd00580441bfdd83a15a12153a59\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ccc1530df8a90b252f6008c3a1734fd489bcb57944c2cc46bdf3c7554d837c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8857b4ade110b31598d9080f45bc15f5702b2c278fe7909e4d172c9877e61534\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"message\\\":\\\"falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1211 13:04:15.326141 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1211 13:04:15.327902 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-38072788/tls.crt::/tmp/serving-cert-38072788/tls.key\\\\\\\"\\\\nI1211 13:04:21.035582 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1211 13:04:21.038444 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1211 13:04:21.038483 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1211 13:04:21.038515 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1211 13:04:21.038527 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1211 13:04:21.044993 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1211 13:04:21.045022 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1211 13:04:21.045029 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1211 13:04:21.045034 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1211 13:04:21.045038 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1211 13:04:21.045043 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1211 13:04:21.045047 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1211 13:04:21.045297 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1211 13:04:21.046774 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a6e962a8c37823fc7d0047f9c1cae14311bfa1f36236addff039bba3369a021\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca4b5408884ed07718750ad82f77961d242327ca11991fd4dbbd1b5b53c5b642\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca4b5408884ed07718750ad82f77961d242327ca11991fd4dbbd1b5b53c5b642\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:42Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:42 crc kubenswrapper[4898]: I1211 13:04:42.955359 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:42Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:42 crc kubenswrapper[4898]: I1211 13:04:42.970641 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:42Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:42 crc kubenswrapper[4898]: I1211 13:04:42.988882 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca3db723-c4e9-4a38-9cee-827f045c7be3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b72e9f6787b4bfadbc89cf669d5f31678f0a85a726107e3b3d2e06eff0e18ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a52b842d9e7001cf1e9c3284c56ac9a76f642b66457d6717c6367e00ef89fdf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://583816a3f9ba9a092d6cd8d75bba54cf42e3ff9f65230f83822baec967dd53dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f419f755ec0a61ffad06dca0d35df43386ebd253152f29cf04312cd0ea89d5de\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:42Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:43 crc kubenswrapper[4898]: I1211 13:04:43.004779 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b28cc743c1173591c6ee75d4bb3841608d01bca50dab896d4c5de930a67ee22f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:43Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:43 crc kubenswrapper[4898]: I1211 13:04:43.021775 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-dlqfj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e8ed6cb-b822-4b64-9e00-e755c5aea812\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76ddfb054d2dd7ec27a771c96a0fca2cead8eacc3221c22b36c1d075414a1995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bvrpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-dlqfj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:43Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:43 crc kubenswrapper[4898]: I1211 13:04:43.034336 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:43 crc kubenswrapper[4898]: I1211 13:04:43.034398 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:43 crc kubenswrapper[4898]: I1211 13:04:43.034421 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:43 crc kubenswrapper[4898]: I1211 13:04:43.034483 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:43 crc kubenswrapper[4898]: I1211 13:04:43.034509 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:43Z","lastTransitionTime":"2025-12-11T13:04:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:43 crc kubenswrapper[4898]: I1211 13:04:43.036570 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jmvlr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8009db93-3f68-4a84-87ae-863f64e231e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08079d059f71e05ad9d28ebed3d2608a035d3b6fc87db1d80b3974b439ac5bdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkmx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:24Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jmvlr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:43Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:43 crc kubenswrapper[4898]: I1211 13:04:43.051362 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://337971ff1d0d3cf7ee256777a71e4549ec0b973b58b437caaa152c589047141f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:43Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:43 crc kubenswrapper[4898]: I1211 13:04:43.136785 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:43 crc kubenswrapper[4898]: I1211 13:04:43.136842 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:43 crc kubenswrapper[4898]: I1211 13:04:43.136854 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:43 crc kubenswrapper[4898]: I1211 13:04:43.136874 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:43 crc kubenswrapper[4898]: I1211 13:04:43.136889 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:43Z","lastTransitionTime":"2025-12-11T13:04:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:43 crc kubenswrapper[4898]: I1211 13:04:43.239408 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:43 crc kubenswrapper[4898]: I1211 13:04:43.239447 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:43 crc kubenswrapper[4898]: I1211 13:04:43.239494 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:43 crc kubenswrapper[4898]: I1211 13:04:43.239511 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:43 crc kubenswrapper[4898]: I1211 13:04:43.239519 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:43Z","lastTransitionTime":"2025-12-11T13:04:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:43 crc kubenswrapper[4898]: I1211 13:04:43.341627 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:43 crc kubenswrapper[4898]: I1211 13:04:43.341691 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:43 crc kubenswrapper[4898]: I1211 13:04:43.341707 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:43 crc kubenswrapper[4898]: I1211 13:04:43.341730 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:43 crc kubenswrapper[4898]: I1211 13:04:43.341748 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:43Z","lastTransitionTime":"2025-12-11T13:04:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:43 crc kubenswrapper[4898]: I1211 13:04:43.444367 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:43 crc kubenswrapper[4898]: I1211 13:04:43.444414 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:43 crc kubenswrapper[4898]: I1211 13:04:43.444426 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:43 crc kubenswrapper[4898]: I1211 13:04:43.444444 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:43 crc kubenswrapper[4898]: I1211 13:04:43.444483 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:43Z","lastTransitionTime":"2025-12-11T13:04:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:43 crc kubenswrapper[4898]: I1211 13:04:43.547161 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:43 crc kubenswrapper[4898]: I1211 13:04:43.547199 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:43 crc kubenswrapper[4898]: I1211 13:04:43.547207 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:43 crc kubenswrapper[4898]: I1211 13:04:43.547223 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:43 crc kubenswrapper[4898]: I1211 13:04:43.547232 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:43Z","lastTransitionTime":"2025-12-11T13:04:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:43 crc kubenswrapper[4898]: I1211 13:04:43.650429 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:43 crc kubenswrapper[4898]: I1211 13:04:43.650540 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:43 crc kubenswrapper[4898]: I1211 13:04:43.650558 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:43 crc kubenswrapper[4898]: I1211 13:04:43.650580 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:43 crc kubenswrapper[4898]: I1211 13:04:43.650596 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:43Z","lastTransitionTime":"2025-12-11T13:04:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:43 crc kubenswrapper[4898]: I1211 13:04:43.752936 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:43 crc kubenswrapper[4898]: I1211 13:04:43.752992 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:43 crc kubenswrapper[4898]: I1211 13:04:43.753010 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:43 crc kubenswrapper[4898]: I1211 13:04:43.753034 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:43 crc kubenswrapper[4898]: I1211 13:04:43.753052 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:43Z","lastTransitionTime":"2025-12-11T13:04:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:43 crc kubenswrapper[4898]: I1211 13:04:43.774606 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 11 13:04:43 crc kubenswrapper[4898]: E1211 13:04:43.774756 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 11 13:04:43 crc kubenswrapper[4898]: I1211 13:04:43.774603 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zcq7l" Dec 11 13:04:43 crc kubenswrapper[4898]: I1211 13:04:43.774613 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 11 13:04:43 crc kubenswrapper[4898]: I1211 13:04:43.774603 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 11 13:04:43 crc kubenswrapper[4898]: E1211 13:04:43.775019 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zcq7l" podUID="34380c7c-1d75-4f6f-a6cb-b015a55ca978" Dec 11 13:04:43 crc kubenswrapper[4898]: E1211 13:04:43.775161 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 11 13:04:43 crc kubenswrapper[4898]: E1211 13:04:43.775227 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 11 13:04:43 crc kubenswrapper[4898]: I1211 13:04:43.856008 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:43 crc kubenswrapper[4898]: I1211 13:04:43.856113 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:43 crc kubenswrapper[4898]: I1211 13:04:43.856139 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:43 crc kubenswrapper[4898]: I1211 13:04:43.856166 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:43 crc kubenswrapper[4898]: I1211 13:04:43.856187 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:43Z","lastTransitionTime":"2025-12-11T13:04:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:43 crc kubenswrapper[4898]: I1211 13:04:43.959056 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:43 crc kubenswrapper[4898]: I1211 13:04:43.959125 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:43 crc kubenswrapper[4898]: I1211 13:04:43.959146 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:43 crc kubenswrapper[4898]: I1211 13:04:43.959174 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:43 crc kubenswrapper[4898]: I1211 13:04:43.959196 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:43Z","lastTransitionTime":"2025-12-11T13:04:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:44 crc kubenswrapper[4898]: I1211 13:04:44.061825 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:44 crc kubenswrapper[4898]: I1211 13:04:44.061893 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:44 crc kubenswrapper[4898]: I1211 13:04:44.061906 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:44 crc kubenswrapper[4898]: I1211 13:04:44.061922 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:44 crc kubenswrapper[4898]: I1211 13:04:44.061934 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:44Z","lastTransitionTime":"2025-12-11T13:04:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:44 crc kubenswrapper[4898]: I1211 13:04:44.165180 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:44 crc kubenswrapper[4898]: I1211 13:04:44.165246 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:44 crc kubenswrapper[4898]: I1211 13:04:44.165257 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:44 crc kubenswrapper[4898]: I1211 13:04:44.165276 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:44 crc kubenswrapper[4898]: I1211 13:04:44.165289 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:44Z","lastTransitionTime":"2025-12-11T13:04:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:44 crc kubenswrapper[4898]: I1211 13:04:44.269398 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:44 crc kubenswrapper[4898]: I1211 13:04:44.269520 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:44 crc kubenswrapper[4898]: I1211 13:04:44.269545 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:44 crc kubenswrapper[4898]: I1211 13:04:44.269567 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:44 crc kubenswrapper[4898]: I1211 13:04:44.269580 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:44Z","lastTransitionTime":"2025-12-11T13:04:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:44 crc kubenswrapper[4898]: I1211 13:04:44.276252 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/34380c7c-1d75-4f6f-a6cb-b015a55ca978-metrics-certs\") pod \"network-metrics-daemon-zcq7l\" (UID: \"34380c7c-1d75-4f6f-a6cb-b015a55ca978\") " pod="openshift-multus/network-metrics-daemon-zcq7l" Dec 11 13:04:44 crc kubenswrapper[4898]: E1211 13:04:44.276570 4898 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 11 13:04:44 crc kubenswrapper[4898]: E1211 13:04:44.276677 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/34380c7c-1d75-4f6f-a6cb-b015a55ca978-metrics-certs podName:34380c7c-1d75-4f6f-a6cb-b015a55ca978 nodeName:}" failed. No retries permitted until 2025-12-11 13:04:52.276652482 +0000 UTC m=+49.848978909 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/34380c7c-1d75-4f6f-a6cb-b015a55ca978-metrics-certs") pod "network-metrics-daemon-zcq7l" (UID: "34380c7c-1d75-4f6f-a6cb-b015a55ca978") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 11 13:04:44 crc kubenswrapper[4898]: I1211 13:04:44.372175 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:44 crc kubenswrapper[4898]: I1211 13:04:44.372274 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:44 crc kubenswrapper[4898]: I1211 13:04:44.372300 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:44 crc kubenswrapper[4898]: I1211 13:04:44.372327 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:44 crc kubenswrapper[4898]: I1211 13:04:44.372346 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:44Z","lastTransitionTime":"2025-12-11T13:04:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:44 crc kubenswrapper[4898]: I1211 13:04:44.475194 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:44 crc kubenswrapper[4898]: I1211 13:04:44.475318 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:44 crc kubenswrapper[4898]: I1211 13:04:44.475343 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:44 crc kubenswrapper[4898]: I1211 13:04:44.475371 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:44 crc kubenswrapper[4898]: I1211 13:04:44.475391 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:44Z","lastTransitionTime":"2025-12-11T13:04:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:44 crc kubenswrapper[4898]: I1211 13:04:44.577638 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:44 crc kubenswrapper[4898]: I1211 13:04:44.577699 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:44 crc kubenswrapper[4898]: I1211 13:04:44.577716 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:44 crc kubenswrapper[4898]: I1211 13:04:44.577750 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:44 crc kubenswrapper[4898]: I1211 13:04:44.577769 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:44Z","lastTransitionTime":"2025-12-11T13:04:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:44 crc kubenswrapper[4898]: I1211 13:04:44.680808 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:44 crc kubenswrapper[4898]: I1211 13:04:44.680896 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:44 crc kubenswrapper[4898]: I1211 13:04:44.680920 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:44 crc kubenswrapper[4898]: I1211 13:04:44.680953 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:44 crc kubenswrapper[4898]: I1211 13:04:44.680978 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:44Z","lastTransitionTime":"2025-12-11T13:04:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:44 crc kubenswrapper[4898]: I1211 13:04:44.783044 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:44 crc kubenswrapper[4898]: I1211 13:04:44.783087 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:44 crc kubenswrapper[4898]: I1211 13:04:44.783100 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:44 crc kubenswrapper[4898]: I1211 13:04:44.783116 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:44 crc kubenswrapper[4898]: I1211 13:04:44.783129 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:44Z","lastTransitionTime":"2025-12-11T13:04:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:44 crc kubenswrapper[4898]: I1211 13:04:44.885774 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:44 crc kubenswrapper[4898]: I1211 13:04:44.885839 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:44 crc kubenswrapper[4898]: I1211 13:04:44.885856 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:44 crc kubenswrapper[4898]: I1211 13:04:44.885886 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:44 crc kubenswrapper[4898]: I1211 13:04:44.885906 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:44Z","lastTransitionTime":"2025-12-11T13:04:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:44 crc kubenswrapper[4898]: I1211 13:04:44.989134 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:44 crc kubenswrapper[4898]: I1211 13:04:44.989297 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:44 crc kubenswrapper[4898]: I1211 13:04:44.989329 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:44 crc kubenswrapper[4898]: I1211 13:04:44.989360 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:44 crc kubenswrapper[4898]: I1211 13:04:44.989385 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:44Z","lastTransitionTime":"2025-12-11T13:04:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:45 crc kubenswrapper[4898]: I1211 13:04:45.092493 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:45 crc kubenswrapper[4898]: I1211 13:04:45.092567 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:45 crc kubenswrapper[4898]: I1211 13:04:45.092581 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:45 crc kubenswrapper[4898]: I1211 13:04:45.092614 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:45 crc kubenswrapper[4898]: I1211 13:04:45.092630 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:45Z","lastTransitionTime":"2025-12-11T13:04:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:45 crc kubenswrapper[4898]: I1211 13:04:45.195898 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:45 crc kubenswrapper[4898]: I1211 13:04:45.195958 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:45 crc kubenswrapper[4898]: I1211 13:04:45.195974 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:45 crc kubenswrapper[4898]: I1211 13:04:45.195998 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:45 crc kubenswrapper[4898]: I1211 13:04:45.196017 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:45Z","lastTransitionTime":"2025-12-11T13:04:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:45 crc kubenswrapper[4898]: I1211 13:04:45.299081 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:45 crc kubenswrapper[4898]: I1211 13:04:45.299127 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:45 crc kubenswrapper[4898]: I1211 13:04:45.299139 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:45 crc kubenswrapper[4898]: I1211 13:04:45.299154 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:45 crc kubenswrapper[4898]: I1211 13:04:45.299164 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:45Z","lastTransitionTime":"2025-12-11T13:04:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:45 crc kubenswrapper[4898]: I1211 13:04:45.401944 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:45 crc kubenswrapper[4898]: I1211 13:04:45.401977 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:45 crc kubenswrapper[4898]: I1211 13:04:45.401989 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:45 crc kubenswrapper[4898]: I1211 13:04:45.402006 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:45 crc kubenswrapper[4898]: I1211 13:04:45.402018 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:45Z","lastTransitionTime":"2025-12-11T13:04:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:45 crc kubenswrapper[4898]: I1211 13:04:45.505859 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:45 crc kubenswrapper[4898]: I1211 13:04:45.505941 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:45 crc kubenswrapper[4898]: I1211 13:04:45.505961 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:45 crc kubenswrapper[4898]: I1211 13:04:45.505985 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:45 crc kubenswrapper[4898]: I1211 13:04:45.506012 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:45Z","lastTransitionTime":"2025-12-11T13:04:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:45 crc kubenswrapper[4898]: I1211 13:04:45.609033 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:45 crc kubenswrapper[4898]: I1211 13:04:45.609066 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:45 crc kubenswrapper[4898]: I1211 13:04:45.609074 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:45 crc kubenswrapper[4898]: I1211 13:04:45.609087 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:45 crc kubenswrapper[4898]: I1211 13:04:45.609095 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:45Z","lastTransitionTime":"2025-12-11T13:04:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:45 crc kubenswrapper[4898]: I1211 13:04:45.712082 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:45 crc kubenswrapper[4898]: I1211 13:04:45.712115 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:45 crc kubenswrapper[4898]: I1211 13:04:45.712126 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:45 crc kubenswrapper[4898]: I1211 13:04:45.712143 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:45 crc kubenswrapper[4898]: I1211 13:04:45.712156 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:45Z","lastTransitionTime":"2025-12-11T13:04:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:45 crc kubenswrapper[4898]: I1211 13:04:45.774221 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zcq7l" Dec 11 13:04:45 crc kubenswrapper[4898]: I1211 13:04:45.774290 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 11 13:04:45 crc kubenswrapper[4898]: I1211 13:04:45.774251 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 11 13:04:45 crc kubenswrapper[4898]: I1211 13:04:45.774365 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 11 13:04:45 crc kubenswrapper[4898]: E1211 13:04:45.774632 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zcq7l" podUID="34380c7c-1d75-4f6f-a6cb-b015a55ca978" Dec 11 13:04:45 crc kubenswrapper[4898]: E1211 13:04:45.774660 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 11 13:04:45 crc kubenswrapper[4898]: E1211 13:04:45.774720 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 11 13:04:45 crc kubenswrapper[4898]: E1211 13:04:45.774804 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 11 13:04:45 crc kubenswrapper[4898]: I1211 13:04:45.814113 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:45 crc kubenswrapper[4898]: I1211 13:04:45.814191 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:45 crc kubenswrapper[4898]: I1211 13:04:45.814214 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:45 crc kubenswrapper[4898]: I1211 13:04:45.814244 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:45 crc kubenswrapper[4898]: I1211 13:04:45.814266 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:45Z","lastTransitionTime":"2025-12-11T13:04:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:45 crc kubenswrapper[4898]: I1211 13:04:45.917971 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:45 crc kubenswrapper[4898]: I1211 13:04:45.918091 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:45 crc kubenswrapper[4898]: I1211 13:04:45.918159 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:45 crc kubenswrapper[4898]: I1211 13:04:45.918192 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:45 crc kubenswrapper[4898]: I1211 13:04:45.918256 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:45Z","lastTransitionTime":"2025-12-11T13:04:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:46 crc kubenswrapper[4898]: I1211 13:04:46.021941 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:46 crc kubenswrapper[4898]: I1211 13:04:46.022002 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:46 crc kubenswrapper[4898]: I1211 13:04:46.022013 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:46 crc kubenswrapper[4898]: I1211 13:04:46.022030 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:46 crc kubenswrapper[4898]: I1211 13:04:46.022042 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:46Z","lastTransitionTime":"2025-12-11T13:04:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:46 crc kubenswrapper[4898]: I1211 13:04:46.124173 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:46 crc kubenswrapper[4898]: I1211 13:04:46.124214 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:46 crc kubenswrapper[4898]: I1211 13:04:46.124227 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:46 crc kubenswrapper[4898]: I1211 13:04:46.124244 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:46 crc kubenswrapper[4898]: I1211 13:04:46.124255 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:46Z","lastTransitionTime":"2025-12-11T13:04:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:46 crc kubenswrapper[4898]: I1211 13:04:46.226960 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:46 crc kubenswrapper[4898]: I1211 13:04:46.227000 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:46 crc kubenswrapper[4898]: I1211 13:04:46.227011 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:46 crc kubenswrapper[4898]: I1211 13:04:46.227026 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:46 crc kubenswrapper[4898]: I1211 13:04:46.227036 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:46Z","lastTransitionTime":"2025-12-11T13:04:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:46 crc kubenswrapper[4898]: I1211 13:04:46.329360 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:46 crc kubenswrapper[4898]: I1211 13:04:46.329402 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:46 crc kubenswrapper[4898]: I1211 13:04:46.329412 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:46 crc kubenswrapper[4898]: I1211 13:04:46.329427 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:46 crc kubenswrapper[4898]: I1211 13:04:46.329437 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:46Z","lastTransitionTime":"2025-12-11T13:04:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:46 crc kubenswrapper[4898]: I1211 13:04:46.432125 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:46 crc kubenswrapper[4898]: I1211 13:04:46.432191 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:46 crc kubenswrapper[4898]: I1211 13:04:46.432215 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:46 crc kubenswrapper[4898]: I1211 13:04:46.432243 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:46 crc kubenswrapper[4898]: I1211 13:04:46.432270 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:46Z","lastTransitionTime":"2025-12-11T13:04:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:46 crc kubenswrapper[4898]: I1211 13:04:46.481533 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:46 crc kubenswrapper[4898]: I1211 13:04:46.481581 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:46 crc kubenswrapper[4898]: I1211 13:04:46.481593 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:46 crc kubenswrapper[4898]: I1211 13:04:46.481611 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:46 crc kubenswrapper[4898]: I1211 13:04:46.481622 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:46Z","lastTransitionTime":"2025-12-11T13:04:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:46 crc kubenswrapper[4898]: E1211 13:04:46.500098 4898 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-11T13:04:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-11T13:04:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-11T13:04:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-11T13:04:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"160776e4-8c30-4eed-9dbf-aa0b51733cfb\\\",\\\"systemUUID\\\":\\\"3ede5a2c-67f9-4bff-827e-03a23908e5c0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:46Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:46 crc kubenswrapper[4898]: I1211 13:04:46.504605 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:46 crc kubenswrapper[4898]: I1211 13:04:46.504681 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:46 crc kubenswrapper[4898]: I1211 13:04:46.504705 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:46 crc kubenswrapper[4898]: I1211 13:04:46.504734 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:46 crc kubenswrapper[4898]: I1211 13:04:46.504756 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:46Z","lastTransitionTime":"2025-12-11T13:04:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:46 crc kubenswrapper[4898]: E1211 13:04:46.520524 4898 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-11T13:04:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-11T13:04:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-11T13:04:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-11T13:04:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"160776e4-8c30-4eed-9dbf-aa0b51733cfb\\\",\\\"systemUUID\\\":\\\"3ede5a2c-67f9-4bff-827e-03a23908e5c0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:46Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:46 crc kubenswrapper[4898]: I1211 13:04:46.525198 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:46 crc kubenswrapper[4898]: I1211 13:04:46.525264 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:46 crc kubenswrapper[4898]: I1211 13:04:46.525292 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:46 crc kubenswrapper[4898]: I1211 13:04:46.525322 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:46 crc kubenswrapper[4898]: I1211 13:04:46.525343 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:46Z","lastTransitionTime":"2025-12-11T13:04:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:46 crc kubenswrapper[4898]: E1211 13:04:46.539672 4898 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-11T13:04:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-11T13:04:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-11T13:04:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-11T13:04:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"160776e4-8c30-4eed-9dbf-aa0b51733cfb\\\",\\\"systemUUID\\\":\\\"3ede5a2c-67f9-4bff-827e-03a23908e5c0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:46Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:46 crc kubenswrapper[4898]: I1211 13:04:46.543889 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:46 crc kubenswrapper[4898]: I1211 13:04:46.543948 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:46 crc kubenswrapper[4898]: I1211 13:04:46.543965 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:46 crc kubenswrapper[4898]: I1211 13:04:46.543987 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:46 crc kubenswrapper[4898]: I1211 13:04:46.544006 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:46Z","lastTransitionTime":"2025-12-11T13:04:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:46 crc kubenswrapper[4898]: E1211 13:04:46.563149 4898 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-11T13:04:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-11T13:04:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-11T13:04:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-11T13:04:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"160776e4-8c30-4eed-9dbf-aa0b51733cfb\\\",\\\"systemUUID\\\":\\\"3ede5a2c-67f9-4bff-827e-03a23908e5c0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:46Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:46 crc kubenswrapper[4898]: I1211 13:04:46.567368 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:46 crc kubenswrapper[4898]: I1211 13:04:46.567414 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:46 crc kubenswrapper[4898]: I1211 13:04:46.567422 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:46 crc kubenswrapper[4898]: I1211 13:04:46.567437 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:46 crc kubenswrapper[4898]: I1211 13:04:46.567448 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:46Z","lastTransitionTime":"2025-12-11T13:04:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:46 crc kubenswrapper[4898]: E1211 13:04:46.581252 4898 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-11T13:04:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-11T13:04:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-11T13:04:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-11T13:04:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"160776e4-8c30-4eed-9dbf-aa0b51733cfb\\\",\\\"systemUUID\\\":\\\"3ede5a2c-67f9-4bff-827e-03a23908e5c0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:46Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:46 crc kubenswrapper[4898]: E1211 13:04:46.581404 4898 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 11 13:04:46 crc kubenswrapper[4898]: I1211 13:04:46.583018 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:46 crc kubenswrapper[4898]: I1211 13:04:46.583055 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:46 crc kubenswrapper[4898]: I1211 13:04:46.583067 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:46 crc kubenswrapper[4898]: I1211 13:04:46.583084 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:46 crc kubenswrapper[4898]: I1211 13:04:46.583096 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:46Z","lastTransitionTime":"2025-12-11T13:04:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:46 crc kubenswrapper[4898]: I1211 13:04:46.684759 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:46 crc kubenswrapper[4898]: I1211 13:04:46.684805 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:46 crc kubenswrapper[4898]: I1211 13:04:46.684818 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:46 crc kubenswrapper[4898]: I1211 13:04:46.684831 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:46 crc kubenswrapper[4898]: I1211 13:04:46.684840 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:46Z","lastTransitionTime":"2025-12-11T13:04:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:46 crc kubenswrapper[4898]: I1211 13:04:46.786952 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:46 crc kubenswrapper[4898]: I1211 13:04:46.787028 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:46 crc kubenswrapper[4898]: I1211 13:04:46.787054 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:46 crc kubenswrapper[4898]: I1211 13:04:46.787083 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:46 crc kubenswrapper[4898]: I1211 13:04:46.787108 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:46Z","lastTransitionTime":"2025-12-11T13:04:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:46 crc kubenswrapper[4898]: I1211 13:04:46.889398 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:46 crc kubenswrapper[4898]: I1211 13:04:46.889436 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:46 crc kubenswrapper[4898]: I1211 13:04:46.889447 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:46 crc kubenswrapper[4898]: I1211 13:04:46.889487 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:46 crc kubenswrapper[4898]: I1211 13:04:46.889502 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:46Z","lastTransitionTime":"2025-12-11T13:04:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:46 crc kubenswrapper[4898]: I1211 13:04:46.992792 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:46 crc kubenswrapper[4898]: I1211 13:04:46.992893 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:46 crc kubenswrapper[4898]: I1211 13:04:46.992922 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:46 crc kubenswrapper[4898]: I1211 13:04:46.992954 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:46 crc kubenswrapper[4898]: I1211 13:04:46.992980 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:46Z","lastTransitionTime":"2025-12-11T13:04:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:47 crc kubenswrapper[4898]: I1211 13:04:47.096158 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:47 crc kubenswrapper[4898]: I1211 13:04:47.096231 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:47 crc kubenswrapper[4898]: I1211 13:04:47.096249 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:47 crc kubenswrapper[4898]: I1211 13:04:47.096272 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:47 crc kubenswrapper[4898]: I1211 13:04:47.096290 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:47Z","lastTransitionTime":"2025-12-11T13:04:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:47 crc kubenswrapper[4898]: I1211 13:04:47.199802 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:47 crc kubenswrapper[4898]: I1211 13:04:47.199867 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:47 crc kubenswrapper[4898]: I1211 13:04:47.199884 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:47 crc kubenswrapper[4898]: I1211 13:04:47.199908 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:47 crc kubenswrapper[4898]: I1211 13:04:47.199926 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:47Z","lastTransitionTime":"2025-12-11T13:04:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:47 crc kubenswrapper[4898]: I1211 13:04:47.303133 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:47 crc kubenswrapper[4898]: I1211 13:04:47.303227 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:47 crc kubenswrapper[4898]: I1211 13:04:47.303251 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:47 crc kubenswrapper[4898]: I1211 13:04:47.303281 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:47 crc kubenswrapper[4898]: I1211 13:04:47.303304 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:47Z","lastTransitionTime":"2025-12-11T13:04:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:47 crc kubenswrapper[4898]: I1211 13:04:47.405449 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:47 crc kubenswrapper[4898]: I1211 13:04:47.405502 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:47 crc kubenswrapper[4898]: I1211 13:04:47.405510 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:47 crc kubenswrapper[4898]: I1211 13:04:47.405523 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:47 crc kubenswrapper[4898]: I1211 13:04:47.405531 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:47Z","lastTransitionTime":"2025-12-11T13:04:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:47 crc kubenswrapper[4898]: I1211 13:04:47.507601 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:47 crc kubenswrapper[4898]: I1211 13:04:47.507640 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:47 crc kubenswrapper[4898]: I1211 13:04:47.507651 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:47 crc kubenswrapper[4898]: I1211 13:04:47.507666 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:47 crc kubenswrapper[4898]: I1211 13:04:47.507677 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:47Z","lastTransitionTime":"2025-12-11T13:04:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:47 crc kubenswrapper[4898]: I1211 13:04:47.627321 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:47 crc kubenswrapper[4898]: I1211 13:04:47.627426 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:47 crc kubenswrapper[4898]: I1211 13:04:47.627447 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:47 crc kubenswrapper[4898]: I1211 13:04:47.627533 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:47 crc kubenswrapper[4898]: I1211 13:04:47.627554 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:47Z","lastTransitionTime":"2025-12-11T13:04:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:47 crc kubenswrapper[4898]: I1211 13:04:47.730769 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:47 crc kubenswrapper[4898]: I1211 13:04:47.730853 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:47 crc kubenswrapper[4898]: I1211 13:04:47.730878 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:47 crc kubenswrapper[4898]: I1211 13:04:47.730910 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:47 crc kubenswrapper[4898]: I1211 13:04:47.730935 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:47Z","lastTransitionTime":"2025-12-11T13:04:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:47 crc kubenswrapper[4898]: I1211 13:04:47.774502 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zcq7l" Dec 11 13:04:47 crc kubenswrapper[4898]: I1211 13:04:47.774538 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 11 13:04:47 crc kubenswrapper[4898]: I1211 13:04:47.774583 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 11 13:04:47 crc kubenswrapper[4898]: E1211 13:04:47.774699 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zcq7l" podUID="34380c7c-1d75-4f6f-a6cb-b015a55ca978" Dec 11 13:04:47 crc kubenswrapper[4898]: I1211 13:04:47.774718 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 11 13:04:47 crc kubenswrapper[4898]: E1211 13:04:47.774866 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 11 13:04:47 crc kubenswrapper[4898]: E1211 13:04:47.774969 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 11 13:04:47 crc kubenswrapper[4898]: E1211 13:04:47.775053 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 11 13:04:47 crc kubenswrapper[4898]: I1211 13:04:47.833958 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:47 crc kubenswrapper[4898]: I1211 13:04:47.834008 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:47 crc kubenswrapper[4898]: I1211 13:04:47.834024 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:47 crc kubenswrapper[4898]: I1211 13:04:47.834052 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:47 crc kubenswrapper[4898]: I1211 13:04:47.834074 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:47Z","lastTransitionTime":"2025-12-11T13:04:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:47 crc kubenswrapper[4898]: I1211 13:04:47.937585 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:47 crc kubenswrapper[4898]: I1211 13:04:47.937648 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:47 crc kubenswrapper[4898]: I1211 13:04:47.937665 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:47 crc kubenswrapper[4898]: I1211 13:04:47.937687 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:47 crc kubenswrapper[4898]: I1211 13:04:47.937709 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:47Z","lastTransitionTime":"2025-12-11T13:04:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:48 crc kubenswrapper[4898]: I1211 13:04:48.040842 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:48 crc kubenswrapper[4898]: I1211 13:04:48.040896 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:48 crc kubenswrapper[4898]: I1211 13:04:48.040913 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:48 crc kubenswrapper[4898]: I1211 13:04:48.040934 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:48 crc kubenswrapper[4898]: I1211 13:04:48.040948 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:48Z","lastTransitionTime":"2025-12-11T13:04:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:48 crc kubenswrapper[4898]: I1211 13:04:48.143282 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:48 crc kubenswrapper[4898]: I1211 13:04:48.143857 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:48 crc kubenswrapper[4898]: I1211 13:04:48.143931 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:48 crc kubenswrapper[4898]: I1211 13:04:48.144014 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:48 crc kubenswrapper[4898]: I1211 13:04:48.144130 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:48Z","lastTransitionTime":"2025-12-11T13:04:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:48 crc kubenswrapper[4898]: I1211 13:04:48.246699 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:48 crc kubenswrapper[4898]: I1211 13:04:48.246760 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:48 crc kubenswrapper[4898]: I1211 13:04:48.246777 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:48 crc kubenswrapper[4898]: I1211 13:04:48.246802 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:48 crc kubenswrapper[4898]: I1211 13:04:48.246823 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:48Z","lastTransitionTime":"2025-12-11T13:04:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:48 crc kubenswrapper[4898]: I1211 13:04:48.349693 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:48 crc kubenswrapper[4898]: I1211 13:04:48.349750 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:48 crc kubenswrapper[4898]: I1211 13:04:48.349769 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:48 crc kubenswrapper[4898]: I1211 13:04:48.349792 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:48 crc kubenswrapper[4898]: I1211 13:04:48.349810 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:48Z","lastTransitionTime":"2025-12-11T13:04:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:48 crc kubenswrapper[4898]: I1211 13:04:48.453372 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:48 crc kubenswrapper[4898]: I1211 13:04:48.453433 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:48 crc kubenswrapper[4898]: I1211 13:04:48.453483 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:48 crc kubenswrapper[4898]: I1211 13:04:48.453509 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:48 crc kubenswrapper[4898]: I1211 13:04:48.453526 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:48Z","lastTransitionTime":"2025-12-11T13:04:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:48 crc kubenswrapper[4898]: I1211 13:04:48.556767 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:48 crc kubenswrapper[4898]: I1211 13:04:48.556874 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:48 crc kubenswrapper[4898]: I1211 13:04:48.556895 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:48 crc kubenswrapper[4898]: I1211 13:04:48.556924 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:48 crc kubenswrapper[4898]: I1211 13:04:48.556942 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:48Z","lastTransitionTime":"2025-12-11T13:04:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:48 crc kubenswrapper[4898]: I1211 13:04:48.660524 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:48 crc kubenswrapper[4898]: I1211 13:04:48.660825 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:48 crc kubenswrapper[4898]: I1211 13:04:48.660946 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:48 crc kubenswrapper[4898]: I1211 13:04:48.661128 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:48 crc kubenswrapper[4898]: I1211 13:04:48.661249 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:48Z","lastTransitionTime":"2025-12-11T13:04:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:48 crc kubenswrapper[4898]: I1211 13:04:48.763863 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:48 crc kubenswrapper[4898]: I1211 13:04:48.764186 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:48 crc kubenswrapper[4898]: I1211 13:04:48.764316 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:48 crc kubenswrapper[4898]: I1211 13:04:48.764492 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:48 crc kubenswrapper[4898]: I1211 13:04:48.764688 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:48Z","lastTransitionTime":"2025-12-11T13:04:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:48 crc kubenswrapper[4898]: I1211 13:04:48.867170 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:48 crc kubenswrapper[4898]: I1211 13:04:48.867234 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:48 crc kubenswrapper[4898]: I1211 13:04:48.867256 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:48 crc kubenswrapper[4898]: I1211 13:04:48.867285 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:48 crc kubenswrapper[4898]: I1211 13:04:48.867310 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:48Z","lastTransitionTime":"2025-12-11T13:04:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:48 crc kubenswrapper[4898]: I1211 13:04:48.970090 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:48 crc kubenswrapper[4898]: I1211 13:04:48.970126 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:48 crc kubenswrapper[4898]: I1211 13:04:48.970134 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:48 crc kubenswrapper[4898]: I1211 13:04:48.970146 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:48 crc kubenswrapper[4898]: I1211 13:04:48.970155 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:48Z","lastTransitionTime":"2025-12-11T13:04:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:49 crc kubenswrapper[4898]: I1211 13:04:49.072876 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:49 crc kubenswrapper[4898]: I1211 13:04:49.073643 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:49 crc kubenswrapper[4898]: I1211 13:04:49.073698 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:49 crc kubenswrapper[4898]: I1211 13:04:49.073729 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:49 crc kubenswrapper[4898]: I1211 13:04:49.073749 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:49Z","lastTransitionTime":"2025-12-11T13:04:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:49 crc kubenswrapper[4898]: I1211 13:04:49.177002 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:49 crc kubenswrapper[4898]: I1211 13:04:49.177712 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:49 crc kubenswrapper[4898]: I1211 13:04:49.177746 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:49 crc kubenswrapper[4898]: I1211 13:04:49.177767 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:49 crc kubenswrapper[4898]: I1211 13:04:49.177779 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:49Z","lastTransitionTime":"2025-12-11T13:04:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:49 crc kubenswrapper[4898]: I1211 13:04:49.280837 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:49 crc kubenswrapper[4898]: I1211 13:04:49.280904 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:49 crc kubenswrapper[4898]: I1211 13:04:49.280925 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:49 crc kubenswrapper[4898]: I1211 13:04:49.280949 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:49 crc kubenswrapper[4898]: I1211 13:04:49.280966 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:49Z","lastTransitionTime":"2025-12-11T13:04:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:49 crc kubenswrapper[4898]: I1211 13:04:49.383551 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:49 crc kubenswrapper[4898]: I1211 13:04:49.383593 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:49 crc kubenswrapper[4898]: I1211 13:04:49.383605 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:49 crc kubenswrapper[4898]: I1211 13:04:49.383619 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:49 crc kubenswrapper[4898]: I1211 13:04:49.383628 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:49Z","lastTransitionTime":"2025-12-11T13:04:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:49 crc kubenswrapper[4898]: I1211 13:04:49.486793 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:49 crc kubenswrapper[4898]: I1211 13:04:49.486869 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:49 crc kubenswrapper[4898]: I1211 13:04:49.486896 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:49 crc kubenswrapper[4898]: I1211 13:04:49.486926 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:49 crc kubenswrapper[4898]: I1211 13:04:49.486948 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:49Z","lastTransitionTime":"2025-12-11T13:04:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:49 crc kubenswrapper[4898]: I1211 13:04:49.589640 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:49 crc kubenswrapper[4898]: I1211 13:04:49.589702 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:49 crc kubenswrapper[4898]: I1211 13:04:49.589718 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:49 crc kubenswrapper[4898]: I1211 13:04:49.589744 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:49 crc kubenswrapper[4898]: I1211 13:04:49.589761 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:49Z","lastTransitionTime":"2025-12-11T13:04:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:49 crc kubenswrapper[4898]: I1211 13:04:49.692487 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:49 crc kubenswrapper[4898]: I1211 13:04:49.692541 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:49 crc kubenswrapper[4898]: I1211 13:04:49.692554 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:49 crc kubenswrapper[4898]: I1211 13:04:49.692574 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:49 crc kubenswrapper[4898]: I1211 13:04:49.692593 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:49Z","lastTransitionTime":"2025-12-11T13:04:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:49 crc kubenswrapper[4898]: I1211 13:04:49.774741 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 11 13:04:49 crc kubenswrapper[4898]: I1211 13:04:49.774782 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zcq7l" Dec 11 13:04:49 crc kubenswrapper[4898]: I1211 13:04:49.774813 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 11 13:04:49 crc kubenswrapper[4898]: I1211 13:04:49.774849 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 11 13:04:49 crc kubenswrapper[4898]: E1211 13:04:49.775047 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 11 13:04:49 crc kubenswrapper[4898]: E1211 13:04:49.775161 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 11 13:04:49 crc kubenswrapper[4898]: E1211 13:04:49.775312 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zcq7l" podUID="34380c7c-1d75-4f6f-a6cb-b015a55ca978" Dec 11 13:04:49 crc kubenswrapper[4898]: E1211 13:04:49.775447 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 11 13:04:49 crc kubenswrapper[4898]: I1211 13:04:49.795264 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:49 crc kubenswrapper[4898]: I1211 13:04:49.795314 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:49 crc kubenswrapper[4898]: I1211 13:04:49.795326 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:49 crc kubenswrapper[4898]: I1211 13:04:49.795343 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:49 crc kubenswrapper[4898]: I1211 13:04:49.795357 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:49Z","lastTransitionTime":"2025-12-11T13:04:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:49 crc kubenswrapper[4898]: I1211 13:04:49.897427 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:49 crc kubenswrapper[4898]: I1211 13:04:49.897504 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:49 crc kubenswrapper[4898]: I1211 13:04:49.897517 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:49 crc kubenswrapper[4898]: I1211 13:04:49.897533 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:49 crc kubenswrapper[4898]: I1211 13:04:49.897546 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:49Z","lastTransitionTime":"2025-12-11T13:04:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:49 crc kubenswrapper[4898]: I1211 13:04:49.999932 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:49 crc kubenswrapper[4898]: I1211 13:04:49.999993 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:49 crc kubenswrapper[4898]: I1211 13:04:50.000009 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:50 crc kubenswrapper[4898]: I1211 13:04:50.000027 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:50 crc kubenswrapper[4898]: I1211 13:04:50.000039 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:50Z","lastTransitionTime":"2025-12-11T13:04:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:50 crc kubenswrapper[4898]: I1211 13:04:50.103338 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:50 crc kubenswrapper[4898]: I1211 13:04:50.103412 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:50 crc kubenswrapper[4898]: I1211 13:04:50.103424 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:50 crc kubenswrapper[4898]: I1211 13:04:50.103443 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:50 crc kubenswrapper[4898]: I1211 13:04:50.103479 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:50Z","lastTransitionTime":"2025-12-11T13:04:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:50 crc kubenswrapper[4898]: I1211 13:04:50.206684 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:50 crc kubenswrapper[4898]: I1211 13:04:50.206764 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:50 crc kubenswrapper[4898]: I1211 13:04:50.206778 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:50 crc kubenswrapper[4898]: I1211 13:04:50.206799 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:50 crc kubenswrapper[4898]: I1211 13:04:50.206811 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:50Z","lastTransitionTime":"2025-12-11T13:04:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:50 crc kubenswrapper[4898]: I1211 13:04:50.309561 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:50 crc kubenswrapper[4898]: I1211 13:04:50.309610 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:50 crc kubenswrapper[4898]: I1211 13:04:50.309622 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:50 crc kubenswrapper[4898]: I1211 13:04:50.309640 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:50 crc kubenswrapper[4898]: I1211 13:04:50.309652 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:50Z","lastTransitionTime":"2025-12-11T13:04:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:50 crc kubenswrapper[4898]: I1211 13:04:50.412646 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:50 crc kubenswrapper[4898]: I1211 13:04:50.412714 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:50 crc kubenswrapper[4898]: I1211 13:04:50.412732 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:50 crc kubenswrapper[4898]: I1211 13:04:50.412766 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:50 crc kubenswrapper[4898]: I1211 13:04:50.412789 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:50Z","lastTransitionTime":"2025-12-11T13:04:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:50 crc kubenswrapper[4898]: I1211 13:04:50.514955 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:50 crc kubenswrapper[4898]: I1211 13:04:50.515003 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:50 crc kubenswrapper[4898]: I1211 13:04:50.515019 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:50 crc kubenswrapper[4898]: I1211 13:04:50.515037 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:50 crc kubenswrapper[4898]: I1211 13:04:50.515050 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:50Z","lastTransitionTime":"2025-12-11T13:04:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:50 crc kubenswrapper[4898]: I1211 13:04:50.617366 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:50 crc kubenswrapper[4898]: I1211 13:04:50.617506 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:50 crc kubenswrapper[4898]: I1211 13:04:50.617542 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:50 crc kubenswrapper[4898]: I1211 13:04:50.617596 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:50 crc kubenswrapper[4898]: I1211 13:04:50.617619 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:50Z","lastTransitionTime":"2025-12-11T13:04:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:50 crc kubenswrapper[4898]: I1211 13:04:50.720070 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:50 crc kubenswrapper[4898]: I1211 13:04:50.720143 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:50 crc kubenswrapper[4898]: I1211 13:04:50.720167 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:50 crc kubenswrapper[4898]: I1211 13:04:50.720196 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:50 crc kubenswrapper[4898]: I1211 13:04:50.720218 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:50Z","lastTransitionTime":"2025-12-11T13:04:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:50 crc kubenswrapper[4898]: I1211 13:04:50.823109 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:50 crc kubenswrapper[4898]: I1211 13:04:50.823151 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:50 crc kubenswrapper[4898]: I1211 13:04:50.823160 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:50 crc kubenswrapper[4898]: I1211 13:04:50.823180 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:50 crc kubenswrapper[4898]: I1211 13:04:50.823204 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:50Z","lastTransitionTime":"2025-12-11T13:04:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:50 crc kubenswrapper[4898]: I1211 13:04:50.925834 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:50 crc kubenswrapper[4898]: I1211 13:04:50.925887 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:50 crc kubenswrapper[4898]: I1211 13:04:50.925901 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:50 crc kubenswrapper[4898]: I1211 13:04:50.925917 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:50 crc kubenswrapper[4898]: I1211 13:04:50.925928 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:50Z","lastTransitionTime":"2025-12-11T13:04:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:51 crc kubenswrapper[4898]: I1211 13:04:51.029127 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:51 crc kubenswrapper[4898]: I1211 13:04:51.029169 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:51 crc kubenswrapper[4898]: I1211 13:04:51.029179 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:51 crc kubenswrapper[4898]: I1211 13:04:51.029193 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:51 crc kubenswrapper[4898]: I1211 13:04:51.029203 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:51Z","lastTransitionTime":"2025-12-11T13:04:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:51 crc kubenswrapper[4898]: I1211 13:04:51.132402 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:51 crc kubenswrapper[4898]: I1211 13:04:51.132447 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:51 crc kubenswrapper[4898]: I1211 13:04:51.132480 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:51 crc kubenswrapper[4898]: I1211 13:04:51.132498 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:51 crc kubenswrapper[4898]: I1211 13:04:51.132509 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:51Z","lastTransitionTime":"2025-12-11T13:04:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:51 crc kubenswrapper[4898]: I1211 13:04:51.234781 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:51 crc kubenswrapper[4898]: I1211 13:04:51.234845 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:51 crc kubenswrapper[4898]: I1211 13:04:51.234863 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:51 crc kubenswrapper[4898]: I1211 13:04:51.234887 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:51 crc kubenswrapper[4898]: I1211 13:04:51.234913 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:51Z","lastTransitionTime":"2025-12-11T13:04:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:51 crc kubenswrapper[4898]: I1211 13:04:51.337514 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:51 crc kubenswrapper[4898]: I1211 13:04:51.337563 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:51 crc kubenswrapper[4898]: I1211 13:04:51.337578 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:51 crc kubenswrapper[4898]: I1211 13:04:51.337599 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:51 crc kubenswrapper[4898]: I1211 13:04:51.337615 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:51Z","lastTransitionTime":"2025-12-11T13:04:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:51 crc kubenswrapper[4898]: I1211 13:04:51.441705 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:51 crc kubenswrapper[4898]: I1211 13:04:51.441760 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:51 crc kubenswrapper[4898]: I1211 13:04:51.441791 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:51 crc kubenswrapper[4898]: I1211 13:04:51.441817 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:51 crc kubenswrapper[4898]: I1211 13:04:51.441836 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:51Z","lastTransitionTime":"2025-12-11T13:04:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:51 crc kubenswrapper[4898]: I1211 13:04:51.544270 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:51 crc kubenswrapper[4898]: I1211 13:04:51.544348 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:51 crc kubenswrapper[4898]: I1211 13:04:51.544372 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:51 crc kubenswrapper[4898]: I1211 13:04:51.544401 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:51 crc kubenswrapper[4898]: I1211 13:04:51.544423 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:51Z","lastTransitionTime":"2025-12-11T13:04:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:51 crc kubenswrapper[4898]: I1211 13:04:51.647129 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:51 crc kubenswrapper[4898]: I1211 13:04:51.647183 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:51 crc kubenswrapper[4898]: I1211 13:04:51.647199 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:51 crc kubenswrapper[4898]: I1211 13:04:51.647221 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:51 crc kubenswrapper[4898]: I1211 13:04:51.647238 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:51Z","lastTransitionTime":"2025-12-11T13:04:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:51 crc kubenswrapper[4898]: I1211 13:04:51.749342 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:51 crc kubenswrapper[4898]: I1211 13:04:51.749377 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:51 crc kubenswrapper[4898]: I1211 13:04:51.749389 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:51 crc kubenswrapper[4898]: I1211 13:04:51.749404 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:51 crc kubenswrapper[4898]: I1211 13:04:51.749416 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:51Z","lastTransitionTime":"2025-12-11T13:04:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:51 crc kubenswrapper[4898]: I1211 13:04:51.774044 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 11 13:04:51 crc kubenswrapper[4898]: I1211 13:04:51.774077 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 11 13:04:51 crc kubenswrapper[4898]: I1211 13:04:51.774130 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 11 13:04:51 crc kubenswrapper[4898]: I1211 13:04:51.774160 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zcq7l" Dec 11 13:04:51 crc kubenswrapper[4898]: E1211 13:04:51.774239 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 11 13:04:51 crc kubenswrapper[4898]: E1211 13:04:51.774322 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 11 13:04:51 crc kubenswrapper[4898]: E1211 13:04:51.774733 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 11 13:04:51 crc kubenswrapper[4898]: E1211 13:04:51.774974 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zcq7l" podUID="34380c7c-1d75-4f6f-a6cb-b015a55ca978" Dec 11 13:04:51 crc kubenswrapper[4898]: I1211 13:04:51.851978 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:51 crc kubenswrapper[4898]: I1211 13:04:51.852012 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:51 crc kubenswrapper[4898]: I1211 13:04:51.852020 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:51 crc kubenswrapper[4898]: I1211 13:04:51.852033 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:51 crc kubenswrapper[4898]: I1211 13:04:51.852041 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:51Z","lastTransitionTime":"2025-12-11T13:04:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:51 crc kubenswrapper[4898]: I1211 13:04:51.956676 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:51 crc kubenswrapper[4898]: I1211 13:04:51.956739 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:51 crc kubenswrapper[4898]: I1211 13:04:51.956761 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:51 crc kubenswrapper[4898]: I1211 13:04:51.956791 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:51 crc kubenswrapper[4898]: I1211 13:04:51.956814 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:51Z","lastTransitionTime":"2025-12-11T13:04:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:52 crc kubenswrapper[4898]: I1211 13:04:52.059522 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:52 crc kubenswrapper[4898]: I1211 13:04:52.059562 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:52 crc kubenswrapper[4898]: I1211 13:04:52.059570 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:52 crc kubenswrapper[4898]: I1211 13:04:52.059586 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:52 crc kubenswrapper[4898]: I1211 13:04:52.059594 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:52Z","lastTransitionTime":"2025-12-11T13:04:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:52 crc kubenswrapper[4898]: I1211 13:04:52.162494 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:52 crc kubenswrapper[4898]: I1211 13:04:52.162531 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:52 crc kubenswrapper[4898]: I1211 13:04:52.162543 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:52 crc kubenswrapper[4898]: I1211 13:04:52.162559 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:52 crc kubenswrapper[4898]: I1211 13:04:52.162573 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:52Z","lastTransitionTime":"2025-12-11T13:04:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:52 crc kubenswrapper[4898]: I1211 13:04:52.265613 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:52 crc kubenswrapper[4898]: I1211 13:04:52.265688 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:52 crc kubenswrapper[4898]: I1211 13:04:52.265705 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:52 crc kubenswrapper[4898]: I1211 13:04:52.265724 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:52 crc kubenswrapper[4898]: I1211 13:04:52.265738 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:52Z","lastTransitionTime":"2025-12-11T13:04:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:52 crc kubenswrapper[4898]: I1211 13:04:52.281182 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/34380c7c-1d75-4f6f-a6cb-b015a55ca978-metrics-certs\") pod \"network-metrics-daemon-zcq7l\" (UID: \"34380c7c-1d75-4f6f-a6cb-b015a55ca978\") " pod="openshift-multus/network-metrics-daemon-zcq7l" Dec 11 13:04:52 crc kubenswrapper[4898]: E1211 13:04:52.281399 4898 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 11 13:04:52 crc kubenswrapper[4898]: E1211 13:04:52.281565 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/34380c7c-1d75-4f6f-a6cb-b015a55ca978-metrics-certs podName:34380c7c-1d75-4f6f-a6cb-b015a55ca978 nodeName:}" failed. No retries permitted until 2025-12-11 13:05:08.281496332 +0000 UTC m=+65.853822829 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/34380c7c-1d75-4f6f-a6cb-b015a55ca978-metrics-certs") pod "network-metrics-daemon-zcq7l" (UID: "34380c7c-1d75-4f6f-a6cb-b015a55ca978") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 11 13:04:52 crc kubenswrapper[4898]: I1211 13:04:52.368874 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:52 crc kubenswrapper[4898]: I1211 13:04:52.368929 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:52 crc kubenswrapper[4898]: I1211 13:04:52.368942 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:52 crc kubenswrapper[4898]: I1211 13:04:52.368960 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:52 crc kubenswrapper[4898]: I1211 13:04:52.368973 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:52Z","lastTransitionTime":"2025-12-11T13:04:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:52 crc kubenswrapper[4898]: I1211 13:04:52.471628 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:52 crc kubenswrapper[4898]: I1211 13:04:52.471686 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:52 crc kubenswrapper[4898]: I1211 13:04:52.471702 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:52 crc kubenswrapper[4898]: I1211 13:04:52.471725 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:52 crc kubenswrapper[4898]: I1211 13:04:52.471739 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:52Z","lastTransitionTime":"2025-12-11T13:04:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:52 crc kubenswrapper[4898]: I1211 13:04:52.574799 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:52 crc kubenswrapper[4898]: I1211 13:04:52.574877 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:52 crc kubenswrapper[4898]: I1211 13:04:52.574900 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:52 crc kubenswrapper[4898]: I1211 13:04:52.574932 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:52 crc kubenswrapper[4898]: I1211 13:04:52.574950 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:52Z","lastTransitionTime":"2025-12-11T13:04:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:52 crc kubenswrapper[4898]: I1211 13:04:52.677539 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:52 crc kubenswrapper[4898]: I1211 13:04:52.677644 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:52 crc kubenswrapper[4898]: I1211 13:04:52.677682 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:52 crc kubenswrapper[4898]: I1211 13:04:52.677737 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:52 crc kubenswrapper[4898]: I1211 13:04:52.677762 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:52Z","lastTransitionTime":"2025-12-11T13:04:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:52 crc kubenswrapper[4898]: I1211 13:04:52.782127 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:52 crc kubenswrapper[4898]: I1211 13:04:52.782205 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:52 crc kubenswrapper[4898]: I1211 13:04:52.782231 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:52 crc kubenswrapper[4898]: I1211 13:04:52.782264 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:52 crc kubenswrapper[4898]: I1211 13:04:52.782289 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:52Z","lastTransitionTime":"2025-12-11T13:04:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:52 crc kubenswrapper[4898]: I1211 13:04:52.787794 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-h86cf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"00c0a5c3-6be3-4c77-a628-cf6710a1f10f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cb94931502e22d03595fc3289c2b098ecf881b9f8c14180a036a11b2f65bd91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xv2hc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-h86cf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:52Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:52 crc kubenswrapper[4898]: I1211 13:04:52.802071 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6782ac5869029f1d6c05198c469d34070decbbbbab0c093963024040f067ecf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7cv42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb7b98c50effebe09db7e51fe139da11af7b116ce6b7d81f0dada92fa29c82a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7cv42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7mmvk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:52Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:52 crc kubenswrapper[4898]: I1211 13:04:52.831650 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qndxl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1efa7034-8a95-4e6e-bd84-0189dc5acaa3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2a77e22893dca5ea818fc6992db6c34f8eb90ce2f25e814bec5f314d60ad569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd81ec4902eeb225d5a2969e211c63bbab448c1cccb327cfa84fb68c4f386ab0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ab0efc2d7d46f575125a46f025b016017e66f74466a9d796dfb0285577f5bcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d992af21fd68cce19a4be8328466a3d46df62ed908fe4e911ba14e7c009ce982\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a60901e27e67b7483e02800a91a86f2676b610d538c999ab1df0d42fde146f90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7e7d7ded2a6e0e20781a16b0ed56a744db593c6fe40c75e89a088ccb8cdb127\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://378427d319c4df6bcb001488490b4a1a785a18161dfed6887f08fbcea9e06c5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://378427d319c4df6bcb001488490b4a1a785a18161dfed6887f08fbcea9e06c5a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-11T13:04:35Z\\\",\\\"message\\\":\\\"ent time 2025-12-11T13:04:34Z is after 2025-08-24T17:21:41Z]\\\\nI1211 13:04:34.324602 6346 services_controller.go:434] Service openshift-network-console/networking-console-plugin retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{networking-console-plugin openshift-network-console 45966864-9dcc-4747-a124-95f4ab710d2d 13853 0 2025-02-23 05:40:35 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[app.kubernetes.io/component:networking-console-plugin app.kubernetes.io/managed-by:cluster-network-operator app.kubernetes.io/name:networking-console-plugin app.kubernetes.io/part-of:cluster-network-operator] map[openshift.io/description:Expose the networking console plugin service on port 9443. This port is for internal use, and no other usage is guaranteed. service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-secret-name:networking-console-plugin-cert service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{operator.openshift.io/v1 Network cluster 8d01ddba-7e05-4639-926a-4485de3b6327 0xc0073ce7ae 0xc0073ce7af}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:https,Protocol:TCP,Port:9443,TargetPort:{1 0 https},NodePort:0,AppProtocol:nil,},},Selector:map[string]string{app.kubernetes.io/component: networking-console-plugin,app.kuberne\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:33Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-qndxl_openshift-ovn-kubernetes(1efa7034-8a95-4e6e-bd84-0189dc5acaa3)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3aa907a5ca865510f1aafd0f949963b5938b6fe62fa9fd690447c4ab62566f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fc397b24e8480c4d07281b09fb871f859eb9fdf47fa9103b9243f91e5800d61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fc397b24e8480c4d07281b09fb871f859eb9fdf47fa9103b9243f91e5800d61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qndxl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:52Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:52 crc kubenswrapper[4898]: I1211 13:04:52.846608 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-brphv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f71cb6f-609f-4fab-86eb-e01518fb8c61\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d17922d87e50d85f50e76fef19a44cedae3bfd35727f8531089cfcf6da12d5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwpjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cedba92b63b5430b84d32f2b59c5cc0c4e1cd2bc2d513f2f905ebb59c34639b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwpjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-brphv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:52Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:52 crc kubenswrapper[4898]: I1211 13:04:52.880912 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"88d684b6-361a-40e6-978e-b46ea34398f5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fdd8f4f5e41c02e8bfa32782027f6483a3568a3e1cb28f2abc7807eaada32646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad23dc07c8ad7cf6100e3b53e1d9ed8abac20e942968edb5decdde1b263de306\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce1bf1a7718645350f992e26a366a0fefa0d757b696c932d3552dd24a4c93550\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4801f56faef50a1c100b7f54be9508a2be17ab655648fdb92cffaab1cb198a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87f5a835b639c4ce50c0d2b5f1a865f1b67bc5cc06d9fd88875d9defb71dbed2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77f3c45a0ef619bd5ee7b89f62fdaf4ae983c8f75e23034eb88019194f6ef88a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77f3c45a0ef619bd5ee7b89f62fdaf4ae983c8f75e23034eb88019194f6ef88a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32f392949d5ebc630af8b0646f8f482f806803d464e2cedbcbe731328d6ba5ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32f392949d5ebc630af8b0646f8f482f806803d464e2cedbcbe731328d6ba5ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://bf8553e4b1a22afaf454efae3707136b9e6e320a54fd269f48a5f72297249d21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf8553e4b1a22afaf454efae3707136b9e6e320a54fd269f48a5f72297249d21\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:52Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:52 crc kubenswrapper[4898]: I1211 13:04:52.884765 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:52 crc kubenswrapper[4898]: I1211 13:04:52.884811 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:52 crc kubenswrapper[4898]: I1211 13:04:52.884824 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:52 crc kubenswrapper[4898]: I1211 13:04:52.884845 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:52 crc kubenswrapper[4898]: I1211 13:04:52.884924 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:52Z","lastTransitionTime":"2025-12-11T13:04:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:52 crc kubenswrapper[4898]: I1211 13:04:52.906650 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7lxfm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45192e97-2770-4866-9865-dc2f45b3f616\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4fc2a5689b589ccda6ec382b6a586f4e47c42d4f137d852e28688a18b98e6e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14a9667311a747f8e844897394432d67e0897b22569dd1314f4ebeccfce1c531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14a9667311a747f8e844897394432d67e0897b22569dd1314f4ebeccfce1c531\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://35bb75c37377565dd0410255d00b3a848a3d22801107bd4d9547ce5574cda121\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35bb75c37377565dd0410255d00b3a848a3d22801107bd4d9547ce5574cda121\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd3bae44b475629c42174d195e162b9afe545e8dc84857847b1e2d936cf1a32d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd3bae44b475629c42174d195e162b9afe545e8dc84857847b1e2d936cf1a32d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7d5edf28866aeb08ed16121e0cea183463e58a708b157da20edd847bdad9860\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7d5edf28866aeb08ed16121e0cea183463e58a708b157da20edd847bdad9860\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://951e5da3fc6041496960dcfcd28bb3e918a40d9303c6609880bc2f8dcca29688\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://951e5da3fc6041496960dcfcd28bb3e918a40d9303c6609880bc2f8dcca29688\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39b6a95a0387b8f1ea4992bfa28355272a299bbb81adddf7ad39835786b43b2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39b6a95a0387b8f1ea4992bfa28355272a299bbb81adddf7ad39835786b43b2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7lxfm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:52Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:52 crc kubenswrapper[4898]: I1211 13:04:52.920425 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-zcq7l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34380c7c-1d75-4f6f-a6cb-b015a55ca978\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:36Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-zcq7l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:52Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:52 crc kubenswrapper[4898]: I1211 13:04:52.938666 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fc485a2-a69e-4453-8c3d-4b9698caa632\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df081be8bffcbf5068ade5f09ad81f2ea332d110fb0dffc1ab215f4a447e75d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://765cb993103d23770e20a725ce1f28ca635d13d2ecd05676039a833274ecf3e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41568c567a418a6ebd534c09d02c4331dfa2cd00580441bfdd83a15a12153a59\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ccc1530df8a90b252f6008c3a1734fd489bcb57944c2cc46bdf3c7554d837c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8857b4ade110b31598d9080f45bc15f5702b2c278fe7909e4d172c9877e61534\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"message\\\":\\\"falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1211 13:04:15.326141 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1211 13:04:15.327902 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-38072788/tls.crt::/tmp/serving-cert-38072788/tls.key\\\\\\\"\\\\nI1211 13:04:21.035582 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1211 13:04:21.038444 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1211 13:04:21.038483 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1211 13:04:21.038515 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1211 13:04:21.038527 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1211 13:04:21.044993 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1211 13:04:21.045022 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1211 13:04:21.045029 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1211 13:04:21.045034 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1211 13:04:21.045038 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1211 13:04:21.045043 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1211 13:04:21.045047 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1211 13:04:21.045297 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1211 13:04:21.046774 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a6e962a8c37823fc7d0047f9c1cae14311bfa1f36236addff039bba3369a021\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca4b5408884ed07718750ad82f77961d242327ca11991fd4dbbd1b5b53c5b642\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca4b5408884ed07718750ad82f77961d242327ca11991fd4dbbd1b5b53c5b642\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:52Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:52 crc kubenswrapper[4898]: I1211 13:04:52.954727 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9dab79b9393ae8ca48f0b4366d9b4e6fd25015a3d42f913b3d3d5ce78cbf0cc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bee512bd389a5e402acad6fdbf075e8d90bbfb3899efaecee992ff8ef64617fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:52Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:52 crc kubenswrapper[4898]: I1211 13:04:52.969635 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:52Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:52 crc kubenswrapper[4898]: I1211 13:04:52.984787 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca3db723-c4e9-4a38-9cee-827f045c7be3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b72e9f6787b4bfadbc89cf669d5f31678f0a85a726107e3b3d2e06eff0e18ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a52b842d9e7001cf1e9c3284c56ac9a76f642b66457d6717c6367e00ef89fdf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://583816a3f9ba9a092d6cd8d75bba54cf42e3ff9f65230f83822baec967dd53dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f419f755ec0a61ffad06dca0d35df43386ebd253152f29cf04312cd0ea89d5de\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:52Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:52 crc kubenswrapper[4898]: I1211 13:04:52.986900 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:52 crc kubenswrapper[4898]: I1211 13:04:52.986934 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:52 crc kubenswrapper[4898]: I1211 13:04:52.986946 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:52 crc kubenswrapper[4898]: I1211 13:04:52.986958 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:52 crc kubenswrapper[4898]: I1211 13:04:52.986967 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:52Z","lastTransitionTime":"2025-12-11T13:04:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:52 crc kubenswrapper[4898]: I1211 13:04:52.998212 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:52Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:53 crc kubenswrapper[4898]: I1211 13:04:53.011653 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:53Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:53 crc kubenswrapper[4898]: I1211 13:04:53.027769 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jmvlr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8009db93-3f68-4a84-87ae-863f64e231e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08079d059f71e05ad9d28ebed3d2608a035d3b6fc87db1d80b3974b439ac5bdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkmx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:24Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jmvlr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:53Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:53 crc kubenswrapper[4898]: I1211 13:04:53.045320 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://337971ff1d0d3cf7ee256777a71e4549ec0b973b58b437caaa152c589047141f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:53Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:53 crc kubenswrapper[4898]: I1211 13:04:53.065589 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b28cc743c1173591c6ee75d4bb3841608d01bca50dab896d4c5de930a67ee22f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:53Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:53 crc kubenswrapper[4898]: I1211 13:04:53.120734 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:53 crc kubenswrapper[4898]: I1211 13:04:53.120786 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:53 crc kubenswrapper[4898]: I1211 13:04:53.120801 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:53 crc kubenswrapper[4898]: I1211 13:04:53.120821 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:53 crc kubenswrapper[4898]: I1211 13:04:53.120847 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:53Z","lastTransitionTime":"2025-12-11T13:04:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:53 crc kubenswrapper[4898]: I1211 13:04:53.127510 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-dlqfj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e8ed6cb-b822-4b64-9e00-e755c5aea812\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76ddfb054d2dd7ec27a771c96a0fca2cead8eacc3221c22b36c1d075414a1995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bvrpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-dlqfj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:53Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:53 crc kubenswrapper[4898]: I1211 13:04:53.222637 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:53 crc kubenswrapper[4898]: I1211 13:04:53.222691 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:53 crc kubenswrapper[4898]: I1211 13:04:53.222701 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:53 crc kubenswrapper[4898]: I1211 13:04:53.222715 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:53 crc kubenswrapper[4898]: I1211 13:04:53.222725 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:53Z","lastTransitionTime":"2025-12-11T13:04:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:53 crc kubenswrapper[4898]: I1211 13:04:53.325174 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:53 crc kubenswrapper[4898]: I1211 13:04:53.325227 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:53 crc kubenswrapper[4898]: I1211 13:04:53.325241 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:53 crc kubenswrapper[4898]: I1211 13:04:53.325260 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:53 crc kubenswrapper[4898]: I1211 13:04:53.325273 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:53Z","lastTransitionTime":"2025-12-11T13:04:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:53 crc kubenswrapper[4898]: I1211 13:04:53.428338 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:53 crc kubenswrapper[4898]: I1211 13:04:53.428381 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:53 crc kubenswrapper[4898]: I1211 13:04:53.428392 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:53 crc kubenswrapper[4898]: I1211 13:04:53.428407 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:53 crc kubenswrapper[4898]: I1211 13:04:53.428418 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:53Z","lastTransitionTime":"2025-12-11T13:04:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:53 crc kubenswrapper[4898]: I1211 13:04:53.521898 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 11 13:04:53 crc kubenswrapper[4898]: E1211 13:04:53.522251 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-11 13:05:25.52220532 +0000 UTC m=+83.094531797 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 13:04:53 crc kubenswrapper[4898]: I1211 13:04:53.530015 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:53 crc kubenswrapper[4898]: I1211 13:04:53.530076 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:53 crc kubenswrapper[4898]: I1211 13:04:53.530094 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:53 crc kubenswrapper[4898]: I1211 13:04:53.530114 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:53 crc kubenswrapper[4898]: I1211 13:04:53.530132 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:53Z","lastTransitionTime":"2025-12-11T13:04:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:53 crc kubenswrapper[4898]: I1211 13:04:53.623605 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 11 13:04:53 crc kubenswrapper[4898]: I1211 13:04:53.623674 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 11 13:04:53 crc kubenswrapper[4898]: I1211 13:04:53.623711 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 11 13:04:53 crc kubenswrapper[4898]: I1211 13:04:53.623755 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 11 13:04:53 crc kubenswrapper[4898]: E1211 13:04:53.623846 4898 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 11 13:04:53 crc kubenswrapper[4898]: E1211 13:04:53.623899 4898 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 11 13:04:53 crc kubenswrapper[4898]: E1211 13:04:53.623910 4898 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 11 13:04:53 crc kubenswrapper[4898]: E1211 13:04:53.623929 4898 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 11 13:04:53 crc kubenswrapper[4898]: E1211 13:04:53.623932 4898 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 11 13:04:53 crc kubenswrapper[4898]: E1211 13:04:53.623944 4898 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 11 13:04:53 crc kubenswrapper[4898]: E1211 13:04:53.623944 4898 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 11 13:04:53 crc kubenswrapper[4898]: E1211 13:04:53.623866 4898 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 11 13:04:53 crc kubenswrapper[4898]: E1211 13:04:53.624001 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-11 13:05:25.62397469 +0000 UTC m=+83.196301157 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 11 13:04:53 crc kubenswrapper[4898]: E1211 13:04:53.624034 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-11 13:05:25.624016911 +0000 UTC m=+83.196343358 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 11 13:04:53 crc kubenswrapper[4898]: E1211 13:04:53.624051 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-11 13:05:25.624043672 +0000 UTC m=+83.196370119 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 11 13:04:53 crc kubenswrapper[4898]: E1211 13:04:53.624065 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-11 13:05:25.624058572 +0000 UTC m=+83.196385029 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 11 13:04:53 crc kubenswrapper[4898]: I1211 13:04:53.632574 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:53 crc kubenswrapper[4898]: I1211 13:04:53.632631 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:53 crc kubenswrapper[4898]: I1211 13:04:53.632645 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:53 crc kubenswrapper[4898]: I1211 13:04:53.632659 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:53 crc kubenswrapper[4898]: I1211 13:04:53.632668 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:53Z","lastTransitionTime":"2025-12-11T13:04:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:53 crc kubenswrapper[4898]: I1211 13:04:53.735654 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:53 crc kubenswrapper[4898]: I1211 13:04:53.735727 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:53 crc kubenswrapper[4898]: I1211 13:04:53.735750 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:53 crc kubenswrapper[4898]: I1211 13:04:53.735772 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:53 crc kubenswrapper[4898]: I1211 13:04:53.735789 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:53Z","lastTransitionTime":"2025-12-11T13:04:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:53 crc kubenswrapper[4898]: I1211 13:04:53.774350 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 11 13:04:53 crc kubenswrapper[4898]: E1211 13:04:53.774509 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 11 13:04:53 crc kubenswrapper[4898]: I1211 13:04:53.774361 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 11 13:04:53 crc kubenswrapper[4898]: I1211 13:04:53.774353 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 11 13:04:53 crc kubenswrapper[4898]: E1211 13:04:53.774587 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 11 13:04:53 crc kubenswrapper[4898]: I1211 13:04:53.774368 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zcq7l" Dec 11 13:04:53 crc kubenswrapper[4898]: E1211 13:04:53.774791 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 11 13:04:53 crc kubenswrapper[4898]: E1211 13:04:53.774964 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zcq7l" podUID="34380c7c-1d75-4f6f-a6cb-b015a55ca978" Dec 11 13:04:53 crc kubenswrapper[4898]: I1211 13:04:53.837991 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:53 crc kubenswrapper[4898]: I1211 13:04:53.838033 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:53 crc kubenswrapper[4898]: I1211 13:04:53.838043 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:53 crc kubenswrapper[4898]: I1211 13:04:53.838059 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:53 crc kubenswrapper[4898]: I1211 13:04:53.838069 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:53Z","lastTransitionTime":"2025-12-11T13:04:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:53 crc kubenswrapper[4898]: I1211 13:04:53.940908 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:53 crc kubenswrapper[4898]: I1211 13:04:53.940959 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:53 crc kubenswrapper[4898]: I1211 13:04:53.940971 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:53 crc kubenswrapper[4898]: I1211 13:04:53.940985 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:53 crc kubenswrapper[4898]: I1211 13:04:53.940996 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:53Z","lastTransitionTime":"2025-12-11T13:04:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:54 crc kubenswrapper[4898]: I1211 13:04:54.044016 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:54 crc kubenswrapper[4898]: I1211 13:04:54.044066 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:54 crc kubenswrapper[4898]: I1211 13:04:54.044077 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:54 crc kubenswrapper[4898]: I1211 13:04:54.044094 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:54 crc kubenswrapper[4898]: I1211 13:04:54.044106 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:54Z","lastTransitionTime":"2025-12-11T13:04:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:54 crc kubenswrapper[4898]: I1211 13:04:54.146006 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:54 crc kubenswrapper[4898]: I1211 13:04:54.146069 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:54 crc kubenswrapper[4898]: I1211 13:04:54.146092 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:54 crc kubenswrapper[4898]: I1211 13:04:54.146119 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:54 crc kubenswrapper[4898]: I1211 13:04:54.146140 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:54Z","lastTransitionTime":"2025-12-11T13:04:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:54 crc kubenswrapper[4898]: I1211 13:04:54.248172 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:54 crc kubenswrapper[4898]: I1211 13:04:54.248233 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:54 crc kubenswrapper[4898]: I1211 13:04:54.248245 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:54 crc kubenswrapper[4898]: I1211 13:04:54.248259 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:54 crc kubenswrapper[4898]: I1211 13:04:54.248269 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:54Z","lastTransitionTime":"2025-12-11T13:04:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:54 crc kubenswrapper[4898]: I1211 13:04:54.351159 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:54 crc kubenswrapper[4898]: I1211 13:04:54.351214 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:54 crc kubenswrapper[4898]: I1211 13:04:54.351236 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:54 crc kubenswrapper[4898]: I1211 13:04:54.351262 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:54 crc kubenswrapper[4898]: I1211 13:04:54.351283 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:54Z","lastTransitionTime":"2025-12-11T13:04:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:54 crc kubenswrapper[4898]: I1211 13:04:54.454178 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:54 crc kubenswrapper[4898]: I1211 13:04:54.454247 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:54 crc kubenswrapper[4898]: I1211 13:04:54.454263 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:54 crc kubenswrapper[4898]: I1211 13:04:54.454287 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:54 crc kubenswrapper[4898]: I1211 13:04:54.454304 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:54Z","lastTransitionTime":"2025-12-11T13:04:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:54 crc kubenswrapper[4898]: I1211 13:04:54.557113 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:54 crc kubenswrapper[4898]: I1211 13:04:54.557166 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:54 crc kubenswrapper[4898]: I1211 13:04:54.557184 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:54 crc kubenswrapper[4898]: I1211 13:04:54.557206 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:54 crc kubenswrapper[4898]: I1211 13:04:54.557224 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:54Z","lastTransitionTime":"2025-12-11T13:04:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:54 crc kubenswrapper[4898]: I1211 13:04:54.659572 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:54 crc kubenswrapper[4898]: I1211 13:04:54.659642 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:54 crc kubenswrapper[4898]: I1211 13:04:54.659669 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:54 crc kubenswrapper[4898]: I1211 13:04:54.659698 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:54 crc kubenswrapper[4898]: I1211 13:04:54.659724 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:54Z","lastTransitionTime":"2025-12-11T13:04:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:54 crc kubenswrapper[4898]: I1211 13:04:54.762912 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:54 crc kubenswrapper[4898]: I1211 13:04:54.762971 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:54 crc kubenswrapper[4898]: I1211 13:04:54.762987 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:54 crc kubenswrapper[4898]: I1211 13:04:54.763012 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:54 crc kubenswrapper[4898]: I1211 13:04:54.763040 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:54Z","lastTransitionTime":"2025-12-11T13:04:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:54 crc kubenswrapper[4898]: I1211 13:04:54.775307 4898 scope.go:117] "RemoveContainer" containerID="378427d319c4df6bcb001488490b4a1a785a18161dfed6887f08fbcea9e06c5a" Dec 11 13:04:54 crc kubenswrapper[4898]: I1211 13:04:54.866297 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:54 crc kubenswrapper[4898]: I1211 13:04:54.866353 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:54 crc kubenswrapper[4898]: I1211 13:04:54.866372 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:54 crc kubenswrapper[4898]: I1211 13:04:54.866397 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:54 crc kubenswrapper[4898]: I1211 13:04:54.866479 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:54Z","lastTransitionTime":"2025-12-11T13:04:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:54 crc kubenswrapper[4898]: I1211 13:04:54.969875 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:54 crc kubenswrapper[4898]: I1211 13:04:54.969914 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:54 crc kubenswrapper[4898]: I1211 13:04:54.969931 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:54 crc kubenswrapper[4898]: I1211 13:04:54.969953 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:54 crc kubenswrapper[4898]: I1211 13:04:54.969971 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:54Z","lastTransitionTime":"2025-12-11T13:04:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:55 crc kubenswrapper[4898]: I1211 13:04:55.018392 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 11 13:04:55 crc kubenswrapper[4898]: I1211 13:04:55.030310 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Dec 11 13:04:55 crc kubenswrapper[4898]: I1211 13:04:55.039189 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca3db723-c4e9-4a38-9cee-827f045c7be3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b72e9f6787b4bfadbc89cf669d5f31678f0a85a726107e3b3d2e06eff0e18ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a52b842d9e7001cf1e9c3284c56ac9a76f642b66457d6717c6367e00ef89fdf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://583816a3f9ba9a092d6cd8d75bba54cf42e3ff9f65230f83822baec967dd53dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f419f755ec0a61ffad06dca0d35df43386ebd253152f29cf04312cd0ea89d5de\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:55Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:55 crc kubenswrapper[4898]: I1211 13:04:55.057232 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:55Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:55 crc kubenswrapper[4898]: I1211 13:04:55.071173 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:55Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:55 crc kubenswrapper[4898]: I1211 13:04:55.073195 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:55 crc kubenswrapper[4898]: I1211 13:04:55.073242 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:55 crc kubenswrapper[4898]: I1211 13:04:55.073257 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:55 crc kubenswrapper[4898]: I1211 13:04:55.073274 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:55 crc kubenswrapper[4898]: I1211 13:04:55.073286 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:55Z","lastTransitionTime":"2025-12-11T13:04:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:55 crc kubenswrapper[4898]: I1211 13:04:55.090327 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://337971ff1d0d3cf7ee256777a71e4549ec0b973b58b437caaa152c589047141f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:55Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:55 crc kubenswrapper[4898]: I1211 13:04:55.100754 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b28cc743c1173591c6ee75d4bb3841608d01bca50dab896d4c5de930a67ee22f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:55Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:55 crc kubenswrapper[4898]: I1211 13:04:55.116662 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-dlqfj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e8ed6cb-b822-4b64-9e00-e755c5aea812\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76ddfb054d2dd7ec27a771c96a0fca2cead8eacc3221c22b36c1d075414a1995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bvrpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-dlqfj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:55Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:55 crc kubenswrapper[4898]: I1211 13:04:55.128906 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qndxl_1efa7034-8a95-4e6e-bd84-0189dc5acaa3/ovnkube-controller/1.log" Dec 11 13:04:55 crc kubenswrapper[4898]: I1211 13:04:55.131848 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qndxl" event={"ID":"1efa7034-8a95-4e6e-bd84-0189dc5acaa3","Type":"ContainerStarted","Data":"18a2b5e8508fbd418e99cd2fe5af41cf82ce69e8a4240c8c3c1965e8f7b0bfb9"} Dec 11 13:04:55 crc kubenswrapper[4898]: I1211 13:04:55.132364 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-qndxl" Dec 11 13:04:55 crc kubenswrapper[4898]: I1211 13:04:55.132595 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jmvlr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8009db93-3f68-4a84-87ae-863f64e231e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08079d059f71e05ad9d28ebed3d2608a035d3b6fc87db1d80b3974b439ac5bdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkmx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:24Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jmvlr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:55Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:55 crc kubenswrapper[4898]: I1211 13:04:55.148797 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6782ac5869029f1d6c05198c469d34070decbbbbab0c093963024040f067ecf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7cv42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb7b98c50effebe09db7e51fe139da11af7b116ce6b7d81f0dada92fa29c82a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7cv42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7mmvk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:55Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:55 crc kubenswrapper[4898]: I1211 13:04:55.167429 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qndxl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1efa7034-8a95-4e6e-bd84-0189dc5acaa3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2a77e22893dca5ea818fc6992db6c34f8eb90ce2f25e814bec5f314d60ad569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd81ec4902eeb225d5a2969e211c63bbab448c1cccb327cfa84fb68c4f386ab0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ab0efc2d7d46f575125a46f025b016017e66f74466a9d796dfb0285577f5bcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d992af21fd68cce19a4be8328466a3d46df62ed908fe4e911ba14e7c009ce982\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a60901e27e67b7483e02800a91a86f2676b610d538c999ab1df0d42fde146f90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7e7d7ded2a6e0e20781a16b0ed56a744db593c6fe40c75e89a088ccb8cdb127\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://378427d319c4df6bcb001488490b4a1a785a18161dfed6887f08fbcea9e06c5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://378427d319c4df6bcb001488490b4a1a785a18161dfed6887f08fbcea9e06c5a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-11T13:04:35Z\\\",\\\"message\\\":\\\"ent time 2025-12-11T13:04:34Z is after 2025-08-24T17:21:41Z]\\\\nI1211 13:04:34.324602 6346 services_controller.go:434] Service openshift-network-console/networking-console-plugin retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{networking-console-plugin openshift-network-console 45966864-9dcc-4747-a124-95f4ab710d2d 13853 0 2025-02-23 05:40:35 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[app.kubernetes.io/component:networking-console-plugin app.kubernetes.io/managed-by:cluster-network-operator app.kubernetes.io/name:networking-console-plugin app.kubernetes.io/part-of:cluster-network-operator] map[openshift.io/description:Expose the networking console plugin service on port 9443. This port is for internal use, and no other usage is guaranteed. service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-secret-name:networking-console-plugin-cert service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{operator.openshift.io/v1 Network cluster 8d01ddba-7e05-4639-926a-4485de3b6327 0xc0073ce7ae 0xc0073ce7af}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:https,Protocol:TCP,Port:9443,TargetPort:{1 0 https},NodePort:0,AppProtocol:nil,},},Selector:map[string]string{app.kubernetes.io/component: networking-console-plugin,app.kuberne\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:33Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-qndxl_openshift-ovn-kubernetes(1efa7034-8a95-4e6e-bd84-0189dc5acaa3)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3aa907a5ca865510f1aafd0f949963b5938b6fe62fa9fd690447c4ab62566f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fc397b24e8480c4d07281b09fb871f859eb9fdf47fa9103b9243f91e5800d61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fc397b24e8480c4d07281b09fb871f859eb9fdf47fa9103b9243f91e5800d61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qndxl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:55Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:55 crc kubenswrapper[4898]: I1211 13:04:55.175972 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:55 crc kubenswrapper[4898]: I1211 13:04:55.176009 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:55 crc kubenswrapper[4898]: I1211 13:04:55.176020 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:55 crc kubenswrapper[4898]: I1211 13:04:55.176036 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:55 crc kubenswrapper[4898]: I1211 13:04:55.176047 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:55Z","lastTransitionTime":"2025-12-11T13:04:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:55 crc kubenswrapper[4898]: I1211 13:04:55.189186 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-brphv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f71cb6f-609f-4fab-86eb-e01518fb8c61\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d17922d87e50d85f50e76fef19a44cedae3bfd35727f8531089cfcf6da12d5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwpjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cedba92b63b5430b84d32f2b59c5cc0c4e1cd2bc2d513f2f905ebb59c34639b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwpjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-brphv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:55Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:55 crc kubenswrapper[4898]: I1211 13:04:55.218038 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"88d684b6-361a-40e6-978e-b46ea34398f5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fdd8f4f5e41c02e8bfa32782027f6483a3568a3e1cb28f2abc7807eaada32646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad23dc07c8ad7cf6100e3b53e1d9ed8abac20e942968edb5decdde1b263de306\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce1bf1a7718645350f992e26a366a0fefa0d757b696c932d3552dd24a4c93550\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4801f56faef50a1c100b7f54be9508a2be17ab655648fdb92cffaab1cb198a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87f5a835b639c4ce50c0d2b5f1a865f1b67bc5cc06d9fd88875d9defb71dbed2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77f3c45a0ef619bd5ee7b89f62fdaf4ae983c8f75e23034eb88019194f6ef88a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77f3c45a0ef619bd5ee7b89f62fdaf4ae983c8f75e23034eb88019194f6ef88a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32f392949d5ebc630af8b0646f8f482f806803d464e2cedbcbe731328d6ba5ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32f392949d5ebc630af8b0646f8f482f806803d464e2cedbcbe731328d6ba5ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://bf8553e4b1a22afaf454efae3707136b9e6e320a54fd269f48a5f72297249d21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf8553e4b1a22afaf454efae3707136b9e6e320a54fd269f48a5f72297249d21\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:55Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:55 crc kubenswrapper[4898]: I1211 13:04:55.227217 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-h86cf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"00c0a5c3-6be3-4c77-a628-cf6710a1f10f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cb94931502e22d03595fc3289c2b098ecf881b9f8c14180a036a11b2f65bd91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xv2hc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-h86cf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:55Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:55 crc kubenswrapper[4898]: I1211 13:04:55.237006 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-zcq7l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34380c7c-1d75-4f6f-a6cb-b015a55ca978\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:36Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-zcq7l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:55Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:55 crc kubenswrapper[4898]: I1211 13:04:55.249027 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fc485a2-a69e-4453-8c3d-4b9698caa632\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df081be8bffcbf5068ade5f09ad81f2ea332d110fb0dffc1ab215f4a447e75d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://765cb993103d23770e20a725ce1f28ca635d13d2ecd05676039a833274ecf3e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41568c567a418a6ebd534c09d02c4331dfa2cd00580441bfdd83a15a12153a59\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ccc1530df8a90b252f6008c3a1734fd489bcb57944c2cc46bdf3c7554d837c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8857b4ade110b31598d9080f45bc15f5702b2c278fe7909e4d172c9877e61534\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"message\\\":\\\"falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1211 13:04:15.326141 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1211 13:04:15.327902 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-38072788/tls.crt::/tmp/serving-cert-38072788/tls.key\\\\\\\"\\\\nI1211 13:04:21.035582 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1211 13:04:21.038444 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1211 13:04:21.038483 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1211 13:04:21.038515 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1211 13:04:21.038527 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1211 13:04:21.044993 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1211 13:04:21.045022 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1211 13:04:21.045029 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1211 13:04:21.045034 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1211 13:04:21.045038 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1211 13:04:21.045043 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1211 13:04:21.045047 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1211 13:04:21.045297 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1211 13:04:21.046774 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a6e962a8c37823fc7d0047f9c1cae14311bfa1f36236addff039bba3369a021\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca4b5408884ed07718750ad82f77961d242327ca11991fd4dbbd1b5b53c5b642\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca4b5408884ed07718750ad82f77961d242327ca11991fd4dbbd1b5b53c5b642\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:55Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:55 crc kubenswrapper[4898]: I1211 13:04:55.259908 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9dab79b9393ae8ca48f0b4366d9b4e6fd25015a3d42f913b3d3d5ce78cbf0cc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bee512bd389a5e402acad6fdbf075e8d90bbfb3899efaecee992ff8ef64617fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:55Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:55 crc kubenswrapper[4898]: I1211 13:04:55.274028 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:55Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:55 crc kubenswrapper[4898]: I1211 13:04:55.277646 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:55 crc kubenswrapper[4898]: I1211 13:04:55.277680 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:55 crc kubenswrapper[4898]: I1211 13:04:55.277689 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:55 crc kubenswrapper[4898]: I1211 13:04:55.277702 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:55 crc kubenswrapper[4898]: I1211 13:04:55.277711 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:55Z","lastTransitionTime":"2025-12-11T13:04:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:55 crc kubenswrapper[4898]: I1211 13:04:55.290018 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7lxfm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45192e97-2770-4866-9865-dc2f45b3f616\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4fc2a5689b589ccda6ec382b6a586f4e47c42d4f137d852e28688a18b98e6e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14a9667311a747f8e844897394432d67e0897b22569dd1314f4ebeccfce1c531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14a9667311a747f8e844897394432d67e0897b22569dd1314f4ebeccfce1c531\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://35bb75c37377565dd0410255d00b3a848a3d22801107bd4d9547ce5574cda121\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35bb75c37377565dd0410255d00b3a848a3d22801107bd4d9547ce5574cda121\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd3bae44b475629c42174d195e162b9afe545e8dc84857847b1e2d936cf1a32d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd3bae44b475629c42174d195e162b9afe545e8dc84857847b1e2d936cf1a32d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7d5edf28866aeb08ed16121e0cea183463e58a708b157da20edd847bdad9860\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7d5edf28866aeb08ed16121e0cea183463e58a708b157da20edd847bdad9860\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://951e5da3fc6041496960dcfcd28bb3e918a40d9303c6609880bc2f8dcca29688\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://951e5da3fc6041496960dcfcd28bb3e918a40d9303c6609880bc2f8dcca29688\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39b6a95a0387b8f1ea4992bfa28355272a299bbb81adddf7ad39835786b43b2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39b6a95a0387b8f1ea4992bfa28355272a299bbb81adddf7ad39835786b43b2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7lxfm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:55Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:55 crc kubenswrapper[4898]: I1211 13:04:55.301191 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbb0d9e2-813d-44ce-a547-a3c6f60c9f6b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e84fc12f1339e236216301663a693fc0ae8c043f282081b77e08455869556f40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcda5c81dd6cdc256d7acd24d32f956484e8823bed9e29c5713a722d8ebd6f46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cbfca7c42a03929bac318edc66ce1a6c7f88f795fd3b58db686b8484edfac60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e329db1261719ccef14dcf3965ff73799b275591613d4035cb820253dced79b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e329db1261719ccef14dcf3965ff73799b275591613d4035cb820253dced79b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:02Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:55Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:55 crc kubenswrapper[4898]: I1211 13:04:55.322215 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"88d684b6-361a-40e6-978e-b46ea34398f5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fdd8f4f5e41c02e8bfa32782027f6483a3568a3e1cb28f2abc7807eaada32646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad23dc07c8ad7cf6100e3b53e1d9ed8abac20e942968edb5decdde1b263de306\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce1bf1a7718645350f992e26a366a0fefa0d757b696c932d3552dd24a4c93550\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4801f56faef50a1c100b7f54be9508a2be17ab655648fdb92cffaab1cb198a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87f5a835b639c4ce50c0d2b5f1a865f1b67bc5cc06d9fd88875d9defb71dbed2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77f3c45a0ef619bd5ee7b89f62fdaf4ae983c8f75e23034eb88019194f6ef88a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77f3c45a0ef619bd5ee7b89f62fdaf4ae983c8f75e23034eb88019194f6ef88a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32f392949d5ebc630af8b0646f8f482f806803d464e2cedbcbe731328d6ba5ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32f392949d5ebc630af8b0646f8f482f806803d464e2cedbcbe731328d6ba5ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://bf8553e4b1a22afaf454efae3707136b9e6e320a54fd269f48a5f72297249d21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf8553e4b1a22afaf454efae3707136b9e6e320a54fd269f48a5f72297249d21\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:55Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:55 crc kubenswrapper[4898]: I1211 13:04:55.332287 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-h86cf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"00c0a5c3-6be3-4c77-a628-cf6710a1f10f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cb94931502e22d03595fc3289c2b098ecf881b9f8c14180a036a11b2f65bd91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xv2hc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-h86cf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:55Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:55 crc kubenswrapper[4898]: I1211 13:04:55.342704 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6782ac5869029f1d6c05198c469d34070decbbbbab0c093963024040f067ecf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7cv42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb7b98c50effebe09db7e51fe139da11af7b116ce6b7d81f0dada92fa29c82a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7cv42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7mmvk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:55Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:55 crc kubenswrapper[4898]: I1211 13:04:55.365760 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qndxl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1efa7034-8a95-4e6e-bd84-0189dc5acaa3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2a77e22893dca5ea818fc6992db6c34f8eb90ce2f25e814bec5f314d60ad569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd81ec4902eeb225d5a2969e211c63bbab448c1cccb327cfa84fb68c4f386ab0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ab0efc2d7d46f575125a46f025b016017e66f74466a9d796dfb0285577f5bcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d992af21fd68cce19a4be8328466a3d46df62ed908fe4e911ba14e7c009ce982\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a60901e27e67b7483e02800a91a86f2676b610d538c999ab1df0d42fde146f90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7e7d7ded2a6e0e20781a16b0ed56a744db593c6fe40c75e89a088ccb8cdb127\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18a2b5e8508fbd418e99cd2fe5af41cf82ce69e8a4240c8c3c1965e8f7b0bfb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://378427d319c4df6bcb001488490b4a1a785a18161dfed6887f08fbcea9e06c5a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-11T13:04:35Z\\\",\\\"message\\\":\\\"ent time 2025-12-11T13:04:34Z is after 2025-08-24T17:21:41Z]\\\\nI1211 13:04:34.324602 6346 services_controller.go:434] Service openshift-network-console/networking-console-plugin retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{networking-console-plugin openshift-network-console 45966864-9dcc-4747-a124-95f4ab710d2d 13853 0 2025-02-23 05:40:35 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[app.kubernetes.io/component:networking-console-plugin app.kubernetes.io/managed-by:cluster-network-operator app.kubernetes.io/name:networking-console-plugin app.kubernetes.io/part-of:cluster-network-operator] map[openshift.io/description:Expose the networking console plugin service on port 9443. This port is for internal use, and no other usage is guaranteed. service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-secret-name:networking-console-plugin-cert service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{operator.openshift.io/v1 Network cluster 8d01ddba-7e05-4639-926a-4485de3b6327 0xc0073ce7ae 0xc0073ce7af}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:https,Protocol:TCP,Port:9443,TargetPort:{1 0 https},NodePort:0,AppProtocol:nil,},},Selector:map[string]string{app.kubernetes.io/component: networking-console-plugin,app.kuberne\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:33Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3aa907a5ca865510f1aafd0f949963b5938b6fe62fa9fd690447c4ab62566f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fc397b24e8480c4d07281b09fb871f859eb9fdf47fa9103b9243f91e5800d61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fc397b24e8480c4d07281b09fb871f859eb9fdf47fa9103b9243f91e5800d61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qndxl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:55Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:55 crc kubenswrapper[4898]: I1211 13:04:55.379107 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-brphv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f71cb6f-609f-4fab-86eb-e01518fb8c61\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d17922d87e50d85f50e76fef19a44cedae3bfd35727f8531089cfcf6da12d5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwpjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cedba92b63b5430b84d32f2b59c5cc0c4e1cd2bc2d513f2f905ebb59c34639b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwpjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-brphv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:55Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:55 crc kubenswrapper[4898]: I1211 13:04:55.380474 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:55 crc kubenswrapper[4898]: I1211 13:04:55.380504 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:55 crc kubenswrapper[4898]: I1211 13:04:55.380518 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:55 crc kubenswrapper[4898]: I1211 13:04:55.380536 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:55 crc kubenswrapper[4898]: I1211 13:04:55.380549 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:55Z","lastTransitionTime":"2025-12-11T13:04:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:55 crc kubenswrapper[4898]: I1211 13:04:55.395160 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fc485a2-a69e-4453-8c3d-4b9698caa632\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df081be8bffcbf5068ade5f09ad81f2ea332d110fb0dffc1ab215f4a447e75d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://765cb993103d23770e20a725ce1f28ca635d13d2ecd05676039a833274ecf3e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41568c567a418a6ebd534c09d02c4331dfa2cd00580441bfdd83a15a12153a59\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ccc1530df8a90b252f6008c3a1734fd489bcb57944c2cc46bdf3c7554d837c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8857b4ade110b31598d9080f45bc15f5702b2c278fe7909e4d172c9877e61534\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"message\\\":\\\"falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1211 13:04:15.326141 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1211 13:04:15.327902 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-38072788/tls.crt::/tmp/serving-cert-38072788/tls.key\\\\\\\"\\\\nI1211 13:04:21.035582 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1211 13:04:21.038444 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1211 13:04:21.038483 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1211 13:04:21.038515 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1211 13:04:21.038527 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1211 13:04:21.044993 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1211 13:04:21.045022 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1211 13:04:21.045029 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1211 13:04:21.045034 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1211 13:04:21.045038 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1211 13:04:21.045043 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1211 13:04:21.045047 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1211 13:04:21.045297 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1211 13:04:21.046774 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a6e962a8c37823fc7d0047f9c1cae14311bfa1f36236addff039bba3369a021\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca4b5408884ed07718750ad82f77961d242327ca11991fd4dbbd1b5b53c5b642\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca4b5408884ed07718750ad82f77961d242327ca11991fd4dbbd1b5b53c5b642\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:55Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:55 crc kubenswrapper[4898]: I1211 13:04:55.409554 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9dab79b9393ae8ca48f0b4366d9b4e6fd25015a3d42f913b3d3d5ce78cbf0cc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bee512bd389a5e402acad6fdbf075e8d90bbfb3899efaecee992ff8ef64617fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:55Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:55 crc kubenswrapper[4898]: I1211 13:04:55.425274 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:55Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:55 crc kubenswrapper[4898]: I1211 13:04:55.440552 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7lxfm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45192e97-2770-4866-9865-dc2f45b3f616\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4fc2a5689b589ccda6ec382b6a586f4e47c42d4f137d852e28688a18b98e6e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14a9667311a747f8e844897394432d67e0897b22569dd1314f4ebeccfce1c531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14a9667311a747f8e844897394432d67e0897b22569dd1314f4ebeccfce1c531\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://35bb75c37377565dd0410255d00b3a848a3d22801107bd4d9547ce5574cda121\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35bb75c37377565dd0410255d00b3a848a3d22801107bd4d9547ce5574cda121\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd3bae44b475629c42174d195e162b9afe545e8dc84857847b1e2d936cf1a32d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd3bae44b475629c42174d195e162b9afe545e8dc84857847b1e2d936cf1a32d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7d5edf28866aeb08ed16121e0cea183463e58a708b157da20edd847bdad9860\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7d5edf28866aeb08ed16121e0cea183463e58a708b157da20edd847bdad9860\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://951e5da3fc6041496960dcfcd28bb3e918a40d9303c6609880bc2f8dcca29688\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://951e5da3fc6041496960dcfcd28bb3e918a40d9303c6609880bc2f8dcca29688\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39b6a95a0387b8f1ea4992bfa28355272a299bbb81adddf7ad39835786b43b2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39b6a95a0387b8f1ea4992bfa28355272a299bbb81adddf7ad39835786b43b2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7lxfm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:55Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:55 crc kubenswrapper[4898]: I1211 13:04:55.451187 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-zcq7l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34380c7c-1d75-4f6f-a6cb-b015a55ca978\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:36Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-zcq7l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:55Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:55 crc kubenswrapper[4898]: I1211 13:04:55.465533 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca3db723-c4e9-4a38-9cee-827f045c7be3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b72e9f6787b4bfadbc89cf669d5f31678f0a85a726107e3b3d2e06eff0e18ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a52b842d9e7001cf1e9c3284c56ac9a76f642b66457d6717c6367e00ef89fdf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://583816a3f9ba9a092d6cd8d75bba54cf42e3ff9f65230f83822baec967dd53dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f419f755ec0a61ffad06dca0d35df43386ebd253152f29cf04312cd0ea89d5de\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:55Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:55 crc kubenswrapper[4898]: I1211 13:04:55.482801 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:55 crc kubenswrapper[4898]: I1211 13:04:55.482848 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:55 crc kubenswrapper[4898]: I1211 13:04:55.482863 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:55 crc kubenswrapper[4898]: I1211 13:04:55.482879 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:55 crc kubenswrapper[4898]: I1211 13:04:55.482891 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:55Z","lastTransitionTime":"2025-12-11T13:04:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:55 crc kubenswrapper[4898]: I1211 13:04:55.483497 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:55Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:55 crc kubenswrapper[4898]: I1211 13:04:55.501324 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:55Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:55 crc kubenswrapper[4898]: I1211 13:04:55.521164 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://337971ff1d0d3cf7ee256777a71e4549ec0b973b58b437caaa152c589047141f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:55Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:55 crc kubenswrapper[4898]: I1211 13:04:55.534665 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b28cc743c1173591c6ee75d4bb3841608d01bca50dab896d4c5de930a67ee22f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:55Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:55 crc kubenswrapper[4898]: I1211 13:04:55.551217 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-dlqfj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e8ed6cb-b822-4b64-9e00-e755c5aea812\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76ddfb054d2dd7ec27a771c96a0fca2cead8eacc3221c22b36c1d075414a1995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bvrpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-dlqfj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:55Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:55 crc kubenswrapper[4898]: I1211 13:04:55.565677 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jmvlr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8009db93-3f68-4a84-87ae-863f64e231e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08079d059f71e05ad9d28ebed3d2608a035d3b6fc87db1d80b3974b439ac5bdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkmx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:24Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jmvlr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:55Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:55 crc kubenswrapper[4898]: I1211 13:04:55.585669 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:55 crc kubenswrapper[4898]: I1211 13:04:55.585711 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:55 crc kubenswrapper[4898]: I1211 13:04:55.585720 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:55 crc kubenswrapper[4898]: I1211 13:04:55.585734 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:55 crc kubenswrapper[4898]: I1211 13:04:55.585743 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:55Z","lastTransitionTime":"2025-12-11T13:04:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:55 crc kubenswrapper[4898]: I1211 13:04:55.688490 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:55 crc kubenswrapper[4898]: I1211 13:04:55.688525 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:55 crc kubenswrapper[4898]: I1211 13:04:55.688533 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:55 crc kubenswrapper[4898]: I1211 13:04:55.688546 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:55 crc kubenswrapper[4898]: I1211 13:04:55.688555 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:55Z","lastTransitionTime":"2025-12-11T13:04:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:55 crc kubenswrapper[4898]: I1211 13:04:55.774131 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 11 13:04:55 crc kubenswrapper[4898]: I1211 13:04:55.774158 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zcq7l" Dec 11 13:04:55 crc kubenswrapper[4898]: E1211 13:04:55.774263 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 11 13:04:55 crc kubenswrapper[4898]: I1211 13:04:55.774275 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 11 13:04:55 crc kubenswrapper[4898]: I1211 13:04:55.774314 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 11 13:04:55 crc kubenswrapper[4898]: E1211 13:04:55.774340 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 11 13:04:55 crc kubenswrapper[4898]: E1211 13:04:55.774447 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 11 13:04:55 crc kubenswrapper[4898]: E1211 13:04:55.774652 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zcq7l" podUID="34380c7c-1d75-4f6f-a6cb-b015a55ca978" Dec 11 13:04:55 crc kubenswrapper[4898]: I1211 13:04:55.791891 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:55 crc kubenswrapper[4898]: I1211 13:04:55.791949 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:55 crc kubenswrapper[4898]: I1211 13:04:55.791960 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:55 crc kubenswrapper[4898]: I1211 13:04:55.791978 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:55 crc kubenswrapper[4898]: I1211 13:04:55.791992 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:55Z","lastTransitionTime":"2025-12-11T13:04:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:55 crc kubenswrapper[4898]: I1211 13:04:55.894740 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:55 crc kubenswrapper[4898]: I1211 13:04:55.894782 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:55 crc kubenswrapper[4898]: I1211 13:04:55.894792 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:55 crc kubenswrapper[4898]: I1211 13:04:55.894810 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:55 crc kubenswrapper[4898]: I1211 13:04:55.894820 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:55Z","lastTransitionTime":"2025-12-11T13:04:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:55 crc kubenswrapper[4898]: I1211 13:04:55.997567 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:55 crc kubenswrapper[4898]: I1211 13:04:55.997649 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:55 crc kubenswrapper[4898]: I1211 13:04:55.997670 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:55 crc kubenswrapper[4898]: I1211 13:04:55.997703 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:55 crc kubenswrapper[4898]: I1211 13:04:55.997726 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:55Z","lastTransitionTime":"2025-12-11T13:04:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:56 crc kubenswrapper[4898]: I1211 13:04:56.100681 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:56 crc kubenswrapper[4898]: I1211 13:04:56.100737 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:56 crc kubenswrapper[4898]: I1211 13:04:56.100754 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:56 crc kubenswrapper[4898]: I1211 13:04:56.100776 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:56 crc kubenswrapper[4898]: I1211 13:04:56.100792 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:56Z","lastTransitionTime":"2025-12-11T13:04:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:56 crc kubenswrapper[4898]: I1211 13:04:56.138414 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qndxl_1efa7034-8a95-4e6e-bd84-0189dc5acaa3/ovnkube-controller/2.log" Dec 11 13:04:56 crc kubenswrapper[4898]: I1211 13:04:56.139665 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qndxl_1efa7034-8a95-4e6e-bd84-0189dc5acaa3/ovnkube-controller/1.log" Dec 11 13:04:56 crc kubenswrapper[4898]: I1211 13:04:56.144376 4898 generic.go:334] "Generic (PLEG): container finished" podID="1efa7034-8a95-4e6e-bd84-0189dc5acaa3" containerID="18a2b5e8508fbd418e99cd2fe5af41cf82ce69e8a4240c8c3c1965e8f7b0bfb9" exitCode=1 Dec 11 13:04:56 crc kubenswrapper[4898]: I1211 13:04:56.144479 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qndxl" event={"ID":"1efa7034-8a95-4e6e-bd84-0189dc5acaa3","Type":"ContainerDied","Data":"18a2b5e8508fbd418e99cd2fe5af41cf82ce69e8a4240c8c3c1965e8f7b0bfb9"} Dec 11 13:04:56 crc kubenswrapper[4898]: I1211 13:04:56.144538 4898 scope.go:117] "RemoveContainer" containerID="378427d319c4df6bcb001488490b4a1a785a18161dfed6887f08fbcea9e06c5a" Dec 11 13:04:56 crc kubenswrapper[4898]: I1211 13:04:56.145097 4898 scope.go:117] "RemoveContainer" containerID="18a2b5e8508fbd418e99cd2fe5af41cf82ce69e8a4240c8c3c1965e8f7b0bfb9" Dec 11 13:04:56 crc kubenswrapper[4898]: E1211 13:04:56.145276 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-qndxl_openshift-ovn-kubernetes(1efa7034-8a95-4e6e-bd84-0189dc5acaa3)\"" pod="openshift-ovn-kubernetes/ovnkube-node-qndxl" podUID="1efa7034-8a95-4e6e-bd84-0189dc5acaa3" Dec 11 13:04:56 crc kubenswrapper[4898]: I1211 13:04:56.165556 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca3db723-c4e9-4a38-9cee-827f045c7be3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b72e9f6787b4bfadbc89cf669d5f31678f0a85a726107e3b3d2e06eff0e18ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a52b842d9e7001cf1e9c3284c56ac9a76f642b66457d6717c6367e00ef89fdf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://583816a3f9ba9a092d6cd8d75bba54cf42e3ff9f65230f83822baec967dd53dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f419f755ec0a61ffad06dca0d35df43386ebd253152f29cf04312cd0ea89d5de\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:56Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:56 crc kubenswrapper[4898]: I1211 13:04:56.185251 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:56Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:56 crc kubenswrapper[4898]: I1211 13:04:56.199890 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:56Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:56 crc kubenswrapper[4898]: I1211 13:04:56.204575 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:56 crc kubenswrapper[4898]: I1211 13:04:56.204612 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:56 crc kubenswrapper[4898]: I1211 13:04:56.204624 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:56 crc kubenswrapper[4898]: I1211 13:04:56.204641 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:56 crc kubenswrapper[4898]: I1211 13:04:56.204654 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:56Z","lastTransitionTime":"2025-12-11T13:04:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:56 crc kubenswrapper[4898]: I1211 13:04:56.216086 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://337971ff1d0d3cf7ee256777a71e4549ec0b973b58b437caaa152c589047141f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:56Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:56 crc kubenswrapper[4898]: I1211 13:04:56.228875 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b28cc743c1173591c6ee75d4bb3841608d01bca50dab896d4c5de930a67ee22f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:56Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:56 crc kubenswrapper[4898]: I1211 13:04:56.244794 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-dlqfj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e8ed6cb-b822-4b64-9e00-e755c5aea812\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76ddfb054d2dd7ec27a771c96a0fca2cead8eacc3221c22b36c1d075414a1995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bvrpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-dlqfj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:56Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:56 crc kubenswrapper[4898]: I1211 13:04:56.256348 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jmvlr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8009db93-3f68-4a84-87ae-863f64e231e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08079d059f71e05ad9d28ebed3d2608a035d3b6fc87db1d80b3974b439ac5bdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkmx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:24Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jmvlr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:56Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:56 crc kubenswrapper[4898]: I1211 13:04:56.269730 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6782ac5869029f1d6c05198c469d34070decbbbbab0c093963024040f067ecf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7cv42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb7b98c50effebe09db7e51fe139da11af7b116ce6b7d81f0dada92fa29c82a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7cv42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7mmvk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:56Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:56 crc kubenswrapper[4898]: I1211 13:04:56.290551 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qndxl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1efa7034-8a95-4e6e-bd84-0189dc5acaa3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2a77e22893dca5ea818fc6992db6c34f8eb90ce2f25e814bec5f314d60ad569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd81ec4902eeb225d5a2969e211c63bbab448c1cccb327cfa84fb68c4f386ab0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ab0efc2d7d46f575125a46f025b016017e66f74466a9d796dfb0285577f5bcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d992af21fd68cce19a4be8328466a3d46df62ed908fe4e911ba14e7c009ce982\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a60901e27e67b7483e02800a91a86f2676b610d538c999ab1df0d42fde146f90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7e7d7ded2a6e0e20781a16b0ed56a744db593c6fe40c75e89a088ccb8cdb127\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18a2b5e8508fbd418e99cd2fe5af41cf82ce69e8a4240c8c3c1965e8f7b0bfb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://378427d319c4df6bcb001488490b4a1a785a18161dfed6887f08fbcea9e06c5a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-11T13:04:35Z\\\",\\\"message\\\":\\\"ent time 2025-12-11T13:04:34Z is after 2025-08-24T17:21:41Z]\\\\nI1211 13:04:34.324602 6346 services_controller.go:434] Service openshift-network-console/networking-console-plugin retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{networking-console-plugin openshift-network-console 45966864-9dcc-4747-a124-95f4ab710d2d 13853 0 2025-02-23 05:40:35 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[app.kubernetes.io/component:networking-console-plugin app.kubernetes.io/managed-by:cluster-network-operator app.kubernetes.io/name:networking-console-plugin app.kubernetes.io/part-of:cluster-network-operator] map[openshift.io/description:Expose the networking console plugin service on port 9443. This port is for internal use, and no other usage is guaranteed. service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-secret-name:networking-console-plugin-cert service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{operator.openshift.io/v1 Network cluster 8d01ddba-7e05-4639-926a-4485de3b6327 0xc0073ce7ae 0xc0073ce7af}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:https,Protocol:TCP,Port:9443,TargetPort:{1 0 https},NodePort:0,AppProtocol:nil,},},Selector:map[string]string{app.kubernetes.io/component: networking-console-plugin,app.kuberne\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:33Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://18a2b5e8508fbd418e99cd2fe5af41cf82ce69e8a4240c8c3c1965e8f7b0bfb9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-11T13:04:55Z\\\",\\\"message\\\":\\\"pin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.41:9443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {589f95f7-f3e2-4140-80ed-9a0717201481}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1211 13:04:55.603698 6618 services_controller.go:356] Processing sync for service openshift-marketplace/redhat-marketplace for network=default\\\\nF1211 13:04:55.603702 6618 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:55Z is after 2025-08-24T17:21:41Z]\\\\nI1211 13:04:55.6\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3aa907a5ca865510f1aafd0f949963b5938b6fe62fa9fd690447c4ab62566f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fc397b24e8480c4d07281b09fb871f859eb9fdf47fa9103b9243f91e5800d61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fc397b24e8480c4d07281b09fb871f859eb9fdf47fa9103b9243f91e5800d61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qndxl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:56Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:56 crc kubenswrapper[4898]: I1211 13:04:56.303646 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-brphv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f71cb6f-609f-4fab-86eb-e01518fb8c61\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d17922d87e50d85f50e76fef19a44cedae3bfd35727f8531089cfcf6da12d5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwpjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cedba92b63b5430b84d32f2b59c5cc0c4e1cd2bc2d513f2f905ebb59c34639b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwpjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-brphv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:56Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:56 crc kubenswrapper[4898]: I1211 13:04:56.307424 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:56 crc kubenswrapper[4898]: I1211 13:04:56.307601 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:56 crc kubenswrapper[4898]: I1211 13:04:56.307695 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:56 crc kubenswrapper[4898]: I1211 13:04:56.307794 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:56 crc kubenswrapper[4898]: I1211 13:04:56.307888 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:56Z","lastTransitionTime":"2025-12-11T13:04:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:56 crc kubenswrapper[4898]: I1211 13:04:56.317623 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbb0d9e2-813d-44ce-a547-a3c6f60c9f6b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e84fc12f1339e236216301663a693fc0ae8c043f282081b77e08455869556f40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcda5c81dd6cdc256d7acd24d32f956484e8823bed9e29c5713a722d8ebd6f46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cbfca7c42a03929bac318edc66ce1a6c7f88f795fd3b58db686b8484edfac60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e329db1261719ccef14dcf3965ff73799b275591613d4035cb820253dced79b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e329db1261719ccef14dcf3965ff73799b275591613d4035cb820253dced79b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:02Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:56Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:56 crc kubenswrapper[4898]: I1211 13:04:56.342663 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"88d684b6-361a-40e6-978e-b46ea34398f5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fdd8f4f5e41c02e8bfa32782027f6483a3568a3e1cb28f2abc7807eaada32646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad23dc07c8ad7cf6100e3b53e1d9ed8abac20e942968edb5decdde1b263de306\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce1bf1a7718645350f992e26a366a0fefa0d757b696c932d3552dd24a4c93550\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4801f56faef50a1c100b7f54be9508a2be17ab655648fdb92cffaab1cb198a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87f5a835b639c4ce50c0d2b5f1a865f1b67bc5cc06d9fd88875d9defb71dbed2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77f3c45a0ef619bd5ee7b89f62fdaf4ae983c8f75e23034eb88019194f6ef88a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77f3c45a0ef619bd5ee7b89f62fdaf4ae983c8f75e23034eb88019194f6ef88a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32f392949d5ebc630af8b0646f8f482f806803d464e2cedbcbe731328d6ba5ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32f392949d5ebc630af8b0646f8f482f806803d464e2cedbcbe731328d6ba5ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://bf8553e4b1a22afaf454efae3707136b9e6e320a54fd269f48a5f72297249d21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf8553e4b1a22afaf454efae3707136b9e6e320a54fd269f48a5f72297249d21\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:56Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:56 crc kubenswrapper[4898]: I1211 13:04:56.354545 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-h86cf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"00c0a5c3-6be3-4c77-a628-cf6710a1f10f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cb94931502e22d03595fc3289c2b098ecf881b9f8c14180a036a11b2f65bd91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xv2hc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-h86cf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:56Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:56 crc kubenswrapper[4898]: I1211 13:04:56.367328 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-zcq7l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34380c7c-1d75-4f6f-a6cb-b015a55ca978\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:36Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-zcq7l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:56Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:56 crc kubenswrapper[4898]: I1211 13:04:56.381899 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fc485a2-a69e-4453-8c3d-4b9698caa632\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df081be8bffcbf5068ade5f09ad81f2ea332d110fb0dffc1ab215f4a447e75d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://765cb993103d23770e20a725ce1f28ca635d13d2ecd05676039a833274ecf3e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41568c567a418a6ebd534c09d02c4331dfa2cd00580441bfdd83a15a12153a59\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ccc1530df8a90b252f6008c3a1734fd489bcb57944c2cc46bdf3c7554d837c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8857b4ade110b31598d9080f45bc15f5702b2c278fe7909e4d172c9877e61534\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"message\\\":\\\"falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1211 13:04:15.326141 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1211 13:04:15.327902 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-38072788/tls.crt::/tmp/serving-cert-38072788/tls.key\\\\\\\"\\\\nI1211 13:04:21.035582 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1211 13:04:21.038444 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1211 13:04:21.038483 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1211 13:04:21.038515 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1211 13:04:21.038527 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1211 13:04:21.044993 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1211 13:04:21.045022 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1211 13:04:21.045029 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1211 13:04:21.045034 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1211 13:04:21.045038 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1211 13:04:21.045043 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1211 13:04:21.045047 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1211 13:04:21.045297 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1211 13:04:21.046774 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a6e962a8c37823fc7d0047f9c1cae14311bfa1f36236addff039bba3369a021\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca4b5408884ed07718750ad82f77961d242327ca11991fd4dbbd1b5b53c5b642\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca4b5408884ed07718750ad82f77961d242327ca11991fd4dbbd1b5b53c5b642\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:56Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:56 crc kubenswrapper[4898]: I1211 13:04:56.394509 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9dab79b9393ae8ca48f0b4366d9b4e6fd25015a3d42f913b3d3d5ce78cbf0cc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bee512bd389a5e402acad6fdbf075e8d90bbfb3899efaecee992ff8ef64617fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:56Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:56 crc kubenswrapper[4898]: I1211 13:04:56.407750 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:56Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:56 crc kubenswrapper[4898]: I1211 13:04:56.410380 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:56 crc kubenswrapper[4898]: I1211 13:04:56.410404 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:56 crc kubenswrapper[4898]: I1211 13:04:56.410416 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:56 crc kubenswrapper[4898]: I1211 13:04:56.410431 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:56 crc kubenswrapper[4898]: I1211 13:04:56.410442 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:56Z","lastTransitionTime":"2025-12-11T13:04:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:56 crc kubenswrapper[4898]: I1211 13:04:56.421353 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7lxfm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45192e97-2770-4866-9865-dc2f45b3f616\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4fc2a5689b589ccda6ec382b6a586f4e47c42d4f137d852e28688a18b98e6e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14a9667311a747f8e844897394432d67e0897b22569dd1314f4ebeccfce1c531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14a9667311a747f8e844897394432d67e0897b22569dd1314f4ebeccfce1c531\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://35bb75c37377565dd0410255d00b3a848a3d22801107bd4d9547ce5574cda121\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35bb75c37377565dd0410255d00b3a848a3d22801107bd4d9547ce5574cda121\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd3bae44b475629c42174d195e162b9afe545e8dc84857847b1e2d936cf1a32d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd3bae44b475629c42174d195e162b9afe545e8dc84857847b1e2d936cf1a32d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7d5edf28866aeb08ed16121e0cea183463e58a708b157da20edd847bdad9860\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7d5edf28866aeb08ed16121e0cea183463e58a708b157da20edd847bdad9860\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://951e5da3fc6041496960dcfcd28bb3e918a40d9303c6609880bc2f8dcca29688\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://951e5da3fc6041496960dcfcd28bb3e918a40d9303c6609880bc2f8dcca29688\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39b6a95a0387b8f1ea4992bfa28355272a299bbb81adddf7ad39835786b43b2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39b6a95a0387b8f1ea4992bfa28355272a299bbb81adddf7ad39835786b43b2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7lxfm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:56Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:56 crc kubenswrapper[4898]: I1211 13:04:56.512823 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:56 crc kubenswrapper[4898]: I1211 13:04:56.512891 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:56 crc kubenswrapper[4898]: I1211 13:04:56.512918 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:56 crc kubenswrapper[4898]: I1211 13:04:56.512948 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:56 crc kubenswrapper[4898]: I1211 13:04:56.512971 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:56Z","lastTransitionTime":"2025-12-11T13:04:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:56 crc kubenswrapper[4898]: I1211 13:04:56.616174 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:56 crc kubenswrapper[4898]: I1211 13:04:56.616237 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:56 crc kubenswrapper[4898]: I1211 13:04:56.616254 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:56 crc kubenswrapper[4898]: I1211 13:04:56.616279 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:56 crc kubenswrapper[4898]: I1211 13:04:56.616297 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:56Z","lastTransitionTime":"2025-12-11T13:04:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:56 crc kubenswrapper[4898]: I1211 13:04:56.672226 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:56 crc kubenswrapper[4898]: I1211 13:04:56.672282 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:56 crc kubenswrapper[4898]: I1211 13:04:56.672295 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:56 crc kubenswrapper[4898]: I1211 13:04:56.672317 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:56 crc kubenswrapper[4898]: I1211 13:04:56.672329 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:56Z","lastTransitionTime":"2025-12-11T13:04:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:56 crc kubenswrapper[4898]: E1211 13:04:56.686449 4898 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-11T13:04:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-11T13:04:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-11T13:04:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-11T13:04:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"160776e4-8c30-4eed-9dbf-aa0b51733cfb\\\",\\\"systemUUID\\\":\\\"3ede5a2c-67f9-4bff-827e-03a23908e5c0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:56Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:56 crc kubenswrapper[4898]: I1211 13:04:56.690162 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:56 crc kubenswrapper[4898]: I1211 13:04:56.690732 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:56 crc kubenswrapper[4898]: I1211 13:04:56.690817 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:56 crc kubenswrapper[4898]: I1211 13:04:56.690934 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:56 crc kubenswrapper[4898]: I1211 13:04:56.691005 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:56Z","lastTransitionTime":"2025-12-11T13:04:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:56 crc kubenswrapper[4898]: E1211 13:04:56.704196 4898 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-11T13:04:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-11T13:04:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-11T13:04:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-11T13:04:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"160776e4-8c30-4eed-9dbf-aa0b51733cfb\\\",\\\"systemUUID\\\":\\\"3ede5a2c-67f9-4bff-827e-03a23908e5c0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:56Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:56 crc kubenswrapper[4898]: I1211 13:04:56.707619 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:56 crc kubenswrapper[4898]: I1211 13:04:56.707761 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:56 crc kubenswrapper[4898]: I1211 13:04:56.707877 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:56 crc kubenswrapper[4898]: I1211 13:04:56.707999 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:56 crc kubenswrapper[4898]: I1211 13:04:56.708203 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:56Z","lastTransitionTime":"2025-12-11T13:04:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:56 crc kubenswrapper[4898]: E1211 13:04:56.726066 4898 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-11T13:04:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-11T13:04:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-11T13:04:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-11T13:04:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"160776e4-8c30-4eed-9dbf-aa0b51733cfb\\\",\\\"systemUUID\\\":\\\"3ede5a2c-67f9-4bff-827e-03a23908e5c0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:56Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:56 crc kubenswrapper[4898]: I1211 13:04:56.730912 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:56 crc kubenswrapper[4898]: I1211 13:04:56.730958 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:56 crc kubenswrapper[4898]: I1211 13:04:56.730970 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:56 crc kubenswrapper[4898]: I1211 13:04:56.730989 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:56 crc kubenswrapper[4898]: I1211 13:04:56.731001 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:56Z","lastTransitionTime":"2025-12-11T13:04:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:56 crc kubenswrapper[4898]: E1211 13:04:56.742512 4898 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-11T13:04:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-11T13:04:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-11T13:04:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-11T13:04:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"160776e4-8c30-4eed-9dbf-aa0b51733cfb\\\",\\\"systemUUID\\\":\\\"3ede5a2c-67f9-4bff-827e-03a23908e5c0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:56Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:56 crc kubenswrapper[4898]: I1211 13:04:56.746581 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:56 crc kubenswrapper[4898]: I1211 13:04:56.746670 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:56 crc kubenswrapper[4898]: I1211 13:04:56.746682 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:56 crc kubenswrapper[4898]: I1211 13:04:56.746697 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:56 crc kubenswrapper[4898]: I1211 13:04:56.746709 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:56Z","lastTransitionTime":"2025-12-11T13:04:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:56 crc kubenswrapper[4898]: E1211 13:04:56.765211 4898 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-11T13:04:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-11T13:04:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-11T13:04:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-11T13:04:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"160776e4-8c30-4eed-9dbf-aa0b51733cfb\\\",\\\"systemUUID\\\":\\\"3ede5a2c-67f9-4bff-827e-03a23908e5c0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:56Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:56 crc kubenswrapper[4898]: E1211 13:04:56.765426 4898 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 11 13:04:56 crc kubenswrapper[4898]: I1211 13:04:56.767068 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:56 crc kubenswrapper[4898]: I1211 13:04:56.767101 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:56 crc kubenswrapper[4898]: I1211 13:04:56.767110 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:56 crc kubenswrapper[4898]: I1211 13:04:56.767123 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:56 crc kubenswrapper[4898]: I1211 13:04:56.767131 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:56Z","lastTransitionTime":"2025-12-11T13:04:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:56 crc kubenswrapper[4898]: I1211 13:04:56.869744 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:56 crc kubenswrapper[4898]: I1211 13:04:56.870078 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:56 crc kubenswrapper[4898]: I1211 13:04:56.870254 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:56 crc kubenswrapper[4898]: I1211 13:04:56.870552 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:56 crc kubenswrapper[4898]: I1211 13:04:56.870785 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:56Z","lastTransitionTime":"2025-12-11T13:04:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:56 crc kubenswrapper[4898]: I1211 13:04:56.973642 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:56 crc kubenswrapper[4898]: I1211 13:04:56.973713 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:56 crc kubenswrapper[4898]: I1211 13:04:56.973734 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:56 crc kubenswrapper[4898]: I1211 13:04:56.973758 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:56 crc kubenswrapper[4898]: I1211 13:04:56.973775 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:56Z","lastTransitionTime":"2025-12-11T13:04:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:57 crc kubenswrapper[4898]: I1211 13:04:57.076634 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:57 crc kubenswrapper[4898]: I1211 13:04:57.076718 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:57 crc kubenswrapper[4898]: I1211 13:04:57.076736 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:57 crc kubenswrapper[4898]: I1211 13:04:57.076761 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:57 crc kubenswrapper[4898]: I1211 13:04:57.076778 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:57Z","lastTransitionTime":"2025-12-11T13:04:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:57 crc kubenswrapper[4898]: I1211 13:04:57.151866 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qndxl_1efa7034-8a95-4e6e-bd84-0189dc5acaa3/ovnkube-controller/2.log" Dec 11 13:04:57 crc kubenswrapper[4898]: I1211 13:04:57.157788 4898 scope.go:117] "RemoveContainer" containerID="18a2b5e8508fbd418e99cd2fe5af41cf82ce69e8a4240c8c3c1965e8f7b0bfb9" Dec 11 13:04:57 crc kubenswrapper[4898]: E1211 13:04:57.158098 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-qndxl_openshift-ovn-kubernetes(1efa7034-8a95-4e6e-bd84-0189dc5acaa3)\"" pod="openshift-ovn-kubernetes/ovnkube-node-qndxl" podUID="1efa7034-8a95-4e6e-bd84-0189dc5acaa3" Dec 11 13:04:57 crc kubenswrapper[4898]: I1211 13:04:57.179784 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:57 crc kubenswrapper[4898]: I1211 13:04:57.179862 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:57 crc kubenswrapper[4898]: I1211 13:04:57.179873 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:57 crc kubenswrapper[4898]: I1211 13:04:57.179900 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:57 crc kubenswrapper[4898]: I1211 13:04:57.179910 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:57Z","lastTransitionTime":"2025-12-11T13:04:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:57 crc kubenswrapper[4898]: I1211 13:04:57.180313 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://337971ff1d0d3cf7ee256777a71e4549ec0b973b58b437caaa152c589047141f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:57Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:57 crc kubenswrapper[4898]: I1211 13:04:57.198818 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b28cc743c1173591c6ee75d4bb3841608d01bca50dab896d4c5de930a67ee22f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:57Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:57 crc kubenswrapper[4898]: I1211 13:04:57.219434 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-dlqfj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e8ed6cb-b822-4b64-9e00-e755c5aea812\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76ddfb054d2dd7ec27a771c96a0fca2cead8eacc3221c22b36c1d075414a1995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bvrpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-dlqfj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:57Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:57 crc kubenswrapper[4898]: I1211 13:04:57.237379 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jmvlr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8009db93-3f68-4a84-87ae-863f64e231e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08079d059f71e05ad9d28ebed3d2608a035d3b6fc87db1d80b3974b439ac5bdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkmx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:24Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jmvlr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:57Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:57 crc kubenswrapper[4898]: I1211 13:04:57.254056 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6782ac5869029f1d6c05198c469d34070decbbbbab0c093963024040f067ecf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7cv42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb7b98c50effebe09db7e51fe139da11af7b116ce6b7d81f0dada92fa29c82a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7cv42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7mmvk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:57Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:57 crc kubenswrapper[4898]: I1211 13:04:57.275328 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qndxl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1efa7034-8a95-4e6e-bd84-0189dc5acaa3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2a77e22893dca5ea818fc6992db6c34f8eb90ce2f25e814bec5f314d60ad569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd81ec4902eeb225d5a2969e211c63bbab448c1cccb327cfa84fb68c4f386ab0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ab0efc2d7d46f575125a46f025b016017e66f74466a9d796dfb0285577f5bcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d992af21fd68cce19a4be8328466a3d46df62ed908fe4e911ba14e7c009ce982\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a60901e27e67b7483e02800a91a86f2676b610d538c999ab1df0d42fde146f90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7e7d7ded2a6e0e20781a16b0ed56a744db593c6fe40c75e89a088ccb8cdb127\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18a2b5e8508fbd418e99cd2fe5af41cf82ce69e8a4240c8c3c1965e8f7b0bfb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://18a2b5e8508fbd418e99cd2fe5af41cf82ce69e8a4240c8c3c1965e8f7b0bfb9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-11T13:04:55Z\\\",\\\"message\\\":\\\"pin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.41:9443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {589f95f7-f3e2-4140-80ed-9a0717201481}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1211 13:04:55.603698 6618 services_controller.go:356] Processing sync for service openshift-marketplace/redhat-marketplace for network=default\\\\nF1211 13:04:55.603702 6618 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:55Z is after 2025-08-24T17:21:41Z]\\\\nI1211 13:04:55.6\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:54Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-qndxl_openshift-ovn-kubernetes(1efa7034-8a95-4e6e-bd84-0189dc5acaa3)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3aa907a5ca865510f1aafd0f949963b5938b6fe62fa9fd690447c4ab62566f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fc397b24e8480c4d07281b09fb871f859eb9fdf47fa9103b9243f91e5800d61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fc397b24e8480c4d07281b09fb871f859eb9fdf47fa9103b9243f91e5800d61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qndxl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:57Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:57 crc kubenswrapper[4898]: I1211 13:04:57.284137 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:57 crc kubenswrapper[4898]: I1211 13:04:57.284209 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:57 crc kubenswrapper[4898]: I1211 13:04:57.284231 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:57 crc kubenswrapper[4898]: I1211 13:04:57.284260 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:57 crc kubenswrapper[4898]: I1211 13:04:57.284281 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:57Z","lastTransitionTime":"2025-12-11T13:04:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:57 crc kubenswrapper[4898]: I1211 13:04:57.293803 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-brphv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f71cb6f-609f-4fab-86eb-e01518fb8c61\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d17922d87e50d85f50e76fef19a44cedae3bfd35727f8531089cfcf6da12d5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwpjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cedba92b63b5430b84d32f2b59c5cc0c4e1cd2bc2d513f2f905ebb59c34639b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwpjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-brphv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:57Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:57 crc kubenswrapper[4898]: I1211 13:04:57.307601 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbb0d9e2-813d-44ce-a547-a3c6f60c9f6b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e84fc12f1339e236216301663a693fc0ae8c043f282081b77e08455869556f40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcda5c81dd6cdc256d7acd24d32f956484e8823bed9e29c5713a722d8ebd6f46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cbfca7c42a03929bac318edc66ce1a6c7f88f795fd3b58db686b8484edfac60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e329db1261719ccef14dcf3965ff73799b275591613d4035cb820253dced79b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e329db1261719ccef14dcf3965ff73799b275591613d4035cb820253dced79b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:02Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:57Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:57 crc kubenswrapper[4898]: I1211 13:04:57.326063 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"88d684b6-361a-40e6-978e-b46ea34398f5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fdd8f4f5e41c02e8bfa32782027f6483a3568a3e1cb28f2abc7807eaada32646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad23dc07c8ad7cf6100e3b53e1d9ed8abac20e942968edb5decdde1b263de306\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce1bf1a7718645350f992e26a366a0fefa0d757b696c932d3552dd24a4c93550\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4801f56faef50a1c100b7f54be9508a2be17ab655648fdb92cffaab1cb198a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87f5a835b639c4ce50c0d2b5f1a865f1b67bc5cc06d9fd88875d9defb71dbed2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77f3c45a0ef619bd5ee7b89f62fdaf4ae983c8f75e23034eb88019194f6ef88a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77f3c45a0ef619bd5ee7b89f62fdaf4ae983c8f75e23034eb88019194f6ef88a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32f392949d5ebc630af8b0646f8f482f806803d464e2cedbcbe731328d6ba5ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32f392949d5ebc630af8b0646f8f482f806803d464e2cedbcbe731328d6ba5ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://bf8553e4b1a22afaf454efae3707136b9e6e320a54fd269f48a5f72297249d21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf8553e4b1a22afaf454efae3707136b9e6e320a54fd269f48a5f72297249d21\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:57Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:57 crc kubenswrapper[4898]: I1211 13:04:57.335965 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-h86cf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"00c0a5c3-6be3-4c77-a628-cf6710a1f10f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cb94931502e22d03595fc3289c2b098ecf881b9f8c14180a036a11b2f65bd91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xv2hc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-h86cf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:57Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:57 crc kubenswrapper[4898]: I1211 13:04:57.344714 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-zcq7l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34380c7c-1d75-4f6f-a6cb-b015a55ca978\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:36Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-zcq7l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:57Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:57 crc kubenswrapper[4898]: I1211 13:04:57.359885 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fc485a2-a69e-4453-8c3d-4b9698caa632\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df081be8bffcbf5068ade5f09ad81f2ea332d110fb0dffc1ab215f4a447e75d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://765cb993103d23770e20a725ce1f28ca635d13d2ecd05676039a833274ecf3e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41568c567a418a6ebd534c09d02c4331dfa2cd00580441bfdd83a15a12153a59\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ccc1530df8a90b252f6008c3a1734fd489bcb57944c2cc46bdf3c7554d837c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8857b4ade110b31598d9080f45bc15f5702b2c278fe7909e4d172c9877e61534\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"message\\\":\\\"falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1211 13:04:15.326141 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1211 13:04:15.327902 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-38072788/tls.crt::/tmp/serving-cert-38072788/tls.key\\\\\\\"\\\\nI1211 13:04:21.035582 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1211 13:04:21.038444 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1211 13:04:21.038483 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1211 13:04:21.038515 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1211 13:04:21.038527 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1211 13:04:21.044993 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1211 13:04:21.045022 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1211 13:04:21.045029 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1211 13:04:21.045034 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1211 13:04:21.045038 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1211 13:04:21.045043 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1211 13:04:21.045047 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1211 13:04:21.045297 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1211 13:04:21.046774 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a6e962a8c37823fc7d0047f9c1cae14311bfa1f36236addff039bba3369a021\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca4b5408884ed07718750ad82f77961d242327ca11991fd4dbbd1b5b53c5b642\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca4b5408884ed07718750ad82f77961d242327ca11991fd4dbbd1b5b53c5b642\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:57Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:57 crc kubenswrapper[4898]: I1211 13:04:57.370668 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9dab79b9393ae8ca48f0b4366d9b4e6fd25015a3d42f913b3d3d5ce78cbf0cc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bee512bd389a5e402acad6fdbf075e8d90bbfb3899efaecee992ff8ef64617fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:57Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:57 crc kubenswrapper[4898]: I1211 13:04:57.384813 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:57Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:57 crc kubenswrapper[4898]: I1211 13:04:57.386699 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:57 crc kubenswrapper[4898]: I1211 13:04:57.386954 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:57 crc kubenswrapper[4898]: I1211 13:04:57.387159 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:57 crc kubenswrapper[4898]: I1211 13:04:57.387356 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:57 crc kubenswrapper[4898]: I1211 13:04:57.387674 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:57Z","lastTransitionTime":"2025-12-11T13:04:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:57 crc kubenswrapper[4898]: I1211 13:04:57.405572 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7lxfm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45192e97-2770-4866-9865-dc2f45b3f616\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4fc2a5689b589ccda6ec382b6a586f4e47c42d4f137d852e28688a18b98e6e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14a9667311a747f8e844897394432d67e0897b22569dd1314f4ebeccfce1c531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14a9667311a747f8e844897394432d67e0897b22569dd1314f4ebeccfce1c531\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://35bb75c37377565dd0410255d00b3a848a3d22801107bd4d9547ce5574cda121\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35bb75c37377565dd0410255d00b3a848a3d22801107bd4d9547ce5574cda121\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd3bae44b475629c42174d195e162b9afe545e8dc84857847b1e2d936cf1a32d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd3bae44b475629c42174d195e162b9afe545e8dc84857847b1e2d936cf1a32d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7d5edf28866aeb08ed16121e0cea183463e58a708b157da20edd847bdad9860\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7d5edf28866aeb08ed16121e0cea183463e58a708b157da20edd847bdad9860\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://951e5da3fc6041496960dcfcd28bb3e918a40d9303c6609880bc2f8dcca29688\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://951e5da3fc6041496960dcfcd28bb3e918a40d9303c6609880bc2f8dcca29688\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39b6a95a0387b8f1ea4992bfa28355272a299bbb81adddf7ad39835786b43b2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39b6a95a0387b8f1ea4992bfa28355272a299bbb81adddf7ad39835786b43b2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7lxfm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:57Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:57 crc kubenswrapper[4898]: I1211 13:04:57.419631 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca3db723-c4e9-4a38-9cee-827f045c7be3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b72e9f6787b4bfadbc89cf669d5f31678f0a85a726107e3b3d2e06eff0e18ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a52b842d9e7001cf1e9c3284c56ac9a76f642b66457d6717c6367e00ef89fdf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://583816a3f9ba9a092d6cd8d75bba54cf42e3ff9f65230f83822baec967dd53dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f419f755ec0a61ffad06dca0d35df43386ebd253152f29cf04312cd0ea89d5de\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:57Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:57 crc kubenswrapper[4898]: I1211 13:04:57.436093 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:57Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:57 crc kubenswrapper[4898]: I1211 13:04:57.454996 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:57Z is after 2025-08-24T17:21:41Z" Dec 11 13:04:57 crc kubenswrapper[4898]: I1211 13:04:57.489878 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:57 crc kubenswrapper[4898]: I1211 13:04:57.490102 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:57 crc kubenswrapper[4898]: I1211 13:04:57.490181 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:57 crc kubenswrapper[4898]: I1211 13:04:57.490260 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:57 crc kubenswrapper[4898]: I1211 13:04:57.490339 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:57Z","lastTransitionTime":"2025-12-11T13:04:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:57 crc kubenswrapper[4898]: I1211 13:04:57.592699 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:57 crc kubenswrapper[4898]: I1211 13:04:57.592768 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:57 crc kubenswrapper[4898]: I1211 13:04:57.592794 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:57 crc kubenswrapper[4898]: I1211 13:04:57.592828 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:57 crc kubenswrapper[4898]: I1211 13:04:57.592852 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:57Z","lastTransitionTime":"2025-12-11T13:04:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:57 crc kubenswrapper[4898]: I1211 13:04:57.696911 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:57 crc kubenswrapper[4898]: I1211 13:04:57.696983 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:57 crc kubenswrapper[4898]: I1211 13:04:57.696996 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:57 crc kubenswrapper[4898]: I1211 13:04:57.697015 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:57 crc kubenswrapper[4898]: I1211 13:04:57.697029 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:57Z","lastTransitionTime":"2025-12-11T13:04:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:57 crc kubenswrapper[4898]: I1211 13:04:57.774078 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 11 13:04:57 crc kubenswrapper[4898]: I1211 13:04:57.774160 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 11 13:04:57 crc kubenswrapper[4898]: I1211 13:04:57.774101 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zcq7l" Dec 11 13:04:57 crc kubenswrapper[4898]: E1211 13:04:57.774255 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 11 13:04:57 crc kubenswrapper[4898]: E1211 13:04:57.774513 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zcq7l" podUID="34380c7c-1d75-4f6f-a6cb-b015a55ca978" Dec 11 13:04:57 crc kubenswrapper[4898]: E1211 13:04:57.774610 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 11 13:04:57 crc kubenswrapper[4898]: I1211 13:04:57.774101 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 11 13:04:57 crc kubenswrapper[4898]: E1211 13:04:57.774881 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 11 13:04:57 crc kubenswrapper[4898]: I1211 13:04:57.799823 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:57 crc kubenswrapper[4898]: I1211 13:04:57.799887 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:57 crc kubenswrapper[4898]: I1211 13:04:57.799913 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:57 crc kubenswrapper[4898]: I1211 13:04:57.799943 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:57 crc kubenswrapper[4898]: I1211 13:04:57.799966 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:57Z","lastTransitionTime":"2025-12-11T13:04:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:57 crc kubenswrapper[4898]: I1211 13:04:57.903220 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:57 crc kubenswrapper[4898]: I1211 13:04:57.903283 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:57 crc kubenswrapper[4898]: I1211 13:04:57.903354 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:57 crc kubenswrapper[4898]: I1211 13:04:57.903391 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:57 crc kubenswrapper[4898]: I1211 13:04:57.903413 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:57Z","lastTransitionTime":"2025-12-11T13:04:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:58 crc kubenswrapper[4898]: I1211 13:04:58.007249 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:58 crc kubenswrapper[4898]: I1211 13:04:58.007301 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:58 crc kubenswrapper[4898]: I1211 13:04:58.007319 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:58 crc kubenswrapper[4898]: I1211 13:04:58.007341 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:58 crc kubenswrapper[4898]: I1211 13:04:58.007358 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:58Z","lastTransitionTime":"2025-12-11T13:04:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:58 crc kubenswrapper[4898]: I1211 13:04:58.110396 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:58 crc kubenswrapper[4898]: I1211 13:04:58.110808 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:58 crc kubenswrapper[4898]: I1211 13:04:58.110954 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:58 crc kubenswrapper[4898]: I1211 13:04:58.111063 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:58 crc kubenswrapper[4898]: I1211 13:04:58.111148 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:58Z","lastTransitionTime":"2025-12-11T13:04:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:58 crc kubenswrapper[4898]: I1211 13:04:58.213281 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:58 crc kubenswrapper[4898]: I1211 13:04:58.213338 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:58 crc kubenswrapper[4898]: I1211 13:04:58.213350 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:58 crc kubenswrapper[4898]: I1211 13:04:58.213369 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:58 crc kubenswrapper[4898]: I1211 13:04:58.213380 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:58Z","lastTransitionTime":"2025-12-11T13:04:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:58 crc kubenswrapper[4898]: I1211 13:04:58.315835 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:58 crc kubenswrapper[4898]: I1211 13:04:58.315905 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:58 crc kubenswrapper[4898]: I1211 13:04:58.315924 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:58 crc kubenswrapper[4898]: I1211 13:04:58.315952 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:58 crc kubenswrapper[4898]: I1211 13:04:58.315970 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:58Z","lastTransitionTime":"2025-12-11T13:04:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:58 crc kubenswrapper[4898]: I1211 13:04:58.418113 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:58 crc kubenswrapper[4898]: I1211 13:04:58.418419 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:58 crc kubenswrapper[4898]: I1211 13:04:58.418604 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:58 crc kubenswrapper[4898]: I1211 13:04:58.418711 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:58 crc kubenswrapper[4898]: I1211 13:04:58.418817 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:58Z","lastTransitionTime":"2025-12-11T13:04:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:58 crc kubenswrapper[4898]: I1211 13:04:58.521384 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:58 crc kubenswrapper[4898]: I1211 13:04:58.521444 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:58 crc kubenswrapper[4898]: I1211 13:04:58.521477 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:58 crc kubenswrapper[4898]: I1211 13:04:58.521494 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:58 crc kubenswrapper[4898]: I1211 13:04:58.521505 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:58Z","lastTransitionTime":"2025-12-11T13:04:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:58 crc kubenswrapper[4898]: I1211 13:04:58.624388 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:58 crc kubenswrapper[4898]: I1211 13:04:58.624443 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:58 crc kubenswrapper[4898]: I1211 13:04:58.624497 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:58 crc kubenswrapper[4898]: I1211 13:04:58.624525 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:58 crc kubenswrapper[4898]: I1211 13:04:58.624542 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:58Z","lastTransitionTime":"2025-12-11T13:04:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:58 crc kubenswrapper[4898]: I1211 13:04:58.727450 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:58 crc kubenswrapper[4898]: I1211 13:04:58.727559 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:58 crc kubenswrapper[4898]: I1211 13:04:58.727613 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:58 crc kubenswrapper[4898]: I1211 13:04:58.727642 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:58 crc kubenswrapper[4898]: I1211 13:04:58.727663 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:58Z","lastTransitionTime":"2025-12-11T13:04:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:58 crc kubenswrapper[4898]: I1211 13:04:58.830145 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:58 crc kubenswrapper[4898]: I1211 13:04:58.830184 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:58 crc kubenswrapper[4898]: I1211 13:04:58.830193 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:58 crc kubenswrapper[4898]: I1211 13:04:58.830206 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:58 crc kubenswrapper[4898]: I1211 13:04:58.830235 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:58Z","lastTransitionTime":"2025-12-11T13:04:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:58 crc kubenswrapper[4898]: I1211 13:04:58.933511 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:58 crc kubenswrapper[4898]: I1211 13:04:58.933558 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:58 crc kubenswrapper[4898]: I1211 13:04:58.933570 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:58 crc kubenswrapper[4898]: I1211 13:04:58.933585 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:58 crc kubenswrapper[4898]: I1211 13:04:58.933594 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:58Z","lastTransitionTime":"2025-12-11T13:04:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:59 crc kubenswrapper[4898]: I1211 13:04:59.036334 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:59 crc kubenswrapper[4898]: I1211 13:04:59.036385 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:59 crc kubenswrapper[4898]: I1211 13:04:59.036396 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:59 crc kubenswrapper[4898]: I1211 13:04:59.036412 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:59 crc kubenswrapper[4898]: I1211 13:04:59.036424 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:59Z","lastTransitionTime":"2025-12-11T13:04:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:59 crc kubenswrapper[4898]: I1211 13:04:59.138717 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:59 crc kubenswrapper[4898]: I1211 13:04:59.138770 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:59 crc kubenswrapper[4898]: I1211 13:04:59.138781 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:59 crc kubenswrapper[4898]: I1211 13:04:59.138797 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:59 crc kubenswrapper[4898]: I1211 13:04:59.138805 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:59Z","lastTransitionTime":"2025-12-11T13:04:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:59 crc kubenswrapper[4898]: I1211 13:04:59.241579 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:59 crc kubenswrapper[4898]: I1211 13:04:59.241642 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:59 crc kubenswrapper[4898]: I1211 13:04:59.241660 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:59 crc kubenswrapper[4898]: I1211 13:04:59.241683 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:59 crc kubenswrapper[4898]: I1211 13:04:59.241702 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:59Z","lastTransitionTime":"2025-12-11T13:04:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:59 crc kubenswrapper[4898]: I1211 13:04:59.343747 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:59 crc kubenswrapper[4898]: I1211 13:04:59.343805 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:59 crc kubenswrapper[4898]: I1211 13:04:59.343816 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:59 crc kubenswrapper[4898]: I1211 13:04:59.343831 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:59 crc kubenswrapper[4898]: I1211 13:04:59.343843 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:59Z","lastTransitionTime":"2025-12-11T13:04:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:59 crc kubenswrapper[4898]: I1211 13:04:59.446527 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:59 crc kubenswrapper[4898]: I1211 13:04:59.446582 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:59 crc kubenswrapper[4898]: I1211 13:04:59.446597 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:59 crc kubenswrapper[4898]: I1211 13:04:59.446618 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:59 crc kubenswrapper[4898]: I1211 13:04:59.446632 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:59Z","lastTransitionTime":"2025-12-11T13:04:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:59 crc kubenswrapper[4898]: I1211 13:04:59.549164 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:59 crc kubenswrapper[4898]: I1211 13:04:59.549227 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:59 crc kubenswrapper[4898]: I1211 13:04:59.549245 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:59 crc kubenswrapper[4898]: I1211 13:04:59.549269 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:59 crc kubenswrapper[4898]: I1211 13:04:59.549287 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:59Z","lastTransitionTime":"2025-12-11T13:04:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:59 crc kubenswrapper[4898]: I1211 13:04:59.652593 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:59 crc kubenswrapper[4898]: I1211 13:04:59.652634 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:59 crc kubenswrapper[4898]: I1211 13:04:59.652644 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:59 crc kubenswrapper[4898]: I1211 13:04:59.652658 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:59 crc kubenswrapper[4898]: I1211 13:04:59.652668 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:59Z","lastTransitionTime":"2025-12-11T13:04:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:59 crc kubenswrapper[4898]: I1211 13:04:59.756138 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:59 crc kubenswrapper[4898]: I1211 13:04:59.756223 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:59 crc kubenswrapper[4898]: I1211 13:04:59.756249 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:59 crc kubenswrapper[4898]: I1211 13:04:59.756278 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:59 crc kubenswrapper[4898]: I1211 13:04:59.756301 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:59Z","lastTransitionTime":"2025-12-11T13:04:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:59 crc kubenswrapper[4898]: I1211 13:04:59.773907 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 11 13:04:59 crc kubenswrapper[4898]: I1211 13:04:59.773992 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 11 13:04:59 crc kubenswrapper[4898]: I1211 13:04:59.774032 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zcq7l" Dec 11 13:04:59 crc kubenswrapper[4898]: I1211 13:04:59.773927 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 11 13:04:59 crc kubenswrapper[4898]: E1211 13:04:59.774080 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 11 13:04:59 crc kubenswrapper[4898]: E1211 13:04:59.774240 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zcq7l" podUID="34380c7c-1d75-4f6f-a6cb-b015a55ca978" Dec 11 13:04:59 crc kubenswrapper[4898]: E1211 13:04:59.774363 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 11 13:04:59 crc kubenswrapper[4898]: E1211 13:04:59.774507 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 11 13:04:59 crc kubenswrapper[4898]: I1211 13:04:59.859334 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:59 crc kubenswrapper[4898]: I1211 13:04:59.859425 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:59 crc kubenswrapper[4898]: I1211 13:04:59.859503 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:59 crc kubenswrapper[4898]: I1211 13:04:59.859541 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:59 crc kubenswrapper[4898]: I1211 13:04:59.859571 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:59Z","lastTransitionTime":"2025-12-11T13:04:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:04:59 crc kubenswrapper[4898]: I1211 13:04:59.962223 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:04:59 crc kubenswrapper[4898]: I1211 13:04:59.962312 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:04:59 crc kubenswrapper[4898]: I1211 13:04:59.962336 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:04:59 crc kubenswrapper[4898]: I1211 13:04:59.962368 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:04:59 crc kubenswrapper[4898]: I1211 13:04:59.962388 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:04:59Z","lastTransitionTime":"2025-12-11T13:04:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:00 crc kubenswrapper[4898]: I1211 13:05:00.065405 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:00 crc kubenswrapper[4898]: I1211 13:05:00.065509 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:00 crc kubenswrapper[4898]: I1211 13:05:00.065523 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:00 crc kubenswrapper[4898]: I1211 13:05:00.065538 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:00 crc kubenswrapper[4898]: I1211 13:05:00.065548 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:00Z","lastTransitionTime":"2025-12-11T13:05:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:00 crc kubenswrapper[4898]: I1211 13:05:00.168574 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:00 crc kubenswrapper[4898]: I1211 13:05:00.168612 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:00 crc kubenswrapper[4898]: I1211 13:05:00.168621 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:00 crc kubenswrapper[4898]: I1211 13:05:00.168650 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:00 crc kubenswrapper[4898]: I1211 13:05:00.168663 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:00Z","lastTransitionTime":"2025-12-11T13:05:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:00 crc kubenswrapper[4898]: I1211 13:05:00.272154 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:00 crc kubenswrapper[4898]: I1211 13:05:00.272217 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:00 crc kubenswrapper[4898]: I1211 13:05:00.272233 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:00 crc kubenswrapper[4898]: I1211 13:05:00.272256 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:00 crc kubenswrapper[4898]: I1211 13:05:00.272273 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:00Z","lastTransitionTime":"2025-12-11T13:05:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:00 crc kubenswrapper[4898]: I1211 13:05:00.375857 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:00 crc kubenswrapper[4898]: I1211 13:05:00.375923 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:00 crc kubenswrapper[4898]: I1211 13:05:00.375942 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:00 crc kubenswrapper[4898]: I1211 13:05:00.375967 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:00 crc kubenswrapper[4898]: I1211 13:05:00.375984 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:00Z","lastTransitionTime":"2025-12-11T13:05:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:00 crc kubenswrapper[4898]: I1211 13:05:00.479150 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:00 crc kubenswrapper[4898]: I1211 13:05:00.479234 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:00 crc kubenswrapper[4898]: I1211 13:05:00.479244 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:00 crc kubenswrapper[4898]: I1211 13:05:00.479258 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:00 crc kubenswrapper[4898]: I1211 13:05:00.479271 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:00Z","lastTransitionTime":"2025-12-11T13:05:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:00 crc kubenswrapper[4898]: I1211 13:05:00.581523 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:00 crc kubenswrapper[4898]: I1211 13:05:00.581596 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:00 crc kubenswrapper[4898]: I1211 13:05:00.581615 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:00 crc kubenswrapper[4898]: I1211 13:05:00.581640 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:00 crc kubenswrapper[4898]: I1211 13:05:00.581659 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:00Z","lastTransitionTime":"2025-12-11T13:05:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:00 crc kubenswrapper[4898]: I1211 13:05:00.683656 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:00 crc kubenswrapper[4898]: I1211 13:05:00.683724 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:00 crc kubenswrapper[4898]: I1211 13:05:00.683741 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:00 crc kubenswrapper[4898]: I1211 13:05:00.683767 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:00 crc kubenswrapper[4898]: I1211 13:05:00.683783 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:00Z","lastTransitionTime":"2025-12-11T13:05:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:00 crc kubenswrapper[4898]: I1211 13:05:00.786262 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:00 crc kubenswrapper[4898]: I1211 13:05:00.786320 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:00 crc kubenswrapper[4898]: I1211 13:05:00.786339 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:00 crc kubenswrapper[4898]: I1211 13:05:00.786364 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:00 crc kubenswrapper[4898]: I1211 13:05:00.786387 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:00Z","lastTransitionTime":"2025-12-11T13:05:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:00 crc kubenswrapper[4898]: I1211 13:05:00.889016 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:00 crc kubenswrapper[4898]: I1211 13:05:00.889083 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:00 crc kubenswrapper[4898]: I1211 13:05:00.889104 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:00 crc kubenswrapper[4898]: I1211 13:05:00.889128 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:00 crc kubenswrapper[4898]: I1211 13:05:00.889146 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:00Z","lastTransitionTime":"2025-12-11T13:05:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:00 crc kubenswrapper[4898]: I1211 13:05:00.991871 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:00 crc kubenswrapper[4898]: I1211 13:05:00.991943 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:00 crc kubenswrapper[4898]: I1211 13:05:00.991957 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:00 crc kubenswrapper[4898]: I1211 13:05:00.991983 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:00 crc kubenswrapper[4898]: I1211 13:05:00.991996 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:00Z","lastTransitionTime":"2025-12-11T13:05:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:01 crc kubenswrapper[4898]: I1211 13:05:01.095524 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:01 crc kubenswrapper[4898]: I1211 13:05:01.095595 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:01 crc kubenswrapper[4898]: I1211 13:05:01.095618 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:01 crc kubenswrapper[4898]: I1211 13:05:01.095649 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:01 crc kubenswrapper[4898]: I1211 13:05:01.095671 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:01Z","lastTransitionTime":"2025-12-11T13:05:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:01 crc kubenswrapper[4898]: I1211 13:05:01.198183 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:01 crc kubenswrapper[4898]: I1211 13:05:01.198230 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:01 crc kubenswrapper[4898]: I1211 13:05:01.198240 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:01 crc kubenswrapper[4898]: I1211 13:05:01.198259 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:01 crc kubenswrapper[4898]: I1211 13:05:01.198281 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:01Z","lastTransitionTime":"2025-12-11T13:05:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:01 crc kubenswrapper[4898]: I1211 13:05:01.301208 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:01 crc kubenswrapper[4898]: I1211 13:05:01.301271 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:01 crc kubenswrapper[4898]: I1211 13:05:01.301288 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:01 crc kubenswrapper[4898]: I1211 13:05:01.301312 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:01 crc kubenswrapper[4898]: I1211 13:05:01.301331 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:01Z","lastTransitionTime":"2025-12-11T13:05:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:01 crc kubenswrapper[4898]: I1211 13:05:01.403735 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:01 crc kubenswrapper[4898]: I1211 13:05:01.403800 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:01 crc kubenswrapper[4898]: I1211 13:05:01.403815 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:01 crc kubenswrapper[4898]: I1211 13:05:01.403837 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:01 crc kubenswrapper[4898]: I1211 13:05:01.403853 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:01Z","lastTransitionTime":"2025-12-11T13:05:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:01 crc kubenswrapper[4898]: I1211 13:05:01.507729 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:01 crc kubenswrapper[4898]: I1211 13:05:01.507784 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:01 crc kubenswrapper[4898]: I1211 13:05:01.507800 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:01 crc kubenswrapper[4898]: I1211 13:05:01.507834 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:01 crc kubenswrapper[4898]: I1211 13:05:01.507853 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:01Z","lastTransitionTime":"2025-12-11T13:05:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:01 crc kubenswrapper[4898]: I1211 13:05:01.610015 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:01 crc kubenswrapper[4898]: I1211 13:05:01.610063 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:01 crc kubenswrapper[4898]: I1211 13:05:01.610075 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:01 crc kubenswrapper[4898]: I1211 13:05:01.610093 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:01 crc kubenswrapper[4898]: I1211 13:05:01.610110 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:01Z","lastTransitionTime":"2025-12-11T13:05:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:01 crc kubenswrapper[4898]: I1211 13:05:01.712360 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:01 crc kubenswrapper[4898]: I1211 13:05:01.712416 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:01 crc kubenswrapper[4898]: I1211 13:05:01.712433 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:01 crc kubenswrapper[4898]: I1211 13:05:01.712494 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:01 crc kubenswrapper[4898]: I1211 13:05:01.712518 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:01Z","lastTransitionTime":"2025-12-11T13:05:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:01 crc kubenswrapper[4898]: I1211 13:05:01.774212 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 11 13:05:01 crc kubenswrapper[4898]: I1211 13:05:01.774298 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 11 13:05:01 crc kubenswrapper[4898]: E1211 13:05:01.774339 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 11 13:05:01 crc kubenswrapper[4898]: I1211 13:05:01.774363 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 11 13:05:01 crc kubenswrapper[4898]: I1211 13:05:01.774367 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zcq7l" Dec 11 13:05:01 crc kubenswrapper[4898]: E1211 13:05:01.774499 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 11 13:05:01 crc kubenswrapper[4898]: E1211 13:05:01.774665 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 11 13:05:01 crc kubenswrapper[4898]: E1211 13:05:01.774728 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zcq7l" podUID="34380c7c-1d75-4f6f-a6cb-b015a55ca978" Dec 11 13:05:01 crc kubenswrapper[4898]: I1211 13:05:01.815705 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:01 crc kubenswrapper[4898]: I1211 13:05:01.815761 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:01 crc kubenswrapper[4898]: I1211 13:05:01.815780 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:01 crc kubenswrapper[4898]: I1211 13:05:01.815803 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:01 crc kubenswrapper[4898]: I1211 13:05:01.815820 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:01Z","lastTransitionTime":"2025-12-11T13:05:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:01 crc kubenswrapper[4898]: I1211 13:05:01.918709 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:01 crc kubenswrapper[4898]: I1211 13:05:01.918772 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:01 crc kubenswrapper[4898]: I1211 13:05:01.918791 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:01 crc kubenswrapper[4898]: I1211 13:05:01.918815 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:01 crc kubenswrapper[4898]: I1211 13:05:01.918833 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:01Z","lastTransitionTime":"2025-12-11T13:05:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:02 crc kubenswrapper[4898]: I1211 13:05:02.022185 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:02 crc kubenswrapper[4898]: I1211 13:05:02.024165 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:02 crc kubenswrapper[4898]: I1211 13:05:02.024183 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:02 crc kubenswrapper[4898]: I1211 13:05:02.024213 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:02 crc kubenswrapper[4898]: I1211 13:05:02.024226 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:02Z","lastTransitionTime":"2025-12-11T13:05:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:02 crc kubenswrapper[4898]: I1211 13:05:02.127285 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:02 crc kubenswrapper[4898]: I1211 13:05:02.127334 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:02 crc kubenswrapper[4898]: I1211 13:05:02.127345 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:02 crc kubenswrapper[4898]: I1211 13:05:02.127364 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:02 crc kubenswrapper[4898]: I1211 13:05:02.127377 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:02Z","lastTransitionTime":"2025-12-11T13:05:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:02 crc kubenswrapper[4898]: I1211 13:05:02.229660 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:02 crc kubenswrapper[4898]: I1211 13:05:02.229891 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:02 crc kubenswrapper[4898]: I1211 13:05:02.229981 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:02 crc kubenswrapper[4898]: I1211 13:05:02.230072 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:02 crc kubenswrapper[4898]: I1211 13:05:02.230159 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:02Z","lastTransitionTime":"2025-12-11T13:05:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:02 crc kubenswrapper[4898]: I1211 13:05:02.333611 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:02 crc kubenswrapper[4898]: I1211 13:05:02.333671 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:02 crc kubenswrapper[4898]: I1211 13:05:02.333688 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:02 crc kubenswrapper[4898]: I1211 13:05:02.333711 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:02 crc kubenswrapper[4898]: I1211 13:05:02.333730 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:02Z","lastTransitionTime":"2025-12-11T13:05:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:02 crc kubenswrapper[4898]: I1211 13:05:02.435305 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:02 crc kubenswrapper[4898]: I1211 13:05:02.435358 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:02 crc kubenswrapper[4898]: I1211 13:05:02.435367 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:02 crc kubenswrapper[4898]: I1211 13:05:02.435380 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:02 crc kubenswrapper[4898]: I1211 13:05:02.435388 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:02Z","lastTransitionTime":"2025-12-11T13:05:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:02 crc kubenswrapper[4898]: I1211 13:05:02.538056 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:02 crc kubenswrapper[4898]: I1211 13:05:02.538121 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:02 crc kubenswrapper[4898]: I1211 13:05:02.538132 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:02 crc kubenswrapper[4898]: I1211 13:05:02.538149 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:02 crc kubenswrapper[4898]: I1211 13:05:02.538162 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:02Z","lastTransitionTime":"2025-12-11T13:05:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:02 crc kubenswrapper[4898]: I1211 13:05:02.650518 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:02 crc kubenswrapper[4898]: I1211 13:05:02.650547 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:02 crc kubenswrapper[4898]: I1211 13:05:02.650555 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:02 crc kubenswrapper[4898]: I1211 13:05:02.650568 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:02 crc kubenswrapper[4898]: I1211 13:05:02.650577 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:02Z","lastTransitionTime":"2025-12-11T13:05:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:02 crc kubenswrapper[4898]: I1211 13:05:02.752955 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:02 crc kubenswrapper[4898]: I1211 13:05:02.752992 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:02 crc kubenswrapper[4898]: I1211 13:05:02.753018 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:02 crc kubenswrapper[4898]: I1211 13:05:02.753034 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:02 crc kubenswrapper[4898]: I1211 13:05:02.753043 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:02Z","lastTransitionTime":"2025-12-11T13:05:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:02 crc kubenswrapper[4898]: I1211 13:05:02.798839 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://337971ff1d0d3cf7ee256777a71e4549ec0b973b58b437caaa152c589047141f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:02Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:02 crc kubenswrapper[4898]: I1211 13:05:02.813618 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b28cc743c1173591c6ee75d4bb3841608d01bca50dab896d4c5de930a67ee22f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:02Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:02 crc kubenswrapper[4898]: I1211 13:05:02.834982 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-dlqfj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e8ed6cb-b822-4b64-9e00-e755c5aea812\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76ddfb054d2dd7ec27a771c96a0fca2cead8eacc3221c22b36c1d075414a1995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bvrpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-dlqfj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:02Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:02 crc kubenswrapper[4898]: I1211 13:05:02.849697 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jmvlr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8009db93-3f68-4a84-87ae-863f64e231e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08079d059f71e05ad9d28ebed3d2608a035d3b6fc87db1d80b3974b439ac5bdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkmx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:24Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jmvlr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:02Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:02 crc kubenswrapper[4898]: I1211 13:05:02.856133 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:02 crc kubenswrapper[4898]: I1211 13:05:02.856216 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:02 crc kubenswrapper[4898]: I1211 13:05:02.856243 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:02 crc kubenswrapper[4898]: I1211 13:05:02.856269 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:02 crc kubenswrapper[4898]: I1211 13:05:02.856288 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:02Z","lastTransitionTime":"2025-12-11T13:05:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:02 crc kubenswrapper[4898]: I1211 13:05:02.866097 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbb0d9e2-813d-44ce-a547-a3c6f60c9f6b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e84fc12f1339e236216301663a693fc0ae8c043f282081b77e08455869556f40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcda5c81dd6cdc256d7acd24d32f956484e8823bed9e29c5713a722d8ebd6f46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cbfca7c42a03929bac318edc66ce1a6c7f88f795fd3b58db686b8484edfac60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e329db1261719ccef14dcf3965ff73799b275591613d4035cb820253dced79b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e329db1261719ccef14dcf3965ff73799b275591613d4035cb820253dced79b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:02Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:02Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:02 crc kubenswrapper[4898]: I1211 13:05:02.902905 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"88d684b6-361a-40e6-978e-b46ea34398f5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fdd8f4f5e41c02e8bfa32782027f6483a3568a3e1cb28f2abc7807eaada32646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad23dc07c8ad7cf6100e3b53e1d9ed8abac20e942968edb5decdde1b263de306\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce1bf1a7718645350f992e26a366a0fefa0d757b696c932d3552dd24a4c93550\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4801f56faef50a1c100b7f54be9508a2be17ab655648fdb92cffaab1cb198a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87f5a835b639c4ce50c0d2b5f1a865f1b67bc5cc06d9fd88875d9defb71dbed2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77f3c45a0ef619bd5ee7b89f62fdaf4ae983c8f75e23034eb88019194f6ef88a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77f3c45a0ef619bd5ee7b89f62fdaf4ae983c8f75e23034eb88019194f6ef88a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32f392949d5ebc630af8b0646f8f482f806803d464e2cedbcbe731328d6ba5ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32f392949d5ebc630af8b0646f8f482f806803d464e2cedbcbe731328d6ba5ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://bf8553e4b1a22afaf454efae3707136b9e6e320a54fd269f48a5f72297249d21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf8553e4b1a22afaf454efae3707136b9e6e320a54fd269f48a5f72297249d21\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:02Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:02 crc kubenswrapper[4898]: I1211 13:05:02.916169 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-h86cf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"00c0a5c3-6be3-4c77-a628-cf6710a1f10f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cb94931502e22d03595fc3289c2b098ecf881b9f8c14180a036a11b2f65bd91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xv2hc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-h86cf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:02Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:02 crc kubenswrapper[4898]: I1211 13:05:02.929081 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6782ac5869029f1d6c05198c469d34070decbbbbab0c093963024040f067ecf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7cv42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb7b98c50effebe09db7e51fe139da11af7b116ce6b7d81f0dada92fa29c82a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7cv42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7mmvk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:02Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:02 crc kubenswrapper[4898]: I1211 13:05:02.958693 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qndxl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1efa7034-8a95-4e6e-bd84-0189dc5acaa3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2a77e22893dca5ea818fc6992db6c34f8eb90ce2f25e814bec5f314d60ad569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd81ec4902eeb225d5a2969e211c63bbab448c1cccb327cfa84fb68c4f386ab0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ab0efc2d7d46f575125a46f025b016017e66f74466a9d796dfb0285577f5bcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d992af21fd68cce19a4be8328466a3d46df62ed908fe4e911ba14e7c009ce982\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a60901e27e67b7483e02800a91a86f2676b610d538c999ab1df0d42fde146f90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7e7d7ded2a6e0e20781a16b0ed56a744db593c6fe40c75e89a088ccb8cdb127\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18a2b5e8508fbd418e99cd2fe5af41cf82ce69e8a4240c8c3c1965e8f7b0bfb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://18a2b5e8508fbd418e99cd2fe5af41cf82ce69e8a4240c8c3c1965e8f7b0bfb9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-11T13:04:55Z\\\",\\\"message\\\":\\\"pin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.41:9443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {589f95f7-f3e2-4140-80ed-9a0717201481}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1211 13:04:55.603698 6618 services_controller.go:356] Processing sync for service openshift-marketplace/redhat-marketplace for network=default\\\\nF1211 13:04:55.603702 6618 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:55Z is after 2025-08-24T17:21:41Z]\\\\nI1211 13:04:55.6\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:54Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-qndxl_openshift-ovn-kubernetes(1efa7034-8a95-4e6e-bd84-0189dc5acaa3)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3aa907a5ca865510f1aafd0f949963b5938b6fe62fa9fd690447c4ab62566f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fc397b24e8480c4d07281b09fb871f859eb9fdf47fa9103b9243f91e5800d61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fc397b24e8480c4d07281b09fb871f859eb9fdf47fa9103b9243f91e5800d61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qndxl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:02Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:02 crc kubenswrapper[4898]: I1211 13:05:02.959219 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:02 crc kubenswrapper[4898]: I1211 13:05:02.959291 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:02 crc kubenswrapper[4898]: I1211 13:05:02.959312 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:02 crc kubenswrapper[4898]: I1211 13:05:02.959341 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:02 crc kubenswrapper[4898]: I1211 13:05:02.959362 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:02Z","lastTransitionTime":"2025-12-11T13:05:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:02 crc kubenswrapper[4898]: I1211 13:05:02.976056 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-brphv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f71cb6f-609f-4fab-86eb-e01518fb8c61\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d17922d87e50d85f50e76fef19a44cedae3bfd35727f8531089cfcf6da12d5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwpjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cedba92b63b5430b84d32f2b59c5cc0c4e1cd2bc2d513f2f905ebb59c34639b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwpjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-brphv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:02Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:02 crc kubenswrapper[4898]: I1211 13:05:02.998174 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fc485a2-a69e-4453-8c3d-4b9698caa632\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df081be8bffcbf5068ade5f09ad81f2ea332d110fb0dffc1ab215f4a447e75d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://765cb993103d23770e20a725ce1f28ca635d13d2ecd05676039a833274ecf3e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41568c567a418a6ebd534c09d02c4331dfa2cd00580441bfdd83a15a12153a59\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ccc1530df8a90b252f6008c3a1734fd489bcb57944c2cc46bdf3c7554d837c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8857b4ade110b31598d9080f45bc15f5702b2c278fe7909e4d172c9877e61534\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"message\\\":\\\"falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1211 13:04:15.326141 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1211 13:04:15.327902 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-38072788/tls.crt::/tmp/serving-cert-38072788/tls.key\\\\\\\"\\\\nI1211 13:04:21.035582 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1211 13:04:21.038444 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1211 13:04:21.038483 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1211 13:04:21.038515 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1211 13:04:21.038527 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1211 13:04:21.044993 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1211 13:04:21.045022 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1211 13:04:21.045029 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1211 13:04:21.045034 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1211 13:04:21.045038 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1211 13:04:21.045043 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1211 13:04:21.045047 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1211 13:04:21.045297 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1211 13:04:21.046774 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a6e962a8c37823fc7d0047f9c1cae14311bfa1f36236addff039bba3369a021\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca4b5408884ed07718750ad82f77961d242327ca11991fd4dbbd1b5b53c5b642\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca4b5408884ed07718750ad82f77961d242327ca11991fd4dbbd1b5b53c5b642\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:02Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:03 crc kubenswrapper[4898]: I1211 13:05:03.024794 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9dab79b9393ae8ca48f0b4366d9b4e6fd25015a3d42f913b3d3d5ce78cbf0cc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bee512bd389a5e402acad6fdbf075e8d90bbfb3899efaecee992ff8ef64617fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:03Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:03 crc kubenswrapper[4898]: I1211 13:05:03.046558 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:03Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:03 crc kubenswrapper[4898]: I1211 13:05:03.062968 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:03 crc kubenswrapper[4898]: I1211 13:05:03.063003 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:03 crc kubenswrapper[4898]: I1211 13:05:03.063018 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:03 crc kubenswrapper[4898]: I1211 13:05:03.063039 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:03 crc kubenswrapper[4898]: I1211 13:05:03.063053 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:03Z","lastTransitionTime":"2025-12-11T13:05:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:03 crc kubenswrapper[4898]: I1211 13:05:03.071449 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7lxfm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45192e97-2770-4866-9865-dc2f45b3f616\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4fc2a5689b589ccda6ec382b6a586f4e47c42d4f137d852e28688a18b98e6e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14a9667311a747f8e844897394432d67e0897b22569dd1314f4ebeccfce1c531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14a9667311a747f8e844897394432d67e0897b22569dd1314f4ebeccfce1c531\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://35bb75c37377565dd0410255d00b3a848a3d22801107bd4d9547ce5574cda121\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35bb75c37377565dd0410255d00b3a848a3d22801107bd4d9547ce5574cda121\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd3bae44b475629c42174d195e162b9afe545e8dc84857847b1e2d936cf1a32d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd3bae44b475629c42174d195e162b9afe545e8dc84857847b1e2d936cf1a32d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7d5edf28866aeb08ed16121e0cea183463e58a708b157da20edd847bdad9860\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7d5edf28866aeb08ed16121e0cea183463e58a708b157da20edd847bdad9860\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://951e5da3fc6041496960dcfcd28bb3e918a40d9303c6609880bc2f8dcca29688\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://951e5da3fc6041496960dcfcd28bb3e918a40d9303c6609880bc2f8dcca29688\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39b6a95a0387b8f1ea4992bfa28355272a299bbb81adddf7ad39835786b43b2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39b6a95a0387b8f1ea4992bfa28355272a299bbb81adddf7ad39835786b43b2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7lxfm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:03Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:03 crc kubenswrapper[4898]: I1211 13:05:03.087686 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-zcq7l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34380c7c-1d75-4f6f-a6cb-b015a55ca978\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:36Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-zcq7l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:03Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:03 crc kubenswrapper[4898]: I1211 13:05:03.102935 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca3db723-c4e9-4a38-9cee-827f045c7be3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b72e9f6787b4bfadbc89cf669d5f31678f0a85a726107e3b3d2e06eff0e18ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a52b842d9e7001cf1e9c3284c56ac9a76f642b66457d6717c6367e00ef89fdf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://583816a3f9ba9a092d6cd8d75bba54cf42e3ff9f65230f83822baec967dd53dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f419f755ec0a61ffad06dca0d35df43386ebd253152f29cf04312cd0ea89d5de\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:03Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:03 crc kubenswrapper[4898]: I1211 13:05:03.116150 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:03Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:03 crc kubenswrapper[4898]: I1211 13:05:03.133933 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:03Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:03 crc kubenswrapper[4898]: I1211 13:05:03.166114 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:03 crc kubenswrapper[4898]: I1211 13:05:03.166188 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:03 crc kubenswrapper[4898]: I1211 13:05:03.166213 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:03 crc kubenswrapper[4898]: I1211 13:05:03.166245 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:03 crc kubenswrapper[4898]: I1211 13:05:03.166268 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:03Z","lastTransitionTime":"2025-12-11T13:05:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:03 crc kubenswrapper[4898]: I1211 13:05:03.269041 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:03 crc kubenswrapper[4898]: I1211 13:05:03.269089 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:03 crc kubenswrapper[4898]: I1211 13:05:03.269108 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:03 crc kubenswrapper[4898]: I1211 13:05:03.269132 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:03 crc kubenswrapper[4898]: I1211 13:05:03.269151 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:03Z","lastTransitionTime":"2025-12-11T13:05:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:03 crc kubenswrapper[4898]: I1211 13:05:03.371723 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:03 crc kubenswrapper[4898]: I1211 13:05:03.371773 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:03 crc kubenswrapper[4898]: I1211 13:05:03.371791 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:03 crc kubenswrapper[4898]: I1211 13:05:03.371816 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:03 crc kubenswrapper[4898]: I1211 13:05:03.371834 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:03Z","lastTransitionTime":"2025-12-11T13:05:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:03 crc kubenswrapper[4898]: I1211 13:05:03.474530 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:03 crc kubenswrapper[4898]: I1211 13:05:03.474562 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:03 crc kubenswrapper[4898]: I1211 13:05:03.474572 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:03 crc kubenswrapper[4898]: I1211 13:05:03.474587 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:03 crc kubenswrapper[4898]: I1211 13:05:03.474598 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:03Z","lastTransitionTime":"2025-12-11T13:05:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:03 crc kubenswrapper[4898]: I1211 13:05:03.577895 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:03 crc kubenswrapper[4898]: I1211 13:05:03.577959 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:03 crc kubenswrapper[4898]: I1211 13:05:03.577978 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:03 crc kubenswrapper[4898]: I1211 13:05:03.578004 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:03 crc kubenswrapper[4898]: I1211 13:05:03.578022 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:03Z","lastTransitionTime":"2025-12-11T13:05:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:03 crc kubenswrapper[4898]: I1211 13:05:03.681126 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:03 crc kubenswrapper[4898]: I1211 13:05:03.681171 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:03 crc kubenswrapper[4898]: I1211 13:05:03.681182 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:03 crc kubenswrapper[4898]: I1211 13:05:03.681197 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:03 crc kubenswrapper[4898]: I1211 13:05:03.681209 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:03Z","lastTransitionTime":"2025-12-11T13:05:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:03 crc kubenswrapper[4898]: I1211 13:05:03.774829 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zcq7l" Dec 11 13:05:03 crc kubenswrapper[4898]: E1211 13:05:03.774997 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zcq7l" podUID="34380c7c-1d75-4f6f-a6cb-b015a55ca978" Dec 11 13:05:03 crc kubenswrapper[4898]: I1211 13:05:03.775088 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 11 13:05:03 crc kubenswrapper[4898]: I1211 13:05:03.775156 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 11 13:05:03 crc kubenswrapper[4898]: E1211 13:05:03.775265 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 11 13:05:03 crc kubenswrapper[4898]: I1211 13:05:03.774856 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 11 13:05:03 crc kubenswrapper[4898]: E1211 13:05:03.775397 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 11 13:05:03 crc kubenswrapper[4898]: E1211 13:05:03.775827 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 11 13:05:03 crc kubenswrapper[4898]: I1211 13:05:03.783751 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:03 crc kubenswrapper[4898]: I1211 13:05:03.783822 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:03 crc kubenswrapper[4898]: I1211 13:05:03.783836 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:03 crc kubenswrapper[4898]: I1211 13:05:03.783853 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:03 crc kubenswrapper[4898]: I1211 13:05:03.783870 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:03Z","lastTransitionTime":"2025-12-11T13:05:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:03 crc kubenswrapper[4898]: I1211 13:05:03.886517 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:03 crc kubenswrapper[4898]: I1211 13:05:03.886564 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:03 crc kubenswrapper[4898]: I1211 13:05:03.886575 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:03 crc kubenswrapper[4898]: I1211 13:05:03.886593 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:03 crc kubenswrapper[4898]: I1211 13:05:03.886604 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:03Z","lastTransitionTime":"2025-12-11T13:05:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:03 crc kubenswrapper[4898]: I1211 13:05:03.989509 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:03 crc kubenswrapper[4898]: I1211 13:05:03.989548 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:03 crc kubenswrapper[4898]: I1211 13:05:03.989557 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:03 crc kubenswrapper[4898]: I1211 13:05:03.989570 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:03 crc kubenswrapper[4898]: I1211 13:05:03.989583 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:03Z","lastTransitionTime":"2025-12-11T13:05:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:04 crc kubenswrapper[4898]: I1211 13:05:04.092869 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:04 crc kubenswrapper[4898]: I1211 13:05:04.092909 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:04 crc kubenswrapper[4898]: I1211 13:05:04.092917 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:04 crc kubenswrapper[4898]: I1211 13:05:04.092930 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:04 crc kubenswrapper[4898]: I1211 13:05:04.092947 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:04Z","lastTransitionTime":"2025-12-11T13:05:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:04 crc kubenswrapper[4898]: I1211 13:05:04.194968 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:04 crc kubenswrapper[4898]: I1211 13:05:04.195066 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:04 crc kubenswrapper[4898]: I1211 13:05:04.195078 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:04 crc kubenswrapper[4898]: I1211 13:05:04.195091 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:04 crc kubenswrapper[4898]: I1211 13:05:04.195105 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:04Z","lastTransitionTime":"2025-12-11T13:05:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:04 crc kubenswrapper[4898]: I1211 13:05:04.298722 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:04 crc kubenswrapper[4898]: I1211 13:05:04.298763 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:04 crc kubenswrapper[4898]: I1211 13:05:04.298771 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:04 crc kubenswrapper[4898]: I1211 13:05:04.298790 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:04 crc kubenswrapper[4898]: I1211 13:05:04.298803 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:04Z","lastTransitionTime":"2025-12-11T13:05:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:04 crc kubenswrapper[4898]: I1211 13:05:04.401568 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:04 crc kubenswrapper[4898]: I1211 13:05:04.401604 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:04 crc kubenswrapper[4898]: I1211 13:05:04.401615 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:04 crc kubenswrapper[4898]: I1211 13:05:04.401627 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:04 crc kubenswrapper[4898]: I1211 13:05:04.401636 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:04Z","lastTransitionTime":"2025-12-11T13:05:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:04 crc kubenswrapper[4898]: I1211 13:05:04.504178 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:04 crc kubenswrapper[4898]: I1211 13:05:04.504231 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:04 crc kubenswrapper[4898]: I1211 13:05:04.504241 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:04 crc kubenswrapper[4898]: I1211 13:05:04.504256 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:04 crc kubenswrapper[4898]: I1211 13:05:04.504267 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:04Z","lastTransitionTime":"2025-12-11T13:05:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:04 crc kubenswrapper[4898]: I1211 13:05:04.607619 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:04 crc kubenswrapper[4898]: I1211 13:05:04.607672 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:04 crc kubenswrapper[4898]: I1211 13:05:04.607684 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:04 crc kubenswrapper[4898]: I1211 13:05:04.607704 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:04 crc kubenswrapper[4898]: I1211 13:05:04.607716 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:04Z","lastTransitionTime":"2025-12-11T13:05:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:04 crc kubenswrapper[4898]: I1211 13:05:04.712020 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:04 crc kubenswrapper[4898]: I1211 13:05:04.712076 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:04 crc kubenswrapper[4898]: I1211 13:05:04.712086 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:04 crc kubenswrapper[4898]: I1211 13:05:04.712107 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:04 crc kubenswrapper[4898]: I1211 13:05:04.712120 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:04Z","lastTransitionTime":"2025-12-11T13:05:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:04 crc kubenswrapper[4898]: I1211 13:05:04.814838 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:04 crc kubenswrapper[4898]: I1211 13:05:04.814895 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:04 crc kubenswrapper[4898]: I1211 13:05:04.814922 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:04 crc kubenswrapper[4898]: I1211 13:05:04.814945 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:04 crc kubenswrapper[4898]: I1211 13:05:04.814963 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:04Z","lastTransitionTime":"2025-12-11T13:05:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:04 crc kubenswrapper[4898]: I1211 13:05:04.917166 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:04 crc kubenswrapper[4898]: I1211 13:05:04.917236 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:04 crc kubenswrapper[4898]: I1211 13:05:04.917255 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:04 crc kubenswrapper[4898]: I1211 13:05:04.917278 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:04 crc kubenswrapper[4898]: I1211 13:05:04.917295 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:04Z","lastTransitionTime":"2025-12-11T13:05:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:05 crc kubenswrapper[4898]: I1211 13:05:05.020281 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:05 crc kubenswrapper[4898]: I1211 13:05:05.020354 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:05 crc kubenswrapper[4898]: I1211 13:05:05.020377 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:05 crc kubenswrapper[4898]: I1211 13:05:05.020406 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:05 crc kubenswrapper[4898]: I1211 13:05:05.020430 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:05Z","lastTransitionTime":"2025-12-11T13:05:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:05 crc kubenswrapper[4898]: I1211 13:05:05.123676 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:05 crc kubenswrapper[4898]: I1211 13:05:05.123759 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:05 crc kubenswrapper[4898]: I1211 13:05:05.123772 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:05 crc kubenswrapper[4898]: I1211 13:05:05.123795 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:05 crc kubenswrapper[4898]: I1211 13:05:05.123809 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:05Z","lastTransitionTime":"2025-12-11T13:05:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:05 crc kubenswrapper[4898]: I1211 13:05:05.226563 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:05 crc kubenswrapper[4898]: I1211 13:05:05.226679 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:05 crc kubenswrapper[4898]: I1211 13:05:05.226699 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:05 crc kubenswrapper[4898]: I1211 13:05:05.226724 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:05 crc kubenswrapper[4898]: I1211 13:05:05.226742 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:05Z","lastTransitionTime":"2025-12-11T13:05:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:05 crc kubenswrapper[4898]: I1211 13:05:05.331247 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:05 crc kubenswrapper[4898]: I1211 13:05:05.331306 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:05 crc kubenswrapper[4898]: I1211 13:05:05.331323 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:05 crc kubenswrapper[4898]: I1211 13:05:05.331349 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:05 crc kubenswrapper[4898]: I1211 13:05:05.331366 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:05Z","lastTransitionTime":"2025-12-11T13:05:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:05 crc kubenswrapper[4898]: I1211 13:05:05.435147 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:05 crc kubenswrapper[4898]: I1211 13:05:05.435238 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:05 crc kubenswrapper[4898]: I1211 13:05:05.435276 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:05 crc kubenswrapper[4898]: I1211 13:05:05.435307 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:05 crc kubenswrapper[4898]: I1211 13:05:05.435327 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:05Z","lastTransitionTime":"2025-12-11T13:05:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:05 crc kubenswrapper[4898]: I1211 13:05:05.538002 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:05 crc kubenswrapper[4898]: I1211 13:05:05.538029 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:05 crc kubenswrapper[4898]: I1211 13:05:05.538039 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:05 crc kubenswrapper[4898]: I1211 13:05:05.538050 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:05 crc kubenswrapper[4898]: I1211 13:05:05.538060 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:05Z","lastTransitionTime":"2025-12-11T13:05:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:05 crc kubenswrapper[4898]: I1211 13:05:05.641563 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:05 crc kubenswrapper[4898]: I1211 13:05:05.641610 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:05 crc kubenswrapper[4898]: I1211 13:05:05.641621 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:05 crc kubenswrapper[4898]: I1211 13:05:05.641638 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:05 crc kubenswrapper[4898]: I1211 13:05:05.641650 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:05Z","lastTransitionTime":"2025-12-11T13:05:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:05 crc kubenswrapper[4898]: I1211 13:05:05.744298 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:05 crc kubenswrapper[4898]: I1211 13:05:05.744337 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:05 crc kubenswrapper[4898]: I1211 13:05:05.744354 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:05 crc kubenswrapper[4898]: I1211 13:05:05.744370 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:05 crc kubenswrapper[4898]: I1211 13:05:05.744380 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:05Z","lastTransitionTime":"2025-12-11T13:05:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:05 crc kubenswrapper[4898]: I1211 13:05:05.774414 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zcq7l" Dec 11 13:05:05 crc kubenswrapper[4898]: I1211 13:05:05.774447 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 11 13:05:05 crc kubenswrapper[4898]: I1211 13:05:05.774521 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 11 13:05:05 crc kubenswrapper[4898]: I1211 13:05:05.774483 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 11 13:05:05 crc kubenswrapper[4898]: E1211 13:05:05.774658 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zcq7l" podUID="34380c7c-1d75-4f6f-a6cb-b015a55ca978" Dec 11 13:05:05 crc kubenswrapper[4898]: E1211 13:05:05.774758 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 11 13:05:05 crc kubenswrapper[4898]: E1211 13:05:05.774855 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 11 13:05:05 crc kubenswrapper[4898]: E1211 13:05:05.774965 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 11 13:05:05 crc kubenswrapper[4898]: I1211 13:05:05.847106 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:05 crc kubenswrapper[4898]: I1211 13:05:05.847218 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:05 crc kubenswrapper[4898]: I1211 13:05:05.847240 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:05 crc kubenswrapper[4898]: I1211 13:05:05.847270 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:05 crc kubenswrapper[4898]: I1211 13:05:05.847293 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:05Z","lastTransitionTime":"2025-12-11T13:05:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:05 crc kubenswrapper[4898]: I1211 13:05:05.950272 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:05 crc kubenswrapper[4898]: I1211 13:05:05.950341 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:05 crc kubenswrapper[4898]: I1211 13:05:05.950369 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:05 crc kubenswrapper[4898]: I1211 13:05:05.950397 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:05 crc kubenswrapper[4898]: I1211 13:05:05.950418 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:05Z","lastTransitionTime":"2025-12-11T13:05:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:06 crc kubenswrapper[4898]: I1211 13:05:06.053503 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:06 crc kubenswrapper[4898]: I1211 13:05:06.053566 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:06 crc kubenswrapper[4898]: I1211 13:05:06.053583 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:06 crc kubenswrapper[4898]: I1211 13:05:06.053605 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:06 crc kubenswrapper[4898]: I1211 13:05:06.053621 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:06Z","lastTransitionTime":"2025-12-11T13:05:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:06 crc kubenswrapper[4898]: I1211 13:05:06.156331 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:06 crc kubenswrapper[4898]: I1211 13:05:06.156379 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:06 crc kubenswrapper[4898]: I1211 13:05:06.156395 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:06 crc kubenswrapper[4898]: I1211 13:05:06.156416 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:06 crc kubenswrapper[4898]: I1211 13:05:06.156432 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:06Z","lastTransitionTime":"2025-12-11T13:05:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:06 crc kubenswrapper[4898]: I1211 13:05:06.258682 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:06 crc kubenswrapper[4898]: I1211 13:05:06.258736 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:06 crc kubenswrapper[4898]: I1211 13:05:06.258753 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:06 crc kubenswrapper[4898]: I1211 13:05:06.258778 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:06 crc kubenswrapper[4898]: I1211 13:05:06.258794 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:06Z","lastTransitionTime":"2025-12-11T13:05:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:06 crc kubenswrapper[4898]: I1211 13:05:06.361854 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:06 crc kubenswrapper[4898]: I1211 13:05:06.361922 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:06 crc kubenswrapper[4898]: I1211 13:05:06.361950 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:06 crc kubenswrapper[4898]: I1211 13:05:06.361976 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:06 crc kubenswrapper[4898]: I1211 13:05:06.361995 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:06Z","lastTransitionTime":"2025-12-11T13:05:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:06 crc kubenswrapper[4898]: I1211 13:05:06.464891 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:06 crc kubenswrapper[4898]: I1211 13:05:06.464943 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:06 crc kubenswrapper[4898]: I1211 13:05:06.464960 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:06 crc kubenswrapper[4898]: I1211 13:05:06.464982 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:06 crc kubenswrapper[4898]: I1211 13:05:06.464999 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:06Z","lastTransitionTime":"2025-12-11T13:05:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:06 crc kubenswrapper[4898]: I1211 13:05:06.567871 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:06 crc kubenswrapper[4898]: I1211 13:05:06.567946 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:06 crc kubenswrapper[4898]: I1211 13:05:06.567975 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:06 crc kubenswrapper[4898]: I1211 13:05:06.568006 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:06 crc kubenswrapper[4898]: I1211 13:05:06.568025 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:06Z","lastTransitionTime":"2025-12-11T13:05:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:06 crc kubenswrapper[4898]: I1211 13:05:06.671487 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:06 crc kubenswrapper[4898]: I1211 13:05:06.671543 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:06 crc kubenswrapper[4898]: I1211 13:05:06.671557 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:06 crc kubenswrapper[4898]: I1211 13:05:06.671574 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:06 crc kubenswrapper[4898]: I1211 13:05:06.671586 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:06Z","lastTransitionTime":"2025-12-11T13:05:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:06 crc kubenswrapper[4898]: I1211 13:05:06.779237 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:06 crc kubenswrapper[4898]: I1211 13:05:06.779320 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:06 crc kubenswrapper[4898]: I1211 13:05:06.779742 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:06 crc kubenswrapper[4898]: I1211 13:05:06.779789 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:06 crc kubenswrapper[4898]: I1211 13:05:06.779809 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:06Z","lastTransitionTime":"2025-12-11T13:05:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:06 crc kubenswrapper[4898]: I1211 13:05:06.819953 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:06 crc kubenswrapper[4898]: I1211 13:05:06.820000 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:06 crc kubenswrapper[4898]: I1211 13:05:06.820012 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:06 crc kubenswrapper[4898]: I1211 13:05:06.820028 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:06 crc kubenswrapper[4898]: I1211 13:05:06.820041 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:06Z","lastTransitionTime":"2025-12-11T13:05:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:06 crc kubenswrapper[4898]: E1211 13:05:06.833392 4898 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-11T13:05:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-11T13:05:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-11T13:05:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-11T13:05:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-11T13:05:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-11T13:05:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-11T13:05:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-11T13:05:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"160776e4-8c30-4eed-9dbf-aa0b51733cfb\\\",\\\"systemUUID\\\":\\\"3ede5a2c-67f9-4bff-827e-03a23908e5c0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:06Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:06 crc kubenswrapper[4898]: I1211 13:05:06.837198 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:06 crc kubenswrapper[4898]: I1211 13:05:06.837271 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:06 crc kubenswrapper[4898]: I1211 13:05:06.837284 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:06 crc kubenswrapper[4898]: I1211 13:05:06.837303 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:06 crc kubenswrapper[4898]: I1211 13:05:06.837316 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:06Z","lastTransitionTime":"2025-12-11T13:05:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:06 crc kubenswrapper[4898]: E1211 13:05:06.849442 4898 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-11T13:05:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-11T13:05:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-11T13:05:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-11T13:05:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-11T13:05:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-11T13:05:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-11T13:05:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-11T13:05:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"160776e4-8c30-4eed-9dbf-aa0b51733cfb\\\",\\\"systemUUID\\\":\\\"3ede5a2c-67f9-4bff-827e-03a23908e5c0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:06Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:06 crc kubenswrapper[4898]: I1211 13:05:06.852661 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:06 crc kubenswrapper[4898]: I1211 13:05:06.852699 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:06 crc kubenswrapper[4898]: I1211 13:05:06.852712 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:06 crc kubenswrapper[4898]: I1211 13:05:06.852729 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:06 crc kubenswrapper[4898]: I1211 13:05:06.852740 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:06Z","lastTransitionTime":"2025-12-11T13:05:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:06 crc kubenswrapper[4898]: E1211 13:05:06.867214 4898 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-11T13:05:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-11T13:05:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-11T13:05:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-11T13:05:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-11T13:05:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-11T13:05:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-11T13:05:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-11T13:05:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"160776e4-8c30-4eed-9dbf-aa0b51733cfb\\\",\\\"systemUUID\\\":\\\"3ede5a2c-67f9-4bff-827e-03a23908e5c0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:06Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:06 crc kubenswrapper[4898]: I1211 13:05:06.870784 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:06 crc kubenswrapper[4898]: I1211 13:05:06.870822 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:06 crc kubenswrapper[4898]: I1211 13:05:06.870833 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:06 crc kubenswrapper[4898]: I1211 13:05:06.870849 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:06 crc kubenswrapper[4898]: I1211 13:05:06.870860 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:06Z","lastTransitionTime":"2025-12-11T13:05:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:06 crc kubenswrapper[4898]: E1211 13:05:06.883856 4898 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-11T13:05:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-11T13:05:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-11T13:05:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-11T13:05:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-11T13:05:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-11T13:05:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-11T13:05:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-11T13:05:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"160776e4-8c30-4eed-9dbf-aa0b51733cfb\\\",\\\"systemUUID\\\":\\\"3ede5a2c-67f9-4bff-827e-03a23908e5c0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:06Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:06 crc kubenswrapper[4898]: I1211 13:05:06.887327 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:06 crc kubenswrapper[4898]: I1211 13:05:06.887397 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:06 crc kubenswrapper[4898]: I1211 13:05:06.887444 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:06 crc kubenswrapper[4898]: I1211 13:05:06.887489 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:06 crc kubenswrapper[4898]: I1211 13:05:06.887506 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:06Z","lastTransitionTime":"2025-12-11T13:05:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:06 crc kubenswrapper[4898]: E1211 13:05:06.899611 4898 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-11T13:05:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-11T13:05:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-11T13:05:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-11T13:05:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-11T13:05:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-11T13:05:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-11T13:05:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-11T13:05:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"160776e4-8c30-4eed-9dbf-aa0b51733cfb\\\",\\\"systemUUID\\\":\\\"3ede5a2c-67f9-4bff-827e-03a23908e5c0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:06Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:06 crc kubenswrapper[4898]: E1211 13:05:06.899760 4898 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 11 13:05:06 crc kubenswrapper[4898]: I1211 13:05:06.901751 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:06 crc kubenswrapper[4898]: I1211 13:05:06.901776 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:06 crc kubenswrapper[4898]: I1211 13:05:06.901787 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:06 crc kubenswrapper[4898]: I1211 13:05:06.901805 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:06 crc kubenswrapper[4898]: I1211 13:05:06.901817 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:06Z","lastTransitionTime":"2025-12-11T13:05:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:07 crc kubenswrapper[4898]: I1211 13:05:07.004630 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:07 crc kubenswrapper[4898]: I1211 13:05:07.004678 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:07 crc kubenswrapper[4898]: I1211 13:05:07.004690 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:07 crc kubenswrapper[4898]: I1211 13:05:07.004707 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:07 crc kubenswrapper[4898]: I1211 13:05:07.004718 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:07Z","lastTransitionTime":"2025-12-11T13:05:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:07 crc kubenswrapper[4898]: I1211 13:05:07.106701 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:07 crc kubenswrapper[4898]: I1211 13:05:07.106731 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:07 crc kubenswrapper[4898]: I1211 13:05:07.106738 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:07 crc kubenswrapper[4898]: I1211 13:05:07.106753 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:07 crc kubenswrapper[4898]: I1211 13:05:07.106763 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:07Z","lastTransitionTime":"2025-12-11T13:05:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:07 crc kubenswrapper[4898]: I1211 13:05:07.209041 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:07 crc kubenswrapper[4898]: I1211 13:05:07.209082 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:07 crc kubenswrapper[4898]: I1211 13:05:07.209090 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:07 crc kubenswrapper[4898]: I1211 13:05:07.209104 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:07 crc kubenswrapper[4898]: I1211 13:05:07.209113 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:07Z","lastTransitionTime":"2025-12-11T13:05:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:07 crc kubenswrapper[4898]: I1211 13:05:07.311610 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:07 crc kubenswrapper[4898]: I1211 13:05:07.311647 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:07 crc kubenswrapper[4898]: I1211 13:05:07.311656 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:07 crc kubenswrapper[4898]: I1211 13:05:07.311669 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:07 crc kubenswrapper[4898]: I1211 13:05:07.311679 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:07Z","lastTransitionTime":"2025-12-11T13:05:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:07 crc kubenswrapper[4898]: I1211 13:05:07.414106 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:07 crc kubenswrapper[4898]: I1211 13:05:07.414150 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:07 crc kubenswrapper[4898]: I1211 13:05:07.414160 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:07 crc kubenswrapper[4898]: I1211 13:05:07.414174 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:07 crc kubenswrapper[4898]: I1211 13:05:07.414184 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:07Z","lastTransitionTime":"2025-12-11T13:05:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:07 crc kubenswrapper[4898]: I1211 13:05:07.516559 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:07 crc kubenswrapper[4898]: I1211 13:05:07.516601 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:07 crc kubenswrapper[4898]: I1211 13:05:07.516614 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:07 crc kubenswrapper[4898]: I1211 13:05:07.516631 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:07 crc kubenswrapper[4898]: I1211 13:05:07.516643 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:07Z","lastTransitionTime":"2025-12-11T13:05:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:07 crc kubenswrapper[4898]: I1211 13:05:07.619190 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:07 crc kubenswrapper[4898]: I1211 13:05:07.619228 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:07 crc kubenswrapper[4898]: I1211 13:05:07.619238 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:07 crc kubenswrapper[4898]: I1211 13:05:07.619254 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:07 crc kubenswrapper[4898]: I1211 13:05:07.619264 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:07Z","lastTransitionTime":"2025-12-11T13:05:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:07 crc kubenswrapper[4898]: I1211 13:05:07.722298 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:07 crc kubenswrapper[4898]: I1211 13:05:07.722346 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:07 crc kubenswrapper[4898]: I1211 13:05:07.722358 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:07 crc kubenswrapper[4898]: I1211 13:05:07.722377 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:07 crc kubenswrapper[4898]: I1211 13:05:07.722387 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:07Z","lastTransitionTime":"2025-12-11T13:05:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:07 crc kubenswrapper[4898]: I1211 13:05:07.774242 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zcq7l" Dec 11 13:05:07 crc kubenswrapper[4898]: I1211 13:05:07.774299 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 11 13:05:07 crc kubenswrapper[4898]: E1211 13:05:07.774370 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zcq7l" podUID="34380c7c-1d75-4f6f-a6cb-b015a55ca978" Dec 11 13:05:07 crc kubenswrapper[4898]: I1211 13:05:07.774307 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 11 13:05:07 crc kubenswrapper[4898]: E1211 13:05:07.774529 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 11 13:05:07 crc kubenswrapper[4898]: I1211 13:05:07.774546 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 11 13:05:07 crc kubenswrapper[4898]: E1211 13:05:07.774646 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 11 13:05:07 crc kubenswrapper[4898]: E1211 13:05:07.774778 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 11 13:05:07 crc kubenswrapper[4898]: I1211 13:05:07.825509 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:07 crc kubenswrapper[4898]: I1211 13:05:07.825569 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:07 crc kubenswrapper[4898]: I1211 13:05:07.825584 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:07 crc kubenswrapper[4898]: I1211 13:05:07.825602 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:07 crc kubenswrapper[4898]: I1211 13:05:07.825613 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:07Z","lastTransitionTime":"2025-12-11T13:05:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:07 crc kubenswrapper[4898]: I1211 13:05:07.928103 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:07 crc kubenswrapper[4898]: I1211 13:05:07.928151 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:07 crc kubenswrapper[4898]: I1211 13:05:07.928185 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:07 crc kubenswrapper[4898]: I1211 13:05:07.928202 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:07 crc kubenswrapper[4898]: I1211 13:05:07.928214 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:07Z","lastTransitionTime":"2025-12-11T13:05:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:08 crc kubenswrapper[4898]: I1211 13:05:08.030964 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:08 crc kubenswrapper[4898]: I1211 13:05:08.031003 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:08 crc kubenswrapper[4898]: I1211 13:05:08.031012 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:08 crc kubenswrapper[4898]: I1211 13:05:08.031026 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:08 crc kubenswrapper[4898]: I1211 13:05:08.031036 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:08Z","lastTransitionTime":"2025-12-11T13:05:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:08 crc kubenswrapper[4898]: I1211 13:05:08.133095 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:08 crc kubenswrapper[4898]: I1211 13:05:08.133155 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:08 crc kubenswrapper[4898]: I1211 13:05:08.133169 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:08 crc kubenswrapper[4898]: I1211 13:05:08.133187 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:08 crc kubenswrapper[4898]: I1211 13:05:08.133200 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:08Z","lastTransitionTime":"2025-12-11T13:05:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:08 crc kubenswrapper[4898]: I1211 13:05:08.236084 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:08 crc kubenswrapper[4898]: I1211 13:05:08.236128 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:08 crc kubenswrapper[4898]: I1211 13:05:08.236137 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:08 crc kubenswrapper[4898]: I1211 13:05:08.236151 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:08 crc kubenswrapper[4898]: I1211 13:05:08.236161 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:08Z","lastTransitionTime":"2025-12-11T13:05:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:08 crc kubenswrapper[4898]: I1211 13:05:08.286732 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/34380c7c-1d75-4f6f-a6cb-b015a55ca978-metrics-certs\") pod \"network-metrics-daemon-zcq7l\" (UID: \"34380c7c-1d75-4f6f-a6cb-b015a55ca978\") " pod="openshift-multus/network-metrics-daemon-zcq7l" Dec 11 13:05:08 crc kubenswrapper[4898]: E1211 13:05:08.286935 4898 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 11 13:05:08 crc kubenswrapper[4898]: E1211 13:05:08.287028 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/34380c7c-1d75-4f6f-a6cb-b015a55ca978-metrics-certs podName:34380c7c-1d75-4f6f-a6cb-b015a55ca978 nodeName:}" failed. No retries permitted until 2025-12-11 13:05:40.287010535 +0000 UTC m=+97.859336972 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/34380c7c-1d75-4f6f-a6cb-b015a55ca978-metrics-certs") pod "network-metrics-daemon-zcq7l" (UID: "34380c7c-1d75-4f6f-a6cb-b015a55ca978") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 11 13:05:08 crc kubenswrapper[4898]: I1211 13:05:08.338944 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:08 crc kubenswrapper[4898]: I1211 13:05:08.338994 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:08 crc kubenswrapper[4898]: I1211 13:05:08.339008 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:08 crc kubenswrapper[4898]: I1211 13:05:08.339025 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:08 crc kubenswrapper[4898]: I1211 13:05:08.339038 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:08Z","lastTransitionTime":"2025-12-11T13:05:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:08 crc kubenswrapper[4898]: I1211 13:05:08.441414 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:08 crc kubenswrapper[4898]: I1211 13:05:08.441483 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:08 crc kubenswrapper[4898]: I1211 13:05:08.441497 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:08 crc kubenswrapper[4898]: I1211 13:05:08.441512 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:08 crc kubenswrapper[4898]: I1211 13:05:08.441524 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:08Z","lastTransitionTime":"2025-12-11T13:05:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:08 crc kubenswrapper[4898]: I1211 13:05:08.544993 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:08 crc kubenswrapper[4898]: I1211 13:05:08.545032 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:08 crc kubenswrapper[4898]: I1211 13:05:08.545041 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:08 crc kubenswrapper[4898]: I1211 13:05:08.545056 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:08 crc kubenswrapper[4898]: I1211 13:05:08.545065 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:08Z","lastTransitionTime":"2025-12-11T13:05:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:08 crc kubenswrapper[4898]: I1211 13:05:08.648391 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:08 crc kubenswrapper[4898]: I1211 13:05:08.648485 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:08 crc kubenswrapper[4898]: I1211 13:05:08.648504 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:08 crc kubenswrapper[4898]: I1211 13:05:08.648529 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:08 crc kubenswrapper[4898]: I1211 13:05:08.648549 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:08Z","lastTransitionTime":"2025-12-11T13:05:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:08 crc kubenswrapper[4898]: I1211 13:05:08.751517 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:08 crc kubenswrapper[4898]: I1211 13:05:08.751572 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:08 crc kubenswrapper[4898]: I1211 13:05:08.751583 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:08 crc kubenswrapper[4898]: I1211 13:05:08.751600 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:08 crc kubenswrapper[4898]: I1211 13:05:08.751611 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:08Z","lastTransitionTime":"2025-12-11T13:05:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:08 crc kubenswrapper[4898]: I1211 13:05:08.854561 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:08 crc kubenswrapper[4898]: I1211 13:05:08.854616 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:08 crc kubenswrapper[4898]: I1211 13:05:08.854652 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:08 crc kubenswrapper[4898]: I1211 13:05:08.854681 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:08 crc kubenswrapper[4898]: I1211 13:05:08.854703 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:08Z","lastTransitionTime":"2025-12-11T13:05:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:08 crc kubenswrapper[4898]: I1211 13:05:08.957363 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:08 crc kubenswrapper[4898]: I1211 13:05:08.957398 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:08 crc kubenswrapper[4898]: I1211 13:05:08.957414 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:08 crc kubenswrapper[4898]: I1211 13:05:08.957428 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:08 crc kubenswrapper[4898]: I1211 13:05:08.957439 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:08Z","lastTransitionTime":"2025-12-11T13:05:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:09 crc kubenswrapper[4898]: I1211 13:05:09.060022 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:09 crc kubenswrapper[4898]: I1211 13:05:09.060064 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:09 crc kubenswrapper[4898]: I1211 13:05:09.060075 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:09 crc kubenswrapper[4898]: I1211 13:05:09.060096 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:09 crc kubenswrapper[4898]: I1211 13:05:09.060111 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:09Z","lastTransitionTime":"2025-12-11T13:05:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:09 crc kubenswrapper[4898]: I1211 13:05:09.163485 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:09 crc kubenswrapper[4898]: I1211 13:05:09.163850 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:09 crc kubenswrapper[4898]: I1211 13:05:09.163863 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:09 crc kubenswrapper[4898]: I1211 13:05:09.163882 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:09 crc kubenswrapper[4898]: I1211 13:05:09.163894 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:09Z","lastTransitionTime":"2025-12-11T13:05:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:09 crc kubenswrapper[4898]: I1211 13:05:09.266362 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:09 crc kubenswrapper[4898]: I1211 13:05:09.266401 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:09 crc kubenswrapper[4898]: I1211 13:05:09.266413 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:09 crc kubenswrapper[4898]: I1211 13:05:09.266430 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:09 crc kubenswrapper[4898]: I1211 13:05:09.266442 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:09Z","lastTransitionTime":"2025-12-11T13:05:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:09 crc kubenswrapper[4898]: I1211 13:05:09.368135 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:09 crc kubenswrapper[4898]: I1211 13:05:09.368217 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:09 crc kubenswrapper[4898]: I1211 13:05:09.368229 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:09 crc kubenswrapper[4898]: I1211 13:05:09.368246 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:09 crc kubenswrapper[4898]: I1211 13:05:09.368260 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:09Z","lastTransitionTime":"2025-12-11T13:05:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:09 crc kubenswrapper[4898]: I1211 13:05:09.471305 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:09 crc kubenswrapper[4898]: I1211 13:05:09.471339 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:09 crc kubenswrapper[4898]: I1211 13:05:09.471349 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:09 crc kubenswrapper[4898]: I1211 13:05:09.471364 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:09 crc kubenswrapper[4898]: I1211 13:05:09.471374 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:09Z","lastTransitionTime":"2025-12-11T13:05:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:09 crc kubenswrapper[4898]: I1211 13:05:09.573873 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:09 crc kubenswrapper[4898]: I1211 13:05:09.573919 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:09 crc kubenswrapper[4898]: I1211 13:05:09.573931 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:09 crc kubenswrapper[4898]: I1211 13:05:09.573951 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:09 crc kubenswrapper[4898]: I1211 13:05:09.573964 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:09Z","lastTransitionTime":"2025-12-11T13:05:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:09 crc kubenswrapper[4898]: I1211 13:05:09.676684 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:09 crc kubenswrapper[4898]: I1211 13:05:09.676726 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:09 crc kubenswrapper[4898]: I1211 13:05:09.676738 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:09 crc kubenswrapper[4898]: I1211 13:05:09.676753 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:09 crc kubenswrapper[4898]: I1211 13:05:09.676764 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:09Z","lastTransitionTime":"2025-12-11T13:05:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:09 crc kubenswrapper[4898]: I1211 13:05:09.774121 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zcq7l" Dec 11 13:05:09 crc kubenswrapper[4898]: I1211 13:05:09.774323 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 11 13:05:09 crc kubenswrapper[4898]: I1211 13:05:09.774358 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 11 13:05:09 crc kubenswrapper[4898]: I1211 13:05:09.774386 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 11 13:05:09 crc kubenswrapper[4898]: E1211 13:05:09.774496 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 11 13:05:09 crc kubenswrapper[4898]: E1211 13:05:09.774583 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 11 13:05:09 crc kubenswrapper[4898]: E1211 13:05:09.774674 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 11 13:05:09 crc kubenswrapper[4898]: I1211 13:05:09.774692 4898 scope.go:117] "RemoveContainer" containerID="18a2b5e8508fbd418e99cd2fe5af41cf82ce69e8a4240c8c3c1965e8f7b0bfb9" Dec 11 13:05:09 crc kubenswrapper[4898]: E1211 13:05:09.774736 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zcq7l" podUID="34380c7c-1d75-4f6f-a6cb-b015a55ca978" Dec 11 13:05:09 crc kubenswrapper[4898]: E1211 13:05:09.774920 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-qndxl_openshift-ovn-kubernetes(1efa7034-8a95-4e6e-bd84-0189dc5acaa3)\"" pod="openshift-ovn-kubernetes/ovnkube-node-qndxl" podUID="1efa7034-8a95-4e6e-bd84-0189dc5acaa3" Dec 11 13:05:09 crc kubenswrapper[4898]: I1211 13:05:09.779096 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:09 crc kubenswrapper[4898]: I1211 13:05:09.779128 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:09 crc kubenswrapper[4898]: I1211 13:05:09.779137 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:09 crc kubenswrapper[4898]: I1211 13:05:09.779152 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:09 crc kubenswrapper[4898]: I1211 13:05:09.779163 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:09Z","lastTransitionTime":"2025-12-11T13:05:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:09 crc kubenswrapper[4898]: I1211 13:05:09.881421 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:09 crc kubenswrapper[4898]: I1211 13:05:09.881499 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:09 crc kubenswrapper[4898]: I1211 13:05:09.881515 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:09 crc kubenswrapper[4898]: I1211 13:05:09.881536 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:09 crc kubenswrapper[4898]: I1211 13:05:09.881552 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:09Z","lastTransitionTime":"2025-12-11T13:05:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:09 crc kubenswrapper[4898]: I1211 13:05:09.984174 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:09 crc kubenswrapper[4898]: I1211 13:05:09.984218 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:09 crc kubenswrapper[4898]: I1211 13:05:09.984232 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:09 crc kubenswrapper[4898]: I1211 13:05:09.984249 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:09 crc kubenswrapper[4898]: I1211 13:05:09.984260 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:09Z","lastTransitionTime":"2025-12-11T13:05:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:10 crc kubenswrapper[4898]: I1211 13:05:10.086724 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:10 crc kubenswrapper[4898]: I1211 13:05:10.086763 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:10 crc kubenswrapper[4898]: I1211 13:05:10.086773 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:10 crc kubenswrapper[4898]: I1211 13:05:10.086789 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:10 crc kubenswrapper[4898]: I1211 13:05:10.086800 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:10Z","lastTransitionTime":"2025-12-11T13:05:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:10 crc kubenswrapper[4898]: I1211 13:05:10.189295 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:10 crc kubenswrapper[4898]: I1211 13:05:10.189374 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:10 crc kubenswrapper[4898]: I1211 13:05:10.189399 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:10 crc kubenswrapper[4898]: I1211 13:05:10.189431 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:10 crc kubenswrapper[4898]: I1211 13:05:10.189510 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:10Z","lastTransitionTime":"2025-12-11T13:05:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:10 crc kubenswrapper[4898]: I1211 13:05:10.200464 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-dlqfj_4e8ed6cb-b822-4b64-9e00-e755c5aea812/kube-multus/0.log" Dec 11 13:05:10 crc kubenswrapper[4898]: I1211 13:05:10.200577 4898 generic.go:334] "Generic (PLEG): container finished" podID="4e8ed6cb-b822-4b64-9e00-e755c5aea812" containerID="76ddfb054d2dd7ec27a771c96a0fca2cead8eacc3221c22b36c1d075414a1995" exitCode=1 Dec 11 13:05:10 crc kubenswrapper[4898]: I1211 13:05:10.200655 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-dlqfj" event={"ID":"4e8ed6cb-b822-4b64-9e00-e755c5aea812","Type":"ContainerDied","Data":"76ddfb054d2dd7ec27a771c96a0fca2cead8eacc3221c22b36c1d075414a1995"} Dec 11 13:05:10 crc kubenswrapper[4898]: I1211 13:05:10.201179 4898 scope.go:117] "RemoveContainer" containerID="76ddfb054d2dd7ec27a771c96a0fca2cead8eacc3221c22b36c1d075414a1995" Dec 11 13:05:10 crc kubenswrapper[4898]: I1211 13:05:10.215026 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jmvlr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8009db93-3f68-4a84-87ae-863f64e231e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08079d059f71e05ad9d28ebed3d2608a035d3b6fc87db1d80b3974b439ac5bdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkmx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:24Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jmvlr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:10Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:10 crc kubenswrapper[4898]: I1211 13:05:10.231548 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://337971ff1d0d3cf7ee256777a71e4549ec0b973b58b437caaa152c589047141f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:10Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:10 crc kubenswrapper[4898]: I1211 13:05:10.242773 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b28cc743c1173591c6ee75d4bb3841608d01bca50dab896d4c5de930a67ee22f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:10Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:10 crc kubenswrapper[4898]: I1211 13:05:10.256390 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-dlqfj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e8ed6cb-b822-4b64-9e00-e755c5aea812\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:05:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:05:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76ddfb054d2dd7ec27a771c96a0fca2cead8eacc3221c22b36c1d075414a1995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76ddfb054d2dd7ec27a771c96a0fca2cead8eacc3221c22b36c1d075414a1995\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-11T13:05:09Z\\\",\\\"message\\\":\\\"2025-12-11T13:04:23+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_be5097ed-4df8-4212-b1ed-ee3e7c74e5b6\\\\n2025-12-11T13:04:23+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_be5097ed-4df8-4212-b1ed-ee3e7c74e5b6 to /host/opt/cni/bin/\\\\n2025-12-11T13:04:24Z [verbose] multus-daemon started\\\\n2025-12-11T13:04:24Z [verbose] Readiness Indicator file check\\\\n2025-12-11T13:05:09Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bvrpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-dlqfj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:10Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:10 crc kubenswrapper[4898]: I1211 13:05:10.265170 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-h86cf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"00c0a5c3-6be3-4c77-a628-cf6710a1f10f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cb94931502e22d03595fc3289c2b098ecf881b9f8c14180a036a11b2f65bd91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xv2hc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-h86cf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:10Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:10 crc kubenswrapper[4898]: I1211 13:05:10.276629 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6782ac5869029f1d6c05198c469d34070decbbbbab0c093963024040f067ecf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7cv42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb7b98c50effebe09db7e51fe139da11af7b116ce6b7d81f0dada92fa29c82a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7cv42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7mmvk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:10Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:10 crc kubenswrapper[4898]: I1211 13:05:10.291783 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:10 crc kubenswrapper[4898]: I1211 13:05:10.291934 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:10 crc kubenswrapper[4898]: I1211 13:05:10.292000 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:10 crc kubenswrapper[4898]: I1211 13:05:10.292078 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:10 crc kubenswrapper[4898]: I1211 13:05:10.292173 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:10Z","lastTransitionTime":"2025-12-11T13:05:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:10 crc kubenswrapper[4898]: I1211 13:05:10.299884 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qndxl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1efa7034-8a95-4e6e-bd84-0189dc5acaa3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2a77e22893dca5ea818fc6992db6c34f8eb90ce2f25e814bec5f314d60ad569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd81ec4902eeb225d5a2969e211c63bbab448c1cccb327cfa84fb68c4f386ab0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ab0efc2d7d46f575125a46f025b016017e66f74466a9d796dfb0285577f5bcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d992af21fd68cce19a4be8328466a3d46df62ed908fe4e911ba14e7c009ce982\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a60901e27e67b7483e02800a91a86f2676b610d538c999ab1df0d42fde146f90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7e7d7ded2a6e0e20781a16b0ed56a744db593c6fe40c75e89a088ccb8cdb127\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18a2b5e8508fbd418e99cd2fe5af41cf82ce69e8a4240c8c3c1965e8f7b0bfb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://18a2b5e8508fbd418e99cd2fe5af41cf82ce69e8a4240c8c3c1965e8f7b0bfb9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-11T13:04:55Z\\\",\\\"message\\\":\\\"pin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.41:9443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {589f95f7-f3e2-4140-80ed-9a0717201481}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1211 13:04:55.603698 6618 services_controller.go:356] Processing sync for service openshift-marketplace/redhat-marketplace for network=default\\\\nF1211 13:04:55.603702 6618 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:55Z is after 2025-08-24T17:21:41Z]\\\\nI1211 13:04:55.6\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:54Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-qndxl_openshift-ovn-kubernetes(1efa7034-8a95-4e6e-bd84-0189dc5acaa3)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3aa907a5ca865510f1aafd0f949963b5938b6fe62fa9fd690447c4ab62566f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fc397b24e8480c4d07281b09fb871f859eb9fdf47fa9103b9243f91e5800d61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fc397b24e8480c4d07281b09fb871f859eb9fdf47fa9103b9243f91e5800d61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qndxl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:10Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:10 crc kubenswrapper[4898]: I1211 13:05:10.310859 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-brphv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f71cb6f-609f-4fab-86eb-e01518fb8c61\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d17922d87e50d85f50e76fef19a44cedae3bfd35727f8531089cfcf6da12d5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwpjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cedba92b63b5430b84d32f2b59c5cc0c4e1cd2bc2d513f2f905ebb59c34639b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwpjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-brphv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:10Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:10 crc kubenswrapper[4898]: I1211 13:05:10.330113 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbb0d9e2-813d-44ce-a547-a3c6f60c9f6b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e84fc12f1339e236216301663a693fc0ae8c043f282081b77e08455869556f40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcda5c81dd6cdc256d7acd24d32f956484e8823bed9e29c5713a722d8ebd6f46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cbfca7c42a03929bac318edc66ce1a6c7f88f795fd3b58db686b8484edfac60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e329db1261719ccef14dcf3965ff73799b275591613d4035cb820253dced79b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e329db1261719ccef14dcf3965ff73799b275591613d4035cb820253dced79b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:02Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:10Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:10 crc kubenswrapper[4898]: I1211 13:05:10.351766 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"88d684b6-361a-40e6-978e-b46ea34398f5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fdd8f4f5e41c02e8bfa32782027f6483a3568a3e1cb28f2abc7807eaada32646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad23dc07c8ad7cf6100e3b53e1d9ed8abac20e942968edb5decdde1b263de306\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce1bf1a7718645350f992e26a366a0fefa0d757b696c932d3552dd24a4c93550\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4801f56faef50a1c100b7f54be9508a2be17ab655648fdb92cffaab1cb198a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87f5a835b639c4ce50c0d2b5f1a865f1b67bc5cc06d9fd88875d9defb71dbed2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77f3c45a0ef619bd5ee7b89f62fdaf4ae983c8f75e23034eb88019194f6ef88a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77f3c45a0ef619bd5ee7b89f62fdaf4ae983c8f75e23034eb88019194f6ef88a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32f392949d5ebc630af8b0646f8f482f806803d464e2cedbcbe731328d6ba5ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32f392949d5ebc630af8b0646f8f482f806803d464e2cedbcbe731328d6ba5ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://bf8553e4b1a22afaf454efae3707136b9e6e320a54fd269f48a5f72297249d21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf8553e4b1a22afaf454efae3707136b9e6e320a54fd269f48a5f72297249d21\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:10Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:10 crc kubenswrapper[4898]: I1211 13:05:10.365237 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7lxfm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45192e97-2770-4866-9865-dc2f45b3f616\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4fc2a5689b589ccda6ec382b6a586f4e47c42d4f137d852e28688a18b98e6e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14a9667311a747f8e844897394432d67e0897b22569dd1314f4ebeccfce1c531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14a9667311a747f8e844897394432d67e0897b22569dd1314f4ebeccfce1c531\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://35bb75c37377565dd0410255d00b3a848a3d22801107bd4d9547ce5574cda121\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35bb75c37377565dd0410255d00b3a848a3d22801107bd4d9547ce5574cda121\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd3bae44b475629c42174d195e162b9afe545e8dc84857847b1e2d936cf1a32d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd3bae44b475629c42174d195e162b9afe545e8dc84857847b1e2d936cf1a32d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7d5edf28866aeb08ed16121e0cea183463e58a708b157da20edd847bdad9860\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7d5edf28866aeb08ed16121e0cea183463e58a708b157da20edd847bdad9860\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://951e5da3fc6041496960dcfcd28bb3e918a40d9303c6609880bc2f8dcca29688\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://951e5da3fc6041496960dcfcd28bb3e918a40d9303c6609880bc2f8dcca29688\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39b6a95a0387b8f1ea4992bfa28355272a299bbb81adddf7ad39835786b43b2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39b6a95a0387b8f1ea4992bfa28355272a299bbb81adddf7ad39835786b43b2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7lxfm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:10Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:10 crc kubenswrapper[4898]: I1211 13:05:10.375239 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-zcq7l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34380c7c-1d75-4f6f-a6cb-b015a55ca978\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:36Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-zcq7l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:10Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:10 crc kubenswrapper[4898]: I1211 13:05:10.389911 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fc485a2-a69e-4453-8c3d-4b9698caa632\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df081be8bffcbf5068ade5f09ad81f2ea332d110fb0dffc1ab215f4a447e75d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://765cb993103d23770e20a725ce1f28ca635d13d2ecd05676039a833274ecf3e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41568c567a418a6ebd534c09d02c4331dfa2cd00580441bfdd83a15a12153a59\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ccc1530df8a90b252f6008c3a1734fd489bcb57944c2cc46bdf3c7554d837c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8857b4ade110b31598d9080f45bc15f5702b2c278fe7909e4d172c9877e61534\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"message\\\":\\\"falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1211 13:04:15.326141 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1211 13:04:15.327902 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-38072788/tls.crt::/tmp/serving-cert-38072788/tls.key\\\\\\\"\\\\nI1211 13:04:21.035582 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1211 13:04:21.038444 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1211 13:04:21.038483 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1211 13:04:21.038515 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1211 13:04:21.038527 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1211 13:04:21.044993 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1211 13:04:21.045022 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1211 13:04:21.045029 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1211 13:04:21.045034 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1211 13:04:21.045038 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1211 13:04:21.045043 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1211 13:04:21.045047 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1211 13:04:21.045297 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1211 13:04:21.046774 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a6e962a8c37823fc7d0047f9c1cae14311bfa1f36236addff039bba3369a021\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca4b5408884ed07718750ad82f77961d242327ca11991fd4dbbd1b5b53c5b642\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca4b5408884ed07718750ad82f77961d242327ca11991fd4dbbd1b5b53c5b642\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:10Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:10 crc kubenswrapper[4898]: I1211 13:05:10.394081 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:10 crc kubenswrapper[4898]: I1211 13:05:10.394121 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:10 crc kubenswrapper[4898]: I1211 13:05:10.394135 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:10 crc kubenswrapper[4898]: I1211 13:05:10.394151 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:10 crc kubenswrapper[4898]: I1211 13:05:10.394163 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:10Z","lastTransitionTime":"2025-12-11T13:05:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:10 crc kubenswrapper[4898]: I1211 13:05:10.400424 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9dab79b9393ae8ca48f0b4366d9b4e6fd25015a3d42f913b3d3d5ce78cbf0cc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bee512bd389a5e402acad6fdbf075e8d90bbfb3899efaecee992ff8ef64617fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:10Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:10 crc kubenswrapper[4898]: I1211 13:05:10.412907 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:10Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:10 crc kubenswrapper[4898]: I1211 13:05:10.423968 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca3db723-c4e9-4a38-9cee-827f045c7be3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b72e9f6787b4bfadbc89cf669d5f31678f0a85a726107e3b3d2e06eff0e18ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a52b842d9e7001cf1e9c3284c56ac9a76f642b66457d6717c6367e00ef89fdf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://583816a3f9ba9a092d6cd8d75bba54cf42e3ff9f65230f83822baec967dd53dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f419f755ec0a61ffad06dca0d35df43386ebd253152f29cf04312cd0ea89d5de\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:10Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:10 crc kubenswrapper[4898]: I1211 13:05:10.435914 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:10Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:10 crc kubenswrapper[4898]: I1211 13:05:10.447644 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:10Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:10 crc kubenswrapper[4898]: I1211 13:05:10.497026 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:10 crc kubenswrapper[4898]: I1211 13:05:10.497073 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:10 crc kubenswrapper[4898]: I1211 13:05:10.497084 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:10 crc kubenswrapper[4898]: I1211 13:05:10.497100 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:10 crc kubenswrapper[4898]: I1211 13:05:10.497113 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:10Z","lastTransitionTime":"2025-12-11T13:05:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:10 crc kubenswrapper[4898]: I1211 13:05:10.599501 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:10 crc kubenswrapper[4898]: I1211 13:05:10.599549 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:10 crc kubenswrapper[4898]: I1211 13:05:10.599558 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:10 crc kubenswrapper[4898]: I1211 13:05:10.599573 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:10 crc kubenswrapper[4898]: I1211 13:05:10.599583 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:10Z","lastTransitionTime":"2025-12-11T13:05:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:10 crc kubenswrapper[4898]: I1211 13:05:10.702109 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:10 crc kubenswrapper[4898]: I1211 13:05:10.702157 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:10 crc kubenswrapper[4898]: I1211 13:05:10.702167 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:10 crc kubenswrapper[4898]: I1211 13:05:10.702182 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:10 crc kubenswrapper[4898]: I1211 13:05:10.702194 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:10Z","lastTransitionTime":"2025-12-11T13:05:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:10 crc kubenswrapper[4898]: I1211 13:05:10.804745 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:10 crc kubenswrapper[4898]: I1211 13:05:10.804812 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:10 crc kubenswrapper[4898]: I1211 13:05:10.804831 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:10 crc kubenswrapper[4898]: I1211 13:05:10.804854 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:10 crc kubenswrapper[4898]: I1211 13:05:10.804873 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:10Z","lastTransitionTime":"2025-12-11T13:05:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:10 crc kubenswrapper[4898]: I1211 13:05:10.907498 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:10 crc kubenswrapper[4898]: I1211 13:05:10.907561 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:10 crc kubenswrapper[4898]: I1211 13:05:10.907588 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:10 crc kubenswrapper[4898]: I1211 13:05:10.907616 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:10 crc kubenswrapper[4898]: I1211 13:05:10.907640 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:10Z","lastTransitionTime":"2025-12-11T13:05:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:11 crc kubenswrapper[4898]: I1211 13:05:11.009853 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:11 crc kubenswrapper[4898]: I1211 13:05:11.009892 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:11 crc kubenswrapper[4898]: I1211 13:05:11.009901 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:11 crc kubenswrapper[4898]: I1211 13:05:11.009914 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:11 crc kubenswrapper[4898]: I1211 13:05:11.009926 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:11Z","lastTransitionTime":"2025-12-11T13:05:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:11 crc kubenswrapper[4898]: I1211 13:05:11.111737 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:11 crc kubenswrapper[4898]: I1211 13:05:11.111776 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:11 crc kubenswrapper[4898]: I1211 13:05:11.111787 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:11 crc kubenswrapper[4898]: I1211 13:05:11.111801 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:11 crc kubenswrapper[4898]: I1211 13:05:11.111810 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:11Z","lastTransitionTime":"2025-12-11T13:05:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:11 crc kubenswrapper[4898]: I1211 13:05:11.205001 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-dlqfj_4e8ed6cb-b822-4b64-9e00-e755c5aea812/kube-multus/0.log" Dec 11 13:05:11 crc kubenswrapper[4898]: I1211 13:05:11.205280 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-dlqfj" event={"ID":"4e8ed6cb-b822-4b64-9e00-e755c5aea812","Type":"ContainerStarted","Data":"dbc9b844a873836a500af9d781e17c635495762b0e50af71dadf141ff00723a3"} Dec 11 13:05:11 crc kubenswrapper[4898]: I1211 13:05:11.215675 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:11 crc kubenswrapper[4898]: I1211 13:05:11.215829 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:11 crc kubenswrapper[4898]: I1211 13:05:11.215927 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:11 crc kubenswrapper[4898]: I1211 13:05:11.216015 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:11 crc kubenswrapper[4898]: I1211 13:05:11.216111 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:11Z","lastTransitionTime":"2025-12-11T13:05:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:11 crc kubenswrapper[4898]: I1211 13:05:11.217759 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-h86cf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"00c0a5c3-6be3-4c77-a628-cf6710a1f10f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cb94931502e22d03595fc3289c2b098ecf881b9f8c14180a036a11b2f65bd91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xv2hc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-h86cf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:11Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:11 crc kubenswrapper[4898]: I1211 13:05:11.234539 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6782ac5869029f1d6c05198c469d34070decbbbbab0c093963024040f067ecf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7cv42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb7b98c50effebe09db7e51fe139da11af7b116ce6b7d81f0dada92fa29c82a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7cv42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7mmvk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:11Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:11 crc kubenswrapper[4898]: I1211 13:05:11.253680 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qndxl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1efa7034-8a95-4e6e-bd84-0189dc5acaa3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2a77e22893dca5ea818fc6992db6c34f8eb90ce2f25e814bec5f314d60ad569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd81ec4902eeb225d5a2969e211c63bbab448c1cccb327cfa84fb68c4f386ab0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ab0efc2d7d46f575125a46f025b016017e66f74466a9d796dfb0285577f5bcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d992af21fd68cce19a4be8328466a3d46df62ed908fe4e911ba14e7c009ce982\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a60901e27e67b7483e02800a91a86f2676b610d538c999ab1df0d42fde146f90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7e7d7ded2a6e0e20781a16b0ed56a744db593c6fe40c75e89a088ccb8cdb127\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18a2b5e8508fbd418e99cd2fe5af41cf82ce69e8a4240c8c3c1965e8f7b0bfb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://18a2b5e8508fbd418e99cd2fe5af41cf82ce69e8a4240c8c3c1965e8f7b0bfb9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-11T13:04:55Z\\\",\\\"message\\\":\\\"pin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.41:9443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {589f95f7-f3e2-4140-80ed-9a0717201481}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1211 13:04:55.603698 6618 services_controller.go:356] Processing sync for service openshift-marketplace/redhat-marketplace for network=default\\\\nF1211 13:04:55.603702 6618 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:55Z is after 2025-08-24T17:21:41Z]\\\\nI1211 13:04:55.6\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:54Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-qndxl_openshift-ovn-kubernetes(1efa7034-8a95-4e6e-bd84-0189dc5acaa3)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3aa907a5ca865510f1aafd0f949963b5938b6fe62fa9fd690447c4ab62566f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fc397b24e8480c4d07281b09fb871f859eb9fdf47fa9103b9243f91e5800d61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fc397b24e8480c4d07281b09fb871f859eb9fdf47fa9103b9243f91e5800d61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qndxl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:11Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:11 crc kubenswrapper[4898]: I1211 13:05:11.265783 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-brphv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f71cb6f-609f-4fab-86eb-e01518fb8c61\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d17922d87e50d85f50e76fef19a44cedae3bfd35727f8531089cfcf6da12d5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwpjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cedba92b63b5430b84d32f2b59c5cc0c4e1cd2bc2d513f2f905ebb59c34639b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwpjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-brphv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:11Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:11 crc kubenswrapper[4898]: I1211 13:05:11.276490 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbb0d9e2-813d-44ce-a547-a3c6f60c9f6b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e84fc12f1339e236216301663a693fc0ae8c043f282081b77e08455869556f40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcda5c81dd6cdc256d7acd24d32f956484e8823bed9e29c5713a722d8ebd6f46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cbfca7c42a03929bac318edc66ce1a6c7f88f795fd3b58db686b8484edfac60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e329db1261719ccef14dcf3965ff73799b275591613d4035cb820253dced79b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e329db1261719ccef14dcf3965ff73799b275591613d4035cb820253dced79b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:02Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:11Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:11 crc kubenswrapper[4898]: I1211 13:05:11.296584 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"88d684b6-361a-40e6-978e-b46ea34398f5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fdd8f4f5e41c02e8bfa32782027f6483a3568a3e1cb28f2abc7807eaada32646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad23dc07c8ad7cf6100e3b53e1d9ed8abac20e942968edb5decdde1b263de306\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce1bf1a7718645350f992e26a366a0fefa0d757b696c932d3552dd24a4c93550\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4801f56faef50a1c100b7f54be9508a2be17ab655648fdb92cffaab1cb198a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87f5a835b639c4ce50c0d2b5f1a865f1b67bc5cc06d9fd88875d9defb71dbed2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77f3c45a0ef619bd5ee7b89f62fdaf4ae983c8f75e23034eb88019194f6ef88a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77f3c45a0ef619bd5ee7b89f62fdaf4ae983c8f75e23034eb88019194f6ef88a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32f392949d5ebc630af8b0646f8f482f806803d464e2cedbcbe731328d6ba5ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32f392949d5ebc630af8b0646f8f482f806803d464e2cedbcbe731328d6ba5ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://bf8553e4b1a22afaf454efae3707136b9e6e320a54fd269f48a5f72297249d21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf8553e4b1a22afaf454efae3707136b9e6e320a54fd269f48a5f72297249d21\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:11Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:11 crc kubenswrapper[4898]: I1211 13:05:11.311004 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7lxfm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45192e97-2770-4866-9865-dc2f45b3f616\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4fc2a5689b589ccda6ec382b6a586f4e47c42d4f137d852e28688a18b98e6e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14a9667311a747f8e844897394432d67e0897b22569dd1314f4ebeccfce1c531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14a9667311a747f8e844897394432d67e0897b22569dd1314f4ebeccfce1c531\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://35bb75c37377565dd0410255d00b3a848a3d22801107bd4d9547ce5574cda121\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35bb75c37377565dd0410255d00b3a848a3d22801107bd4d9547ce5574cda121\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd3bae44b475629c42174d195e162b9afe545e8dc84857847b1e2d936cf1a32d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd3bae44b475629c42174d195e162b9afe545e8dc84857847b1e2d936cf1a32d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7d5edf28866aeb08ed16121e0cea183463e58a708b157da20edd847bdad9860\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7d5edf28866aeb08ed16121e0cea183463e58a708b157da20edd847bdad9860\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://951e5da3fc6041496960dcfcd28bb3e918a40d9303c6609880bc2f8dcca29688\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://951e5da3fc6041496960dcfcd28bb3e918a40d9303c6609880bc2f8dcca29688\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39b6a95a0387b8f1ea4992bfa28355272a299bbb81adddf7ad39835786b43b2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39b6a95a0387b8f1ea4992bfa28355272a299bbb81adddf7ad39835786b43b2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7lxfm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:11Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:11 crc kubenswrapper[4898]: I1211 13:05:11.319090 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:11 crc kubenswrapper[4898]: I1211 13:05:11.319118 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:11 crc kubenswrapper[4898]: I1211 13:05:11.319129 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:11 crc kubenswrapper[4898]: I1211 13:05:11.319143 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:11 crc kubenswrapper[4898]: I1211 13:05:11.319155 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:11Z","lastTransitionTime":"2025-12-11T13:05:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:11 crc kubenswrapper[4898]: I1211 13:05:11.322535 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-zcq7l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34380c7c-1d75-4f6f-a6cb-b015a55ca978\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:36Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-zcq7l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:11Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:11 crc kubenswrapper[4898]: I1211 13:05:11.337721 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fc485a2-a69e-4453-8c3d-4b9698caa632\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df081be8bffcbf5068ade5f09ad81f2ea332d110fb0dffc1ab215f4a447e75d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://765cb993103d23770e20a725ce1f28ca635d13d2ecd05676039a833274ecf3e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41568c567a418a6ebd534c09d02c4331dfa2cd00580441bfdd83a15a12153a59\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ccc1530df8a90b252f6008c3a1734fd489bcb57944c2cc46bdf3c7554d837c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8857b4ade110b31598d9080f45bc15f5702b2c278fe7909e4d172c9877e61534\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"message\\\":\\\"falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1211 13:04:15.326141 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1211 13:04:15.327902 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-38072788/tls.crt::/tmp/serving-cert-38072788/tls.key\\\\\\\"\\\\nI1211 13:04:21.035582 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1211 13:04:21.038444 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1211 13:04:21.038483 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1211 13:04:21.038515 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1211 13:04:21.038527 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1211 13:04:21.044993 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1211 13:04:21.045022 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1211 13:04:21.045029 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1211 13:04:21.045034 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1211 13:04:21.045038 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1211 13:04:21.045043 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1211 13:04:21.045047 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1211 13:04:21.045297 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1211 13:04:21.046774 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a6e962a8c37823fc7d0047f9c1cae14311bfa1f36236addff039bba3369a021\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca4b5408884ed07718750ad82f77961d242327ca11991fd4dbbd1b5b53c5b642\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca4b5408884ed07718750ad82f77961d242327ca11991fd4dbbd1b5b53c5b642\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:11Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:11 crc kubenswrapper[4898]: I1211 13:05:11.357253 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9dab79b9393ae8ca48f0b4366d9b4e6fd25015a3d42f913b3d3d5ce78cbf0cc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bee512bd389a5e402acad6fdbf075e8d90bbfb3899efaecee992ff8ef64617fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:11Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:11 crc kubenswrapper[4898]: I1211 13:05:11.370736 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:11Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:11 crc kubenswrapper[4898]: I1211 13:05:11.384646 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca3db723-c4e9-4a38-9cee-827f045c7be3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b72e9f6787b4bfadbc89cf669d5f31678f0a85a726107e3b3d2e06eff0e18ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a52b842d9e7001cf1e9c3284c56ac9a76f642b66457d6717c6367e00ef89fdf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://583816a3f9ba9a092d6cd8d75bba54cf42e3ff9f65230f83822baec967dd53dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f419f755ec0a61ffad06dca0d35df43386ebd253152f29cf04312cd0ea89d5de\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:11Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:11 crc kubenswrapper[4898]: I1211 13:05:11.400653 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:11Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:11 crc kubenswrapper[4898]: I1211 13:05:11.417786 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:11Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:11 crc kubenswrapper[4898]: I1211 13:05:11.421726 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:11 crc kubenswrapper[4898]: I1211 13:05:11.421755 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:11 crc kubenswrapper[4898]: I1211 13:05:11.421763 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:11 crc kubenswrapper[4898]: I1211 13:05:11.421775 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:11 crc kubenswrapper[4898]: I1211 13:05:11.421783 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:11Z","lastTransitionTime":"2025-12-11T13:05:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:11 crc kubenswrapper[4898]: I1211 13:05:11.431619 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jmvlr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8009db93-3f68-4a84-87ae-863f64e231e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08079d059f71e05ad9d28ebed3d2608a035d3b6fc87db1d80b3974b439ac5bdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkmx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:24Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jmvlr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:11Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:11 crc kubenswrapper[4898]: I1211 13:05:11.446490 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://337971ff1d0d3cf7ee256777a71e4549ec0b973b58b437caaa152c589047141f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:11Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:11 crc kubenswrapper[4898]: I1211 13:05:11.460795 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b28cc743c1173591c6ee75d4bb3841608d01bca50dab896d4c5de930a67ee22f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:11Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:11 crc kubenswrapper[4898]: I1211 13:05:11.484014 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-dlqfj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e8ed6cb-b822-4b64-9e00-e755c5aea812\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbc9b844a873836a500af9d781e17c635495762b0e50af71dadf141ff00723a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76ddfb054d2dd7ec27a771c96a0fca2cead8eacc3221c22b36c1d075414a1995\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-11T13:05:09Z\\\",\\\"message\\\":\\\"2025-12-11T13:04:23+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_be5097ed-4df8-4212-b1ed-ee3e7c74e5b6\\\\n2025-12-11T13:04:23+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_be5097ed-4df8-4212-b1ed-ee3e7c74e5b6 to /host/opt/cni/bin/\\\\n2025-12-11T13:04:24Z [verbose] multus-daemon started\\\\n2025-12-11T13:04:24Z [verbose] Readiness Indicator file check\\\\n2025-12-11T13:05:09Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:05:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bvrpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-dlqfj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:11Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:11 crc kubenswrapper[4898]: I1211 13:05:11.524306 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:11 crc kubenswrapper[4898]: I1211 13:05:11.524344 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:11 crc kubenswrapper[4898]: I1211 13:05:11.524357 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:11 crc kubenswrapper[4898]: I1211 13:05:11.524374 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:11 crc kubenswrapper[4898]: I1211 13:05:11.524386 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:11Z","lastTransitionTime":"2025-12-11T13:05:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:11 crc kubenswrapper[4898]: I1211 13:05:11.626550 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:11 crc kubenswrapper[4898]: I1211 13:05:11.626585 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:11 crc kubenswrapper[4898]: I1211 13:05:11.626594 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:11 crc kubenswrapper[4898]: I1211 13:05:11.626606 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:11 crc kubenswrapper[4898]: I1211 13:05:11.626614 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:11Z","lastTransitionTime":"2025-12-11T13:05:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:11 crc kubenswrapper[4898]: I1211 13:05:11.728628 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:11 crc kubenswrapper[4898]: I1211 13:05:11.729000 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:11 crc kubenswrapper[4898]: I1211 13:05:11.729089 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:11 crc kubenswrapper[4898]: I1211 13:05:11.729156 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:11 crc kubenswrapper[4898]: I1211 13:05:11.729220 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:11Z","lastTransitionTime":"2025-12-11T13:05:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:11 crc kubenswrapper[4898]: I1211 13:05:11.774977 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 11 13:05:11 crc kubenswrapper[4898]: E1211 13:05:11.775338 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 11 13:05:11 crc kubenswrapper[4898]: I1211 13:05:11.774999 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 11 13:05:11 crc kubenswrapper[4898]: E1211 13:05:11.775773 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 11 13:05:11 crc kubenswrapper[4898]: I1211 13:05:11.774977 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zcq7l" Dec 11 13:05:11 crc kubenswrapper[4898]: I1211 13:05:11.775004 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 11 13:05:11 crc kubenswrapper[4898]: E1211 13:05:11.775873 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zcq7l" podUID="34380c7c-1d75-4f6f-a6cb-b015a55ca978" Dec 11 13:05:11 crc kubenswrapper[4898]: E1211 13:05:11.775930 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 11 13:05:11 crc kubenswrapper[4898]: I1211 13:05:11.831120 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:11 crc kubenswrapper[4898]: I1211 13:05:11.831200 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:11 crc kubenswrapper[4898]: I1211 13:05:11.831227 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:11 crc kubenswrapper[4898]: I1211 13:05:11.831257 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:11 crc kubenswrapper[4898]: I1211 13:05:11.831283 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:11Z","lastTransitionTime":"2025-12-11T13:05:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:11 crc kubenswrapper[4898]: I1211 13:05:11.934141 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:11 crc kubenswrapper[4898]: I1211 13:05:11.934432 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:11 crc kubenswrapper[4898]: I1211 13:05:11.934563 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:11 crc kubenswrapper[4898]: I1211 13:05:11.934686 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:11 crc kubenswrapper[4898]: I1211 13:05:11.934786 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:11Z","lastTransitionTime":"2025-12-11T13:05:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:12 crc kubenswrapper[4898]: I1211 13:05:12.038051 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:12 crc kubenswrapper[4898]: I1211 13:05:12.038101 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:12 crc kubenswrapper[4898]: I1211 13:05:12.038114 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:12 crc kubenswrapper[4898]: I1211 13:05:12.038133 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:12 crc kubenswrapper[4898]: I1211 13:05:12.038146 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:12Z","lastTransitionTime":"2025-12-11T13:05:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:12 crc kubenswrapper[4898]: I1211 13:05:12.139761 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:12 crc kubenswrapper[4898]: I1211 13:05:12.140539 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:12 crc kubenswrapper[4898]: I1211 13:05:12.140786 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:12 crc kubenswrapper[4898]: I1211 13:05:12.140968 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:12 crc kubenswrapper[4898]: I1211 13:05:12.141143 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:12Z","lastTransitionTime":"2025-12-11T13:05:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:12 crc kubenswrapper[4898]: I1211 13:05:12.243525 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:12 crc kubenswrapper[4898]: I1211 13:05:12.243559 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:12 crc kubenswrapper[4898]: I1211 13:05:12.243570 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:12 crc kubenswrapper[4898]: I1211 13:05:12.243584 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:12 crc kubenswrapper[4898]: I1211 13:05:12.243595 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:12Z","lastTransitionTime":"2025-12-11T13:05:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:12 crc kubenswrapper[4898]: I1211 13:05:12.346447 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:12 crc kubenswrapper[4898]: I1211 13:05:12.346503 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:12 crc kubenswrapper[4898]: I1211 13:05:12.346514 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:12 crc kubenswrapper[4898]: I1211 13:05:12.346531 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:12 crc kubenswrapper[4898]: I1211 13:05:12.346542 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:12Z","lastTransitionTime":"2025-12-11T13:05:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:12 crc kubenswrapper[4898]: I1211 13:05:12.448887 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:12 crc kubenswrapper[4898]: I1211 13:05:12.448926 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:12 crc kubenswrapper[4898]: I1211 13:05:12.448934 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:12 crc kubenswrapper[4898]: I1211 13:05:12.448947 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:12 crc kubenswrapper[4898]: I1211 13:05:12.448957 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:12Z","lastTransitionTime":"2025-12-11T13:05:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:12 crc kubenswrapper[4898]: I1211 13:05:12.551115 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:12 crc kubenswrapper[4898]: I1211 13:05:12.551174 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:12 crc kubenswrapper[4898]: I1211 13:05:12.551191 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:12 crc kubenswrapper[4898]: I1211 13:05:12.551215 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:12 crc kubenswrapper[4898]: I1211 13:05:12.551234 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:12Z","lastTransitionTime":"2025-12-11T13:05:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:12 crc kubenswrapper[4898]: I1211 13:05:12.653951 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:12 crc kubenswrapper[4898]: I1211 13:05:12.653986 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:12 crc kubenswrapper[4898]: I1211 13:05:12.653999 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:12 crc kubenswrapper[4898]: I1211 13:05:12.654014 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:12 crc kubenswrapper[4898]: I1211 13:05:12.654022 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:12Z","lastTransitionTime":"2025-12-11T13:05:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:12 crc kubenswrapper[4898]: I1211 13:05:12.756257 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:12 crc kubenswrapper[4898]: I1211 13:05:12.756293 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:12 crc kubenswrapper[4898]: I1211 13:05:12.756305 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:12 crc kubenswrapper[4898]: I1211 13:05:12.756319 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:12 crc kubenswrapper[4898]: I1211 13:05:12.756329 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:12Z","lastTransitionTime":"2025-12-11T13:05:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:12 crc kubenswrapper[4898]: I1211 13:05:12.790421 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6782ac5869029f1d6c05198c469d34070decbbbbab0c093963024040f067ecf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7cv42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb7b98c50effebe09db7e51fe139da11af7b116ce6b7d81f0dada92fa29c82a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7cv42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7mmvk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:12Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:12 crc kubenswrapper[4898]: I1211 13:05:12.807723 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qndxl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1efa7034-8a95-4e6e-bd84-0189dc5acaa3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2a77e22893dca5ea818fc6992db6c34f8eb90ce2f25e814bec5f314d60ad569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd81ec4902eeb225d5a2969e211c63bbab448c1cccb327cfa84fb68c4f386ab0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ab0efc2d7d46f575125a46f025b016017e66f74466a9d796dfb0285577f5bcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d992af21fd68cce19a4be8328466a3d46df62ed908fe4e911ba14e7c009ce982\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a60901e27e67b7483e02800a91a86f2676b610d538c999ab1df0d42fde146f90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7e7d7ded2a6e0e20781a16b0ed56a744db593c6fe40c75e89a088ccb8cdb127\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18a2b5e8508fbd418e99cd2fe5af41cf82ce69e8a4240c8c3c1965e8f7b0bfb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://18a2b5e8508fbd418e99cd2fe5af41cf82ce69e8a4240c8c3c1965e8f7b0bfb9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-11T13:04:55Z\\\",\\\"message\\\":\\\"pin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.41:9443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {589f95f7-f3e2-4140-80ed-9a0717201481}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1211 13:04:55.603698 6618 services_controller.go:356] Processing sync for service openshift-marketplace/redhat-marketplace for network=default\\\\nF1211 13:04:55.603702 6618 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:55Z is after 2025-08-24T17:21:41Z]\\\\nI1211 13:04:55.6\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:54Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-qndxl_openshift-ovn-kubernetes(1efa7034-8a95-4e6e-bd84-0189dc5acaa3)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3aa907a5ca865510f1aafd0f949963b5938b6fe62fa9fd690447c4ab62566f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fc397b24e8480c4d07281b09fb871f859eb9fdf47fa9103b9243f91e5800d61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fc397b24e8480c4d07281b09fb871f859eb9fdf47fa9103b9243f91e5800d61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qndxl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:12Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:12 crc kubenswrapper[4898]: I1211 13:05:12.821987 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-brphv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f71cb6f-609f-4fab-86eb-e01518fb8c61\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d17922d87e50d85f50e76fef19a44cedae3bfd35727f8531089cfcf6da12d5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwpjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cedba92b63b5430b84d32f2b59c5cc0c4e1cd2bc2d513f2f905ebb59c34639b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwpjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-brphv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:12Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:12 crc kubenswrapper[4898]: I1211 13:05:12.836293 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbb0d9e2-813d-44ce-a547-a3c6f60c9f6b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e84fc12f1339e236216301663a693fc0ae8c043f282081b77e08455869556f40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcda5c81dd6cdc256d7acd24d32f956484e8823bed9e29c5713a722d8ebd6f46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cbfca7c42a03929bac318edc66ce1a6c7f88f795fd3b58db686b8484edfac60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e329db1261719ccef14dcf3965ff73799b275591613d4035cb820253dced79b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e329db1261719ccef14dcf3965ff73799b275591613d4035cb820253dced79b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:02Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:12Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:12 crc kubenswrapper[4898]: I1211 13:05:12.858234 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:12 crc kubenswrapper[4898]: I1211 13:05:12.858279 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:12 crc kubenswrapper[4898]: I1211 13:05:12.858293 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:12 crc kubenswrapper[4898]: I1211 13:05:12.858311 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:12 crc kubenswrapper[4898]: I1211 13:05:12.858323 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:12Z","lastTransitionTime":"2025-12-11T13:05:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:12 crc kubenswrapper[4898]: I1211 13:05:12.860875 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"88d684b6-361a-40e6-978e-b46ea34398f5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fdd8f4f5e41c02e8bfa32782027f6483a3568a3e1cb28f2abc7807eaada32646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad23dc07c8ad7cf6100e3b53e1d9ed8abac20e942968edb5decdde1b263de306\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce1bf1a7718645350f992e26a366a0fefa0d757b696c932d3552dd24a4c93550\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4801f56faef50a1c100b7f54be9508a2be17ab655648fdb92cffaab1cb198a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87f5a835b639c4ce50c0d2b5f1a865f1b67bc5cc06d9fd88875d9defb71dbed2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77f3c45a0ef619bd5ee7b89f62fdaf4ae983c8f75e23034eb88019194f6ef88a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77f3c45a0ef619bd5ee7b89f62fdaf4ae983c8f75e23034eb88019194f6ef88a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32f392949d5ebc630af8b0646f8f482f806803d464e2cedbcbe731328d6ba5ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32f392949d5ebc630af8b0646f8f482f806803d464e2cedbcbe731328d6ba5ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://bf8553e4b1a22afaf454efae3707136b9e6e320a54fd269f48a5f72297249d21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf8553e4b1a22afaf454efae3707136b9e6e320a54fd269f48a5f72297249d21\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:12Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:12 crc kubenswrapper[4898]: I1211 13:05:12.876095 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-h86cf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"00c0a5c3-6be3-4c77-a628-cf6710a1f10f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cb94931502e22d03595fc3289c2b098ecf881b9f8c14180a036a11b2f65bd91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xv2hc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-h86cf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:12Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:12 crc kubenswrapper[4898]: I1211 13:05:12.891822 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-zcq7l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34380c7c-1d75-4f6f-a6cb-b015a55ca978\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:36Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-zcq7l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:12Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:12 crc kubenswrapper[4898]: I1211 13:05:12.906625 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fc485a2-a69e-4453-8c3d-4b9698caa632\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df081be8bffcbf5068ade5f09ad81f2ea332d110fb0dffc1ab215f4a447e75d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://765cb993103d23770e20a725ce1f28ca635d13d2ecd05676039a833274ecf3e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41568c567a418a6ebd534c09d02c4331dfa2cd00580441bfdd83a15a12153a59\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ccc1530df8a90b252f6008c3a1734fd489bcb57944c2cc46bdf3c7554d837c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8857b4ade110b31598d9080f45bc15f5702b2c278fe7909e4d172c9877e61534\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"message\\\":\\\"falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1211 13:04:15.326141 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1211 13:04:15.327902 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-38072788/tls.crt::/tmp/serving-cert-38072788/tls.key\\\\\\\"\\\\nI1211 13:04:21.035582 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1211 13:04:21.038444 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1211 13:04:21.038483 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1211 13:04:21.038515 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1211 13:04:21.038527 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1211 13:04:21.044993 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1211 13:04:21.045022 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1211 13:04:21.045029 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1211 13:04:21.045034 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1211 13:04:21.045038 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1211 13:04:21.045043 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1211 13:04:21.045047 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1211 13:04:21.045297 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1211 13:04:21.046774 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a6e962a8c37823fc7d0047f9c1cae14311bfa1f36236addff039bba3369a021\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca4b5408884ed07718750ad82f77961d242327ca11991fd4dbbd1b5b53c5b642\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca4b5408884ed07718750ad82f77961d242327ca11991fd4dbbd1b5b53c5b642\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:12Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:12 crc kubenswrapper[4898]: I1211 13:05:12.918449 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9dab79b9393ae8ca48f0b4366d9b4e6fd25015a3d42f913b3d3d5ce78cbf0cc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bee512bd389a5e402acad6fdbf075e8d90bbfb3899efaecee992ff8ef64617fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:12Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:12 crc kubenswrapper[4898]: I1211 13:05:12.930073 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:12Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:12 crc kubenswrapper[4898]: I1211 13:05:12.946013 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7lxfm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45192e97-2770-4866-9865-dc2f45b3f616\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4fc2a5689b589ccda6ec382b6a586f4e47c42d4f137d852e28688a18b98e6e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14a9667311a747f8e844897394432d67e0897b22569dd1314f4ebeccfce1c531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14a9667311a747f8e844897394432d67e0897b22569dd1314f4ebeccfce1c531\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://35bb75c37377565dd0410255d00b3a848a3d22801107bd4d9547ce5574cda121\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35bb75c37377565dd0410255d00b3a848a3d22801107bd4d9547ce5574cda121\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd3bae44b475629c42174d195e162b9afe545e8dc84857847b1e2d936cf1a32d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd3bae44b475629c42174d195e162b9afe545e8dc84857847b1e2d936cf1a32d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7d5edf28866aeb08ed16121e0cea183463e58a708b157da20edd847bdad9860\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7d5edf28866aeb08ed16121e0cea183463e58a708b157da20edd847bdad9860\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://951e5da3fc6041496960dcfcd28bb3e918a40d9303c6609880bc2f8dcca29688\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://951e5da3fc6041496960dcfcd28bb3e918a40d9303c6609880bc2f8dcca29688\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39b6a95a0387b8f1ea4992bfa28355272a299bbb81adddf7ad39835786b43b2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39b6a95a0387b8f1ea4992bfa28355272a299bbb81adddf7ad39835786b43b2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7lxfm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:12Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:12 crc kubenswrapper[4898]: I1211 13:05:12.958679 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca3db723-c4e9-4a38-9cee-827f045c7be3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b72e9f6787b4bfadbc89cf669d5f31678f0a85a726107e3b3d2e06eff0e18ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a52b842d9e7001cf1e9c3284c56ac9a76f642b66457d6717c6367e00ef89fdf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://583816a3f9ba9a092d6cd8d75bba54cf42e3ff9f65230f83822baec967dd53dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f419f755ec0a61ffad06dca0d35df43386ebd253152f29cf04312cd0ea89d5de\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:12Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:12 crc kubenswrapper[4898]: I1211 13:05:12.960105 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:12 crc kubenswrapper[4898]: I1211 13:05:12.960130 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:12 crc kubenswrapper[4898]: I1211 13:05:12.960137 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:12 crc kubenswrapper[4898]: I1211 13:05:12.960151 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:12 crc kubenswrapper[4898]: I1211 13:05:12.960159 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:12Z","lastTransitionTime":"2025-12-11T13:05:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:12 crc kubenswrapper[4898]: I1211 13:05:12.974251 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:12Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:12 crc kubenswrapper[4898]: I1211 13:05:12.987950 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:12Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:13 crc kubenswrapper[4898]: I1211 13:05:13.002262 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://337971ff1d0d3cf7ee256777a71e4549ec0b973b58b437caaa152c589047141f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:13Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:13 crc kubenswrapper[4898]: I1211 13:05:13.017114 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b28cc743c1173591c6ee75d4bb3841608d01bca50dab896d4c5de930a67ee22f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:13Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:13 crc kubenswrapper[4898]: I1211 13:05:13.030117 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-dlqfj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e8ed6cb-b822-4b64-9e00-e755c5aea812\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbc9b844a873836a500af9d781e17c635495762b0e50af71dadf141ff00723a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76ddfb054d2dd7ec27a771c96a0fca2cead8eacc3221c22b36c1d075414a1995\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-11T13:05:09Z\\\",\\\"message\\\":\\\"2025-12-11T13:04:23+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_be5097ed-4df8-4212-b1ed-ee3e7c74e5b6\\\\n2025-12-11T13:04:23+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_be5097ed-4df8-4212-b1ed-ee3e7c74e5b6 to /host/opt/cni/bin/\\\\n2025-12-11T13:04:24Z [verbose] multus-daemon started\\\\n2025-12-11T13:04:24Z [verbose] Readiness Indicator file check\\\\n2025-12-11T13:05:09Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:05:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bvrpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-dlqfj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:13Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:13 crc kubenswrapper[4898]: I1211 13:05:13.039704 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jmvlr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8009db93-3f68-4a84-87ae-863f64e231e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08079d059f71e05ad9d28ebed3d2608a035d3b6fc87db1d80b3974b439ac5bdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkmx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:24Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jmvlr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:13Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:13 crc kubenswrapper[4898]: I1211 13:05:13.062841 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:13 crc kubenswrapper[4898]: I1211 13:05:13.062875 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:13 crc kubenswrapper[4898]: I1211 13:05:13.062884 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:13 crc kubenswrapper[4898]: I1211 13:05:13.062896 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:13 crc kubenswrapper[4898]: I1211 13:05:13.062907 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:13Z","lastTransitionTime":"2025-12-11T13:05:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:13 crc kubenswrapper[4898]: I1211 13:05:13.165502 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:13 crc kubenswrapper[4898]: I1211 13:05:13.165539 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:13 crc kubenswrapper[4898]: I1211 13:05:13.165549 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:13 crc kubenswrapper[4898]: I1211 13:05:13.165570 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:13 crc kubenswrapper[4898]: I1211 13:05:13.165579 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:13Z","lastTransitionTime":"2025-12-11T13:05:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:13 crc kubenswrapper[4898]: I1211 13:05:13.268536 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:13 crc kubenswrapper[4898]: I1211 13:05:13.268568 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:13 crc kubenswrapper[4898]: I1211 13:05:13.268577 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:13 crc kubenswrapper[4898]: I1211 13:05:13.268589 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:13 crc kubenswrapper[4898]: I1211 13:05:13.268600 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:13Z","lastTransitionTime":"2025-12-11T13:05:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:13 crc kubenswrapper[4898]: I1211 13:05:13.371252 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:13 crc kubenswrapper[4898]: I1211 13:05:13.371289 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:13 crc kubenswrapper[4898]: I1211 13:05:13.371300 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:13 crc kubenswrapper[4898]: I1211 13:05:13.371317 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:13 crc kubenswrapper[4898]: I1211 13:05:13.371331 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:13Z","lastTransitionTime":"2025-12-11T13:05:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:13 crc kubenswrapper[4898]: I1211 13:05:13.473404 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:13 crc kubenswrapper[4898]: I1211 13:05:13.473451 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:13 crc kubenswrapper[4898]: I1211 13:05:13.473479 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:13 crc kubenswrapper[4898]: I1211 13:05:13.473497 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:13 crc kubenswrapper[4898]: I1211 13:05:13.473507 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:13Z","lastTransitionTime":"2025-12-11T13:05:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:13 crc kubenswrapper[4898]: I1211 13:05:13.576779 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:13 crc kubenswrapper[4898]: I1211 13:05:13.576838 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:13 crc kubenswrapper[4898]: I1211 13:05:13.576849 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:13 crc kubenswrapper[4898]: I1211 13:05:13.576865 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:13 crc kubenswrapper[4898]: I1211 13:05:13.576876 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:13Z","lastTransitionTime":"2025-12-11T13:05:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:13 crc kubenswrapper[4898]: I1211 13:05:13.679371 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:13 crc kubenswrapper[4898]: I1211 13:05:13.679449 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:13 crc kubenswrapper[4898]: I1211 13:05:13.679479 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:13 crc kubenswrapper[4898]: I1211 13:05:13.679495 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:13 crc kubenswrapper[4898]: I1211 13:05:13.679505 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:13Z","lastTransitionTime":"2025-12-11T13:05:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:13 crc kubenswrapper[4898]: I1211 13:05:13.774639 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zcq7l" Dec 11 13:05:13 crc kubenswrapper[4898]: I1211 13:05:13.774691 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 11 13:05:13 crc kubenswrapper[4898]: I1211 13:05:13.774742 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 11 13:05:13 crc kubenswrapper[4898]: E1211 13:05:13.774791 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zcq7l" podUID="34380c7c-1d75-4f6f-a6cb-b015a55ca978" Dec 11 13:05:13 crc kubenswrapper[4898]: I1211 13:05:13.774809 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 11 13:05:13 crc kubenswrapper[4898]: E1211 13:05:13.774895 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 11 13:05:13 crc kubenswrapper[4898]: E1211 13:05:13.774946 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 11 13:05:13 crc kubenswrapper[4898]: E1211 13:05:13.775040 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 11 13:05:13 crc kubenswrapper[4898]: I1211 13:05:13.781225 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:13 crc kubenswrapper[4898]: I1211 13:05:13.781266 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:13 crc kubenswrapper[4898]: I1211 13:05:13.781279 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:13 crc kubenswrapper[4898]: I1211 13:05:13.781302 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:13 crc kubenswrapper[4898]: I1211 13:05:13.781317 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:13Z","lastTransitionTime":"2025-12-11T13:05:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:13 crc kubenswrapper[4898]: I1211 13:05:13.884561 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:13 crc kubenswrapper[4898]: I1211 13:05:13.884609 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:13 crc kubenswrapper[4898]: I1211 13:05:13.884622 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:13 crc kubenswrapper[4898]: I1211 13:05:13.884639 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:13 crc kubenswrapper[4898]: I1211 13:05:13.884654 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:13Z","lastTransitionTime":"2025-12-11T13:05:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:13 crc kubenswrapper[4898]: I1211 13:05:13.987299 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:13 crc kubenswrapper[4898]: I1211 13:05:13.987366 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:13 crc kubenswrapper[4898]: I1211 13:05:13.987392 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:13 crc kubenswrapper[4898]: I1211 13:05:13.987423 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:13 crc kubenswrapper[4898]: I1211 13:05:13.987447 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:13Z","lastTransitionTime":"2025-12-11T13:05:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:14 crc kubenswrapper[4898]: I1211 13:05:14.089543 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:14 crc kubenswrapper[4898]: I1211 13:05:14.089586 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:14 crc kubenswrapper[4898]: I1211 13:05:14.089600 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:14 crc kubenswrapper[4898]: I1211 13:05:14.089617 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:14 crc kubenswrapper[4898]: I1211 13:05:14.089629 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:14Z","lastTransitionTime":"2025-12-11T13:05:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:14 crc kubenswrapper[4898]: I1211 13:05:14.191781 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:14 crc kubenswrapper[4898]: I1211 13:05:14.191861 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:14 crc kubenswrapper[4898]: I1211 13:05:14.191881 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:14 crc kubenswrapper[4898]: I1211 13:05:14.191912 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:14 crc kubenswrapper[4898]: I1211 13:05:14.191930 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:14Z","lastTransitionTime":"2025-12-11T13:05:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:14 crc kubenswrapper[4898]: I1211 13:05:14.294620 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:14 crc kubenswrapper[4898]: I1211 13:05:14.294648 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:14 crc kubenswrapper[4898]: I1211 13:05:14.294656 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:14 crc kubenswrapper[4898]: I1211 13:05:14.294669 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:14 crc kubenswrapper[4898]: I1211 13:05:14.294676 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:14Z","lastTransitionTime":"2025-12-11T13:05:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:14 crc kubenswrapper[4898]: I1211 13:05:14.397289 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:14 crc kubenswrapper[4898]: I1211 13:05:14.397647 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:14 crc kubenswrapper[4898]: I1211 13:05:14.397662 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:14 crc kubenswrapper[4898]: I1211 13:05:14.397680 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:14 crc kubenswrapper[4898]: I1211 13:05:14.397695 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:14Z","lastTransitionTime":"2025-12-11T13:05:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:14 crc kubenswrapper[4898]: I1211 13:05:14.499416 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:14 crc kubenswrapper[4898]: I1211 13:05:14.499723 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:14 crc kubenswrapper[4898]: I1211 13:05:14.499836 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:14 crc kubenswrapper[4898]: I1211 13:05:14.499969 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:14 crc kubenswrapper[4898]: I1211 13:05:14.500095 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:14Z","lastTransitionTime":"2025-12-11T13:05:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:14 crc kubenswrapper[4898]: I1211 13:05:14.603510 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:14 crc kubenswrapper[4898]: I1211 13:05:14.603571 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:14 crc kubenswrapper[4898]: I1211 13:05:14.603587 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:14 crc kubenswrapper[4898]: I1211 13:05:14.603608 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:14 crc kubenswrapper[4898]: I1211 13:05:14.603625 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:14Z","lastTransitionTime":"2025-12-11T13:05:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:14 crc kubenswrapper[4898]: I1211 13:05:14.706545 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:14 crc kubenswrapper[4898]: I1211 13:05:14.707058 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:14 crc kubenswrapper[4898]: I1211 13:05:14.707245 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:14 crc kubenswrapper[4898]: I1211 13:05:14.707425 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:14 crc kubenswrapper[4898]: I1211 13:05:14.707619 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:14Z","lastTransitionTime":"2025-12-11T13:05:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:14 crc kubenswrapper[4898]: I1211 13:05:14.810071 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:14 crc kubenswrapper[4898]: I1211 13:05:14.810098 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:14 crc kubenswrapper[4898]: I1211 13:05:14.810106 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:14 crc kubenswrapper[4898]: I1211 13:05:14.810117 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:14 crc kubenswrapper[4898]: I1211 13:05:14.810127 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:14Z","lastTransitionTime":"2025-12-11T13:05:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:14 crc kubenswrapper[4898]: I1211 13:05:14.912077 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:14 crc kubenswrapper[4898]: I1211 13:05:14.912409 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:14 crc kubenswrapper[4898]: I1211 13:05:14.912609 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:14 crc kubenswrapper[4898]: I1211 13:05:14.912759 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:14 crc kubenswrapper[4898]: I1211 13:05:14.912893 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:14Z","lastTransitionTime":"2025-12-11T13:05:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:15 crc kubenswrapper[4898]: I1211 13:05:15.016047 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:15 crc kubenswrapper[4898]: I1211 13:05:15.016098 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:15 crc kubenswrapper[4898]: I1211 13:05:15.016118 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:15 crc kubenswrapper[4898]: I1211 13:05:15.016146 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:15 crc kubenswrapper[4898]: I1211 13:05:15.016167 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:15Z","lastTransitionTime":"2025-12-11T13:05:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:15 crc kubenswrapper[4898]: I1211 13:05:15.118668 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:15 crc kubenswrapper[4898]: I1211 13:05:15.118963 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:15 crc kubenswrapper[4898]: I1211 13:05:15.119113 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:15 crc kubenswrapper[4898]: I1211 13:05:15.119271 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:15 crc kubenswrapper[4898]: I1211 13:05:15.119397 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:15Z","lastTransitionTime":"2025-12-11T13:05:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:15 crc kubenswrapper[4898]: I1211 13:05:15.234854 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:15 crc kubenswrapper[4898]: I1211 13:05:15.234941 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:15 crc kubenswrapper[4898]: I1211 13:05:15.234964 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:15 crc kubenswrapper[4898]: I1211 13:05:15.234998 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:15 crc kubenswrapper[4898]: I1211 13:05:15.235019 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:15Z","lastTransitionTime":"2025-12-11T13:05:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:15 crc kubenswrapper[4898]: I1211 13:05:15.336978 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:15 crc kubenswrapper[4898]: I1211 13:05:15.337048 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:15 crc kubenswrapper[4898]: I1211 13:05:15.337061 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:15 crc kubenswrapper[4898]: I1211 13:05:15.337080 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:15 crc kubenswrapper[4898]: I1211 13:05:15.337092 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:15Z","lastTransitionTime":"2025-12-11T13:05:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:15 crc kubenswrapper[4898]: I1211 13:05:15.442996 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:15 crc kubenswrapper[4898]: I1211 13:05:15.443051 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:15 crc kubenswrapper[4898]: I1211 13:05:15.443071 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:15 crc kubenswrapper[4898]: I1211 13:05:15.443106 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:15 crc kubenswrapper[4898]: I1211 13:05:15.443124 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:15Z","lastTransitionTime":"2025-12-11T13:05:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:15 crc kubenswrapper[4898]: I1211 13:05:15.547751 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:15 crc kubenswrapper[4898]: I1211 13:05:15.547788 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:15 crc kubenswrapper[4898]: I1211 13:05:15.547797 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:15 crc kubenswrapper[4898]: I1211 13:05:15.547812 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:15 crc kubenswrapper[4898]: I1211 13:05:15.547822 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:15Z","lastTransitionTime":"2025-12-11T13:05:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:15 crc kubenswrapper[4898]: I1211 13:05:15.650712 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:15 crc kubenswrapper[4898]: I1211 13:05:15.650777 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:15 crc kubenswrapper[4898]: I1211 13:05:15.650794 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:15 crc kubenswrapper[4898]: I1211 13:05:15.650818 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:15 crc kubenswrapper[4898]: I1211 13:05:15.650835 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:15Z","lastTransitionTime":"2025-12-11T13:05:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:15 crc kubenswrapper[4898]: I1211 13:05:15.754076 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:15 crc kubenswrapper[4898]: I1211 13:05:15.754128 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:15 crc kubenswrapper[4898]: I1211 13:05:15.754139 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:15 crc kubenswrapper[4898]: I1211 13:05:15.754154 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:15 crc kubenswrapper[4898]: I1211 13:05:15.754167 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:15Z","lastTransitionTime":"2025-12-11T13:05:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:15 crc kubenswrapper[4898]: I1211 13:05:15.774803 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 11 13:05:15 crc kubenswrapper[4898]: I1211 13:05:15.774894 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 11 13:05:15 crc kubenswrapper[4898]: I1211 13:05:15.774918 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 11 13:05:15 crc kubenswrapper[4898]: E1211 13:05:15.774926 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 11 13:05:15 crc kubenswrapper[4898]: I1211 13:05:15.775008 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zcq7l" Dec 11 13:05:15 crc kubenswrapper[4898]: E1211 13:05:15.775229 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 11 13:05:15 crc kubenswrapper[4898]: E1211 13:05:15.775427 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 11 13:05:15 crc kubenswrapper[4898]: E1211 13:05:15.775539 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zcq7l" podUID="34380c7c-1d75-4f6f-a6cb-b015a55ca978" Dec 11 13:05:15 crc kubenswrapper[4898]: I1211 13:05:15.857254 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:15 crc kubenswrapper[4898]: I1211 13:05:15.857278 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:15 crc kubenswrapper[4898]: I1211 13:05:15.857286 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:15 crc kubenswrapper[4898]: I1211 13:05:15.857299 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:15 crc kubenswrapper[4898]: I1211 13:05:15.857309 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:15Z","lastTransitionTime":"2025-12-11T13:05:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:15 crc kubenswrapper[4898]: I1211 13:05:15.960086 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:15 crc kubenswrapper[4898]: I1211 13:05:15.960157 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:15 crc kubenswrapper[4898]: I1211 13:05:15.960174 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:15 crc kubenswrapper[4898]: I1211 13:05:15.960200 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:15 crc kubenswrapper[4898]: I1211 13:05:15.960219 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:15Z","lastTransitionTime":"2025-12-11T13:05:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:16 crc kubenswrapper[4898]: I1211 13:05:16.063153 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:16 crc kubenswrapper[4898]: I1211 13:05:16.063201 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:16 crc kubenswrapper[4898]: I1211 13:05:16.063215 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:16 crc kubenswrapper[4898]: I1211 13:05:16.063234 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:16 crc kubenswrapper[4898]: I1211 13:05:16.063248 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:16Z","lastTransitionTime":"2025-12-11T13:05:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:16 crc kubenswrapper[4898]: I1211 13:05:16.167198 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:16 crc kubenswrapper[4898]: I1211 13:05:16.167241 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:16 crc kubenswrapper[4898]: I1211 13:05:16.167251 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:16 crc kubenswrapper[4898]: I1211 13:05:16.167266 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:16 crc kubenswrapper[4898]: I1211 13:05:16.167277 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:16Z","lastTransitionTime":"2025-12-11T13:05:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:16 crc kubenswrapper[4898]: I1211 13:05:16.268620 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:16 crc kubenswrapper[4898]: I1211 13:05:16.268695 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:16 crc kubenswrapper[4898]: I1211 13:05:16.268718 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:16 crc kubenswrapper[4898]: I1211 13:05:16.268741 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:16 crc kubenswrapper[4898]: I1211 13:05:16.268758 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:16Z","lastTransitionTime":"2025-12-11T13:05:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:16 crc kubenswrapper[4898]: I1211 13:05:16.371601 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:16 crc kubenswrapper[4898]: I1211 13:05:16.371669 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:16 crc kubenswrapper[4898]: I1211 13:05:16.371686 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:16 crc kubenswrapper[4898]: I1211 13:05:16.371708 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:16 crc kubenswrapper[4898]: I1211 13:05:16.371724 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:16Z","lastTransitionTime":"2025-12-11T13:05:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:16 crc kubenswrapper[4898]: I1211 13:05:16.473931 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:16 crc kubenswrapper[4898]: I1211 13:05:16.473976 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:16 crc kubenswrapper[4898]: I1211 13:05:16.473988 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:16 crc kubenswrapper[4898]: I1211 13:05:16.474004 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:16 crc kubenswrapper[4898]: I1211 13:05:16.474017 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:16Z","lastTransitionTime":"2025-12-11T13:05:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:16 crc kubenswrapper[4898]: I1211 13:05:16.575771 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:16 crc kubenswrapper[4898]: I1211 13:05:16.575810 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:16 crc kubenswrapper[4898]: I1211 13:05:16.575818 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:16 crc kubenswrapper[4898]: I1211 13:05:16.575831 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:16 crc kubenswrapper[4898]: I1211 13:05:16.575840 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:16Z","lastTransitionTime":"2025-12-11T13:05:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:16 crc kubenswrapper[4898]: I1211 13:05:16.678232 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:16 crc kubenswrapper[4898]: I1211 13:05:16.678272 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:16 crc kubenswrapper[4898]: I1211 13:05:16.678283 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:16 crc kubenswrapper[4898]: I1211 13:05:16.678299 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:16 crc kubenswrapper[4898]: I1211 13:05:16.678312 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:16Z","lastTransitionTime":"2025-12-11T13:05:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:16 crc kubenswrapper[4898]: I1211 13:05:16.780808 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:16 crc kubenswrapper[4898]: I1211 13:05:16.780884 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:16 crc kubenswrapper[4898]: I1211 13:05:16.780907 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:16 crc kubenswrapper[4898]: I1211 13:05:16.780938 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:16 crc kubenswrapper[4898]: I1211 13:05:16.780960 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:16Z","lastTransitionTime":"2025-12-11T13:05:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:16 crc kubenswrapper[4898]: I1211 13:05:16.883588 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:16 crc kubenswrapper[4898]: I1211 13:05:16.883631 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:16 crc kubenswrapper[4898]: I1211 13:05:16.883643 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:16 crc kubenswrapper[4898]: I1211 13:05:16.883657 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:16 crc kubenswrapper[4898]: I1211 13:05:16.883668 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:16Z","lastTransitionTime":"2025-12-11T13:05:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:16 crc kubenswrapper[4898]: I1211 13:05:16.985664 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:16 crc kubenswrapper[4898]: I1211 13:05:16.985755 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:16 crc kubenswrapper[4898]: I1211 13:05:16.985806 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:16 crc kubenswrapper[4898]: I1211 13:05:16.985829 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:16 crc kubenswrapper[4898]: I1211 13:05:16.985845 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:16Z","lastTransitionTime":"2025-12-11T13:05:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:17 crc kubenswrapper[4898]: I1211 13:05:17.088728 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:17 crc kubenswrapper[4898]: I1211 13:05:17.088779 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:17 crc kubenswrapper[4898]: I1211 13:05:17.088804 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:17 crc kubenswrapper[4898]: I1211 13:05:17.088827 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:17 crc kubenswrapper[4898]: I1211 13:05:17.088843 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:17Z","lastTransitionTime":"2025-12-11T13:05:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:17 crc kubenswrapper[4898]: I1211 13:05:17.096753 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:17 crc kubenswrapper[4898]: I1211 13:05:17.096794 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:17 crc kubenswrapper[4898]: I1211 13:05:17.096809 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:17 crc kubenswrapper[4898]: I1211 13:05:17.096827 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:17 crc kubenswrapper[4898]: I1211 13:05:17.096841 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:17Z","lastTransitionTime":"2025-12-11T13:05:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:17 crc kubenswrapper[4898]: E1211 13:05:17.115011 4898 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-11T13:05:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-11T13:05:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-11T13:05:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-11T13:05:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-11T13:05:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-11T13:05:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-11T13:05:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-11T13:05:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"160776e4-8c30-4eed-9dbf-aa0b51733cfb\\\",\\\"systemUUID\\\":\\\"3ede5a2c-67f9-4bff-827e-03a23908e5c0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:17Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:17 crc kubenswrapper[4898]: I1211 13:05:17.118695 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:17 crc kubenswrapper[4898]: I1211 13:05:17.118726 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:17 crc kubenswrapper[4898]: I1211 13:05:17.118737 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:17 crc kubenswrapper[4898]: I1211 13:05:17.118752 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:17 crc kubenswrapper[4898]: I1211 13:05:17.118763 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:17Z","lastTransitionTime":"2025-12-11T13:05:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:17 crc kubenswrapper[4898]: E1211 13:05:17.138164 4898 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-11T13:05:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-11T13:05:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-11T13:05:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-11T13:05:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-11T13:05:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-11T13:05:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-11T13:05:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-11T13:05:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"160776e4-8c30-4eed-9dbf-aa0b51733cfb\\\",\\\"systemUUID\\\":\\\"3ede5a2c-67f9-4bff-827e-03a23908e5c0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:17Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:17 crc kubenswrapper[4898]: I1211 13:05:17.142152 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:17 crc kubenswrapper[4898]: I1211 13:05:17.142237 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:17 crc kubenswrapper[4898]: I1211 13:05:17.142250 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:17 crc kubenswrapper[4898]: I1211 13:05:17.142294 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:17 crc kubenswrapper[4898]: I1211 13:05:17.142309 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:17Z","lastTransitionTime":"2025-12-11T13:05:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:17 crc kubenswrapper[4898]: E1211 13:05:17.161400 4898 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-11T13:05:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-11T13:05:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-11T13:05:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-11T13:05:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-11T13:05:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-11T13:05:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-11T13:05:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-11T13:05:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"160776e4-8c30-4eed-9dbf-aa0b51733cfb\\\",\\\"systemUUID\\\":\\\"3ede5a2c-67f9-4bff-827e-03a23908e5c0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:17Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:17 crc kubenswrapper[4898]: I1211 13:05:17.165659 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:17 crc kubenswrapper[4898]: I1211 13:05:17.165724 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:17 crc kubenswrapper[4898]: I1211 13:05:17.165744 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:17 crc kubenswrapper[4898]: I1211 13:05:17.165769 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:17 crc kubenswrapper[4898]: I1211 13:05:17.165788 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:17Z","lastTransitionTime":"2025-12-11T13:05:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:17 crc kubenswrapper[4898]: E1211 13:05:17.179376 4898 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-11T13:05:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-11T13:05:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-11T13:05:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-11T13:05:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-11T13:05:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-11T13:05:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-11T13:05:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-11T13:05:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"160776e4-8c30-4eed-9dbf-aa0b51733cfb\\\",\\\"systemUUID\\\":\\\"3ede5a2c-67f9-4bff-827e-03a23908e5c0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:17Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:17 crc kubenswrapper[4898]: I1211 13:05:17.184651 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:17 crc kubenswrapper[4898]: I1211 13:05:17.184689 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:17 crc kubenswrapper[4898]: I1211 13:05:17.184703 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:17 crc kubenswrapper[4898]: I1211 13:05:17.184724 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:17 crc kubenswrapper[4898]: I1211 13:05:17.184740 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:17Z","lastTransitionTime":"2025-12-11T13:05:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:17 crc kubenswrapper[4898]: E1211 13:05:17.197544 4898 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-11T13:05:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-11T13:05:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-11T13:05:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-11T13:05:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-11T13:05:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-11T13:05:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-11T13:05:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-11T13:05:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"160776e4-8c30-4eed-9dbf-aa0b51733cfb\\\",\\\"systemUUID\\\":\\\"3ede5a2c-67f9-4bff-827e-03a23908e5c0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:17Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:17 crc kubenswrapper[4898]: E1211 13:05:17.197694 4898 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 11 13:05:17 crc kubenswrapper[4898]: I1211 13:05:17.199880 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:17 crc kubenswrapper[4898]: I1211 13:05:17.199973 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:17 crc kubenswrapper[4898]: I1211 13:05:17.200003 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:17 crc kubenswrapper[4898]: I1211 13:05:17.200033 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:17 crc kubenswrapper[4898]: I1211 13:05:17.200062 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:17Z","lastTransitionTime":"2025-12-11T13:05:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:17 crc kubenswrapper[4898]: I1211 13:05:17.302392 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:17 crc kubenswrapper[4898]: I1211 13:05:17.302492 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:17 crc kubenswrapper[4898]: I1211 13:05:17.302520 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:17 crc kubenswrapper[4898]: I1211 13:05:17.302550 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:17 crc kubenswrapper[4898]: I1211 13:05:17.302572 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:17Z","lastTransitionTime":"2025-12-11T13:05:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:17 crc kubenswrapper[4898]: I1211 13:05:17.405770 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:17 crc kubenswrapper[4898]: I1211 13:05:17.405818 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:17 crc kubenswrapper[4898]: I1211 13:05:17.405833 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:17 crc kubenswrapper[4898]: I1211 13:05:17.405850 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:17 crc kubenswrapper[4898]: I1211 13:05:17.405861 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:17Z","lastTransitionTime":"2025-12-11T13:05:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:17 crc kubenswrapper[4898]: I1211 13:05:17.509133 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:17 crc kubenswrapper[4898]: I1211 13:05:17.509188 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:17 crc kubenswrapper[4898]: I1211 13:05:17.509207 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:17 crc kubenswrapper[4898]: I1211 13:05:17.509230 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:17 crc kubenswrapper[4898]: I1211 13:05:17.509248 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:17Z","lastTransitionTime":"2025-12-11T13:05:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:17 crc kubenswrapper[4898]: I1211 13:05:17.611959 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:17 crc kubenswrapper[4898]: I1211 13:05:17.611993 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:17 crc kubenswrapper[4898]: I1211 13:05:17.612001 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:17 crc kubenswrapper[4898]: I1211 13:05:17.612013 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:17 crc kubenswrapper[4898]: I1211 13:05:17.612021 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:17Z","lastTransitionTime":"2025-12-11T13:05:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:17 crc kubenswrapper[4898]: I1211 13:05:17.714561 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:17 crc kubenswrapper[4898]: I1211 13:05:17.714625 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:17 crc kubenswrapper[4898]: I1211 13:05:17.714650 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:17 crc kubenswrapper[4898]: I1211 13:05:17.714678 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:17 crc kubenswrapper[4898]: I1211 13:05:17.714703 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:17Z","lastTransitionTime":"2025-12-11T13:05:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:17 crc kubenswrapper[4898]: I1211 13:05:17.774819 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zcq7l" Dec 11 13:05:17 crc kubenswrapper[4898]: I1211 13:05:17.774957 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 11 13:05:17 crc kubenswrapper[4898]: E1211 13:05:17.775123 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zcq7l" podUID="34380c7c-1d75-4f6f-a6cb-b015a55ca978" Dec 11 13:05:17 crc kubenswrapper[4898]: I1211 13:05:17.775180 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 11 13:05:17 crc kubenswrapper[4898]: I1211 13:05:17.775195 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 11 13:05:17 crc kubenswrapper[4898]: E1211 13:05:17.775556 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 11 13:05:17 crc kubenswrapper[4898]: E1211 13:05:17.775626 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 11 13:05:17 crc kubenswrapper[4898]: E1211 13:05:17.775390 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 11 13:05:17 crc kubenswrapper[4898]: I1211 13:05:17.817556 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:17 crc kubenswrapper[4898]: I1211 13:05:17.817624 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:17 crc kubenswrapper[4898]: I1211 13:05:17.817695 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:17 crc kubenswrapper[4898]: I1211 13:05:17.817727 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:17 crc kubenswrapper[4898]: I1211 13:05:17.817748 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:17Z","lastTransitionTime":"2025-12-11T13:05:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:17 crc kubenswrapper[4898]: I1211 13:05:17.920870 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:17 crc kubenswrapper[4898]: I1211 13:05:17.920922 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:17 crc kubenswrapper[4898]: I1211 13:05:17.920939 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:17 crc kubenswrapper[4898]: I1211 13:05:17.920965 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:17 crc kubenswrapper[4898]: I1211 13:05:17.920982 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:17Z","lastTransitionTime":"2025-12-11T13:05:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:18 crc kubenswrapper[4898]: I1211 13:05:18.023768 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:18 crc kubenswrapper[4898]: I1211 13:05:18.023817 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:18 crc kubenswrapper[4898]: I1211 13:05:18.023836 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:18 crc kubenswrapper[4898]: I1211 13:05:18.023858 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:18 crc kubenswrapper[4898]: I1211 13:05:18.023876 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:18Z","lastTransitionTime":"2025-12-11T13:05:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:18 crc kubenswrapper[4898]: I1211 13:05:18.127868 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:18 crc kubenswrapper[4898]: I1211 13:05:18.127916 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:18 crc kubenswrapper[4898]: I1211 13:05:18.127933 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:18 crc kubenswrapper[4898]: I1211 13:05:18.127960 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:18 crc kubenswrapper[4898]: I1211 13:05:18.127976 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:18Z","lastTransitionTime":"2025-12-11T13:05:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:18 crc kubenswrapper[4898]: I1211 13:05:18.230424 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:18 crc kubenswrapper[4898]: I1211 13:05:18.230559 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:18 crc kubenswrapper[4898]: I1211 13:05:18.230585 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:18 crc kubenswrapper[4898]: I1211 13:05:18.230615 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:18 crc kubenswrapper[4898]: I1211 13:05:18.230639 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:18Z","lastTransitionTime":"2025-12-11T13:05:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:18 crc kubenswrapper[4898]: I1211 13:05:18.333890 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:18 crc kubenswrapper[4898]: I1211 13:05:18.333968 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:18 crc kubenswrapper[4898]: I1211 13:05:18.333979 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:18 crc kubenswrapper[4898]: I1211 13:05:18.334032 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:18 crc kubenswrapper[4898]: I1211 13:05:18.334044 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:18Z","lastTransitionTime":"2025-12-11T13:05:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:18 crc kubenswrapper[4898]: I1211 13:05:18.436619 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:18 crc kubenswrapper[4898]: I1211 13:05:18.436695 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:18 crc kubenswrapper[4898]: I1211 13:05:18.436713 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:18 crc kubenswrapper[4898]: I1211 13:05:18.436737 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:18 crc kubenswrapper[4898]: I1211 13:05:18.436756 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:18Z","lastTransitionTime":"2025-12-11T13:05:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:18 crc kubenswrapper[4898]: I1211 13:05:18.543315 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:18 crc kubenswrapper[4898]: I1211 13:05:18.543351 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:18 crc kubenswrapper[4898]: I1211 13:05:18.543361 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:18 crc kubenswrapper[4898]: I1211 13:05:18.543377 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:18 crc kubenswrapper[4898]: I1211 13:05:18.543390 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:18Z","lastTransitionTime":"2025-12-11T13:05:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:18 crc kubenswrapper[4898]: I1211 13:05:18.646067 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:18 crc kubenswrapper[4898]: I1211 13:05:18.646162 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:18 crc kubenswrapper[4898]: I1211 13:05:18.646183 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:18 crc kubenswrapper[4898]: I1211 13:05:18.646237 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:18 crc kubenswrapper[4898]: I1211 13:05:18.646258 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:18Z","lastTransitionTime":"2025-12-11T13:05:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:18 crc kubenswrapper[4898]: I1211 13:05:18.748781 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:18 crc kubenswrapper[4898]: I1211 13:05:18.748835 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:18 crc kubenswrapper[4898]: I1211 13:05:18.748852 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:18 crc kubenswrapper[4898]: I1211 13:05:18.748876 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:18 crc kubenswrapper[4898]: I1211 13:05:18.748892 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:18Z","lastTransitionTime":"2025-12-11T13:05:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:18 crc kubenswrapper[4898]: I1211 13:05:18.852788 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:18 crc kubenswrapper[4898]: I1211 13:05:18.853232 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:18 crc kubenswrapper[4898]: I1211 13:05:18.853536 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:18 crc kubenswrapper[4898]: I1211 13:05:18.853831 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:18 crc kubenswrapper[4898]: I1211 13:05:18.854323 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:18Z","lastTransitionTime":"2025-12-11T13:05:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:18 crc kubenswrapper[4898]: I1211 13:05:18.956583 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:18 crc kubenswrapper[4898]: I1211 13:05:18.956620 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:18 crc kubenswrapper[4898]: I1211 13:05:18.956633 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:18 crc kubenswrapper[4898]: I1211 13:05:18.956652 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:18 crc kubenswrapper[4898]: I1211 13:05:18.956666 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:18Z","lastTransitionTime":"2025-12-11T13:05:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:19 crc kubenswrapper[4898]: I1211 13:05:19.059796 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:19 crc kubenswrapper[4898]: I1211 13:05:19.059833 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:19 crc kubenswrapper[4898]: I1211 13:05:19.059845 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:19 crc kubenswrapper[4898]: I1211 13:05:19.059860 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:19 crc kubenswrapper[4898]: I1211 13:05:19.059871 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:19Z","lastTransitionTime":"2025-12-11T13:05:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:19 crc kubenswrapper[4898]: I1211 13:05:19.162551 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:19 crc kubenswrapper[4898]: I1211 13:05:19.162873 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:19 crc kubenswrapper[4898]: I1211 13:05:19.162992 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:19 crc kubenswrapper[4898]: I1211 13:05:19.163106 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:19 crc kubenswrapper[4898]: I1211 13:05:19.163193 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:19Z","lastTransitionTime":"2025-12-11T13:05:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:19 crc kubenswrapper[4898]: I1211 13:05:19.265211 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:19 crc kubenswrapper[4898]: I1211 13:05:19.265601 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:19 crc kubenswrapper[4898]: I1211 13:05:19.265778 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:19 crc kubenswrapper[4898]: I1211 13:05:19.265920 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:19 crc kubenswrapper[4898]: I1211 13:05:19.266060 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:19Z","lastTransitionTime":"2025-12-11T13:05:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:19 crc kubenswrapper[4898]: I1211 13:05:19.369642 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:19 crc kubenswrapper[4898]: I1211 13:05:19.369714 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:19 crc kubenswrapper[4898]: I1211 13:05:19.369737 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:19 crc kubenswrapper[4898]: I1211 13:05:19.369768 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:19 crc kubenswrapper[4898]: I1211 13:05:19.369789 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:19Z","lastTransitionTime":"2025-12-11T13:05:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:19 crc kubenswrapper[4898]: I1211 13:05:19.472865 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:19 crc kubenswrapper[4898]: I1211 13:05:19.472916 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:19 crc kubenswrapper[4898]: I1211 13:05:19.472933 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:19 crc kubenswrapper[4898]: I1211 13:05:19.472960 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:19 crc kubenswrapper[4898]: I1211 13:05:19.472995 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:19Z","lastTransitionTime":"2025-12-11T13:05:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:19 crc kubenswrapper[4898]: I1211 13:05:19.576225 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:19 crc kubenswrapper[4898]: I1211 13:05:19.576296 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:19 crc kubenswrapper[4898]: I1211 13:05:19.576319 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:19 crc kubenswrapper[4898]: I1211 13:05:19.576347 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:19 crc kubenswrapper[4898]: I1211 13:05:19.576364 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:19Z","lastTransitionTime":"2025-12-11T13:05:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:19 crc kubenswrapper[4898]: I1211 13:05:19.679014 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:19 crc kubenswrapper[4898]: I1211 13:05:19.679072 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:19 crc kubenswrapper[4898]: I1211 13:05:19.679095 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:19 crc kubenswrapper[4898]: I1211 13:05:19.679121 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:19 crc kubenswrapper[4898]: I1211 13:05:19.679142 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:19Z","lastTransitionTime":"2025-12-11T13:05:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:19 crc kubenswrapper[4898]: I1211 13:05:19.774132 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 11 13:05:19 crc kubenswrapper[4898]: E1211 13:05:19.774307 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 11 13:05:19 crc kubenswrapper[4898]: I1211 13:05:19.774782 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 11 13:05:19 crc kubenswrapper[4898]: I1211 13:05:19.774839 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zcq7l" Dec 11 13:05:19 crc kubenswrapper[4898]: I1211 13:05:19.774853 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 11 13:05:19 crc kubenswrapper[4898]: E1211 13:05:19.775921 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 11 13:05:19 crc kubenswrapper[4898]: E1211 13:05:19.776073 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zcq7l" podUID="34380c7c-1d75-4f6f-a6cb-b015a55ca978" Dec 11 13:05:19 crc kubenswrapper[4898]: E1211 13:05:19.776242 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 11 13:05:19 crc kubenswrapper[4898]: I1211 13:05:19.781566 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:19 crc kubenswrapper[4898]: I1211 13:05:19.781599 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:19 crc kubenswrapper[4898]: I1211 13:05:19.781614 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:19 crc kubenswrapper[4898]: I1211 13:05:19.781634 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:19 crc kubenswrapper[4898]: I1211 13:05:19.781651 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:19Z","lastTransitionTime":"2025-12-11T13:05:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:19 crc kubenswrapper[4898]: I1211 13:05:19.884950 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:19 crc kubenswrapper[4898]: I1211 13:05:19.885012 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:19 crc kubenswrapper[4898]: I1211 13:05:19.885029 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:19 crc kubenswrapper[4898]: I1211 13:05:19.885101 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:19 crc kubenswrapper[4898]: I1211 13:05:19.885119 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:19Z","lastTransitionTime":"2025-12-11T13:05:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:19 crc kubenswrapper[4898]: I1211 13:05:19.987951 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:19 crc kubenswrapper[4898]: I1211 13:05:19.988021 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:19 crc kubenswrapper[4898]: I1211 13:05:19.988040 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:19 crc kubenswrapper[4898]: I1211 13:05:19.988066 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:19 crc kubenswrapper[4898]: I1211 13:05:19.988085 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:19Z","lastTransitionTime":"2025-12-11T13:05:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:20 crc kubenswrapper[4898]: I1211 13:05:20.090999 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:20 crc kubenswrapper[4898]: I1211 13:05:20.091080 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:20 crc kubenswrapper[4898]: I1211 13:05:20.091100 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:20 crc kubenswrapper[4898]: I1211 13:05:20.091129 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:20 crc kubenswrapper[4898]: I1211 13:05:20.091152 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:20Z","lastTransitionTime":"2025-12-11T13:05:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:20 crc kubenswrapper[4898]: I1211 13:05:20.193732 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:20 crc kubenswrapper[4898]: I1211 13:05:20.193788 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:20 crc kubenswrapper[4898]: I1211 13:05:20.193806 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:20 crc kubenswrapper[4898]: I1211 13:05:20.193831 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:20 crc kubenswrapper[4898]: I1211 13:05:20.193847 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:20Z","lastTransitionTime":"2025-12-11T13:05:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:20 crc kubenswrapper[4898]: I1211 13:05:20.297405 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:20 crc kubenswrapper[4898]: I1211 13:05:20.297487 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:20 crc kubenswrapper[4898]: I1211 13:05:20.297508 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:20 crc kubenswrapper[4898]: I1211 13:05:20.297537 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:20 crc kubenswrapper[4898]: I1211 13:05:20.297560 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:20Z","lastTransitionTime":"2025-12-11T13:05:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:20 crc kubenswrapper[4898]: I1211 13:05:20.400560 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:20 crc kubenswrapper[4898]: I1211 13:05:20.400620 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:20 crc kubenswrapper[4898]: I1211 13:05:20.400643 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:20 crc kubenswrapper[4898]: I1211 13:05:20.400672 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:20 crc kubenswrapper[4898]: I1211 13:05:20.400696 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:20Z","lastTransitionTime":"2025-12-11T13:05:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:20 crc kubenswrapper[4898]: I1211 13:05:20.504173 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:20 crc kubenswrapper[4898]: I1211 13:05:20.504662 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:20 crc kubenswrapper[4898]: I1211 13:05:20.504871 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:20 crc kubenswrapper[4898]: I1211 13:05:20.505063 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:20 crc kubenswrapper[4898]: I1211 13:05:20.505218 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:20Z","lastTransitionTime":"2025-12-11T13:05:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:20 crc kubenswrapper[4898]: I1211 13:05:20.608386 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:20 crc kubenswrapper[4898]: I1211 13:05:20.608448 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:20 crc kubenswrapper[4898]: I1211 13:05:20.608497 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:20 crc kubenswrapper[4898]: I1211 13:05:20.608524 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:20 crc kubenswrapper[4898]: I1211 13:05:20.608542 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:20Z","lastTransitionTime":"2025-12-11T13:05:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:20 crc kubenswrapper[4898]: I1211 13:05:20.711963 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:20 crc kubenswrapper[4898]: I1211 13:05:20.711998 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:20 crc kubenswrapper[4898]: I1211 13:05:20.712006 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:20 crc kubenswrapper[4898]: I1211 13:05:20.712019 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:20 crc kubenswrapper[4898]: I1211 13:05:20.712027 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:20Z","lastTransitionTime":"2025-12-11T13:05:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:20 crc kubenswrapper[4898]: I1211 13:05:20.776037 4898 scope.go:117] "RemoveContainer" containerID="18a2b5e8508fbd418e99cd2fe5af41cf82ce69e8a4240c8c3c1965e8f7b0bfb9" Dec 11 13:05:20 crc kubenswrapper[4898]: I1211 13:05:20.814142 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:20 crc kubenswrapper[4898]: I1211 13:05:20.814200 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:20 crc kubenswrapper[4898]: I1211 13:05:20.814209 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:20 crc kubenswrapper[4898]: I1211 13:05:20.814222 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:20 crc kubenswrapper[4898]: I1211 13:05:20.814232 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:20Z","lastTransitionTime":"2025-12-11T13:05:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:20 crc kubenswrapper[4898]: I1211 13:05:20.917221 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:20 crc kubenswrapper[4898]: I1211 13:05:20.917253 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:20 crc kubenswrapper[4898]: I1211 13:05:20.917261 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:20 crc kubenswrapper[4898]: I1211 13:05:20.917274 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:20 crc kubenswrapper[4898]: I1211 13:05:20.917283 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:20Z","lastTransitionTime":"2025-12-11T13:05:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:21 crc kubenswrapper[4898]: I1211 13:05:21.019881 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:21 crc kubenswrapper[4898]: I1211 13:05:21.019934 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:21 crc kubenswrapper[4898]: I1211 13:05:21.019945 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:21 crc kubenswrapper[4898]: I1211 13:05:21.019961 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:21 crc kubenswrapper[4898]: I1211 13:05:21.019976 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:21Z","lastTransitionTime":"2025-12-11T13:05:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:21 crc kubenswrapper[4898]: I1211 13:05:21.123177 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:21 crc kubenswrapper[4898]: I1211 13:05:21.123241 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:21 crc kubenswrapper[4898]: I1211 13:05:21.123258 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:21 crc kubenswrapper[4898]: I1211 13:05:21.123279 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:21 crc kubenswrapper[4898]: I1211 13:05:21.123296 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:21Z","lastTransitionTime":"2025-12-11T13:05:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:21 crc kubenswrapper[4898]: I1211 13:05:21.225957 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:21 crc kubenswrapper[4898]: I1211 13:05:21.226026 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:21 crc kubenswrapper[4898]: I1211 13:05:21.226047 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:21 crc kubenswrapper[4898]: I1211 13:05:21.226075 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:21 crc kubenswrapper[4898]: I1211 13:05:21.226097 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:21Z","lastTransitionTime":"2025-12-11T13:05:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:21 crc kubenswrapper[4898]: I1211 13:05:21.328642 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:21 crc kubenswrapper[4898]: I1211 13:05:21.328704 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:21 crc kubenswrapper[4898]: I1211 13:05:21.328724 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:21 crc kubenswrapper[4898]: I1211 13:05:21.328747 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:21 crc kubenswrapper[4898]: I1211 13:05:21.328764 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:21Z","lastTransitionTime":"2025-12-11T13:05:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:21 crc kubenswrapper[4898]: I1211 13:05:21.432434 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:21 crc kubenswrapper[4898]: I1211 13:05:21.432529 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:21 crc kubenswrapper[4898]: I1211 13:05:21.432548 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:21 crc kubenswrapper[4898]: I1211 13:05:21.432573 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:21 crc kubenswrapper[4898]: I1211 13:05:21.432590 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:21Z","lastTransitionTime":"2025-12-11T13:05:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:21 crc kubenswrapper[4898]: I1211 13:05:21.534597 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:21 crc kubenswrapper[4898]: I1211 13:05:21.534663 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:21 crc kubenswrapper[4898]: I1211 13:05:21.534681 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:21 crc kubenswrapper[4898]: I1211 13:05:21.534705 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:21 crc kubenswrapper[4898]: I1211 13:05:21.534722 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:21Z","lastTransitionTime":"2025-12-11T13:05:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:21 crc kubenswrapper[4898]: I1211 13:05:21.637077 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:21 crc kubenswrapper[4898]: I1211 13:05:21.637133 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:21 crc kubenswrapper[4898]: I1211 13:05:21.637149 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:21 crc kubenswrapper[4898]: I1211 13:05:21.637171 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:21 crc kubenswrapper[4898]: I1211 13:05:21.637186 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:21Z","lastTransitionTime":"2025-12-11T13:05:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:21 crc kubenswrapper[4898]: I1211 13:05:21.740675 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:21 crc kubenswrapper[4898]: I1211 13:05:21.740735 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:21 crc kubenswrapper[4898]: I1211 13:05:21.740747 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:21 crc kubenswrapper[4898]: I1211 13:05:21.740766 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:21 crc kubenswrapper[4898]: I1211 13:05:21.740781 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:21Z","lastTransitionTime":"2025-12-11T13:05:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:21 crc kubenswrapper[4898]: I1211 13:05:21.774193 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 11 13:05:21 crc kubenswrapper[4898]: I1211 13:05:21.774314 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zcq7l" Dec 11 13:05:21 crc kubenswrapper[4898]: E1211 13:05:21.774363 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 11 13:05:21 crc kubenswrapper[4898]: I1211 13:05:21.774379 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 11 13:05:21 crc kubenswrapper[4898]: I1211 13:05:21.774434 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 11 13:05:21 crc kubenswrapper[4898]: E1211 13:05:21.774638 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zcq7l" podUID="34380c7c-1d75-4f6f-a6cb-b015a55ca978" Dec 11 13:05:21 crc kubenswrapper[4898]: E1211 13:05:21.774739 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 11 13:05:21 crc kubenswrapper[4898]: E1211 13:05:21.774829 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 11 13:05:21 crc kubenswrapper[4898]: I1211 13:05:21.866230 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:21 crc kubenswrapper[4898]: I1211 13:05:21.866284 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:21 crc kubenswrapper[4898]: I1211 13:05:21.866301 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:21 crc kubenswrapper[4898]: I1211 13:05:21.866322 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:21 crc kubenswrapper[4898]: I1211 13:05:21.866339 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:21Z","lastTransitionTime":"2025-12-11T13:05:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:21 crc kubenswrapper[4898]: I1211 13:05:21.968628 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:21 crc kubenswrapper[4898]: I1211 13:05:21.969024 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:21 crc kubenswrapper[4898]: I1211 13:05:21.969238 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:21 crc kubenswrapper[4898]: I1211 13:05:21.969526 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:21 crc kubenswrapper[4898]: I1211 13:05:21.969763 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:21Z","lastTransitionTime":"2025-12-11T13:05:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:22 crc kubenswrapper[4898]: I1211 13:05:22.072496 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:22 crc kubenswrapper[4898]: I1211 13:05:22.072530 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:22 crc kubenswrapper[4898]: I1211 13:05:22.072544 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:22 crc kubenswrapper[4898]: I1211 13:05:22.072563 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:22 crc kubenswrapper[4898]: I1211 13:05:22.072575 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:22Z","lastTransitionTime":"2025-12-11T13:05:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:22 crc kubenswrapper[4898]: I1211 13:05:22.175574 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:22 crc kubenswrapper[4898]: I1211 13:05:22.175619 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:22 crc kubenswrapper[4898]: I1211 13:05:22.175631 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:22 crc kubenswrapper[4898]: I1211 13:05:22.175648 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:22 crc kubenswrapper[4898]: I1211 13:05:22.175760 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:22Z","lastTransitionTime":"2025-12-11T13:05:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:22 crc kubenswrapper[4898]: I1211 13:05:22.278359 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:22 crc kubenswrapper[4898]: I1211 13:05:22.278417 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:22 crc kubenswrapper[4898]: I1211 13:05:22.278433 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:22 crc kubenswrapper[4898]: I1211 13:05:22.278480 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:22 crc kubenswrapper[4898]: I1211 13:05:22.278501 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:22Z","lastTransitionTime":"2025-12-11T13:05:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:22 crc kubenswrapper[4898]: I1211 13:05:22.382111 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:22 crc kubenswrapper[4898]: I1211 13:05:22.382400 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:22 crc kubenswrapper[4898]: I1211 13:05:22.382534 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:22 crc kubenswrapper[4898]: I1211 13:05:22.382635 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:22 crc kubenswrapper[4898]: I1211 13:05:22.382724 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:22Z","lastTransitionTime":"2025-12-11T13:05:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:22 crc kubenswrapper[4898]: I1211 13:05:22.485954 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:22 crc kubenswrapper[4898]: I1211 13:05:22.486000 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:22 crc kubenswrapper[4898]: I1211 13:05:22.486012 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:22 crc kubenswrapper[4898]: I1211 13:05:22.486030 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:22 crc kubenswrapper[4898]: I1211 13:05:22.486042 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:22Z","lastTransitionTime":"2025-12-11T13:05:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:22 crc kubenswrapper[4898]: I1211 13:05:22.592218 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:22 crc kubenswrapper[4898]: I1211 13:05:22.592255 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:22 crc kubenswrapper[4898]: I1211 13:05:22.592271 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:22 crc kubenswrapper[4898]: I1211 13:05:22.592290 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:22 crc kubenswrapper[4898]: I1211 13:05:22.592305 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:22Z","lastTransitionTime":"2025-12-11T13:05:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:22 crc kubenswrapper[4898]: I1211 13:05:22.695363 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:22 crc kubenswrapper[4898]: I1211 13:05:22.695400 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:22 crc kubenswrapper[4898]: I1211 13:05:22.695408 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:22 crc kubenswrapper[4898]: I1211 13:05:22.695422 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:22 crc kubenswrapper[4898]: I1211 13:05:22.695432 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:22Z","lastTransitionTime":"2025-12-11T13:05:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:22 crc kubenswrapper[4898]: I1211 13:05:22.792113 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-h86cf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"00c0a5c3-6be3-4c77-a628-cf6710a1f10f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cb94931502e22d03595fc3289c2b098ecf881b9f8c14180a036a11b2f65bd91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xv2hc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-h86cf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:22Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:22 crc kubenswrapper[4898]: I1211 13:05:22.798250 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:22 crc kubenswrapper[4898]: I1211 13:05:22.798307 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:22 crc kubenswrapper[4898]: I1211 13:05:22.798318 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:22 crc kubenswrapper[4898]: I1211 13:05:22.798339 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:22 crc kubenswrapper[4898]: I1211 13:05:22.798381 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:22Z","lastTransitionTime":"2025-12-11T13:05:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:22 crc kubenswrapper[4898]: I1211 13:05:22.809966 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6782ac5869029f1d6c05198c469d34070decbbbbab0c093963024040f067ecf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7cv42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb7b98c50effebe09db7e51fe139da11af7b116ce6b7d81f0dada92fa29c82a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7cv42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7mmvk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:22Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:22 crc kubenswrapper[4898]: I1211 13:05:22.843859 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qndxl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1efa7034-8a95-4e6e-bd84-0189dc5acaa3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2a77e22893dca5ea818fc6992db6c34f8eb90ce2f25e814bec5f314d60ad569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd81ec4902eeb225d5a2969e211c63bbab448c1cccb327cfa84fb68c4f386ab0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ab0efc2d7d46f575125a46f025b016017e66f74466a9d796dfb0285577f5bcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d992af21fd68cce19a4be8328466a3d46df62ed908fe4e911ba14e7c009ce982\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a60901e27e67b7483e02800a91a86f2676b610d538c999ab1df0d42fde146f90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7e7d7ded2a6e0e20781a16b0ed56a744db593c6fe40c75e89a088ccb8cdb127\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18a2b5e8508fbd418e99cd2fe5af41cf82ce69e8a4240c8c3c1965e8f7b0bfb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://18a2b5e8508fbd418e99cd2fe5af41cf82ce69e8a4240c8c3c1965e8f7b0bfb9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-11T13:04:55Z\\\",\\\"message\\\":\\\"pin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.41:9443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {589f95f7-f3e2-4140-80ed-9a0717201481}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1211 13:04:55.603698 6618 services_controller.go:356] Processing sync for service openshift-marketplace/redhat-marketplace for network=default\\\\nF1211 13:04:55.603702 6618 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:55Z is after 2025-08-24T17:21:41Z]\\\\nI1211 13:04:55.6\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:54Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-qndxl_openshift-ovn-kubernetes(1efa7034-8a95-4e6e-bd84-0189dc5acaa3)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3aa907a5ca865510f1aafd0f949963b5938b6fe62fa9fd690447c4ab62566f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fc397b24e8480c4d07281b09fb871f859eb9fdf47fa9103b9243f91e5800d61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fc397b24e8480c4d07281b09fb871f859eb9fdf47fa9103b9243f91e5800d61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qndxl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:22Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:22 crc kubenswrapper[4898]: I1211 13:05:22.861259 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-brphv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f71cb6f-609f-4fab-86eb-e01518fb8c61\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d17922d87e50d85f50e76fef19a44cedae3bfd35727f8531089cfcf6da12d5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwpjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cedba92b63b5430b84d32f2b59c5cc0c4e1cd2bc2d513f2f905ebb59c34639b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwpjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-brphv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:22Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:22 crc kubenswrapper[4898]: I1211 13:05:22.889350 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbb0d9e2-813d-44ce-a547-a3c6f60c9f6b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e84fc12f1339e236216301663a693fc0ae8c043f282081b77e08455869556f40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcda5c81dd6cdc256d7acd24d32f956484e8823bed9e29c5713a722d8ebd6f46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cbfca7c42a03929bac318edc66ce1a6c7f88f795fd3b58db686b8484edfac60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e329db1261719ccef14dcf3965ff73799b275591613d4035cb820253dced79b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e329db1261719ccef14dcf3965ff73799b275591613d4035cb820253dced79b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:02Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:22Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:22 crc kubenswrapper[4898]: I1211 13:05:22.900761 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:22 crc kubenswrapper[4898]: I1211 13:05:22.900811 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:22 crc kubenswrapper[4898]: I1211 13:05:22.900824 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:22 crc kubenswrapper[4898]: I1211 13:05:22.900843 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:22 crc kubenswrapper[4898]: I1211 13:05:22.900857 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:22Z","lastTransitionTime":"2025-12-11T13:05:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:22 crc kubenswrapper[4898]: I1211 13:05:22.934400 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"88d684b6-361a-40e6-978e-b46ea34398f5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fdd8f4f5e41c02e8bfa32782027f6483a3568a3e1cb28f2abc7807eaada32646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad23dc07c8ad7cf6100e3b53e1d9ed8abac20e942968edb5decdde1b263de306\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce1bf1a7718645350f992e26a366a0fefa0d757b696c932d3552dd24a4c93550\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4801f56faef50a1c100b7f54be9508a2be17ab655648fdb92cffaab1cb198a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87f5a835b639c4ce50c0d2b5f1a865f1b67bc5cc06d9fd88875d9defb71dbed2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77f3c45a0ef619bd5ee7b89f62fdaf4ae983c8f75e23034eb88019194f6ef88a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77f3c45a0ef619bd5ee7b89f62fdaf4ae983c8f75e23034eb88019194f6ef88a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32f392949d5ebc630af8b0646f8f482f806803d464e2cedbcbe731328d6ba5ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32f392949d5ebc630af8b0646f8f482f806803d464e2cedbcbe731328d6ba5ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://bf8553e4b1a22afaf454efae3707136b9e6e320a54fd269f48a5f72297249d21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf8553e4b1a22afaf454efae3707136b9e6e320a54fd269f48a5f72297249d21\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:22Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:22 crc kubenswrapper[4898]: I1211 13:05:22.952120 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7lxfm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45192e97-2770-4866-9865-dc2f45b3f616\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4fc2a5689b589ccda6ec382b6a586f4e47c42d4f137d852e28688a18b98e6e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14a9667311a747f8e844897394432d67e0897b22569dd1314f4ebeccfce1c531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14a9667311a747f8e844897394432d67e0897b22569dd1314f4ebeccfce1c531\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://35bb75c37377565dd0410255d00b3a848a3d22801107bd4d9547ce5574cda121\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35bb75c37377565dd0410255d00b3a848a3d22801107bd4d9547ce5574cda121\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd3bae44b475629c42174d195e162b9afe545e8dc84857847b1e2d936cf1a32d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd3bae44b475629c42174d195e162b9afe545e8dc84857847b1e2d936cf1a32d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7d5edf28866aeb08ed16121e0cea183463e58a708b157da20edd847bdad9860\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7d5edf28866aeb08ed16121e0cea183463e58a708b157da20edd847bdad9860\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://951e5da3fc6041496960dcfcd28bb3e918a40d9303c6609880bc2f8dcca29688\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://951e5da3fc6041496960dcfcd28bb3e918a40d9303c6609880bc2f8dcca29688\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39b6a95a0387b8f1ea4992bfa28355272a299bbb81adddf7ad39835786b43b2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39b6a95a0387b8f1ea4992bfa28355272a299bbb81adddf7ad39835786b43b2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7lxfm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:22Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:22 crc kubenswrapper[4898]: I1211 13:05:22.963164 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-zcq7l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34380c7c-1d75-4f6f-a6cb-b015a55ca978\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:36Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-zcq7l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:22Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:22 crc kubenswrapper[4898]: I1211 13:05:22.981591 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fc485a2-a69e-4453-8c3d-4b9698caa632\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df081be8bffcbf5068ade5f09ad81f2ea332d110fb0dffc1ab215f4a447e75d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://765cb993103d23770e20a725ce1f28ca635d13d2ecd05676039a833274ecf3e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41568c567a418a6ebd534c09d02c4331dfa2cd00580441bfdd83a15a12153a59\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ccc1530df8a90b252f6008c3a1734fd489bcb57944c2cc46bdf3c7554d837c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8857b4ade110b31598d9080f45bc15f5702b2c278fe7909e4d172c9877e61534\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"message\\\":\\\"falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1211 13:04:15.326141 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1211 13:04:15.327902 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-38072788/tls.crt::/tmp/serving-cert-38072788/tls.key\\\\\\\"\\\\nI1211 13:04:21.035582 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1211 13:04:21.038444 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1211 13:04:21.038483 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1211 13:04:21.038515 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1211 13:04:21.038527 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1211 13:04:21.044993 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1211 13:04:21.045022 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1211 13:04:21.045029 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1211 13:04:21.045034 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1211 13:04:21.045038 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1211 13:04:21.045043 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1211 13:04:21.045047 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1211 13:04:21.045297 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1211 13:04:21.046774 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a6e962a8c37823fc7d0047f9c1cae14311bfa1f36236addff039bba3369a021\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca4b5408884ed07718750ad82f77961d242327ca11991fd4dbbd1b5b53c5b642\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca4b5408884ed07718750ad82f77961d242327ca11991fd4dbbd1b5b53c5b642\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:22Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:22 crc kubenswrapper[4898]: I1211 13:05:22.995747 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9dab79b9393ae8ca48f0b4366d9b4e6fd25015a3d42f913b3d3d5ce78cbf0cc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bee512bd389a5e402acad6fdbf075e8d90bbfb3899efaecee992ff8ef64617fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:22Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:23 crc kubenswrapper[4898]: I1211 13:05:23.003269 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:23 crc kubenswrapper[4898]: I1211 13:05:23.003344 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:23 crc kubenswrapper[4898]: I1211 13:05:23.003359 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:23 crc kubenswrapper[4898]: I1211 13:05:23.003384 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:23 crc kubenswrapper[4898]: I1211 13:05:23.003404 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:23Z","lastTransitionTime":"2025-12-11T13:05:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:23 crc kubenswrapper[4898]: I1211 13:05:23.009623 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:23Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:23 crc kubenswrapper[4898]: I1211 13:05:23.021660 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca3db723-c4e9-4a38-9cee-827f045c7be3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b72e9f6787b4bfadbc89cf669d5f31678f0a85a726107e3b3d2e06eff0e18ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a52b842d9e7001cf1e9c3284c56ac9a76f642b66457d6717c6367e00ef89fdf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://583816a3f9ba9a092d6cd8d75bba54cf42e3ff9f65230f83822baec967dd53dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f419f755ec0a61ffad06dca0d35df43386ebd253152f29cf04312cd0ea89d5de\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:23Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:23 crc kubenswrapper[4898]: I1211 13:05:23.033525 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:23Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:23 crc kubenswrapper[4898]: I1211 13:05:23.047845 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:23Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:23 crc kubenswrapper[4898]: I1211 13:05:23.061622 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jmvlr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8009db93-3f68-4a84-87ae-863f64e231e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08079d059f71e05ad9d28ebed3d2608a035d3b6fc87db1d80b3974b439ac5bdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkmx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:24Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jmvlr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:23Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:23 crc kubenswrapper[4898]: I1211 13:05:23.075540 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://337971ff1d0d3cf7ee256777a71e4549ec0b973b58b437caaa152c589047141f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:23Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:23 crc kubenswrapper[4898]: I1211 13:05:23.086240 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b28cc743c1173591c6ee75d4bb3841608d01bca50dab896d4c5de930a67ee22f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:23Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:23 crc kubenswrapper[4898]: I1211 13:05:23.100696 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-dlqfj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e8ed6cb-b822-4b64-9e00-e755c5aea812\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbc9b844a873836a500af9d781e17c635495762b0e50af71dadf141ff00723a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76ddfb054d2dd7ec27a771c96a0fca2cead8eacc3221c22b36c1d075414a1995\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-11T13:05:09Z\\\",\\\"message\\\":\\\"2025-12-11T13:04:23+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_be5097ed-4df8-4212-b1ed-ee3e7c74e5b6\\\\n2025-12-11T13:04:23+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_be5097ed-4df8-4212-b1ed-ee3e7c74e5b6 to /host/opt/cni/bin/\\\\n2025-12-11T13:04:24Z [verbose] multus-daemon started\\\\n2025-12-11T13:04:24Z [verbose] Readiness Indicator file check\\\\n2025-12-11T13:05:09Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:05:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bvrpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-dlqfj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:23Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:23 crc kubenswrapper[4898]: I1211 13:05:23.105313 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:23 crc kubenswrapper[4898]: I1211 13:05:23.105550 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:23 crc kubenswrapper[4898]: I1211 13:05:23.105651 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:23 crc kubenswrapper[4898]: I1211 13:05:23.105779 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:23 crc kubenswrapper[4898]: I1211 13:05:23.105863 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:23Z","lastTransitionTime":"2025-12-11T13:05:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:23 crc kubenswrapper[4898]: I1211 13:05:23.207659 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:23 crc kubenswrapper[4898]: I1211 13:05:23.207707 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:23 crc kubenswrapper[4898]: I1211 13:05:23.207719 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:23 crc kubenswrapper[4898]: I1211 13:05:23.207735 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:23 crc kubenswrapper[4898]: I1211 13:05:23.207747 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:23Z","lastTransitionTime":"2025-12-11T13:05:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:23 crc kubenswrapper[4898]: I1211 13:05:23.268167 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qndxl_1efa7034-8a95-4e6e-bd84-0189dc5acaa3/ovnkube-controller/2.log" Dec 11 13:05:23 crc kubenswrapper[4898]: I1211 13:05:23.280514 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qndxl" event={"ID":"1efa7034-8a95-4e6e-bd84-0189dc5acaa3","Type":"ContainerStarted","Data":"b19bc85d9d1f776f0f2a0bfedbfa6f30383df3673e54acadcc74a46e4d24bcde"} Dec 11 13:05:23 crc kubenswrapper[4898]: I1211 13:05:23.281498 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-qndxl" Dec 11 13:05:23 crc kubenswrapper[4898]: I1211 13:05:23.295198 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://337971ff1d0d3cf7ee256777a71e4549ec0b973b58b437caaa152c589047141f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:23Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:23 crc kubenswrapper[4898]: I1211 13:05:23.306907 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b28cc743c1173591c6ee75d4bb3841608d01bca50dab896d4c5de930a67ee22f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:23Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:23 crc kubenswrapper[4898]: I1211 13:05:23.310765 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:23 crc kubenswrapper[4898]: I1211 13:05:23.310806 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:23 crc kubenswrapper[4898]: I1211 13:05:23.310821 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:23 crc kubenswrapper[4898]: I1211 13:05:23.310839 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:23 crc kubenswrapper[4898]: I1211 13:05:23.310854 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:23Z","lastTransitionTime":"2025-12-11T13:05:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:23 crc kubenswrapper[4898]: I1211 13:05:23.327494 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-dlqfj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e8ed6cb-b822-4b64-9e00-e755c5aea812\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbc9b844a873836a500af9d781e17c635495762b0e50af71dadf141ff00723a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76ddfb054d2dd7ec27a771c96a0fca2cead8eacc3221c22b36c1d075414a1995\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-11T13:05:09Z\\\",\\\"message\\\":\\\"2025-12-11T13:04:23+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_be5097ed-4df8-4212-b1ed-ee3e7c74e5b6\\\\n2025-12-11T13:04:23+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_be5097ed-4df8-4212-b1ed-ee3e7c74e5b6 to /host/opt/cni/bin/\\\\n2025-12-11T13:04:24Z [verbose] multus-daemon started\\\\n2025-12-11T13:04:24Z [verbose] Readiness Indicator file check\\\\n2025-12-11T13:05:09Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:05:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bvrpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-dlqfj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:23Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:23 crc kubenswrapper[4898]: I1211 13:05:23.342576 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jmvlr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8009db93-3f68-4a84-87ae-863f64e231e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08079d059f71e05ad9d28ebed3d2608a035d3b6fc87db1d80b3974b439ac5bdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkmx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:24Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jmvlr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:23Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:23 crc kubenswrapper[4898]: I1211 13:05:23.355064 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbb0d9e2-813d-44ce-a547-a3c6f60c9f6b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e84fc12f1339e236216301663a693fc0ae8c043f282081b77e08455869556f40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcda5c81dd6cdc256d7acd24d32f956484e8823bed9e29c5713a722d8ebd6f46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cbfca7c42a03929bac318edc66ce1a6c7f88f795fd3b58db686b8484edfac60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e329db1261719ccef14dcf3965ff73799b275591613d4035cb820253dced79b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e329db1261719ccef14dcf3965ff73799b275591613d4035cb820253dced79b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:02Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:23Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:23 crc kubenswrapper[4898]: I1211 13:05:23.376260 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"88d684b6-361a-40e6-978e-b46ea34398f5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fdd8f4f5e41c02e8bfa32782027f6483a3568a3e1cb28f2abc7807eaada32646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad23dc07c8ad7cf6100e3b53e1d9ed8abac20e942968edb5decdde1b263de306\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce1bf1a7718645350f992e26a366a0fefa0d757b696c932d3552dd24a4c93550\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4801f56faef50a1c100b7f54be9508a2be17ab655648fdb92cffaab1cb198a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87f5a835b639c4ce50c0d2b5f1a865f1b67bc5cc06d9fd88875d9defb71dbed2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77f3c45a0ef619bd5ee7b89f62fdaf4ae983c8f75e23034eb88019194f6ef88a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77f3c45a0ef619bd5ee7b89f62fdaf4ae983c8f75e23034eb88019194f6ef88a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32f392949d5ebc630af8b0646f8f482f806803d464e2cedbcbe731328d6ba5ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32f392949d5ebc630af8b0646f8f482f806803d464e2cedbcbe731328d6ba5ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://bf8553e4b1a22afaf454efae3707136b9e6e320a54fd269f48a5f72297249d21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf8553e4b1a22afaf454efae3707136b9e6e320a54fd269f48a5f72297249d21\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:23Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:23 crc kubenswrapper[4898]: I1211 13:05:23.387764 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-h86cf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"00c0a5c3-6be3-4c77-a628-cf6710a1f10f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cb94931502e22d03595fc3289c2b098ecf881b9f8c14180a036a11b2f65bd91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xv2hc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-h86cf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:23Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:23 crc kubenswrapper[4898]: I1211 13:05:23.398533 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6782ac5869029f1d6c05198c469d34070decbbbbab0c093963024040f067ecf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7cv42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb7b98c50effebe09db7e51fe139da11af7b116ce6b7d81f0dada92fa29c82a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7cv42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7mmvk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:23Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:23 crc kubenswrapper[4898]: I1211 13:05:23.413005 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:23 crc kubenswrapper[4898]: I1211 13:05:23.413060 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:23 crc kubenswrapper[4898]: I1211 13:05:23.413078 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:23 crc kubenswrapper[4898]: I1211 13:05:23.413100 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:23 crc kubenswrapper[4898]: I1211 13:05:23.413112 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:23Z","lastTransitionTime":"2025-12-11T13:05:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:23 crc kubenswrapper[4898]: I1211 13:05:23.415387 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qndxl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1efa7034-8a95-4e6e-bd84-0189dc5acaa3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2a77e22893dca5ea818fc6992db6c34f8eb90ce2f25e814bec5f314d60ad569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd81ec4902eeb225d5a2969e211c63bbab448c1cccb327cfa84fb68c4f386ab0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ab0efc2d7d46f575125a46f025b016017e66f74466a9d796dfb0285577f5bcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d992af21fd68cce19a4be8328466a3d46df62ed908fe4e911ba14e7c009ce982\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a60901e27e67b7483e02800a91a86f2676b610d538c999ab1df0d42fde146f90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7e7d7ded2a6e0e20781a16b0ed56a744db593c6fe40c75e89a088ccb8cdb127\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b19bc85d9d1f776f0f2a0bfedbfa6f30383df3673e54acadcc74a46e4d24bcde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://18a2b5e8508fbd418e99cd2fe5af41cf82ce69e8a4240c8c3c1965e8f7b0bfb9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-11T13:04:55Z\\\",\\\"message\\\":\\\"pin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.41:9443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {589f95f7-f3e2-4140-80ed-9a0717201481}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1211 13:04:55.603698 6618 services_controller.go:356] Processing sync for service openshift-marketplace/redhat-marketplace for network=default\\\\nF1211 13:04:55.603702 6618 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:55Z is after 2025-08-24T17:21:41Z]\\\\nI1211 13:04:55.6\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:54Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3aa907a5ca865510f1aafd0f949963b5938b6fe62fa9fd690447c4ab62566f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fc397b24e8480c4d07281b09fb871f859eb9fdf47fa9103b9243f91e5800d61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fc397b24e8480c4d07281b09fb871f859eb9fdf47fa9103b9243f91e5800d61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qndxl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:23Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:23 crc kubenswrapper[4898]: I1211 13:05:23.426501 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-brphv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f71cb6f-609f-4fab-86eb-e01518fb8c61\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d17922d87e50d85f50e76fef19a44cedae3bfd35727f8531089cfcf6da12d5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwpjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cedba92b63b5430b84d32f2b59c5cc0c4e1cd2bc2d513f2f905ebb59c34639b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwpjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-brphv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:23Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:23 crc kubenswrapper[4898]: I1211 13:05:23.441583 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fc485a2-a69e-4453-8c3d-4b9698caa632\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df081be8bffcbf5068ade5f09ad81f2ea332d110fb0dffc1ab215f4a447e75d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://765cb993103d23770e20a725ce1f28ca635d13d2ecd05676039a833274ecf3e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41568c567a418a6ebd534c09d02c4331dfa2cd00580441bfdd83a15a12153a59\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ccc1530df8a90b252f6008c3a1734fd489bcb57944c2cc46bdf3c7554d837c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8857b4ade110b31598d9080f45bc15f5702b2c278fe7909e4d172c9877e61534\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"message\\\":\\\"falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1211 13:04:15.326141 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1211 13:04:15.327902 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-38072788/tls.crt::/tmp/serving-cert-38072788/tls.key\\\\\\\"\\\\nI1211 13:04:21.035582 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1211 13:04:21.038444 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1211 13:04:21.038483 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1211 13:04:21.038515 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1211 13:04:21.038527 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1211 13:04:21.044993 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1211 13:04:21.045022 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1211 13:04:21.045029 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1211 13:04:21.045034 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1211 13:04:21.045038 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1211 13:04:21.045043 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1211 13:04:21.045047 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1211 13:04:21.045297 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1211 13:04:21.046774 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a6e962a8c37823fc7d0047f9c1cae14311bfa1f36236addff039bba3369a021\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca4b5408884ed07718750ad82f77961d242327ca11991fd4dbbd1b5b53c5b642\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca4b5408884ed07718750ad82f77961d242327ca11991fd4dbbd1b5b53c5b642\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:23Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:23 crc kubenswrapper[4898]: I1211 13:05:23.453536 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9dab79b9393ae8ca48f0b4366d9b4e6fd25015a3d42f913b3d3d5ce78cbf0cc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bee512bd389a5e402acad6fdbf075e8d90bbfb3899efaecee992ff8ef64617fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:23Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:23 crc kubenswrapper[4898]: I1211 13:05:23.465685 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:23Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:23 crc kubenswrapper[4898]: I1211 13:05:23.477845 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7lxfm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45192e97-2770-4866-9865-dc2f45b3f616\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4fc2a5689b589ccda6ec382b6a586f4e47c42d4f137d852e28688a18b98e6e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14a9667311a747f8e844897394432d67e0897b22569dd1314f4ebeccfce1c531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14a9667311a747f8e844897394432d67e0897b22569dd1314f4ebeccfce1c531\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://35bb75c37377565dd0410255d00b3a848a3d22801107bd4d9547ce5574cda121\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35bb75c37377565dd0410255d00b3a848a3d22801107bd4d9547ce5574cda121\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd3bae44b475629c42174d195e162b9afe545e8dc84857847b1e2d936cf1a32d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd3bae44b475629c42174d195e162b9afe545e8dc84857847b1e2d936cf1a32d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7d5edf28866aeb08ed16121e0cea183463e58a708b157da20edd847bdad9860\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7d5edf28866aeb08ed16121e0cea183463e58a708b157da20edd847bdad9860\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://951e5da3fc6041496960dcfcd28bb3e918a40d9303c6609880bc2f8dcca29688\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://951e5da3fc6041496960dcfcd28bb3e918a40d9303c6609880bc2f8dcca29688\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39b6a95a0387b8f1ea4992bfa28355272a299bbb81adddf7ad39835786b43b2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39b6a95a0387b8f1ea4992bfa28355272a299bbb81adddf7ad39835786b43b2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7lxfm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:23Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:23 crc kubenswrapper[4898]: I1211 13:05:23.487911 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-zcq7l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34380c7c-1d75-4f6f-a6cb-b015a55ca978\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:36Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-zcq7l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:23Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:23 crc kubenswrapper[4898]: I1211 13:05:23.500973 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca3db723-c4e9-4a38-9cee-827f045c7be3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b72e9f6787b4bfadbc89cf669d5f31678f0a85a726107e3b3d2e06eff0e18ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a52b842d9e7001cf1e9c3284c56ac9a76f642b66457d6717c6367e00ef89fdf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://583816a3f9ba9a092d6cd8d75bba54cf42e3ff9f65230f83822baec967dd53dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f419f755ec0a61ffad06dca0d35df43386ebd253152f29cf04312cd0ea89d5de\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:23Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:23 crc kubenswrapper[4898]: I1211 13:05:23.512694 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:23Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:23 crc kubenswrapper[4898]: I1211 13:05:23.515078 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:23 crc kubenswrapper[4898]: I1211 13:05:23.515225 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:23 crc kubenswrapper[4898]: I1211 13:05:23.515287 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:23 crc kubenswrapper[4898]: I1211 13:05:23.515356 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:23 crc kubenswrapper[4898]: I1211 13:05:23.515415 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:23Z","lastTransitionTime":"2025-12-11T13:05:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:23 crc kubenswrapper[4898]: I1211 13:05:23.532136 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:23Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:23 crc kubenswrapper[4898]: I1211 13:05:23.617688 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:23 crc kubenswrapper[4898]: I1211 13:05:23.617715 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:23 crc kubenswrapper[4898]: I1211 13:05:23.617725 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:23 crc kubenswrapper[4898]: I1211 13:05:23.617737 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:23 crc kubenswrapper[4898]: I1211 13:05:23.617745 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:23Z","lastTransitionTime":"2025-12-11T13:05:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:23 crc kubenswrapper[4898]: I1211 13:05:23.720588 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:23 crc kubenswrapper[4898]: I1211 13:05:23.721028 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:23 crc kubenswrapper[4898]: I1211 13:05:23.721175 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:23 crc kubenswrapper[4898]: I1211 13:05:23.721317 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:23 crc kubenswrapper[4898]: I1211 13:05:23.721444 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:23Z","lastTransitionTime":"2025-12-11T13:05:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:23 crc kubenswrapper[4898]: I1211 13:05:23.774116 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zcq7l" Dec 11 13:05:23 crc kubenswrapper[4898]: I1211 13:05:23.774174 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 11 13:05:23 crc kubenswrapper[4898]: I1211 13:05:23.774261 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 11 13:05:23 crc kubenswrapper[4898]: E1211 13:05:23.774366 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zcq7l" podUID="34380c7c-1d75-4f6f-a6cb-b015a55ca978" Dec 11 13:05:23 crc kubenswrapper[4898]: E1211 13:05:23.774434 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 11 13:05:23 crc kubenswrapper[4898]: E1211 13:05:23.774537 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 11 13:05:23 crc kubenswrapper[4898]: I1211 13:05:23.774785 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 11 13:05:23 crc kubenswrapper[4898]: E1211 13:05:23.775027 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 11 13:05:23 crc kubenswrapper[4898]: I1211 13:05:23.824440 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:23 crc kubenswrapper[4898]: I1211 13:05:23.824715 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:23 crc kubenswrapper[4898]: I1211 13:05:23.824809 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:23 crc kubenswrapper[4898]: I1211 13:05:23.824942 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:23 crc kubenswrapper[4898]: I1211 13:05:23.825030 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:23Z","lastTransitionTime":"2025-12-11T13:05:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:23 crc kubenswrapper[4898]: I1211 13:05:23.927871 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:23 crc kubenswrapper[4898]: I1211 13:05:23.927940 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:23 crc kubenswrapper[4898]: I1211 13:05:23.927958 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:23 crc kubenswrapper[4898]: I1211 13:05:23.927983 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:23 crc kubenswrapper[4898]: I1211 13:05:23.928002 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:23Z","lastTransitionTime":"2025-12-11T13:05:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:24 crc kubenswrapper[4898]: I1211 13:05:24.030898 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:24 crc kubenswrapper[4898]: I1211 13:05:24.030936 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:24 crc kubenswrapper[4898]: I1211 13:05:24.030945 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:24 crc kubenswrapper[4898]: I1211 13:05:24.030957 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:24 crc kubenswrapper[4898]: I1211 13:05:24.030966 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:24Z","lastTransitionTime":"2025-12-11T13:05:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:24 crc kubenswrapper[4898]: I1211 13:05:24.133856 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:24 crc kubenswrapper[4898]: I1211 13:05:24.134257 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:24 crc kubenswrapper[4898]: I1211 13:05:24.134440 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:24 crc kubenswrapper[4898]: I1211 13:05:24.134691 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:24 crc kubenswrapper[4898]: I1211 13:05:24.134852 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:24Z","lastTransitionTime":"2025-12-11T13:05:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:24 crc kubenswrapper[4898]: I1211 13:05:24.237855 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:24 crc kubenswrapper[4898]: I1211 13:05:24.238259 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:24 crc kubenswrapper[4898]: I1211 13:05:24.238492 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:24 crc kubenswrapper[4898]: I1211 13:05:24.238804 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:24 crc kubenswrapper[4898]: I1211 13:05:24.239050 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:24Z","lastTransitionTime":"2025-12-11T13:05:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:24 crc kubenswrapper[4898]: I1211 13:05:24.286646 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qndxl_1efa7034-8a95-4e6e-bd84-0189dc5acaa3/ovnkube-controller/3.log" Dec 11 13:05:24 crc kubenswrapper[4898]: I1211 13:05:24.288446 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qndxl_1efa7034-8a95-4e6e-bd84-0189dc5acaa3/ovnkube-controller/2.log" Dec 11 13:05:24 crc kubenswrapper[4898]: I1211 13:05:24.297446 4898 generic.go:334] "Generic (PLEG): container finished" podID="1efa7034-8a95-4e6e-bd84-0189dc5acaa3" containerID="b19bc85d9d1f776f0f2a0bfedbfa6f30383df3673e54acadcc74a46e4d24bcde" exitCode=1 Dec 11 13:05:24 crc kubenswrapper[4898]: I1211 13:05:24.297530 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qndxl" event={"ID":"1efa7034-8a95-4e6e-bd84-0189dc5acaa3","Type":"ContainerDied","Data":"b19bc85d9d1f776f0f2a0bfedbfa6f30383df3673e54acadcc74a46e4d24bcde"} Dec 11 13:05:24 crc kubenswrapper[4898]: I1211 13:05:24.297588 4898 scope.go:117] "RemoveContainer" containerID="18a2b5e8508fbd418e99cd2fe5af41cf82ce69e8a4240c8c3c1965e8f7b0bfb9" Dec 11 13:05:24 crc kubenswrapper[4898]: I1211 13:05:24.307099 4898 scope.go:117] "RemoveContainer" containerID="b19bc85d9d1f776f0f2a0bfedbfa6f30383df3673e54acadcc74a46e4d24bcde" Dec 11 13:05:24 crc kubenswrapper[4898]: E1211 13:05:24.307370 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-qndxl_openshift-ovn-kubernetes(1efa7034-8a95-4e6e-bd84-0189dc5acaa3)\"" pod="openshift-ovn-kubernetes/ovnkube-node-qndxl" podUID="1efa7034-8a95-4e6e-bd84-0189dc5acaa3" Dec 11 13:05:24 crc kubenswrapper[4898]: I1211 13:05:24.318800 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fc485a2-a69e-4453-8c3d-4b9698caa632\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df081be8bffcbf5068ade5f09ad81f2ea332d110fb0dffc1ab215f4a447e75d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://765cb993103d23770e20a725ce1f28ca635d13d2ecd05676039a833274ecf3e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41568c567a418a6ebd534c09d02c4331dfa2cd00580441bfdd83a15a12153a59\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ccc1530df8a90b252f6008c3a1734fd489bcb57944c2cc46bdf3c7554d837c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8857b4ade110b31598d9080f45bc15f5702b2c278fe7909e4d172c9877e61534\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"message\\\":\\\"falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1211 13:04:15.326141 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1211 13:04:15.327902 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-38072788/tls.crt::/tmp/serving-cert-38072788/tls.key\\\\\\\"\\\\nI1211 13:04:21.035582 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1211 13:04:21.038444 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1211 13:04:21.038483 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1211 13:04:21.038515 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1211 13:04:21.038527 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1211 13:04:21.044993 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1211 13:04:21.045022 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1211 13:04:21.045029 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1211 13:04:21.045034 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1211 13:04:21.045038 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1211 13:04:21.045043 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1211 13:04:21.045047 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1211 13:04:21.045297 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1211 13:04:21.046774 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a6e962a8c37823fc7d0047f9c1cae14311bfa1f36236addff039bba3369a021\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca4b5408884ed07718750ad82f77961d242327ca11991fd4dbbd1b5b53c5b642\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca4b5408884ed07718750ad82f77961d242327ca11991fd4dbbd1b5b53c5b642\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:24Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:24 crc kubenswrapper[4898]: I1211 13:05:24.333335 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9dab79b9393ae8ca48f0b4366d9b4e6fd25015a3d42f913b3d3d5ce78cbf0cc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bee512bd389a5e402acad6fdbf075e8d90bbfb3899efaecee992ff8ef64617fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:24Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:24 crc kubenswrapper[4898]: I1211 13:05:24.341505 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:24 crc kubenswrapper[4898]: I1211 13:05:24.341534 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:24 crc kubenswrapper[4898]: I1211 13:05:24.341545 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:24 crc kubenswrapper[4898]: I1211 13:05:24.341558 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:24 crc kubenswrapper[4898]: I1211 13:05:24.341568 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:24Z","lastTransitionTime":"2025-12-11T13:05:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:24 crc kubenswrapper[4898]: I1211 13:05:24.346434 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:24Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:24 crc kubenswrapper[4898]: I1211 13:05:24.358945 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7lxfm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45192e97-2770-4866-9865-dc2f45b3f616\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4fc2a5689b589ccda6ec382b6a586f4e47c42d4f137d852e28688a18b98e6e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14a9667311a747f8e844897394432d67e0897b22569dd1314f4ebeccfce1c531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14a9667311a747f8e844897394432d67e0897b22569dd1314f4ebeccfce1c531\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://35bb75c37377565dd0410255d00b3a848a3d22801107bd4d9547ce5574cda121\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35bb75c37377565dd0410255d00b3a848a3d22801107bd4d9547ce5574cda121\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd3bae44b475629c42174d195e162b9afe545e8dc84857847b1e2d936cf1a32d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd3bae44b475629c42174d195e162b9afe545e8dc84857847b1e2d936cf1a32d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7d5edf28866aeb08ed16121e0cea183463e58a708b157da20edd847bdad9860\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7d5edf28866aeb08ed16121e0cea183463e58a708b157da20edd847bdad9860\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://951e5da3fc6041496960dcfcd28bb3e918a40d9303c6609880bc2f8dcca29688\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://951e5da3fc6041496960dcfcd28bb3e918a40d9303c6609880bc2f8dcca29688\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39b6a95a0387b8f1ea4992bfa28355272a299bbb81adddf7ad39835786b43b2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39b6a95a0387b8f1ea4992bfa28355272a299bbb81adddf7ad39835786b43b2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7lxfm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:24Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:24 crc kubenswrapper[4898]: I1211 13:05:24.368808 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-zcq7l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34380c7c-1d75-4f6f-a6cb-b015a55ca978\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:36Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-zcq7l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:24Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:24 crc kubenswrapper[4898]: I1211 13:05:24.379497 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca3db723-c4e9-4a38-9cee-827f045c7be3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b72e9f6787b4bfadbc89cf669d5f31678f0a85a726107e3b3d2e06eff0e18ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a52b842d9e7001cf1e9c3284c56ac9a76f642b66457d6717c6367e00ef89fdf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://583816a3f9ba9a092d6cd8d75bba54cf42e3ff9f65230f83822baec967dd53dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f419f755ec0a61ffad06dca0d35df43386ebd253152f29cf04312cd0ea89d5de\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:24Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:24 crc kubenswrapper[4898]: I1211 13:05:24.392783 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:24Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:24 crc kubenswrapper[4898]: I1211 13:05:24.408170 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:24Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:24 crc kubenswrapper[4898]: I1211 13:05:24.422757 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://337971ff1d0d3cf7ee256777a71e4549ec0b973b58b437caaa152c589047141f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:24Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:24 crc kubenswrapper[4898]: I1211 13:05:24.435051 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b28cc743c1173591c6ee75d4bb3841608d01bca50dab896d4c5de930a67ee22f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:24Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:24 crc kubenswrapper[4898]: I1211 13:05:24.444810 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:24 crc kubenswrapper[4898]: I1211 13:05:24.444837 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:24 crc kubenswrapper[4898]: I1211 13:05:24.444845 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:24 crc kubenswrapper[4898]: I1211 13:05:24.444858 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:24 crc kubenswrapper[4898]: I1211 13:05:24.444868 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:24Z","lastTransitionTime":"2025-12-11T13:05:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:24 crc kubenswrapper[4898]: I1211 13:05:24.447493 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-dlqfj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e8ed6cb-b822-4b64-9e00-e755c5aea812\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbc9b844a873836a500af9d781e17c635495762b0e50af71dadf141ff00723a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76ddfb054d2dd7ec27a771c96a0fca2cead8eacc3221c22b36c1d075414a1995\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-11T13:05:09Z\\\",\\\"message\\\":\\\"2025-12-11T13:04:23+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_be5097ed-4df8-4212-b1ed-ee3e7c74e5b6\\\\n2025-12-11T13:04:23+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_be5097ed-4df8-4212-b1ed-ee3e7c74e5b6 to /host/opt/cni/bin/\\\\n2025-12-11T13:04:24Z [verbose] multus-daemon started\\\\n2025-12-11T13:04:24Z [verbose] Readiness Indicator file check\\\\n2025-12-11T13:05:09Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:05:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bvrpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-dlqfj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:24Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:24 crc kubenswrapper[4898]: I1211 13:05:24.457101 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jmvlr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8009db93-3f68-4a84-87ae-863f64e231e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08079d059f71e05ad9d28ebed3d2608a035d3b6fc87db1d80b3974b439ac5bdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkmx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:24Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jmvlr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:24Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:24 crc kubenswrapper[4898]: I1211 13:05:24.467478 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbb0d9e2-813d-44ce-a547-a3c6f60c9f6b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e84fc12f1339e236216301663a693fc0ae8c043f282081b77e08455869556f40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcda5c81dd6cdc256d7acd24d32f956484e8823bed9e29c5713a722d8ebd6f46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cbfca7c42a03929bac318edc66ce1a6c7f88f795fd3b58db686b8484edfac60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e329db1261719ccef14dcf3965ff73799b275591613d4035cb820253dced79b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e329db1261719ccef14dcf3965ff73799b275591613d4035cb820253dced79b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:02Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:24Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:24 crc kubenswrapper[4898]: I1211 13:05:24.496519 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"88d684b6-361a-40e6-978e-b46ea34398f5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fdd8f4f5e41c02e8bfa32782027f6483a3568a3e1cb28f2abc7807eaada32646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad23dc07c8ad7cf6100e3b53e1d9ed8abac20e942968edb5decdde1b263de306\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce1bf1a7718645350f992e26a366a0fefa0d757b696c932d3552dd24a4c93550\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4801f56faef50a1c100b7f54be9508a2be17ab655648fdb92cffaab1cb198a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87f5a835b639c4ce50c0d2b5f1a865f1b67bc5cc06d9fd88875d9defb71dbed2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77f3c45a0ef619bd5ee7b89f62fdaf4ae983c8f75e23034eb88019194f6ef88a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77f3c45a0ef619bd5ee7b89f62fdaf4ae983c8f75e23034eb88019194f6ef88a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32f392949d5ebc630af8b0646f8f482f806803d464e2cedbcbe731328d6ba5ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32f392949d5ebc630af8b0646f8f482f806803d464e2cedbcbe731328d6ba5ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://bf8553e4b1a22afaf454efae3707136b9e6e320a54fd269f48a5f72297249d21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf8553e4b1a22afaf454efae3707136b9e6e320a54fd269f48a5f72297249d21\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:24Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:24 crc kubenswrapper[4898]: I1211 13:05:24.508153 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-h86cf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"00c0a5c3-6be3-4c77-a628-cf6710a1f10f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cb94931502e22d03595fc3289c2b098ecf881b9f8c14180a036a11b2f65bd91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xv2hc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-h86cf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:24Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:24 crc kubenswrapper[4898]: I1211 13:05:24.519670 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6782ac5869029f1d6c05198c469d34070decbbbbab0c093963024040f067ecf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7cv42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb7b98c50effebe09db7e51fe139da11af7b116ce6b7d81f0dada92fa29c82a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7cv42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7mmvk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:24Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:24 crc kubenswrapper[4898]: I1211 13:05:24.535305 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qndxl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1efa7034-8a95-4e6e-bd84-0189dc5acaa3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2a77e22893dca5ea818fc6992db6c34f8eb90ce2f25e814bec5f314d60ad569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd81ec4902eeb225d5a2969e211c63bbab448c1cccb327cfa84fb68c4f386ab0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ab0efc2d7d46f575125a46f025b016017e66f74466a9d796dfb0285577f5bcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d992af21fd68cce19a4be8328466a3d46df62ed908fe4e911ba14e7c009ce982\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a60901e27e67b7483e02800a91a86f2676b610d538c999ab1df0d42fde146f90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7e7d7ded2a6e0e20781a16b0ed56a744db593c6fe40c75e89a088ccb8cdb127\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b19bc85d9d1f776f0f2a0bfedbfa6f30383df3673e54acadcc74a46e4d24bcde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://18a2b5e8508fbd418e99cd2fe5af41cf82ce69e8a4240c8c3c1965e8f7b0bfb9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-11T13:04:55Z\\\",\\\"message\\\":\\\"pin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.41:9443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {589f95f7-f3e2-4140-80ed-9a0717201481}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1211 13:04:55.603698 6618 services_controller.go:356] Processing sync for service openshift-marketplace/redhat-marketplace for network=default\\\\nF1211 13:04:55.603702 6618 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:04:55Z is after 2025-08-24T17:21:41Z]\\\\nI1211 13:04:55.6\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:54Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19bc85d9d1f776f0f2a0bfedbfa6f30383df3673e54acadcc74a46e4d24bcde\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-11T13:05:23Z\\\",\\\"message\\\":\\\"twork controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:23Z is after 2025-08-24T17:21:41Z]\\\\nI1211 13:05:23.341647 6982 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-console-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"ebd4748e-0473-49fb-88ad-83dbb221791a\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-console-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]st\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-11T13:05:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3aa907a5ca865510f1aafd0f949963b5938b6fe62fa9fd690447c4ab62566f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fc397b24e8480c4d07281b09fb871f859eb9fdf47fa9103b9243f91e5800d61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fc397b24e8480c4d07281b09fb871f859eb9fdf47fa9103b9243f91e5800d61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qndxl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:24Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:24 crc kubenswrapper[4898]: I1211 13:05:24.546172 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-brphv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f71cb6f-609f-4fab-86eb-e01518fb8c61\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d17922d87e50d85f50e76fef19a44cedae3bfd35727f8531089cfcf6da12d5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwpjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cedba92b63b5430b84d32f2b59c5cc0c4e1cd2bc2d513f2f905ebb59c34639b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwpjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-brphv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:24Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:24 crc kubenswrapper[4898]: I1211 13:05:24.547667 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:24 crc kubenswrapper[4898]: I1211 13:05:24.547695 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:24 crc kubenswrapper[4898]: I1211 13:05:24.547705 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:24 crc kubenswrapper[4898]: I1211 13:05:24.547722 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:24 crc kubenswrapper[4898]: I1211 13:05:24.547735 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:24Z","lastTransitionTime":"2025-12-11T13:05:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:24 crc kubenswrapper[4898]: I1211 13:05:24.651182 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:24 crc kubenswrapper[4898]: I1211 13:05:24.651255 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:24 crc kubenswrapper[4898]: I1211 13:05:24.651272 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:24 crc kubenswrapper[4898]: I1211 13:05:24.651298 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:24 crc kubenswrapper[4898]: I1211 13:05:24.651324 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:24Z","lastTransitionTime":"2025-12-11T13:05:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:24 crc kubenswrapper[4898]: I1211 13:05:24.754108 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:24 crc kubenswrapper[4898]: I1211 13:05:24.754152 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:24 crc kubenswrapper[4898]: I1211 13:05:24.754163 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:24 crc kubenswrapper[4898]: I1211 13:05:24.754178 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:24 crc kubenswrapper[4898]: I1211 13:05:24.754189 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:24Z","lastTransitionTime":"2025-12-11T13:05:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:24 crc kubenswrapper[4898]: I1211 13:05:24.857128 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:24 crc kubenswrapper[4898]: I1211 13:05:24.857204 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:24 crc kubenswrapper[4898]: I1211 13:05:24.857216 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:24 crc kubenswrapper[4898]: I1211 13:05:24.857231 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:24 crc kubenswrapper[4898]: I1211 13:05:24.857243 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:24Z","lastTransitionTime":"2025-12-11T13:05:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:24 crc kubenswrapper[4898]: I1211 13:05:24.959984 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:24 crc kubenswrapper[4898]: I1211 13:05:24.960031 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:24 crc kubenswrapper[4898]: I1211 13:05:24.960042 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:24 crc kubenswrapper[4898]: I1211 13:05:24.960059 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:24 crc kubenswrapper[4898]: I1211 13:05:24.960074 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:24Z","lastTransitionTime":"2025-12-11T13:05:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:25 crc kubenswrapper[4898]: I1211 13:05:25.063010 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:25 crc kubenswrapper[4898]: I1211 13:05:25.063073 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:25 crc kubenswrapper[4898]: I1211 13:05:25.063089 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:25 crc kubenswrapper[4898]: I1211 13:05:25.063116 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:25 crc kubenswrapper[4898]: I1211 13:05:25.063133 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:25Z","lastTransitionTime":"2025-12-11T13:05:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:25 crc kubenswrapper[4898]: I1211 13:05:25.166733 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:25 crc kubenswrapper[4898]: I1211 13:05:25.166776 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:25 crc kubenswrapper[4898]: I1211 13:05:25.166790 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:25 crc kubenswrapper[4898]: I1211 13:05:25.166807 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:25 crc kubenswrapper[4898]: I1211 13:05:25.166819 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:25Z","lastTransitionTime":"2025-12-11T13:05:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:25 crc kubenswrapper[4898]: I1211 13:05:25.269098 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:25 crc kubenswrapper[4898]: I1211 13:05:25.269148 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:25 crc kubenswrapper[4898]: I1211 13:05:25.269159 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:25 crc kubenswrapper[4898]: I1211 13:05:25.269178 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:25 crc kubenswrapper[4898]: I1211 13:05:25.269190 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:25Z","lastTransitionTime":"2025-12-11T13:05:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:25 crc kubenswrapper[4898]: I1211 13:05:25.303662 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qndxl_1efa7034-8a95-4e6e-bd84-0189dc5acaa3/ovnkube-controller/3.log" Dec 11 13:05:25 crc kubenswrapper[4898]: I1211 13:05:25.309156 4898 scope.go:117] "RemoveContainer" containerID="b19bc85d9d1f776f0f2a0bfedbfa6f30383df3673e54acadcc74a46e4d24bcde" Dec 11 13:05:25 crc kubenswrapper[4898]: E1211 13:05:25.309356 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-qndxl_openshift-ovn-kubernetes(1efa7034-8a95-4e6e-bd84-0189dc5acaa3)\"" pod="openshift-ovn-kubernetes/ovnkube-node-qndxl" podUID="1efa7034-8a95-4e6e-bd84-0189dc5acaa3" Dec 11 13:05:25 crc kubenswrapper[4898]: I1211 13:05:25.326142 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-h86cf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"00c0a5c3-6be3-4c77-a628-cf6710a1f10f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cb94931502e22d03595fc3289c2b098ecf881b9f8c14180a036a11b2f65bd91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xv2hc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-h86cf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:25Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:25 crc kubenswrapper[4898]: I1211 13:05:25.345614 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6782ac5869029f1d6c05198c469d34070decbbbbab0c093963024040f067ecf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7cv42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb7b98c50effebe09db7e51fe139da11af7b116ce6b7d81f0dada92fa29c82a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7cv42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7mmvk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:25Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:25 crc kubenswrapper[4898]: I1211 13:05:25.371924 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:25 crc kubenswrapper[4898]: I1211 13:05:25.371987 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:25 crc kubenswrapper[4898]: I1211 13:05:25.372001 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:25 crc kubenswrapper[4898]: I1211 13:05:25.372017 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:25 crc kubenswrapper[4898]: I1211 13:05:25.372029 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:25Z","lastTransitionTime":"2025-12-11T13:05:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:25 crc kubenswrapper[4898]: I1211 13:05:25.379523 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qndxl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1efa7034-8a95-4e6e-bd84-0189dc5acaa3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2a77e22893dca5ea818fc6992db6c34f8eb90ce2f25e814bec5f314d60ad569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd81ec4902eeb225d5a2969e211c63bbab448c1cccb327cfa84fb68c4f386ab0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ab0efc2d7d46f575125a46f025b016017e66f74466a9d796dfb0285577f5bcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d992af21fd68cce19a4be8328466a3d46df62ed908fe4e911ba14e7c009ce982\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a60901e27e67b7483e02800a91a86f2676b610d538c999ab1df0d42fde146f90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7e7d7ded2a6e0e20781a16b0ed56a744db593c6fe40c75e89a088ccb8cdb127\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b19bc85d9d1f776f0f2a0bfedbfa6f30383df3673e54acadcc74a46e4d24bcde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19bc85d9d1f776f0f2a0bfedbfa6f30383df3673e54acadcc74a46e4d24bcde\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-11T13:05:23Z\\\",\\\"message\\\":\\\"twork controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:23Z is after 2025-08-24T17:21:41Z]\\\\nI1211 13:05:23.341647 6982 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-console-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"ebd4748e-0473-49fb-88ad-83dbb221791a\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-console-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]st\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-11T13:05:22Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-qndxl_openshift-ovn-kubernetes(1efa7034-8a95-4e6e-bd84-0189dc5acaa3)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3aa907a5ca865510f1aafd0f949963b5938b6fe62fa9fd690447c4ab62566f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fc397b24e8480c4d07281b09fb871f859eb9fdf47fa9103b9243f91e5800d61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fc397b24e8480c4d07281b09fb871f859eb9fdf47fa9103b9243f91e5800d61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qndxl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:25Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:25 crc kubenswrapper[4898]: I1211 13:05:25.397872 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-brphv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f71cb6f-609f-4fab-86eb-e01518fb8c61\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d17922d87e50d85f50e76fef19a44cedae3bfd35727f8531089cfcf6da12d5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwpjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cedba92b63b5430b84d32f2b59c5cc0c4e1cd2bc2d513f2f905ebb59c34639b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwpjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-brphv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:25Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:25 crc kubenswrapper[4898]: I1211 13:05:25.417130 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbb0d9e2-813d-44ce-a547-a3c6f60c9f6b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e84fc12f1339e236216301663a693fc0ae8c043f282081b77e08455869556f40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcda5c81dd6cdc256d7acd24d32f956484e8823bed9e29c5713a722d8ebd6f46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cbfca7c42a03929bac318edc66ce1a6c7f88f795fd3b58db686b8484edfac60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e329db1261719ccef14dcf3965ff73799b275591613d4035cb820253dced79b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e329db1261719ccef14dcf3965ff73799b275591613d4035cb820253dced79b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:02Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:25Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:25 crc kubenswrapper[4898]: I1211 13:05:25.436971 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"88d684b6-361a-40e6-978e-b46ea34398f5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fdd8f4f5e41c02e8bfa32782027f6483a3568a3e1cb28f2abc7807eaada32646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad23dc07c8ad7cf6100e3b53e1d9ed8abac20e942968edb5decdde1b263de306\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce1bf1a7718645350f992e26a366a0fefa0d757b696c932d3552dd24a4c93550\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4801f56faef50a1c100b7f54be9508a2be17ab655648fdb92cffaab1cb198a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87f5a835b639c4ce50c0d2b5f1a865f1b67bc5cc06d9fd88875d9defb71dbed2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77f3c45a0ef619bd5ee7b89f62fdaf4ae983c8f75e23034eb88019194f6ef88a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77f3c45a0ef619bd5ee7b89f62fdaf4ae983c8f75e23034eb88019194f6ef88a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32f392949d5ebc630af8b0646f8f482f806803d464e2cedbcbe731328d6ba5ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32f392949d5ebc630af8b0646f8f482f806803d464e2cedbcbe731328d6ba5ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://bf8553e4b1a22afaf454efae3707136b9e6e320a54fd269f48a5f72297249d21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf8553e4b1a22afaf454efae3707136b9e6e320a54fd269f48a5f72297249d21\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:25Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:25 crc kubenswrapper[4898]: I1211 13:05:25.449968 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7lxfm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45192e97-2770-4866-9865-dc2f45b3f616\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4fc2a5689b589ccda6ec382b6a586f4e47c42d4f137d852e28688a18b98e6e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14a9667311a747f8e844897394432d67e0897b22569dd1314f4ebeccfce1c531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14a9667311a747f8e844897394432d67e0897b22569dd1314f4ebeccfce1c531\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://35bb75c37377565dd0410255d00b3a848a3d22801107bd4d9547ce5574cda121\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35bb75c37377565dd0410255d00b3a848a3d22801107bd4d9547ce5574cda121\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd3bae44b475629c42174d195e162b9afe545e8dc84857847b1e2d936cf1a32d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd3bae44b475629c42174d195e162b9afe545e8dc84857847b1e2d936cf1a32d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7d5edf28866aeb08ed16121e0cea183463e58a708b157da20edd847bdad9860\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7d5edf28866aeb08ed16121e0cea183463e58a708b157da20edd847bdad9860\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://951e5da3fc6041496960dcfcd28bb3e918a40d9303c6609880bc2f8dcca29688\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://951e5da3fc6041496960dcfcd28bb3e918a40d9303c6609880bc2f8dcca29688\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39b6a95a0387b8f1ea4992bfa28355272a299bbb81adddf7ad39835786b43b2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39b6a95a0387b8f1ea4992bfa28355272a299bbb81adddf7ad39835786b43b2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7lxfm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:25Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:25 crc kubenswrapper[4898]: I1211 13:05:25.460865 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-zcq7l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34380c7c-1d75-4f6f-a6cb-b015a55ca978\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:36Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-zcq7l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:25Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:25 crc kubenswrapper[4898]: I1211 13:05:25.475209 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:25 crc kubenswrapper[4898]: I1211 13:05:25.475285 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:25 crc kubenswrapper[4898]: I1211 13:05:25.475308 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:25 crc kubenswrapper[4898]: I1211 13:05:25.475338 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:25 crc kubenswrapper[4898]: I1211 13:05:25.475223 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fc485a2-a69e-4453-8c3d-4b9698caa632\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df081be8bffcbf5068ade5f09ad81f2ea332d110fb0dffc1ab215f4a447e75d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://765cb993103d23770e20a725ce1f28ca635d13d2ecd05676039a833274ecf3e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41568c567a418a6ebd534c09d02c4331dfa2cd00580441bfdd83a15a12153a59\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ccc1530df8a90b252f6008c3a1734fd489bcb57944c2cc46bdf3c7554d837c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8857b4ade110b31598d9080f45bc15f5702b2c278fe7909e4d172c9877e61534\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"message\\\":\\\"falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1211 13:04:15.326141 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1211 13:04:15.327902 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-38072788/tls.crt::/tmp/serving-cert-38072788/tls.key\\\\\\\"\\\\nI1211 13:04:21.035582 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1211 13:04:21.038444 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1211 13:04:21.038483 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1211 13:04:21.038515 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1211 13:04:21.038527 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1211 13:04:21.044993 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1211 13:04:21.045022 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1211 13:04:21.045029 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1211 13:04:21.045034 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1211 13:04:21.045038 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1211 13:04:21.045043 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1211 13:04:21.045047 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1211 13:04:21.045297 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1211 13:04:21.046774 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a6e962a8c37823fc7d0047f9c1cae14311bfa1f36236addff039bba3369a021\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca4b5408884ed07718750ad82f77961d242327ca11991fd4dbbd1b5b53c5b642\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca4b5408884ed07718750ad82f77961d242327ca11991fd4dbbd1b5b53c5b642\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:25Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:25 crc kubenswrapper[4898]: I1211 13:05:25.475362 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:25Z","lastTransitionTime":"2025-12-11T13:05:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:25 crc kubenswrapper[4898]: I1211 13:05:25.490114 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9dab79b9393ae8ca48f0b4366d9b4e6fd25015a3d42f913b3d3d5ce78cbf0cc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bee512bd389a5e402acad6fdbf075e8d90bbfb3899efaecee992ff8ef64617fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:25Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:25 crc kubenswrapper[4898]: I1211 13:05:25.509933 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:25Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:25 crc kubenswrapper[4898]: I1211 13:05:25.528676 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca3db723-c4e9-4a38-9cee-827f045c7be3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b72e9f6787b4bfadbc89cf669d5f31678f0a85a726107e3b3d2e06eff0e18ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a52b842d9e7001cf1e9c3284c56ac9a76f642b66457d6717c6367e00ef89fdf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://583816a3f9ba9a092d6cd8d75bba54cf42e3ff9f65230f83822baec967dd53dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f419f755ec0a61ffad06dca0d35df43386ebd253152f29cf04312cd0ea89d5de\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:25Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:25 crc kubenswrapper[4898]: I1211 13:05:25.545613 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:25Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:25 crc kubenswrapper[4898]: I1211 13:05:25.566823 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:25Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:25 crc kubenswrapper[4898]: I1211 13:05:25.570577 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 11 13:05:25 crc kubenswrapper[4898]: E1211 13:05:25.570750 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-11 13:06:29.570716735 +0000 UTC m=+147.143043222 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 13:05:25 crc kubenswrapper[4898]: I1211 13:05:25.580646 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jmvlr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8009db93-3f68-4a84-87ae-863f64e231e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08079d059f71e05ad9d28ebed3d2608a035d3b6fc87db1d80b3974b439ac5bdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkmx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:24Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jmvlr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:25Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:25 crc kubenswrapper[4898]: I1211 13:05:25.580723 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:25 crc kubenswrapper[4898]: I1211 13:05:25.580755 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:25 crc kubenswrapper[4898]: I1211 13:05:25.580768 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:25 crc kubenswrapper[4898]: I1211 13:05:25.580784 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:25 crc kubenswrapper[4898]: I1211 13:05:25.580796 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:25Z","lastTransitionTime":"2025-12-11T13:05:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:25 crc kubenswrapper[4898]: I1211 13:05:25.600799 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://337971ff1d0d3cf7ee256777a71e4549ec0b973b58b437caaa152c589047141f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:25Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:25 crc kubenswrapper[4898]: I1211 13:05:25.613648 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b28cc743c1173591c6ee75d4bb3841608d01bca50dab896d4c5de930a67ee22f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:25Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:25 crc kubenswrapper[4898]: I1211 13:05:25.634174 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-dlqfj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e8ed6cb-b822-4b64-9e00-e755c5aea812\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbc9b844a873836a500af9d781e17c635495762b0e50af71dadf141ff00723a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76ddfb054d2dd7ec27a771c96a0fca2cead8eacc3221c22b36c1d075414a1995\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-11T13:05:09Z\\\",\\\"message\\\":\\\"2025-12-11T13:04:23+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_be5097ed-4df8-4212-b1ed-ee3e7c74e5b6\\\\n2025-12-11T13:04:23+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_be5097ed-4df8-4212-b1ed-ee3e7c74e5b6 to /host/opt/cni/bin/\\\\n2025-12-11T13:04:24Z [verbose] multus-daemon started\\\\n2025-12-11T13:04:24Z [verbose] Readiness Indicator file check\\\\n2025-12-11T13:05:09Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:05:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bvrpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-dlqfj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:25Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:25 crc kubenswrapper[4898]: I1211 13:05:25.672360 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 11 13:05:25 crc kubenswrapper[4898]: I1211 13:05:25.672500 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 11 13:05:25 crc kubenswrapper[4898]: I1211 13:05:25.672598 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 11 13:05:25 crc kubenswrapper[4898]: I1211 13:05:25.672653 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 11 13:05:25 crc kubenswrapper[4898]: E1211 13:05:25.672682 4898 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 11 13:05:25 crc kubenswrapper[4898]: E1211 13:05:25.672708 4898 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 11 13:05:25 crc kubenswrapper[4898]: E1211 13:05:25.672826 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-11 13:06:29.672790343 +0000 UTC m=+147.245116820 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 11 13:05:25 crc kubenswrapper[4898]: E1211 13:05:25.672846 4898 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 11 13:05:25 crc kubenswrapper[4898]: E1211 13:05:25.672870 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-11 13:06:29.672849635 +0000 UTC m=+147.245176182 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 11 13:05:25 crc kubenswrapper[4898]: E1211 13:05:25.672882 4898 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 11 13:05:25 crc kubenswrapper[4898]: E1211 13:05:25.672887 4898 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 11 13:05:25 crc kubenswrapper[4898]: E1211 13:05:25.672917 4898 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 11 13:05:25 crc kubenswrapper[4898]: E1211 13:05:25.672928 4898 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 11 13:05:25 crc kubenswrapper[4898]: E1211 13:05:25.672944 4898 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 11 13:05:25 crc kubenswrapper[4898]: E1211 13:05:25.673002 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-11 13:06:29.672984469 +0000 UTC m=+147.245310936 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 11 13:05:25 crc kubenswrapper[4898]: E1211 13:05:25.673045 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-11 13:06:29.673015259 +0000 UTC m=+147.245341786 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 11 13:05:25 crc kubenswrapper[4898]: I1211 13:05:25.684566 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:25 crc kubenswrapper[4898]: I1211 13:05:25.684631 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:25 crc kubenswrapper[4898]: I1211 13:05:25.684656 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:25 crc kubenswrapper[4898]: I1211 13:05:25.684699 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:25 crc kubenswrapper[4898]: I1211 13:05:25.684724 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:25Z","lastTransitionTime":"2025-12-11T13:05:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:25 crc kubenswrapper[4898]: I1211 13:05:25.773917 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 11 13:05:25 crc kubenswrapper[4898]: I1211 13:05:25.773973 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 11 13:05:25 crc kubenswrapper[4898]: I1211 13:05:25.774128 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 11 13:05:25 crc kubenswrapper[4898]: E1211 13:05:25.774313 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 11 13:05:25 crc kubenswrapper[4898]: I1211 13:05:25.774388 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zcq7l" Dec 11 13:05:25 crc kubenswrapper[4898]: E1211 13:05:25.774541 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 11 13:05:25 crc kubenswrapper[4898]: E1211 13:05:25.774676 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zcq7l" podUID="34380c7c-1d75-4f6f-a6cb-b015a55ca978" Dec 11 13:05:25 crc kubenswrapper[4898]: E1211 13:05:25.774810 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 11 13:05:25 crc kubenswrapper[4898]: I1211 13:05:25.787321 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:25 crc kubenswrapper[4898]: I1211 13:05:25.787380 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:25 crc kubenswrapper[4898]: I1211 13:05:25.787398 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:25 crc kubenswrapper[4898]: I1211 13:05:25.787422 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:25 crc kubenswrapper[4898]: I1211 13:05:25.787440 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:25Z","lastTransitionTime":"2025-12-11T13:05:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:25 crc kubenswrapper[4898]: I1211 13:05:25.890139 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:25 crc kubenswrapper[4898]: I1211 13:05:25.890176 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:25 crc kubenswrapper[4898]: I1211 13:05:25.890186 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:25 crc kubenswrapper[4898]: I1211 13:05:25.890199 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:25 crc kubenswrapper[4898]: I1211 13:05:25.890210 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:25Z","lastTransitionTime":"2025-12-11T13:05:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:25 crc kubenswrapper[4898]: I1211 13:05:25.993164 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:25 crc kubenswrapper[4898]: I1211 13:05:25.993253 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:25 crc kubenswrapper[4898]: I1211 13:05:25.993288 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:25 crc kubenswrapper[4898]: I1211 13:05:25.993322 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:25 crc kubenswrapper[4898]: I1211 13:05:25.993346 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:25Z","lastTransitionTime":"2025-12-11T13:05:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:26 crc kubenswrapper[4898]: I1211 13:05:26.095827 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:26 crc kubenswrapper[4898]: I1211 13:05:26.095864 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:26 crc kubenswrapper[4898]: I1211 13:05:26.095872 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:26 crc kubenswrapper[4898]: I1211 13:05:26.095885 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:26 crc kubenswrapper[4898]: I1211 13:05:26.095894 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:26Z","lastTransitionTime":"2025-12-11T13:05:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:26 crc kubenswrapper[4898]: I1211 13:05:26.199440 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:26 crc kubenswrapper[4898]: I1211 13:05:26.199563 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:26 crc kubenswrapper[4898]: I1211 13:05:26.199632 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:26 crc kubenswrapper[4898]: I1211 13:05:26.199675 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:26 crc kubenswrapper[4898]: I1211 13:05:26.199706 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:26Z","lastTransitionTime":"2025-12-11T13:05:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:26 crc kubenswrapper[4898]: I1211 13:05:26.301887 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:26 crc kubenswrapper[4898]: I1211 13:05:26.301948 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:26 crc kubenswrapper[4898]: I1211 13:05:26.301965 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:26 crc kubenswrapper[4898]: I1211 13:05:26.301990 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:26 crc kubenswrapper[4898]: I1211 13:05:26.302014 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:26Z","lastTransitionTime":"2025-12-11T13:05:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:26 crc kubenswrapper[4898]: I1211 13:05:26.404743 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:26 crc kubenswrapper[4898]: I1211 13:05:26.404790 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:26 crc kubenswrapper[4898]: I1211 13:05:26.404799 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:26 crc kubenswrapper[4898]: I1211 13:05:26.404815 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:26 crc kubenswrapper[4898]: I1211 13:05:26.404824 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:26Z","lastTransitionTime":"2025-12-11T13:05:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:26 crc kubenswrapper[4898]: I1211 13:05:26.507176 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:26 crc kubenswrapper[4898]: I1211 13:05:26.507214 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:26 crc kubenswrapper[4898]: I1211 13:05:26.507224 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:26 crc kubenswrapper[4898]: I1211 13:05:26.507259 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:26 crc kubenswrapper[4898]: I1211 13:05:26.507269 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:26Z","lastTransitionTime":"2025-12-11T13:05:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:26 crc kubenswrapper[4898]: I1211 13:05:26.609280 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:26 crc kubenswrapper[4898]: I1211 13:05:26.609349 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:26 crc kubenswrapper[4898]: I1211 13:05:26.609371 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:26 crc kubenswrapper[4898]: I1211 13:05:26.609396 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:26 crc kubenswrapper[4898]: I1211 13:05:26.609418 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:26Z","lastTransitionTime":"2025-12-11T13:05:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:26 crc kubenswrapper[4898]: I1211 13:05:26.712246 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:26 crc kubenswrapper[4898]: I1211 13:05:26.712320 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:26 crc kubenswrapper[4898]: I1211 13:05:26.712348 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:26 crc kubenswrapper[4898]: I1211 13:05:26.712376 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:26 crc kubenswrapper[4898]: I1211 13:05:26.712398 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:26Z","lastTransitionTime":"2025-12-11T13:05:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:26 crc kubenswrapper[4898]: I1211 13:05:26.814815 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:26 crc kubenswrapper[4898]: I1211 13:05:26.814873 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:26 crc kubenswrapper[4898]: I1211 13:05:26.814921 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:26 crc kubenswrapper[4898]: I1211 13:05:26.814944 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:26 crc kubenswrapper[4898]: I1211 13:05:26.814960 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:26Z","lastTransitionTime":"2025-12-11T13:05:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:26 crc kubenswrapper[4898]: I1211 13:05:26.917990 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:26 crc kubenswrapper[4898]: I1211 13:05:26.918079 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:26 crc kubenswrapper[4898]: I1211 13:05:26.918098 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:26 crc kubenswrapper[4898]: I1211 13:05:26.918131 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:26 crc kubenswrapper[4898]: I1211 13:05:26.918157 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:26Z","lastTransitionTime":"2025-12-11T13:05:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:27 crc kubenswrapper[4898]: I1211 13:05:27.020880 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:27 crc kubenswrapper[4898]: I1211 13:05:27.021412 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:27 crc kubenswrapper[4898]: I1211 13:05:27.021433 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:27 crc kubenswrapper[4898]: I1211 13:05:27.021498 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:27 crc kubenswrapper[4898]: I1211 13:05:27.021518 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:27Z","lastTransitionTime":"2025-12-11T13:05:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:27 crc kubenswrapper[4898]: I1211 13:05:27.125631 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:27 crc kubenswrapper[4898]: I1211 13:05:27.125672 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:27 crc kubenswrapper[4898]: I1211 13:05:27.125683 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:27 crc kubenswrapper[4898]: I1211 13:05:27.125699 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:27 crc kubenswrapper[4898]: I1211 13:05:27.125711 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:27Z","lastTransitionTime":"2025-12-11T13:05:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:27 crc kubenswrapper[4898]: I1211 13:05:27.229054 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:27 crc kubenswrapper[4898]: I1211 13:05:27.229113 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:27 crc kubenswrapper[4898]: I1211 13:05:27.229130 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:27 crc kubenswrapper[4898]: I1211 13:05:27.229154 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:27 crc kubenswrapper[4898]: I1211 13:05:27.229172 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:27Z","lastTransitionTime":"2025-12-11T13:05:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:27 crc kubenswrapper[4898]: I1211 13:05:27.331122 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:27 crc kubenswrapper[4898]: I1211 13:05:27.331179 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:27 crc kubenswrapper[4898]: I1211 13:05:27.331191 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:27 crc kubenswrapper[4898]: I1211 13:05:27.331213 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:27 crc kubenswrapper[4898]: I1211 13:05:27.331228 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:27Z","lastTransitionTime":"2025-12-11T13:05:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:27 crc kubenswrapper[4898]: I1211 13:05:27.437620 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:27 crc kubenswrapper[4898]: I1211 13:05:27.437676 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:27 crc kubenswrapper[4898]: I1211 13:05:27.437692 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:27 crc kubenswrapper[4898]: I1211 13:05:27.438170 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:27 crc kubenswrapper[4898]: I1211 13:05:27.438352 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:27Z","lastTransitionTime":"2025-12-11T13:05:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:27 crc kubenswrapper[4898]: I1211 13:05:27.481164 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:27 crc kubenswrapper[4898]: I1211 13:05:27.481209 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:27 crc kubenswrapper[4898]: I1211 13:05:27.481219 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:27 crc kubenswrapper[4898]: I1211 13:05:27.481235 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:27 crc kubenswrapper[4898]: I1211 13:05:27.481246 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:27Z","lastTransitionTime":"2025-12-11T13:05:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:27 crc kubenswrapper[4898]: E1211 13:05:27.502936 4898 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-11T13:05:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-11T13:05:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-11T13:05:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-11T13:05:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-11T13:05:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-11T13:05:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-11T13:05:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-11T13:05:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"160776e4-8c30-4eed-9dbf-aa0b51733cfb\\\",\\\"systemUUID\\\":\\\"3ede5a2c-67f9-4bff-827e-03a23908e5c0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:27Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:27 crc kubenswrapper[4898]: I1211 13:05:27.507125 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:27 crc kubenswrapper[4898]: I1211 13:05:27.507188 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:27 crc kubenswrapper[4898]: I1211 13:05:27.507213 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:27 crc kubenswrapper[4898]: I1211 13:05:27.507244 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:27 crc kubenswrapper[4898]: I1211 13:05:27.507269 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:27Z","lastTransitionTime":"2025-12-11T13:05:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:27 crc kubenswrapper[4898]: E1211 13:05:27.521752 4898 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-11T13:05:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-11T13:05:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-11T13:05:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-11T13:05:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-11T13:05:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-11T13:05:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-11T13:05:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-11T13:05:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"160776e4-8c30-4eed-9dbf-aa0b51733cfb\\\",\\\"systemUUID\\\":\\\"3ede5a2c-67f9-4bff-827e-03a23908e5c0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:27Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:27 crc kubenswrapper[4898]: I1211 13:05:27.525279 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:27 crc kubenswrapper[4898]: I1211 13:05:27.525325 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:27 crc kubenswrapper[4898]: I1211 13:05:27.525340 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:27 crc kubenswrapper[4898]: I1211 13:05:27.525360 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:27 crc kubenswrapper[4898]: I1211 13:05:27.525374 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:27Z","lastTransitionTime":"2025-12-11T13:05:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:27 crc kubenswrapper[4898]: E1211 13:05:27.538748 4898 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-11T13:05:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-11T13:05:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-11T13:05:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-11T13:05:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-11T13:05:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-11T13:05:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-11T13:05:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-11T13:05:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"160776e4-8c30-4eed-9dbf-aa0b51733cfb\\\",\\\"systemUUID\\\":\\\"3ede5a2c-67f9-4bff-827e-03a23908e5c0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:27Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:27 crc kubenswrapper[4898]: I1211 13:05:27.542818 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:27 crc kubenswrapper[4898]: I1211 13:05:27.542853 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:27 crc kubenswrapper[4898]: I1211 13:05:27.542864 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:27 crc kubenswrapper[4898]: I1211 13:05:27.542878 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:27 crc kubenswrapper[4898]: I1211 13:05:27.542890 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:27Z","lastTransitionTime":"2025-12-11T13:05:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:27 crc kubenswrapper[4898]: E1211 13:05:27.555892 4898 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-11T13:05:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-11T13:05:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-11T13:05:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-11T13:05:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-11T13:05:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-11T13:05:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-11T13:05:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-11T13:05:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"160776e4-8c30-4eed-9dbf-aa0b51733cfb\\\",\\\"systemUUID\\\":\\\"3ede5a2c-67f9-4bff-827e-03a23908e5c0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:27Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:27 crc kubenswrapper[4898]: I1211 13:05:27.559281 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:27 crc kubenswrapper[4898]: I1211 13:05:27.559323 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:27 crc kubenswrapper[4898]: I1211 13:05:27.559336 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:27 crc kubenswrapper[4898]: I1211 13:05:27.559352 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:27 crc kubenswrapper[4898]: I1211 13:05:27.559362 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:27Z","lastTransitionTime":"2025-12-11T13:05:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:27 crc kubenswrapper[4898]: E1211 13:05:27.572351 4898 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-11T13:05:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-11T13:05:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-11T13:05:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-11T13:05:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-11T13:05:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-11T13:05:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-11T13:05:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-11T13:05:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"160776e4-8c30-4eed-9dbf-aa0b51733cfb\\\",\\\"systemUUID\\\":\\\"3ede5a2c-67f9-4bff-827e-03a23908e5c0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:27Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:27 crc kubenswrapper[4898]: E1211 13:05:27.572561 4898 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 11 13:05:27 crc kubenswrapper[4898]: I1211 13:05:27.574338 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:27 crc kubenswrapper[4898]: I1211 13:05:27.574385 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:27 crc kubenswrapper[4898]: I1211 13:05:27.574397 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:27 crc kubenswrapper[4898]: I1211 13:05:27.574412 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:27 crc kubenswrapper[4898]: I1211 13:05:27.574424 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:27Z","lastTransitionTime":"2025-12-11T13:05:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:27 crc kubenswrapper[4898]: I1211 13:05:27.677905 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:27 crc kubenswrapper[4898]: I1211 13:05:27.677964 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:27 crc kubenswrapper[4898]: I1211 13:05:27.677982 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:27 crc kubenswrapper[4898]: I1211 13:05:27.678041 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:27 crc kubenswrapper[4898]: I1211 13:05:27.678058 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:27Z","lastTransitionTime":"2025-12-11T13:05:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:27 crc kubenswrapper[4898]: I1211 13:05:27.774556 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 11 13:05:27 crc kubenswrapper[4898]: E1211 13:05:27.775002 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 11 13:05:27 crc kubenswrapper[4898]: I1211 13:05:27.774716 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 11 13:05:27 crc kubenswrapper[4898]: E1211 13:05:27.775274 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 11 13:05:27 crc kubenswrapper[4898]: I1211 13:05:27.774658 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zcq7l" Dec 11 13:05:27 crc kubenswrapper[4898]: I1211 13:05:27.774731 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 11 13:05:27 crc kubenswrapper[4898]: E1211 13:05:27.775692 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zcq7l" podUID="34380c7c-1d75-4f6f-a6cb-b015a55ca978" Dec 11 13:05:27 crc kubenswrapper[4898]: E1211 13:05:27.775825 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 11 13:05:27 crc kubenswrapper[4898]: I1211 13:05:27.781236 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:27 crc kubenswrapper[4898]: I1211 13:05:27.781366 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:27 crc kubenswrapper[4898]: I1211 13:05:27.781482 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:27 crc kubenswrapper[4898]: I1211 13:05:27.781518 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:27 crc kubenswrapper[4898]: I1211 13:05:27.781543 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:27Z","lastTransitionTime":"2025-12-11T13:05:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:27 crc kubenswrapper[4898]: I1211 13:05:27.884037 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:27 crc kubenswrapper[4898]: I1211 13:05:27.884104 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:27 crc kubenswrapper[4898]: I1211 13:05:27.884125 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:27 crc kubenswrapper[4898]: I1211 13:05:27.884153 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:27 crc kubenswrapper[4898]: I1211 13:05:27.884175 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:27Z","lastTransitionTime":"2025-12-11T13:05:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:27 crc kubenswrapper[4898]: I1211 13:05:27.987489 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:27 crc kubenswrapper[4898]: I1211 13:05:27.987547 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:27 crc kubenswrapper[4898]: I1211 13:05:27.987565 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:27 crc kubenswrapper[4898]: I1211 13:05:27.987594 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:27 crc kubenswrapper[4898]: I1211 13:05:27.987613 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:27Z","lastTransitionTime":"2025-12-11T13:05:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:28 crc kubenswrapper[4898]: I1211 13:05:28.090037 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:28 crc kubenswrapper[4898]: I1211 13:05:28.090076 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:28 crc kubenswrapper[4898]: I1211 13:05:28.090086 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:28 crc kubenswrapper[4898]: I1211 13:05:28.090102 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:28 crc kubenswrapper[4898]: I1211 13:05:28.090112 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:28Z","lastTransitionTime":"2025-12-11T13:05:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:28 crc kubenswrapper[4898]: I1211 13:05:28.192153 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:28 crc kubenswrapper[4898]: I1211 13:05:28.192223 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:28 crc kubenswrapper[4898]: I1211 13:05:28.192237 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:28 crc kubenswrapper[4898]: I1211 13:05:28.192252 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:28 crc kubenswrapper[4898]: I1211 13:05:28.192284 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:28Z","lastTransitionTime":"2025-12-11T13:05:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:28 crc kubenswrapper[4898]: I1211 13:05:28.294574 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:28 crc kubenswrapper[4898]: I1211 13:05:28.294911 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:28 crc kubenswrapper[4898]: I1211 13:05:28.295002 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:28 crc kubenswrapper[4898]: I1211 13:05:28.295224 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:28 crc kubenswrapper[4898]: I1211 13:05:28.295286 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:28Z","lastTransitionTime":"2025-12-11T13:05:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:28 crc kubenswrapper[4898]: I1211 13:05:28.398181 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:28 crc kubenswrapper[4898]: I1211 13:05:28.398234 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:28 crc kubenswrapper[4898]: I1211 13:05:28.398244 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:28 crc kubenswrapper[4898]: I1211 13:05:28.398260 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:28 crc kubenswrapper[4898]: I1211 13:05:28.398291 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:28Z","lastTransitionTime":"2025-12-11T13:05:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:28 crc kubenswrapper[4898]: I1211 13:05:28.500466 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:28 crc kubenswrapper[4898]: I1211 13:05:28.500506 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:28 crc kubenswrapper[4898]: I1211 13:05:28.500516 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:28 crc kubenswrapper[4898]: I1211 13:05:28.500530 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:28 crc kubenswrapper[4898]: I1211 13:05:28.500539 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:28Z","lastTransitionTime":"2025-12-11T13:05:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:28 crc kubenswrapper[4898]: I1211 13:05:28.603525 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:28 crc kubenswrapper[4898]: I1211 13:05:28.604098 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:28 crc kubenswrapper[4898]: I1211 13:05:28.604241 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:28 crc kubenswrapper[4898]: I1211 13:05:28.604374 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:28 crc kubenswrapper[4898]: I1211 13:05:28.604517 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:28Z","lastTransitionTime":"2025-12-11T13:05:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:28 crc kubenswrapper[4898]: I1211 13:05:28.707116 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:28 crc kubenswrapper[4898]: I1211 13:05:28.707165 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:28 crc kubenswrapper[4898]: I1211 13:05:28.707178 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:28 crc kubenswrapper[4898]: I1211 13:05:28.707195 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:28 crc kubenswrapper[4898]: I1211 13:05:28.707208 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:28Z","lastTransitionTime":"2025-12-11T13:05:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:28 crc kubenswrapper[4898]: I1211 13:05:28.808960 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:28 crc kubenswrapper[4898]: I1211 13:05:28.809060 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:28 crc kubenswrapper[4898]: I1211 13:05:28.809078 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:28 crc kubenswrapper[4898]: I1211 13:05:28.809101 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:28 crc kubenswrapper[4898]: I1211 13:05:28.809118 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:28Z","lastTransitionTime":"2025-12-11T13:05:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:28 crc kubenswrapper[4898]: I1211 13:05:28.911149 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:28 crc kubenswrapper[4898]: I1211 13:05:28.911193 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:28 crc kubenswrapper[4898]: I1211 13:05:28.911205 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:28 crc kubenswrapper[4898]: I1211 13:05:28.911222 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:28 crc kubenswrapper[4898]: I1211 13:05:28.911234 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:28Z","lastTransitionTime":"2025-12-11T13:05:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:29 crc kubenswrapper[4898]: I1211 13:05:29.014112 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:29 crc kubenswrapper[4898]: I1211 13:05:29.014179 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:29 crc kubenswrapper[4898]: I1211 13:05:29.014203 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:29 crc kubenswrapper[4898]: I1211 13:05:29.014232 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:29 crc kubenswrapper[4898]: I1211 13:05:29.014255 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:29Z","lastTransitionTime":"2025-12-11T13:05:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:29 crc kubenswrapper[4898]: I1211 13:05:29.117100 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:29 crc kubenswrapper[4898]: I1211 13:05:29.117130 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:29 crc kubenswrapper[4898]: I1211 13:05:29.117140 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:29 crc kubenswrapper[4898]: I1211 13:05:29.117158 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:29 crc kubenswrapper[4898]: I1211 13:05:29.117168 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:29Z","lastTransitionTime":"2025-12-11T13:05:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:29 crc kubenswrapper[4898]: I1211 13:05:29.220073 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:29 crc kubenswrapper[4898]: I1211 13:05:29.220350 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:29 crc kubenswrapper[4898]: I1211 13:05:29.220542 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:29 crc kubenswrapper[4898]: I1211 13:05:29.220682 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:29 crc kubenswrapper[4898]: I1211 13:05:29.220807 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:29Z","lastTransitionTime":"2025-12-11T13:05:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:29 crc kubenswrapper[4898]: I1211 13:05:29.323543 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:29 crc kubenswrapper[4898]: I1211 13:05:29.323605 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:29 crc kubenswrapper[4898]: I1211 13:05:29.323627 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:29 crc kubenswrapper[4898]: I1211 13:05:29.323656 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:29 crc kubenswrapper[4898]: I1211 13:05:29.323677 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:29Z","lastTransitionTime":"2025-12-11T13:05:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:29 crc kubenswrapper[4898]: I1211 13:05:29.426547 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:29 crc kubenswrapper[4898]: I1211 13:05:29.426581 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:29 crc kubenswrapper[4898]: I1211 13:05:29.426590 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:29 crc kubenswrapper[4898]: I1211 13:05:29.426603 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:29 crc kubenswrapper[4898]: I1211 13:05:29.426613 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:29Z","lastTransitionTime":"2025-12-11T13:05:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:29 crc kubenswrapper[4898]: I1211 13:05:29.529816 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:29 crc kubenswrapper[4898]: I1211 13:05:29.529884 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:29 crc kubenswrapper[4898]: I1211 13:05:29.529898 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:29 crc kubenswrapper[4898]: I1211 13:05:29.529915 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:29 crc kubenswrapper[4898]: I1211 13:05:29.529928 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:29Z","lastTransitionTime":"2025-12-11T13:05:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:29 crc kubenswrapper[4898]: I1211 13:05:29.632168 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:29 crc kubenswrapper[4898]: I1211 13:05:29.632451 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:29 crc kubenswrapper[4898]: I1211 13:05:29.632561 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:29 crc kubenswrapper[4898]: I1211 13:05:29.632652 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:29 crc kubenswrapper[4898]: I1211 13:05:29.632726 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:29Z","lastTransitionTime":"2025-12-11T13:05:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:29 crc kubenswrapper[4898]: I1211 13:05:29.734883 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:29 crc kubenswrapper[4898]: I1211 13:05:29.734935 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:29 crc kubenswrapper[4898]: I1211 13:05:29.734950 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:29 crc kubenswrapper[4898]: I1211 13:05:29.734970 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:29 crc kubenswrapper[4898]: I1211 13:05:29.734986 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:29Z","lastTransitionTime":"2025-12-11T13:05:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:29 crc kubenswrapper[4898]: I1211 13:05:29.774240 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zcq7l" Dec 11 13:05:29 crc kubenswrapper[4898]: I1211 13:05:29.774284 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 11 13:05:29 crc kubenswrapper[4898]: I1211 13:05:29.774282 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 11 13:05:29 crc kubenswrapper[4898]: E1211 13:05:29.774358 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zcq7l" podUID="34380c7c-1d75-4f6f-a6cb-b015a55ca978" Dec 11 13:05:29 crc kubenswrapper[4898]: I1211 13:05:29.774250 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 11 13:05:29 crc kubenswrapper[4898]: E1211 13:05:29.774551 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 11 13:05:29 crc kubenswrapper[4898]: E1211 13:05:29.774591 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 11 13:05:29 crc kubenswrapper[4898]: E1211 13:05:29.774813 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 11 13:05:29 crc kubenswrapper[4898]: I1211 13:05:29.838139 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:29 crc kubenswrapper[4898]: I1211 13:05:29.838190 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:29 crc kubenswrapper[4898]: I1211 13:05:29.838206 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:29 crc kubenswrapper[4898]: I1211 13:05:29.838225 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:29 crc kubenswrapper[4898]: I1211 13:05:29.838238 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:29Z","lastTransitionTime":"2025-12-11T13:05:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:29 crc kubenswrapper[4898]: I1211 13:05:29.941597 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:29 crc kubenswrapper[4898]: I1211 13:05:29.941666 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:29 crc kubenswrapper[4898]: I1211 13:05:29.941678 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:29 crc kubenswrapper[4898]: I1211 13:05:29.941697 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:29 crc kubenswrapper[4898]: I1211 13:05:29.941708 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:29Z","lastTransitionTime":"2025-12-11T13:05:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:30 crc kubenswrapper[4898]: I1211 13:05:30.044408 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:30 crc kubenswrapper[4898]: I1211 13:05:30.044534 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:30 crc kubenswrapper[4898]: I1211 13:05:30.044559 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:30 crc kubenswrapper[4898]: I1211 13:05:30.044591 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:30 crc kubenswrapper[4898]: I1211 13:05:30.044613 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:30Z","lastTransitionTime":"2025-12-11T13:05:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:30 crc kubenswrapper[4898]: I1211 13:05:30.147537 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:30 crc kubenswrapper[4898]: I1211 13:05:30.147599 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:30 crc kubenswrapper[4898]: I1211 13:05:30.147616 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:30 crc kubenswrapper[4898]: I1211 13:05:30.147639 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:30 crc kubenswrapper[4898]: I1211 13:05:30.147657 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:30Z","lastTransitionTime":"2025-12-11T13:05:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:30 crc kubenswrapper[4898]: I1211 13:05:30.249814 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:30 crc kubenswrapper[4898]: I1211 13:05:30.249853 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:30 crc kubenswrapper[4898]: I1211 13:05:30.249861 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:30 crc kubenswrapper[4898]: I1211 13:05:30.249875 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:30 crc kubenswrapper[4898]: I1211 13:05:30.249884 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:30Z","lastTransitionTime":"2025-12-11T13:05:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:30 crc kubenswrapper[4898]: I1211 13:05:30.351951 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:30 crc kubenswrapper[4898]: I1211 13:05:30.352017 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:30 crc kubenswrapper[4898]: I1211 13:05:30.352031 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:30 crc kubenswrapper[4898]: I1211 13:05:30.352046 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:30 crc kubenswrapper[4898]: I1211 13:05:30.352056 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:30Z","lastTransitionTime":"2025-12-11T13:05:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:30 crc kubenswrapper[4898]: I1211 13:05:30.454539 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:30 crc kubenswrapper[4898]: I1211 13:05:30.454614 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:30 crc kubenswrapper[4898]: I1211 13:05:30.454637 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:30 crc kubenswrapper[4898]: I1211 13:05:30.454666 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:30 crc kubenswrapper[4898]: I1211 13:05:30.454689 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:30Z","lastTransitionTime":"2025-12-11T13:05:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:30 crc kubenswrapper[4898]: I1211 13:05:30.557322 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:30 crc kubenswrapper[4898]: I1211 13:05:30.557365 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:30 crc kubenswrapper[4898]: I1211 13:05:30.557376 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:30 crc kubenswrapper[4898]: I1211 13:05:30.557388 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:30 crc kubenswrapper[4898]: I1211 13:05:30.557401 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:30Z","lastTransitionTime":"2025-12-11T13:05:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:30 crc kubenswrapper[4898]: I1211 13:05:30.659844 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:30 crc kubenswrapper[4898]: I1211 13:05:30.659889 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:30 crc kubenswrapper[4898]: I1211 13:05:30.659899 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:30 crc kubenswrapper[4898]: I1211 13:05:30.659915 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:30 crc kubenswrapper[4898]: I1211 13:05:30.659926 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:30Z","lastTransitionTime":"2025-12-11T13:05:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:30 crc kubenswrapper[4898]: I1211 13:05:30.762615 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:30 crc kubenswrapper[4898]: I1211 13:05:30.762672 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:30 crc kubenswrapper[4898]: I1211 13:05:30.762687 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:30 crc kubenswrapper[4898]: I1211 13:05:30.762707 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:30 crc kubenswrapper[4898]: I1211 13:05:30.762723 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:30Z","lastTransitionTime":"2025-12-11T13:05:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:30 crc kubenswrapper[4898]: I1211 13:05:30.786437 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Dec 11 13:05:30 crc kubenswrapper[4898]: I1211 13:05:30.865385 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:30 crc kubenswrapper[4898]: I1211 13:05:30.865429 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:30 crc kubenswrapper[4898]: I1211 13:05:30.865440 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:30 crc kubenswrapper[4898]: I1211 13:05:30.865480 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:30 crc kubenswrapper[4898]: I1211 13:05:30.865492 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:30Z","lastTransitionTime":"2025-12-11T13:05:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:30 crc kubenswrapper[4898]: I1211 13:05:30.967839 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:30 crc kubenswrapper[4898]: I1211 13:05:30.967879 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:30 crc kubenswrapper[4898]: I1211 13:05:30.967894 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:30 crc kubenswrapper[4898]: I1211 13:05:30.967911 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:30 crc kubenswrapper[4898]: I1211 13:05:30.967923 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:30Z","lastTransitionTime":"2025-12-11T13:05:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:31 crc kubenswrapper[4898]: I1211 13:05:31.071144 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:31 crc kubenswrapper[4898]: I1211 13:05:31.071193 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:31 crc kubenswrapper[4898]: I1211 13:05:31.071206 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:31 crc kubenswrapper[4898]: I1211 13:05:31.071225 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:31 crc kubenswrapper[4898]: I1211 13:05:31.071236 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:31Z","lastTransitionTime":"2025-12-11T13:05:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:31 crc kubenswrapper[4898]: I1211 13:05:31.174538 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:31 crc kubenswrapper[4898]: I1211 13:05:31.174568 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:31 crc kubenswrapper[4898]: I1211 13:05:31.174578 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:31 crc kubenswrapper[4898]: I1211 13:05:31.174594 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:31 crc kubenswrapper[4898]: I1211 13:05:31.174606 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:31Z","lastTransitionTime":"2025-12-11T13:05:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:31 crc kubenswrapper[4898]: I1211 13:05:31.277103 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:31 crc kubenswrapper[4898]: I1211 13:05:31.277169 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:31 crc kubenswrapper[4898]: I1211 13:05:31.277190 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:31 crc kubenswrapper[4898]: I1211 13:05:31.277216 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:31 crc kubenswrapper[4898]: I1211 13:05:31.277236 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:31Z","lastTransitionTime":"2025-12-11T13:05:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:31 crc kubenswrapper[4898]: I1211 13:05:31.382828 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:31 crc kubenswrapper[4898]: I1211 13:05:31.383110 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:31 crc kubenswrapper[4898]: I1211 13:05:31.383175 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:31 crc kubenswrapper[4898]: I1211 13:05:31.383284 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:31 crc kubenswrapper[4898]: I1211 13:05:31.383345 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:31Z","lastTransitionTime":"2025-12-11T13:05:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:31 crc kubenswrapper[4898]: I1211 13:05:31.485524 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:31 crc kubenswrapper[4898]: I1211 13:05:31.485775 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:31 crc kubenswrapper[4898]: I1211 13:05:31.485929 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:31 crc kubenswrapper[4898]: I1211 13:05:31.486097 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:31 crc kubenswrapper[4898]: I1211 13:05:31.486333 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:31Z","lastTransitionTime":"2025-12-11T13:05:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:31 crc kubenswrapper[4898]: I1211 13:05:31.588734 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:31 crc kubenswrapper[4898]: I1211 13:05:31.589124 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:31 crc kubenswrapper[4898]: I1211 13:05:31.589328 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:31 crc kubenswrapper[4898]: I1211 13:05:31.589576 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:31 crc kubenswrapper[4898]: I1211 13:05:31.589775 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:31Z","lastTransitionTime":"2025-12-11T13:05:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:31 crc kubenswrapper[4898]: I1211 13:05:31.692208 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:31 crc kubenswrapper[4898]: I1211 13:05:31.692254 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:31 crc kubenswrapper[4898]: I1211 13:05:31.692265 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:31 crc kubenswrapper[4898]: I1211 13:05:31.692284 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:31 crc kubenswrapper[4898]: I1211 13:05:31.692296 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:31Z","lastTransitionTime":"2025-12-11T13:05:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:31 crc kubenswrapper[4898]: I1211 13:05:31.774476 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 11 13:05:31 crc kubenswrapper[4898]: I1211 13:05:31.774483 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 11 13:05:31 crc kubenswrapper[4898]: E1211 13:05:31.775338 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 11 13:05:31 crc kubenswrapper[4898]: I1211 13:05:31.774771 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zcq7l" Dec 11 13:05:31 crc kubenswrapper[4898]: E1211 13:05:31.775418 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zcq7l" podUID="34380c7c-1d75-4f6f-a6cb-b015a55ca978" Dec 11 13:05:31 crc kubenswrapper[4898]: E1211 13:05:31.775265 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 11 13:05:31 crc kubenswrapper[4898]: I1211 13:05:31.774510 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 11 13:05:31 crc kubenswrapper[4898]: E1211 13:05:31.775603 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 11 13:05:31 crc kubenswrapper[4898]: I1211 13:05:31.795561 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:31 crc kubenswrapper[4898]: I1211 13:05:31.795612 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:31 crc kubenswrapper[4898]: I1211 13:05:31.795637 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:31 crc kubenswrapper[4898]: I1211 13:05:31.795659 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:31 crc kubenswrapper[4898]: I1211 13:05:31.795670 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:31Z","lastTransitionTime":"2025-12-11T13:05:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:31 crc kubenswrapper[4898]: I1211 13:05:31.900078 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:31 crc kubenswrapper[4898]: I1211 13:05:31.900484 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:31 crc kubenswrapper[4898]: I1211 13:05:31.900552 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:31 crc kubenswrapper[4898]: I1211 13:05:31.900625 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:31 crc kubenswrapper[4898]: I1211 13:05:31.900739 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:31Z","lastTransitionTime":"2025-12-11T13:05:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:32 crc kubenswrapper[4898]: I1211 13:05:32.003588 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:32 crc kubenswrapper[4898]: I1211 13:05:32.003639 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:32 crc kubenswrapper[4898]: I1211 13:05:32.003656 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:32 crc kubenswrapper[4898]: I1211 13:05:32.003683 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:32 crc kubenswrapper[4898]: I1211 13:05:32.003701 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:32Z","lastTransitionTime":"2025-12-11T13:05:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:32 crc kubenswrapper[4898]: I1211 13:05:32.106078 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:32 crc kubenswrapper[4898]: I1211 13:05:32.106658 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:32 crc kubenswrapper[4898]: I1211 13:05:32.106772 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:32 crc kubenswrapper[4898]: I1211 13:05:32.106862 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:32 crc kubenswrapper[4898]: I1211 13:05:32.106973 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:32Z","lastTransitionTime":"2025-12-11T13:05:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:32 crc kubenswrapper[4898]: I1211 13:05:32.210001 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:32 crc kubenswrapper[4898]: I1211 13:05:32.210075 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:32 crc kubenswrapper[4898]: I1211 13:05:32.210088 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:32 crc kubenswrapper[4898]: I1211 13:05:32.210103 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:32 crc kubenswrapper[4898]: I1211 13:05:32.210114 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:32Z","lastTransitionTime":"2025-12-11T13:05:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:32 crc kubenswrapper[4898]: I1211 13:05:32.312925 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:32 crc kubenswrapper[4898]: I1211 13:05:32.312979 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:32 crc kubenswrapper[4898]: I1211 13:05:32.312991 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:32 crc kubenswrapper[4898]: I1211 13:05:32.313007 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:32 crc kubenswrapper[4898]: I1211 13:05:32.313018 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:32Z","lastTransitionTime":"2025-12-11T13:05:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:32 crc kubenswrapper[4898]: I1211 13:05:32.416151 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:32 crc kubenswrapper[4898]: I1211 13:05:32.416198 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:32 crc kubenswrapper[4898]: I1211 13:05:32.416210 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:32 crc kubenswrapper[4898]: I1211 13:05:32.416227 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:32 crc kubenswrapper[4898]: I1211 13:05:32.416240 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:32Z","lastTransitionTime":"2025-12-11T13:05:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:32 crc kubenswrapper[4898]: I1211 13:05:32.519690 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:32 crc kubenswrapper[4898]: I1211 13:05:32.519750 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:32 crc kubenswrapper[4898]: I1211 13:05:32.519769 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:32 crc kubenswrapper[4898]: I1211 13:05:32.519797 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:32 crc kubenswrapper[4898]: I1211 13:05:32.519821 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:32Z","lastTransitionTime":"2025-12-11T13:05:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:32 crc kubenswrapper[4898]: I1211 13:05:32.623071 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:32 crc kubenswrapper[4898]: I1211 13:05:32.623151 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:32 crc kubenswrapper[4898]: I1211 13:05:32.623170 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:32 crc kubenswrapper[4898]: I1211 13:05:32.623190 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:32 crc kubenswrapper[4898]: I1211 13:05:32.623207 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:32Z","lastTransitionTime":"2025-12-11T13:05:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:32 crc kubenswrapper[4898]: I1211 13:05:32.727128 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:32 crc kubenswrapper[4898]: I1211 13:05:32.727187 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:32 crc kubenswrapper[4898]: I1211 13:05:32.727205 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:32 crc kubenswrapper[4898]: I1211 13:05:32.727230 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:32 crc kubenswrapper[4898]: I1211 13:05:32.727250 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:32Z","lastTransitionTime":"2025-12-11T13:05:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:32 crc kubenswrapper[4898]: I1211 13:05:32.796794 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://337971ff1d0d3cf7ee256777a71e4549ec0b973b58b437caaa152c589047141f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:32Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:32 crc kubenswrapper[4898]: I1211 13:05:32.817200 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b28cc743c1173591c6ee75d4bb3841608d01bca50dab896d4c5de930a67ee22f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:32Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:32 crc kubenswrapper[4898]: I1211 13:05:32.830003 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:32 crc kubenswrapper[4898]: I1211 13:05:32.830082 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:32 crc kubenswrapper[4898]: I1211 13:05:32.830110 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:32 crc kubenswrapper[4898]: I1211 13:05:32.830143 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:32 crc kubenswrapper[4898]: I1211 13:05:32.830166 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:32Z","lastTransitionTime":"2025-12-11T13:05:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:32 crc kubenswrapper[4898]: I1211 13:05:32.837966 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-dlqfj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e8ed6cb-b822-4b64-9e00-e755c5aea812\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbc9b844a873836a500af9d781e17c635495762b0e50af71dadf141ff00723a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76ddfb054d2dd7ec27a771c96a0fca2cead8eacc3221c22b36c1d075414a1995\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-11T13:05:09Z\\\",\\\"message\\\":\\\"2025-12-11T13:04:23+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_be5097ed-4df8-4212-b1ed-ee3e7c74e5b6\\\\n2025-12-11T13:04:23+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_be5097ed-4df8-4212-b1ed-ee3e7c74e5b6 to /host/opt/cni/bin/\\\\n2025-12-11T13:04:24Z [verbose] multus-daemon started\\\\n2025-12-11T13:04:24Z [verbose] Readiness Indicator file check\\\\n2025-12-11T13:05:09Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:05:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bvrpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-dlqfj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:32Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:32 crc kubenswrapper[4898]: I1211 13:05:32.856131 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jmvlr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8009db93-3f68-4a84-87ae-863f64e231e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08079d059f71e05ad9d28ebed3d2608a035d3b6fc87db1d80b3974b439ac5bdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkmx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:24Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jmvlr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:32Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:32 crc kubenswrapper[4898]: I1211 13:05:32.875248 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbb0d9e2-813d-44ce-a547-a3c6f60c9f6b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e84fc12f1339e236216301663a693fc0ae8c043f282081b77e08455869556f40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcda5c81dd6cdc256d7acd24d32f956484e8823bed9e29c5713a722d8ebd6f46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cbfca7c42a03929bac318edc66ce1a6c7f88f795fd3b58db686b8484edfac60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e329db1261719ccef14dcf3965ff73799b275591613d4035cb820253dced79b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e329db1261719ccef14dcf3965ff73799b275591613d4035cb820253dced79b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:02Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:32Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:32 crc kubenswrapper[4898]: I1211 13:05:32.887989 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4281bcb0-26b5-4c42-89da-27cf45eb279a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://02c825a08c28d57d3fe2034e57f21ac7b8ca377df15a37ad6757b4d0c6632f2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62d748867b692dbb01e602f478b8af8cfee39ad2effae7fb32c2d53207c52519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62d748867b692dbb01e602f478b8af8cfee39ad2effae7fb32c2d53207c52519\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:32Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:32 crc kubenswrapper[4898]: I1211 13:05:32.916017 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"88d684b6-361a-40e6-978e-b46ea34398f5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fdd8f4f5e41c02e8bfa32782027f6483a3568a3e1cb28f2abc7807eaada32646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad23dc07c8ad7cf6100e3b53e1d9ed8abac20e942968edb5decdde1b263de306\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce1bf1a7718645350f992e26a366a0fefa0d757b696c932d3552dd24a4c93550\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4801f56faef50a1c100b7f54be9508a2be17ab655648fdb92cffaab1cb198a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87f5a835b639c4ce50c0d2b5f1a865f1b67bc5cc06d9fd88875d9defb71dbed2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77f3c45a0ef619bd5ee7b89f62fdaf4ae983c8f75e23034eb88019194f6ef88a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77f3c45a0ef619bd5ee7b89f62fdaf4ae983c8f75e23034eb88019194f6ef88a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32f392949d5ebc630af8b0646f8f482f806803d464e2cedbcbe731328d6ba5ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32f392949d5ebc630af8b0646f8f482f806803d464e2cedbcbe731328d6ba5ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://bf8553e4b1a22afaf454efae3707136b9e6e320a54fd269f48a5f72297249d21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf8553e4b1a22afaf454efae3707136b9e6e320a54fd269f48a5f72297249d21\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:32Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:32 crc kubenswrapper[4898]: I1211 13:05:32.927429 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-h86cf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"00c0a5c3-6be3-4c77-a628-cf6710a1f10f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cb94931502e22d03595fc3289c2b098ecf881b9f8c14180a036a11b2f65bd91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xv2hc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-h86cf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:32Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:32 crc kubenswrapper[4898]: I1211 13:05:32.932285 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:32 crc kubenswrapper[4898]: I1211 13:05:32.932319 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:32 crc kubenswrapper[4898]: I1211 13:05:32.932331 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:32 crc kubenswrapper[4898]: I1211 13:05:32.932347 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:32 crc kubenswrapper[4898]: I1211 13:05:32.932359 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:32Z","lastTransitionTime":"2025-12-11T13:05:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:32 crc kubenswrapper[4898]: I1211 13:05:32.941938 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6782ac5869029f1d6c05198c469d34070decbbbbab0c093963024040f067ecf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7cv42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb7b98c50effebe09db7e51fe139da11af7b116ce6b7d81f0dada92fa29c82a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7cv42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7mmvk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:32Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:32 crc kubenswrapper[4898]: I1211 13:05:32.962087 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qndxl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1efa7034-8a95-4e6e-bd84-0189dc5acaa3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2a77e22893dca5ea818fc6992db6c34f8eb90ce2f25e814bec5f314d60ad569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd81ec4902eeb225d5a2969e211c63bbab448c1cccb327cfa84fb68c4f386ab0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ab0efc2d7d46f575125a46f025b016017e66f74466a9d796dfb0285577f5bcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d992af21fd68cce19a4be8328466a3d46df62ed908fe4e911ba14e7c009ce982\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a60901e27e67b7483e02800a91a86f2676b610d538c999ab1df0d42fde146f90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7e7d7ded2a6e0e20781a16b0ed56a744db593c6fe40c75e89a088ccb8cdb127\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b19bc85d9d1f776f0f2a0bfedbfa6f30383df3673e54acadcc74a46e4d24bcde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19bc85d9d1f776f0f2a0bfedbfa6f30383df3673e54acadcc74a46e4d24bcde\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-11T13:05:23Z\\\",\\\"message\\\":\\\"twork controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:23Z is after 2025-08-24T17:21:41Z]\\\\nI1211 13:05:23.341647 6982 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-console-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"ebd4748e-0473-49fb-88ad-83dbb221791a\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-console-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]st\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-11T13:05:22Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-qndxl_openshift-ovn-kubernetes(1efa7034-8a95-4e6e-bd84-0189dc5acaa3)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3aa907a5ca865510f1aafd0f949963b5938b6fe62fa9fd690447c4ab62566f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fc397b24e8480c4d07281b09fb871f859eb9fdf47fa9103b9243f91e5800d61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fc397b24e8480c4d07281b09fb871f859eb9fdf47fa9103b9243f91e5800d61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qndxl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:32Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:32 crc kubenswrapper[4898]: I1211 13:05:32.978381 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-brphv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f71cb6f-609f-4fab-86eb-e01518fb8c61\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d17922d87e50d85f50e76fef19a44cedae3bfd35727f8531089cfcf6da12d5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwpjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cedba92b63b5430b84d32f2b59c5cc0c4e1cd2bc2d513f2f905ebb59c34639b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwpjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-brphv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:32Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:32 crc kubenswrapper[4898]: I1211 13:05:32.993401 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fc485a2-a69e-4453-8c3d-4b9698caa632\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df081be8bffcbf5068ade5f09ad81f2ea332d110fb0dffc1ab215f4a447e75d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://765cb993103d23770e20a725ce1f28ca635d13d2ecd05676039a833274ecf3e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41568c567a418a6ebd534c09d02c4331dfa2cd00580441bfdd83a15a12153a59\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ccc1530df8a90b252f6008c3a1734fd489bcb57944c2cc46bdf3c7554d837c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8857b4ade110b31598d9080f45bc15f5702b2c278fe7909e4d172c9877e61534\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"message\\\":\\\"falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1211 13:04:15.326141 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1211 13:04:15.327902 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-38072788/tls.crt::/tmp/serving-cert-38072788/tls.key\\\\\\\"\\\\nI1211 13:04:21.035582 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1211 13:04:21.038444 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1211 13:04:21.038483 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1211 13:04:21.038515 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1211 13:04:21.038527 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1211 13:04:21.044993 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1211 13:04:21.045022 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1211 13:04:21.045029 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1211 13:04:21.045034 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1211 13:04:21.045038 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1211 13:04:21.045043 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1211 13:04:21.045047 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1211 13:04:21.045297 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1211 13:04:21.046774 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a6e962a8c37823fc7d0047f9c1cae14311bfa1f36236addff039bba3369a021\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca4b5408884ed07718750ad82f77961d242327ca11991fd4dbbd1b5b53c5b642\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca4b5408884ed07718750ad82f77961d242327ca11991fd4dbbd1b5b53c5b642\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:32Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:33 crc kubenswrapper[4898]: I1211 13:05:33.009246 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9dab79b9393ae8ca48f0b4366d9b4e6fd25015a3d42f913b3d3d5ce78cbf0cc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bee512bd389a5e402acad6fdbf075e8d90bbfb3899efaecee992ff8ef64617fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:33Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:33 crc kubenswrapper[4898]: I1211 13:05:33.027397 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:33Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:33 crc kubenswrapper[4898]: I1211 13:05:33.035023 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:33 crc kubenswrapper[4898]: I1211 13:05:33.035047 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:33 crc kubenswrapper[4898]: I1211 13:05:33.035054 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:33 crc kubenswrapper[4898]: I1211 13:05:33.035066 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:33 crc kubenswrapper[4898]: I1211 13:05:33.035074 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:33Z","lastTransitionTime":"2025-12-11T13:05:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:33 crc kubenswrapper[4898]: I1211 13:05:33.044542 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7lxfm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45192e97-2770-4866-9865-dc2f45b3f616\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4fc2a5689b589ccda6ec382b6a586f4e47c42d4f137d852e28688a18b98e6e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14a9667311a747f8e844897394432d67e0897b22569dd1314f4ebeccfce1c531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14a9667311a747f8e844897394432d67e0897b22569dd1314f4ebeccfce1c531\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://35bb75c37377565dd0410255d00b3a848a3d22801107bd4d9547ce5574cda121\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35bb75c37377565dd0410255d00b3a848a3d22801107bd4d9547ce5574cda121\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd3bae44b475629c42174d195e162b9afe545e8dc84857847b1e2d936cf1a32d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd3bae44b475629c42174d195e162b9afe545e8dc84857847b1e2d936cf1a32d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7d5edf28866aeb08ed16121e0cea183463e58a708b157da20edd847bdad9860\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7d5edf28866aeb08ed16121e0cea183463e58a708b157da20edd847bdad9860\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://951e5da3fc6041496960dcfcd28bb3e918a40d9303c6609880bc2f8dcca29688\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://951e5da3fc6041496960dcfcd28bb3e918a40d9303c6609880bc2f8dcca29688\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39b6a95a0387b8f1ea4992bfa28355272a299bbb81adddf7ad39835786b43b2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39b6a95a0387b8f1ea4992bfa28355272a299bbb81adddf7ad39835786b43b2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7lxfm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:33Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:33 crc kubenswrapper[4898]: I1211 13:05:33.068018 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-zcq7l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34380c7c-1d75-4f6f-a6cb-b015a55ca978\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:36Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-zcq7l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:33Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:33 crc kubenswrapper[4898]: I1211 13:05:33.086902 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca3db723-c4e9-4a38-9cee-827f045c7be3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b72e9f6787b4bfadbc89cf669d5f31678f0a85a726107e3b3d2e06eff0e18ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a52b842d9e7001cf1e9c3284c56ac9a76f642b66457d6717c6367e00ef89fdf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://583816a3f9ba9a092d6cd8d75bba54cf42e3ff9f65230f83822baec967dd53dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f419f755ec0a61ffad06dca0d35df43386ebd253152f29cf04312cd0ea89d5de\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:33Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:33 crc kubenswrapper[4898]: I1211 13:05:33.102239 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:33Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:33 crc kubenswrapper[4898]: I1211 13:05:33.119090 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:33Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:33 crc kubenswrapper[4898]: I1211 13:05:33.137893 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:33 crc kubenswrapper[4898]: I1211 13:05:33.137935 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:33 crc kubenswrapper[4898]: I1211 13:05:33.137945 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:33 crc kubenswrapper[4898]: I1211 13:05:33.137961 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:33 crc kubenswrapper[4898]: I1211 13:05:33.137975 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:33Z","lastTransitionTime":"2025-12-11T13:05:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:33 crc kubenswrapper[4898]: I1211 13:05:33.241015 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:33 crc kubenswrapper[4898]: I1211 13:05:33.241071 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:33 crc kubenswrapper[4898]: I1211 13:05:33.241083 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:33 crc kubenswrapper[4898]: I1211 13:05:33.241102 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:33 crc kubenswrapper[4898]: I1211 13:05:33.241114 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:33Z","lastTransitionTime":"2025-12-11T13:05:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:33 crc kubenswrapper[4898]: I1211 13:05:33.343501 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:33 crc kubenswrapper[4898]: I1211 13:05:33.343529 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:33 crc kubenswrapper[4898]: I1211 13:05:33.343537 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:33 crc kubenswrapper[4898]: I1211 13:05:33.343549 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:33 crc kubenswrapper[4898]: I1211 13:05:33.343557 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:33Z","lastTransitionTime":"2025-12-11T13:05:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:33 crc kubenswrapper[4898]: I1211 13:05:33.446192 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:33 crc kubenswrapper[4898]: I1211 13:05:33.446238 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:33 crc kubenswrapper[4898]: I1211 13:05:33.446254 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:33 crc kubenswrapper[4898]: I1211 13:05:33.446276 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:33 crc kubenswrapper[4898]: I1211 13:05:33.446293 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:33Z","lastTransitionTime":"2025-12-11T13:05:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:33 crc kubenswrapper[4898]: I1211 13:05:33.550037 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:33 crc kubenswrapper[4898]: I1211 13:05:33.550363 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:33 crc kubenswrapper[4898]: I1211 13:05:33.550598 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:33 crc kubenswrapper[4898]: I1211 13:05:33.550801 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:33 crc kubenswrapper[4898]: I1211 13:05:33.551083 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:33Z","lastTransitionTime":"2025-12-11T13:05:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:33 crc kubenswrapper[4898]: I1211 13:05:33.655009 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:33 crc kubenswrapper[4898]: I1211 13:05:33.655087 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:33 crc kubenswrapper[4898]: I1211 13:05:33.655110 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:33 crc kubenswrapper[4898]: I1211 13:05:33.655140 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:33 crc kubenswrapper[4898]: I1211 13:05:33.655166 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:33Z","lastTransitionTime":"2025-12-11T13:05:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:33 crc kubenswrapper[4898]: I1211 13:05:33.758105 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:33 crc kubenswrapper[4898]: I1211 13:05:33.758155 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:33 crc kubenswrapper[4898]: I1211 13:05:33.758166 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:33 crc kubenswrapper[4898]: I1211 13:05:33.758181 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:33 crc kubenswrapper[4898]: I1211 13:05:33.758193 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:33Z","lastTransitionTime":"2025-12-11T13:05:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:33 crc kubenswrapper[4898]: I1211 13:05:33.774793 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zcq7l" Dec 11 13:05:33 crc kubenswrapper[4898]: I1211 13:05:33.774827 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 11 13:05:33 crc kubenswrapper[4898]: I1211 13:05:33.774876 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 11 13:05:33 crc kubenswrapper[4898]: E1211 13:05:33.774963 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zcq7l" podUID="34380c7c-1d75-4f6f-a6cb-b015a55ca978" Dec 11 13:05:33 crc kubenswrapper[4898]: I1211 13:05:33.774980 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 11 13:05:33 crc kubenswrapper[4898]: E1211 13:05:33.775074 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 11 13:05:33 crc kubenswrapper[4898]: E1211 13:05:33.775201 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 11 13:05:33 crc kubenswrapper[4898]: E1211 13:05:33.775320 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 11 13:05:33 crc kubenswrapper[4898]: I1211 13:05:33.861989 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:33 crc kubenswrapper[4898]: I1211 13:05:33.862038 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:33 crc kubenswrapper[4898]: I1211 13:05:33.862056 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:33 crc kubenswrapper[4898]: I1211 13:05:33.862078 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:33 crc kubenswrapper[4898]: I1211 13:05:33.862095 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:33Z","lastTransitionTime":"2025-12-11T13:05:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:33 crc kubenswrapper[4898]: I1211 13:05:33.964231 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:33 crc kubenswrapper[4898]: I1211 13:05:33.964278 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:33 crc kubenswrapper[4898]: I1211 13:05:33.964287 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:33 crc kubenswrapper[4898]: I1211 13:05:33.964305 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:33 crc kubenswrapper[4898]: I1211 13:05:33.964316 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:33Z","lastTransitionTime":"2025-12-11T13:05:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:34 crc kubenswrapper[4898]: I1211 13:05:34.066842 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:34 crc kubenswrapper[4898]: I1211 13:05:34.066893 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:34 crc kubenswrapper[4898]: I1211 13:05:34.066905 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:34 crc kubenswrapper[4898]: I1211 13:05:34.066925 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:34 crc kubenswrapper[4898]: I1211 13:05:34.066937 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:34Z","lastTransitionTime":"2025-12-11T13:05:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:34 crc kubenswrapper[4898]: I1211 13:05:34.168993 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:34 crc kubenswrapper[4898]: I1211 13:05:34.169063 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:34 crc kubenswrapper[4898]: I1211 13:05:34.169085 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:34 crc kubenswrapper[4898]: I1211 13:05:34.169112 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:34 crc kubenswrapper[4898]: I1211 13:05:34.169133 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:34Z","lastTransitionTime":"2025-12-11T13:05:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:34 crc kubenswrapper[4898]: I1211 13:05:34.271862 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:34 crc kubenswrapper[4898]: I1211 13:05:34.271914 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:34 crc kubenswrapper[4898]: I1211 13:05:34.271928 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:34 crc kubenswrapper[4898]: I1211 13:05:34.271951 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:34 crc kubenswrapper[4898]: I1211 13:05:34.271966 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:34Z","lastTransitionTime":"2025-12-11T13:05:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:34 crc kubenswrapper[4898]: I1211 13:05:34.374384 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:34 crc kubenswrapper[4898]: I1211 13:05:34.374423 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:34 crc kubenswrapper[4898]: I1211 13:05:34.374432 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:34 crc kubenswrapper[4898]: I1211 13:05:34.374446 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:34 crc kubenswrapper[4898]: I1211 13:05:34.374472 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:34Z","lastTransitionTime":"2025-12-11T13:05:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:34 crc kubenswrapper[4898]: I1211 13:05:34.477529 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:34 crc kubenswrapper[4898]: I1211 13:05:34.477589 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:34 crc kubenswrapper[4898]: I1211 13:05:34.477606 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:34 crc kubenswrapper[4898]: I1211 13:05:34.477627 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:34 crc kubenswrapper[4898]: I1211 13:05:34.477644 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:34Z","lastTransitionTime":"2025-12-11T13:05:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:34 crc kubenswrapper[4898]: I1211 13:05:34.580143 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:34 crc kubenswrapper[4898]: I1211 13:05:34.580199 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:34 crc kubenswrapper[4898]: I1211 13:05:34.580210 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:34 crc kubenswrapper[4898]: I1211 13:05:34.580228 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:34 crc kubenswrapper[4898]: I1211 13:05:34.580240 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:34Z","lastTransitionTime":"2025-12-11T13:05:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:34 crc kubenswrapper[4898]: I1211 13:05:34.682923 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:34 crc kubenswrapper[4898]: I1211 13:05:34.682998 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:34 crc kubenswrapper[4898]: I1211 13:05:34.683013 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:34 crc kubenswrapper[4898]: I1211 13:05:34.683028 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:34 crc kubenswrapper[4898]: I1211 13:05:34.683040 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:34Z","lastTransitionTime":"2025-12-11T13:05:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:34 crc kubenswrapper[4898]: I1211 13:05:34.785289 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:34 crc kubenswrapper[4898]: I1211 13:05:34.785347 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:34 crc kubenswrapper[4898]: I1211 13:05:34.785363 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:34 crc kubenswrapper[4898]: I1211 13:05:34.785383 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:34 crc kubenswrapper[4898]: I1211 13:05:34.785396 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:34Z","lastTransitionTime":"2025-12-11T13:05:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:34 crc kubenswrapper[4898]: I1211 13:05:34.888272 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:34 crc kubenswrapper[4898]: I1211 13:05:34.888308 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:34 crc kubenswrapper[4898]: I1211 13:05:34.888316 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:34 crc kubenswrapper[4898]: I1211 13:05:34.888328 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:34 crc kubenswrapper[4898]: I1211 13:05:34.888337 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:34Z","lastTransitionTime":"2025-12-11T13:05:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:34 crc kubenswrapper[4898]: I1211 13:05:34.991480 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:34 crc kubenswrapper[4898]: I1211 13:05:34.991547 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:34 crc kubenswrapper[4898]: I1211 13:05:34.991563 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:34 crc kubenswrapper[4898]: I1211 13:05:34.991588 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:34 crc kubenswrapper[4898]: I1211 13:05:34.991605 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:34Z","lastTransitionTime":"2025-12-11T13:05:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:35 crc kubenswrapper[4898]: I1211 13:05:35.094915 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:35 crc kubenswrapper[4898]: I1211 13:05:35.095293 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:35 crc kubenswrapper[4898]: I1211 13:05:35.095430 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:35 crc kubenswrapper[4898]: I1211 13:05:35.095600 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:35 crc kubenswrapper[4898]: I1211 13:05:35.095723 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:35Z","lastTransitionTime":"2025-12-11T13:05:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:35 crc kubenswrapper[4898]: I1211 13:05:35.199281 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:35 crc kubenswrapper[4898]: I1211 13:05:35.199662 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:35 crc kubenswrapper[4898]: I1211 13:05:35.199804 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:35 crc kubenswrapper[4898]: I1211 13:05:35.199980 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:35 crc kubenswrapper[4898]: I1211 13:05:35.200137 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:35Z","lastTransitionTime":"2025-12-11T13:05:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:35 crc kubenswrapper[4898]: I1211 13:05:35.303963 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:35 crc kubenswrapper[4898]: I1211 13:05:35.304024 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:35 crc kubenswrapper[4898]: I1211 13:05:35.304041 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:35 crc kubenswrapper[4898]: I1211 13:05:35.304065 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:35 crc kubenswrapper[4898]: I1211 13:05:35.304082 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:35Z","lastTransitionTime":"2025-12-11T13:05:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:35 crc kubenswrapper[4898]: I1211 13:05:35.407558 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:35 crc kubenswrapper[4898]: I1211 13:05:35.407606 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:35 crc kubenswrapper[4898]: I1211 13:05:35.407625 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:35 crc kubenswrapper[4898]: I1211 13:05:35.407649 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:35 crc kubenswrapper[4898]: I1211 13:05:35.407670 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:35Z","lastTransitionTime":"2025-12-11T13:05:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:35 crc kubenswrapper[4898]: I1211 13:05:35.510388 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:35 crc kubenswrapper[4898]: I1211 13:05:35.510439 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:35 crc kubenswrapper[4898]: I1211 13:05:35.510483 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:35 crc kubenswrapper[4898]: I1211 13:05:35.510503 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:35 crc kubenswrapper[4898]: I1211 13:05:35.510516 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:35Z","lastTransitionTime":"2025-12-11T13:05:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:35 crc kubenswrapper[4898]: I1211 13:05:35.612632 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:35 crc kubenswrapper[4898]: I1211 13:05:35.612716 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:35 crc kubenswrapper[4898]: I1211 13:05:35.612738 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:35 crc kubenswrapper[4898]: I1211 13:05:35.612778 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:35 crc kubenswrapper[4898]: I1211 13:05:35.612800 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:35Z","lastTransitionTime":"2025-12-11T13:05:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:35 crc kubenswrapper[4898]: I1211 13:05:35.715975 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:35 crc kubenswrapper[4898]: I1211 13:05:35.716025 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:35 crc kubenswrapper[4898]: I1211 13:05:35.716037 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:35 crc kubenswrapper[4898]: I1211 13:05:35.716056 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:35 crc kubenswrapper[4898]: I1211 13:05:35.716068 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:35Z","lastTransitionTime":"2025-12-11T13:05:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:35 crc kubenswrapper[4898]: I1211 13:05:35.774356 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zcq7l" Dec 11 13:05:35 crc kubenswrapper[4898]: I1211 13:05:35.774439 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 11 13:05:35 crc kubenswrapper[4898]: E1211 13:05:35.774638 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zcq7l" podUID="34380c7c-1d75-4f6f-a6cb-b015a55ca978" Dec 11 13:05:35 crc kubenswrapper[4898]: I1211 13:05:35.774694 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 11 13:05:35 crc kubenswrapper[4898]: I1211 13:05:35.774387 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 11 13:05:35 crc kubenswrapper[4898]: E1211 13:05:35.774828 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 11 13:05:35 crc kubenswrapper[4898]: E1211 13:05:35.774963 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 11 13:05:35 crc kubenswrapper[4898]: E1211 13:05:35.775063 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 11 13:05:35 crc kubenswrapper[4898]: I1211 13:05:35.776146 4898 scope.go:117] "RemoveContainer" containerID="b19bc85d9d1f776f0f2a0bfedbfa6f30383df3673e54acadcc74a46e4d24bcde" Dec 11 13:05:35 crc kubenswrapper[4898]: E1211 13:05:35.776389 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-qndxl_openshift-ovn-kubernetes(1efa7034-8a95-4e6e-bd84-0189dc5acaa3)\"" pod="openshift-ovn-kubernetes/ovnkube-node-qndxl" podUID="1efa7034-8a95-4e6e-bd84-0189dc5acaa3" Dec 11 13:05:35 crc kubenswrapper[4898]: I1211 13:05:35.819063 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:35 crc kubenswrapper[4898]: I1211 13:05:35.819110 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:35 crc kubenswrapper[4898]: I1211 13:05:35.819122 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:35 crc kubenswrapper[4898]: I1211 13:05:35.819140 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:35 crc kubenswrapper[4898]: I1211 13:05:35.819153 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:35Z","lastTransitionTime":"2025-12-11T13:05:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:35 crc kubenswrapper[4898]: I1211 13:05:35.921645 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:35 crc kubenswrapper[4898]: I1211 13:05:35.921683 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:35 crc kubenswrapper[4898]: I1211 13:05:35.921752 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:35 crc kubenswrapper[4898]: I1211 13:05:35.921764 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:35 crc kubenswrapper[4898]: I1211 13:05:35.921773 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:35Z","lastTransitionTime":"2025-12-11T13:05:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:36 crc kubenswrapper[4898]: I1211 13:05:36.023868 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:36 crc kubenswrapper[4898]: I1211 13:05:36.023915 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:36 crc kubenswrapper[4898]: I1211 13:05:36.023927 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:36 crc kubenswrapper[4898]: I1211 13:05:36.023944 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:36 crc kubenswrapper[4898]: I1211 13:05:36.023957 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:36Z","lastTransitionTime":"2025-12-11T13:05:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:36 crc kubenswrapper[4898]: I1211 13:05:36.126871 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:36 crc kubenswrapper[4898]: I1211 13:05:36.126924 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:36 crc kubenswrapper[4898]: I1211 13:05:36.126939 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:36 crc kubenswrapper[4898]: I1211 13:05:36.126957 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:36 crc kubenswrapper[4898]: I1211 13:05:36.126971 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:36Z","lastTransitionTime":"2025-12-11T13:05:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:36 crc kubenswrapper[4898]: I1211 13:05:36.229566 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:36 crc kubenswrapper[4898]: I1211 13:05:36.229626 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:36 crc kubenswrapper[4898]: I1211 13:05:36.229644 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:36 crc kubenswrapper[4898]: I1211 13:05:36.229667 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:36 crc kubenswrapper[4898]: I1211 13:05:36.229684 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:36Z","lastTransitionTime":"2025-12-11T13:05:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:36 crc kubenswrapper[4898]: I1211 13:05:36.332966 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:36 crc kubenswrapper[4898]: I1211 13:05:36.333038 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:36 crc kubenswrapper[4898]: I1211 13:05:36.333049 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:36 crc kubenswrapper[4898]: I1211 13:05:36.333065 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:36 crc kubenswrapper[4898]: I1211 13:05:36.333075 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:36Z","lastTransitionTime":"2025-12-11T13:05:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:36 crc kubenswrapper[4898]: I1211 13:05:36.435884 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:36 crc kubenswrapper[4898]: I1211 13:05:36.435954 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:36 crc kubenswrapper[4898]: I1211 13:05:36.435965 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:36 crc kubenswrapper[4898]: I1211 13:05:36.435979 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:36 crc kubenswrapper[4898]: I1211 13:05:36.435992 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:36Z","lastTransitionTime":"2025-12-11T13:05:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:36 crc kubenswrapper[4898]: I1211 13:05:36.538759 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:36 crc kubenswrapper[4898]: I1211 13:05:36.538829 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:36 crc kubenswrapper[4898]: I1211 13:05:36.538847 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:36 crc kubenswrapper[4898]: I1211 13:05:36.538875 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:36 crc kubenswrapper[4898]: I1211 13:05:36.538897 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:36Z","lastTransitionTime":"2025-12-11T13:05:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:36 crc kubenswrapper[4898]: I1211 13:05:36.641736 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:36 crc kubenswrapper[4898]: I1211 13:05:36.641796 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:36 crc kubenswrapper[4898]: I1211 13:05:36.641813 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:36 crc kubenswrapper[4898]: I1211 13:05:36.641835 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:36 crc kubenswrapper[4898]: I1211 13:05:36.641851 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:36Z","lastTransitionTime":"2025-12-11T13:05:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:36 crc kubenswrapper[4898]: I1211 13:05:36.745485 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:36 crc kubenswrapper[4898]: I1211 13:05:36.745552 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:36 crc kubenswrapper[4898]: I1211 13:05:36.745578 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:36 crc kubenswrapper[4898]: I1211 13:05:36.745608 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:36 crc kubenswrapper[4898]: I1211 13:05:36.745630 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:36Z","lastTransitionTime":"2025-12-11T13:05:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:36 crc kubenswrapper[4898]: I1211 13:05:36.848812 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:36 crc kubenswrapper[4898]: I1211 13:05:36.848895 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:36 crc kubenswrapper[4898]: I1211 13:05:36.848920 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:36 crc kubenswrapper[4898]: I1211 13:05:36.848948 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:36 crc kubenswrapper[4898]: I1211 13:05:36.848969 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:36Z","lastTransitionTime":"2025-12-11T13:05:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:36 crc kubenswrapper[4898]: I1211 13:05:36.951989 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:36 crc kubenswrapper[4898]: I1211 13:05:36.952057 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:36 crc kubenswrapper[4898]: I1211 13:05:36.952077 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:36 crc kubenswrapper[4898]: I1211 13:05:36.952106 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:36 crc kubenswrapper[4898]: I1211 13:05:36.952131 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:36Z","lastTransitionTime":"2025-12-11T13:05:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:37 crc kubenswrapper[4898]: I1211 13:05:37.055617 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:37 crc kubenswrapper[4898]: I1211 13:05:37.055678 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:37 crc kubenswrapper[4898]: I1211 13:05:37.055697 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:37 crc kubenswrapper[4898]: I1211 13:05:37.055720 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:37 crc kubenswrapper[4898]: I1211 13:05:37.055744 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:37Z","lastTransitionTime":"2025-12-11T13:05:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:37 crc kubenswrapper[4898]: I1211 13:05:37.159529 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:37 crc kubenswrapper[4898]: I1211 13:05:37.159578 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:37 crc kubenswrapper[4898]: I1211 13:05:37.159594 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:37 crc kubenswrapper[4898]: I1211 13:05:37.159619 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:37 crc kubenswrapper[4898]: I1211 13:05:37.159637 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:37Z","lastTransitionTime":"2025-12-11T13:05:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:37 crc kubenswrapper[4898]: I1211 13:05:37.262560 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:37 crc kubenswrapper[4898]: I1211 13:05:37.262623 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:37 crc kubenswrapper[4898]: I1211 13:05:37.262632 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:37 crc kubenswrapper[4898]: I1211 13:05:37.262646 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:37 crc kubenswrapper[4898]: I1211 13:05:37.262655 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:37Z","lastTransitionTime":"2025-12-11T13:05:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:37 crc kubenswrapper[4898]: I1211 13:05:37.365118 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:37 crc kubenswrapper[4898]: I1211 13:05:37.365367 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:37 crc kubenswrapper[4898]: I1211 13:05:37.365517 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:37 crc kubenswrapper[4898]: I1211 13:05:37.365600 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:37 crc kubenswrapper[4898]: I1211 13:05:37.365782 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:37Z","lastTransitionTime":"2025-12-11T13:05:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:37 crc kubenswrapper[4898]: I1211 13:05:37.468203 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:37 crc kubenswrapper[4898]: I1211 13:05:37.468266 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:37 crc kubenswrapper[4898]: I1211 13:05:37.468283 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:37 crc kubenswrapper[4898]: I1211 13:05:37.468309 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:37 crc kubenswrapper[4898]: I1211 13:05:37.468327 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:37Z","lastTransitionTime":"2025-12-11T13:05:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:37 crc kubenswrapper[4898]: I1211 13:05:37.570946 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:37 crc kubenswrapper[4898]: I1211 13:05:37.570977 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:37 crc kubenswrapper[4898]: I1211 13:05:37.570985 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:37 crc kubenswrapper[4898]: I1211 13:05:37.571000 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:37 crc kubenswrapper[4898]: I1211 13:05:37.571008 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:37Z","lastTransitionTime":"2025-12-11T13:05:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:37 crc kubenswrapper[4898]: I1211 13:05:37.673708 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:37 crc kubenswrapper[4898]: I1211 13:05:37.673755 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:37 crc kubenswrapper[4898]: I1211 13:05:37.673776 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:37 crc kubenswrapper[4898]: I1211 13:05:37.673803 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:37 crc kubenswrapper[4898]: I1211 13:05:37.673824 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:37Z","lastTransitionTime":"2025-12-11T13:05:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:37 crc kubenswrapper[4898]: I1211 13:05:37.774205 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 11 13:05:37 crc kubenswrapper[4898]: I1211 13:05:37.774249 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 11 13:05:37 crc kubenswrapper[4898]: I1211 13:05:37.774304 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zcq7l" Dec 11 13:05:37 crc kubenswrapper[4898]: E1211 13:05:37.774417 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 11 13:05:37 crc kubenswrapper[4898]: I1211 13:05:37.774249 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 11 13:05:37 crc kubenswrapper[4898]: I1211 13:05:37.775018 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:37 crc kubenswrapper[4898]: I1211 13:05:37.775073 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:37 crc kubenswrapper[4898]: I1211 13:05:37.775092 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:37 crc kubenswrapper[4898]: I1211 13:05:37.775114 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:37 crc kubenswrapper[4898]: E1211 13:05:37.775121 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zcq7l" podUID="34380c7c-1d75-4f6f-a6cb-b015a55ca978" Dec 11 13:05:37 crc kubenswrapper[4898]: I1211 13:05:37.775132 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:37Z","lastTransitionTime":"2025-12-11T13:05:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:37 crc kubenswrapper[4898]: E1211 13:05:37.775270 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 11 13:05:37 crc kubenswrapper[4898]: E1211 13:05:37.774946 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 11 13:05:37 crc kubenswrapper[4898]: E1211 13:05:37.794156 4898 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-11T13:05:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-11T13:05:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-11T13:05:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-11T13:05:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-11T13:05:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-11T13:05:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-11T13:05:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-11T13:05:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"160776e4-8c30-4eed-9dbf-aa0b51733cfb\\\",\\\"systemUUID\\\":\\\"3ede5a2c-67f9-4bff-827e-03a23908e5c0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:37Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:37 crc kubenswrapper[4898]: I1211 13:05:37.799168 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:37 crc kubenswrapper[4898]: I1211 13:05:37.799417 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:37 crc kubenswrapper[4898]: I1211 13:05:37.799651 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:37 crc kubenswrapper[4898]: I1211 13:05:37.799813 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:37 crc kubenswrapper[4898]: I1211 13:05:37.799950 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:37Z","lastTransitionTime":"2025-12-11T13:05:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:37 crc kubenswrapper[4898]: E1211 13:05:37.814827 4898 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-11T13:05:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-11T13:05:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-11T13:05:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-11T13:05:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-11T13:05:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-11T13:05:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-11T13:05:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-11T13:05:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"160776e4-8c30-4eed-9dbf-aa0b51733cfb\\\",\\\"systemUUID\\\":\\\"3ede5a2c-67f9-4bff-827e-03a23908e5c0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:37Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:37 crc kubenswrapper[4898]: I1211 13:05:37.819005 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:37 crc kubenswrapper[4898]: I1211 13:05:37.819054 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:37 crc kubenswrapper[4898]: I1211 13:05:37.819070 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:37 crc kubenswrapper[4898]: I1211 13:05:37.819092 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:37 crc kubenswrapper[4898]: I1211 13:05:37.819111 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:37Z","lastTransitionTime":"2025-12-11T13:05:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:37 crc kubenswrapper[4898]: E1211 13:05:37.836754 4898 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-11T13:05:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-11T13:05:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-11T13:05:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-11T13:05:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-11T13:05:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-11T13:05:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-11T13:05:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-11T13:05:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"160776e4-8c30-4eed-9dbf-aa0b51733cfb\\\",\\\"systemUUID\\\":\\\"3ede5a2c-67f9-4bff-827e-03a23908e5c0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:37Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:37 crc kubenswrapper[4898]: I1211 13:05:37.841622 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:37 crc kubenswrapper[4898]: I1211 13:05:37.841673 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:37 crc kubenswrapper[4898]: I1211 13:05:37.841692 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:37 crc kubenswrapper[4898]: I1211 13:05:37.841721 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:37 crc kubenswrapper[4898]: I1211 13:05:37.841743 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:37Z","lastTransitionTime":"2025-12-11T13:05:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:37 crc kubenswrapper[4898]: E1211 13:05:37.863741 4898 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-11T13:05:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-11T13:05:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-11T13:05:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-11T13:05:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-11T13:05:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-11T13:05:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-11T13:05:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-11T13:05:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"160776e4-8c30-4eed-9dbf-aa0b51733cfb\\\",\\\"systemUUID\\\":\\\"3ede5a2c-67f9-4bff-827e-03a23908e5c0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:37Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:37 crc kubenswrapper[4898]: I1211 13:05:37.869048 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:37 crc kubenswrapper[4898]: I1211 13:05:37.869289 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:37 crc kubenswrapper[4898]: I1211 13:05:37.869690 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:37 crc kubenswrapper[4898]: I1211 13:05:37.870062 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:37 crc kubenswrapper[4898]: I1211 13:05:37.870444 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:37Z","lastTransitionTime":"2025-12-11T13:05:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:37 crc kubenswrapper[4898]: E1211 13:05:37.893702 4898 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-11T13:05:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-11T13:05:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-11T13:05:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-11T13:05:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-11T13:05:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-11T13:05:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-11T13:05:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-11T13:05:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"160776e4-8c30-4eed-9dbf-aa0b51733cfb\\\",\\\"systemUUID\\\":\\\"3ede5a2c-67f9-4bff-827e-03a23908e5c0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:37Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:37 crc kubenswrapper[4898]: E1211 13:05:37.894346 4898 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 11 13:05:37 crc kubenswrapper[4898]: I1211 13:05:37.896445 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:37 crc kubenswrapper[4898]: I1211 13:05:37.896640 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:37 crc kubenswrapper[4898]: I1211 13:05:37.896785 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:37 crc kubenswrapper[4898]: I1211 13:05:37.896923 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:37 crc kubenswrapper[4898]: I1211 13:05:37.897060 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:37Z","lastTransitionTime":"2025-12-11T13:05:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:38 crc kubenswrapper[4898]: I1211 13:05:38.000293 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:38 crc kubenswrapper[4898]: I1211 13:05:38.000350 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:38 crc kubenswrapper[4898]: I1211 13:05:38.000364 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:38 crc kubenswrapper[4898]: I1211 13:05:38.000381 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:38 crc kubenswrapper[4898]: I1211 13:05:38.000392 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:38Z","lastTransitionTime":"2025-12-11T13:05:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:38 crc kubenswrapper[4898]: I1211 13:05:38.103958 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:38 crc kubenswrapper[4898]: I1211 13:05:38.104416 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:38 crc kubenswrapper[4898]: I1211 13:05:38.104630 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:38 crc kubenswrapper[4898]: I1211 13:05:38.104826 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:38 crc kubenswrapper[4898]: I1211 13:05:38.105027 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:38Z","lastTransitionTime":"2025-12-11T13:05:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:38 crc kubenswrapper[4898]: I1211 13:05:38.208442 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:38 crc kubenswrapper[4898]: I1211 13:05:38.208925 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:38 crc kubenswrapper[4898]: I1211 13:05:38.209119 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:38 crc kubenswrapper[4898]: I1211 13:05:38.209273 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:38 crc kubenswrapper[4898]: I1211 13:05:38.209396 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:38Z","lastTransitionTime":"2025-12-11T13:05:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:38 crc kubenswrapper[4898]: I1211 13:05:38.312544 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:38 crc kubenswrapper[4898]: I1211 13:05:38.312899 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:38 crc kubenswrapper[4898]: I1211 13:05:38.313051 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:38 crc kubenswrapper[4898]: I1211 13:05:38.313183 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:38 crc kubenswrapper[4898]: I1211 13:05:38.313306 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:38Z","lastTransitionTime":"2025-12-11T13:05:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:38 crc kubenswrapper[4898]: I1211 13:05:38.417140 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:38 crc kubenswrapper[4898]: I1211 13:05:38.417212 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:38 crc kubenswrapper[4898]: I1211 13:05:38.417234 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:38 crc kubenswrapper[4898]: I1211 13:05:38.417258 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:38 crc kubenswrapper[4898]: I1211 13:05:38.417275 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:38Z","lastTransitionTime":"2025-12-11T13:05:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:38 crc kubenswrapper[4898]: I1211 13:05:38.520000 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:38 crc kubenswrapper[4898]: I1211 13:05:38.520056 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:38 crc kubenswrapper[4898]: I1211 13:05:38.520073 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:38 crc kubenswrapper[4898]: I1211 13:05:38.520097 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:38 crc kubenswrapper[4898]: I1211 13:05:38.520114 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:38Z","lastTransitionTime":"2025-12-11T13:05:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:38 crc kubenswrapper[4898]: I1211 13:05:38.623154 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:38 crc kubenswrapper[4898]: I1211 13:05:38.623223 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:38 crc kubenswrapper[4898]: I1211 13:05:38.623242 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:38 crc kubenswrapper[4898]: I1211 13:05:38.623272 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:38 crc kubenswrapper[4898]: I1211 13:05:38.623290 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:38Z","lastTransitionTime":"2025-12-11T13:05:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:38 crc kubenswrapper[4898]: I1211 13:05:38.726044 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:38 crc kubenswrapper[4898]: I1211 13:05:38.726702 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:38 crc kubenswrapper[4898]: I1211 13:05:38.726745 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:38 crc kubenswrapper[4898]: I1211 13:05:38.726774 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:38 crc kubenswrapper[4898]: I1211 13:05:38.726796 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:38Z","lastTransitionTime":"2025-12-11T13:05:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:38 crc kubenswrapper[4898]: I1211 13:05:38.829811 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:38 crc kubenswrapper[4898]: I1211 13:05:38.829884 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:38 crc kubenswrapper[4898]: I1211 13:05:38.829895 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:38 crc kubenswrapper[4898]: I1211 13:05:38.829921 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:38 crc kubenswrapper[4898]: I1211 13:05:38.829934 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:38Z","lastTransitionTime":"2025-12-11T13:05:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:38 crc kubenswrapper[4898]: I1211 13:05:38.932114 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:38 crc kubenswrapper[4898]: I1211 13:05:38.932186 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:38 crc kubenswrapper[4898]: I1211 13:05:38.932202 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:38 crc kubenswrapper[4898]: I1211 13:05:38.932228 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:38 crc kubenswrapper[4898]: I1211 13:05:38.932292 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:38Z","lastTransitionTime":"2025-12-11T13:05:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:39 crc kubenswrapper[4898]: I1211 13:05:39.035245 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:39 crc kubenswrapper[4898]: I1211 13:05:39.035277 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:39 crc kubenswrapper[4898]: I1211 13:05:39.035285 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:39 crc kubenswrapper[4898]: I1211 13:05:39.035300 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:39 crc kubenswrapper[4898]: I1211 13:05:39.035308 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:39Z","lastTransitionTime":"2025-12-11T13:05:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:39 crc kubenswrapper[4898]: I1211 13:05:39.137890 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:39 crc kubenswrapper[4898]: I1211 13:05:39.138199 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:39 crc kubenswrapper[4898]: I1211 13:05:39.138440 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:39 crc kubenswrapper[4898]: I1211 13:05:39.138733 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:39 crc kubenswrapper[4898]: I1211 13:05:39.138950 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:39Z","lastTransitionTime":"2025-12-11T13:05:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:39 crc kubenswrapper[4898]: I1211 13:05:39.241433 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:39 crc kubenswrapper[4898]: I1211 13:05:39.241505 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:39 crc kubenswrapper[4898]: I1211 13:05:39.241518 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:39 crc kubenswrapper[4898]: I1211 13:05:39.241536 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:39 crc kubenswrapper[4898]: I1211 13:05:39.241548 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:39Z","lastTransitionTime":"2025-12-11T13:05:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:39 crc kubenswrapper[4898]: I1211 13:05:39.345199 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:39 crc kubenswrapper[4898]: I1211 13:05:39.345265 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:39 crc kubenswrapper[4898]: I1211 13:05:39.345287 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:39 crc kubenswrapper[4898]: I1211 13:05:39.345315 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:39 crc kubenswrapper[4898]: I1211 13:05:39.345337 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:39Z","lastTransitionTime":"2025-12-11T13:05:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:39 crc kubenswrapper[4898]: I1211 13:05:39.448862 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:39 crc kubenswrapper[4898]: I1211 13:05:39.448903 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:39 crc kubenswrapper[4898]: I1211 13:05:39.448915 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:39 crc kubenswrapper[4898]: I1211 13:05:39.448949 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:39 crc kubenswrapper[4898]: I1211 13:05:39.448960 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:39Z","lastTransitionTime":"2025-12-11T13:05:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:39 crc kubenswrapper[4898]: I1211 13:05:39.551648 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:39 crc kubenswrapper[4898]: I1211 13:05:39.551730 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:39 crc kubenswrapper[4898]: I1211 13:05:39.551757 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:39 crc kubenswrapper[4898]: I1211 13:05:39.551787 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:39 crc kubenswrapper[4898]: I1211 13:05:39.551808 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:39Z","lastTransitionTime":"2025-12-11T13:05:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:39 crc kubenswrapper[4898]: I1211 13:05:39.655220 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:39 crc kubenswrapper[4898]: I1211 13:05:39.655295 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:39 crc kubenswrapper[4898]: I1211 13:05:39.655320 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:39 crc kubenswrapper[4898]: I1211 13:05:39.655344 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:39 crc kubenswrapper[4898]: I1211 13:05:39.655363 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:39Z","lastTransitionTime":"2025-12-11T13:05:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:39 crc kubenswrapper[4898]: I1211 13:05:39.758520 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:39 crc kubenswrapper[4898]: I1211 13:05:39.758583 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:39 crc kubenswrapper[4898]: I1211 13:05:39.758600 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:39 crc kubenswrapper[4898]: I1211 13:05:39.758624 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:39 crc kubenswrapper[4898]: I1211 13:05:39.758642 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:39Z","lastTransitionTime":"2025-12-11T13:05:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:39 crc kubenswrapper[4898]: I1211 13:05:39.774899 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 11 13:05:39 crc kubenswrapper[4898]: I1211 13:05:39.774925 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 11 13:05:39 crc kubenswrapper[4898]: E1211 13:05:39.775060 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 11 13:05:39 crc kubenswrapper[4898]: I1211 13:05:39.775106 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zcq7l" Dec 11 13:05:39 crc kubenswrapper[4898]: I1211 13:05:39.775142 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 11 13:05:39 crc kubenswrapper[4898]: E1211 13:05:39.775259 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zcq7l" podUID="34380c7c-1d75-4f6f-a6cb-b015a55ca978" Dec 11 13:05:39 crc kubenswrapper[4898]: E1211 13:05:39.775327 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 11 13:05:39 crc kubenswrapper[4898]: E1211 13:05:39.775430 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 11 13:05:39 crc kubenswrapper[4898]: I1211 13:05:39.861311 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:39 crc kubenswrapper[4898]: I1211 13:05:39.861357 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:39 crc kubenswrapper[4898]: I1211 13:05:39.861373 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:39 crc kubenswrapper[4898]: I1211 13:05:39.861393 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:39 crc kubenswrapper[4898]: I1211 13:05:39.861410 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:39Z","lastTransitionTime":"2025-12-11T13:05:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:39 crc kubenswrapper[4898]: I1211 13:05:39.964120 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:39 crc kubenswrapper[4898]: I1211 13:05:39.964155 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:39 crc kubenswrapper[4898]: I1211 13:05:39.964166 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:39 crc kubenswrapper[4898]: I1211 13:05:39.964185 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:39 crc kubenswrapper[4898]: I1211 13:05:39.964196 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:39Z","lastTransitionTime":"2025-12-11T13:05:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:40 crc kubenswrapper[4898]: I1211 13:05:40.066161 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:40 crc kubenswrapper[4898]: I1211 13:05:40.066214 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:40 crc kubenswrapper[4898]: I1211 13:05:40.066235 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:40 crc kubenswrapper[4898]: I1211 13:05:40.066255 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:40 crc kubenswrapper[4898]: I1211 13:05:40.066273 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:40Z","lastTransitionTime":"2025-12-11T13:05:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:40 crc kubenswrapper[4898]: I1211 13:05:40.168599 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:40 crc kubenswrapper[4898]: I1211 13:05:40.168674 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:40 crc kubenswrapper[4898]: I1211 13:05:40.168698 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:40 crc kubenswrapper[4898]: I1211 13:05:40.168728 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:40 crc kubenswrapper[4898]: I1211 13:05:40.168751 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:40Z","lastTransitionTime":"2025-12-11T13:05:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:40 crc kubenswrapper[4898]: I1211 13:05:40.272094 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:40 crc kubenswrapper[4898]: I1211 13:05:40.272160 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:40 crc kubenswrapper[4898]: I1211 13:05:40.272179 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:40 crc kubenswrapper[4898]: I1211 13:05:40.272202 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:40 crc kubenswrapper[4898]: I1211 13:05:40.272219 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:40Z","lastTransitionTime":"2025-12-11T13:05:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:40 crc kubenswrapper[4898]: I1211 13:05:40.330217 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/34380c7c-1d75-4f6f-a6cb-b015a55ca978-metrics-certs\") pod \"network-metrics-daemon-zcq7l\" (UID: \"34380c7c-1d75-4f6f-a6cb-b015a55ca978\") " pod="openshift-multus/network-metrics-daemon-zcq7l" Dec 11 13:05:40 crc kubenswrapper[4898]: E1211 13:05:40.330499 4898 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 11 13:05:40 crc kubenswrapper[4898]: E1211 13:05:40.330652 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/34380c7c-1d75-4f6f-a6cb-b015a55ca978-metrics-certs podName:34380c7c-1d75-4f6f-a6cb-b015a55ca978 nodeName:}" failed. No retries permitted until 2025-12-11 13:06:44.330614628 +0000 UTC m=+161.902941095 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/34380c7c-1d75-4f6f-a6cb-b015a55ca978-metrics-certs") pod "network-metrics-daemon-zcq7l" (UID: "34380c7c-1d75-4f6f-a6cb-b015a55ca978") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 11 13:05:40 crc kubenswrapper[4898]: I1211 13:05:40.374972 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:40 crc kubenswrapper[4898]: I1211 13:05:40.375017 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:40 crc kubenswrapper[4898]: I1211 13:05:40.375029 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:40 crc kubenswrapper[4898]: I1211 13:05:40.375044 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:40 crc kubenswrapper[4898]: I1211 13:05:40.375056 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:40Z","lastTransitionTime":"2025-12-11T13:05:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:40 crc kubenswrapper[4898]: I1211 13:05:40.477703 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:40 crc kubenswrapper[4898]: I1211 13:05:40.477753 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:40 crc kubenswrapper[4898]: I1211 13:05:40.477766 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:40 crc kubenswrapper[4898]: I1211 13:05:40.477781 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:40 crc kubenswrapper[4898]: I1211 13:05:40.477792 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:40Z","lastTransitionTime":"2025-12-11T13:05:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:40 crc kubenswrapper[4898]: I1211 13:05:40.580802 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:40 crc kubenswrapper[4898]: I1211 13:05:40.580867 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:40 crc kubenswrapper[4898]: I1211 13:05:40.580885 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:40 crc kubenswrapper[4898]: I1211 13:05:40.580911 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:40 crc kubenswrapper[4898]: I1211 13:05:40.580933 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:40Z","lastTransitionTime":"2025-12-11T13:05:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:40 crc kubenswrapper[4898]: I1211 13:05:40.683384 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:40 crc kubenswrapper[4898]: I1211 13:05:40.683503 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:40 crc kubenswrapper[4898]: I1211 13:05:40.683529 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:40 crc kubenswrapper[4898]: I1211 13:05:40.683556 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:40 crc kubenswrapper[4898]: I1211 13:05:40.683572 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:40Z","lastTransitionTime":"2025-12-11T13:05:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:40 crc kubenswrapper[4898]: I1211 13:05:40.785692 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:40 crc kubenswrapper[4898]: I1211 13:05:40.785725 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:40 crc kubenswrapper[4898]: I1211 13:05:40.785734 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:40 crc kubenswrapper[4898]: I1211 13:05:40.785746 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:40 crc kubenswrapper[4898]: I1211 13:05:40.785754 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:40Z","lastTransitionTime":"2025-12-11T13:05:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:40 crc kubenswrapper[4898]: I1211 13:05:40.889086 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:40 crc kubenswrapper[4898]: I1211 13:05:40.889127 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:40 crc kubenswrapper[4898]: I1211 13:05:40.889138 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:40 crc kubenswrapper[4898]: I1211 13:05:40.889154 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:40 crc kubenswrapper[4898]: I1211 13:05:40.889165 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:40Z","lastTransitionTime":"2025-12-11T13:05:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:40 crc kubenswrapper[4898]: I1211 13:05:40.991698 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:40 crc kubenswrapper[4898]: I1211 13:05:40.991765 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:40 crc kubenswrapper[4898]: I1211 13:05:40.991788 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:40 crc kubenswrapper[4898]: I1211 13:05:40.991818 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:40 crc kubenswrapper[4898]: I1211 13:05:40.991845 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:40Z","lastTransitionTime":"2025-12-11T13:05:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:41 crc kubenswrapper[4898]: I1211 13:05:41.094795 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:41 crc kubenswrapper[4898]: I1211 13:05:41.094846 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:41 crc kubenswrapper[4898]: I1211 13:05:41.094864 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:41 crc kubenswrapper[4898]: I1211 13:05:41.094886 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:41 crc kubenswrapper[4898]: I1211 13:05:41.094900 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:41Z","lastTransitionTime":"2025-12-11T13:05:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:41 crc kubenswrapper[4898]: I1211 13:05:41.196937 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:41 crc kubenswrapper[4898]: I1211 13:05:41.196977 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:41 crc kubenswrapper[4898]: I1211 13:05:41.196991 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:41 crc kubenswrapper[4898]: I1211 13:05:41.197006 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:41 crc kubenswrapper[4898]: I1211 13:05:41.197018 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:41Z","lastTransitionTime":"2025-12-11T13:05:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:41 crc kubenswrapper[4898]: I1211 13:05:41.299915 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:41 crc kubenswrapper[4898]: I1211 13:05:41.299986 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:41 crc kubenswrapper[4898]: I1211 13:05:41.300010 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:41 crc kubenswrapper[4898]: I1211 13:05:41.300040 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:41 crc kubenswrapper[4898]: I1211 13:05:41.300062 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:41Z","lastTransitionTime":"2025-12-11T13:05:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:41 crc kubenswrapper[4898]: I1211 13:05:41.403131 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:41 crc kubenswrapper[4898]: I1211 13:05:41.403192 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:41 crc kubenswrapper[4898]: I1211 13:05:41.403209 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:41 crc kubenswrapper[4898]: I1211 13:05:41.403235 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:41 crc kubenswrapper[4898]: I1211 13:05:41.403254 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:41Z","lastTransitionTime":"2025-12-11T13:05:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:41 crc kubenswrapper[4898]: I1211 13:05:41.506031 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:41 crc kubenswrapper[4898]: I1211 13:05:41.506091 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:41 crc kubenswrapper[4898]: I1211 13:05:41.506103 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:41 crc kubenswrapper[4898]: I1211 13:05:41.506119 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:41 crc kubenswrapper[4898]: I1211 13:05:41.506130 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:41Z","lastTransitionTime":"2025-12-11T13:05:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:41 crc kubenswrapper[4898]: I1211 13:05:41.607711 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:41 crc kubenswrapper[4898]: I1211 13:05:41.607746 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:41 crc kubenswrapper[4898]: I1211 13:05:41.607754 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:41 crc kubenswrapper[4898]: I1211 13:05:41.607768 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:41 crc kubenswrapper[4898]: I1211 13:05:41.607777 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:41Z","lastTransitionTime":"2025-12-11T13:05:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:41 crc kubenswrapper[4898]: I1211 13:05:41.709581 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:41 crc kubenswrapper[4898]: I1211 13:05:41.709615 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:41 crc kubenswrapper[4898]: I1211 13:05:41.709623 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:41 crc kubenswrapper[4898]: I1211 13:05:41.709635 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:41 crc kubenswrapper[4898]: I1211 13:05:41.709646 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:41Z","lastTransitionTime":"2025-12-11T13:05:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:41 crc kubenswrapper[4898]: I1211 13:05:41.774118 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 11 13:05:41 crc kubenswrapper[4898]: I1211 13:05:41.774126 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 11 13:05:41 crc kubenswrapper[4898]: I1211 13:05:41.774132 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zcq7l" Dec 11 13:05:41 crc kubenswrapper[4898]: I1211 13:05:41.774491 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 11 13:05:41 crc kubenswrapper[4898]: E1211 13:05:41.774685 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 11 13:05:41 crc kubenswrapper[4898]: E1211 13:05:41.774853 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zcq7l" podUID="34380c7c-1d75-4f6f-a6cb-b015a55ca978" Dec 11 13:05:41 crc kubenswrapper[4898]: E1211 13:05:41.774930 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 11 13:05:41 crc kubenswrapper[4898]: E1211 13:05:41.775039 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 11 13:05:41 crc kubenswrapper[4898]: I1211 13:05:41.812282 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:41 crc kubenswrapper[4898]: I1211 13:05:41.812313 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:41 crc kubenswrapper[4898]: I1211 13:05:41.812320 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:41 crc kubenswrapper[4898]: I1211 13:05:41.812332 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:41 crc kubenswrapper[4898]: I1211 13:05:41.812342 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:41Z","lastTransitionTime":"2025-12-11T13:05:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:41 crc kubenswrapper[4898]: I1211 13:05:41.914132 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:41 crc kubenswrapper[4898]: I1211 13:05:41.914201 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:41 crc kubenswrapper[4898]: I1211 13:05:41.914216 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:41 crc kubenswrapper[4898]: I1211 13:05:41.914237 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:41 crc kubenswrapper[4898]: I1211 13:05:41.914266 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:41Z","lastTransitionTime":"2025-12-11T13:05:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:42 crc kubenswrapper[4898]: I1211 13:05:42.017261 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:42 crc kubenswrapper[4898]: I1211 13:05:42.017326 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:42 crc kubenswrapper[4898]: I1211 13:05:42.017348 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:42 crc kubenswrapper[4898]: I1211 13:05:42.017375 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:42 crc kubenswrapper[4898]: I1211 13:05:42.017397 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:42Z","lastTransitionTime":"2025-12-11T13:05:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:42 crc kubenswrapper[4898]: I1211 13:05:42.120952 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:42 crc kubenswrapper[4898]: I1211 13:05:42.121032 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:42 crc kubenswrapper[4898]: I1211 13:05:42.121043 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:42 crc kubenswrapper[4898]: I1211 13:05:42.121059 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:42 crc kubenswrapper[4898]: I1211 13:05:42.121069 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:42Z","lastTransitionTime":"2025-12-11T13:05:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:42 crc kubenswrapper[4898]: I1211 13:05:42.224478 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:42 crc kubenswrapper[4898]: I1211 13:05:42.224545 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:42 crc kubenswrapper[4898]: I1211 13:05:42.224562 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:42 crc kubenswrapper[4898]: I1211 13:05:42.224589 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:42 crc kubenswrapper[4898]: I1211 13:05:42.224606 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:42Z","lastTransitionTime":"2025-12-11T13:05:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:42 crc kubenswrapper[4898]: I1211 13:05:42.328161 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:42 crc kubenswrapper[4898]: I1211 13:05:42.328222 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:42 crc kubenswrapper[4898]: I1211 13:05:42.328244 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:42 crc kubenswrapper[4898]: I1211 13:05:42.328272 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:42 crc kubenswrapper[4898]: I1211 13:05:42.328293 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:42Z","lastTransitionTime":"2025-12-11T13:05:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:42 crc kubenswrapper[4898]: I1211 13:05:42.431581 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:42 crc kubenswrapper[4898]: I1211 13:05:42.431667 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:42 crc kubenswrapper[4898]: I1211 13:05:42.431690 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:42 crc kubenswrapper[4898]: I1211 13:05:42.431716 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:42 crc kubenswrapper[4898]: I1211 13:05:42.431740 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:42Z","lastTransitionTime":"2025-12-11T13:05:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:42 crc kubenswrapper[4898]: I1211 13:05:42.535444 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:42 crc kubenswrapper[4898]: I1211 13:05:42.535563 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:42 crc kubenswrapper[4898]: I1211 13:05:42.535586 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:42 crc kubenswrapper[4898]: I1211 13:05:42.535614 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:42 crc kubenswrapper[4898]: I1211 13:05:42.535636 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:42Z","lastTransitionTime":"2025-12-11T13:05:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:42 crc kubenswrapper[4898]: I1211 13:05:42.638996 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:42 crc kubenswrapper[4898]: I1211 13:05:42.639063 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:42 crc kubenswrapper[4898]: I1211 13:05:42.639080 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:42 crc kubenswrapper[4898]: I1211 13:05:42.639117 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:42 crc kubenswrapper[4898]: I1211 13:05:42.639147 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:42Z","lastTransitionTime":"2025-12-11T13:05:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:42 crc kubenswrapper[4898]: I1211 13:05:42.742197 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:42 crc kubenswrapper[4898]: I1211 13:05:42.742276 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:42 crc kubenswrapper[4898]: I1211 13:05:42.742294 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:42 crc kubenswrapper[4898]: I1211 13:05:42.742320 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:42 crc kubenswrapper[4898]: I1211 13:05:42.742337 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:42Z","lastTransitionTime":"2025-12-11T13:05:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:42 crc kubenswrapper[4898]: I1211 13:05:42.791916 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca3db723-c4e9-4a38-9cee-827f045c7be3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b72e9f6787b4bfadbc89cf669d5f31678f0a85a726107e3b3d2e06eff0e18ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a52b842d9e7001cf1e9c3284c56ac9a76f642b66457d6717c6367e00ef89fdf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://583816a3f9ba9a092d6cd8d75bba54cf42e3ff9f65230f83822baec967dd53dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f419f755ec0a61ffad06dca0d35df43386ebd253152f29cf04312cd0ea89d5de\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:42Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:42 crc kubenswrapper[4898]: I1211 13:05:42.807350 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:42Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:42 crc kubenswrapper[4898]: I1211 13:05:42.825641 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:42Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:42 crc kubenswrapper[4898]: I1211 13:05:42.842322 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jmvlr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8009db93-3f68-4a84-87ae-863f64e231e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08079d059f71e05ad9d28ebed3d2608a035d3b6fc87db1d80b3974b439ac5bdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkmx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:24Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jmvlr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:42Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:42 crc kubenswrapper[4898]: I1211 13:05:42.844917 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:42 crc kubenswrapper[4898]: I1211 13:05:42.844973 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:42 crc kubenswrapper[4898]: I1211 13:05:42.844986 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:42 crc kubenswrapper[4898]: I1211 13:05:42.845005 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:42 crc kubenswrapper[4898]: I1211 13:05:42.845018 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:42Z","lastTransitionTime":"2025-12-11T13:05:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:42 crc kubenswrapper[4898]: I1211 13:05:42.859415 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://337971ff1d0d3cf7ee256777a71e4549ec0b973b58b437caaa152c589047141f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:42Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:42 crc kubenswrapper[4898]: I1211 13:05:42.875824 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b28cc743c1173591c6ee75d4bb3841608d01bca50dab896d4c5de930a67ee22f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:42Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:42 crc kubenswrapper[4898]: I1211 13:05:42.898960 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-dlqfj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e8ed6cb-b822-4b64-9e00-e755c5aea812\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbc9b844a873836a500af9d781e17c635495762b0e50af71dadf141ff00723a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76ddfb054d2dd7ec27a771c96a0fca2cead8eacc3221c22b36c1d075414a1995\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-11T13:05:09Z\\\",\\\"message\\\":\\\"2025-12-11T13:04:23+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_be5097ed-4df8-4212-b1ed-ee3e7c74e5b6\\\\n2025-12-11T13:04:23+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_be5097ed-4df8-4212-b1ed-ee3e7c74e5b6 to /host/opt/cni/bin/\\\\n2025-12-11T13:04:24Z [verbose] multus-daemon started\\\\n2025-12-11T13:04:24Z [verbose] Readiness Indicator file check\\\\n2025-12-11T13:05:09Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:05:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bvrpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-dlqfj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:42Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:42 crc kubenswrapper[4898]: I1211 13:05:42.913139 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-h86cf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"00c0a5c3-6be3-4c77-a628-cf6710a1f10f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cb94931502e22d03595fc3289c2b098ecf881b9f8c14180a036a11b2f65bd91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xv2hc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-h86cf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:42Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:42 crc kubenswrapper[4898]: I1211 13:05:42.928612 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6782ac5869029f1d6c05198c469d34070decbbbbab0c093963024040f067ecf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7cv42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb7b98c50effebe09db7e51fe139da11af7b116ce6b7d81f0dada92fa29c82a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7cv42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7mmvk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:42Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:42 crc kubenswrapper[4898]: I1211 13:05:42.947820 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:42 crc kubenswrapper[4898]: I1211 13:05:42.947889 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:42 crc kubenswrapper[4898]: I1211 13:05:42.947934 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:42 crc kubenswrapper[4898]: I1211 13:05:42.947966 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:42 crc kubenswrapper[4898]: I1211 13:05:42.947992 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:42Z","lastTransitionTime":"2025-12-11T13:05:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:42 crc kubenswrapper[4898]: I1211 13:05:42.961194 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qndxl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1efa7034-8a95-4e6e-bd84-0189dc5acaa3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2a77e22893dca5ea818fc6992db6c34f8eb90ce2f25e814bec5f314d60ad569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd81ec4902eeb225d5a2969e211c63bbab448c1cccb327cfa84fb68c4f386ab0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ab0efc2d7d46f575125a46f025b016017e66f74466a9d796dfb0285577f5bcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d992af21fd68cce19a4be8328466a3d46df62ed908fe4e911ba14e7c009ce982\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a60901e27e67b7483e02800a91a86f2676b610d538c999ab1df0d42fde146f90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7e7d7ded2a6e0e20781a16b0ed56a744db593c6fe40c75e89a088ccb8cdb127\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b19bc85d9d1f776f0f2a0bfedbfa6f30383df3673e54acadcc74a46e4d24bcde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19bc85d9d1f776f0f2a0bfedbfa6f30383df3673e54acadcc74a46e4d24bcde\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-11T13:05:23Z\\\",\\\"message\\\":\\\"twork controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:23Z is after 2025-08-24T17:21:41Z]\\\\nI1211 13:05:23.341647 6982 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-console-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"ebd4748e-0473-49fb-88ad-83dbb221791a\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-console-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]st\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-11T13:05:22Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-qndxl_openshift-ovn-kubernetes(1efa7034-8a95-4e6e-bd84-0189dc5acaa3)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3aa907a5ca865510f1aafd0f949963b5938b6fe62fa9fd690447c4ab62566f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fc397b24e8480c4d07281b09fb871f859eb9fdf47fa9103b9243f91e5800d61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fc397b24e8480c4d07281b09fb871f859eb9fdf47fa9103b9243f91e5800d61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-729s2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qndxl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:42Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:42 crc kubenswrapper[4898]: I1211 13:05:42.975734 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-brphv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f71cb6f-609f-4fab-86eb-e01518fb8c61\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d17922d87e50d85f50e76fef19a44cedae3bfd35727f8531089cfcf6da12d5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwpjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cedba92b63b5430b84d32f2b59c5cc0c4e1cd2bc2d513f2f905ebb59c34639b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwpjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-brphv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:42Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:42 crc kubenswrapper[4898]: I1211 13:05:42.990598 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbb0d9e2-813d-44ce-a547-a3c6f60c9f6b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e84fc12f1339e236216301663a693fc0ae8c043f282081b77e08455869556f40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcda5c81dd6cdc256d7acd24d32f956484e8823bed9e29c5713a722d8ebd6f46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cbfca7c42a03929bac318edc66ce1a6c7f88f795fd3b58db686b8484edfac60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e329db1261719ccef14dcf3965ff73799b275591613d4035cb820253dced79b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e329db1261719ccef14dcf3965ff73799b275591613d4035cb820253dced79b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:02Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:42Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:43 crc kubenswrapper[4898]: I1211 13:05:43.001733 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4281bcb0-26b5-4c42-89da-27cf45eb279a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://02c825a08c28d57d3fe2034e57f21ac7b8ca377df15a37ad6757b4d0c6632f2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62d748867b692dbb01e602f478b8af8cfee39ad2effae7fb32c2d53207c52519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62d748867b692dbb01e602f478b8af8cfee39ad2effae7fb32c2d53207c52519\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:42Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:43 crc kubenswrapper[4898]: I1211 13:05:43.026165 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"88d684b6-361a-40e6-978e-b46ea34398f5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fdd8f4f5e41c02e8bfa32782027f6483a3568a3e1cb28f2abc7807eaada32646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad23dc07c8ad7cf6100e3b53e1d9ed8abac20e942968edb5decdde1b263de306\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce1bf1a7718645350f992e26a366a0fefa0d757b696c932d3552dd24a4c93550\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4801f56faef50a1c100b7f54be9508a2be17ab655648fdb92cffaab1cb198a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87f5a835b639c4ce50c0d2b5f1a865f1b67bc5cc06d9fd88875d9defb71dbed2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77f3c45a0ef619bd5ee7b89f62fdaf4ae983c8f75e23034eb88019194f6ef88a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77f3c45a0ef619bd5ee7b89f62fdaf4ae983c8f75e23034eb88019194f6ef88a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32f392949d5ebc630af8b0646f8f482f806803d464e2cedbcbe731328d6ba5ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32f392949d5ebc630af8b0646f8f482f806803d464e2cedbcbe731328d6ba5ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://bf8553e4b1a22afaf454efae3707136b9e6e320a54fd269f48a5f72297249d21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf8553e4b1a22afaf454efae3707136b9e6e320a54fd269f48a5f72297249d21\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:43Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:43 crc kubenswrapper[4898]: I1211 13:05:43.040225 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7lxfm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45192e97-2770-4866-9865-dc2f45b3f616\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4fc2a5689b589ccda6ec382b6a586f4e47c42d4f137d852e28688a18b98e6e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14a9667311a747f8e844897394432d67e0897b22569dd1314f4ebeccfce1c531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14a9667311a747f8e844897394432d67e0897b22569dd1314f4ebeccfce1c531\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://35bb75c37377565dd0410255d00b3a848a3d22801107bd4d9547ce5574cda121\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35bb75c37377565dd0410255d00b3a848a3d22801107bd4d9547ce5574cda121\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd3bae44b475629c42174d195e162b9afe545e8dc84857847b1e2d936cf1a32d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd3bae44b475629c42174d195e162b9afe545e8dc84857847b1e2d936cf1a32d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7d5edf28866aeb08ed16121e0cea183463e58a708b157da20edd847bdad9860\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7d5edf28866aeb08ed16121e0cea183463e58a708b157da20edd847bdad9860\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://951e5da3fc6041496960dcfcd28bb3e918a40d9303c6609880bc2f8dcca29688\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://951e5da3fc6041496960dcfcd28bb3e918a40d9303c6609880bc2f8dcca29688\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39b6a95a0387b8f1ea4992bfa28355272a299bbb81adddf7ad39835786b43b2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39b6a95a0387b8f1ea4992bfa28355272a299bbb81adddf7ad39835786b43b2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96jns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7lxfm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:43Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:43 crc kubenswrapper[4898]: I1211 13:05:43.051149 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:43 crc kubenswrapper[4898]: I1211 13:05:43.051192 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:43 crc kubenswrapper[4898]: I1211 13:05:43.051208 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:43 crc kubenswrapper[4898]: I1211 13:05:43.051228 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:43 crc kubenswrapper[4898]: I1211 13:05:43.051244 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:43Z","lastTransitionTime":"2025-12-11T13:05:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:43 crc kubenswrapper[4898]: I1211 13:05:43.051263 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-zcq7l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34380c7c-1d75-4f6f-a6cb-b015a55ca978\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4cxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:36Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-zcq7l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:43Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:43 crc kubenswrapper[4898]: I1211 13:05:43.065071 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fc485a2-a69e-4453-8c3d-4b9698caa632\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df081be8bffcbf5068ade5f09ad81f2ea332d110fb0dffc1ab215f4a447e75d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://765cb993103d23770e20a725ce1f28ca635d13d2ecd05676039a833274ecf3e5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41568c567a418a6ebd534c09d02c4331dfa2cd00580441bfdd83a15a12153a59\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ccc1530df8a90b252f6008c3a1734fd489bcb57944c2cc46bdf3c7554d837c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8857b4ade110b31598d9080f45bc15f5702b2c278fe7909e4d172c9877e61534\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"message\\\":\\\"falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1211 13:04:15.326141 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1211 13:04:15.327902 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-38072788/tls.crt::/tmp/serving-cert-38072788/tls.key\\\\\\\"\\\\nI1211 13:04:21.035582 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1211 13:04:21.038444 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1211 13:04:21.038483 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1211 13:04:21.038515 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1211 13:04:21.038527 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1211 13:04:21.044993 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1211 13:04:21.045022 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1211 13:04:21.045029 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1211 13:04:21.045034 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1211 13:04:21.045038 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1211 13:04:21.045043 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1211 13:04:21.045047 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1211 13:04:21.045297 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1211 13:04:21.046774 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a6e962a8c37823fc7d0047f9c1cae14311bfa1f36236addff039bba3369a021\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca4b5408884ed07718750ad82f77961d242327ca11991fd4dbbd1b5b53c5b642\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca4b5408884ed07718750ad82f77961d242327ca11991fd4dbbd1b5b53c5b642\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T13:04:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T13:04:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:43Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:43 crc kubenswrapper[4898]: I1211 13:05:43.076530 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9dab79b9393ae8ca48f0b4366d9b4e6fd25015a3d42f913b3d3d5ce78cbf0cc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bee512bd389a5e402acad6fdbf075e8d90bbfb3899efaecee992ff8ef64617fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T13:04:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:43Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:43 crc kubenswrapper[4898]: I1211 13:05:43.087593 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T13:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:43Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:43 crc kubenswrapper[4898]: I1211 13:05:43.154147 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:43 crc kubenswrapper[4898]: I1211 13:05:43.154181 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:43 crc kubenswrapper[4898]: I1211 13:05:43.154190 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:43 crc kubenswrapper[4898]: I1211 13:05:43.154204 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:43 crc kubenswrapper[4898]: I1211 13:05:43.154214 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:43Z","lastTransitionTime":"2025-12-11T13:05:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:43 crc kubenswrapper[4898]: I1211 13:05:43.256926 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:43 crc kubenswrapper[4898]: I1211 13:05:43.257249 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:43 crc kubenswrapper[4898]: I1211 13:05:43.257382 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:43 crc kubenswrapper[4898]: I1211 13:05:43.257517 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:43 crc kubenswrapper[4898]: I1211 13:05:43.257609 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:43Z","lastTransitionTime":"2025-12-11T13:05:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:43 crc kubenswrapper[4898]: I1211 13:05:43.360347 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:43 crc kubenswrapper[4898]: I1211 13:05:43.360984 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:43 crc kubenswrapper[4898]: I1211 13:05:43.361247 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:43 crc kubenswrapper[4898]: I1211 13:05:43.361554 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:43 crc kubenswrapper[4898]: I1211 13:05:43.361772 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:43Z","lastTransitionTime":"2025-12-11T13:05:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:43 crc kubenswrapper[4898]: I1211 13:05:43.464625 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:43 crc kubenswrapper[4898]: I1211 13:05:43.464833 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:43 crc kubenswrapper[4898]: I1211 13:05:43.464933 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:43 crc kubenswrapper[4898]: I1211 13:05:43.465005 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:43 crc kubenswrapper[4898]: I1211 13:05:43.465082 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:43Z","lastTransitionTime":"2025-12-11T13:05:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:43 crc kubenswrapper[4898]: I1211 13:05:43.568166 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:43 crc kubenswrapper[4898]: I1211 13:05:43.568242 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:43 crc kubenswrapper[4898]: I1211 13:05:43.568258 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:43 crc kubenswrapper[4898]: I1211 13:05:43.568290 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:43 crc kubenswrapper[4898]: I1211 13:05:43.568308 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:43Z","lastTransitionTime":"2025-12-11T13:05:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:43 crc kubenswrapper[4898]: I1211 13:05:43.671668 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:43 crc kubenswrapper[4898]: I1211 13:05:43.671748 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:43 crc kubenswrapper[4898]: I1211 13:05:43.671763 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:43 crc kubenswrapper[4898]: I1211 13:05:43.671789 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:43 crc kubenswrapper[4898]: I1211 13:05:43.671803 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:43Z","lastTransitionTime":"2025-12-11T13:05:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:43 crc kubenswrapper[4898]: I1211 13:05:43.774412 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 11 13:05:43 crc kubenswrapper[4898]: I1211 13:05:43.774648 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zcq7l" Dec 11 13:05:43 crc kubenswrapper[4898]: E1211 13:05:43.774698 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 11 13:05:43 crc kubenswrapper[4898]: I1211 13:05:43.774747 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 11 13:05:43 crc kubenswrapper[4898]: I1211 13:05:43.774862 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 11 13:05:43 crc kubenswrapper[4898]: I1211 13:05:43.774858 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:43 crc kubenswrapper[4898]: I1211 13:05:43.774912 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:43 crc kubenswrapper[4898]: I1211 13:05:43.774937 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:43 crc kubenswrapper[4898]: I1211 13:05:43.774965 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:43 crc kubenswrapper[4898]: I1211 13:05:43.774986 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:43Z","lastTransitionTime":"2025-12-11T13:05:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:43 crc kubenswrapper[4898]: E1211 13:05:43.774866 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zcq7l" podUID="34380c7c-1d75-4f6f-a6cb-b015a55ca978" Dec 11 13:05:43 crc kubenswrapper[4898]: E1211 13:05:43.774939 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 11 13:05:43 crc kubenswrapper[4898]: E1211 13:05:43.775303 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 11 13:05:43 crc kubenswrapper[4898]: I1211 13:05:43.877919 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:43 crc kubenswrapper[4898]: I1211 13:05:43.878337 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:43 crc kubenswrapper[4898]: I1211 13:05:43.878601 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:43 crc kubenswrapper[4898]: I1211 13:05:43.878811 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:43 crc kubenswrapper[4898]: I1211 13:05:43.878962 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:43Z","lastTransitionTime":"2025-12-11T13:05:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:43 crc kubenswrapper[4898]: I1211 13:05:43.982288 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:43 crc kubenswrapper[4898]: I1211 13:05:43.982330 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:43 crc kubenswrapper[4898]: I1211 13:05:43.982342 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:43 crc kubenswrapper[4898]: I1211 13:05:43.982358 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:43 crc kubenswrapper[4898]: I1211 13:05:43.982371 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:43Z","lastTransitionTime":"2025-12-11T13:05:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:44 crc kubenswrapper[4898]: I1211 13:05:44.084612 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:44 crc kubenswrapper[4898]: I1211 13:05:44.084686 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:44 crc kubenswrapper[4898]: I1211 13:05:44.084708 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:44 crc kubenswrapper[4898]: I1211 13:05:44.084734 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:44 crc kubenswrapper[4898]: I1211 13:05:44.084751 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:44Z","lastTransitionTime":"2025-12-11T13:05:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:44 crc kubenswrapper[4898]: I1211 13:05:44.187492 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:44 crc kubenswrapper[4898]: I1211 13:05:44.187541 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:44 crc kubenswrapper[4898]: I1211 13:05:44.187552 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:44 crc kubenswrapper[4898]: I1211 13:05:44.187568 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:44 crc kubenswrapper[4898]: I1211 13:05:44.187578 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:44Z","lastTransitionTime":"2025-12-11T13:05:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:44 crc kubenswrapper[4898]: I1211 13:05:44.290489 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:44 crc kubenswrapper[4898]: I1211 13:05:44.290747 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:44 crc kubenswrapper[4898]: I1211 13:05:44.290817 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:44 crc kubenswrapper[4898]: I1211 13:05:44.290879 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:44 crc kubenswrapper[4898]: I1211 13:05:44.290950 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:44Z","lastTransitionTime":"2025-12-11T13:05:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:44 crc kubenswrapper[4898]: I1211 13:05:44.393315 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:44 crc kubenswrapper[4898]: I1211 13:05:44.393620 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:44 crc kubenswrapper[4898]: I1211 13:05:44.393726 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:44 crc kubenswrapper[4898]: I1211 13:05:44.393818 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:44 crc kubenswrapper[4898]: I1211 13:05:44.393880 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:44Z","lastTransitionTime":"2025-12-11T13:05:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:44 crc kubenswrapper[4898]: I1211 13:05:44.496590 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:44 crc kubenswrapper[4898]: I1211 13:05:44.497116 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:44 crc kubenswrapper[4898]: I1211 13:05:44.497342 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:44 crc kubenswrapper[4898]: I1211 13:05:44.497601 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:44 crc kubenswrapper[4898]: I1211 13:05:44.498175 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:44Z","lastTransitionTime":"2025-12-11T13:05:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:44 crc kubenswrapper[4898]: I1211 13:05:44.601323 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:44 crc kubenswrapper[4898]: I1211 13:05:44.601363 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:44 crc kubenswrapper[4898]: I1211 13:05:44.601374 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:44 crc kubenswrapper[4898]: I1211 13:05:44.601392 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:44 crc kubenswrapper[4898]: I1211 13:05:44.601404 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:44Z","lastTransitionTime":"2025-12-11T13:05:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:44 crc kubenswrapper[4898]: I1211 13:05:44.704524 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:44 crc kubenswrapper[4898]: I1211 13:05:44.704570 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:44 crc kubenswrapper[4898]: I1211 13:05:44.704586 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:44 crc kubenswrapper[4898]: I1211 13:05:44.704608 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:44 crc kubenswrapper[4898]: I1211 13:05:44.704625 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:44Z","lastTransitionTime":"2025-12-11T13:05:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:44 crc kubenswrapper[4898]: I1211 13:05:44.806493 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:44 crc kubenswrapper[4898]: I1211 13:05:44.806704 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:44 crc kubenswrapper[4898]: I1211 13:05:44.806792 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:44 crc kubenswrapper[4898]: I1211 13:05:44.806865 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:44 crc kubenswrapper[4898]: I1211 13:05:44.806927 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:44Z","lastTransitionTime":"2025-12-11T13:05:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:44 crc kubenswrapper[4898]: I1211 13:05:44.909146 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:44 crc kubenswrapper[4898]: I1211 13:05:44.909199 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:44 crc kubenswrapper[4898]: I1211 13:05:44.909215 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:44 crc kubenswrapper[4898]: I1211 13:05:44.909234 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:44 crc kubenswrapper[4898]: I1211 13:05:44.909250 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:44Z","lastTransitionTime":"2025-12-11T13:05:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:45 crc kubenswrapper[4898]: I1211 13:05:45.012720 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:45 crc kubenswrapper[4898]: I1211 13:05:45.012785 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:45 crc kubenswrapper[4898]: I1211 13:05:45.012802 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:45 crc kubenswrapper[4898]: I1211 13:05:45.012824 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:45 crc kubenswrapper[4898]: I1211 13:05:45.012839 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:45Z","lastTransitionTime":"2025-12-11T13:05:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:45 crc kubenswrapper[4898]: I1211 13:05:45.114800 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:45 crc kubenswrapper[4898]: I1211 13:05:45.114843 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:45 crc kubenswrapper[4898]: I1211 13:05:45.114854 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:45 crc kubenswrapper[4898]: I1211 13:05:45.114869 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:45 crc kubenswrapper[4898]: I1211 13:05:45.114882 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:45Z","lastTransitionTime":"2025-12-11T13:05:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:45 crc kubenswrapper[4898]: I1211 13:05:45.217287 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:45 crc kubenswrapper[4898]: I1211 13:05:45.217331 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:45 crc kubenswrapper[4898]: I1211 13:05:45.217350 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:45 crc kubenswrapper[4898]: I1211 13:05:45.217370 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:45 crc kubenswrapper[4898]: I1211 13:05:45.217384 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:45Z","lastTransitionTime":"2025-12-11T13:05:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:45 crc kubenswrapper[4898]: I1211 13:05:45.320057 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:45 crc kubenswrapper[4898]: I1211 13:05:45.320294 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:45 crc kubenswrapper[4898]: I1211 13:05:45.320407 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:45 crc kubenswrapper[4898]: I1211 13:05:45.320533 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:45 crc kubenswrapper[4898]: I1211 13:05:45.320630 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:45Z","lastTransitionTime":"2025-12-11T13:05:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:45 crc kubenswrapper[4898]: I1211 13:05:45.422733 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:45 crc kubenswrapper[4898]: I1211 13:05:45.422773 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:45 crc kubenswrapper[4898]: I1211 13:05:45.422784 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:45 crc kubenswrapper[4898]: I1211 13:05:45.422798 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:45 crc kubenswrapper[4898]: I1211 13:05:45.422808 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:45Z","lastTransitionTime":"2025-12-11T13:05:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:45 crc kubenswrapper[4898]: I1211 13:05:45.524821 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:45 crc kubenswrapper[4898]: I1211 13:05:45.524859 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:45 crc kubenswrapper[4898]: I1211 13:05:45.524869 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:45 crc kubenswrapper[4898]: I1211 13:05:45.524883 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:45 crc kubenswrapper[4898]: I1211 13:05:45.524893 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:45Z","lastTransitionTime":"2025-12-11T13:05:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:45 crc kubenswrapper[4898]: I1211 13:05:45.627022 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:45 crc kubenswrapper[4898]: I1211 13:05:45.627063 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:45 crc kubenswrapper[4898]: I1211 13:05:45.627071 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:45 crc kubenswrapper[4898]: I1211 13:05:45.627085 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:45 crc kubenswrapper[4898]: I1211 13:05:45.627095 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:45Z","lastTransitionTime":"2025-12-11T13:05:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:45 crc kubenswrapper[4898]: I1211 13:05:45.729415 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:45 crc kubenswrapper[4898]: I1211 13:05:45.729891 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:45 crc kubenswrapper[4898]: I1211 13:05:45.730088 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:45 crc kubenswrapper[4898]: I1211 13:05:45.730240 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:45 crc kubenswrapper[4898]: I1211 13:05:45.730369 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:45Z","lastTransitionTime":"2025-12-11T13:05:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:45 crc kubenswrapper[4898]: I1211 13:05:45.773981 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 11 13:05:45 crc kubenswrapper[4898]: I1211 13:05:45.773987 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 11 13:05:45 crc kubenswrapper[4898]: I1211 13:05:45.774050 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 11 13:05:45 crc kubenswrapper[4898]: E1211 13:05:45.774139 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 11 13:05:45 crc kubenswrapper[4898]: I1211 13:05:45.774235 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zcq7l" Dec 11 13:05:45 crc kubenswrapper[4898]: E1211 13:05:45.774275 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 11 13:05:45 crc kubenswrapper[4898]: E1211 13:05:45.774424 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 11 13:05:45 crc kubenswrapper[4898]: E1211 13:05:45.774527 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zcq7l" podUID="34380c7c-1d75-4f6f-a6cb-b015a55ca978" Dec 11 13:05:45 crc kubenswrapper[4898]: I1211 13:05:45.832543 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:45 crc kubenswrapper[4898]: I1211 13:05:45.832726 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:45 crc kubenswrapper[4898]: I1211 13:05:45.832808 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:45 crc kubenswrapper[4898]: I1211 13:05:45.832896 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:45 crc kubenswrapper[4898]: I1211 13:05:45.832979 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:45Z","lastTransitionTime":"2025-12-11T13:05:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:45 crc kubenswrapper[4898]: I1211 13:05:45.936613 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:45 crc kubenswrapper[4898]: I1211 13:05:45.936690 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:45 crc kubenswrapper[4898]: I1211 13:05:45.936708 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:45 crc kubenswrapper[4898]: I1211 13:05:45.936758 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:45 crc kubenswrapper[4898]: I1211 13:05:45.936775 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:45Z","lastTransitionTime":"2025-12-11T13:05:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:46 crc kubenswrapper[4898]: I1211 13:05:46.039707 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:46 crc kubenswrapper[4898]: I1211 13:05:46.039766 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:46 crc kubenswrapper[4898]: I1211 13:05:46.039783 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:46 crc kubenswrapper[4898]: I1211 13:05:46.039805 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:46 crc kubenswrapper[4898]: I1211 13:05:46.039820 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:46Z","lastTransitionTime":"2025-12-11T13:05:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:46 crc kubenswrapper[4898]: I1211 13:05:46.142902 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:46 crc kubenswrapper[4898]: I1211 13:05:46.142964 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:46 crc kubenswrapper[4898]: I1211 13:05:46.142986 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:46 crc kubenswrapper[4898]: I1211 13:05:46.143013 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:46 crc kubenswrapper[4898]: I1211 13:05:46.143034 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:46Z","lastTransitionTime":"2025-12-11T13:05:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:46 crc kubenswrapper[4898]: I1211 13:05:46.246700 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:46 crc kubenswrapper[4898]: I1211 13:05:46.246733 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:46 crc kubenswrapper[4898]: I1211 13:05:46.246742 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:46 crc kubenswrapper[4898]: I1211 13:05:46.246756 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:46 crc kubenswrapper[4898]: I1211 13:05:46.246765 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:46Z","lastTransitionTime":"2025-12-11T13:05:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:46 crc kubenswrapper[4898]: I1211 13:05:46.349340 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:46 crc kubenswrapper[4898]: I1211 13:05:46.349402 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:46 crc kubenswrapper[4898]: I1211 13:05:46.349428 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:46 crc kubenswrapper[4898]: I1211 13:05:46.349538 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:46 crc kubenswrapper[4898]: I1211 13:05:46.349569 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:46Z","lastTransitionTime":"2025-12-11T13:05:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:46 crc kubenswrapper[4898]: I1211 13:05:46.451864 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:46 crc kubenswrapper[4898]: I1211 13:05:46.451894 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:46 crc kubenswrapper[4898]: I1211 13:05:46.451902 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:46 crc kubenswrapper[4898]: I1211 13:05:46.451914 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:46 crc kubenswrapper[4898]: I1211 13:05:46.451922 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:46Z","lastTransitionTime":"2025-12-11T13:05:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:46 crc kubenswrapper[4898]: I1211 13:05:46.554846 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:46 crc kubenswrapper[4898]: I1211 13:05:46.554916 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:46 crc kubenswrapper[4898]: I1211 13:05:46.554941 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:46 crc kubenswrapper[4898]: I1211 13:05:46.554969 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:46 crc kubenswrapper[4898]: I1211 13:05:46.555002 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:46Z","lastTransitionTime":"2025-12-11T13:05:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:46 crc kubenswrapper[4898]: I1211 13:05:46.657209 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:46 crc kubenswrapper[4898]: I1211 13:05:46.657775 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:46 crc kubenswrapper[4898]: I1211 13:05:46.658000 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:46 crc kubenswrapper[4898]: I1211 13:05:46.658206 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:46 crc kubenswrapper[4898]: I1211 13:05:46.658394 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:46Z","lastTransitionTime":"2025-12-11T13:05:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:46 crc kubenswrapper[4898]: I1211 13:05:46.761762 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:46 crc kubenswrapper[4898]: I1211 13:05:46.761844 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:46 crc kubenswrapper[4898]: I1211 13:05:46.761858 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:46 crc kubenswrapper[4898]: I1211 13:05:46.761876 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:46 crc kubenswrapper[4898]: I1211 13:05:46.761889 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:46Z","lastTransitionTime":"2025-12-11T13:05:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:46 crc kubenswrapper[4898]: I1211 13:05:46.865376 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:46 crc kubenswrapper[4898]: I1211 13:05:46.865483 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:46 crc kubenswrapper[4898]: I1211 13:05:46.865509 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:46 crc kubenswrapper[4898]: I1211 13:05:46.865539 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:46 crc kubenswrapper[4898]: I1211 13:05:46.865560 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:46Z","lastTransitionTime":"2025-12-11T13:05:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:46 crc kubenswrapper[4898]: I1211 13:05:46.968605 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:46 crc kubenswrapper[4898]: I1211 13:05:46.968653 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:46 crc kubenswrapper[4898]: I1211 13:05:46.968665 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:46 crc kubenswrapper[4898]: I1211 13:05:46.968682 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:46 crc kubenswrapper[4898]: I1211 13:05:46.968696 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:46Z","lastTransitionTime":"2025-12-11T13:05:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:47 crc kubenswrapper[4898]: I1211 13:05:47.071622 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:47 crc kubenswrapper[4898]: I1211 13:05:47.071690 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:47 crc kubenswrapper[4898]: I1211 13:05:47.071707 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:47 crc kubenswrapper[4898]: I1211 13:05:47.071732 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:47 crc kubenswrapper[4898]: I1211 13:05:47.071750 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:47Z","lastTransitionTime":"2025-12-11T13:05:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:47 crc kubenswrapper[4898]: I1211 13:05:47.175154 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:47 crc kubenswrapper[4898]: I1211 13:05:47.175215 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:47 crc kubenswrapper[4898]: I1211 13:05:47.175233 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:47 crc kubenswrapper[4898]: I1211 13:05:47.175255 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:47 crc kubenswrapper[4898]: I1211 13:05:47.175276 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:47Z","lastTransitionTime":"2025-12-11T13:05:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:47 crc kubenswrapper[4898]: I1211 13:05:47.277955 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:47 crc kubenswrapper[4898]: I1211 13:05:47.278000 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:47 crc kubenswrapper[4898]: I1211 13:05:47.278015 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:47 crc kubenswrapper[4898]: I1211 13:05:47.278033 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:47 crc kubenswrapper[4898]: I1211 13:05:47.278045 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:47Z","lastTransitionTime":"2025-12-11T13:05:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:47 crc kubenswrapper[4898]: I1211 13:05:47.380366 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:47 crc kubenswrapper[4898]: I1211 13:05:47.380418 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:47 crc kubenswrapper[4898]: I1211 13:05:47.380433 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:47 crc kubenswrapper[4898]: I1211 13:05:47.380475 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:47 crc kubenswrapper[4898]: I1211 13:05:47.380492 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:47Z","lastTransitionTime":"2025-12-11T13:05:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:47 crc kubenswrapper[4898]: I1211 13:05:47.483706 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:47 crc kubenswrapper[4898]: I1211 13:05:47.483773 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:47 crc kubenswrapper[4898]: I1211 13:05:47.483796 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:47 crc kubenswrapper[4898]: I1211 13:05:47.483827 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:47 crc kubenswrapper[4898]: I1211 13:05:47.483850 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:47Z","lastTransitionTime":"2025-12-11T13:05:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:47 crc kubenswrapper[4898]: I1211 13:05:47.587066 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:47 crc kubenswrapper[4898]: I1211 13:05:47.587140 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:47 crc kubenswrapper[4898]: I1211 13:05:47.587157 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:47 crc kubenswrapper[4898]: I1211 13:05:47.587188 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:47 crc kubenswrapper[4898]: I1211 13:05:47.587205 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:47Z","lastTransitionTime":"2025-12-11T13:05:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:47 crc kubenswrapper[4898]: I1211 13:05:47.690537 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:47 crc kubenswrapper[4898]: I1211 13:05:47.690598 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:47 crc kubenswrapper[4898]: I1211 13:05:47.690620 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:47 crc kubenswrapper[4898]: I1211 13:05:47.690643 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:47 crc kubenswrapper[4898]: I1211 13:05:47.690660 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:47Z","lastTransitionTime":"2025-12-11T13:05:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:47 crc kubenswrapper[4898]: I1211 13:05:47.774327 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 11 13:05:47 crc kubenswrapper[4898]: I1211 13:05:47.774363 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 11 13:05:47 crc kubenswrapper[4898]: I1211 13:05:47.774327 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 11 13:05:47 crc kubenswrapper[4898]: I1211 13:05:47.774819 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zcq7l" Dec 11 13:05:47 crc kubenswrapper[4898]: E1211 13:05:47.774906 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 11 13:05:47 crc kubenswrapper[4898]: I1211 13:05:47.775030 4898 scope.go:117] "RemoveContainer" containerID="b19bc85d9d1f776f0f2a0bfedbfa6f30383df3673e54acadcc74a46e4d24bcde" Dec 11 13:05:47 crc kubenswrapper[4898]: E1211 13:05:47.775019 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 11 13:05:47 crc kubenswrapper[4898]: E1211 13:05:47.775219 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-qndxl_openshift-ovn-kubernetes(1efa7034-8a95-4e6e-bd84-0189dc5acaa3)\"" pod="openshift-ovn-kubernetes/ovnkube-node-qndxl" podUID="1efa7034-8a95-4e6e-bd84-0189dc5acaa3" Dec 11 13:05:47 crc kubenswrapper[4898]: E1211 13:05:47.775276 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zcq7l" podUID="34380c7c-1d75-4f6f-a6cb-b015a55ca978" Dec 11 13:05:47 crc kubenswrapper[4898]: E1211 13:05:47.775395 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 11 13:05:47 crc kubenswrapper[4898]: I1211 13:05:47.793515 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:47 crc kubenswrapper[4898]: I1211 13:05:47.793583 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:47 crc kubenswrapper[4898]: I1211 13:05:47.793601 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:47 crc kubenswrapper[4898]: I1211 13:05:47.793626 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:47 crc kubenswrapper[4898]: I1211 13:05:47.793647 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:47Z","lastTransitionTime":"2025-12-11T13:05:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:47 crc kubenswrapper[4898]: I1211 13:05:47.900992 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:47 crc kubenswrapper[4898]: I1211 13:05:47.901031 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:47 crc kubenswrapper[4898]: I1211 13:05:47.901041 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:47 crc kubenswrapper[4898]: I1211 13:05:47.901106 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:47 crc kubenswrapper[4898]: I1211 13:05:47.901118 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:47Z","lastTransitionTime":"2025-12-11T13:05:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:48 crc kubenswrapper[4898]: I1211 13:05:48.004905 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:48 crc kubenswrapper[4898]: I1211 13:05:48.004940 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:48 crc kubenswrapper[4898]: I1211 13:05:48.004951 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:48 crc kubenswrapper[4898]: I1211 13:05:48.004968 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:48 crc kubenswrapper[4898]: I1211 13:05:48.004980 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:48Z","lastTransitionTime":"2025-12-11T13:05:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:48 crc kubenswrapper[4898]: I1211 13:05:48.107328 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:48 crc kubenswrapper[4898]: I1211 13:05:48.107380 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:48 crc kubenswrapper[4898]: I1211 13:05:48.107393 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:48 crc kubenswrapper[4898]: I1211 13:05:48.107412 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:48 crc kubenswrapper[4898]: I1211 13:05:48.107424 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:48Z","lastTransitionTime":"2025-12-11T13:05:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:48 crc kubenswrapper[4898]: I1211 13:05:48.211450 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:48 crc kubenswrapper[4898]: I1211 13:05:48.212092 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:48 crc kubenswrapper[4898]: I1211 13:05:48.212326 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:48 crc kubenswrapper[4898]: I1211 13:05:48.212583 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:48 crc kubenswrapper[4898]: I1211 13:05:48.212830 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:48Z","lastTransitionTime":"2025-12-11T13:05:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:48 crc kubenswrapper[4898]: I1211 13:05:48.228151 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:48 crc kubenswrapper[4898]: I1211 13:05:48.228184 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:48 crc kubenswrapper[4898]: I1211 13:05:48.228207 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:48 crc kubenswrapper[4898]: I1211 13:05:48.228229 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:48 crc kubenswrapper[4898]: I1211 13:05:48.228242 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:48Z","lastTransitionTime":"2025-12-11T13:05:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:48 crc kubenswrapper[4898]: E1211 13:05:48.250248 4898 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-11T13:05:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-11T13:05:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-11T13:05:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-11T13:05:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-11T13:05:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-11T13:05:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-11T13:05:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-11T13:05:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"160776e4-8c30-4eed-9dbf-aa0b51733cfb\\\",\\\"systemUUID\\\":\\\"3ede5a2c-67f9-4bff-827e-03a23908e5c0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:48Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:48 crc kubenswrapper[4898]: I1211 13:05:48.254082 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:48 crc kubenswrapper[4898]: I1211 13:05:48.254126 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:48 crc kubenswrapper[4898]: I1211 13:05:48.254137 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:48 crc kubenswrapper[4898]: I1211 13:05:48.254152 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:48 crc kubenswrapper[4898]: I1211 13:05:48.254164 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:48Z","lastTransitionTime":"2025-12-11T13:05:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:48 crc kubenswrapper[4898]: E1211 13:05:48.270666 4898 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-11T13:05:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-11T13:05:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-11T13:05:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-11T13:05:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-11T13:05:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-11T13:05:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-11T13:05:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-11T13:05:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"160776e4-8c30-4eed-9dbf-aa0b51733cfb\\\",\\\"systemUUID\\\":\\\"3ede5a2c-67f9-4bff-827e-03a23908e5c0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:48Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:48 crc kubenswrapper[4898]: I1211 13:05:48.274698 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:48 crc kubenswrapper[4898]: I1211 13:05:48.274736 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:48 crc kubenswrapper[4898]: I1211 13:05:48.274748 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:48 crc kubenswrapper[4898]: I1211 13:05:48.274763 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:48 crc kubenswrapper[4898]: I1211 13:05:48.274776 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:48Z","lastTransitionTime":"2025-12-11T13:05:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:48 crc kubenswrapper[4898]: E1211 13:05:48.289603 4898 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-11T13:05:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-11T13:05:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-11T13:05:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-11T13:05:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-11T13:05:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-11T13:05:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-11T13:05:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-11T13:05:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"160776e4-8c30-4eed-9dbf-aa0b51733cfb\\\",\\\"systemUUID\\\":\\\"3ede5a2c-67f9-4bff-827e-03a23908e5c0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:48Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:48 crc kubenswrapper[4898]: I1211 13:05:48.294960 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:48 crc kubenswrapper[4898]: I1211 13:05:48.295020 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:48 crc kubenswrapper[4898]: I1211 13:05:48.295053 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:48 crc kubenswrapper[4898]: I1211 13:05:48.295073 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:48 crc kubenswrapper[4898]: I1211 13:05:48.295085 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:48Z","lastTransitionTime":"2025-12-11T13:05:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:48 crc kubenswrapper[4898]: E1211 13:05:48.310392 4898 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-11T13:05:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-11T13:05:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-11T13:05:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-11T13:05:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-11T13:05:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-11T13:05:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-11T13:05:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-11T13:05:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"160776e4-8c30-4eed-9dbf-aa0b51733cfb\\\",\\\"systemUUID\\\":\\\"3ede5a2c-67f9-4bff-827e-03a23908e5c0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:48Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:48 crc kubenswrapper[4898]: I1211 13:05:48.314724 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:48 crc kubenswrapper[4898]: I1211 13:05:48.314751 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:48 crc kubenswrapper[4898]: I1211 13:05:48.314763 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:48 crc kubenswrapper[4898]: I1211 13:05:48.314783 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:48 crc kubenswrapper[4898]: I1211 13:05:48.314797 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:48Z","lastTransitionTime":"2025-12-11T13:05:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:48 crc kubenswrapper[4898]: E1211 13:05:48.328356 4898 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-11T13:05:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-11T13:05:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-11T13:05:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-11T13:05:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-11T13:05:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-11T13:05:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-11T13:05:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-11T13:05:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"160776e4-8c30-4eed-9dbf-aa0b51733cfb\\\",\\\"systemUUID\\\":\\\"3ede5a2c-67f9-4bff-827e-03a23908e5c0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T13:05:48Z is after 2025-08-24T17:21:41Z" Dec 11 13:05:48 crc kubenswrapper[4898]: E1211 13:05:48.328723 4898 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 11 13:05:48 crc kubenswrapper[4898]: I1211 13:05:48.330422 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:48 crc kubenswrapper[4898]: I1211 13:05:48.330466 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:48 crc kubenswrapper[4898]: I1211 13:05:48.330477 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:48 crc kubenswrapper[4898]: I1211 13:05:48.330492 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:48 crc kubenswrapper[4898]: I1211 13:05:48.330504 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:48Z","lastTransitionTime":"2025-12-11T13:05:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:48 crc kubenswrapper[4898]: I1211 13:05:48.432235 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:48 crc kubenswrapper[4898]: I1211 13:05:48.432271 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:48 crc kubenswrapper[4898]: I1211 13:05:48.432309 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:48 crc kubenswrapper[4898]: I1211 13:05:48.432324 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:48 crc kubenswrapper[4898]: I1211 13:05:48.432332 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:48Z","lastTransitionTime":"2025-12-11T13:05:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:48 crc kubenswrapper[4898]: I1211 13:05:48.535043 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:48 crc kubenswrapper[4898]: I1211 13:05:48.535130 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:48 crc kubenswrapper[4898]: I1211 13:05:48.535148 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:48 crc kubenswrapper[4898]: I1211 13:05:48.535169 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:48 crc kubenswrapper[4898]: I1211 13:05:48.535186 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:48Z","lastTransitionTime":"2025-12-11T13:05:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:48 crc kubenswrapper[4898]: I1211 13:05:48.637377 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:48 crc kubenswrapper[4898]: I1211 13:05:48.637488 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:48 crc kubenswrapper[4898]: I1211 13:05:48.637506 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:48 crc kubenswrapper[4898]: I1211 13:05:48.637525 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:48 crc kubenswrapper[4898]: I1211 13:05:48.637538 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:48Z","lastTransitionTime":"2025-12-11T13:05:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:48 crc kubenswrapper[4898]: I1211 13:05:48.740442 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:48 crc kubenswrapper[4898]: I1211 13:05:48.740559 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:48 crc kubenswrapper[4898]: I1211 13:05:48.740587 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:48 crc kubenswrapper[4898]: I1211 13:05:48.740616 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:48 crc kubenswrapper[4898]: I1211 13:05:48.740639 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:48Z","lastTransitionTime":"2025-12-11T13:05:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:48 crc kubenswrapper[4898]: I1211 13:05:48.843094 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:48 crc kubenswrapper[4898]: I1211 13:05:48.843129 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:48 crc kubenswrapper[4898]: I1211 13:05:48.843137 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:48 crc kubenswrapper[4898]: I1211 13:05:48.843151 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:48 crc kubenswrapper[4898]: I1211 13:05:48.843160 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:48Z","lastTransitionTime":"2025-12-11T13:05:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:48 crc kubenswrapper[4898]: I1211 13:05:48.945768 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:48 crc kubenswrapper[4898]: I1211 13:05:48.945835 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:48 crc kubenswrapper[4898]: I1211 13:05:48.945846 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:48 crc kubenswrapper[4898]: I1211 13:05:48.945860 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:48 crc kubenswrapper[4898]: I1211 13:05:48.945871 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:48Z","lastTransitionTime":"2025-12-11T13:05:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:49 crc kubenswrapper[4898]: I1211 13:05:49.048807 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:49 crc kubenswrapper[4898]: I1211 13:05:49.048865 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:49 crc kubenswrapper[4898]: I1211 13:05:49.048882 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:49 crc kubenswrapper[4898]: I1211 13:05:49.048906 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:49 crc kubenswrapper[4898]: I1211 13:05:49.048927 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:49Z","lastTransitionTime":"2025-12-11T13:05:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:49 crc kubenswrapper[4898]: I1211 13:05:49.151845 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:49 crc kubenswrapper[4898]: I1211 13:05:49.151924 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:49 crc kubenswrapper[4898]: I1211 13:05:49.151935 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:49 crc kubenswrapper[4898]: I1211 13:05:49.151954 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:49 crc kubenswrapper[4898]: I1211 13:05:49.151966 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:49Z","lastTransitionTime":"2025-12-11T13:05:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:49 crc kubenswrapper[4898]: I1211 13:05:49.255740 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:49 crc kubenswrapper[4898]: I1211 13:05:49.255859 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:49 crc kubenswrapper[4898]: I1211 13:05:49.255884 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:49 crc kubenswrapper[4898]: I1211 13:05:49.255915 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:49 crc kubenswrapper[4898]: I1211 13:05:49.255941 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:49Z","lastTransitionTime":"2025-12-11T13:05:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:49 crc kubenswrapper[4898]: I1211 13:05:49.359059 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:49 crc kubenswrapper[4898]: I1211 13:05:49.359120 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:49 crc kubenswrapper[4898]: I1211 13:05:49.359131 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:49 crc kubenswrapper[4898]: I1211 13:05:49.359151 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:49 crc kubenswrapper[4898]: I1211 13:05:49.359163 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:49Z","lastTransitionTime":"2025-12-11T13:05:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:49 crc kubenswrapper[4898]: I1211 13:05:49.461703 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:49 crc kubenswrapper[4898]: I1211 13:05:49.461772 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:49 crc kubenswrapper[4898]: I1211 13:05:49.461796 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:49 crc kubenswrapper[4898]: I1211 13:05:49.461828 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:49 crc kubenswrapper[4898]: I1211 13:05:49.461851 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:49Z","lastTransitionTime":"2025-12-11T13:05:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:49 crc kubenswrapper[4898]: I1211 13:05:49.564134 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:49 crc kubenswrapper[4898]: I1211 13:05:49.564213 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:49 crc kubenswrapper[4898]: I1211 13:05:49.564239 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:49 crc kubenswrapper[4898]: I1211 13:05:49.564267 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:49 crc kubenswrapper[4898]: I1211 13:05:49.564290 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:49Z","lastTransitionTime":"2025-12-11T13:05:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:49 crc kubenswrapper[4898]: I1211 13:05:49.667440 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:49 crc kubenswrapper[4898]: I1211 13:05:49.668361 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:49 crc kubenswrapper[4898]: I1211 13:05:49.668386 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:49 crc kubenswrapper[4898]: I1211 13:05:49.668416 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:49 crc kubenswrapper[4898]: I1211 13:05:49.668435 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:49Z","lastTransitionTime":"2025-12-11T13:05:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:49 crc kubenswrapper[4898]: I1211 13:05:49.771307 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:49 crc kubenswrapper[4898]: I1211 13:05:49.771366 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:49 crc kubenswrapper[4898]: I1211 13:05:49.771384 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:49 crc kubenswrapper[4898]: I1211 13:05:49.771421 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:49 crc kubenswrapper[4898]: I1211 13:05:49.771435 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:49Z","lastTransitionTime":"2025-12-11T13:05:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:49 crc kubenswrapper[4898]: I1211 13:05:49.774638 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 11 13:05:49 crc kubenswrapper[4898]: I1211 13:05:49.774692 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 11 13:05:49 crc kubenswrapper[4898]: I1211 13:05:49.774762 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zcq7l" Dec 11 13:05:49 crc kubenswrapper[4898]: E1211 13:05:49.774903 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 11 13:05:49 crc kubenswrapper[4898]: I1211 13:05:49.774922 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 11 13:05:49 crc kubenswrapper[4898]: E1211 13:05:49.775079 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 11 13:05:49 crc kubenswrapper[4898]: E1211 13:05:49.775110 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 11 13:05:49 crc kubenswrapper[4898]: E1211 13:05:49.775177 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zcq7l" podUID="34380c7c-1d75-4f6f-a6cb-b015a55ca978" Dec 11 13:05:49 crc kubenswrapper[4898]: I1211 13:05:49.874819 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:49 crc kubenswrapper[4898]: I1211 13:05:49.874870 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:49 crc kubenswrapper[4898]: I1211 13:05:49.874882 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:49 crc kubenswrapper[4898]: I1211 13:05:49.874903 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:49 crc kubenswrapper[4898]: I1211 13:05:49.874917 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:49Z","lastTransitionTime":"2025-12-11T13:05:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:49 crc kubenswrapper[4898]: I1211 13:05:49.978786 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:49 crc kubenswrapper[4898]: I1211 13:05:49.978840 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:49 crc kubenswrapper[4898]: I1211 13:05:49.978852 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:49 crc kubenswrapper[4898]: I1211 13:05:49.978873 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:49 crc kubenswrapper[4898]: I1211 13:05:49.978889 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:49Z","lastTransitionTime":"2025-12-11T13:05:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:50 crc kubenswrapper[4898]: I1211 13:05:50.082802 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:50 crc kubenswrapper[4898]: I1211 13:05:50.082874 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:50 crc kubenswrapper[4898]: I1211 13:05:50.082885 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:50 crc kubenswrapper[4898]: I1211 13:05:50.082922 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:50 crc kubenswrapper[4898]: I1211 13:05:50.082933 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:50Z","lastTransitionTime":"2025-12-11T13:05:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:50 crc kubenswrapper[4898]: I1211 13:05:50.186308 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:50 crc kubenswrapper[4898]: I1211 13:05:50.186368 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:50 crc kubenswrapper[4898]: I1211 13:05:50.186386 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:50 crc kubenswrapper[4898]: I1211 13:05:50.186410 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:50 crc kubenswrapper[4898]: I1211 13:05:50.186429 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:50Z","lastTransitionTime":"2025-12-11T13:05:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:50 crc kubenswrapper[4898]: I1211 13:05:50.289409 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:50 crc kubenswrapper[4898]: I1211 13:05:50.289492 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:50 crc kubenswrapper[4898]: I1211 13:05:50.289518 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:50 crc kubenswrapper[4898]: I1211 13:05:50.289538 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:50 crc kubenswrapper[4898]: I1211 13:05:50.289553 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:50Z","lastTransitionTime":"2025-12-11T13:05:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:50 crc kubenswrapper[4898]: I1211 13:05:50.392148 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:50 crc kubenswrapper[4898]: I1211 13:05:50.392192 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:50 crc kubenswrapper[4898]: I1211 13:05:50.392202 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:50 crc kubenswrapper[4898]: I1211 13:05:50.392218 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:50 crc kubenswrapper[4898]: I1211 13:05:50.392229 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:50Z","lastTransitionTime":"2025-12-11T13:05:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:50 crc kubenswrapper[4898]: I1211 13:05:50.495025 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:50 crc kubenswrapper[4898]: I1211 13:05:50.495096 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:50 crc kubenswrapper[4898]: I1211 13:05:50.495119 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:50 crc kubenswrapper[4898]: I1211 13:05:50.495147 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:50 crc kubenswrapper[4898]: I1211 13:05:50.495167 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:50Z","lastTransitionTime":"2025-12-11T13:05:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:50 crc kubenswrapper[4898]: I1211 13:05:50.599337 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:50 crc kubenswrapper[4898]: I1211 13:05:50.599421 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:50 crc kubenswrapper[4898]: I1211 13:05:50.599446 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:50 crc kubenswrapper[4898]: I1211 13:05:50.599513 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:50 crc kubenswrapper[4898]: I1211 13:05:50.599537 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:50Z","lastTransitionTime":"2025-12-11T13:05:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:50 crc kubenswrapper[4898]: I1211 13:05:50.701932 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:50 crc kubenswrapper[4898]: I1211 13:05:50.702012 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:50 crc kubenswrapper[4898]: I1211 13:05:50.702033 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:50 crc kubenswrapper[4898]: I1211 13:05:50.702058 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:50 crc kubenswrapper[4898]: I1211 13:05:50.702075 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:50Z","lastTransitionTime":"2025-12-11T13:05:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:50 crc kubenswrapper[4898]: I1211 13:05:50.804700 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:50 crc kubenswrapper[4898]: I1211 13:05:50.804781 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:50 crc kubenswrapper[4898]: I1211 13:05:50.804794 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:50 crc kubenswrapper[4898]: I1211 13:05:50.804810 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:50 crc kubenswrapper[4898]: I1211 13:05:50.804847 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:50Z","lastTransitionTime":"2025-12-11T13:05:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:50 crc kubenswrapper[4898]: I1211 13:05:50.906663 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:50 crc kubenswrapper[4898]: I1211 13:05:50.906873 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:50 crc kubenswrapper[4898]: I1211 13:05:50.906882 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:50 crc kubenswrapper[4898]: I1211 13:05:50.906894 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:50 crc kubenswrapper[4898]: I1211 13:05:50.906903 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:50Z","lastTransitionTime":"2025-12-11T13:05:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:51 crc kubenswrapper[4898]: I1211 13:05:51.009669 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:51 crc kubenswrapper[4898]: I1211 13:05:51.009698 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:51 crc kubenswrapper[4898]: I1211 13:05:51.009706 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:51 crc kubenswrapper[4898]: I1211 13:05:51.009719 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:51 crc kubenswrapper[4898]: I1211 13:05:51.009728 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:51Z","lastTransitionTime":"2025-12-11T13:05:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:51 crc kubenswrapper[4898]: I1211 13:05:51.111965 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:51 crc kubenswrapper[4898]: I1211 13:05:51.112010 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:51 crc kubenswrapper[4898]: I1211 13:05:51.112022 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:51 crc kubenswrapper[4898]: I1211 13:05:51.112038 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:51 crc kubenswrapper[4898]: I1211 13:05:51.112049 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:51Z","lastTransitionTime":"2025-12-11T13:05:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:51 crc kubenswrapper[4898]: I1211 13:05:51.214064 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:51 crc kubenswrapper[4898]: I1211 13:05:51.214114 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:51 crc kubenswrapper[4898]: I1211 13:05:51.214122 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:51 crc kubenswrapper[4898]: I1211 13:05:51.214136 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:51 crc kubenswrapper[4898]: I1211 13:05:51.214147 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:51Z","lastTransitionTime":"2025-12-11T13:05:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:51 crc kubenswrapper[4898]: I1211 13:05:51.316862 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:51 crc kubenswrapper[4898]: I1211 13:05:51.316906 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:51 crc kubenswrapper[4898]: I1211 13:05:51.316914 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:51 crc kubenswrapper[4898]: I1211 13:05:51.316928 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:51 crc kubenswrapper[4898]: I1211 13:05:51.316936 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:51Z","lastTransitionTime":"2025-12-11T13:05:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:51 crc kubenswrapper[4898]: I1211 13:05:51.419738 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:51 crc kubenswrapper[4898]: I1211 13:05:51.419803 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:51 crc kubenswrapper[4898]: I1211 13:05:51.419823 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:51 crc kubenswrapper[4898]: I1211 13:05:51.419860 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:51 crc kubenswrapper[4898]: I1211 13:05:51.419891 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:51Z","lastTransitionTime":"2025-12-11T13:05:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:51 crc kubenswrapper[4898]: I1211 13:05:51.522726 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:51 crc kubenswrapper[4898]: I1211 13:05:51.522772 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:51 crc kubenswrapper[4898]: I1211 13:05:51.522787 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:51 crc kubenswrapper[4898]: I1211 13:05:51.522806 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:51 crc kubenswrapper[4898]: I1211 13:05:51.522820 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:51Z","lastTransitionTime":"2025-12-11T13:05:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:51 crc kubenswrapper[4898]: I1211 13:05:51.625791 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:51 crc kubenswrapper[4898]: I1211 13:05:51.625879 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:51 crc kubenswrapper[4898]: I1211 13:05:51.625894 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:51 crc kubenswrapper[4898]: I1211 13:05:51.625919 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:51 crc kubenswrapper[4898]: I1211 13:05:51.625939 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:51Z","lastTransitionTime":"2025-12-11T13:05:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:51 crc kubenswrapper[4898]: I1211 13:05:51.728933 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:51 crc kubenswrapper[4898]: I1211 13:05:51.729205 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:51 crc kubenswrapper[4898]: I1211 13:05:51.729288 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:51 crc kubenswrapper[4898]: I1211 13:05:51.729352 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:51 crc kubenswrapper[4898]: I1211 13:05:51.729411 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:51Z","lastTransitionTime":"2025-12-11T13:05:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:51 crc kubenswrapper[4898]: I1211 13:05:51.774098 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 11 13:05:51 crc kubenswrapper[4898]: E1211 13:05:51.774254 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 11 13:05:51 crc kubenswrapper[4898]: I1211 13:05:51.774517 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 11 13:05:51 crc kubenswrapper[4898]: E1211 13:05:51.774725 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 11 13:05:51 crc kubenswrapper[4898]: I1211 13:05:51.774892 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zcq7l" Dec 11 13:05:51 crc kubenswrapper[4898]: E1211 13:05:51.775020 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zcq7l" podUID="34380c7c-1d75-4f6f-a6cb-b015a55ca978" Dec 11 13:05:51 crc kubenswrapper[4898]: I1211 13:05:51.775043 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 11 13:05:51 crc kubenswrapper[4898]: E1211 13:05:51.775227 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 11 13:05:51 crc kubenswrapper[4898]: I1211 13:05:51.833657 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:51 crc kubenswrapper[4898]: I1211 13:05:51.833718 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:51 crc kubenswrapper[4898]: I1211 13:05:51.833732 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:51 crc kubenswrapper[4898]: I1211 13:05:51.833752 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:51 crc kubenswrapper[4898]: I1211 13:05:51.833925 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:51Z","lastTransitionTime":"2025-12-11T13:05:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:51 crc kubenswrapper[4898]: I1211 13:05:51.937274 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:51 crc kubenswrapper[4898]: I1211 13:05:51.937343 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:51 crc kubenswrapper[4898]: I1211 13:05:51.937365 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:51 crc kubenswrapper[4898]: I1211 13:05:51.937394 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:51 crc kubenswrapper[4898]: I1211 13:05:51.937416 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:51Z","lastTransitionTime":"2025-12-11T13:05:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:52 crc kubenswrapper[4898]: I1211 13:05:52.039973 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:52 crc kubenswrapper[4898]: I1211 13:05:52.040407 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:52 crc kubenswrapper[4898]: I1211 13:05:52.040836 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:52 crc kubenswrapper[4898]: I1211 13:05:52.041175 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:52 crc kubenswrapper[4898]: I1211 13:05:52.041526 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:52Z","lastTransitionTime":"2025-12-11T13:05:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:52 crc kubenswrapper[4898]: I1211 13:05:52.144375 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:52 crc kubenswrapper[4898]: I1211 13:05:52.144634 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:52 crc kubenswrapper[4898]: I1211 13:05:52.144703 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:52 crc kubenswrapper[4898]: I1211 13:05:52.144767 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:52 crc kubenswrapper[4898]: I1211 13:05:52.144825 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:52Z","lastTransitionTime":"2025-12-11T13:05:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:52 crc kubenswrapper[4898]: I1211 13:05:52.247289 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:52 crc kubenswrapper[4898]: I1211 13:05:52.247673 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:52 crc kubenswrapper[4898]: I1211 13:05:52.247773 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:52 crc kubenswrapper[4898]: I1211 13:05:52.247880 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:52 crc kubenswrapper[4898]: I1211 13:05:52.247957 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:52Z","lastTransitionTime":"2025-12-11T13:05:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:52 crc kubenswrapper[4898]: I1211 13:05:52.351065 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:52 crc kubenswrapper[4898]: I1211 13:05:52.351115 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:52 crc kubenswrapper[4898]: I1211 13:05:52.351131 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:52 crc kubenswrapper[4898]: I1211 13:05:52.351153 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:52 crc kubenswrapper[4898]: I1211 13:05:52.351165 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:52Z","lastTransitionTime":"2025-12-11T13:05:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:52 crc kubenswrapper[4898]: I1211 13:05:52.454148 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:52 crc kubenswrapper[4898]: I1211 13:05:52.454199 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:52 crc kubenswrapper[4898]: I1211 13:05:52.454210 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:52 crc kubenswrapper[4898]: I1211 13:05:52.454229 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:52 crc kubenswrapper[4898]: I1211 13:05:52.454242 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:52Z","lastTransitionTime":"2025-12-11T13:05:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:52 crc kubenswrapper[4898]: I1211 13:05:52.557682 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:52 crc kubenswrapper[4898]: I1211 13:05:52.557745 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:52 crc kubenswrapper[4898]: I1211 13:05:52.557757 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:52 crc kubenswrapper[4898]: I1211 13:05:52.557777 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:52 crc kubenswrapper[4898]: I1211 13:05:52.557792 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:52Z","lastTransitionTime":"2025-12-11T13:05:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:52 crc kubenswrapper[4898]: I1211 13:05:52.660631 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:52 crc kubenswrapper[4898]: I1211 13:05:52.661056 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:52 crc kubenswrapper[4898]: I1211 13:05:52.661222 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:52 crc kubenswrapper[4898]: I1211 13:05:52.663034 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:52 crc kubenswrapper[4898]: I1211 13:05:52.663270 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:52Z","lastTransitionTime":"2025-12-11T13:05:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:52 crc kubenswrapper[4898]: I1211 13:05:52.766030 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:52 crc kubenswrapper[4898]: I1211 13:05:52.766089 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:52 crc kubenswrapper[4898]: I1211 13:05:52.766102 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:52 crc kubenswrapper[4898]: I1211 13:05:52.766120 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:52 crc kubenswrapper[4898]: I1211 13:05:52.766133 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:52Z","lastTransitionTime":"2025-12-11T13:05:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:52 crc kubenswrapper[4898]: I1211 13:05:52.853598 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-jmvlr" podStartSLOduration=91.853562817 podStartE2EDuration="1m31.853562817s" podCreationTimestamp="2025-12-11 13:04:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 13:05:52.853398082 +0000 UTC m=+110.425724529" watchObservedRunningTime="2025-12-11 13:05:52.853562817 +0000 UTC m=+110.425889294" Dec 11 13:05:52 crc kubenswrapper[4898]: I1211 13:05:52.853976 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-dlqfj" podStartSLOduration=91.853961889 podStartE2EDuration="1m31.853961889s" podCreationTimestamp="2025-12-11 13:04:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 13:05:52.841169514 +0000 UTC m=+110.413495961" watchObservedRunningTime="2025-12-11 13:05:52.853961889 +0000 UTC m=+110.426288376" Dec 11 13:05:52 crc kubenswrapper[4898]: I1211 13:05:52.869516 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:52 crc kubenswrapper[4898]: I1211 13:05:52.869569 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:52 crc kubenswrapper[4898]: I1211 13:05:52.869581 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:52 crc kubenswrapper[4898]: I1211 13:05:52.869596 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:52 crc kubenswrapper[4898]: I1211 13:05:52.869606 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:52Z","lastTransitionTime":"2025-12-11T13:05:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:52 crc kubenswrapper[4898]: I1211 13:05:52.921155 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-brphv" podStartSLOduration=90.921123965 podStartE2EDuration="1m30.921123965s" podCreationTimestamp="2025-12-11 13:04:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 13:05:52.900600984 +0000 UTC m=+110.472927431" watchObservedRunningTime="2025-12-11 13:05:52.921123965 +0000 UTC m=+110.493450442" Dec 11 13:05:52 crc kubenswrapper[4898]: I1211 13:05:52.922178 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=57.922161585 podStartE2EDuration="57.922161585s" podCreationTimestamp="2025-12-11 13:04:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 13:05:52.920684302 +0000 UTC m=+110.493010749" watchObservedRunningTime="2025-12-11 13:05:52.922161585 +0000 UTC m=+110.494488062" Dec 11 13:05:52 crc kubenswrapper[4898]: I1211 13:05:52.933415 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=22.933387564 podStartE2EDuration="22.933387564s" podCreationTimestamp="2025-12-11 13:05:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 13:05:52.932959031 +0000 UTC m=+110.505285478" watchObservedRunningTime="2025-12-11 13:05:52.933387564 +0000 UTC m=+110.505714041" Dec 11 13:05:52 crc kubenswrapper[4898]: I1211 13:05:52.970816 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=90.970794379 podStartE2EDuration="1m30.970794379s" podCreationTimestamp="2025-12-11 13:04:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 13:05:52.959293872 +0000 UTC m=+110.531620319" watchObservedRunningTime="2025-12-11 13:05:52.970794379 +0000 UTC m=+110.543120826" Dec 11 13:05:52 crc kubenswrapper[4898]: I1211 13:05:52.971990 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:52 crc kubenswrapper[4898]: I1211 13:05:52.972061 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:52 crc kubenswrapper[4898]: I1211 13:05:52.972071 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:52 crc kubenswrapper[4898]: I1211 13:05:52.972087 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:52 crc kubenswrapper[4898]: I1211 13:05:52.972097 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:52Z","lastTransitionTime":"2025-12-11T13:05:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:52 crc kubenswrapper[4898]: I1211 13:05:52.989393 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-h86cf" podStartSLOduration=91.989359382 podStartE2EDuration="1m31.989359382s" podCreationTimestamp="2025-12-11 13:04:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 13:05:52.973718534 +0000 UTC m=+110.546045031" watchObservedRunningTime="2025-12-11 13:05:52.989359382 +0000 UTC m=+110.561685869" Dec 11 13:05:52 crc kubenswrapper[4898]: I1211 13:05:52.990882 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podStartSLOduration=91.990862386 podStartE2EDuration="1m31.990862386s" podCreationTimestamp="2025-12-11 13:04:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 13:05:52.989268039 +0000 UTC m=+110.561594496" watchObservedRunningTime="2025-12-11 13:05:52.990862386 +0000 UTC m=+110.563188883" Dec 11 13:05:53 crc kubenswrapper[4898]: I1211 13:05:53.006117 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=92.006101382 podStartE2EDuration="1m32.006101382s" podCreationTimestamp="2025-12-11 13:04:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 13:05:53.004705891 +0000 UTC m=+110.577032388" watchObservedRunningTime="2025-12-11 13:05:53.006101382 +0000 UTC m=+110.578427819" Dec 11 13:05:53 crc kubenswrapper[4898]: I1211 13:05:53.050564 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-7lxfm" podStartSLOduration=92.050533623 podStartE2EDuration="1m32.050533623s" podCreationTimestamp="2025-12-11 13:04:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 13:05:53.049708509 +0000 UTC m=+110.622034966" watchObservedRunningTime="2025-12-11 13:05:53.050533623 +0000 UTC m=+110.622860140" Dec 11 13:05:53 crc kubenswrapper[4898]: I1211 13:05:53.074771 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:53 crc kubenswrapper[4898]: I1211 13:05:53.074809 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:53 crc kubenswrapper[4898]: I1211 13:05:53.074819 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:53 crc kubenswrapper[4898]: I1211 13:05:53.074833 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:53 crc kubenswrapper[4898]: I1211 13:05:53.074843 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:53Z","lastTransitionTime":"2025-12-11T13:05:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:53 crc kubenswrapper[4898]: I1211 13:05:53.077090 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=92.07707239 podStartE2EDuration="1m32.07707239s" podCreationTimestamp="2025-12-11 13:04:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 13:05:53.076722149 +0000 UTC m=+110.649048586" watchObservedRunningTime="2025-12-11 13:05:53.07707239 +0000 UTC m=+110.649398867" Dec 11 13:05:53 crc kubenswrapper[4898]: I1211 13:05:53.177628 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:53 crc kubenswrapper[4898]: I1211 13:05:53.177686 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:53 crc kubenswrapper[4898]: I1211 13:05:53.177704 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:53 crc kubenswrapper[4898]: I1211 13:05:53.177726 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:53 crc kubenswrapper[4898]: I1211 13:05:53.177743 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:53Z","lastTransitionTime":"2025-12-11T13:05:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:53 crc kubenswrapper[4898]: I1211 13:05:53.280636 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:53 crc kubenswrapper[4898]: I1211 13:05:53.280710 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:53 crc kubenswrapper[4898]: I1211 13:05:53.280740 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:53 crc kubenswrapper[4898]: I1211 13:05:53.280770 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:53 crc kubenswrapper[4898]: I1211 13:05:53.280792 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:53Z","lastTransitionTime":"2025-12-11T13:05:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:53 crc kubenswrapper[4898]: I1211 13:05:53.383073 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:53 crc kubenswrapper[4898]: I1211 13:05:53.383338 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:53 crc kubenswrapper[4898]: I1211 13:05:53.383521 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:53 crc kubenswrapper[4898]: I1211 13:05:53.383628 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:53 crc kubenswrapper[4898]: I1211 13:05:53.383716 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:53Z","lastTransitionTime":"2025-12-11T13:05:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:53 crc kubenswrapper[4898]: I1211 13:05:53.488615 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:53 crc kubenswrapper[4898]: I1211 13:05:53.488674 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:53 crc kubenswrapper[4898]: I1211 13:05:53.488697 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:53 crc kubenswrapper[4898]: I1211 13:05:53.488725 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:53 crc kubenswrapper[4898]: I1211 13:05:53.488748 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:53Z","lastTransitionTime":"2025-12-11T13:05:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:53 crc kubenswrapper[4898]: I1211 13:05:53.590715 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:53 crc kubenswrapper[4898]: I1211 13:05:53.590747 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:53 crc kubenswrapper[4898]: I1211 13:05:53.590755 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:53 crc kubenswrapper[4898]: I1211 13:05:53.590770 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:53 crc kubenswrapper[4898]: I1211 13:05:53.590778 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:53Z","lastTransitionTime":"2025-12-11T13:05:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:53 crc kubenswrapper[4898]: I1211 13:05:53.694204 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:53 crc kubenswrapper[4898]: I1211 13:05:53.694243 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:53 crc kubenswrapper[4898]: I1211 13:05:53.694257 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:53 crc kubenswrapper[4898]: I1211 13:05:53.694279 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:53 crc kubenswrapper[4898]: I1211 13:05:53.694292 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:53Z","lastTransitionTime":"2025-12-11T13:05:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:53 crc kubenswrapper[4898]: I1211 13:05:53.774366 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zcq7l" Dec 11 13:05:53 crc kubenswrapper[4898]: I1211 13:05:53.774396 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 11 13:05:53 crc kubenswrapper[4898]: I1211 13:05:53.774400 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 11 13:05:53 crc kubenswrapper[4898]: I1211 13:05:53.774510 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 11 13:05:53 crc kubenswrapper[4898]: E1211 13:05:53.774536 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zcq7l" podUID="34380c7c-1d75-4f6f-a6cb-b015a55ca978" Dec 11 13:05:53 crc kubenswrapper[4898]: E1211 13:05:53.774599 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 11 13:05:53 crc kubenswrapper[4898]: E1211 13:05:53.774801 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 11 13:05:53 crc kubenswrapper[4898]: E1211 13:05:53.774873 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 11 13:05:53 crc kubenswrapper[4898]: I1211 13:05:53.797695 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:53 crc kubenswrapper[4898]: I1211 13:05:53.797765 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:53 crc kubenswrapper[4898]: I1211 13:05:53.797787 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:53 crc kubenswrapper[4898]: I1211 13:05:53.797813 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:53 crc kubenswrapper[4898]: I1211 13:05:53.797831 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:53Z","lastTransitionTime":"2025-12-11T13:05:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:53 crc kubenswrapper[4898]: I1211 13:05:53.901156 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:53 crc kubenswrapper[4898]: I1211 13:05:53.901220 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:53 crc kubenswrapper[4898]: I1211 13:05:53.901236 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:53 crc kubenswrapper[4898]: I1211 13:05:53.901259 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:53 crc kubenswrapper[4898]: I1211 13:05:53.901279 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:53Z","lastTransitionTime":"2025-12-11T13:05:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:54 crc kubenswrapper[4898]: I1211 13:05:54.004413 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:54 crc kubenswrapper[4898]: I1211 13:05:54.004503 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:54 crc kubenswrapper[4898]: I1211 13:05:54.004520 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:54 crc kubenswrapper[4898]: I1211 13:05:54.004544 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:54 crc kubenswrapper[4898]: I1211 13:05:54.004565 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:54Z","lastTransitionTime":"2025-12-11T13:05:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:54 crc kubenswrapper[4898]: I1211 13:05:54.107320 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:54 crc kubenswrapper[4898]: I1211 13:05:54.107385 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:54 crc kubenswrapper[4898]: I1211 13:05:54.107403 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:54 crc kubenswrapper[4898]: I1211 13:05:54.107429 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:54 crc kubenswrapper[4898]: I1211 13:05:54.107448 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:54Z","lastTransitionTime":"2025-12-11T13:05:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:54 crc kubenswrapper[4898]: I1211 13:05:54.210216 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:54 crc kubenswrapper[4898]: I1211 13:05:54.210279 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:54 crc kubenswrapper[4898]: I1211 13:05:54.210297 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:54 crc kubenswrapper[4898]: I1211 13:05:54.210321 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:54 crc kubenswrapper[4898]: I1211 13:05:54.210338 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:54Z","lastTransitionTime":"2025-12-11T13:05:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:54 crc kubenswrapper[4898]: I1211 13:05:54.312919 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:54 crc kubenswrapper[4898]: I1211 13:05:54.312960 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:54 crc kubenswrapper[4898]: I1211 13:05:54.312973 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:54 crc kubenswrapper[4898]: I1211 13:05:54.312990 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:54 crc kubenswrapper[4898]: I1211 13:05:54.313001 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:54Z","lastTransitionTime":"2025-12-11T13:05:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:54 crc kubenswrapper[4898]: I1211 13:05:54.415773 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:54 crc kubenswrapper[4898]: I1211 13:05:54.415832 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:54 crc kubenswrapper[4898]: I1211 13:05:54.415848 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:54 crc kubenswrapper[4898]: I1211 13:05:54.415870 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:54 crc kubenswrapper[4898]: I1211 13:05:54.415887 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:54Z","lastTransitionTime":"2025-12-11T13:05:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:54 crc kubenswrapper[4898]: I1211 13:05:54.518631 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:54 crc kubenswrapper[4898]: I1211 13:05:54.518674 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:54 crc kubenswrapper[4898]: I1211 13:05:54.518690 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:54 crc kubenswrapper[4898]: I1211 13:05:54.518709 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:54 crc kubenswrapper[4898]: I1211 13:05:54.518724 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:54Z","lastTransitionTime":"2025-12-11T13:05:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:54 crc kubenswrapper[4898]: I1211 13:05:54.621995 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:54 crc kubenswrapper[4898]: I1211 13:05:54.622041 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:54 crc kubenswrapper[4898]: I1211 13:05:54.622059 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:54 crc kubenswrapper[4898]: I1211 13:05:54.622081 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:54 crc kubenswrapper[4898]: I1211 13:05:54.622101 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:54Z","lastTransitionTime":"2025-12-11T13:05:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:54 crc kubenswrapper[4898]: I1211 13:05:54.724914 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:54 crc kubenswrapper[4898]: I1211 13:05:54.724952 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:54 crc kubenswrapper[4898]: I1211 13:05:54.724960 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:54 crc kubenswrapper[4898]: I1211 13:05:54.724974 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:54 crc kubenswrapper[4898]: I1211 13:05:54.724982 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:54Z","lastTransitionTime":"2025-12-11T13:05:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:54 crc kubenswrapper[4898]: I1211 13:05:54.827942 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:54 crc kubenswrapper[4898]: I1211 13:05:54.828005 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:54 crc kubenswrapper[4898]: I1211 13:05:54.828022 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:54 crc kubenswrapper[4898]: I1211 13:05:54.828043 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:54 crc kubenswrapper[4898]: I1211 13:05:54.828058 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:54Z","lastTransitionTime":"2025-12-11T13:05:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:54 crc kubenswrapper[4898]: I1211 13:05:54.929945 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:54 crc kubenswrapper[4898]: I1211 13:05:54.929992 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:54 crc kubenswrapper[4898]: I1211 13:05:54.930001 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:54 crc kubenswrapper[4898]: I1211 13:05:54.930015 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:54 crc kubenswrapper[4898]: I1211 13:05:54.930024 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:54Z","lastTransitionTime":"2025-12-11T13:05:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:55 crc kubenswrapper[4898]: I1211 13:05:55.032380 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:55 crc kubenswrapper[4898]: I1211 13:05:55.032486 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:55 crc kubenswrapper[4898]: I1211 13:05:55.032505 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:55 crc kubenswrapper[4898]: I1211 13:05:55.032528 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:55 crc kubenswrapper[4898]: I1211 13:05:55.032561 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:55Z","lastTransitionTime":"2025-12-11T13:05:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:55 crc kubenswrapper[4898]: I1211 13:05:55.135727 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:55 crc kubenswrapper[4898]: I1211 13:05:55.135784 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:55 crc kubenswrapper[4898]: I1211 13:05:55.135800 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:55 crc kubenswrapper[4898]: I1211 13:05:55.135821 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:55 crc kubenswrapper[4898]: I1211 13:05:55.135837 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:55Z","lastTransitionTime":"2025-12-11T13:05:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:55 crc kubenswrapper[4898]: I1211 13:05:55.238300 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:55 crc kubenswrapper[4898]: I1211 13:05:55.238359 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:55 crc kubenswrapper[4898]: I1211 13:05:55.238376 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:55 crc kubenswrapper[4898]: I1211 13:05:55.238403 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:55 crc kubenswrapper[4898]: I1211 13:05:55.238421 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:55Z","lastTransitionTime":"2025-12-11T13:05:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:55 crc kubenswrapper[4898]: I1211 13:05:55.341953 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:55 crc kubenswrapper[4898]: I1211 13:05:55.342019 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:55 crc kubenswrapper[4898]: I1211 13:05:55.342034 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:55 crc kubenswrapper[4898]: I1211 13:05:55.342050 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:55 crc kubenswrapper[4898]: I1211 13:05:55.342064 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:55Z","lastTransitionTime":"2025-12-11T13:05:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:55 crc kubenswrapper[4898]: I1211 13:05:55.445345 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:55 crc kubenswrapper[4898]: I1211 13:05:55.445408 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:55 crc kubenswrapper[4898]: I1211 13:05:55.445429 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:55 crc kubenswrapper[4898]: I1211 13:05:55.445488 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:55 crc kubenswrapper[4898]: I1211 13:05:55.445513 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:55Z","lastTransitionTime":"2025-12-11T13:05:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:55 crc kubenswrapper[4898]: I1211 13:05:55.548626 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:55 crc kubenswrapper[4898]: I1211 13:05:55.548681 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:55 crc kubenswrapper[4898]: I1211 13:05:55.548696 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:55 crc kubenswrapper[4898]: I1211 13:05:55.548717 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:55 crc kubenswrapper[4898]: I1211 13:05:55.548732 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:55Z","lastTransitionTime":"2025-12-11T13:05:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:55 crc kubenswrapper[4898]: I1211 13:05:55.652600 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:55 crc kubenswrapper[4898]: I1211 13:05:55.652662 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:55 crc kubenswrapper[4898]: I1211 13:05:55.652729 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:55 crc kubenswrapper[4898]: I1211 13:05:55.652759 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:55 crc kubenswrapper[4898]: I1211 13:05:55.652782 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:55Z","lastTransitionTime":"2025-12-11T13:05:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:55 crc kubenswrapper[4898]: I1211 13:05:55.755356 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:55 crc kubenswrapper[4898]: I1211 13:05:55.755410 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:55 crc kubenswrapper[4898]: I1211 13:05:55.755425 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:55 crc kubenswrapper[4898]: I1211 13:05:55.755494 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:55 crc kubenswrapper[4898]: I1211 13:05:55.755513 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:55Z","lastTransitionTime":"2025-12-11T13:05:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:55 crc kubenswrapper[4898]: I1211 13:05:55.774736 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zcq7l" Dec 11 13:05:55 crc kubenswrapper[4898]: I1211 13:05:55.774776 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 11 13:05:55 crc kubenswrapper[4898]: E1211 13:05:55.774866 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 11 13:05:55 crc kubenswrapper[4898]: I1211 13:05:55.774992 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 11 13:05:55 crc kubenswrapper[4898]: E1211 13:05:55.774995 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zcq7l" podUID="34380c7c-1d75-4f6f-a6cb-b015a55ca978" Dec 11 13:05:55 crc kubenswrapper[4898]: E1211 13:05:55.775182 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 11 13:05:55 crc kubenswrapper[4898]: I1211 13:05:55.775285 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 11 13:05:55 crc kubenswrapper[4898]: E1211 13:05:55.775436 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 11 13:05:55 crc kubenswrapper[4898]: I1211 13:05:55.871551 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:55 crc kubenswrapper[4898]: I1211 13:05:55.871602 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:55 crc kubenswrapper[4898]: I1211 13:05:55.871612 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:55 crc kubenswrapper[4898]: I1211 13:05:55.871629 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:55 crc kubenswrapper[4898]: I1211 13:05:55.871641 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:55Z","lastTransitionTime":"2025-12-11T13:05:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:55 crc kubenswrapper[4898]: I1211 13:05:55.974406 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:55 crc kubenswrapper[4898]: I1211 13:05:55.974479 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:55 crc kubenswrapper[4898]: I1211 13:05:55.974491 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:55 crc kubenswrapper[4898]: I1211 13:05:55.974509 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:55 crc kubenswrapper[4898]: I1211 13:05:55.974521 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:55Z","lastTransitionTime":"2025-12-11T13:05:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:56 crc kubenswrapper[4898]: I1211 13:05:56.076747 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:56 crc kubenswrapper[4898]: I1211 13:05:56.077012 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:56 crc kubenswrapper[4898]: I1211 13:05:56.077102 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:56 crc kubenswrapper[4898]: I1211 13:05:56.077173 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:56 crc kubenswrapper[4898]: I1211 13:05:56.077234 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:56Z","lastTransitionTime":"2025-12-11T13:05:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:56 crc kubenswrapper[4898]: I1211 13:05:56.180089 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:56 crc kubenswrapper[4898]: I1211 13:05:56.180127 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:56 crc kubenswrapper[4898]: I1211 13:05:56.180137 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:56 crc kubenswrapper[4898]: I1211 13:05:56.180152 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:56 crc kubenswrapper[4898]: I1211 13:05:56.180163 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:56Z","lastTransitionTime":"2025-12-11T13:05:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:56 crc kubenswrapper[4898]: I1211 13:05:56.283202 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:56 crc kubenswrapper[4898]: I1211 13:05:56.283236 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:56 crc kubenswrapper[4898]: I1211 13:05:56.283246 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:56 crc kubenswrapper[4898]: I1211 13:05:56.283261 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:56 crc kubenswrapper[4898]: I1211 13:05:56.283273 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:56Z","lastTransitionTime":"2025-12-11T13:05:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:56 crc kubenswrapper[4898]: I1211 13:05:56.385826 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:56 crc kubenswrapper[4898]: I1211 13:05:56.385877 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:56 crc kubenswrapper[4898]: I1211 13:05:56.385889 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:56 crc kubenswrapper[4898]: I1211 13:05:56.385906 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:56 crc kubenswrapper[4898]: I1211 13:05:56.385920 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:56Z","lastTransitionTime":"2025-12-11T13:05:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:56 crc kubenswrapper[4898]: I1211 13:05:56.406202 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-dlqfj_4e8ed6cb-b822-4b64-9e00-e755c5aea812/kube-multus/1.log" Dec 11 13:05:56 crc kubenswrapper[4898]: I1211 13:05:56.406695 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-dlqfj_4e8ed6cb-b822-4b64-9e00-e755c5aea812/kube-multus/0.log" Dec 11 13:05:56 crc kubenswrapper[4898]: I1211 13:05:56.406739 4898 generic.go:334] "Generic (PLEG): container finished" podID="4e8ed6cb-b822-4b64-9e00-e755c5aea812" containerID="dbc9b844a873836a500af9d781e17c635495762b0e50af71dadf141ff00723a3" exitCode=1 Dec 11 13:05:56 crc kubenswrapper[4898]: I1211 13:05:56.406768 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-dlqfj" event={"ID":"4e8ed6cb-b822-4b64-9e00-e755c5aea812","Type":"ContainerDied","Data":"dbc9b844a873836a500af9d781e17c635495762b0e50af71dadf141ff00723a3"} Dec 11 13:05:56 crc kubenswrapper[4898]: I1211 13:05:56.406801 4898 scope.go:117] "RemoveContainer" containerID="76ddfb054d2dd7ec27a771c96a0fca2cead8eacc3221c22b36c1d075414a1995" Dec 11 13:05:56 crc kubenswrapper[4898]: I1211 13:05:56.407212 4898 scope.go:117] "RemoveContainer" containerID="dbc9b844a873836a500af9d781e17c635495762b0e50af71dadf141ff00723a3" Dec 11 13:05:56 crc kubenswrapper[4898]: E1211 13:05:56.407400 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-dlqfj_openshift-multus(4e8ed6cb-b822-4b64-9e00-e755c5aea812)\"" pod="openshift-multus/multus-dlqfj" podUID="4e8ed6cb-b822-4b64-9e00-e755c5aea812" Dec 11 13:05:56 crc kubenswrapper[4898]: I1211 13:05:56.488737 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:56 crc kubenswrapper[4898]: I1211 13:05:56.488963 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:56 crc kubenswrapper[4898]: I1211 13:05:56.489066 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:56 crc kubenswrapper[4898]: I1211 13:05:56.489132 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:56 crc kubenswrapper[4898]: I1211 13:05:56.489193 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:56Z","lastTransitionTime":"2025-12-11T13:05:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:56 crc kubenswrapper[4898]: I1211 13:05:56.591858 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:56 crc kubenswrapper[4898]: I1211 13:05:56.591889 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:56 crc kubenswrapper[4898]: I1211 13:05:56.591896 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:56 crc kubenswrapper[4898]: I1211 13:05:56.591909 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:56 crc kubenswrapper[4898]: I1211 13:05:56.591921 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:56Z","lastTransitionTime":"2025-12-11T13:05:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:56 crc kubenswrapper[4898]: I1211 13:05:56.694671 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:56 crc kubenswrapper[4898]: I1211 13:05:56.694734 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:56 crc kubenswrapper[4898]: I1211 13:05:56.694752 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:56 crc kubenswrapper[4898]: I1211 13:05:56.694776 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:56 crc kubenswrapper[4898]: I1211 13:05:56.694795 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:56Z","lastTransitionTime":"2025-12-11T13:05:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:56 crc kubenswrapper[4898]: I1211 13:05:56.797180 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:56 crc kubenswrapper[4898]: I1211 13:05:56.797215 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:56 crc kubenswrapper[4898]: I1211 13:05:56.797226 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:56 crc kubenswrapper[4898]: I1211 13:05:56.797242 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:56 crc kubenswrapper[4898]: I1211 13:05:56.797253 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:56Z","lastTransitionTime":"2025-12-11T13:05:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:56 crc kubenswrapper[4898]: I1211 13:05:56.899934 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:56 crc kubenswrapper[4898]: I1211 13:05:56.900000 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:56 crc kubenswrapper[4898]: I1211 13:05:56.900022 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:56 crc kubenswrapper[4898]: I1211 13:05:56.900051 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:56 crc kubenswrapper[4898]: I1211 13:05:56.900072 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:56Z","lastTransitionTime":"2025-12-11T13:05:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:57 crc kubenswrapper[4898]: I1211 13:05:57.002316 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:57 crc kubenswrapper[4898]: I1211 13:05:57.002388 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:57 crc kubenswrapper[4898]: I1211 13:05:57.002412 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:57 crc kubenswrapper[4898]: I1211 13:05:57.002441 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:57 crc kubenswrapper[4898]: I1211 13:05:57.002514 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:57Z","lastTransitionTime":"2025-12-11T13:05:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:57 crc kubenswrapper[4898]: I1211 13:05:57.104842 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:57 crc kubenswrapper[4898]: I1211 13:05:57.104924 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:57 crc kubenswrapper[4898]: I1211 13:05:57.104945 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:57 crc kubenswrapper[4898]: I1211 13:05:57.104975 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:57 crc kubenswrapper[4898]: I1211 13:05:57.104998 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:57Z","lastTransitionTime":"2025-12-11T13:05:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:57 crc kubenswrapper[4898]: I1211 13:05:57.208384 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:57 crc kubenswrapper[4898]: I1211 13:05:57.208439 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:57 crc kubenswrapper[4898]: I1211 13:05:57.208452 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:57 crc kubenswrapper[4898]: I1211 13:05:57.208495 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:57 crc kubenswrapper[4898]: I1211 13:05:57.208511 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:57Z","lastTransitionTime":"2025-12-11T13:05:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:57 crc kubenswrapper[4898]: I1211 13:05:57.311693 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:57 crc kubenswrapper[4898]: I1211 13:05:57.311735 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:57 crc kubenswrapper[4898]: I1211 13:05:57.311746 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:57 crc kubenswrapper[4898]: I1211 13:05:57.311763 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:57 crc kubenswrapper[4898]: I1211 13:05:57.311774 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:57Z","lastTransitionTime":"2025-12-11T13:05:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:57 crc kubenswrapper[4898]: I1211 13:05:57.412296 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-dlqfj_4e8ed6cb-b822-4b64-9e00-e755c5aea812/kube-multus/1.log" Dec 11 13:05:57 crc kubenswrapper[4898]: I1211 13:05:57.413485 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:57 crc kubenswrapper[4898]: I1211 13:05:57.413533 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:57 crc kubenswrapper[4898]: I1211 13:05:57.413543 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:57 crc kubenswrapper[4898]: I1211 13:05:57.413560 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:57 crc kubenswrapper[4898]: I1211 13:05:57.413598 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:57Z","lastTransitionTime":"2025-12-11T13:05:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:57 crc kubenswrapper[4898]: I1211 13:05:57.516599 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:57 crc kubenswrapper[4898]: I1211 13:05:57.516670 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:57 crc kubenswrapper[4898]: I1211 13:05:57.516692 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:57 crc kubenswrapper[4898]: I1211 13:05:57.516718 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:57 crc kubenswrapper[4898]: I1211 13:05:57.516739 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:57Z","lastTransitionTime":"2025-12-11T13:05:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:57 crc kubenswrapper[4898]: I1211 13:05:57.619405 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:57 crc kubenswrapper[4898]: I1211 13:05:57.619558 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:57 crc kubenswrapper[4898]: I1211 13:05:57.619579 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:57 crc kubenswrapper[4898]: I1211 13:05:57.619602 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:57 crc kubenswrapper[4898]: I1211 13:05:57.619619 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:57Z","lastTransitionTime":"2025-12-11T13:05:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:57 crc kubenswrapper[4898]: I1211 13:05:57.722938 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:57 crc kubenswrapper[4898]: I1211 13:05:57.723006 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:57 crc kubenswrapper[4898]: I1211 13:05:57.723021 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:57 crc kubenswrapper[4898]: I1211 13:05:57.723042 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:57 crc kubenswrapper[4898]: I1211 13:05:57.723059 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:57Z","lastTransitionTime":"2025-12-11T13:05:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:57 crc kubenswrapper[4898]: I1211 13:05:57.774171 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 11 13:05:57 crc kubenswrapper[4898]: I1211 13:05:57.774309 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 11 13:05:57 crc kubenswrapper[4898]: E1211 13:05:57.774430 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 11 13:05:57 crc kubenswrapper[4898]: I1211 13:05:57.774522 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zcq7l" Dec 11 13:05:57 crc kubenswrapper[4898]: E1211 13:05:57.774590 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 11 13:05:57 crc kubenswrapper[4898]: I1211 13:05:57.774534 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 11 13:05:57 crc kubenswrapper[4898]: E1211 13:05:57.774688 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zcq7l" podUID="34380c7c-1d75-4f6f-a6cb-b015a55ca978" Dec 11 13:05:57 crc kubenswrapper[4898]: E1211 13:05:57.774834 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 11 13:05:57 crc kubenswrapper[4898]: I1211 13:05:57.826415 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:57 crc kubenswrapper[4898]: I1211 13:05:57.826522 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:57 crc kubenswrapper[4898]: I1211 13:05:57.826546 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:57 crc kubenswrapper[4898]: I1211 13:05:57.826576 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:57 crc kubenswrapper[4898]: I1211 13:05:57.826598 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:57Z","lastTransitionTime":"2025-12-11T13:05:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:57 crc kubenswrapper[4898]: I1211 13:05:57.929563 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:57 crc kubenswrapper[4898]: I1211 13:05:57.929809 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:57 crc kubenswrapper[4898]: I1211 13:05:57.929826 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:57 crc kubenswrapper[4898]: I1211 13:05:57.929855 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:57 crc kubenswrapper[4898]: I1211 13:05:57.929872 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:57Z","lastTransitionTime":"2025-12-11T13:05:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:58 crc kubenswrapper[4898]: I1211 13:05:58.032794 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:58 crc kubenswrapper[4898]: I1211 13:05:58.032855 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:58 crc kubenswrapper[4898]: I1211 13:05:58.032871 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:58 crc kubenswrapper[4898]: I1211 13:05:58.032896 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:58 crc kubenswrapper[4898]: I1211 13:05:58.032914 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:58Z","lastTransitionTime":"2025-12-11T13:05:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:58 crc kubenswrapper[4898]: I1211 13:05:58.135436 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:58 crc kubenswrapper[4898]: I1211 13:05:58.135529 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:58 crc kubenswrapper[4898]: I1211 13:05:58.135549 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:58 crc kubenswrapper[4898]: I1211 13:05:58.135576 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:58 crc kubenswrapper[4898]: I1211 13:05:58.135604 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:58Z","lastTransitionTime":"2025-12-11T13:05:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:58 crc kubenswrapper[4898]: I1211 13:05:58.238236 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:58 crc kubenswrapper[4898]: I1211 13:05:58.238319 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:58 crc kubenswrapper[4898]: I1211 13:05:58.238331 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:58 crc kubenswrapper[4898]: I1211 13:05:58.238349 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:58 crc kubenswrapper[4898]: I1211 13:05:58.238361 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:58Z","lastTransitionTime":"2025-12-11T13:05:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:58 crc kubenswrapper[4898]: I1211 13:05:58.341343 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:58 crc kubenswrapper[4898]: I1211 13:05:58.341399 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:58 crc kubenswrapper[4898]: I1211 13:05:58.341419 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:58 crc kubenswrapper[4898]: I1211 13:05:58.341441 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:58 crc kubenswrapper[4898]: I1211 13:05:58.341490 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:58Z","lastTransitionTime":"2025-12-11T13:05:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:58 crc kubenswrapper[4898]: I1211 13:05:58.444105 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:58 crc kubenswrapper[4898]: I1211 13:05:58.444171 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:58 crc kubenswrapper[4898]: I1211 13:05:58.444189 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:58 crc kubenswrapper[4898]: I1211 13:05:58.444215 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:58 crc kubenswrapper[4898]: I1211 13:05:58.444232 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:58Z","lastTransitionTime":"2025-12-11T13:05:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:58 crc kubenswrapper[4898]: I1211 13:05:58.456863 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 13:05:58 crc kubenswrapper[4898]: I1211 13:05:58.456929 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 13:05:58 crc kubenswrapper[4898]: I1211 13:05:58.456946 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 13:05:58 crc kubenswrapper[4898]: I1211 13:05:58.456970 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 13:05:58 crc kubenswrapper[4898]: I1211 13:05:58.456987 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T13:05:58Z","lastTransitionTime":"2025-12-11T13:05:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 13:05:58 crc kubenswrapper[4898]: I1211 13:05:58.522385 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-x5ckz"] Dec 11 13:05:58 crc kubenswrapper[4898]: I1211 13:05:58.522998 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-x5ckz" Dec 11 13:05:58 crc kubenswrapper[4898]: I1211 13:05:58.525229 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Dec 11 13:05:58 crc kubenswrapper[4898]: I1211 13:05:58.525606 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Dec 11 13:05:58 crc kubenswrapper[4898]: I1211 13:05:58.525439 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Dec 11 13:05:58 crc kubenswrapper[4898]: I1211 13:05:58.526295 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Dec 11 13:05:58 crc kubenswrapper[4898]: I1211 13:05:58.639651 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/0a244f02-814f-496a-82e1-8afda88ff8e9-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-x5ckz\" (UID: \"0a244f02-814f-496a-82e1-8afda88ff8e9\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-x5ckz" Dec 11 13:05:58 crc kubenswrapper[4898]: I1211 13:05:58.640371 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0a244f02-814f-496a-82e1-8afda88ff8e9-service-ca\") pod \"cluster-version-operator-5c965bbfc6-x5ckz\" (UID: \"0a244f02-814f-496a-82e1-8afda88ff8e9\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-x5ckz" Dec 11 13:05:58 crc kubenswrapper[4898]: I1211 13:05:58.640554 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/0a244f02-814f-496a-82e1-8afda88ff8e9-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-x5ckz\" (UID: \"0a244f02-814f-496a-82e1-8afda88ff8e9\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-x5ckz" Dec 11 13:05:58 crc kubenswrapper[4898]: I1211 13:05:58.640682 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0a244f02-814f-496a-82e1-8afda88ff8e9-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-x5ckz\" (UID: \"0a244f02-814f-496a-82e1-8afda88ff8e9\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-x5ckz" Dec 11 13:05:58 crc kubenswrapper[4898]: I1211 13:05:58.640796 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0a244f02-814f-496a-82e1-8afda88ff8e9-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-x5ckz\" (UID: \"0a244f02-814f-496a-82e1-8afda88ff8e9\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-x5ckz" Dec 11 13:05:58 crc kubenswrapper[4898]: I1211 13:05:58.741350 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/0a244f02-814f-496a-82e1-8afda88ff8e9-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-x5ckz\" (UID: \"0a244f02-814f-496a-82e1-8afda88ff8e9\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-x5ckz" Dec 11 13:05:58 crc kubenswrapper[4898]: I1211 13:05:58.741661 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0a244f02-814f-496a-82e1-8afda88ff8e9-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-x5ckz\" (UID: \"0a244f02-814f-496a-82e1-8afda88ff8e9\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-x5ckz" Dec 11 13:05:58 crc kubenswrapper[4898]: I1211 13:05:58.741474 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/0a244f02-814f-496a-82e1-8afda88ff8e9-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-x5ckz\" (UID: \"0a244f02-814f-496a-82e1-8afda88ff8e9\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-x5ckz" Dec 11 13:05:58 crc kubenswrapper[4898]: I1211 13:05:58.741768 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0a244f02-814f-496a-82e1-8afda88ff8e9-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-x5ckz\" (UID: \"0a244f02-814f-496a-82e1-8afda88ff8e9\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-x5ckz" Dec 11 13:05:58 crc kubenswrapper[4898]: I1211 13:05:58.741951 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/0a244f02-814f-496a-82e1-8afda88ff8e9-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-x5ckz\" (UID: \"0a244f02-814f-496a-82e1-8afda88ff8e9\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-x5ckz" Dec 11 13:05:58 crc kubenswrapper[4898]: I1211 13:05:58.742002 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0a244f02-814f-496a-82e1-8afda88ff8e9-service-ca\") pod \"cluster-version-operator-5c965bbfc6-x5ckz\" (UID: \"0a244f02-814f-496a-82e1-8afda88ff8e9\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-x5ckz" Dec 11 13:05:58 crc kubenswrapper[4898]: I1211 13:05:58.742123 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/0a244f02-814f-496a-82e1-8afda88ff8e9-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-x5ckz\" (UID: \"0a244f02-814f-496a-82e1-8afda88ff8e9\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-x5ckz" Dec 11 13:05:58 crc kubenswrapper[4898]: I1211 13:05:58.744140 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0a244f02-814f-496a-82e1-8afda88ff8e9-service-ca\") pod \"cluster-version-operator-5c965bbfc6-x5ckz\" (UID: \"0a244f02-814f-496a-82e1-8afda88ff8e9\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-x5ckz" Dec 11 13:05:58 crc kubenswrapper[4898]: I1211 13:05:58.749653 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0a244f02-814f-496a-82e1-8afda88ff8e9-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-x5ckz\" (UID: \"0a244f02-814f-496a-82e1-8afda88ff8e9\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-x5ckz" Dec 11 13:05:58 crc kubenswrapper[4898]: I1211 13:05:58.759347 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0a244f02-814f-496a-82e1-8afda88ff8e9-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-x5ckz\" (UID: \"0a244f02-814f-496a-82e1-8afda88ff8e9\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-x5ckz" Dec 11 13:05:58 crc kubenswrapper[4898]: I1211 13:05:58.855800 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-x5ckz" Dec 11 13:05:59 crc kubenswrapper[4898]: I1211 13:05:59.421279 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-x5ckz" event={"ID":"0a244f02-814f-496a-82e1-8afda88ff8e9","Type":"ContainerStarted","Data":"7034a5a056e42565eddd46d9d72fdb85471d132dba64fbbce9fe618a5de628d7"} Dec 11 13:05:59 crc kubenswrapper[4898]: I1211 13:05:59.774131 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zcq7l" Dec 11 13:05:59 crc kubenswrapper[4898]: I1211 13:05:59.774131 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 11 13:05:59 crc kubenswrapper[4898]: I1211 13:05:59.774151 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 11 13:05:59 crc kubenswrapper[4898]: I1211 13:05:59.774291 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 11 13:05:59 crc kubenswrapper[4898]: E1211 13:05:59.774419 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zcq7l" podUID="34380c7c-1d75-4f6f-a6cb-b015a55ca978" Dec 11 13:05:59 crc kubenswrapper[4898]: E1211 13:05:59.774604 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 11 13:05:59 crc kubenswrapper[4898]: E1211 13:05:59.774735 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 11 13:05:59 crc kubenswrapper[4898]: E1211 13:05:59.774873 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 11 13:06:00 crc kubenswrapper[4898]: I1211 13:06:00.426087 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-x5ckz" event={"ID":"0a244f02-814f-496a-82e1-8afda88ff8e9","Type":"ContainerStarted","Data":"a805dcbacba826378ad27b226d4ac7feecd743ed00cd84e039fa8f8810f4c659"} Dec 11 13:06:00 crc kubenswrapper[4898]: I1211 13:06:00.441139 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-x5ckz" podStartSLOduration=99.44111599 podStartE2EDuration="1m39.44111599s" podCreationTimestamp="2025-12-11 13:04:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 13:06:00.440398309 +0000 UTC m=+118.012724756" watchObservedRunningTime="2025-12-11 13:06:00.44111599 +0000 UTC m=+118.013442467" Dec 11 13:06:00 crc kubenswrapper[4898]: I1211 13:06:00.775705 4898 scope.go:117] "RemoveContainer" containerID="b19bc85d9d1f776f0f2a0bfedbfa6f30383df3673e54acadcc74a46e4d24bcde" Dec 11 13:06:00 crc kubenswrapper[4898]: E1211 13:06:00.775976 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-qndxl_openshift-ovn-kubernetes(1efa7034-8a95-4e6e-bd84-0189dc5acaa3)\"" pod="openshift-ovn-kubernetes/ovnkube-node-qndxl" podUID="1efa7034-8a95-4e6e-bd84-0189dc5acaa3" Dec 11 13:06:01 crc kubenswrapper[4898]: I1211 13:06:01.774598 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 11 13:06:01 crc kubenswrapper[4898]: I1211 13:06:01.774713 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 11 13:06:01 crc kubenswrapper[4898]: I1211 13:06:01.774616 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zcq7l" Dec 11 13:06:01 crc kubenswrapper[4898]: I1211 13:06:01.774614 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 11 13:06:01 crc kubenswrapper[4898]: E1211 13:06:01.774859 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 11 13:06:01 crc kubenswrapper[4898]: E1211 13:06:01.774992 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 11 13:06:01 crc kubenswrapper[4898]: E1211 13:06:01.775084 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zcq7l" podUID="34380c7c-1d75-4f6f-a6cb-b015a55ca978" Dec 11 13:06:01 crc kubenswrapper[4898]: E1211 13:06:01.775151 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 11 13:06:02 crc kubenswrapper[4898]: E1211 13:06:02.766302 4898 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Dec 11 13:06:02 crc kubenswrapper[4898]: E1211 13:06:02.897888 4898 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 11 13:06:03 crc kubenswrapper[4898]: I1211 13:06:03.774192 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 11 13:06:03 crc kubenswrapper[4898]: E1211 13:06:03.774508 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 11 13:06:03 crc kubenswrapper[4898]: I1211 13:06:03.774590 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zcq7l" Dec 11 13:06:03 crc kubenswrapper[4898]: E1211 13:06:03.774653 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zcq7l" podUID="34380c7c-1d75-4f6f-a6cb-b015a55ca978" Dec 11 13:06:03 crc kubenswrapper[4898]: I1211 13:06:03.774664 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 11 13:06:03 crc kubenswrapper[4898]: I1211 13:06:03.774720 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 11 13:06:03 crc kubenswrapper[4898]: E1211 13:06:03.774826 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 11 13:06:03 crc kubenswrapper[4898]: E1211 13:06:03.775109 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 11 13:06:05 crc kubenswrapper[4898]: I1211 13:06:05.774799 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zcq7l" Dec 11 13:06:05 crc kubenswrapper[4898]: I1211 13:06:05.774853 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 11 13:06:05 crc kubenswrapper[4898]: I1211 13:06:05.774825 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 11 13:06:05 crc kubenswrapper[4898]: I1211 13:06:05.774799 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 11 13:06:05 crc kubenswrapper[4898]: E1211 13:06:05.775024 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zcq7l" podUID="34380c7c-1d75-4f6f-a6cb-b015a55ca978" Dec 11 13:06:05 crc kubenswrapper[4898]: E1211 13:06:05.775168 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 11 13:06:05 crc kubenswrapper[4898]: E1211 13:06:05.775308 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 11 13:06:05 crc kubenswrapper[4898]: E1211 13:06:05.775398 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 11 13:06:07 crc kubenswrapper[4898]: I1211 13:06:07.774891 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 11 13:06:07 crc kubenswrapper[4898]: I1211 13:06:07.775005 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zcq7l" Dec 11 13:06:07 crc kubenswrapper[4898]: E1211 13:06:07.775038 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 11 13:06:07 crc kubenswrapper[4898]: I1211 13:06:07.775323 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 11 13:06:07 crc kubenswrapper[4898]: E1211 13:06:07.775400 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zcq7l" podUID="34380c7c-1d75-4f6f-a6cb-b015a55ca978" Dec 11 13:06:07 crc kubenswrapper[4898]: E1211 13:06:07.775569 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 11 13:06:07 crc kubenswrapper[4898]: I1211 13:06:07.774871 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 11 13:06:07 crc kubenswrapper[4898]: E1211 13:06:07.775836 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 11 13:06:07 crc kubenswrapper[4898]: E1211 13:06:07.898858 4898 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 11 13:06:09 crc kubenswrapper[4898]: I1211 13:06:09.774868 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 11 13:06:09 crc kubenswrapper[4898]: I1211 13:06:09.774931 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 11 13:06:09 crc kubenswrapper[4898]: I1211 13:06:09.774903 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 11 13:06:09 crc kubenswrapper[4898]: I1211 13:06:09.774892 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zcq7l" Dec 11 13:06:09 crc kubenswrapper[4898]: E1211 13:06:09.775091 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 11 13:06:09 crc kubenswrapper[4898]: E1211 13:06:09.775208 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zcq7l" podUID="34380c7c-1d75-4f6f-a6cb-b015a55ca978" Dec 11 13:06:09 crc kubenswrapper[4898]: E1211 13:06:09.775260 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 11 13:06:09 crc kubenswrapper[4898]: E1211 13:06:09.775340 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 11 13:06:10 crc kubenswrapper[4898]: I1211 13:06:10.775361 4898 scope.go:117] "RemoveContainer" containerID="dbc9b844a873836a500af9d781e17c635495762b0e50af71dadf141ff00723a3" Dec 11 13:06:11 crc kubenswrapper[4898]: I1211 13:06:11.467558 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-dlqfj_4e8ed6cb-b822-4b64-9e00-e755c5aea812/kube-multus/1.log" Dec 11 13:06:11 crc kubenswrapper[4898]: I1211 13:06:11.467647 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-dlqfj" event={"ID":"4e8ed6cb-b822-4b64-9e00-e755c5aea812","Type":"ContainerStarted","Data":"114b3259a9fd034573fe9dc3c103980a05be8bdb8205c084f35e27725255ec28"} Dec 11 13:06:11 crc kubenswrapper[4898]: I1211 13:06:11.773962 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 11 13:06:11 crc kubenswrapper[4898]: I1211 13:06:11.774062 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 11 13:06:11 crc kubenswrapper[4898]: E1211 13:06:11.774152 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 11 13:06:11 crc kubenswrapper[4898]: I1211 13:06:11.774274 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 11 13:06:11 crc kubenswrapper[4898]: I1211 13:06:11.773989 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zcq7l" Dec 11 13:06:11 crc kubenswrapper[4898]: E1211 13:06:11.774545 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 11 13:06:11 crc kubenswrapper[4898]: E1211 13:06:11.774836 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zcq7l" podUID="34380c7c-1d75-4f6f-a6cb-b015a55ca978" Dec 11 13:06:11 crc kubenswrapper[4898]: E1211 13:06:11.774915 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 11 13:06:12 crc kubenswrapper[4898]: E1211 13:06:12.899834 4898 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 11 13:06:13 crc kubenswrapper[4898]: I1211 13:06:13.774094 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 11 13:06:13 crc kubenswrapper[4898]: E1211 13:06:13.774703 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 11 13:06:13 crc kubenswrapper[4898]: I1211 13:06:13.774175 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zcq7l" Dec 11 13:06:13 crc kubenswrapper[4898]: E1211 13:06:13.774979 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zcq7l" podUID="34380c7c-1d75-4f6f-a6cb-b015a55ca978" Dec 11 13:06:13 crc kubenswrapper[4898]: I1211 13:06:13.774094 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 11 13:06:13 crc kubenswrapper[4898]: I1211 13:06:13.774197 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 11 13:06:13 crc kubenswrapper[4898]: E1211 13:06:13.775128 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 11 13:06:13 crc kubenswrapper[4898]: E1211 13:06:13.775223 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 11 13:06:15 crc kubenswrapper[4898]: I1211 13:06:15.774914 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 11 13:06:15 crc kubenswrapper[4898]: E1211 13:06:15.775125 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 11 13:06:15 crc kubenswrapper[4898]: I1211 13:06:15.775379 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 11 13:06:15 crc kubenswrapper[4898]: I1211 13:06:15.775417 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 11 13:06:15 crc kubenswrapper[4898]: I1211 13:06:15.775948 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zcq7l" Dec 11 13:06:15 crc kubenswrapper[4898]: I1211 13:06:15.776443 4898 scope.go:117] "RemoveContainer" containerID="b19bc85d9d1f776f0f2a0bfedbfa6f30383df3673e54acadcc74a46e4d24bcde" Dec 11 13:06:15 crc kubenswrapper[4898]: E1211 13:06:15.776628 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zcq7l" podUID="34380c7c-1d75-4f6f-a6cb-b015a55ca978" Dec 11 13:06:15 crc kubenswrapper[4898]: E1211 13:06:15.776931 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 11 13:06:15 crc kubenswrapper[4898]: E1211 13:06:15.777194 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 11 13:06:16 crc kubenswrapper[4898]: I1211 13:06:16.487494 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qndxl_1efa7034-8a95-4e6e-bd84-0189dc5acaa3/ovnkube-controller/3.log" Dec 11 13:06:16 crc kubenswrapper[4898]: I1211 13:06:16.490647 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qndxl" event={"ID":"1efa7034-8a95-4e6e-bd84-0189dc5acaa3","Type":"ContainerStarted","Data":"1562e3d204cfba3207d1a92c464e141095798dfd147032dc841d0c3d87c7ca2d"} Dec 11 13:06:16 crc kubenswrapper[4898]: I1211 13:06:16.491583 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-qndxl" Dec 11 13:06:16 crc kubenswrapper[4898]: I1211 13:06:16.509014 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-zcq7l"] Dec 11 13:06:16 crc kubenswrapper[4898]: I1211 13:06:16.509136 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zcq7l" Dec 11 13:06:16 crc kubenswrapper[4898]: E1211 13:06:16.509224 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zcq7l" podUID="34380c7c-1d75-4f6f-a6cb-b015a55ca978" Dec 11 13:06:17 crc kubenswrapper[4898]: I1211 13:06:17.773964 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 11 13:06:17 crc kubenswrapper[4898]: I1211 13:06:17.773989 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 11 13:06:17 crc kubenswrapper[4898]: I1211 13:06:17.774070 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 11 13:06:17 crc kubenswrapper[4898]: E1211 13:06:17.774102 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 11 13:06:17 crc kubenswrapper[4898]: E1211 13:06:17.774194 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 11 13:06:17 crc kubenswrapper[4898]: E1211 13:06:17.774306 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 11 13:06:17 crc kubenswrapper[4898]: E1211 13:06:17.901611 4898 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 11 13:06:18 crc kubenswrapper[4898]: I1211 13:06:18.774979 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zcq7l" Dec 11 13:06:18 crc kubenswrapper[4898]: E1211 13:06:18.775119 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zcq7l" podUID="34380c7c-1d75-4f6f-a6cb-b015a55ca978" Dec 11 13:06:19 crc kubenswrapper[4898]: I1211 13:06:19.774233 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 11 13:06:19 crc kubenswrapper[4898]: I1211 13:06:19.774233 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 11 13:06:19 crc kubenswrapper[4898]: I1211 13:06:19.774257 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 11 13:06:19 crc kubenswrapper[4898]: E1211 13:06:19.774575 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 11 13:06:19 crc kubenswrapper[4898]: E1211 13:06:19.774374 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 11 13:06:19 crc kubenswrapper[4898]: E1211 13:06:19.774644 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 11 13:06:20 crc kubenswrapper[4898]: I1211 13:06:20.774913 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zcq7l" Dec 11 13:06:20 crc kubenswrapper[4898]: E1211 13:06:20.775125 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zcq7l" podUID="34380c7c-1d75-4f6f-a6cb-b015a55ca978" Dec 11 13:06:21 crc kubenswrapper[4898]: I1211 13:06:21.774060 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 11 13:06:21 crc kubenswrapper[4898]: I1211 13:06:21.774145 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 11 13:06:21 crc kubenswrapper[4898]: I1211 13:06:21.774143 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 11 13:06:21 crc kubenswrapper[4898]: E1211 13:06:21.774230 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 11 13:06:21 crc kubenswrapper[4898]: E1211 13:06:21.774312 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 11 13:06:21 crc kubenswrapper[4898]: E1211 13:06:21.774529 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 11 13:06:22 crc kubenswrapper[4898]: I1211 13:06:22.774394 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zcq7l" Dec 11 13:06:22 crc kubenswrapper[4898]: E1211 13:06:22.775360 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zcq7l" podUID="34380c7c-1d75-4f6f-a6cb-b015a55ca978" Dec 11 13:06:23 crc kubenswrapper[4898]: I1211 13:06:23.774208 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 11 13:06:23 crc kubenswrapper[4898]: I1211 13:06:23.774297 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 11 13:06:23 crc kubenswrapper[4898]: I1211 13:06:23.774252 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 11 13:06:23 crc kubenswrapper[4898]: I1211 13:06:23.776325 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Dec 11 13:06:23 crc kubenswrapper[4898]: I1211 13:06:23.778197 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Dec 11 13:06:23 crc kubenswrapper[4898]: I1211 13:06:23.778258 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Dec 11 13:06:23 crc kubenswrapper[4898]: I1211 13:06:23.778361 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Dec 11 13:06:24 crc kubenswrapper[4898]: I1211 13:06:24.774959 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zcq7l" Dec 11 13:06:24 crc kubenswrapper[4898]: I1211 13:06:24.777738 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Dec 11 13:06:24 crc kubenswrapper[4898]: I1211 13:06:24.779173 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Dec 11 13:06:28 crc kubenswrapper[4898]: I1211 13:06:28.894347 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Dec 11 13:06:28 crc kubenswrapper[4898]: I1211 13:06:28.956994 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-qndxl" podStartSLOduration=127.95697636 podStartE2EDuration="2m7.95697636s" podCreationTimestamp="2025-12-11 13:04:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 13:06:16.537984763 +0000 UTC m=+134.110311210" watchObservedRunningTime="2025-12-11 13:06:28.95697636 +0000 UTC m=+146.529302807" Dec 11 13:06:28 crc kubenswrapper[4898]: I1211 13:06:28.957585 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-vgm9d"] Dec 11 13:06:28 crc kubenswrapper[4898]: I1211 13:06:28.958587 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vgm9d" Dec 11 13:06:28 crc kubenswrapper[4898]: I1211 13:06:28.958785 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-pjp8z"] Dec 11 13:06:28 crc kubenswrapper[4898]: I1211 13:06:28.959936 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-pjp8z" Dec 11 13:06:28 crc kubenswrapper[4898]: I1211 13:06:28.960616 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-c9zbz"] Dec 11 13:06:28 crc kubenswrapper[4898]: I1211 13:06:28.961433 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-c9zbz" Dec 11 13:06:28 crc kubenswrapper[4898]: I1211 13:06:28.962235 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-fvkz4"] Dec 11 13:06:28 crc kubenswrapper[4898]: I1211 13:06:28.963346 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-fvkz4" Dec 11 13:06:28 crc kubenswrapper[4898]: I1211 13:06:28.963769 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-lqr65"] Dec 11 13:06:28 crc kubenswrapper[4898]: I1211 13:06:28.964677 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-lqr65" Dec 11 13:06:28 crc kubenswrapper[4898]: I1211 13:06:28.965077 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-4wrc2"] Dec 11 13:06:28 crc kubenswrapper[4898]: I1211 13:06:28.969827 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4wrc2" Dec 11 13:06:28 crc kubenswrapper[4898]: I1211 13:06:28.970368 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Dec 11 13:06:28 crc kubenswrapper[4898]: I1211 13:06:28.970885 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Dec 11 13:06:28 crc kubenswrapper[4898]: I1211 13:06:28.971203 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Dec 11 13:06:28 crc kubenswrapper[4898]: I1211 13:06:28.971629 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Dec 11 13:06:28 crc kubenswrapper[4898]: I1211 13:06:28.971699 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Dec 11 13:06:28 crc kubenswrapper[4898]: I1211 13:06:28.971769 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Dec 11 13:06:28 crc kubenswrapper[4898]: I1211 13:06:28.971776 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Dec 11 13:06:28 crc kubenswrapper[4898]: I1211 13:06:28.971804 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Dec 11 13:06:28 crc kubenswrapper[4898]: I1211 13:06:28.971896 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Dec 11 13:06:28 crc kubenswrapper[4898]: I1211 13:06:28.971923 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Dec 11 13:06:28 crc kubenswrapper[4898]: I1211 13:06:28.972009 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Dec 11 13:06:28 crc kubenswrapper[4898]: I1211 13:06:28.972097 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Dec 11 13:06:28 crc kubenswrapper[4898]: I1211 13:06:28.972365 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Dec 11 13:06:28 crc kubenswrapper[4898]: I1211 13:06:28.973667 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Dec 11 13:06:28 crc kubenswrapper[4898]: I1211 13:06:28.973743 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Dec 11 13:06:28 crc kubenswrapper[4898]: I1211 13:06:28.978072 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Dec 11 13:06:28 crc kubenswrapper[4898]: I1211 13:06:28.978124 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-lpsbl"] Dec 11 13:06:28 crc kubenswrapper[4898]: I1211 13:06:28.978290 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Dec 11 13:06:28 crc kubenswrapper[4898]: I1211 13:06:28.978432 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Dec 11 13:06:28 crc kubenswrapper[4898]: I1211 13:06:28.978485 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Dec 11 13:06:28 crc kubenswrapper[4898]: I1211 13:06:28.978506 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Dec 11 13:06:28 crc kubenswrapper[4898]: I1211 13:06:28.978528 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Dec 11 13:06:28 crc kubenswrapper[4898]: I1211 13:06:28.978652 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Dec 11 13:06:28 crc kubenswrapper[4898]: I1211 13:06:28.978725 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Dec 11 13:06:28 crc kubenswrapper[4898]: I1211 13:06:28.978796 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Dec 11 13:06:28 crc kubenswrapper[4898]: I1211 13:06:28.978835 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Dec 11 13:06:28 crc kubenswrapper[4898]: I1211 13:06:28.978845 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Dec 11 13:06:28 crc kubenswrapper[4898]: I1211 13:06:28.978918 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Dec 11 13:06:28 crc kubenswrapper[4898]: I1211 13:06:28.979006 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Dec 11 13:06:28 crc kubenswrapper[4898]: I1211 13:06:28.979015 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 11 13:06:28 crc kubenswrapper[4898]: I1211 13:06:28.979055 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Dec 11 13:06:28 crc kubenswrapper[4898]: I1211 13:06:28.979148 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Dec 11 13:06:28 crc kubenswrapper[4898]: I1211 13:06:28.979296 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 11 13:06:28 crc kubenswrapper[4898]: I1211 13:06:28.979819 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Dec 11 13:06:28 crc kubenswrapper[4898]: I1211 13:06:28.980247 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-lpsbl" Dec 11 13:06:28 crc kubenswrapper[4898]: I1211 13:06:28.982518 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-2c4kd"] Dec 11 13:06:28 crc kubenswrapper[4898]: I1211 13:06:28.983156 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-2c4kd" Dec 11 13:06:28 crc kubenswrapper[4898]: I1211 13:06:28.984336 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 11 13:06:28 crc kubenswrapper[4898]: I1211 13:06:28.984336 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 11 13:06:28 crc kubenswrapper[4898]: I1211 13:06:28.984787 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Dec 11 13:06:28 crc kubenswrapper[4898]: I1211 13:06:28.985231 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Dec 11 13:06:28 crc kubenswrapper[4898]: I1211 13:06:28.985444 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Dec 11 13:06:28 crc kubenswrapper[4898]: I1211 13:06:28.985794 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 11 13:06:28 crc kubenswrapper[4898]: I1211 13:06:28.986402 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/92578329-85d1-4295-bc06-88764e9d54c2-config\") pod \"route-controller-manager-6576b87f9c-4wrc2\" (UID: \"92578329-85d1-4295-bc06-88764e9d54c2\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4wrc2" Dec 11 13:06:28 crc kubenswrapper[4898]: I1211 13:06:28.986531 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a3c9df9f-29e5-45ed-bd07-642ae50d0ed9-serving-cert\") pod \"controller-manager-879f6c89f-lpsbl\" (UID: \"a3c9df9f-29e5-45ed-bd07-642ae50d0ed9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-lpsbl" Dec 11 13:06:28 crc kubenswrapper[4898]: I1211 13:06:28.986590 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/494fa5aa-0063-4f27-af65-f3a92879017a-node-pullsecrets\") pod \"apiserver-76f77b778f-fvkz4\" (UID: \"494fa5aa-0063-4f27-af65-f3a92879017a\") " pod="openshift-apiserver/apiserver-76f77b778f-fvkz4" Dec 11 13:06:28 crc kubenswrapper[4898]: I1211 13:06:28.986669 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a3c9df9f-29e5-45ed-bd07-642ae50d0ed9-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-lpsbl\" (UID: \"a3c9df9f-29e5-45ed-bd07-642ae50d0ed9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-lpsbl" Dec 11 13:06:28 crc kubenswrapper[4898]: I1211 13:06:28.986718 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/494fa5aa-0063-4f27-af65-f3a92879017a-image-import-ca\") pod \"apiserver-76f77b778f-fvkz4\" (UID: \"494fa5aa-0063-4f27-af65-f3a92879017a\") " pod="openshift-apiserver/apiserver-76f77b778f-fvkz4" Dec 11 13:06:28 crc kubenswrapper[4898]: I1211 13:06:28.986764 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/494fa5aa-0063-4f27-af65-f3a92879017a-serving-cert\") pod \"apiserver-76f77b778f-fvkz4\" (UID: \"494fa5aa-0063-4f27-af65-f3a92879017a\") " pod="openshift-apiserver/apiserver-76f77b778f-fvkz4" Dec 11 13:06:28 crc kubenswrapper[4898]: I1211 13:06:28.986854 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/494fa5aa-0063-4f27-af65-f3a92879017a-audit\") pod \"apiserver-76f77b778f-fvkz4\" (UID: \"494fa5aa-0063-4f27-af65-f3a92879017a\") " pod="openshift-apiserver/apiserver-76f77b778f-fvkz4" Dec 11 13:06:28 crc kubenswrapper[4898]: I1211 13:06:28.986909 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5c39dd70-f596-4e60-b400-3611eddfefc4-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-2c4kd\" (UID: \"5c39dd70-f596-4e60-b400-3611eddfefc4\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-2c4kd" Dec 11 13:06:28 crc kubenswrapper[4898]: I1211 13:06:28.986955 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/494fa5aa-0063-4f27-af65-f3a92879017a-config\") pod \"apiserver-76f77b778f-fvkz4\" (UID: \"494fa5aa-0063-4f27-af65-f3a92879017a\") " pod="openshift-apiserver/apiserver-76f77b778f-fvkz4" Dec 11 13:06:28 crc kubenswrapper[4898]: I1211 13:06:28.987007 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ljmkm\" (UniqueName: \"kubernetes.io/projected/a3c9df9f-29e5-45ed-bd07-642ae50d0ed9-kube-api-access-ljmkm\") pod \"controller-manager-879f6c89f-lpsbl\" (UID: \"a3c9df9f-29e5-45ed-bd07-642ae50d0ed9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-lpsbl" Dec 11 13:06:28 crc kubenswrapper[4898]: I1211 13:06:28.987056 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/494fa5aa-0063-4f27-af65-f3a92879017a-etcd-client\") pod \"apiserver-76f77b778f-fvkz4\" (UID: \"494fa5aa-0063-4f27-af65-f3a92879017a\") " pod="openshift-apiserver/apiserver-76f77b778f-fvkz4" Dec 11 13:06:28 crc kubenswrapper[4898]: I1211 13:06:28.987111 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5c39dd70-f596-4e60-b400-3611eddfefc4-config\") pod \"openshift-apiserver-operator-796bbdcf4f-2c4kd\" (UID: \"5c39dd70-f596-4e60-b400-3611eddfefc4\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-2c4kd" Dec 11 13:06:28 crc kubenswrapper[4898]: I1211 13:06:28.987159 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/494fa5aa-0063-4f27-af65-f3a92879017a-audit-dir\") pod \"apiserver-76f77b778f-fvkz4\" (UID: \"494fa5aa-0063-4f27-af65-f3a92879017a\") " pod="openshift-apiserver/apiserver-76f77b778f-fvkz4" Dec 11 13:06:28 crc kubenswrapper[4898]: I1211 13:06:28.987210 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/92578329-85d1-4295-bc06-88764e9d54c2-client-ca\") pod \"route-controller-manager-6576b87f9c-4wrc2\" (UID: \"92578329-85d1-4295-bc06-88764e9d54c2\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4wrc2" Dec 11 13:06:28 crc kubenswrapper[4898]: I1211 13:06:28.987255 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3c9df9f-29e5-45ed-bd07-642ae50d0ed9-config\") pod \"controller-manager-879f6c89f-lpsbl\" (UID: \"a3c9df9f-29e5-45ed-bd07-642ae50d0ed9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-lpsbl" Dec 11 13:06:28 crc kubenswrapper[4898]: I1211 13:06:28.987343 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a3c9df9f-29e5-45ed-bd07-642ae50d0ed9-client-ca\") pod \"controller-manager-879f6c89f-lpsbl\" (UID: \"a3c9df9f-29e5-45ed-bd07-642ae50d0ed9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-lpsbl" Dec 11 13:06:28 crc kubenswrapper[4898]: I1211 13:06:28.987393 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/494fa5aa-0063-4f27-af65-f3a92879017a-etcd-serving-ca\") pod \"apiserver-76f77b778f-fvkz4\" (UID: \"494fa5aa-0063-4f27-af65-f3a92879017a\") " pod="openshift-apiserver/apiserver-76f77b778f-fvkz4" Dec 11 13:06:28 crc kubenswrapper[4898]: I1211 13:06:28.987510 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4vrf2\" (UniqueName: \"kubernetes.io/projected/a4915d3f-e64d-4fad-8933-218f11bc783d-kube-api-access-4vrf2\") pod \"machine-approver-56656f9798-c9zbz\" (UID: \"a4915d3f-e64d-4fad-8933-218f11bc783d\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-c9zbz" Dec 11 13:06:28 crc kubenswrapper[4898]: I1211 13:06:28.987569 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-76hpf\" (UniqueName: \"kubernetes.io/projected/5c39dd70-f596-4e60-b400-3611eddfefc4-kube-api-access-76hpf\") pod \"openshift-apiserver-operator-796bbdcf4f-2c4kd\" (UID: \"5c39dd70-f596-4e60-b400-3611eddfefc4\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-2c4kd" Dec 11 13:06:28 crc kubenswrapper[4898]: I1211 13:06:28.987799 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a4915d3f-e64d-4fad-8933-218f11bc783d-config\") pod \"machine-approver-56656f9798-c9zbz\" (UID: \"a4915d3f-e64d-4fad-8933-218f11bc783d\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-c9zbz" Dec 11 13:06:28 crc kubenswrapper[4898]: I1211 13:06:28.987880 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/494fa5aa-0063-4f27-af65-f3a92879017a-encryption-config\") pod \"apiserver-76f77b778f-fvkz4\" (UID: \"494fa5aa-0063-4f27-af65-f3a92879017a\") " pod="openshift-apiserver/apiserver-76f77b778f-fvkz4" Dec 11 13:06:28 crc kubenswrapper[4898]: I1211 13:06:28.987943 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a4915d3f-e64d-4fad-8933-218f11bc783d-auth-proxy-config\") pod \"machine-approver-56656f9798-c9zbz\" (UID: \"a4915d3f-e64d-4fad-8933-218f11bc783d\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-c9zbz" Dec 11 13:06:28 crc kubenswrapper[4898]: I1211 13:06:28.988030 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/a4915d3f-e64d-4fad-8933-218f11bc783d-machine-approver-tls\") pod \"machine-approver-56656f9798-c9zbz\" (UID: \"a4915d3f-e64d-4fad-8933-218f11bc783d\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-c9zbz" Dec 11 13:06:28 crc kubenswrapper[4898]: I1211 13:06:28.988099 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/92578329-85d1-4295-bc06-88764e9d54c2-serving-cert\") pod \"route-controller-manager-6576b87f9c-4wrc2\" (UID: \"92578329-85d1-4295-bc06-88764e9d54c2\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4wrc2" Dec 11 13:06:28 crc kubenswrapper[4898]: I1211 13:06:28.988168 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-db2rp\" (UniqueName: \"kubernetes.io/projected/92578329-85d1-4295-bc06-88764e9d54c2-kube-api-access-db2rp\") pod \"route-controller-manager-6576b87f9c-4wrc2\" (UID: \"92578329-85d1-4295-bc06-88764e9d54c2\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4wrc2" Dec 11 13:06:28 crc kubenswrapper[4898]: I1211 13:06:28.988213 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/494fa5aa-0063-4f27-af65-f3a92879017a-trusted-ca-bundle\") pod \"apiserver-76f77b778f-fvkz4\" (UID: \"494fa5aa-0063-4f27-af65-f3a92879017a\") " pod="openshift-apiserver/apiserver-76f77b778f-fvkz4" Dec 11 13:06:28 crc kubenswrapper[4898]: I1211 13:06:28.988244 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xxtlp\" (UniqueName: \"kubernetes.io/projected/494fa5aa-0063-4f27-af65-f3a92879017a-kube-api-access-xxtlp\") pod \"apiserver-76f77b778f-fvkz4\" (UID: \"494fa5aa-0063-4f27-af65-f3a92879017a\") " pod="openshift-apiserver/apiserver-76f77b778f-fvkz4" Dec 11 13:06:28 crc kubenswrapper[4898]: I1211 13:06:28.989661 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-lbjzz"] Dec 11 13:06:28 crc kubenswrapper[4898]: I1211 13:06:28.990288 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-lbjzz" Dec 11 13:06:28 crc kubenswrapper[4898]: I1211 13:06:28.990392 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-rkglj"] Dec 11 13:06:28 crc kubenswrapper[4898]: I1211 13:06:28.991075 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-rkglj" Dec 11 13:06:28 crc kubenswrapper[4898]: I1211 13:06:28.992820 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Dec 11 13:06:28 crc kubenswrapper[4898]: I1211 13:06:28.993353 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-7csms"] Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.010857 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.011158 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.011730 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-7csms" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.027065 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.034792 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.035022 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.035773 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.036409 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.036563 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.036665 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.036772 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.037256 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.037308 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-4bmq4"] Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.037784 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-7cbpd"] Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.038050 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-4bmq4" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.038179 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-7cbpd" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.038100 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-fswhw"] Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.038937 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-fswhw" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.047084 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.047195 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.047302 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.047805 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.047962 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.048055 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.048141 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.048224 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.048699 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.048699 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.048948 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.048895 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.049131 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.049160 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.049228 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.049267 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.049351 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.049416 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.049952 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-g9vz7"] Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.050210 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.050350 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-g9vz7" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.050363 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.051749 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-42442"] Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.052144 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-42442" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.054037 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.056306 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.056592 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.056799 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.057200 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.058755 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.058861 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gkd27"] Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.059256 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.059425 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-p9xjl"] Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.059556 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.061094 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-mmcc8"] Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.061582 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-jmljz"] Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.061982 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-jmljz" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.062030 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gkd27" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.062292 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-p9xjl" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.062577 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-mmcc8" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.063451 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.063863 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-kdd7p"] Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.064512 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kdd7p" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.064951 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-nnwb2"] Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.065083 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.079441 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-nnwb2" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.083688 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.083798 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.084401 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.084490 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.085256 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.085726 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.085738 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vc9zl"] Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.086019 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.087360 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.087838 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vc9zl" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.108265 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af9f94b9-3e85-4e0c-bb97-ac84071e969f-config\") pod \"authentication-operator-69f744f599-pjp8z\" (UID: \"af9f94b9-3e85-4e0c-bb97-ac84071e969f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-pjp8z" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.108347 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4vrf2\" (UniqueName: \"kubernetes.io/projected/a4915d3f-e64d-4fad-8933-218f11bc783d-kube-api-access-4vrf2\") pod \"machine-approver-56656f9798-c9zbz\" (UID: \"a4915d3f-e64d-4fad-8933-218f11bc783d\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-c9zbz" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.108409 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-76hpf\" (UniqueName: \"kubernetes.io/projected/5c39dd70-f596-4e60-b400-3611eddfefc4-kube-api-access-76hpf\") pod \"openshift-apiserver-operator-796bbdcf4f-2c4kd\" (UID: \"5c39dd70-f596-4e60-b400-3611eddfefc4\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-2c4kd" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.108501 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a4915d3f-e64d-4fad-8933-218f11bc783d-config\") pod \"machine-approver-56656f9798-c9zbz\" (UID: \"a4915d3f-e64d-4fad-8933-218f11bc783d\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-c9zbz" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.108531 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/494fa5aa-0063-4f27-af65-f3a92879017a-encryption-config\") pod \"apiserver-76f77b778f-fvkz4\" (UID: \"494fa5aa-0063-4f27-af65-f3a92879017a\") " pod="openshift-apiserver/apiserver-76f77b778f-fvkz4" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.108570 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a4915d3f-e64d-4fad-8933-218f11bc783d-auth-proxy-config\") pod \"machine-approver-56656f9798-c9zbz\" (UID: \"a4915d3f-e64d-4fad-8933-218f11bc783d\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-c9zbz" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.108610 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bd37230-d5b0-47d1-b4c6-df3c1ad3788c-encryption-config\") pod \"apiserver-7bbb656c7d-vgm9d\" (UID: \"1bd37230-d5b0-47d1-b4c6-df3c1ad3788c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vgm9d" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.108649 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1dd4c49a-1898-48ab-9ed0-4f455f392b57-config\") pod \"machine-api-operator-5694c8668f-lqr65\" (UID: \"1dd4c49a-1898-48ab-9ed0-4f455f392b57\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lqr65" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.108682 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/1dd4c49a-1898-48ab-9ed0-4f455f392b57-images\") pod \"machine-api-operator-5694c8668f-lqr65\" (UID: \"1dd4c49a-1898-48ab-9ed0-4f455f392b57\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lqr65" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.108718 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/1bd37230-d5b0-47d1-b4c6-df3c1ad3788c-audit-policies\") pod \"apiserver-7bbb656c7d-vgm9d\" (UID: \"1bd37230-d5b0-47d1-b4c6-df3c1ad3788c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vgm9d" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.108750 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bd37230-d5b0-47d1-b4c6-df3c1ad3788c-serving-cert\") pod \"apiserver-7bbb656c7d-vgm9d\" (UID: \"1bd37230-d5b0-47d1-b4c6-df3c1ad3788c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vgm9d" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.108788 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/a4915d3f-e64d-4fad-8933-218f11bc783d-machine-approver-tls\") pod \"machine-approver-56656f9798-c9zbz\" (UID: \"a4915d3f-e64d-4fad-8933-218f11bc783d\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-c9zbz" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.108829 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/92578329-85d1-4295-bc06-88764e9d54c2-serving-cert\") pod \"route-controller-manager-6576b87f9c-4wrc2\" (UID: \"92578329-85d1-4295-bc06-88764e9d54c2\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4wrc2" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.108882 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-db2rp\" (UniqueName: \"kubernetes.io/projected/92578329-85d1-4295-bc06-88764e9d54c2-kube-api-access-db2rp\") pod \"route-controller-manager-6576b87f9c-4wrc2\" (UID: \"92578329-85d1-4295-bc06-88764e9d54c2\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4wrc2" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.108922 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/494fa5aa-0063-4f27-af65-f3a92879017a-trusted-ca-bundle\") pod \"apiserver-76f77b778f-fvkz4\" (UID: \"494fa5aa-0063-4f27-af65-f3a92879017a\") " pod="openshift-apiserver/apiserver-76f77b778f-fvkz4" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.108960 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xxtlp\" (UniqueName: \"kubernetes.io/projected/494fa5aa-0063-4f27-af65-f3a92879017a-kube-api-access-xxtlp\") pod \"apiserver-76f77b778f-fvkz4\" (UID: \"494fa5aa-0063-4f27-af65-f3a92879017a\") " pod="openshift-apiserver/apiserver-76f77b778f-fvkz4" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.109007 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-87f9w\" (UniqueName: \"kubernetes.io/projected/1dd4c49a-1898-48ab-9ed0-4f455f392b57-kube-api-access-87f9w\") pod \"machine-api-operator-5694c8668f-lqr65\" (UID: \"1dd4c49a-1898-48ab-9ed0-4f455f392b57\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lqr65" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.109049 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-twzqk\" (UniqueName: \"kubernetes.io/projected/1bd37230-d5b0-47d1-b4c6-df3c1ad3788c-kube-api-access-twzqk\") pod \"apiserver-7bbb656c7d-vgm9d\" (UID: \"1bd37230-d5b0-47d1-b4c6-df3c1ad3788c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vgm9d" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.109089 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/92578329-85d1-4295-bc06-88764e9d54c2-config\") pod \"route-controller-manager-6576b87f9c-4wrc2\" (UID: \"92578329-85d1-4295-bc06-88764e9d54c2\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4wrc2" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.109128 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a3c9df9f-29e5-45ed-bd07-642ae50d0ed9-serving-cert\") pod \"controller-manager-879f6c89f-lpsbl\" (UID: \"a3c9df9f-29e5-45ed-bd07-642ae50d0ed9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-lpsbl" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.109167 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/494fa5aa-0063-4f27-af65-f3a92879017a-node-pullsecrets\") pod \"apiserver-76f77b778f-fvkz4\" (UID: \"494fa5aa-0063-4f27-af65-f3a92879017a\") " pod="openshift-apiserver/apiserver-76f77b778f-fvkz4" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.109201 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/1dd4c49a-1898-48ab-9ed0-4f455f392b57-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-lqr65\" (UID: \"1dd4c49a-1898-48ab-9ed0-4f455f392b57\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lqr65" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.109251 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/af9f94b9-3e85-4e0c-bb97-ac84071e969f-serving-cert\") pod \"authentication-operator-69f744f599-pjp8z\" (UID: \"af9f94b9-3e85-4e0c-bb97-ac84071e969f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-pjp8z" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.109289 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9gjkg\" (UniqueName: \"kubernetes.io/projected/af9f94b9-3e85-4e0c-bb97-ac84071e969f-kube-api-access-9gjkg\") pod \"authentication-operator-69f744f599-pjp8z\" (UID: \"af9f94b9-3e85-4e0c-bb97-ac84071e969f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-pjp8z" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.109317 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a3c9df9f-29e5-45ed-bd07-642ae50d0ed9-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-lpsbl\" (UID: \"a3c9df9f-29e5-45ed-bd07-642ae50d0ed9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-lpsbl" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.109343 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bd37230-d5b0-47d1-b4c6-df3c1ad3788c-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-vgm9d\" (UID: \"1bd37230-d5b0-47d1-b4c6-df3c1ad3788c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vgm9d" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.109377 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bd37230-d5b0-47d1-b4c6-df3c1ad3788c-etcd-client\") pod \"apiserver-7bbb656c7d-vgm9d\" (UID: \"1bd37230-d5b0-47d1-b4c6-df3c1ad3788c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vgm9d" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.109312 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.109704 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/494fa5aa-0063-4f27-af65-f3a92879017a-image-import-ca\") pod \"apiserver-76f77b778f-fvkz4\" (UID: \"494fa5aa-0063-4f27-af65-f3a92879017a\") " pod="openshift-apiserver/apiserver-76f77b778f-fvkz4" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.109742 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/494fa5aa-0063-4f27-af65-f3a92879017a-serving-cert\") pod \"apiserver-76f77b778f-fvkz4\" (UID: \"494fa5aa-0063-4f27-af65-f3a92879017a\") " pod="openshift-apiserver/apiserver-76f77b778f-fvkz4" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.109772 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/af9f94b9-3e85-4e0c-bb97-ac84071e969f-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-pjp8z\" (UID: \"af9f94b9-3e85-4e0c-bb97-ac84071e969f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-pjp8z" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.109801 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/494fa5aa-0063-4f27-af65-f3a92879017a-audit\") pod \"apiserver-76f77b778f-fvkz4\" (UID: \"494fa5aa-0063-4f27-af65-f3a92879017a\") " pod="openshift-apiserver/apiserver-76f77b778f-fvkz4" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.109830 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/af9f94b9-3e85-4e0c-bb97-ac84071e969f-service-ca-bundle\") pod \"authentication-operator-69f744f599-pjp8z\" (UID: \"af9f94b9-3e85-4e0c-bb97-ac84071e969f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-pjp8z" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.109855 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5c39dd70-f596-4e60-b400-3611eddfefc4-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-2c4kd\" (UID: \"5c39dd70-f596-4e60-b400-3611eddfefc4\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-2c4kd" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.109878 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/494fa5aa-0063-4f27-af65-f3a92879017a-config\") pod \"apiserver-76f77b778f-fvkz4\" (UID: \"494fa5aa-0063-4f27-af65-f3a92879017a\") " pod="openshift-apiserver/apiserver-76f77b778f-fvkz4" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.109909 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ljmkm\" (UniqueName: \"kubernetes.io/projected/a3c9df9f-29e5-45ed-bd07-642ae50d0ed9-kube-api-access-ljmkm\") pod \"controller-manager-879f6c89f-lpsbl\" (UID: \"a3c9df9f-29e5-45ed-bd07-642ae50d0ed9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-lpsbl" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.109930 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/494fa5aa-0063-4f27-af65-f3a92879017a-etcd-client\") pod \"apiserver-76f77b778f-fvkz4\" (UID: \"494fa5aa-0063-4f27-af65-f3a92879017a\") " pod="openshift-apiserver/apiserver-76f77b778f-fvkz4" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.109953 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5c39dd70-f596-4e60-b400-3611eddfefc4-config\") pod \"openshift-apiserver-operator-796bbdcf4f-2c4kd\" (UID: \"5c39dd70-f596-4e60-b400-3611eddfefc4\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-2c4kd" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.109972 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/494fa5aa-0063-4f27-af65-f3a92879017a-audit-dir\") pod \"apiserver-76f77b778f-fvkz4\" (UID: \"494fa5aa-0063-4f27-af65-f3a92879017a\") " pod="openshift-apiserver/apiserver-76f77b778f-fvkz4" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.109996 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/92578329-85d1-4295-bc06-88764e9d54c2-client-ca\") pod \"route-controller-manager-6576b87f9c-4wrc2\" (UID: \"92578329-85d1-4295-bc06-88764e9d54c2\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4wrc2" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.110016 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3c9df9f-29e5-45ed-bd07-642ae50d0ed9-config\") pod \"controller-manager-879f6c89f-lpsbl\" (UID: \"a3c9df9f-29e5-45ed-bd07-642ae50d0ed9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-lpsbl" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.110228 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bd37230-d5b0-47d1-b4c6-df3c1ad3788c-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-vgm9d\" (UID: \"1bd37230-d5b0-47d1-b4c6-df3c1ad3788c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vgm9d" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.110259 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a3c9df9f-29e5-45ed-bd07-642ae50d0ed9-client-ca\") pod \"controller-manager-879f6c89f-lpsbl\" (UID: \"a3c9df9f-29e5-45ed-bd07-642ae50d0ed9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-lpsbl" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.110282 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/494fa5aa-0063-4f27-af65-f3a92879017a-etcd-serving-ca\") pod \"apiserver-76f77b778f-fvkz4\" (UID: \"494fa5aa-0063-4f27-af65-f3a92879017a\") " pod="openshift-apiserver/apiserver-76f77b778f-fvkz4" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.110315 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1bd37230-d5b0-47d1-b4c6-df3c1ad3788c-audit-dir\") pod \"apiserver-7bbb656c7d-vgm9d\" (UID: \"1bd37230-d5b0-47d1-b4c6-df3c1ad3788c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vgm9d" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.111107 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a4915d3f-e64d-4fad-8933-218f11bc783d-config\") pod \"machine-approver-56656f9798-c9zbz\" (UID: \"a4915d3f-e64d-4fad-8933-218f11bc783d\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-c9zbz" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.111415 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/494fa5aa-0063-4f27-af65-f3a92879017a-node-pullsecrets\") pod \"apiserver-76f77b778f-fvkz4\" (UID: \"494fa5aa-0063-4f27-af65-f3a92879017a\") " pod="openshift-apiserver/apiserver-76f77b778f-fvkz4" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.111608 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/92578329-85d1-4295-bc06-88764e9d54c2-client-ca\") pod \"route-controller-manager-6576b87f9c-4wrc2\" (UID: \"92578329-85d1-4295-bc06-88764e9d54c2\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4wrc2" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.111693 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/494fa5aa-0063-4f27-af65-f3a92879017a-audit-dir\") pod \"apiserver-76f77b778f-fvkz4\" (UID: \"494fa5aa-0063-4f27-af65-f3a92879017a\") " pod="openshift-apiserver/apiserver-76f77b778f-fvkz4" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.111819 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a4915d3f-e64d-4fad-8933-218f11bc783d-auth-proxy-config\") pod \"machine-approver-56656f9798-c9zbz\" (UID: \"a4915d3f-e64d-4fad-8933-218f11bc783d\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-c9zbz" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.112682 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5c39dd70-f596-4e60-b400-3611eddfefc4-config\") pod \"openshift-apiserver-operator-796bbdcf4f-2c4kd\" (UID: \"5c39dd70-f596-4e60-b400-3611eddfefc4\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-2c4kd" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.113824 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3c9df9f-29e5-45ed-bd07-642ae50d0ed9-config\") pod \"controller-manager-879f6c89f-lpsbl\" (UID: \"a3c9df9f-29e5-45ed-bd07-642ae50d0ed9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-lpsbl" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.113941 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/92578329-85d1-4295-bc06-88764e9d54c2-config\") pod \"route-controller-manager-6576b87f9c-4wrc2\" (UID: \"92578329-85d1-4295-bc06-88764e9d54c2\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4wrc2" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.114967 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/494fa5aa-0063-4f27-af65-f3a92879017a-encryption-config\") pod \"apiserver-76f77b778f-fvkz4\" (UID: \"494fa5aa-0063-4f27-af65-f3a92879017a\") " pod="openshift-apiserver/apiserver-76f77b778f-fvkz4" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.115686 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a3c9df9f-29e5-45ed-bd07-642ae50d0ed9-client-ca\") pod \"controller-manager-879f6c89f-lpsbl\" (UID: \"a3c9df9f-29e5-45ed-bd07-642ae50d0ed9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-lpsbl" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.115732 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-6df8v"] Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.116040 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/494fa5aa-0063-4f27-af65-f3a92879017a-etcd-client\") pod \"apiserver-76f77b778f-fvkz4\" (UID: \"494fa5aa-0063-4f27-af65-f3a92879017a\") " pod="openshift-apiserver/apiserver-76f77b778f-fvkz4" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.116599 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/92578329-85d1-4295-bc06-88764e9d54c2-serving-cert\") pod \"route-controller-manager-6576b87f9c-4wrc2\" (UID: \"92578329-85d1-4295-bc06-88764e9d54c2\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4wrc2" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.117517 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.117778 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.118540 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.119042 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/a4915d3f-e64d-4fad-8933-218f11bc783d-machine-approver-tls\") pod \"machine-approver-56656f9798-c9zbz\" (UID: \"a4915d3f-e64d-4fad-8933-218f11bc783d\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-c9zbz" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.119294 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6df8v" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.119707 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a3c9df9f-29e5-45ed-bd07-642ae50d0ed9-serving-cert\") pod \"controller-manager-879f6c89f-lpsbl\" (UID: \"a3c9df9f-29e5-45ed-bd07-642ae50d0ed9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-lpsbl" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.120729 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.121026 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.121169 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5c39dd70-f596-4e60-b400-3611eddfefc4-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-2c4kd\" (UID: \"5c39dd70-f596-4e60-b400-3611eddfefc4\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-2c4kd" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.129703 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.129876 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.130056 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.130765 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.130914 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.131078 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.131735 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/494fa5aa-0063-4f27-af65-f3a92879017a-serving-cert\") pod \"apiserver-76f77b778f-fvkz4\" (UID: \"494fa5aa-0063-4f27-af65-f3a92879017a\") " pod="openshift-apiserver/apiserver-76f77b778f-fvkz4" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.131906 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.132941 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a3c9df9f-29e5-45ed-bd07-642ae50d0ed9-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-lpsbl\" (UID: \"a3c9df9f-29e5-45ed-bd07-642ae50d0ed9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-lpsbl" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.133108 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.133141 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/494fa5aa-0063-4f27-af65-f3a92879017a-trusted-ca-bundle\") pod \"apiserver-76f77b778f-fvkz4\" (UID: \"494fa5aa-0063-4f27-af65-f3a92879017a\") " pod="openshift-apiserver/apiserver-76f77b778f-fvkz4" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.133246 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.133748 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-wm9qr"] Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.134704 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/494fa5aa-0063-4f27-af65-f3a92879017a-config\") pod \"apiserver-76f77b778f-fvkz4\" (UID: \"494fa5aa-0063-4f27-af65-f3a92879017a\") " pod="openshift-apiserver/apiserver-76f77b778f-fvkz4" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.134840 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-wm9qr" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.135181 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/494fa5aa-0063-4f27-af65-f3a92879017a-image-import-ca\") pod \"apiserver-76f77b778f-fvkz4\" (UID: \"494fa5aa-0063-4f27-af65-f3a92879017a\") " pod="openshift-apiserver/apiserver-76f77b778f-fvkz4" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.135488 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/494fa5aa-0063-4f27-af65-f3a92879017a-etcd-serving-ca\") pod \"apiserver-76f77b778f-fvkz4\" (UID: \"494fa5aa-0063-4f27-af65-f3a92879017a\") " pod="openshift-apiserver/apiserver-76f77b778f-fvkz4" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.135563 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-5gljx"] Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.136149 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.136408 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-5gljx" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.136664 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-676rf"] Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.137188 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/494fa5aa-0063-4f27-af65-f3a92879017a-audit\") pod \"apiserver-76f77b778f-fvkz4\" (UID: \"494fa5aa-0063-4f27-af65-f3a92879017a\") " pod="openshift-apiserver/apiserver-76f77b778f-fvkz4" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.137362 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-676rf" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.138040 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-nv8jk"] Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.138656 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-nv8jk" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.139101 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-25l4n"] Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.139639 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-25l4n" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.141781 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.141994 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-vjbmx"] Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.142851 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ntzhr"] Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.145628 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-vjbmx" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.145752 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ntzhr" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.149898 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-pjp8z"] Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.150760 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-2cnbc"] Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.157552 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zl9tf"] Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.158153 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pjgq4"] Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.158768 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29424300-txml6"] Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.159367 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-vgm9d"] Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.159390 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-lw2tt"] Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.159626 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zl9tf" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.160104 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-2cnbc" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.160136 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pjgq4" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.160370 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29424300-txml6" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.166150 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.166375 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-92m7s"] Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.166900 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-92m7s" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.167168 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-lw2tt" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.168912 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-22s99"] Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.170095 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-22s99" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.172050 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-fvkz4"] Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.173973 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-p9xjl"] Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.179876 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-lqr65"] Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.179934 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-cc9qv"] Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.180606 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-42442"] Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.180704 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-cc9qv" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.204548 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-4wrc2"] Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.204996 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.207018 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.211125 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af9f94b9-3e85-4e0c-bb97-ac84071e969f-config\") pod \"authentication-operator-69f744f599-pjp8z\" (UID: \"af9f94b9-3e85-4e0c-bb97-ac84071e969f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-pjp8z" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.211188 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vqmz6\" (UniqueName: \"kubernetes.io/projected/cf77b13b-75a2-4fc1-a068-8fd33773f827-kube-api-access-vqmz6\") pod \"console-operator-58897d9998-rkglj\" (UID: \"cf77b13b-75a2-4fc1-a068-8fd33773f827\") " pod="openshift-console-operator/console-operator-58897d9998-rkglj" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.211254 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mhs6j\" (UniqueName: \"kubernetes.io/projected/5442029d-e1aa-4496-b3c6-18f11b034179-kube-api-access-mhs6j\") pod \"downloads-7954f5f757-4bmq4\" (UID: \"5442029d-e1aa-4496-b3c6-18f11b034179\") " pod="openshift-console/downloads-7954f5f757-4bmq4" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.211278 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/730444c8-a1f8-4b14-b415-4b55869e7da3-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-gkd27\" (UID: \"730444c8-a1f8-4b14-b415-4b55869e7da3\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gkd27" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.211326 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7zg6l\" (UniqueName: \"kubernetes.io/projected/5ec5ed0a-c3c1-4205-869f-c126a1249aa6-kube-api-access-7zg6l\") pod \"etcd-operator-b45778765-42442\" (UID: \"5ec5ed0a-c3c1-4205-869f-c126a1249aa6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-42442" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.211351 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f46b0f35-a460-4977-a99a-8e0ea7015416-serving-cert\") pod \"openshift-config-operator-7777fb866f-7csms\" (UID: \"f46b0f35-a460-4977-a99a-8e0ea7015416\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-7csms" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.211376 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/7b9b7fbd-6a91-4761-8604-b82c417e12f8-stats-auth\") pod \"router-default-5444994796-jmljz\" (UID: \"7b9b7fbd-6a91-4761-8604-b82c417e12f8\") " pod="openshift-ingress/router-default-5444994796-jmljz" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.211396 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7b9b7fbd-6a91-4761-8604-b82c417e12f8-metrics-certs\") pod \"router-default-5444994796-jmljz\" (UID: \"7b9b7fbd-6a91-4761-8604-b82c417e12f8\") " pod="openshift-ingress/router-default-5444994796-jmljz" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.211478 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/48dab864-fed6-46a5-bae2-85cfa4edc939-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-vc9zl\" (UID: \"48dab864-fed6-46a5-bae2-85cfa4edc939\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vc9zl" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.211503 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/5ec5ed0a-c3c1-4205-869f-c126a1249aa6-etcd-ca\") pod \"etcd-operator-b45778765-42442\" (UID: \"5ec5ed0a-c3c1-4205-869f-c126a1249aa6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-42442" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.211548 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/5ec5ed0a-c3c1-4205-869f-c126a1249aa6-etcd-service-ca\") pod \"etcd-operator-b45778765-42442\" (UID: \"5ec5ed0a-c3c1-4205-869f-c126a1249aa6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-42442" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.211613 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bd37230-d5b0-47d1-b4c6-df3c1ad3788c-encryption-config\") pod \"apiserver-7bbb656c7d-vgm9d\" (UID: \"1bd37230-d5b0-47d1-b4c6-df3c1ad3788c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vgm9d" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.211643 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bd1c8aad-a95c-4ab4-82f9-beb8702efab8-proxy-tls\") pod \"machine-config-operator-74547568cd-kdd7p\" (UID: \"bd1c8aad-a95c-4ab4-82f9-beb8702efab8\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kdd7p" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.211715 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/f46b0f35-a460-4977-a99a-8e0ea7015416-available-featuregates\") pod \"openshift-config-operator-7777fb866f-7csms\" (UID: \"f46b0f35-a460-4977-a99a-8e0ea7015416\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-7csms" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.211774 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/7b9b7fbd-6a91-4761-8604-b82c417e12f8-default-certificate\") pod \"router-default-5444994796-jmljz\" (UID: \"7b9b7fbd-6a91-4761-8604-b82c417e12f8\") " pod="openshift-ingress/router-default-5444994796-jmljz" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.211805 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1dd4c49a-1898-48ab-9ed0-4f455f392b57-config\") pod \"machine-api-operator-5694c8668f-lqr65\" (UID: \"1dd4c49a-1898-48ab-9ed0-4f455f392b57\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lqr65" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.211852 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/1dd4c49a-1898-48ab-9ed0-4f455f392b57-images\") pod \"machine-api-operator-5694c8668f-lqr65\" (UID: \"1dd4c49a-1898-48ab-9ed0-4f455f392b57\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lqr65" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.211879 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/1bd37230-d5b0-47d1-b4c6-df3c1ad3788c-audit-policies\") pod \"apiserver-7bbb656c7d-vgm9d\" (UID: \"1bd37230-d5b0-47d1-b4c6-df3c1ad3788c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vgm9d" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.211900 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ec5ed0a-c3c1-4205-869f-c126a1249aa6-config\") pod \"etcd-operator-b45778765-42442\" (UID: \"5ec5ed0a-c3c1-4205-869f-c126a1249aa6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-42442" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.211952 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bd37230-d5b0-47d1-b4c6-df3c1ad3788c-serving-cert\") pod \"apiserver-7bbb656c7d-vgm9d\" (UID: \"1bd37230-d5b0-47d1-b4c6-df3c1ad3788c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vgm9d" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.211977 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/48dab864-fed6-46a5-bae2-85cfa4edc939-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-vc9zl\" (UID: \"48dab864-fed6-46a5-bae2-85cfa4edc939\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vc9zl" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.212025 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf77b13b-75a2-4fc1-a068-8fd33773f827-config\") pod \"console-operator-58897d9998-rkglj\" (UID: \"cf77b13b-75a2-4fc1-a068-8fd33773f827\") " pod="openshift-console-operator/console-operator-58897d9998-rkglj" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.212048 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/730444c8-a1f8-4b14-b415-4b55869e7da3-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-gkd27\" (UID: \"730444c8-a1f8-4b14-b415-4b55869e7da3\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gkd27" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.212120 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cf77b13b-75a2-4fc1-a068-8fd33773f827-trusted-ca\") pod \"console-operator-58897d9998-rkglj\" (UID: \"cf77b13b-75a2-4fc1-a068-8fd33773f827\") " pod="openshift-console-operator/console-operator-58897d9998-rkglj" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.212174 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/730444c8-a1f8-4b14-b415-4b55869e7da3-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-gkd27\" (UID: \"730444c8-a1f8-4b14-b415-4b55869e7da3\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gkd27" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.212202 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/dd7edc7b-2d3a-4402-b7de-e70de317e52e-console-config\") pod \"console-f9d7485db-7cbpd\" (UID: \"dd7edc7b-2d3a-4402-b7de-e70de317e52e\") " pod="openshift-console/console-f9d7485db-7cbpd" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.212224 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wv759\" (UniqueName: \"kubernetes.io/projected/f46b0f35-a460-4977-a99a-8e0ea7015416-kube-api-access-wv759\") pod \"openshift-config-operator-7777fb866f-7csms\" (UID: \"f46b0f35-a460-4977-a99a-8e0ea7015416\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-7csms" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.212289 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/dd7edc7b-2d3a-4402-b7de-e70de317e52e-service-ca\") pod \"console-f9d7485db-7cbpd\" (UID: \"dd7edc7b-2d3a-4402-b7de-e70de317e52e\") " pod="openshift-console/console-f9d7485db-7cbpd" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.212351 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2ca49685-192c-4594-8046-f104c0590b2f-config\") pod \"kube-controller-manager-operator-78b949d7b-fswhw\" (UID: \"2ca49685-192c-4594-8046-f104c0590b2f\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-fswhw" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.212384 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-87f9w\" (UniqueName: \"kubernetes.io/projected/1dd4c49a-1898-48ab-9ed0-4f455f392b57-kube-api-access-87f9w\") pod \"machine-api-operator-5694c8668f-lqr65\" (UID: \"1dd4c49a-1898-48ab-9ed0-4f455f392b57\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lqr65" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.212439 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7b9b7fbd-6a91-4761-8604-b82c417e12f8-service-ca-bundle\") pod \"router-default-5444994796-jmljz\" (UID: \"7b9b7fbd-6a91-4761-8604-b82c417e12f8\") " pod="openshift-ingress/router-default-5444994796-jmljz" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.212546 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-twzqk\" (UniqueName: \"kubernetes.io/projected/1bd37230-d5b0-47d1-b4c6-df3c1ad3788c-kube-api-access-twzqk\") pod \"apiserver-7bbb656c7d-vgm9d\" (UID: \"1bd37230-d5b0-47d1-b4c6-df3c1ad3788c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vgm9d" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.212617 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/1dd4c49a-1898-48ab-9ed0-4f455f392b57-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-lqr65\" (UID: \"1dd4c49a-1898-48ab-9ed0-4f455f392b57\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lqr65" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.212672 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/bd1c8aad-a95c-4ab4-82f9-beb8702efab8-images\") pod \"machine-config-operator-74547568cd-kdd7p\" (UID: \"bd1c8aad-a95c-4ab4-82f9-beb8702efab8\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kdd7p" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.212703 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6dnkf\" (UniqueName: \"kubernetes.io/projected/bd1c8aad-a95c-4ab4-82f9-beb8702efab8-kube-api-access-6dnkf\") pod \"machine-config-operator-74547568cd-kdd7p\" (UID: \"bd1c8aad-a95c-4ab4-82f9-beb8702efab8\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kdd7p" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.212753 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/af9f94b9-3e85-4e0c-bb97-ac84071e969f-serving-cert\") pod \"authentication-operator-69f744f599-pjp8z\" (UID: \"af9f94b9-3e85-4e0c-bb97-ac84071e969f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-pjp8z" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.212781 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/48dab864-fed6-46a5-bae2-85cfa4edc939-config\") pod \"kube-apiserver-operator-766d6c64bb-vc9zl\" (UID: \"48dab864-fed6-46a5-bae2-85cfa4edc939\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vc9zl" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.212832 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rlqzs\" (UniqueName: \"kubernetes.io/projected/b7239fb8-0fed-4730-9608-db8000ae13dd-kube-api-access-rlqzs\") pod \"dns-operator-744455d44c-mmcc8\" (UID: \"b7239fb8-0fed-4730-9608-db8000ae13dd\") " pod="openshift-dns-operator/dns-operator-744455d44c-mmcc8" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.212868 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bd37230-d5b0-47d1-b4c6-df3c1ad3788c-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-vgm9d\" (UID: \"1bd37230-d5b0-47d1-b4c6-df3c1ad3788c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vgm9d" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.212922 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9gjkg\" (UniqueName: \"kubernetes.io/projected/af9f94b9-3e85-4e0c-bb97-ac84071e969f-kube-api-access-9gjkg\") pod \"authentication-operator-69f744f599-pjp8z\" (UID: \"af9f94b9-3e85-4e0c-bb97-ac84071e969f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-pjp8z" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.212946 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5ec5ed0a-c3c1-4205-869f-c126a1249aa6-serving-cert\") pod \"etcd-operator-b45778765-42442\" (UID: \"5ec5ed0a-c3c1-4205-869f-c126a1249aa6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-42442" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.212998 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bd37230-d5b0-47d1-b4c6-df3c1ad3788c-etcd-client\") pod \"apiserver-7bbb656c7d-vgm9d\" (UID: \"1bd37230-d5b0-47d1-b4c6-df3c1ad3788c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vgm9d" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.213029 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/af9f94b9-3e85-4e0c-bb97-ac84071e969f-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-pjp8z\" (UID: \"af9f94b9-3e85-4e0c-bb97-ac84071e969f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-pjp8z" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.213082 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/bd1c8aad-a95c-4ab4-82f9-beb8702efab8-auth-proxy-config\") pod \"machine-config-operator-74547568cd-kdd7p\" (UID: \"bd1c8aad-a95c-4ab4-82f9-beb8702efab8\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kdd7p" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.213105 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/dd7edc7b-2d3a-4402-b7de-e70de317e52e-console-serving-cert\") pod \"console-f9d7485db-7cbpd\" (UID: \"dd7edc7b-2d3a-4402-b7de-e70de317e52e\") " pod="openshift-console/console-f9d7485db-7cbpd" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.213156 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/5ec5ed0a-c3c1-4205-869f-c126a1249aa6-etcd-client\") pod \"etcd-operator-b45778765-42442\" (UID: \"5ec5ed0a-c3c1-4205-869f-c126a1249aa6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-42442" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.213182 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2ca49685-192c-4594-8046-f104c0590b2f-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-fswhw\" (UID: \"2ca49685-192c-4594-8046-f104c0590b2f\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-fswhw" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.213236 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/af9f94b9-3e85-4e0c-bb97-ac84071e969f-service-ca-bundle\") pod \"authentication-operator-69f744f599-pjp8z\" (UID: \"af9f94b9-3e85-4e0c-bb97-ac84071e969f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-pjp8z" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.213264 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7t87p\" (UniqueName: \"kubernetes.io/projected/dd7edc7b-2d3a-4402-b7de-e70de317e52e-kube-api-access-7t87p\") pod \"console-f9d7485db-7cbpd\" (UID: \"dd7edc7b-2d3a-4402-b7de-e70de317e52e\") " pod="openshift-console/console-f9d7485db-7cbpd" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.213312 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dmzvf\" (UniqueName: \"kubernetes.io/projected/730444c8-a1f8-4b14-b415-4b55869e7da3-kube-api-access-dmzvf\") pod \"cluster-image-registry-operator-dc59b4c8b-gkd27\" (UID: \"730444c8-a1f8-4b14-b415-4b55869e7da3\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gkd27" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.214331 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af9f94b9-3e85-4e0c-bb97-ac84071e969f-config\") pod \"authentication-operator-69f744f599-pjp8z\" (UID: \"af9f94b9-3e85-4e0c-bb97-ac84071e969f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-pjp8z" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.214425 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cf77b13b-75a2-4fc1-a068-8fd33773f827-serving-cert\") pod \"console-operator-58897d9998-rkglj\" (UID: \"cf77b13b-75a2-4fc1-a068-8fd33773f827\") " pod="openshift-console-operator/console-operator-58897d9998-rkglj" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.216336 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-4bmq4"] Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.216775 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/dd7edc7b-2d3a-4402-b7de-e70de317e52e-console-oauth-config\") pod \"console-f9d7485db-7cbpd\" (UID: \"dd7edc7b-2d3a-4402-b7de-e70de317e52e\") " pod="openshift-console/console-f9d7485db-7cbpd" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.216811 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/dd7edc7b-2d3a-4402-b7de-e70de317e52e-oauth-serving-cert\") pod \"console-f9d7485db-7cbpd\" (UID: \"dd7edc7b-2d3a-4402-b7de-e70de317e52e\") " pod="openshift-console/console-f9d7485db-7cbpd" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.216837 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2ca49685-192c-4594-8046-f104c0590b2f-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-fswhw\" (UID: \"2ca49685-192c-4594-8046-f104c0590b2f\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-fswhw" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.216859 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b7239fb8-0fed-4730-9608-db8000ae13dd-metrics-tls\") pod \"dns-operator-744455d44c-mmcc8\" (UID: \"b7239fb8-0fed-4730-9608-db8000ae13dd\") " pod="openshift-dns-operator/dns-operator-744455d44c-mmcc8" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.217307 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-74f7r\" (UniqueName: \"kubernetes.io/projected/7b9b7fbd-6a91-4761-8604-b82c417e12f8-kube-api-access-74f7r\") pod \"router-default-5444994796-jmljz\" (UID: \"7b9b7fbd-6a91-4761-8604-b82c417e12f8\") " pod="openshift-ingress/router-default-5444994796-jmljz" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.217365 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bd37230-d5b0-47d1-b4c6-df3c1ad3788c-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-vgm9d\" (UID: \"1bd37230-d5b0-47d1-b4c6-df3c1ad3788c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vgm9d" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.217400 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dd7edc7b-2d3a-4402-b7de-e70de317e52e-trusted-ca-bundle\") pod \"console-f9d7485db-7cbpd\" (UID: \"dd7edc7b-2d3a-4402-b7de-e70de317e52e\") " pod="openshift-console/console-f9d7485db-7cbpd" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.217999 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1dd4c49a-1898-48ab-9ed0-4f455f392b57-config\") pod \"machine-api-operator-5694c8668f-lqr65\" (UID: \"1dd4c49a-1898-48ab-9ed0-4f455f392b57\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lqr65" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.218061 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bd37230-d5b0-47d1-b4c6-df3c1ad3788c-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-vgm9d\" (UID: \"1bd37230-d5b0-47d1-b4c6-df3c1ad3788c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vgm9d" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.218103 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/af9f94b9-3e85-4e0c-bb97-ac84071e969f-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-pjp8z\" (UID: \"af9f94b9-3e85-4e0c-bb97-ac84071e969f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-pjp8z" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.218124 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1bd37230-d5b0-47d1-b4c6-df3c1ad3788c-audit-dir\") pod \"apiserver-7bbb656c7d-vgm9d\" (UID: \"1bd37230-d5b0-47d1-b4c6-df3c1ad3788c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vgm9d" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.218235 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1bd37230-d5b0-47d1-b4c6-df3c1ad3788c-audit-dir\") pod \"apiserver-7bbb656c7d-vgm9d\" (UID: \"1bd37230-d5b0-47d1-b4c6-df3c1ad3788c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vgm9d" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.218555 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/1bd37230-d5b0-47d1-b4c6-df3c1ad3788c-audit-policies\") pod \"apiserver-7bbb656c7d-vgm9d\" (UID: \"1bd37230-d5b0-47d1-b4c6-df3c1ad3788c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vgm9d" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.219652 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/af9f94b9-3e85-4e0c-bb97-ac84071e969f-service-ca-bundle\") pod \"authentication-operator-69f744f599-pjp8z\" (UID: \"af9f94b9-3e85-4e0c-bb97-ac84071e969f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-pjp8z" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.220085 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bd37230-d5b0-47d1-b4c6-df3c1ad3788c-encryption-config\") pod \"apiserver-7bbb656c7d-vgm9d\" (UID: \"1bd37230-d5b0-47d1-b4c6-df3c1ad3788c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vgm9d" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.220678 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-7cbpd"] Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.220904 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/1dd4c49a-1898-48ab-9ed0-4f455f392b57-images\") pod \"machine-api-operator-5694c8668f-lqr65\" (UID: \"1dd4c49a-1898-48ab-9ed0-4f455f392b57\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lqr65" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.221144 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bd37230-d5b0-47d1-b4c6-df3c1ad3788c-serving-cert\") pod \"apiserver-7bbb656c7d-vgm9d\" (UID: \"1bd37230-d5b0-47d1-b4c6-df3c1ad3788c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vgm9d" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.222744 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-5gljx"] Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.223908 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-kdd7p"] Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.224792 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bd37230-d5b0-47d1-b4c6-df3c1ad3788c-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-vgm9d\" (UID: \"1bd37230-d5b0-47d1-b4c6-df3c1ad3788c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vgm9d" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.226084 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.226103 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-rkglj"] Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.226082 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bd37230-d5b0-47d1-b4c6-df3c1ad3788c-etcd-client\") pod \"apiserver-7bbb656c7d-vgm9d\" (UID: \"1bd37230-d5b0-47d1-b4c6-df3c1ad3788c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vgm9d" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.226334 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/1dd4c49a-1898-48ab-9ed0-4f455f392b57-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-lqr65\" (UID: \"1dd4c49a-1898-48ab-9ed0-4f455f392b57\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lqr65" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.227198 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-676rf"] Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.229151 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-lbjzz"] Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.229346 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-6df8v"] Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.230332 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-7csms"] Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.231478 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-2c4kd"] Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.232530 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pjgq4"] Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.233544 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-lpsbl"] Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.236134 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-nnwb2"] Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.238939 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-mdzwq"] Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.239543 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-mdzwq" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.239834 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/af9f94b9-3e85-4e0c-bb97-ac84071e969f-serving-cert\") pod \"authentication-operator-69f744f599-pjp8z\" (UID: \"af9f94b9-3e85-4e0c-bb97-ac84071e969f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-pjp8z" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.240051 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-5ht9s"] Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.240409 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-5ht9s" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.240967 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.241177 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-lw2tt"] Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.243512 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ntzhr"] Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.243558 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zl9tf"] Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.246244 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-g9vz7"] Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.246268 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-fswhw"] Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.246881 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-92m7s"] Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.248042 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vc9zl"] Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.248803 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-25l4n"] Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.249882 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-wm9qr"] Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.250869 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-nv8jk"] Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.252113 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-cc9qv"] Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.253204 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29424300-txml6"] Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.254263 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gkd27"] Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.255258 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-vjbmx"] Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.256257 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-mmcc8"] Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.257206 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-mdzwq"] Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.258365 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-2cnbc"] Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.259342 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-22s99"] Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.260568 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-n8qh4"] Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.260932 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.261770 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-n8qh4" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.262246 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-n8qh4"] Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.281057 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.301041 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.319306 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/48dab864-fed6-46a5-bae2-85cfa4edc939-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-vc9zl\" (UID: \"48dab864-fed6-46a5-bae2-85cfa4edc939\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vc9zl" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.319438 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/5ec5ed0a-c3c1-4205-869f-c126a1249aa6-etcd-ca\") pod \"etcd-operator-b45778765-42442\" (UID: \"5ec5ed0a-c3c1-4205-869f-c126a1249aa6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-42442" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.319509 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/5ec5ed0a-c3c1-4205-869f-c126a1249aa6-etcd-service-ca\") pod \"etcd-operator-b45778765-42442\" (UID: \"5ec5ed0a-c3c1-4205-869f-c126a1249aa6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-42442" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.319533 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bd1c8aad-a95c-4ab4-82f9-beb8702efab8-proxy-tls\") pod \"machine-config-operator-74547568cd-kdd7p\" (UID: \"bd1c8aad-a95c-4ab4-82f9-beb8702efab8\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kdd7p" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.320209 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/5ec5ed0a-c3c1-4205-869f-c126a1249aa6-etcd-service-ca\") pod \"etcd-operator-b45778765-42442\" (UID: \"5ec5ed0a-c3c1-4205-869f-c126a1249aa6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-42442" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.320286 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/f46b0f35-a460-4977-a99a-8e0ea7015416-available-featuregates\") pod \"openshift-config-operator-7777fb866f-7csms\" (UID: \"f46b0f35-a460-4977-a99a-8e0ea7015416\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-7csms" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.320297 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/5ec5ed0a-c3c1-4205-869f-c126a1249aa6-etcd-ca\") pod \"etcd-operator-b45778765-42442\" (UID: \"5ec5ed0a-c3c1-4205-869f-c126a1249aa6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-42442" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.320317 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/7b9b7fbd-6a91-4761-8604-b82c417e12f8-default-certificate\") pod \"router-default-5444994796-jmljz\" (UID: \"7b9b7fbd-6a91-4761-8604-b82c417e12f8\") " pod="openshift-ingress/router-default-5444994796-jmljz" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.320357 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ec5ed0a-c3c1-4205-869f-c126a1249aa6-config\") pod \"etcd-operator-b45778765-42442\" (UID: \"5ec5ed0a-c3c1-4205-869f-c126a1249aa6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-42442" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.320373 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf77b13b-75a2-4fc1-a068-8fd33773f827-config\") pod \"console-operator-58897d9998-rkglj\" (UID: \"cf77b13b-75a2-4fc1-a068-8fd33773f827\") " pod="openshift-console-operator/console-operator-58897d9998-rkglj" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.320611 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/f46b0f35-a460-4977-a99a-8e0ea7015416-available-featuregates\") pod \"openshift-config-operator-7777fb866f-7csms\" (UID: \"f46b0f35-a460-4977-a99a-8e0ea7015416\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-7csms" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.321004 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.321066 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf77b13b-75a2-4fc1-a068-8fd33773f827-config\") pod \"console-operator-58897d9998-rkglj\" (UID: \"cf77b13b-75a2-4fc1-a068-8fd33773f827\") " pod="openshift-console-operator/console-operator-58897d9998-rkglj" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.320448 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/48dab864-fed6-46a5-bae2-85cfa4edc939-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-vc9zl\" (UID: \"48dab864-fed6-46a5-bae2-85cfa4edc939\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vc9zl" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.321149 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/730444c8-a1f8-4b14-b415-4b55869e7da3-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-gkd27\" (UID: \"730444c8-a1f8-4b14-b415-4b55869e7da3\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gkd27" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.321204 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/730444c8-a1f8-4b14-b415-4b55869e7da3-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-gkd27\" (UID: \"730444c8-a1f8-4b14-b415-4b55869e7da3\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gkd27" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.321230 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/dd7edc7b-2d3a-4402-b7de-e70de317e52e-console-config\") pod \"console-f9d7485db-7cbpd\" (UID: \"dd7edc7b-2d3a-4402-b7de-e70de317e52e\") " pod="openshift-console/console-f9d7485db-7cbpd" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.321256 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wv759\" (UniqueName: \"kubernetes.io/projected/f46b0f35-a460-4977-a99a-8e0ea7015416-kube-api-access-wv759\") pod \"openshift-config-operator-7777fb866f-7csms\" (UID: \"f46b0f35-a460-4977-a99a-8e0ea7015416\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-7csms" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.321285 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cf77b13b-75a2-4fc1-a068-8fd33773f827-trusted-ca\") pod \"console-operator-58897d9998-rkglj\" (UID: \"cf77b13b-75a2-4fc1-a068-8fd33773f827\") " pod="openshift-console-operator/console-operator-58897d9998-rkglj" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.321307 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/dd7edc7b-2d3a-4402-b7de-e70de317e52e-service-ca\") pod \"console-f9d7485db-7cbpd\" (UID: \"dd7edc7b-2d3a-4402-b7de-e70de317e52e\") " pod="openshift-console/console-f9d7485db-7cbpd" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.321325 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2ca49685-192c-4594-8046-f104c0590b2f-config\") pod \"kube-controller-manager-operator-78b949d7b-fswhw\" (UID: \"2ca49685-192c-4594-8046-f104c0590b2f\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-fswhw" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.321357 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7b9b7fbd-6a91-4761-8604-b82c417e12f8-service-ca-bundle\") pod \"router-default-5444994796-jmljz\" (UID: \"7b9b7fbd-6a91-4761-8604-b82c417e12f8\") " pod="openshift-ingress/router-default-5444994796-jmljz" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.321391 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/bd1c8aad-a95c-4ab4-82f9-beb8702efab8-images\") pod \"machine-config-operator-74547568cd-kdd7p\" (UID: \"bd1c8aad-a95c-4ab4-82f9-beb8702efab8\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kdd7p" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.321410 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6dnkf\" (UniqueName: \"kubernetes.io/projected/bd1c8aad-a95c-4ab4-82f9-beb8702efab8-kube-api-access-6dnkf\") pod \"machine-config-operator-74547568cd-kdd7p\" (UID: \"bd1c8aad-a95c-4ab4-82f9-beb8702efab8\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kdd7p" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.321430 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rlqzs\" (UniqueName: \"kubernetes.io/projected/b7239fb8-0fed-4730-9608-db8000ae13dd-kube-api-access-rlqzs\") pod \"dns-operator-744455d44c-mmcc8\" (UID: \"b7239fb8-0fed-4730-9608-db8000ae13dd\") " pod="openshift-dns-operator/dns-operator-744455d44c-mmcc8" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.321447 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/48dab864-fed6-46a5-bae2-85cfa4edc939-config\") pod \"kube-apiserver-operator-766d6c64bb-vc9zl\" (UID: \"48dab864-fed6-46a5-bae2-85cfa4edc939\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vc9zl" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.321483 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5ec5ed0a-c3c1-4205-869f-c126a1249aa6-serving-cert\") pod \"etcd-operator-b45778765-42442\" (UID: \"5ec5ed0a-c3c1-4205-869f-c126a1249aa6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-42442" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.321509 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/bd1c8aad-a95c-4ab4-82f9-beb8702efab8-auth-proxy-config\") pod \"machine-config-operator-74547568cd-kdd7p\" (UID: \"bd1c8aad-a95c-4ab4-82f9-beb8702efab8\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kdd7p" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.321528 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/dd7edc7b-2d3a-4402-b7de-e70de317e52e-console-serving-cert\") pod \"console-f9d7485db-7cbpd\" (UID: \"dd7edc7b-2d3a-4402-b7de-e70de317e52e\") " pod="openshift-console/console-f9d7485db-7cbpd" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.321547 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/5ec5ed0a-c3c1-4205-869f-c126a1249aa6-etcd-client\") pod \"etcd-operator-b45778765-42442\" (UID: \"5ec5ed0a-c3c1-4205-869f-c126a1249aa6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-42442" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.321570 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2ca49685-192c-4594-8046-f104c0590b2f-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-fswhw\" (UID: \"2ca49685-192c-4594-8046-f104c0590b2f\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-fswhw" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.321592 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7t87p\" (UniqueName: \"kubernetes.io/projected/dd7edc7b-2d3a-4402-b7de-e70de317e52e-kube-api-access-7t87p\") pod \"console-f9d7485db-7cbpd\" (UID: \"dd7edc7b-2d3a-4402-b7de-e70de317e52e\") " pod="openshift-console/console-f9d7485db-7cbpd" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.321615 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dmzvf\" (UniqueName: \"kubernetes.io/projected/730444c8-a1f8-4b14-b415-4b55869e7da3-kube-api-access-dmzvf\") pod \"cluster-image-registry-operator-dc59b4c8b-gkd27\" (UID: \"730444c8-a1f8-4b14-b415-4b55869e7da3\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gkd27" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.321641 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/dd7edc7b-2d3a-4402-b7de-e70de317e52e-console-oauth-config\") pod \"console-f9d7485db-7cbpd\" (UID: \"dd7edc7b-2d3a-4402-b7de-e70de317e52e\") " pod="openshift-console/console-f9d7485db-7cbpd" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.321660 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/dd7edc7b-2d3a-4402-b7de-e70de317e52e-oauth-serving-cert\") pod \"console-f9d7485db-7cbpd\" (UID: \"dd7edc7b-2d3a-4402-b7de-e70de317e52e\") " pod="openshift-console/console-f9d7485db-7cbpd" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.321683 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cf77b13b-75a2-4fc1-a068-8fd33773f827-serving-cert\") pod \"console-operator-58897d9998-rkglj\" (UID: \"cf77b13b-75a2-4fc1-a068-8fd33773f827\") " pod="openshift-console-operator/console-operator-58897d9998-rkglj" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.321702 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2ca49685-192c-4594-8046-f104c0590b2f-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-fswhw\" (UID: \"2ca49685-192c-4594-8046-f104c0590b2f\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-fswhw" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.321718 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b7239fb8-0fed-4730-9608-db8000ae13dd-metrics-tls\") pod \"dns-operator-744455d44c-mmcc8\" (UID: \"b7239fb8-0fed-4730-9608-db8000ae13dd\") " pod="openshift-dns-operator/dns-operator-744455d44c-mmcc8" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.321736 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-74f7r\" (UniqueName: \"kubernetes.io/projected/7b9b7fbd-6a91-4761-8604-b82c417e12f8-kube-api-access-74f7r\") pod \"router-default-5444994796-jmljz\" (UID: \"7b9b7fbd-6a91-4761-8604-b82c417e12f8\") " pod="openshift-ingress/router-default-5444994796-jmljz" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.321755 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dd7edc7b-2d3a-4402-b7de-e70de317e52e-trusted-ca-bundle\") pod \"console-f9d7485db-7cbpd\" (UID: \"dd7edc7b-2d3a-4402-b7de-e70de317e52e\") " pod="openshift-console/console-f9d7485db-7cbpd" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.321958 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ec5ed0a-c3c1-4205-869f-c126a1249aa6-config\") pod \"etcd-operator-b45778765-42442\" (UID: \"5ec5ed0a-c3c1-4205-869f-c126a1249aa6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-42442" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.322354 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2ca49685-192c-4594-8046-f104c0590b2f-config\") pod \"kube-controller-manager-operator-78b949d7b-fswhw\" (UID: \"2ca49685-192c-4594-8046-f104c0590b2f\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-fswhw" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.322494 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/dd7edc7b-2d3a-4402-b7de-e70de317e52e-console-config\") pod \"console-f9d7485db-7cbpd\" (UID: \"dd7edc7b-2d3a-4402-b7de-e70de317e52e\") " pod="openshift-console/console-f9d7485db-7cbpd" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.323271 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7b9b7fbd-6a91-4761-8604-b82c417e12f8-service-ca-bundle\") pod \"router-default-5444994796-jmljz\" (UID: \"7b9b7fbd-6a91-4761-8604-b82c417e12f8\") " pod="openshift-ingress/router-default-5444994796-jmljz" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.324012 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/dd7edc7b-2d3a-4402-b7de-e70de317e52e-oauth-serving-cert\") pod \"console-f9d7485db-7cbpd\" (UID: \"dd7edc7b-2d3a-4402-b7de-e70de317e52e\") " pod="openshift-console/console-f9d7485db-7cbpd" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.324044 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/bd1c8aad-a95c-4ab4-82f9-beb8702efab8-auth-proxy-config\") pod \"machine-config-operator-74547568cd-kdd7p\" (UID: \"bd1c8aad-a95c-4ab4-82f9-beb8702efab8\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kdd7p" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.324339 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cf77b13b-75a2-4fc1-a068-8fd33773f827-trusted-ca\") pod \"console-operator-58897d9998-rkglj\" (UID: \"cf77b13b-75a2-4fc1-a068-8fd33773f827\") " pod="openshift-console-operator/console-operator-58897d9998-rkglj" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.325168 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/730444c8-a1f8-4b14-b415-4b55869e7da3-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-gkd27\" (UID: \"730444c8-a1f8-4b14-b415-4b55869e7da3\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gkd27" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.325287 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/5ec5ed0a-c3c1-4205-869f-c126a1249aa6-etcd-client\") pod \"etcd-operator-b45778765-42442\" (UID: \"5ec5ed0a-c3c1-4205-869f-c126a1249aa6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-42442" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.325533 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/dd7edc7b-2d3a-4402-b7de-e70de317e52e-service-ca\") pod \"console-f9d7485db-7cbpd\" (UID: \"dd7edc7b-2d3a-4402-b7de-e70de317e52e\") " pod="openshift-console/console-f9d7485db-7cbpd" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.325576 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vqmz6\" (UniqueName: \"kubernetes.io/projected/cf77b13b-75a2-4fc1-a068-8fd33773f827-kube-api-access-vqmz6\") pod \"console-operator-58897d9998-rkglj\" (UID: \"cf77b13b-75a2-4fc1-a068-8fd33773f827\") " pod="openshift-console-operator/console-operator-58897d9998-rkglj" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.325673 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mhs6j\" (UniqueName: \"kubernetes.io/projected/5442029d-e1aa-4496-b3c6-18f11b034179-kube-api-access-mhs6j\") pod \"downloads-7954f5f757-4bmq4\" (UID: \"5442029d-e1aa-4496-b3c6-18f11b034179\") " pod="openshift-console/downloads-7954f5f757-4bmq4" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.325718 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/730444c8-a1f8-4b14-b415-4b55869e7da3-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-gkd27\" (UID: \"730444c8-a1f8-4b14-b415-4b55869e7da3\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gkd27" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.325756 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7zg6l\" (UniqueName: \"kubernetes.io/projected/5ec5ed0a-c3c1-4205-869f-c126a1249aa6-kube-api-access-7zg6l\") pod \"etcd-operator-b45778765-42442\" (UID: \"5ec5ed0a-c3c1-4205-869f-c126a1249aa6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-42442" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.325788 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f46b0f35-a460-4977-a99a-8e0ea7015416-serving-cert\") pod \"openshift-config-operator-7777fb866f-7csms\" (UID: \"f46b0f35-a460-4977-a99a-8e0ea7015416\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-7csms" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.325812 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dd7edc7b-2d3a-4402-b7de-e70de317e52e-trusted-ca-bundle\") pod \"console-f9d7485db-7cbpd\" (UID: \"dd7edc7b-2d3a-4402-b7de-e70de317e52e\") " pod="openshift-console/console-f9d7485db-7cbpd" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.325819 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/7b9b7fbd-6a91-4761-8604-b82c417e12f8-stats-auth\") pod \"router-default-5444994796-jmljz\" (UID: \"7b9b7fbd-6a91-4761-8604-b82c417e12f8\") " pod="openshift-ingress/router-default-5444994796-jmljz" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.325852 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7b9b7fbd-6a91-4761-8604-b82c417e12f8-metrics-certs\") pod \"router-default-5444994796-jmljz\" (UID: \"7b9b7fbd-6a91-4761-8604-b82c417e12f8\") " pod="openshift-ingress/router-default-5444994796-jmljz" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.326715 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/730444c8-a1f8-4b14-b415-4b55869e7da3-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-gkd27\" (UID: \"730444c8-a1f8-4b14-b415-4b55869e7da3\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gkd27" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.327538 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/dd7edc7b-2d3a-4402-b7de-e70de317e52e-console-oauth-config\") pod \"console-f9d7485db-7cbpd\" (UID: \"dd7edc7b-2d3a-4402-b7de-e70de317e52e\") " pod="openshift-console/console-f9d7485db-7cbpd" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.328117 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2ca49685-192c-4594-8046-f104c0590b2f-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-fswhw\" (UID: \"2ca49685-192c-4594-8046-f104c0590b2f\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-fswhw" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.328689 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/dd7edc7b-2d3a-4402-b7de-e70de317e52e-console-serving-cert\") pod \"console-f9d7485db-7cbpd\" (UID: \"dd7edc7b-2d3a-4402-b7de-e70de317e52e\") " pod="openshift-console/console-f9d7485db-7cbpd" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.328707 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5ec5ed0a-c3c1-4205-869f-c126a1249aa6-serving-cert\") pod \"etcd-operator-b45778765-42442\" (UID: \"5ec5ed0a-c3c1-4205-869f-c126a1249aa6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-42442" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.329257 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f46b0f35-a460-4977-a99a-8e0ea7015416-serving-cert\") pod \"openshift-config-operator-7777fb866f-7csms\" (UID: \"f46b0f35-a460-4977-a99a-8e0ea7015416\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-7csms" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.333944 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cf77b13b-75a2-4fc1-a068-8fd33773f827-serving-cert\") pod \"console-operator-58897d9998-rkglj\" (UID: \"cf77b13b-75a2-4fc1-a068-8fd33773f827\") " pod="openshift-console-operator/console-operator-58897d9998-rkglj" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.341893 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.360800 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.380853 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.393413 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/7b9b7fbd-6a91-4761-8604-b82c417e12f8-default-certificate\") pod \"router-default-5444994796-jmljz\" (UID: \"7b9b7fbd-6a91-4761-8604-b82c417e12f8\") " pod="openshift-ingress/router-default-5444994796-jmljz" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.400821 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.420536 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.440688 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.449580 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/7b9b7fbd-6a91-4761-8604-b82c417e12f8-stats-auth\") pod \"router-default-5444994796-jmljz\" (UID: \"7b9b7fbd-6a91-4761-8604-b82c417e12f8\") " pod="openshift-ingress/router-default-5444994796-jmljz" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.461628 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.469655 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7b9b7fbd-6a91-4761-8604-b82c417e12f8-metrics-certs\") pod \"router-default-5444994796-jmljz\" (UID: \"7b9b7fbd-6a91-4761-8604-b82c417e12f8\") " pod="openshift-ingress/router-default-5444994796-jmljz" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.481305 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.502940 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.508270 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b7239fb8-0fed-4730-9608-db8000ae13dd-metrics-tls\") pod \"dns-operator-744455d44c-mmcc8\" (UID: \"b7239fb8-0fed-4730-9608-db8000ae13dd\") " pod="openshift-dns-operator/dns-operator-744455d44c-mmcc8" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.520622 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.541113 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.561137 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.564995 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/bd1c8aad-a95c-4ab4-82f9-beb8702efab8-images\") pod \"machine-config-operator-74547568cd-kdd7p\" (UID: \"bd1c8aad-a95c-4ab4-82f9-beb8702efab8\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kdd7p" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.580990 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.594706 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bd1c8aad-a95c-4ab4-82f9-beb8702efab8-proxy-tls\") pod \"machine-config-operator-74547568cd-kdd7p\" (UID: \"bd1c8aad-a95c-4ab4-82f9-beb8702efab8\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kdd7p" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.602340 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.621659 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.628722 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 11 13:06:29 crc kubenswrapper[4898]: E1211 13:06:29.628998 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-11 13:08:31.628977264 +0000 UTC m=+269.201303701 (durationBeforeRetry 2m2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.640579 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.660936 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.681098 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.713711 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-76hpf\" (UniqueName: \"kubernetes.io/projected/5c39dd70-f596-4e60-b400-3611eddfefc4-kube-api-access-76hpf\") pod \"openshift-apiserver-operator-796bbdcf4f-2c4kd\" (UID: \"5c39dd70-f596-4e60-b400-3611eddfefc4\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-2c4kd" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.730692 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.730783 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.730809 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.730893 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.731774 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.733948 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.734080 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.734186 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.735299 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4vrf2\" (UniqueName: \"kubernetes.io/projected/a4915d3f-e64d-4fad-8933-218f11bc783d-kube-api-access-4vrf2\") pod \"machine-approver-56656f9798-c9zbz\" (UID: \"a4915d3f-e64d-4fad-8933-218f11bc783d\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-c9zbz" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.758643 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-db2rp\" (UniqueName: \"kubernetes.io/projected/92578329-85d1-4295-bc06-88764e9d54c2-kube-api-access-db2rp\") pod \"route-controller-manager-6576b87f9c-4wrc2\" (UID: \"92578329-85d1-4295-bc06-88764e9d54c2\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4wrc2" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.776929 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xxtlp\" (UniqueName: \"kubernetes.io/projected/494fa5aa-0063-4f27-af65-f3a92879017a-kube-api-access-xxtlp\") pod \"apiserver-76f77b778f-fvkz4\" (UID: \"494fa5aa-0063-4f27-af65-f3a92879017a\") " pod="openshift-apiserver/apiserver-76f77b778f-fvkz4" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.795369 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ljmkm\" (UniqueName: \"kubernetes.io/projected/a3c9df9f-29e5-45ed-bd07-642ae50d0ed9-kube-api-access-ljmkm\") pod \"controller-manager-879f6c89f-lpsbl\" (UID: \"a3c9df9f-29e5-45ed-bd07-642ae50d0ed9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-lpsbl" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.800064 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.800240 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.817331 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.821400 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.826950 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.841245 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.844977 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/48dab864-fed6-46a5-bae2-85cfa4edc939-config\") pod \"kube-apiserver-operator-766d6c64bb-vc9zl\" (UID: \"48dab864-fed6-46a5-bae2-85cfa4edc939\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vc9zl" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.862371 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.874127 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/48dab864-fed6-46a5-bae2-85cfa4edc939-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-vc9zl\" (UID: \"48dab864-fed6-46a5-bae2-85cfa4edc939\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vc9zl" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.885974 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.909726 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.921841 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.941490 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.955957 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-c9zbz" Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.962559 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Dec 11 13:06:29 crc kubenswrapper[4898]: W1211 13:06:29.969295 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda4915d3f_e64d_4fad_8933_218f11bc783d.slice/crio-2b022008ec4629adf7e35557ce69787ed8e782d8c60ac0eeefc58a0c531a9728 WatchSource:0}: Error finding container 2b022008ec4629adf7e35557ce69787ed8e782d8c60ac0eeefc58a0c531a9728: Status 404 returned error can't find the container with id 2b022008ec4629adf7e35557ce69787ed8e782d8c60ac0eeefc58a0c531a9728 Dec 11 13:06:29 crc kubenswrapper[4898]: I1211 13:06:29.983785 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-fvkz4" Dec 11 13:06:30 crc kubenswrapper[4898]: I1211 13:06:30.011261 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-2c4kd" Dec 11 13:06:30 crc kubenswrapper[4898]: W1211 13:06:30.014704 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fe485a1_e14f_4c09_b5b9_f252bc42b7e8.slice/crio-de976b481e110d969123e015f84d7296b5ffca594d727c9c8583e8995e5f9d9d WatchSource:0}: Error finding container de976b481e110d969123e015f84d7296b5ffca594d727c9c8583e8995e5f9d9d: Status 404 returned error can't find the container with id de976b481e110d969123e015f84d7296b5ffca594d727c9c8583e8995e5f9d9d Dec 11 13:06:30 crc kubenswrapper[4898]: I1211 13:06:30.020654 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Dec 11 13:06:30 crc kubenswrapper[4898]: W1211 13:06:30.032662 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d751cbb_f2e2_430d_9754_c882a5e924a5.slice/crio-82a170826a451e9d7da00764031b7d40c5fa411f0a79c4e62fa85281a7f61164 WatchSource:0}: Error finding container 82a170826a451e9d7da00764031b7d40c5fa411f0a79c4e62fa85281a7f61164: Status 404 returned error can't find the container with id 82a170826a451e9d7da00764031b7d40c5fa411f0a79c4e62fa85281a7f61164 Dec 11 13:06:30 crc kubenswrapper[4898]: I1211 13:06:30.040915 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Dec 11 13:06:30 crc kubenswrapper[4898]: I1211 13:06:30.048108 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4wrc2" Dec 11 13:06:30 crc kubenswrapper[4898]: W1211 13:06:30.058525 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b6479f0_333b_4a96_9adf_2099afdc2447.slice/crio-e650f723a2e4c9d0a0b3d53993344b3bec7e9b2f4cb420aaeebcd07a988aa7c5 WatchSource:0}: Error finding container e650f723a2e4c9d0a0b3d53993344b3bec7e9b2f4cb420aaeebcd07a988aa7c5: Status 404 returned error can't find the container with id e650f723a2e4c9d0a0b3d53993344b3bec7e9b2f4cb420aaeebcd07a988aa7c5 Dec 11 13:06:30 crc kubenswrapper[4898]: I1211 13:06:30.063864 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Dec 11 13:06:30 crc kubenswrapper[4898]: I1211 13:06:30.071101 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-lpsbl" Dec 11 13:06:30 crc kubenswrapper[4898]: I1211 13:06:30.085922 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Dec 11 13:06:30 crc kubenswrapper[4898]: I1211 13:06:30.100748 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Dec 11 13:06:30 crc kubenswrapper[4898]: I1211 13:06:30.128439 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Dec 11 13:06:30 crc kubenswrapper[4898]: I1211 13:06:30.139434 4898 request.go:700] Waited for 1.000508018s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-operator-lifecycle-manager/configmaps?fieldSelector=metadata.name%3Dopenshift-service-ca.crt&limit=500&resourceVersion=0 Dec 11 13:06:30 crc kubenswrapper[4898]: I1211 13:06:30.141316 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Dec 11 13:06:30 crc kubenswrapper[4898]: I1211 13:06:30.161100 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Dec 11 13:06:30 crc kubenswrapper[4898]: I1211 13:06:30.185051 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Dec 11 13:06:30 crc kubenswrapper[4898]: I1211 13:06:30.200846 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Dec 11 13:06:30 crc kubenswrapper[4898]: I1211 13:06:30.205122 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-fvkz4"] Dec 11 13:06:30 crc kubenswrapper[4898]: I1211 13:06:30.221809 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Dec 11 13:06:30 crc kubenswrapper[4898]: I1211 13:06:30.241858 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Dec 11 13:06:30 crc kubenswrapper[4898]: I1211 13:06:30.261202 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Dec 11 13:06:30 crc kubenswrapper[4898]: I1211 13:06:30.262057 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-2c4kd"] Dec 11 13:06:30 crc kubenswrapper[4898]: I1211 13:06:30.283469 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Dec 11 13:06:30 crc kubenswrapper[4898]: I1211 13:06:30.301991 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Dec 11 13:06:30 crc kubenswrapper[4898]: I1211 13:06:30.324020 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Dec 11 13:06:30 crc kubenswrapper[4898]: I1211 13:06:30.324058 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-4wrc2"] Dec 11 13:06:30 crc kubenswrapper[4898]: I1211 13:06:30.334303 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-lpsbl"] Dec 11 13:06:30 crc kubenswrapper[4898]: W1211 13:06:30.335694 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod92578329_85d1_4295_bc06_88764e9d54c2.slice/crio-15b3f81d1ee3b8f5b1bdb46f09546b19243a411c1d45876cf30a4487834827a1 WatchSource:0}: Error finding container 15b3f81d1ee3b8f5b1bdb46f09546b19243a411c1d45876cf30a4487834827a1: Status 404 returned error can't find the container with id 15b3f81d1ee3b8f5b1bdb46f09546b19243a411c1d45876cf30a4487834827a1 Dec 11 13:06:30 crc kubenswrapper[4898]: I1211 13:06:30.340761 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Dec 11 13:06:30 crc kubenswrapper[4898]: W1211 13:06:30.352167 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda3c9df9f_29e5_45ed_bd07_642ae50d0ed9.slice/crio-faac4b437985a5980206e67d881ecc78ea05fd74851705ae735b5f3da3f93ead WatchSource:0}: Error finding container faac4b437985a5980206e67d881ecc78ea05fd74851705ae735b5f3da3f93ead: Status 404 returned error can't find the container with id faac4b437985a5980206e67d881ecc78ea05fd74851705ae735b5f3da3f93ead Dec 11 13:06:30 crc kubenswrapper[4898]: I1211 13:06:30.360817 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Dec 11 13:06:30 crc kubenswrapper[4898]: I1211 13:06:30.380812 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Dec 11 13:06:30 crc kubenswrapper[4898]: I1211 13:06:30.401309 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Dec 11 13:06:30 crc kubenswrapper[4898]: I1211 13:06:30.421663 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Dec 11 13:06:30 crc kubenswrapper[4898]: I1211 13:06:30.442795 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Dec 11 13:06:30 crc kubenswrapper[4898]: I1211 13:06:30.460362 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Dec 11 13:06:30 crc kubenswrapper[4898]: I1211 13:06:30.482688 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 11 13:06:30 crc kubenswrapper[4898]: I1211 13:06:30.502348 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 11 13:06:30 crc kubenswrapper[4898]: I1211 13:06:30.521659 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Dec 11 13:06:30 crc kubenswrapper[4898]: I1211 13:06:30.542049 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Dec 11 13:06:30 crc kubenswrapper[4898]: I1211 13:06:30.544262 4898 generic.go:334] "Generic (PLEG): container finished" podID="494fa5aa-0063-4f27-af65-f3a92879017a" containerID="e9e2e56a02c0a797ceadbc70ce51899befec51ecb71faedc01d146550f023697" exitCode=0 Dec 11 13:06:30 crc kubenswrapper[4898]: I1211 13:06:30.544321 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-fvkz4" event={"ID":"494fa5aa-0063-4f27-af65-f3a92879017a","Type":"ContainerDied","Data":"e9e2e56a02c0a797ceadbc70ce51899befec51ecb71faedc01d146550f023697"} Dec 11 13:06:30 crc kubenswrapper[4898]: I1211 13:06:30.544350 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-fvkz4" event={"ID":"494fa5aa-0063-4f27-af65-f3a92879017a","Type":"ContainerStarted","Data":"9f22e37465ee00a9c4e54f6f328eeb05a268b416d8294918310f70a0648033c4"} Dec 11 13:06:30 crc kubenswrapper[4898]: I1211 13:06:30.547651 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"6266c96a0fca317218a88fb9d732f7060bf5d3e1e4d83d5ccb36dfe257741fb6"} Dec 11 13:06:30 crc kubenswrapper[4898]: I1211 13:06:30.547712 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"de976b481e110d969123e015f84d7296b5ffca594d727c9c8583e8995e5f9d9d"} Dec 11 13:06:30 crc kubenswrapper[4898]: I1211 13:06:30.548945 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"ed1abde848d9838d6a081c679cf00ca05302921a1df065803160344e70e1de2f"} Dec 11 13:06:30 crc kubenswrapper[4898]: I1211 13:06:30.548977 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"e650f723a2e4c9d0a0b3d53993344b3bec7e9b2f4cb420aaeebcd07a988aa7c5"} Dec 11 13:06:30 crc kubenswrapper[4898]: I1211 13:06:30.549229 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 11 13:06:30 crc kubenswrapper[4898]: I1211 13:06:30.550631 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-lpsbl" event={"ID":"a3c9df9f-29e5-45ed-bd07-642ae50d0ed9","Type":"ContainerStarted","Data":"ab995c5f1f5f9f9d65b60620584c0282782f1011515d8925109e9439571767a7"} Dec 11 13:06:30 crc kubenswrapper[4898]: I1211 13:06:30.550670 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-lpsbl" event={"ID":"a3c9df9f-29e5-45ed-bd07-642ae50d0ed9","Type":"ContainerStarted","Data":"faac4b437985a5980206e67d881ecc78ea05fd74851705ae735b5f3da3f93ead"} Dec 11 13:06:30 crc kubenswrapper[4898]: I1211 13:06:30.551394 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-lpsbl" Dec 11 13:06:30 crc kubenswrapper[4898]: I1211 13:06:30.552697 4898 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-lpsbl container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.21:8443/healthz\": dial tcp 10.217.0.21:8443: connect: connection refused" start-of-body= Dec 11 13:06:30 crc kubenswrapper[4898]: I1211 13:06:30.552739 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-lpsbl" podUID="a3c9df9f-29e5-45ed-bd07-642ae50d0ed9" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.21:8443/healthz\": dial tcp 10.217.0.21:8443: connect: connection refused" Dec 11 13:06:30 crc kubenswrapper[4898]: I1211 13:06:30.553074 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4wrc2" event={"ID":"92578329-85d1-4295-bc06-88764e9d54c2","Type":"ContainerStarted","Data":"8172453c4fccc0cee6395f6ebd67fda17373afb5b60717670fd9f48c977824f6"} Dec 11 13:06:30 crc kubenswrapper[4898]: I1211 13:06:30.553104 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4wrc2" event={"ID":"92578329-85d1-4295-bc06-88764e9d54c2","Type":"ContainerStarted","Data":"15b3f81d1ee3b8f5b1bdb46f09546b19243a411c1d45876cf30a4487834827a1"} Dec 11 13:06:30 crc kubenswrapper[4898]: I1211 13:06:30.553701 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4wrc2" Dec 11 13:06:30 crc kubenswrapper[4898]: I1211 13:06:30.554574 4898 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-4wrc2 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.9:8443/healthz\": dial tcp 10.217.0.9:8443: connect: connection refused" start-of-body= Dec 11 13:06:30 crc kubenswrapper[4898]: I1211 13:06:30.554607 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4wrc2" podUID="92578329-85d1-4295-bc06-88764e9d54c2" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.9:8443/healthz\": dial tcp 10.217.0.9:8443: connect: connection refused" Dec 11 13:06:30 crc kubenswrapper[4898]: I1211 13:06:30.555231 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-2c4kd" event={"ID":"5c39dd70-f596-4e60-b400-3611eddfefc4","Type":"ContainerStarted","Data":"69b71c79295d17d8495a531ac5db8778fbb6ee9c98495005fb5d0a263e14eed5"} Dec 11 13:06:30 crc kubenswrapper[4898]: I1211 13:06:30.555261 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-2c4kd" event={"ID":"5c39dd70-f596-4e60-b400-3611eddfefc4","Type":"ContainerStarted","Data":"e0e56187a2fc27a337a1be13f6a236e8be4809ff91d32d9919060aa0268bf1ed"} Dec 11 13:06:30 crc kubenswrapper[4898]: I1211 13:06:30.557157 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-c9zbz" event={"ID":"a4915d3f-e64d-4fad-8933-218f11bc783d","Type":"ContainerStarted","Data":"1c5b3549c2891d9075811360199dac898a00ab64b0f8116c9613f07807265b20"} Dec 11 13:06:30 crc kubenswrapper[4898]: I1211 13:06:30.557191 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-c9zbz" event={"ID":"a4915d3f-e64d-4fad-8933-218f11bc783d","Type":"ContainerStarted","Data":"59da598bdc1e3c1b74795c0f73945b3dd1ed53150db4568422af5a34efa169d2"} Dec 11 13:06:30 crc kubenswrapper[4898]: I1211 13:06:30.557209 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-c9zbz" event={"ID":"a4915d3f-e64d-4fad-8933-218f11bc783d","Type":"ContainerStarted","Data":"2b022008ec4629adf7e35557ce69787ed8e782d8c60ac0eeefc58a0c531a9728"} Dec 11 13:06:30 crc kubenswrapper[4898]: I1211 13:06:30.560328 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Dec 11 13:06:30 crc kubenswrapper[4898]: I1211 13:06:30.564981 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"39f9f7730afd7884f5094b224de523a029afff55b2536299749d0cdfa9e792c8"} Dec 11 13:06:30 crc kubenswrapper[4898]: I1211 13:06:30.565025 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"82a170826a451e9d7da00764031b7d40c5fa411f0a79c4e62fa85281a7f61164"} Dec 11 13:06:30 crc kubenswrapper[4898]: I1211 13:06:30.581925 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Dec 11 13:06:30 crc kubenswrapper[4898]: I1211 13:06:30.601249 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Dec 11 13:06:30 crc kubenswrapper[4898]: I1211 13:06:30.621520 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Dec 11 13:06:30 crc kubenswrapper[4898]: I1211 13:06:30.648752 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Dec 11 13:06:30 crc kubenswrapper[4898]: I1211 13:06:30.662006 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Dec 11 13:06:30 crc kubenswrapper[4898]: I1211 13:06:30.681041 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Dec 11 13:06:30 crc kubenswrapper[4898]: I1211 13:06:30.700845 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Dec 11 13:06:30 crc kubenswrapper[4898]: I1211 13:06:30.720909 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Dec 11 13:06:30 crc kubenswrapper[4898]: I1211 13:06:30.741003 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Dec 11 13:06:30 crc kubenswrapper[4898]: I1211 13:06:30.761569 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Dec 11 13:06:30 crc kubenswrapper[4898]: I1211 13:06:30.781395 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Dec 11 13:06:30 crc kubenswrapper[4898]: I1211 13:06:30.802754 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Dec 11 13:06:30 crc kubenswrapper[4898]: I1211 13:06:30.820934 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Dec 11 13:06:30 crc kubenswrapper[4898]: I1211 13:06:30.840868 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Dec 11 13:06:30 crc kubenswrapper[4898]: I1211 13:06:30.862412 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Dec 11 13:06:30 crc kubenswrapper[4898]: I1211 13:06:30.881076 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Dec 11 13:06:30 crc kubenswrapper[4898]: I1211 13:06:30.900504 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Dec 11 13:06:30 crc kubenswrapper[4898]: I1211 13:06:30.921059 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Dec 11 13:06:30 crc kubenswrapper[4898]: I1211 13:06:30.967484 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9gjkg\" (UniqueName: \"kubernetes.io/projected/af9f94b9-3e85-4e0c-bb97-ac84071e969f-kube-api-access-9gjkg\") pod \"authentication-operator-69f744f599-pjp8z\" (UID: \"af9f94b9-3e85-4e0c-bb97-ac84071e969f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-pjp8z" Dec 11 13:06:30 crc kubenswrapper[4898]: I1211 13:06:30.982327 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-87f9w\" (UniqueName: \"kubernetes.io/projected/1dd4c49a-1898-48ab-9ed0-4f455f392b57-kube-api-access-87f9w\") pod \"machine-api-operator-5694c8668f-lqr65\" (UID: \"1dd4c49a-1898-48ab-9ed0-4f455f392b57\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lqr65" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.000039 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-twzqk\" (UniqueName: \"kubernetes.io/projected/1bd37230-d5b0-47d1-b4c6-df3c1ad3788c-kube-api-access-twzqk\") pod \"apiserver-7bbb656c7d-vgm9d\" (UID: \"1bd37230-d5b0-47d1-b4c6-df3c1ad3788c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vgm9d" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.002749 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.021178 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.040648 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.062016 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.080307 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.089325 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vgm9d" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.101076 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.121362 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.137442 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-pjp8z" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.143295 4898 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.159900 4898 request.go:700] Waited for 1.897864122s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/hostpath-provisioner/configmaps?fieldSelector=metadata.name%3Dkube-root-ca.crt&limit=500&resourceVersion=0 Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.162087 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.180484 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.214671 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-lqr65" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.229597 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/48dab864-fed6-46a5-bae2-85cfa4edc939-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-vc9zl\" (UID: \"48dab864-fed6-46a5-bae2-85cfa4edc939\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vc9zl" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.258949 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wv759\" (UniqueName: \"kubernetes.io/projected/f46b0f35-a460-4977-a99a-8e0ea7015416-kube-api-access-wv759\") pod \"openshift-config-operator-7777fb866f-7csms\" (UID: \"f46b0f35-a460-4977-a99a-8e0ea7015416\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-7csms" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.270053 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rlqzs\" (UniqueName: \"kubernetes.io/projected/b7239fb8-0fed-4730-9608-db8000ae13dd-kube-api-access-rlqzs\") pod \"dns-operator-744455d44c-mmcc8\" (UID: \"b7239fb8-0fed-4730-9608-db8000ae13dd\") " pod="openshift-dns-operator/dns-operator-744455d44c-mmcc8" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.285008 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6dnkf\" (UniqueName: \"kubernetes.io/projected/bd1c8aad-a95c-4ab4-82f9-beb8702efab8-kube-api-access-6dnkf\") pod \"machine-config-operator-74547568cd-kdd7p\" (UID: \"bd1c8aad-a95c-4ab4-82f9-beb8702efab8\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kdd7p" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.301159 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2ca49685-192c-4594-8046-f104c0590b2f-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-fswhw\" (UID: \"2ca49685-192c-4594-8046-f104c0590b2f\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-fswhw" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.318201 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7t87p\" (UniqueName: \"kubernetes.io/projected/dd7edc7b-2d3a-4402-b7de-e70de317e52e-kube-api-access-7t87p\") pod \"console-f9d7485db-7cbpd\" (UID: \"dd7edc7b-2d3a-4402-b7de-e70de317e52e\") " pod="openshift-console/console-f9d7485db-7cbpd" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.336223 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dmzvf\" (UniqueName: \"kubernetes.io/projected/730444c8-a1f8-4b14-b415-4b55869e7da3-kube-api-access-dmzvf\") pod \"cluster-image-registry-operator-dc59b4c8b-gkd27\" (UID: \"730444c8-a1f8-4b14-b415-4b55869e7da3\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gkd27" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.366762 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-74f7r\" (UniqueName: \"kubernetes.io/projected/7b9b7fbd-6a91-4761-8604-b82c417e12f8-kube-api-access-74f7r\") pod \"router-default-5444994796-jmljz\" (UID: \"7b9b7fbd-6a91-4761-8604-b82c417e12f8\") " pod="openshift-ingress/router-default-5444994796-jmljz" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.375181 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-7csms" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.379362 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vqmz6\" (UniqueName: \"kubernetes.io/projected/cf77b13b-75a2-4fc1-a068-8fd33773f827-kube-api-access-vqmz6\") pod \"console-operator-58897d9998-rkglj\" (UID: \"cf77b13b-75a2-4fc1-a068-8fd33773f827\") " pod="openshift-console-operator/console-operator-58897d9998-rkglj" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.385011 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-7cbpd" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.392883 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-fswhw" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.398343 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mhs6j\" (UniqueName: \"kubernetes.io/projected/5442029d-e1aa-4496-b3c6-18f11b034179-kube-api-access-mhs6j\") pod \"downloads-7954f5f757-4bmq4\" (UID: \"5442029d-e1aa-4496-b3c6-18f11b034179\") " pod="openshift-console/downloads-7954f5f757-4bmq4" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.403783 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-vgm9d"] Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.414320 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-jmljz" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.419425 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/730444c8-a1f8-4b14-b415-4b55869e7da3-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-gkd27\" (UID: \"730444c8-a1f8-4b14-b415-4b55869e7da3\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gkd27" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.424775 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-pjp8z"] Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.424816 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gkd27" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.439143 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7zg6l\" (UniqueName: \"kubernetes.io/projected/5ec5ed0a-c3c1-4205-869f-c126a1249aa6-kube-api-access-7zg6l\") pod \"etcd-operator-b45778765-42442\" (UID: \"5ec5ed0a-c3c1-4205-869f-c126a1249aa6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-42442" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.441628 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-mmcc8" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.446584 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kdd7p" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.466746 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vc9zl" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.506069 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-lqr65"] Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.562210 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/199efc5b-5cf8-4d4b-85ad-44bdcd033bd6-audit-dir\") pod \"oauth-openshift-558db77b4-lbjzz\" (UID: \"199efc5b-5cf8-4d4b-85ad-44bdcd033bd6\") " pod="openshift-authentication/oauth-openshift-558db77b4-lbjzz" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.562268 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8850b908-0e43-45d6-a8d2-44e1fe06c4e0-registry-tls\") pod \"image-registry-697d97f7c8-g9vz7\" (UID: \"8850b908-0e43-45d6-a8d2-44e1fe06c4e0\") " pod="openshift-image-registry/image-registry-697d97f7c8-g9vz7" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.562297 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/199efc5b-5cf8-4d4b-85ad-44bdcd033bd6-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-lbjzz\" (UID: \"199efc5b-5cf8-4d4b-85ad-44bdcd033bd6\") " pod="openshift-authentication/oauth-openshift-558db77b4-lbjzz" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.562356 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8850b908-0e43-45d6-a8d2-44e1fe06c4e0-registry-certificates\") pod \"image-registry-697d97f7c8-g9vz7\" (UID: \"8850b908-0e43-45d6-a8d2-44e1fe06c4e0\") " pod="openshift-image-registry/image-registry-697d97f7c8-g9vz7" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.562386 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/199efc5b-5cf8-4d4b-85ad-44bdcd033bd6-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-lbjzz\" (UID: \"199efc5b-5cf8-4d4b-85ad-44bdcd033bd6\") " pod="openshift-authentication/oauth-openshift-558db77b4-lbjzz" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.562427 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/199efc5b-5cf8-4d4b-85ad-44bdcd033bd6-audit-policies\") pod \"oauth-openshift-558db77b4-lbjzz\" (UID: \"199efc5b-5cf8-4d4b-85ad-44bdcd033bd6\") " pod="openshift-authentication/oauth-openshift-558db77b4-lbjzz" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.562446 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b6j5q\" (UniqueName: \"kubernetes.io/projected/8850b908-0e43-45d6-a8d2-44e1fe06c4e0-kube-api-access-b6j5q\") pod \"image-registry-697d97f7c8-g9vz7\" (UID: \"8850b908-0e43-45d6-a8d2-44e1fe06c4e0\") " pod="openshift-image-registry/image-registry-697d97f7c8-g9vz7" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.562504 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/199efc5b-5cf8-4d4b-85ad-44bdcd033bd6-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-lbjzz\" (UID: \"199efc5b-5cf8-4d4b-85ad-44bdcd033bd6\") " pod="openshift-authentication/oauth-openshift-558db77b4-lbjzz" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.562570 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/065a3503-e7a8-4b7d-9cb7-366489ac0247-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-p9xjl\" (UID: \"065a3503-e7a8-4b7d-9cb7-366489ac0247\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-p9xjl" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.562599 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8850b908-0e43-45d6-a8d2-44e1fe06c4e0-ca-trust-extracted\") pod \"image-registry-697d97f7c8-g9vz7\" (UID: \"8850b908-0e43-45d6-a8d2-44e1fe06c4e0\") " pod="openshift-image-registry/image-registry-697d97f7c8-g9vz7" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.562620 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/065a3503-e7a8-4b7d-9cb7-366489ac0247-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-p9xjl\" (UID: \"065a3503-e7a8-4b7d-9cb7-366489ac0247\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-p9xjl" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.562638 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/199efc5b-5cf8-4d4b-85ad-44bdcd033bd6-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-lbjzz\" (UID: \"199efc5b-5cf8-4d4b-85ad-44bdcd033bd6\") " pod="openshift-authentication/oauth-openshift-558db77b4-lbjzz" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.562658 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/199efc5b-5cf8-4d4b-85ad-44bdcd033bd6-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-lbjzz\" (UID: \"199efc5b-5cf8-4d4b-85ad-44bdcd033bd6\") " pod="openshift-authentication/oauth-openshift-558db77b4-lbjzz" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.562681 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4nqx5\" (UniqueName: \"kubernetes.io/projected/065a3503-e7a8-4b7d-9cb7-366489ac0247-kube-api-access-4nqx5\") pod \"openshift-controller-manager-operator-756b6f6bc6-p9xjl\" (UID: \"065a3503-e7a8-4b7d-9cb7-366489ac0247\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-p9xjl" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.562702 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/199efc5b-5cf8-4d4b-85ad-44bdcd033bd6-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-lbjzz\" (UID: \"199efc5b-5cf8-4d4b-85ad-44bdcd033bd6\") " pod="openshift-authentication/oauth-openshift-558db77b4-lbjzz" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.562722 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/199efc5b-5cf8-4d4b-85ad-44bdcd033bd6-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-lbjzz\" (UID: \"199efc5b-5cf8-4d4b-85ad-44bdcd033bd6\") " pod="openshift-authentication/oauth-openshift-558db77b4-lbjzz" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.562742 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8850b908-0e43-45d6-a8d2-44e1fe06c4e0-bound-sa-token\") pod \"image-registry-697d97f7c8-g9vz7\" (UID: \"8850b908-0e43-45d6-a8d2-44e1fe06c4e0\") " pod="openshift-image-registry/image-registry-697d97f7c8-g9vz7" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.562762 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8850b908-0e43-45d6-a8d2-44e1fe06c4e0-installation-pull-secrets\") pod \"image-registry-697d97f7c8-g9vz7\" (UID: \"8850b908-0e43-45d6-a8d2-44e1fe06c4e0\") " pod="openshift-image-registry/image-registry-697d97f7c8-g9vz7" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.562791 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g9vz7\" (UID: \"8850b908-0e43-45d6-a8d2-44e1fe06c4e0\") " pod="openshift-image-registry/image-registry-697d97f7c8-g9vz7" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.562813 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/199efc5b-5cf8-4d4b-85ad-44bdcd033bd6-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-lbjzz\" (UID: \"199efc5b-5cf8-4d4b-85ad-44bdcd033bd6\") " pod="openshift-authentication/oauth-openshift-558db77b4-lbjzz" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.562848 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/199efc5b-5cf8-4d4b-85ad-44bdcd033bd6-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-lbjzz\" (UID: \"199efc5b-5cf8-4d4b-85ad-44bdcd033bd6\") " pod="openshift-authentication/oauth-openshift-558db77b4-lbjzz" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.562869 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lmt9s\" (UniqueName: \"kubernetes.io/projected/199efc5b-5cf8-4d4b-85ad-44bdcd033bd6-kube-api-access-lmt9s\") pod \"oauth-openshift-558db77b4-lbjzz\" (UID: \"199efc5b-5cf8-4d4b-85ad-44bdcd033bd6\") " pod="openshift-authentication/oauth-openshift-558db77b4-lbjzz" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.562890 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/e65a531b-17ae-4b24-b9f6-71c758a757b0-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-nnwb2\" (UID: \"e65a531b-17ae-4b24-b9f6-71c758a757b0\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-nnwb2" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.562910 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8850b908-0e43-45d6-a8d2-44e1fe06c4e0-trusted-ca\") pod \"image-registry-697d97f7c8-g9vz7\" (UID: \"8850b908-0e43-45d6-a8d2-44e1fe06c4e0\") " pod="openshift-image-registry/image-registry-697d97f7c8-g9vz7" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.562952 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rcnzx\" (UniqueName: \"kubernetes.io/projected/e65a531b-17ae-4b24-b9f6-71c758a757b0-kube-api-access-rcnzx\") pod \"cluster-samples-operator-665b6dd947-nnwb2\" (UID: \"e65a531b-17ae-4b24-b9f6-71c758a757b0\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-nnwb2" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.562969 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/199efc5b-5cf8-4d4b-85ad-44bdcd033bd6-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-lbjzz\" (UID: \"199efc5b-5cf8-4d4b-85ad-44bdcd033bd6\") " pod="openshift-authentication/oauth-openshift-558db77b4-lbjzz" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.562988 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/199efc5b-5cf8-4d4b-85ad-44bdcd033bd6-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-lbjzz\" (UID: \"199efc5b-5cf8-4d4b-85ad-44bdcd033bd6\") " pod="openshift-authentication/oauth-openshift-558db77b4-lbjzz" Dec 11 13:06:31 crc kubenswrapper[4898]: E1211 13:06:31.575272 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-11 13:06:32.075240898 +0000 UTC m=+149.647567335 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g9vz7" (UID: "8850b908-0e43-45d6-a8d2-44e1fe06c4e0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.591742 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-jmljz" event={"ID":"7b9b7fbd-6a91-4761-8604-b82c417e12f8","Type":"ContainerStarted","Data":"87330c50920e0013331b7e66b2dea9400abd3ce1ba1e97bf4ea9eb153a1dc376"} Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.592816 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-lqr65" event={"ID":"1dd4c49a-1898-48ab-9ed0-4f455f392b57","Type":"ContainerStarted","Data":"7b75ccba8cc5f624dc5f4a4cd8e432fe4324449a1262035f7a79e77951b36d1b"} Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.638238 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-7csms"] Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.645251 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-fvkz4" event={"ID":"494fa5aa-0063-4f27-af65-f3a92879017a","Type":"ContainerStarted","Data":"e89f36eb0e4fe95ec108e5426972d49190c4decc7a2cdbd0c917e685bdf96e18"} Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.645299 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-fvkz4" event={"ID":"494fa5aa-0063-4f27-af65-f3a92879017a","Type":"ContainerStarted","Data":"0eb937b887d2be64a5eb54f85cb943fdda8f1186acba23249c295833440449b3"} Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.655091 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-pjp8z" event={"ID":"af9f94b9-3e85-4e0c-bb97-ac84071e969f","Type":"ContainerStarted","Data":"fa6135cb56b252e3b1f881ea7dff35610c6ad35905e6d8ca9529ee447d630d7d"} Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.657954 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vgm9d" event={"ID":"1bd37230-d5b0-47d1-b4c6-df3c1ad3788c","Type":"ContainerStarted","Data":"41de305968d6b7c61cd5591d1bf63d27d8570064a2e0fe27719ff223acf942e9"} Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.665644 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.665869 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/39bfd95c-8066-475a-ae86-f34ce2cae1e7-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-vjbmx\" (UID: \"39bfd95c-8066-475a-ae86-f34ce2cae1e7\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-vjbmx" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.665920 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ccaafcb8-0877-4754-b7f7-43d3c35c6283-cert\") pod \"ingress-canary-mdzwq\" (UID: \"ccaafcb8-0877-4754-b7f7-43d3c35c6283\") " pod="openshift-ingress-canary/ingress-canary-mdzwq" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.665947 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ntjcn\" (UniqueName: \"kubernetes.io/projected/14360874-1ac7-4262-b7ed-3ccc4d909191-kube-api-access-ntjcn\") pod \"packageserver-d55dfcdfc-pjgq4\" (UID: \"14360874-1ac7-4262-b7ed-3ccc4d909191\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pjgq4" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.665985 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/3ab8ca08-87a3-4920-aa33-6bbc290b1c15-signing-key\") pod \"service-ca-9c57cc56f-22s99\" (UID: \"3ab8ca08-87a3-4920-aa33-6bbc290b1c15\") " pod="openshift-service-ca/service-ca-9c57cc56f-22s99" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.666007 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/14360874-1ac7-4262-b7ed-3ccc4d909191-webhook-cert\") pod \"packageserver-d55dfcdfc-pjgq4\" (UID: \"14360874-1ac7-4262-b7ed-3ccc4d909191\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pjgq4" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.666040 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t52vq\" (UniqueName: \"kubernetes.io/projected/588998fa-36f4-49d4-a69b-d60a3952787b-kube-api-access-t52vq\") pod \"olm-operator-6b444d44fb-nv8jk\" (UID: \"588998fa-36f4-49d4-a69b-d60a3952787b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-nv8jk" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.666081 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/c000ebbb-ebbf-4861-b148-6a21649befb4-srv-cert\") pod \"catalog-operator-68c6474976-ntzhr\" (UID: \"c000ebbb-ebbf-4861-b148-6a21649befb4\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ntzhr" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.666102 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7jcpb\" (UniqueName: \"kubernetes.io/projected/2ae587f2-140e-4780-844e-1eb3430f7ee6-kube-api-access-7jcpb\") pod \"dns-default-cc9qv\" (UID: \"2ae587f2-140e-4780-844e-1eb3430f7ee6\") " pod="openshift-dns/dns-default-cc9qv" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.666136 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/065a3503-e7a8-4b7d-9cb7-366489ac0247-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-p9xjl\" (UID: \"065a3503-e7a8-4b7d-9cb7-366489ac0247\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-p9xjl" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.666169 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4nqx5\" (UniqueName: \"kubernetes.io/projected/065a3503-e7a8-4b7d-9cb7-366489ac0247-kube-api-access-4nqx5\") pod \"openshift-controller-manager-operator-756b6f6bc6-p9xjl\" (UID: \"065a3503-e7a8-4b7d-9cb7-366489ac0247\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-p9xjl" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.666217 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fnkql\" (UniqueName: \"kubernetes.io/projected/d5bcb23e-d100-4e41-bcc4-e8773a821c91-kube-api-access-fnkql\") pod \"ingress-operator-5b745b69d9-6df8v\" (UID: \"d5bcb23e-d100-4e41-bcc4-e8773a821c91\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6df8v" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.666243 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/199efc5b-5cf8-4d4b-85ad-44bdcd033bd6-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-lbjzz\" (UID: \"199efc5b-5cf8-4d4b-85ad-44bdcd033bd6\") " pod="openshift-authentication/oauth-openshift-558db77b4-lbjzz" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.666292 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/199efc5b-5cf8-4d4b-85ad-44bdcd033bd6-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-lbjzz\" (UID: \"199efc5b-5cf8-4d4b-85ad-44bdcd033bd6\") " pod="openshift-authentication/oauth-openshift-558db77b4-lbjzz" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.666319 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-25hbq\" (UniqueName: \"kubernetes.io/projected/2b85feb3-50df-4ad9-ac89-de94e7842c7e-kube-api-access-25hbq\") pod \"service-ca-operator-777779d784-2cnbc\" (UID: \"2b85feb3-50df-4ad9-ac89-de94e7842c7e\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-2cnbc" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.666337 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/502c4cda-4f15-490b-bce9-a51f62fb940e-certs\") pod \"machine-config-server-5ht9s\" (UID: \"502c4cda-4f15-490b-bce9-a51f62fb940e\") " pod="openshift-machine-config-operator/machine-config-server-5ht9s" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.666358 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9pfkw\" (UniqueName: \"kubernetes.io/projected/e650293a-d60c-4e05-88dd-ea1fa46b3492-kube-api-access-9pfkw\") pod \"collect-profiles-29424300-txml6\" (UID: \"e650293a-d60c-4e05-88dd-ea1fa46b3492\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29424300-txml6" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.666396 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/199efc5b-5cf8-4d4b-85ad-44bdcd033bd6-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-lbjzz\" (UID: \"199efc5b-5cf8-4d4b-85ad-44bdcd033bd6\") " pod="openshift-authentication/oauth-openshift-558db77b4-lbjzz" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.668893 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4wrc2" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.669886 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4d25d3be-3b53-4824-b357-5f251a16aa38-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-25l4n\" (UID: \"4d25d3be-3b53-4824-b357-5f251a16aa38\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-25l4n" Dec 11 13:06:31 crc kubenswrapper[4898]: E1211 13:06:31.669943 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-11 13:06:32.169919174 +0000 UTC m=+149.742245621 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.670535 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-rkglj" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.675229 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2ae587f2-140e-4780-844e-1eb3430f7ee6-config-volume\") pod \"dns-default-cc9qv\" (UID: \"2ae587f2-140e-4780-844e-1eb3430f7ee6\") " pod="openshift-dns/dns-default-cc9qv" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.676494 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rcnzx\" (UniqueName: \"kubernetes.io/projected/e65a531b-17ae-4b24-b9f6-71c758a757b0-kube-api-access-rcnzx\") pod \"cluster-samples-operator-665b6dd947-nnwb2\" (UID: \"e65a531b-17ae-4b24-b9f6-71c758a757b0\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-nnwb2" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.676876 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-lpsbl" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.678300 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/199efc5b-5cf8-4d4b-85ad-44bdcd033bd6-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-lbjzz\" (UID: \"199efc5b-5cf8-4d4b-85ad-44bdcd033bd6\") " pod="openshift-authentication/oauth-openshift-558db77b4-lbjzz" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.677654 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/c000ebbb-ebbf-4861-b148-6a21649befb4-profile-collector-cert\") pod \"catalog-operator-68c6474976-ntzhr\" (UID: \"c000ebbb-ebbf-4861-b148-6a21649befb4\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ntzhr" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.678472 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/199efc5b-5cf8-4d4b-85ad-44bdcd033bd6-audit-dir\") pod \"oauth-openshift-558db77b4-lbjzz\" (UID: \"199efc5b-5cf8-4d4b-85ad-44bdcd033bd6\") " pod="openshift-authentication/oauth-openshift-558db77b4-lbjzz" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.678498 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/14360874-1ac7-4262-b7ed-3ccc4d909191-apiservice-cert\") pod \"packageserver-d55dfcdfc-pjgq4\" (UID: \"14360874-1ac7-4262-b7ed-3ccc4d909191\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pjgq4" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.678526 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/199efc5b-5cf8-4d4b-85ad-44bdcd033bd6-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-lbjzz\" (UID: \"199efc5b-5cf8-4d4b-85ad-44bdcd033bd6\") " pod="openshift-authentication/oauth-openshift-558db77b4-lbjzz" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.678603 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8850b908-0e43-45d6-a8d2-44e1fe06c4e0-registry-certificates\") pod \"image-registry-697d97f7c8-g9vz7\" (UID: \"8850b908-0e43-45d6-a8d2-44e1fe06c4e0\") " pod="openshift-image-registry/image-registry-697d97f7c8-g9vz7" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.678727 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-4bmq4" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.679211 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/199efc5b-5cf8-4d4b-85ad-44bdcd033bd6-audit-dir\") pod \"oauth-openshift-558db77b4-lbjzz\" (UID: \"199efc5b-5cf8-4d4b-85ad-44bdcd033bd6\") " pod="openshift-authentication/oauth-openshift-558db77b4-lbjzz" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.684506 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4d25d3be-3b53-4824-b357-5f251a16aa38-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-25l4n\" (UID: \"4d25d3be-3b53-4824-b357-5f251a16aa38\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-25l4n" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.684573 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5dx7q\" (UniqueName: \"kubernetes.io/projected/aeb38d85-05d0-4a84-b3a7-4a7a168ccd98-kube-api-access-5dx7q\") pod \"control-plane-machine-set-operator-78cbb6b69f-676rf\" (UID: \"aeb38d85-05d0-4a84-b3a7-4a7a168ccd98\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-676rf" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.684602 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4dwqc\" (UniqueName: \"kubernetes.io/projected/3ab8ca08-87a3-4920-aa33-6bbc290b1c15-kube-api-access-4dwqc\") pod \"service-ca-9c57cc56f-22s99\" (UID: \"3ab8ca08-87a3-4920-aa33-6bbc290b1c15\") " pod="openshift-service-ca/service-ca-9c57cc56f-22s99" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.684637 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ndrg9\" (UniqueName: \"kubernetes.io/projected/06275739-8ec6-405c-a9d4-0fc545a2277a-kube-api-access-ndrg9\") pod \"migrator-59844c95c7-5gljx\" (UID: \"06275739-8ec6-405c-a9d4-0fc545a2277a\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-5gljx" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.684684 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/14360874-1ac7-4262-b7ed-3ccc4d909191-tmpfs\") pod \"packageserver-d55dfcdfc-pjgq4\" (UID: \"14360874-1ac7-4262-b7ed-3ccc4d909191\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pjgq4" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.685516 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-59qgx\" (UniqueName: \"kubernetes.io/projected/c395a35a-0376-4626-a75d-c1d2631e3de1-kube-api-access-59qgx\") pod \"machine-config-controller-84d6567774-wm9qr\" (UID: \"c395a35a-0376-4626-a75d-c1d2631e3de1\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-wm9qr" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.685566 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/aa90eeb0-bd02-434e-a457-47336b084be7-registration-dir\") pod \"csi-hostpathplugin-n8qh4\" (UID: \"aa90eeb0-bd02-434e-a457-47336b084be7\") " pod="hostpath-provisioner/csi-hostpathplugin-n8qh4" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.685585 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d9ad469-8c6d-45df-8a7d-84250d766f58-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-lw2tt\" (UID: \"7d9ad469-8c6d-45df-8a7d-84250d766f58\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-lw2tt" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.686404 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/065a3503-e7a8-4b7d-9cb7-366489ac0247-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-p9xjl\" (UID: \"065a3503-e7a8-4b7d-9cb7-366489ac0247\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-p9xjl" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.686544 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/199efc5b-5cf8-4d4b-85ad-44bdcd033bd6-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-lbjzz\" (UID: \"199efc5b-5cf8-4d4b-85ad-44bdcd033bd6\") " pod="openshift-authentication/oauth-openshift-558db77b4-lbjzz" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.686628 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/aa90eeb0-bd02-434e-a457-47336b084be7-plugins-dir\") pod \"csi-hostpathplugin-n8qh4\" (UID: \"aa90eeb0-bd02-434e-a457-47336b084be7\") " pod="hostpath-provisioner/csi-hostpathplugin-n8qh4" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.686675 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/588998fa-36f4-49d4-a69b-d60a3952787b-profile-collector-cert\") pod \"olm-operator-6b444d44fb-nv8jk\" (UID: \"588998fa-36f4-49d4-a69b-d60a3952787b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-nv8jk" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.686695 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v79b5\" (UniqueName: \"kubernetes.io/projected/5cfb0e1f-8b8f-4f19-9a91-87a9c2bd42f7-kube-api-access-v79b5\") pod \"marketplace-operator-79b997595-92m7s\" (UID: \"5cfb0e1f-8b8f-4f19-9a91-87a9c2bd42f7\") " pod="openshift-marketplace/marketplace-operator-79b997595-92m7s" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.687316 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/199efc5b-5cf8-4d4b-85ad-44bdcd033bd6-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-lbjzz\" (UID: \"199efc5b-5cf8-4d4b-85ad-44bdcd033bd6\") " pod="openshift-authentication/oauth-openshift-558db77b4-lbjzz" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.690134 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/199efc5b-5cf8-4d4b-85ad-44bdcd033bd6-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-lbjzz\" (UID: \"199efc5b-5cf8-4d4b-85ad-44bdcd033bd6\") " pod="openshift-authentication/oauth-openshift-558db77b4-lbjzz" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.692073 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8850b908-0e43-45d6-a8d2-44e1fe06c4e0-registry-certificates\") pod \"image-registry-697d97f7c8-g9vz7\" (UID: \"8850b908-0e43-45d6-a8d2-44e1fe06c4e0\") " pod="openshift-image-registry/image-registry-697d97f7c8-g9vz7" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.693356 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/065a3503-e7a8-4b7d-9cb7-366489ac0247-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-p9xjl\" (UID: \"065a3503-e7a8-4b7d-9cb7-366489ac0247\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-p9xjl" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.693400 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/588998fa-36f4-49d4-a69b-d60a3952787b-srv-cert\") pod \"olm-operator-6b444d44fb-nv8jk\" (UID: \"588998fa-36f4-49d4-a69b-d60a3952787b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-nv8jk" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.693922 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/199efc5b-5cf8-4d4b-85ad-44bdcd033bd6-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-lbjzz\" (UID: \"199efc5b-5cf8-4d4b-85ad-44bdcd033bd6\") " pod="openshift-authentication/oauth-openshift-558db77b4-lbjzz" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.694100 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2bsx6\" (UniqueName: \"kubernetes.io/projected/7d9ad469-8c6d-45df-8a7d-84250d766f58-kube-api-access-2bsx6\") pod \"kube-storage-version-migrator-operator-b67b599dd-lw2tt\" (UID: \"7d9ad469-8c6d-45df-8a7d-84250d766f58\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-lw2tt" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.695742 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8850b908-0e43-45d6-a8d2-44e1fe06c4e0-ca-trust-extracted\") pod \"image-registry-697d97f7c8-g9vz7\" (UID: \"8850b908-0e43-45d6-a8d2-44e1fe06c4e0\") " pod="openshift-image-registry/image-registry-697d97f7c8-g9vz7" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.695792 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/3ab8ca08-87a3-4920-aa33-6bbc290b1c15-signing-cabundle\") pod \"service-ca-9c57cc56f-22s99\" (UID: \"3ab8ca08-87a3-4920-aa33-6bbc290b1c15\") " pod="openshift-service-ca/service-ca-9c57cc56f-22s99" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.695852 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/199efc5b-5cf8-4d4b-85ad-44bdcd033bd6-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-lbjzz\" (UID: \"199efc5b-5cf8-4d4b-85ad-44bdcd033bd6\") " pod="openshift-authentication/oauth-openshift-558db77b4-lbjzz" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.695881 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/aa90eeb0-bd02-434e-a457-47336b084be7-mountpoint-dir\") pod \"csi-hostpathplugin-n8qh4\" (UID: \"aa90eeb0-bd02-434e-a457-47336b084be7\") " pod="hostpath-provisioner/csi-hostpathplugin-n8qh4" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.695907 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2ae587f2-140e-4780-844e-1eb3430f7ee6-metrics-tls\") pod \"dns-default-cc9qv\" (UID: \"2ae587f2-140e-4780-844e-1eb3430f7ee6\") " pod="openshift-dns/dns-default-cc9qv" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.695986 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/199efc5b-5cf8-4d4b-85ad-44bdcd033bd6-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-lbjzz\" (UID: \"199efc5b-5cf8-4d4b-85ad-44bdcd033bd6\") " pod="openshift-authentication/oauth-openshift-558db77b4-lbjzz" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.696004 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c395a35a-0376-4626-a75d-c1d2631e3de1-proxy-tls\") pod \"machine-config-controller-84d6567774-wm9qr\" (UID: \"c395a35a-0376-4626-a75d-c1d2631e3de1\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-wm9qr" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.696031 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c395a35a-0376-4626-a75d-c1d2631e3de1-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-wm9qr\" (UID: \"c395a35a-0376-4626-a75d-c1d2631e3de1\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-wm9qr" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.696191 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/065a3503-e7a8-4b7d-9cb7-366489ac0247-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-p9xjl\" (UID: \"065a3503-e7a8-4b7d-9cb7-366489ac0247\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-p9xjl" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.696216 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2b85feb3-50df-4ad9-ac89-de94e7842c7e-serving-cert\") pod \"service-ca-operator-777779d784-2cnbc\" (UID: \"2b85feb3-50df-4ad9-ac89-de94e7842c7e\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-2cnbc" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.696238 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d5bcb23e-d100-4e41-bcc4-e8773a821c91-trusted-ca\") pod \"ingress-operator-5b745b69d9-6df8v\" (UID: \"d5bcb23e-d100-4e41-bcc4-e8773a821c91\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6df8v" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.696259 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8850b908-0e43-45d6-a8d2-44e1fe06c4e0-bound-sa-token\") pod \"image-registry-697d97f7c8-g9vz7\" (UID: \"8850b908-0e43-45d6-a8d2-44e1fe06c4e0\") " pod="openshift-image-registry/image-registry-697d97f7c8-g9vz7" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.696276 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gsdqx\" (UniqueName: \"kubernetes.io/projected/ccaafcb8-0877-4754-b7f7-43d3c35c6283-kube-api-access-gsdqx\") pod \"ingress-canary-mdzwq\" (UID: \"ccaafcb8-0877-4754-b7f7-43d3c35c6283\") " pod="openshift-ingress-canary/ingress-canary-mdzwq" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.697168 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/502c4cda-4f15-490b-bce9-a51f62fb940e-node-bootstrap-token\") pod \"machine-config-server-5ht9s\" (UID: \"502c4cda-4f15-490b-bce9-a51f62fb940e\") " pod="openshift-machine-config-operator/machine-config-server-5ht9s" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.697228 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8850b908-0e43-45d6-a8d2-44e1fe06c4e0-installation-pull-secrets\") pod \"image-registry-697d97f7c8-g9vz7\" (UID: \"8850b908-0e43-45d6-a8d2-44e1fe06c4e0\") " pod="openshift-image-registry/image-registry-697d97f7c8-g9vz7" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.697224 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/199efc5b-5cf8-4d4b-85ad-44bdcd033bd6-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-lbjzz\" (UID: \"199efc5b-5cf8-4d4b-85ad-44bdcd033bd6\") " pod="openshift-authentication/oauth-openshift-558db77b4-lbjzz" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.697323 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g9vz7\" (UID: \"8850b908-0e43-45d6-a8d2-44e1fe06c4e0\") " pod="openshift-image-registry/image-registry-697d97f7c8-g9vz7" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.697382 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/199efc5b-5cf8-4d4b-85ad-44bdcd033bd6-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-lbjzz\" (UID: \"199efc5b-5cf8-4d4b-85ad-44bdcd033bd6\") " pod="openshift-authentication/oauth-openshift-558db77b4-lbjzz" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.697489 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4d25d3be-3b53-4824-b357-5f251a16aa38-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-25l4n\" (UID: \"4d25d3be-3b53-4824-b357-5f251a16aa38\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-25l4n" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.697533 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lmt9s\" (UniqueName: \"kubernetes.io/projected/199efc5b-5cf8-4d4b-85ad-44bdcd033bd6-kube-api-access-lmt9s\") pod \"oauth-openshift-558db77b4-lbjzz\" (UID: \"199efc5b-5cf8-4d4b-85ad-44bdcd033bd6\") " pod="openshift-authentication/oauth-openshift-558db77b4-lbjzz" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.697550 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/e65a531b-17ae-4b24-b9f6-71c758a757b0-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-nnwb2\" (UID: \"e65a531b-17ae-4b24-b9f6-71c758a757b0\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-nnwb2" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.697567 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/5cfb0e1f-8b8f-4f19-9a91-87a9c2bd42f7-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-92m7s\" (UID: \"5cfb0e1f-8b8f-4f19-9a91-87a9c2bd42f7\") " pod="openshift-marketplace/marketplace-operator-79b997595-92m7s" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.697589 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8850b908-0e43-45d6-a8d2-44e1fe06c4e0-trusted-ca\") pod \"image-registry-697d97f7c8-g9vz7\" (UID: \"8850b908-0e43-45d6-a8d2-44e1fe06c4e0\") " pod="openshift-image-registry/image-registry-697d97f7c8-g9vz7" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.698156 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/199efc5b-5cf8-4d4b-85ad-44bdcd033bd6-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-lbjzz\" (UID: \"199efc5b-5cf8-4d4b-85ad-44bdcd033bd6\") " pod="openshift-authentication/oauth-openshift-558db77b4-lbjzz" Dec 11 13:06:31 crc kubenswrapper[4898]: E1211 13:06:31.698903 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-11 13:06:32.198891074 +0000 UTC m=+149.771217511 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g9vz7" (UID: "8850b908-0e43-45d6-a8d2-44e1fe06c4e0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.700738 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8850b908-0e43-45d6-a8d2-44e1fe06c4e0-ca-trust-extracted\") pod \"image-registry-697d97f7c8-g9vz7\" (UID: \"8850b908-0e43-45d6-a8d2-44e1fe06c4e0\") " pod="openshift-image-registry/image-registry-697d97f7c8-g9vz7" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.702611 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/199efc5b-5cf8-4d4b-85ad-44bdcd033bd6-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-lbjzz\" (UID: \"199efc5b-5cf8-4d4b-85ad-44bdcd033bd6\") " pod="openshift-authentication/oauth-openshift-558db77b4-lbjzz" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.702661 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/199efc5b-5cf8-4d4b-85ad-44bdcd033bd6-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-lbjzz\" (UID: \"199efc5b-5cf8-4d4b-85ad-44bdcd033bd6\") " pod="openshift-authentication/oauth-openshift-558db77b4-lbjzz" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.704074 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8850b908-0e43-45d6-a8d2-44e1fe06c4e0-trusted-ca\") pod \"image-registry-697d97f7c8-g9vz7\" (UID: \"8850b908-0e43-45d6-a8d2-44e1fe06c4e0\") " pod="openshift-image-registry/image-registry-697d97f7c8-g9vz7" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.704434 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/199efc5b-5cf8-4d4b-85ad-44bdcd033bd6-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-lbjzz\" (UID: \"199efc5b-5cf8-4d4b-85ad-44bdcd033bd6\") " pod="openshift-authentication/oauth-openshift-558db77b4-lbjzz" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.704897 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d5bcb23e-d100-4e41-bcc4-e8773a821c91-metrics-tls\") pod \"ingress-operator-5b745b69d9-6df8v\" (UID: \"d5bcb23e-d100-4e41-bcc4-e8773a821c91\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6df8v" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.704952 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5cfb0e1f-8b8f-4f19-9a91-87a9c2bd42f7-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-92m7s\" (UID: \"5cfb0e1f-8b8f-4f19-9a91-87a9c2bd42f7\") " pod="openshift-marketplace/marketplace-operator-79b997595-92m7s" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.705004 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8850b908-0e43-45d6-a8d2-44e1fe06c4e0-registry-tls\") pod \"image-registry-697d97f7c8-g9vz7\" (UID: \"8850b908-0e43-45d6-a8d2-44e1fe06c4e0\") " pod="openshift-image-registry/image-registry-697d97f7c8-g9vz7" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.705210 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/aa90eeb0-bd02-434e-a457-47336b084be7-socket-dir\") pod \"csi-hostpathplugin-n8qh4\" (UID: \"aa90eeb0-bd02-434e-a457-47336b084be7\") " pod="hostpath-provisioner/csi-hostpathplugin-n8qh4" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.705229 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d5bcb23e-d100-4e41-bcc4-e8773a821c91-bound-sa-token\") pod \"ingress-operator-5b745b69d9-6df8v\" (UID: \"d5bcb23e-d100-4e41-bcc4-e8773a821c91\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6df8v" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.705273 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fzmxs\" (UniqueName: \"kubernetes.io/projected/39bfd95c-8066-475a-ae86-f34ce2cae1e7-kube-api-access-fzmxs\") pod \"multus-admission-controller-857f4d67dd-vjbmx\" (UID: \"39bfd95c-8066-475a-ae86-f34ce2cae1e7\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-vjbmx" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.706587 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/199efc5b-5cf8-4d4b-85ad-44bdcd033bd6-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-lbjzz\" (UID: \"199efc5b-5cf8-4d4b-85ad-44bdcd033bd6\") " pod="openshift-authentication/oauth-openshift-558db77b4-lbjzz" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.707437 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/aa90eeb0-bd02-434e-a457-47336b084be7-csi-data-dir\") pod \"csi-hostpathplugin-n8qh4\" (UID: \"aa90eeb0-bd02-434e-a457-47336b084be7\") " pod="hostpath-provisioner/csi-hostpathplugin-n8qh4" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.707503 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xqv9j\" (UniqueName: \"kubernetes.io/projected/aa90eeb0-bd02-434e-a457-47336b084be7-kube-api-access-xqv9j\") pod \"csi-hostpathplugin-n8qh4\" (UID: \"aa90eeb0-bd02-434e-a457-47336b084be7\") " pod="hostpath-provisioner/csi-hostpathplugin-n8qh4" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.707537 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/aeb38d85-05d0-4a84-b3a7-4a7a168ccd98-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-676rf\" (UID: \"aeb38d85-05d0-4a84-b3a7-4a7a168ccd98\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-676rf" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.707570 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/199efc5b-5cf8-4d4b-85ad-44bdcd033bd6-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-lbjzz\" (UID: \"199efc5b-5cf8-4d4b-85ad-44bdcd033bd6\") " pod="openshift-authentication/oauth-openshift-558db77b4-lbjzz" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.708060 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/199efc5b-5cf8-4d4b-85ad-44bdcd033bd6-audit-policies\") pod \"oauth-openshift-558db77b4-lbjzz\" (UID: \"199efc5b-5cf8-4d4b-85ad-44bdcd033bd6\") " pod="openshift-authentication/oauth-openshift-558db77b4-lbjzz" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.708095 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/d5b90074-739f-4f6d-a41c-29612abca57e-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-zl9tf\" (UID: \"d5b90074-739f-4f6d-a41c-29612abca57e\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zl9tf" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.708135 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7d9ad469-8c6d-45df-8a7d-84250d766f58-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-lw2tt\" (UID: \"7d9ad469-8c6d-45df-8a7d-84250d766f58\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-lw2tt" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.708797 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/199efc5b-5cf8-4d4b-85ad-44bdcd033bd6-audit-policies\") pod \"oauth-openshift-558db77b4-lbjzz\" (UID: \"199efc5b-5cf8-4d4b-85ad-44bdcd033bd6\") " pod="openshift-authentication/oauth-openshift-558db77b4-lbjzz" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.708840 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrxc2\" (UniqueName: \"kubernetes.io/projected/d5b90074-739f-4f6d-a41c-29612abca57e-kube-api-access-qrxc2\") pod \"package-server-manager-789f6589d5-zl9tf\" (UID: \"d5b90074-739f-4f6d-a41c-29612abca57e\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zl9tf" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.708909 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b6j5q\" (UniqueName: \"kubernetes.io/projected/8850b908-0e43-45d6-a8d2-44e1fe06c4e0-kube-api-access-b6j5q\") pod \"image-registry-697d97f7c8-g9vz7\" (UID: \"8850b908-0e43-45d6-a8d2-44e1fe06c4e0\") " pod="openshift-image-registry/image-registry-697d97f7c8-g9vz7" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.709001 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fvrfg\" (UniqueName: \"kubernetes.io/projected/502c4cda-4f15-490b-bce9-a51f62fb940e-kube-api-access-fvrfg\") pod \"machine-config-server-5ht9s\" (UID: \"502c4cda-4f15-490b-bce9-a51f62fb940e\") " pod="openshift-machine-config-operator/machine-config-server-5ht9s" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.709020 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e650293a-d60c-4e05-88dd-ea1fa46b3492-config-volume\") pod \"collect-profiles-29424300-txml6\" (UID: \"e650293a-d60c-4e05-88dd-ea1fa46b3492\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29424300-txml6" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.709038 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pjldm\" (UniqueName: \"kubernetes.io/projected/c000ebbb-ebbf-4861-b148-6a21649befb4-kube-api-access-pjldm\") pod \"catalog-operator-68c6474976-ntzhr\" (UID: \"c000ebbb-ebbf-4861-b148-6a21649befb4\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ntzhr" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.709054 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2b85feb3-50df-4ad9-ac89-de94e7842c7e-config\") pod \"service-ca-operator-777779d784-2cnbc\" (UID: \"2b85feb3-50df-4ad9-ac89-de94e7842c7e\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-2cnbc" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.709069 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e650293a-d60c-4e05-88dd-ea1fa46b3492-secret-volume\") pod \"collect-profiles-29424300-txml6\" (UID: \"e650293a-d60c-4e05-88dd-ea1fa46b3492\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29424300-txml6" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.710055 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8850b908-0e43-45d6-a8d2-44e1fe06c4e0-installation-pull-secrets\") pod \"image-registry-697d97f7c8-g9vz7\" (UID: \"8850b908-0e43-45d6-a8d2-44e1fe06c4e0\") " pod="openshift-image-registry/image-registry-697d97f7c8-g9vz7" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.710761 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-42442" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.727955 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/199efc5b-5cf8-4d4b-85ad-44bdcd033bd6-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-lbjzz\" (UID: \"199efc5b-5cf8-4d4b-85ad-44bdcd033bd6\") " pod="openshift-authentication/oauth-openshift-558db77b4-lbjzz" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.731863 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/199efc5b-5cf8-4d4b-85ad-44bdcd033bd6-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-lbjzz\" (UID: \"199efc5b-5cf8-4d4b-85ad-44bdcd033bd6\") " pod="openshift-authentication/oauth-openshift-558db77b4-lbjzz" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.732716 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/e65a531b-17ae-4b24-b9f6-71c758a757b0-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-nnwb2\" (UID: \"e65a531b-17ae-4b24-b9f6-71c758a757b0\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-nnwb2" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.734502 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rcnzx\" (UniqueName: \"kubernetes.io/projected/e65a531b-17ae-4b24-b9f6-71c758a757b0-kube-api-access-rcnzx\") pod \"cluster-samples-operator-665b6dd947-nnwb2\" (UID: \"e65a531b-17ae-4b24-b9f6-71c758a757b0\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-nnwb2" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.735983 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8850b908-0e43-45d6-a8d2-44e1fe06c4e0-registry-tls\") pod \"image-registry-697d97f7c8-g9vz7\" (UID: \"8850b908-0e43-45d6-a8d2-44e1fe06c4e0\") " pod="openshift-image-registry/image-registry-697d97f7c8-g9vz7" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.737886 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/199efc5b-5cf8-4d4b-85ad-44bdcd033bd6-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-lbjzz\" (UID: \"199efc5b-5cf8-4d4b-85ad-44bdcd033bd6\") " pod="openshift-authentication/oauth-openshift-558db77b4-lbjzz" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.754278 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-7cbpd"] Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.754874 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-nnwb2" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.754931 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4nqx5\" (UniqueName: \"kubernetes.io/projected/065a3503-e7a8-4b7d-9cb7-366489ac0247-kube-api-access-4nqx5\") pod \"openshift-controller-manager-operator-756b6f6bc6-p9xjl\" (UID: \"065a3503-e7a8-4b7d-9cb7-366489ac0247\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-p9xjl" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.788666 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lmt9s\" (UniqueName: \"kubernetes.io/projected/199efc5b-5cf8-4d4b-85ad-44bdcd033bd6-kube-api-access-lmt9s\") pod \"oauth-openshift-558db77b4-lbjzz\" (UID: \"199efc5b-5cf8-4d4b-85ad-44bdcd033bd6\") " pod="openshift-authentication/oauth-openshift-558db77b4-lbjzz" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.811492 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.811715 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d5bcb23e-d100-4e41-bcc4-e8773a821c91-metrics-tls\") pod \"ingress-operator-5b745b69d9-6df8v\" (UID: \"d5bcb23e-d100-4e41-bcc4-e8773a821c91\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6df8v" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.811741 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5cfb0e1f-8b8f-4f19-9a91-87a9c2bd42f7-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-92m7s\" (UID: \"5cfb0e1f-8b8f-4f19-9a91-87a9c2bd42f7\") " pod="openshift-marketplace/marketplace-operator-79b997595-92m7s" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.811776 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/aa90eeb0-bd02-434e-a457-47336b084be7-socket-dir\") pod \"csi-hostpathplugin-n8qh4\" (UID: \"aa90eeb0-bd02-434e-a457-47336b084be7\") " pod="hostpath-provisioner/csi-hostpathplugin-n8qh4" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.811795 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d5bcb23e-d100-4e41-bcc4-e8773a821c91-bound-sa-token\") pod \"ingress-operator-5b745b69d9-6df8v\" (UID: \"d5bcb23e-d100-4e41-bcc4-e8773a821c91\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6df8v" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.811825 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/aa90eeb0-bd02-434e-a457-47336b084be7-csi-data-dir\") pod \"csi-hostpathplugin-n8qh4\" (UID: \"aa90eeb0-bd02-434e-a457-47336b084be7\") " pod="hostpath-provisioner/csi-hostpathplugin-n8qh4" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.811846 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xqv9j\" (UniqueName: \"kubernetes.io/projected/aa90eeb0-bd02-434e-a457-47336b084be7-kube-api-access-xqv9j\") pod \"csi-hostpathplugin-n8qh4\" (UID: \"aa90eeb0-bd02-434e-a457-47336b084be7\") " pod="hostpath-provisioner/csi-hostpathplugin-n8qh4" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.811867 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/aeb38d85-05d0-4a84-b3a7-4a7a168ccd98-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-676rf\" (UID: \"aeb38d85-05d0-4a84-b3a7-4a7a168ccd98\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-676rf" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.811890 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fzmxs\" (UniqueName: \"kubernetes.io/projected/39bfd95c-8066-475a-ae86-f34ce2cae1e7-kube-api-access-fzmxs\") pod \"multus-admission-controller-857f4d67dd-vjbmx\" (UID: \"39bfd95c-8066-475a-ae86-f34ce2cae1e7\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-vjbmx" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.811915 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/d5b90074-739f-4f6d-a41c-29612abca57e-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-zl9tf\" (UID: \"d5b90074-739f-4f6d-a41c-29612abca57e\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zl9tf" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.811937 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7d9ad469-8c6d-45df-8a7d-84250d766f58-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-lw2tt\" (UID: \"7d9ad469-8c6d-45df-8a7d-84250d766f58\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-lw2tt" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.811988 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qrxc2\" (UniqueName: \"kubernetes.io/projected/d5b90074-739f-4f6d-a41c-29612abca57e-kube-api-access-qrxc2\") pod \"package-server-manager-789f6589d5-zl9tf\" (UID: \"d5b90074-739f-4f6d-a41c-29612abca57e\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zl9tf" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.812010 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fvrfg\" (UniqueName: \"kubernetes.io/projected/502c4cda-4f15-490b-bce9-a51f62fb940e-kube-api-access-fvrfg\") pod \"machine-config-server-5ht9s\" (UID: \"502c4cda-4f15-490b-bce9-a51f62fb940e\") " pod="openshift-machine-config-operator/machine-config-server-5ht9s" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.812030 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e650293a-d60c-4e05-88dd-ea1fa46b3492-config-volume\") pod \"collect-profiles-29424300-txml6\" (UID: \"e650293a-d60c-4e05-88dd-ea1fa46b3492\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29424300-txml6" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.812050 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2b85feb3-50df-4ad9-ac89-de94e7842c7e-config\") pod \"service-ca-operator-777779d784-2cnbc\" (UID: \"2b85feb3-50df-4ad9-ac89-de94e7842c7e\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-2cnbc" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.812068 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e650293a-d60c-4e05-88dd-ea1fa46b3492-secret-volume\") pod \"collect-profiles-29424300-txml6\" (UID: \"e650293a-d60c-4e05-88dd-ea1fa46b3492\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29424300-txml6" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.812089 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pjldm\" (UniqueName: \"kubernetes.io/projected/c000ebbb-ebbf-4861-b148-6a21649befb4-kube-api-access-pjldm\") pod \"catalog-operator-68c6474976-ntzhr\" (UID: \"c000ebbb-ebbf-4861-b148-6a21649befb4\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ntzhr" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.812110 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/39bfd95c-8066-475a-ae86-f34ce2cae1e7-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-vjbmx\" (UID: \"39bfd95c-8066-475a-ae86-f34ce2cae1e7\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-vjbmx" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.812129 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ccaafcb8-0877-4754-b7f7-43d3c35c6283-cert\") pod \"ingress-canary-mdzwq\" (UID: \"ccaafcb8-0877-4754-b7f7-43d3c35c6283\") " pod="openshift-ingress-canary/ingress-canary-mdzwq" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.812151 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ntjcn\" (UniqueName: \"kubernetes.io/projected/14360874-1ac7-4262-b7ed-3ccc4d909191-kube-api-access-ntjcn\") pod \"packageserver-d55dfcdfc-pjgq4\" (UID: \"14360874-1ac7-4262-b7ed-3ccc4d909191\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pjgq4" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.812171 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/3ab8ca08-87a3-4920-aa33-6bbc290b1c15-signing-key\") pod \"service-ca-9c57cc56f-22s99\" (UID: \"3ab8ca08-87a3-4920-aa33-6bbc290b1c15\") " pod="openshift-service-ca/service-ca-9c57cc56f-22s99" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.812195 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t52vq\" (UniqueName: \"kubernetes.io/projected/588998fa-36f4-49d4-a69b-d60a3952787b-kube-api-access-t52vq\") pod \"olm-operator-6b444d44fb-nv8jk\" (UID: \"588998fa-36f4-49d4-a69b-d60a3952787b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-nv8jk" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.812215 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/14360874-1ac7-4262-b7ed-3ccc4d909191-webhook-cert\") pod \"packageserver-d55dfcdfc-pjgq4\" (UID: \"14360874-1ac7-4262-b7ed-3ccc4d909191\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pjgq4" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.812254 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/c000ebbb-ebbf-4861-b148-6a21649befb4-srv-cert\") pod \"catalog-operator-68c6474976-ntzhr\" (UID: \"c000ebbb-ebbf-4861-b148-6a21649befb4\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ntzhr" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.812277 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7jcpb\" (UniqueName: \"kubernetes.io/projected/2ae587f2-140e-4780-844e-1eb3430f7ee6-kube-api-access-7jcpb\") pod \"dns-default-cc9qv\" (UID: \"2ae587f2-140e-4780-844e-1eb3430f7ee6\") " pod="openshift-dns/dns-default-cc9qv" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.812301 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fnkql\" (UniqueName: \"kubernetes.io/projected/d5bcb23e-d100-4e41-bcc4-e8773a821c91-kube-api-access-fnkql\") pod \"ingress-operator-5b745b69d9-6df8v\" (UID: \"d5bcb23e-d100-4e41-bcc4-e8773a821c91\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6df8v" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.812325 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-25hbq\" (UniqueName: \"kubernetes.io/projected/2b85feb3-50df-4ad9-ac89-de94e7842c7e-kube-api-access-25hbq\") pod \"service-ca-operator-777779d784-2cnbc\" (UID: \"2b85feb3-50df-4ad9-ac89-de94e7842c7e\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-2cnbc" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.812345 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9pfkw\" (UniqueName: \"kubernetes.io/projected/e650293a-d60c-4e05-88dd-ea1fa46b3492-kube-api-access-9pfkw\") pod \"collect-profiles-29424300-txml6\" (UID: \"e650293a-d60c-4e05-88dd-ea1fa46b3492\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29424300-txml6" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.812367 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/502c4cda-4f15-490b-bce9-a51f62fb940e-certs\") pod \"machine-config-server-5ht9s\" (UID: \"502c4cda-4f15-490b-bce9-a51f62fb940e\") " pod="openshift-machine-config-operator/machine-config-server-5ht9s" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.812391 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4d25d3be-3b53-4824-b357-5f251a16aa38-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-25l4n\" (UID: \"4d25d3be-3b53-4824-b357-5f251a16aa38\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-25l4n" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.812409 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2ae587f2-140e-4780-844e-1eb3430f7ee6-config-volume\") pod \"dns-default-cc9qv\" (UID: \"2ae587f2-140e-4780-844e-1eb3430f7ee6\") " pod="openshift-dns/dns-default-cc9qv" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.812446 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/c000ebbb-ebbf-4861-b148-6a21649befb4-profile-collector-cert\") pod \"catalog-operator-68c6474976-ntzhr\" (UID: \"c000ebbb-ebbf-4861-b148-6a21649befb4\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ntzhr" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.812498 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/14360874-1ac7-4262-b7ed-3ccc4d909191-apiservice-cert\") pod \"packageserver-d55dfcdfc-pjgq4\" (UID: \"14360874-1ac7-4262-b7ed-3ccc4d909191\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pjgq4" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.812531 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4d25d3be-3b53-4824-b357-5f251a16aa38-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-25l4n\" (UID: \"4d25d3be-3b53-4824-b357-5f251a16aa38\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-25l4n" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.812551 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5dx7q\" (UniqueName: \"kubernetes.io/projected/aeb38d85-05d0-4a84-b3a7-4a7a168ccd98-kube-api-access-5dx7q\") pod \"control-plane-machine-set-operator-78cbb6b69f-676rf\" (UID: \"aeb38d85-05d0-4a84-b3a7-4a7a168ccd98\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-676rf" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.812572 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ndrg9\" (UniqueName: \"kubernetes.io/projected/06275739-8ec6-405c-a9d4-0fc545a2277a-kube-api-access-ndrg9\") pod \"migrator-59844c95c7-5gljx\" (UID: \"06275739-8ec6-405c-a9d4-0fc545a2277a\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-5gljx" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.812594 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4dwqc\" (UniqueName: \"kubernetes.io/projected/3ab8ca08-87a3-4920-aa33-6bbc290b1c15-kube-api-access-4dwqc\") pod \"service-ca-9c57cc56f-22s99\" (UID: \"3ab8ca08-87a3-4920-aa33-6bbc290b1c15\") " pod="openshift-service-ca/service-ca-9c57cc56f-22s99" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.812636 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-59qgx\" (UniqueName: \"kubernetes.io/projected/c395a35a-0376-4626-a75d-c1d2631e3de1-kube-api-access-59qgx\") pod \"machine-config-controller-84d6567774-wm9qr\" (UID: \"c395a35a-0376-4626-a75d-c1d2631e3de1\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-wm9qr" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.812654 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/14360874-1ac7-4262-b7ed-3ccc4d909191-tmpfs\") pod \"packageserver-d55dfcdfc-pjgq4\" (UID: \"14360874-1ac7-4262-b7ed-3ccc4d909191\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pjgq4" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.812673 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/aa90eeb0-bd02-434e-a457-47336b084be7-registration-dir\") pod \"csi-hostpathplugin-n8qh4\" (UID: \"aa90eeb0-bd02-434e-a457-47336b084be7\") " pod="hostpath-provisioner/csi-hostpathplugin-n8qh4" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.812693 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d9ad469-8c6d-45df-8a7d-84250d766f58-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-lw2tt\" (UID: \"7d9ad469-8c6d-45df-8a7d-84250d766f58\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-lw2tt" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.812717 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/588998fa-36f4-49d4-a69b-d60a3952787b-profile-collector-cert\") pod \"olm-operator-6b444d44fb-nv8jk\" (UID: \"588998fa-36f4-49d4-a69b-d60a3952787b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-nv8jk" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.812737 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v79b5\" (UniqueName: \"kubernetes.io/projected/5cfb0e1f-8b8f-4f19-9a91-87a9c2bd42f7-kube-api-access-v79b5\") pod \"marketplace-operator-79b997595-92m7s\" (UID: \"5cfb0e1f-8b8f-4f19-9a91-87a9c2bd42f7\") " pod="openshift-marketplace/marketplace-operator-79b997595-92m7s" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.812779 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/aa90eeb0-bd02-434e-a457-47336b084be7-plugins-dir\") pod \"csi-hostpathplugin-n8qh4\" (UID: \"aa90eeb0-bd02-434e-a457-47336b084be7\") " pod="hostpath-provisioner/csi-hostpathplugin-n8qh4" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.812821 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/588998fa-36f4-49d4-a69b-d60a3952787b-srv-cert\") pod \"olm-operator-6b444d44fb-nv8jk\" (UID: \"588998fa-36f4-49d4-a69b-d60a3952787b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-nv8jk" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.812857 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2bsx6\" (UniqueName: \"kubernetes.io/projected/7d9ad469-8c6d-45df-8a7d-84250d766f58-kube-api-access-2bsx6\") pod \"kube-storage-version-migrator-operator-b67b599dd-lw2tt\" (UID: \"7d9ad469-8c6d-45df-8a7d-84250d766f58\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-lw2tt" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.812879 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/3ab8ca08-87a3-4920-aa33-6bbc290b1c15-signing-cabundle\") pod \"service-ca-9c57cc56f-22s99\" (UID: \"3ab8ca08-87a3-4920-aa33-6bbc290b1c15\") " pod="openshift-service-ca/service-ca-9c57cc56f-22s99" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.812900 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/aa90eeb0-bd02-434e-a457-47336b084be7-mountpoint-dir\") pod \"csi-hostpathplugin-n8qh4\" (UID: \"aa90eeb0-bd02-434e-a457-47336b084be7\") " pod="hostpath-provisioner/csi-hostpathplugin-n8qh4" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.812919 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2ae587f2-140e-4780-844e-1eb3430f7ee6-metrics-tls\") pod \"dns-default-cc9qv\" (UID: \"2ae587f2-140e-4780-844e-1eb3430f7ee6\") " pod="openshift-dns/dns-default-cc9qv" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.812951 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c395a35a-0376-4626-a75d-c1d2631e3de1-proxy-tls\") pod \"machine-config-controller-84d6567774-wm9qr\" (UID: \"c395a35a-0376-4626-a75d-c1d2631e3de1\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-wm9qr" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.812973 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c395a35a-0376-4626-a75d-c1d2631e3de1-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-wm9qr\" (UID: \"c395a35a-0376-4626-a75d-c1d2631e3de1\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-wm9qr" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.812992 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2b85feb3-50df-4ad9-ac89-de94e7842c7e-serving-cert\") pod \"service-ca-operator-777779d784-2cnbc\" (UID: \"2b85feb3-50df-4ad9-ac89-de94e7842c7e\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-2cnbc" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.813017 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d5bcb23e-d100-4e41-bcc4-e8773a821c91-trusted-ca\") pod \"ingress-operator-5b745b69d9-6df8v\" (UID: \"d5bcb23e-d100-4e41-bcc4-e8773a821c91\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6df8v" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.813045 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gsdqx\" (UniqueName: \"kubernetes.io/projected/ccaafcb8-0877-4754-b7f7-43d3c35c6283-kube-api-access-gsdqx\") pod \"ingress-canary-mdzwq\" (UID: \"ccaafcb8-0877-4754-b7f7-43d3c35c6283\") " pod="openshift-ingress-canary/ingress-canary-mdzwq" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.813067 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/502c4cda-4f15-490b-bce9-a51f62fb940e-node-bootstrap-token\") pod \"machine-config-server-5ht9s\" (UID: \"502c4cda-4f15-490b-bce9-a51f62fb940e\") " pod="openshift-machine-config-operator/machine-config-server-5ht9s" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.813128 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4d25d3be-3b53-4824-b357-5f251a16aa38-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-25l4n\" (UID: \"4d25d3be-3b53-4824-b357-5f251a16aa38\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-25l4n" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.813164 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/5cfb0e1f-8b8f-4f19-9a91-87a9c2bd42f7-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-92m7s\" (UID: \"5cfb0e1f-8b8f-4f19-9a91-87a9c2bd42f7\") " pod="openshift-marketplace/marketplace-operator-79b997595-92m7s" Dec 11 13:06:31 crc kubenswrapper[4898]: E1211 13:06:31.813868 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-11 13:06:32.313850558 +0000 UTC m=+149.886176995 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.816944 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c395a35a-0376-4626-a75d-c1d2631e3de1-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-wm9qr\" (UID: \"c395a35a-0376-4626-a75d-c1d2631e3de1\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-wm9qr" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.817079 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/aa90eeb0-bd02-434e-a457-47336b084be7-mountpoint-dir\") pod \"csi-hostpathplugin-n8qh4\" (UID: \"aa90eeb0-bd02-434e-a457-47336b084be7\") " pod="hostpath-provisioner/csi-hostpathplugin-n8qh4" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.817255 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/3ab8ca08-87a3-4920-aa33-6bbc290b1c15-signing-cabundle\") pod \"service-ca-9c57cc56f-22s99\" (UID: \"3ab8ca08-87a3-4920-aa33-6bbc290b1c15\") " pod="openshift-service-ca/service-ca-9c57cc56f-22s99" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.821552 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/588998fa-36f4-49d4-a69b-d60a3952787b-srv-cert\") pod \"olm-operator-6b444d44fb-nv8jk\" (UID: \"588998fa-36f4-49d4-a69b-d60a3952787b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-nv8jk" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.821576 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/14360874-1ac7-4262-b7ed-3ccc4d909191-tmpfs\") pod \"packageserver-d55dfcdfc-pjgq4\" (UID: \"14360874-1ac7-4262-b7ed-3ccc4d909191\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pjgq4" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.823955 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/aa90eeb0-bd02-434e-a457-47336b084be7-registration-dir\") pod \"csi-hostpathplugin-n8qh4\" (UID: \"aa90eeb0-bd02-434e-a457-47336b084be7\") " pod="hostpath-provisioner/csi-hostpathplugin-n8qh4" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.824697 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e650293a-d60c-4e05-88dd-ea1fa46b3492-secret-volume\") pod \"collect-profiles-29424300-txml6\" (UID: \"e650293a-d60c-4e05-88dd-ea1fa46b3492\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29424300-txml6" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.824701 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d9ad469-8c6d-45df-8a7d-84250d766f58-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-lw2tt\" (UID: \"7d9ad469-8c6d-45df-8a7d-84250d766f58\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-lw2tt" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.824930 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/aa90eeb0-bd02-434e-a457-47336b084be7-csi-data-dir\") pod \"csi-hostpathplugin-n8qh4\" (UID: \"aa90eeb0-bd02-434e-a457-47336b084be7\") " pod="hostpath-provisioner/csi-hostpathplugin-n8qh4" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.824965 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/aa90eeb0-bd02-434e-a457-47336b084be7-socket-dir\") pod \"csi-hostpathplugin-n8qh4\" (UID: \"aa90eeb0-bd02-434e-a457-47336b084be7\") " pod="hostpath-provisioner/csi-hostpathplugin-n8qh4" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.826301 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5cfb0e1f-8b8f-4f19-9a91-87a9c2bd42f7-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-92m7s\" (UID: \"5cfb0e1f-8b8f-4f19-9a91-87a9c2bd42f7\") " pod="openshift-marketplace/marketplace-operator-79b997595-92m7s" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.829115 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/588998fa-36f4-49d4-a69b-d60a3952787b-profile-collector-cert\") pod \"olm-operator-6b444d44fb-nv8jk\" (UID: \"588998fa-36f4-49d4-a69b-d60a3952787b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-nv8jk" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.829209 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/aa90eeb0-bd02-434e-a457-47336b084be7-plugins-dir\") pod \"csi-hostpathplugin-n8qh4\" (UID: \"aa90eeb0-bd02-434e-a457-47336b084be7\") " pod="hostpath-provisioner/csi-hostpathplugin-n8qh4" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.832332 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/14360874-1ac7-4262-b7ed-3ccc4d909191-apiservice-cert\") pod \"packageserver-d55dfcdfc-pjgq4\" (UID: \"14360874-1ac7-4262-b7ed-3ccc4d909191\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pjgq4" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.834362 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/aeb38d85-05d0-4a84-b3a7-4a7a168ccd98-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-676rf\" (UID: \"aeb38d85-05d0-4a84-b3a7-4a7a168ccd98\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-676rf" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.835567 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d5bcb23e-d100-4e41-bcc4-e8773a821c91-trusted-ca\") pod \"ingress-operator-5b745b69d9-6df8v\" (UID: \"d5bcb23e-d100-4e41-bcc4-e8773a821c91\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6df8v" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.836439 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2b85feb3-50df-4ad9-ac89-de94e7842c7e-serving-cert\") pod \"service-ca-operator-777779d784-2cnbc\" (UID: \"2b85feb3-50df-4ad9-ac89-de94e7842c7e\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-2cnbc" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.837019 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c395a35a-0376-4626-a75d-c1d2631e3de1-proxy-tls\") pod \"machine-config-controller-84d6567774-wm9qr\" (UID: \"c395a35a-0376-4626-a75d-c1d2631e3de1\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-wm9qr" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.837347 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8850b908-0e43-45d6-a8d2-44e1fe06c4e0-bound-sa-token\") pod \"image-registry-697d97f7c8-g9vz7\" (UID: \"8850b908-0e43-45d6-a8d2-44e1fe06c4e0\") " pod="openshift-image-registry/image-registry-697d97f7c8-g9vz7" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.838070 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e650293a-d60c-4e05-88dd-ea1fa46b3492-config-volume\") pod \"collect-profiles-29424300-txml6\" (UID: \"e650293a-d60c-4e05-88dd-ea1fa46b3492\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29424300-txml6" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.838628 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4d25d3be-3b53-4824-b357-5f251a16aa38-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-25l4n\" (UID: \"4d25d3be-3b53-4824-b357-5f251a16aa38\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-25l4n" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.838732 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b6j5q\" (UniqueName: \"kubernetes.io/projected/8850b908-0e43-45d6-a8d2-44e1fe06c4e0-kube-api-access-b6j5q\") pod \"image-registry-697d97f7c8-g9vz7\" (UID: \"8850b908-0e43-45d6-a8d2-44e1fe06c4e0\") " pod="openshift-image-registry/image-registry-697d97f7c8-g9vz7" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.838900 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2b85feb3-50df-4ad9-ac89-de94e7842c7e-config\") pod \"service-ca-operator-777779d784-2cnbc\" (UID: \"2b85feb3-50df-4ad9-ac89-de94e7842c7e\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-2cnbc" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.844208 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2ae587f2-140e-4780-844e-1eb3430f7ee6-metrics-tls\") pod \"dns-default-cc9qv\" (UID: \"2ae587f2-140e-4780-844e-1eb3430f7ee6\") " pod="openshift-dns/dns-default-cc9qv" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.845362 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/d5b90074-739f-4f6d-a41c-29612abca57e-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-zl9tf\" (UID: \"d5b90074-739f-4f6d-a41c-29612abca57e\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zl9tf" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.845564 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2ae587f2-140e-4780-844e-1eb3430f7ee6-config-volume\") pod \"dns-default-cc9qv\" (UID: \"2ae587f2-140e-4780-844e-1eb3430f7ee6\") " pod="openshift-dns/dns-default-cc9qv" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.845995 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/39bfd95c-8066-475a-ae86-f34ce2cae1e7-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-vjbmx\" (UID: \"39bfd95c-8066-475a-ae86-f34ce2cae1e7\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-vjbmx" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.852046 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/3ab8ca08-87a3-4920-aa33-6bbc290b1c15-signing-key\") pod \"service-ca-9c57cc56f-22s99\" (UID: \"3ab8ca08-87a3-4920-aa33-6bbc290b1c15\") " pod="openshift-service-ca/service-ca-9c57cc56f-22s99" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.852730 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/14360874-1ac7-4262-b7ed-3ccc4d909191-webhook-cert\") pod \"packageserver-d55dfcdfc-pjgq4\" (UID: \"14360874-1ac7-4262-b7ed-3ccc4d909191\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pjgq4" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.854349 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/502c4cda-4f15-490b-bce9-a51f62fb940e-node-bootstrap-token\") pod \"machine-config-server-5ht9s\" (UID: \"502c4cda-4f15-490b-bce9-a51f62fb940e\") " pod="openshift-machine-config-operator/machine-config-server-5ht9s" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.854985 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7d9ad469-8c6d-45df-8a7d-84250d766f58-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-lw2tt\" (UID: \"7d9ad469-8c6d-45df-8a7d-84250d766f58\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-lw2tt" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.858190 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/5cfb0e1f-8b8f-4f19-9a91-87a9c2bd42f7-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-92m7s\" (UID: \"5cfb0e1f-8b8f-4f19-9a91-87a9c2bd42f7\") " pod="openshift-marketplace/marketplace-operator-79b997595-92m7s" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.866151 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4d25d3be-3b53-4824-b357-5f251a16aa38-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-25l4n\" (UID: \"4d25d3be-3b53-4824-b357-5f251a16aa38\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-25l4n" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.866218 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ccaafcb8-0877-4754-b7f7-43d3c35c6283-cert\") pod \"ingress-canary-mdzwq\" (UID: \"ccaafcb8-0877-4754-b7f7-43d3c35c6283\") " pod="openshift-ingress-canary/ingress-canary-mdzwq" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.866654 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/502c4cda-4f15-490b-bce9-a51f62fb940e-certs\") pod \"machine-config-server-5ht9s\" (UID: \"502c4cda-4f15-490b-bce9-a51f62fb940e\") " pod="openshift-machine-config-operator/machine-config-server-5ht9s" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.866748 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/c000ebbb-ebbf-4861-b148-6a21649befb4-profile-collector-cert\") pod \"catalog-operator-68c6474976-ntzhr\" (UID: \"c000ebbb-ebbf-4861-b148-6a21649befb4\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ntzhr" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.867947 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/c000ebbb-ebbf-4861-b148-6a21649befb4-srv-cert\") pod \"catalog-operator-68c6474976-ntzhr\" (UID: \"c000ebbb-ebbf-4861-b148-6a21649befb4\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ntzhr" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.869905 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-fswhw"] Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.872348 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d5bcb23e-d100-4e41-bcc4-e8773a821c91-metrics-tls\") pod \"ingress-operator-5b745b69d9-6df8v\" (UID: \"d5bcb23e-d100-4e41-bcc4-e8773a821c91\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6df8v" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.875009 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2bsx6\" (UniqueName: \"kubernetes.io/projected/7d9ad469-8c6d-45df-8a7d-84250d766f58-kube-api-access-2bsx6\") pod \"kube-storage-version-migrator-operator-b67b599dd-lw2tt\" (UID: \"7d9ad469-8c6d-45df-8a7d-84250d766f58\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-lw2tt" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.890801 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4dwqc\" (UniqueName: \"kubernetes.io/projected/3ab8ca08-87a3-4920-aa33-6bbc290b1c15-kube-api-access-4dwqc\") pod \"service-ca-9c57cc56f-22s99\" (UID: \"3ab8ca08-87a3-4920-aa33-6bbc290b1c15\") " pod="openshift-service-ca/service-ca-9c57cc56f-22s99" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.910410 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5dx7q\" (UniqueName: \"kubernetes.io/projected/aeb38d85-05d0-4a84-b3a7-4a7a168ccd98-kube-api-access-5dx7q\") pod \"control-plane-machine-set-operator-78cbb6b69f-676rf\" (UID: \"aeb38d85-05d0-4a84-b3a7-4a7a168ccd98\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-676rf" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.917820 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g9vz7\" (UID: \"8850b908-0e43-45d6-a8d2-44e1fe06c4e0\") " pod="openshift-image-registry/image-registry-697d97f7c8-g9vz7" Dec 11 13:06:31 crc kubenswrapper[4898]: E1211 13:06:31.918207 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-11 13:06:32.418192155 +0000 UTC m=+149.990518592 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g9vz7" (UID: "8850b908-0e43-45d6-a8d2-44e1fe06c4e0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.957546 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ndrg9\" (UniqueName: \"kubernetes.io/projected/06275739-8ec6-405c-a9d4-0fc545a2277a-kube-api-access-ndrg9\") pod \"migrator-59844c95c7-5gljx\" (UID: \"06275739-8ec6-405c-a9d4-0fc545a2277a\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-5gljx" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.964653 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-lbjzz" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.966143 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gkd27"] Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.973163 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v79b5\" (UniqueName: \"kubernetes.io/projected/5cfb0e1f-8b8f-4f19-9a91-87a9c2bd42f7-kube-api-access-v79b5\") pod \"marketplace-operator-79b997595-92m7s\" (UID: \"5cfb0e1f-8b8f-4f19-9a91-87a9c2bd42f7\") " pod="openshift-marketplace/marketplace-operator-79b997595-92m7s" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.973306 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fzmxs\" (UniqueName: \"kubernetes.io/projected/39bfd95c-8066-475a-ae86-f34ce2cae1e7-kube-api-access-fzmxs\") pod \"multus-admission-controller-857f4d67dd-vjbmx\" (UID: \"39bfd95c-8066-475a-ae86-f34ce2cae1e7\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-vjbmx" Dec 11 13:06:31 crc kubenswrapper[4898]: I1211 13:06:31.975162 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vc9zl"] Dec 11 13:06:32 crc kubenswrapper[4898]: I1211 13:06:31.996708 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xqv9j\" (UniqueName: \"kubernetes.io/projected/aa90eeb0-bd02-434e-a457-47336b084be7-kube-api-access-xqv9j\") pod \"csi-hostpathplugin-n8qh4\" (UID: \"aa90eeb0-bd02-434e-a457-47336b084be7\") " pod="hostpath-provisioner/csi-hostpathplugin-n8qh4" Dec 11 13:06:32 crc kubenswrapper[4898]: I1211 13:06:32.019335 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 11 13:06:32 crc kubenswrapper[4898]: E1211 13:06:32.019712 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-11 13:06:32.519698222 +0000 UTC m=+150.092024659 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 13:06:32 crc kubenswrapper[4898]: I1211 13:06:32.021777 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d5bcb23e-d100-4e41-bcc4-e8773a821c91-bound-sa-token\") pod \"ingress-operator-5b745b69d9-6df8v\" (UID: \"d5bcb23e-d100-4e41-bcc4-e8773a821c91\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6df8v" Dec 11 13:06:32 crc kubenswrapper[4898]: W1211 13:06:32.043963 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod730444c8_a1f8_4b14_b415_4b55869e7da3.slice/crio-bbb29dda5bd5ae4baf0134f3fad9f89e443078c4b8332fb565e38ae73cfd0a4c WatchSource:0}: Error finding container bbb29dda5bd5ae4baf0134f3fad9f89e443078c4b8332fb565e38ae73cfd0a4c: Status 404 returned error can't find the container with id bbb29dda5bd5ae4baf0134f3fad9f89e443078c4b8332fb565e38ae73cfd0a4c Dec 11 13:06:32 crc kubenswrapper[4898]: I1211 13:06:32.044071 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-p9xjl" Dec 11 13:06:32 crc kubenswrapper[4898]: I1211 13:06:32.051274 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gsdqx\" (UniqueName: \"kubernetes.io/projected/ccaafcb8-0877-4754-b7f7-43d3c35c6283-kube-api-access-gsdqx\") pod \"ingress-canary-mdzwq\" (UID: \"ccaafcb8-0877-4754-b7f7-43d3c35c6283\") " pod="openshift-ingress-canary/ingress-canary-mdzwq" Dec 11 13:06:32 crc kubenswrapper[4898]: I1211 13:06:32.060763 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pjldm\" (UniqueName: \"kubernetes.io/projected/c000ebbb-ebbf-4861-b148-6a21649befb4-kube-api-access-pjldm\") pod \"catalog-operator-68c6474976-ntzhr\" (UID: \"c000ebbb-ebbf-4861-b148-6a21649befb4\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ntzhr" Dec 11 13:06:32 crc kubenswrapper[4898]: I1211 13:06:32.090830 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-5gljx" Dec 11 13:06:32 crc kubenswrapper[4898]: I1211 13:06:32.107506 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-59qgx\" (UniqueName: \"kubernetes.io/projected/c395a35a-0376-4626-a75d-c1d2631e3de1-kube-api-access-59qgx\") pod \"machine-config-controller-84d6567774-wm9qr\" (UID: \"c395a35a-0376-4626-a75d-c1d2631e3de1\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-wm9qr" Dec 11 13:06:32 crc kubenswrapper[4898]: I1211 13:06:32.107794 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-676rf" Dec 11 13:06:32 crc kubenswrapper[4898]: I1211 13:06:32.120733 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g9vz7\" (UID: \"8850b908-0e43-45d6-a8d2-44e1fe06c4e0\") " pod="openshift-image-registry/image-registry-697d97f7c8-g9vz7" Dec 11 13:06:32 crc kubenswrapper[4898]: E1211 13:06:32.121098 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-11 13:06:32.621083516 +0000 UTC m=+150.193409953 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g9vz7" (UID: "8850b908-0e43-45d6-a8d2-44e1fe06c4e0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 13:06:32 crc kubenswrapper[4898]: I1211 13:06:32.124868 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fvrfg\" (UniqueName: \"kubernetes.io/projected/502c4cda-4f15-490b-bce9-a51f62fb940e-kube-api-access-fvrfg\") pod \"machine-config-server-5ht9s\" (UID: \"502c4cda-4f15-490b-bce9-a51f62fb940e\") " pod="openshift-machine-config-operator/machine-config-server-5ht9s" Dec 11 13:06:32 crc kubenswrapper[4898]: I1211 13:06:32.165260 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-kdd7p"] Dec 11 13:06:32 crc kubenswrapper[4898]: I1211 13:06:32.170159 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-92m7s" Dec 11 13:06:32 crc kubenswrapper[4898]: I1211 13:06:32.173085 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-22s99" Dec 11 13:06:32 crc kubenswrapper[4898]: I1211 13:06:32.173192 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-lw2tt" Dec 11 13:06:32 crc kubenswrapper[4898]: I1211 13:06:32.180479 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-mdzwq" Dec 11 13:06:32 crc kubenswrapper[4898]: I1211 13:06:32.182049 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4d25d3be-3b53-4824-b357-5f251a16aa38-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-25l4n\" (UID: \"4d25d3be-3b53-4824-b357-5f251a16aa38\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-25l4n" Dec 11 13:06:32 crc kubenswrapper[4898]: I1211 13:06:32.188600 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-5ht9s" Dec 11 13:06:32 crc kubenswrapper[4898]: I1211 13:06:32.189770 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fnkql\" (UniqueName: \"kubernetes.io/projected/d5bcb23e-d100-4e41-bcc4-e8773a821c91-kube-api-access-fnkql\") pod \"ingress-operator-5b745b69d9-6df8v\" (UID: \"d5bcb23e-d100-4e41-bcc4-e8773a821c91\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6df8v" Dec 11 13:06:32 crc kubenswrapper[4898]: I1211 13:06:32.195197 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-vjbmx" Dec 11 13:06:32 crc kubenswrapper[4898]: I1211 13:06:32.200044 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-25hbq\" (UniqueName: \"kubernetes.io/projected/2b85feb3-50df-4ad9-ac89-de94e7842c7e-kube-api-access-25hbq\") pod \"service-ca-operator-777779d784-2cnbc\" (UID: \"2b85feb3-50df-4ad9-ac89-de94e7842c7e\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-2cnbc" Dec 11 13:06:32 crc kubenswrapper[4898]: I1211 13:06:32.202945 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-n8qh4" Dec 11 13:06:32 crc kubenswrapper[4898]: I1211 13:06:32.211972 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ntjcn\" (UniqueName: \"kubernetes.io/projected/14360874-1ac7-4262-b7ed-3ccc4d909191-kube-api-access-ntjcn\") pod \"packageserver-d55dfcdfc-pjgq4\" (UID: \"14360874-1ac7-4262-b7ed-3ccc4d909191\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pjgq4" Dec 11 13:06:32 crc kubenswrapper[4898]: I1211 13:06:32.217531 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9pfkw\" (UniqueName: \"kubernetes.io/projected/e650293a-d60c-4e05-88dd-ea1fa46b3492-kube-api-access-9pfkw\") pod \"collect-profiles-29424300-txml6\" (UID: \"e650293a-d60c-4e05-88dd-ea1fa46b3492\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29424300-txml6" Dec 11 13:06:32 crc kubenswrapper[4898]: I1211 13:06:32.223006 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 11 13:06:32 crc kubenswrapper[4898]: E1211 13:06:32.223344 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-11 13:06:32.723327334 +0000 UTC m=+150.295653761 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 13:06:32 crc kubenswrapper[4898]: I1211 13:06:32.249857 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrxc2\" (UniqueName: \"kubernetes.io/projected/d5b90074-739f-4f6d-a41c-29612abca57e-kube-api-access-qrxc2\") pod \"package-server-manager-789f6589d5-zl9tf\" (UID: \"d5b90074-739f-4f6d-a41c-29612abca57e\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zl9tf" Dec 11 13:06:32 crc kubenswrapper[4898]: I1211 13:06:32.250163 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7jcpb\" (UniqueName: \"kubernetes.io/projected/2ae587f2-140e-4780-844e-1eb3430f7ee6-kube-api-access-7jcpb\") pod \"dns-default-cc9qv\" (UID: \"2ae587f2-140e-4780-844e-1eb3430f7ee6\") " pod="openshift-dns/dns-default-cc9qv" Dec 11 13:06:32 crc kubenswrapper[4898]: I1211 13:06:32.252096 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ntzhr" Dec 11 13:06:32 crc kubenswrapper[4898]: I1211 13:06:32.261230 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t52vq\" (UniqueName: \"kubernetes.io/projected/588998fa-36f4-49d4-a69b-d60a3952787b-kube-api-access-t52vq\") pod \"olm-operator-6b444d44fb-nv8jk\" (UID: \"588998fa-36f4-49d4-a69b-d60a3952787b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-nv8jk" Dec 11 13:06:32 crc kubenswrapper[4898]: I1211 13:06:32.324247 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g9vz7\" (UID: \"8850b908-0e43-45d6-a8d2-44e1fe06c4e0\") " pod="openshift-image-registry/image-registry-697d97f7c8-g9vz7" Dec 11 13:06:32 crc kubenswrapper[4898]: E1211 13:06:32.324798 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-11 13:06:32.82478381 +0000 UTC m=+150.397110237 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g9vz7" (UID: "8850b908-0e43-45d6-a8d2-44e1fe06c4e0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 13:06:32 crc kubenswrapper[4898]: I1211 13:06:32.371726 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6df8v" Dec 11 13:06:32 crc kubenswrapper[4898]: I1211 13:06:32.375622 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-wm9qr" Dec 11 13:06:32 crc kubenswrapper[4898]: I1211 13:06:32.388898 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-nv8jk" Dec 11 13:06:32 crc kubenswrapper[4898]: I1211 13:06:32.399427 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-25l4n" Dec 11 13:06:32 crc kubenswrapper[4898]: I1211 13:06:32.420218 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zl9tf" Dec 11 13:06:32 crc kubenswrapper[4898]: I1211 13:06:32.426861 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-2cnbc" Dec 11 13:06:32 crc kubenswrapper[4898]: I1211 13:06:32.431091 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 11 13:06:32 crc kubenswrapper[4898]: E1211 13:06:32.431227 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-11 13:06:32.931200915 +0000 UTC m=+150.503527352 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 13:06:32 crc kubenswrapper[4898]: I1211 13:06:32.431258 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g9vz7\" (UID: \"8850b908-0e43-45d6-a8d2-44e1fe06c4e0\") " pod="openshift-image-registry/image-registry-697d97f7c8-g9vz7" Dec 11 13:06:32 crc kubenswrapper[4898]: E1211 13:06:32.431692 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-11 13:06:32.931675038 +0000 UTC m=+150.504001475 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g9vz7" (UID: "8850b908-0e43-45d6-a8d2-44e1fe06c4e0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 13:06:32 crc kubenswrapper[4898]: I1211 13:06:32.434281 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pjgq4" Dec 11 13:06:32 crc kubenswrapper[4898]: I1211 13:06:32.441201 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29424300-txml6" Dec 11 13:06:32 crc kubenswrapper[4898]: I1211 13:06:32.452190 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-4bmq4"] Dec 11 13:06:32 crc kubenswrapper[4898]: I1211 13:06:32.474294 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-cc9qv" Dec 11 13:06:32 crc kubenswrapper[4898]: I1211 13:06:32.521521 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-mmcc8"] Dec 11 13:06:32 crc kubenswrapper[4898]: I1211 13:06:32.524197 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-42442"] Dec 11 13:06:32 crc kubenswrapper[4898]: I1211 13:06:32.524403 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-lpsbl" podStartSLOduration=131.52439498 podStartE2EDuration="2m11.52439498s" podCreationTimestamp="2025-12-11 13:04:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 13:06:32.518626959 +0000 UTC m=+150.090953406" watchObservedRunningTime="2025-12-11 13:06:32.52439498 +0000 UTC m=+150.096721417" Dec 11 13:06:32 crc kubenswrapper[4898]: I1211 13:06:32.535925 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 11 13:06:32 crc kubenswrapper[4898]: E1211 13:06:32.536294 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-11 13:06:33.036279822 +0000 UTC m=+150.608606259 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 13:06:32 crc kubenswrapper[4898]: I1211 13:06:32.637113 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g9vz7\" (UID: \"8850b908-0e43-45d6-a8d2-44e1fe06c4e0\") " pod="openshift-image-registry/image-registry-697d97f7c8-g9vz7" Dec 11 13:06:32 crc kubenswrapper[4898]: E1211 13:06:32.637395 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-11 13:06:33.137383739 +0000 UTC m=+150.709710176 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g9vz7" (UID: "8850b908-0e43-45d6-a8d2-44e1fe06c4e0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 13:06:32 crc kubenswrapper[4898]: I1211 13:06:32.662703 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-nnwb2"] Dec 11 13:06:32 crc kubenswrapper[4898]: I1211 13:06:32.701339 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vc9zl" event={"ID":"48dab864-fed6-46a5-bae2-85cfa4edc939","Type":"ContainerStarted","Data":"a6eb564afe809867b31813d9f4a82d112707f9337d0dab78b86718a8ea42c43c"} Dec 11 13:06:32 crc kubenswrapper[4898]: I1211 13:06:32.739385 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 11 13:06:32 crc kubenswrapper[4898]: E1211 13:06:32.739860 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-11 13:06:33.239838312 +0000 UTC m=+150.812164749 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 13:06:32 crc kubenswrapper[4898]: I1211 13:06:32.740989 4898 generic.go:334] "Generic (PLEG): container finished" podID="1bd37230-d5b0-47d1-b4c6-df3c1ad3788c" containerID="90edb8f4357ecf4f8be75cd62e4f2324812e267557d1ace6212ef799374e5f91" exitCode=0 Dec 11 13:06:32 crc kubenswrapper[4898]: I1211 13:06:32.741069 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vgm9d" event={"ID":"1bd37230-d5b0-47d1-b4c6-df3c1ad3788c","Type":"ContainerDied","Data":"90edb8f4357ecf4f8be75cd62e4f2324812e267557d1ace6212ef799374e5f91"} Dec 11 13:06:32 crc kubenswrapper[4898]: I1211 13:06:32.753102 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-42442" event={"ID":"5ec5ed0a-c3c1-4205-869f-c126a1249aa6","Type":"ContainerStarted","Data":"5a0179f9b3acfff4bf79e651b1a4bc063b67a2a3b706f332ec398167db85faee"} Dec 11 13:06:32 crc kubenswrapper[4898]: I1211 13:06:32.759211 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-fswhw" event={"ID":"2ca49685-192c-4594-8046-f104c0590b2f","Type":"ContainerStarted","Data":"81d332c29a896e8e1343cc08cd3867ecab695a198dda4c2d7700130d5e7a1499"} Dec 11 13:06:32 crc kubenswrapper[4898]: I1211 13:06:32.760169 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-4bmq4" event={"ID":"5442029d-e1aa-4496-b3c6-18f11b034179","Type":"ContainerStarted","Data":"26d44e6ad0f02e112eaf0bf39e3f26523ffe9488853056a190e72cb59e9fb2d5"} Dec 11 13:06:32 crc kubenswrapper[4898]: I1211 13:06:32.764240 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-7cbpd" event={"ID":"dd7edc7b-2d3a-4402-b7de-e70de317e52e","Type":"ContainerStarted","Data":"77cb8186e37ceeb95553e18fa80b275131038f66f4ef145697387a2e3a44122b"} Dec 11 13:06:32 crc kubenswrapper[4898]: I1211 13:06:32.764271 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-7cbpd" event={"ID":"dd7edc7b-2d3a-4402-b7de-e70de317e52e","Type":"ContainerStarted","Data":"e38e187f9d9ca3467e385951b420634e42884eacc7a246a7d2ffe3a369a2bb59"} Dec 11 13:06:32 crc kubenswrapper[4898]: I1211 13:06:32.768991 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-5ht9s" event={"ID":"502c4cda-4f15-490b-bce9-a51f62fb940e","Type":"ContainerStarted","Data":"f630ea2e4ac289d6bca0cc4c962215bf6c7dcc56a65dee25b4d563018cf89e8d"} Dec 11 13:06:32 crc kubenswrapper[4898]: I1211 13:06:32.769869 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-mmcc8" event={"ID":"b7239fb8-0fed-4730-9608-db8000ae13dd","Type":"ContainerStarted","Data":"80c7c10a182f6dc52ba8079bda3c6b6671c8e58862be9ffd9d212575fae5becd"} Dec 11 13:06:32 crc kubenswrapper[4898]: I1211 13:06:32.771135 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-jmljz" event={"ID":"7b9b7fbd-6a91-4761-8604-b82c417e12f8","Type":"ContainerStarted","Data":"32667290e7f2ee56d146c89e60be0fed72620437a54370d2c55a14d4bc5bb28f"} Dec 11 13:06:32 crc kubenswrapper[4898]: I1211 13:06:32.773932 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kdd7p" event={"ID":"bd1c8aad-a95c-4ab4-82f9-beb8702efab8","Type":"ContainerStarted","Data":"6063224dc5293deceee41176ffd8e48d412ccf160af891f2d0581fc8de85afbe"} Dec 11 13:06:32 crc kubenswrapper[4898]: I1211 13:06:32.852900 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g9vz7\" (UID: \"8850b908-0e43-45d6-a8d2-44e1fe06c4e0\") " pod="openshift-image-registry/image-registry-697d97f7c8-g9vz7" Dec 11 13:06:32 crc kubenswrapper[4898]: E1211 13:06:32.858219 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-11 13:06:33.358197451 +0000 UTC m=+150.930523888 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g9vz7" (UID: "8850b908-0e43-45d6-a8d2-44e1fe06c4e0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 13:06:32 crc kubenswrapper[4898]: I1211 13:06:32.897688 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gkd27" event={"ID":"730444c8-a1f8-4b14-b415-4b55869e7da3","Type":"ContainerStarted","Data":"bbb29dda5bd5ae4baf0134f3fad9f89e443078c4b8332fb565e38ae73cfd0a4c"} Dec 11 13:06:32 crc kubenswrapper[4898]: I1211 13:06:32.897734 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-lqr65" event={"ID":"1dd4c49a-1898-48ab-9ed0-4f455f392b57","Type":"ContainerStarted","Data":"3f5abb7b0bc606caadfa6327cecb0508cb4131860c37923ac7fcc694bd59d0d7"} Dec 11 13:06:32 crc kubenswrapper[4898]: I1211 13:06:32.897746 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-lqr65" event={"ID":"1dd4c49a-1898-48ab-9ed0-4f455f392b57","Type":"ContainerStarted","Data":"82114d8cb9abdaeab29a4e351ac10afab3f0dc0b63766868deab867832d8e0ab"} Dec 11 13:06:32 crc kubenswrapper[4898]: I1211 13:06:32.897757 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-pjp8z" event={"ID":"af9f94b9-3e85-4e0c-bb97-ac84071e969f","Type":"ContainerStarted","Data":"68eccb1734073a4c76d84cd08a5109a4302b1f76b1ae43d8bd8390fc7b0bb820"} Dec 11 13:06:32 crc kubenswrapper[4898]: I1211 13:06:32.897769 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-7csms" event={"ID":"f46b0f35-a460-4977-a99a-8e0ea7015416","Type":"ContainerStarted","Data":"aa34d0dbf2d8fa19ca840209192a0a3f2bd5ac55c079419b95dbd534625e4239"} Dec 11 13:06:32 crc kubenswrapper[4898]: I1211 13:06:32.897788 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-7csms" event={"ID":"f46b0f35-a460-4977-a99a-8e0ea7015416","Type":"ContainerStarted","Data":"e5c29678a6a07f558f68789dbb527165d6d56e1ce9994f58bfa724f77bba09b3"} Dec 11 13:06:32 crc kubenswrapper[4898]: I1211 13:06:32.953762 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 11 13:06:32 crc kubenswrapper[4898]: E1211 13:06:32.955331 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-11 13:06:33.455306156 +0000 UTC m=+151.027632663 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 13:06:32 crc kubenswrapper[4898]: I1211 13:06:32.964276 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-2c4kd" podStartSLOduration=131.964257176 podStartE2EDuration="2m11.964257176s" podCreationTimestamp="2025-12-11 13:04:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 13:06:32.96011063 +0000 UTC m=+150.532437057" watchObservedRunningTime="2025-12-11 13:06:32.964257176 +0000 UTC m=+150.536583613" Dec 11 13:06:33 crc kubenswrapper[4898]: I1211 13:06:33.055859 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g9vz7\" (UID: \"8850b908-0e43-45d6-a8d2-44e1fe06c4e0\") " pod="openshift-image-registry/image-registry-697d97f7c8-g9vz7" Dec 11 13:06:33 crc kubenswrapper[4898]: E1211 13:06:33.056225 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-11 13:06:33.556211426 +0000 UTC m=+151.128537863 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g9vz7" (UID: "8850b908-0e43-45d6-a8d2-44e1fe06c4e0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 13:06:33 crc kubenswrapper[4898]: I1211 13:06:33.147874 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-lbjzz"] Dec 11 13:06:33 crc kubenswrapper[4898]: I1211 13:06:33.148119 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-rkglj"] Dec 11 13:06:33 crc kubenswrapper[4898]: I1211 13:06:33.163542 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 11 13:06:33 crc kubenswrapper[4898]: E1211 13:06:33.163656 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-11 13:06:33.663614938 +0000 UTC m=+151.235941375 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 13:06:33 crc kubenswrapper[4898]: I1211 13:06:33.163959 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g9vz7\" (UID: \"8850b908-0e43-45d6-a8d2-44e1fe06c4e0\") " pod="openshift-image-registry/image-registry-697d97f7c8-g9vz7" Dec 11 13:06:33 crc kubenswrapper[4898]: E1211 13:06:33.164203 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-11 13:06:33.664192984 +0000 UTC m=+151.236519421 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g9vz7" (UID: "8850b908-0e43-45d6-a8d2-44e1fe06c4e0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 13:06:33 crc kubenswrapper[4898]: I1211 13:06:33.265479 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 11 13:06:33 crc kubenswrapper[4898]: E1211 13:06:33.265793 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-11 13:06:33.765778904 +0000 UTC m=+151.338105341 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 13:06:33 crc kubenswrapper[4898]: I1211 13:06:33.289847 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-5gljx"] Dec 11 13:06:33 crc kubenswrapper[4898]: I1211 13:06:33.366473 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g9vz7\" (UID: \"8850b908-0e43-45d6-a8d2-44e1fe06c4e0\") " pod="openshift-image-registry/image-registry-697d97f7c8-g9vz7" Dec 11 13:06:33 crc kubenswrapper[4898]: E1211 13:06:33.366799 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-11 13:06:33.866787688 +0000 UTC m=+151.439114125 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g9vz7" (UID: "8850b908-0e43-45d6-a8d2-44e1fe06c4e0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 13:06:33 crc kubenswrapper[4898]: I1211 13:06:33.366864 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-n8qh4"] Dec 11 13:06:33 crc kubenswrapper[4898]: I1211 13:06:33.415530 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-jmljz" Dec 11 13:06:33 crc kubenswrapper[4898]: I1211 13:06:33.448385 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-c9zbz" podStartSLOduration=132.448367598 podStartE2EDuration="2m12.448367598s" podCreationTimestamp="2025-12-11 13:04:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 13:06:33.442846704 +0000 UTC m=+151.015173141" watchObservedRunningTime="2025-12-11 13:06:33.448367598 +0000 UTC m=+151.020694035" Dec 11 13:06:33 crc kubenswrapper[4898]: W1211 13:06:33.456657 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaa90eeb0_bd02_434e_a457_47336b084be7.slice/crio-38fa43612feb1733f818ac1f7a7b4d4d67dcb6aa6beb22ae5875ea3313978042 WatchSource:0}: Error finding container 38fa43612feb1733f818ac1f7a7b4d4d67dcb6aa6beb22ae5875ea3313978042: Status 404 returned error can't find the container with id 38fa43612feb1733f818ac1f7a7b4d4d67dcb6aa6beb22ae5875ea3313978042 Dec 11 13:06:33 crc kubenswrapper[4898]: I1211 13:06:33.474335 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 11 13:06:33 crc kubenswrapper[4898]: E1211 13:06:33.474782 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-11 13:06:33.974735805 +0000 UTC m=+151.547062242 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 13:06:33 crc kubenswrapper[4898]: I1211 13:06:33.537116 4898 patch_prober.go:28] interesting pod/router-default-5444994796-jmljz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 11 13:06:33 crc kubenswrapper[4898]: [-]has-synced failed: reason withheld Dec 11 13:06:33 crc kubenswrapper[4898]: [+]process-running ok Dec 11 13:06:33 crc kubenswrapper[4898]: healthz check failed Dec 11 13:06:33 crc kubenswrapper[4898]: I1211 13:06:33.538246 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jmljz" podUID="7b9b7fbd-6a91-4761-8604-b82c417e12f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 11 13:06:33 crc kubenswrapper[4898]: I1211 13:06:33.580449 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g9vz7\" (UID: \"8850b908-0e43-45d6-a8d2-44e1fe06c4e0\") " pod="openshift-image-registry/image-registry-697d97f7c8-g9vz7" Dec 11 13:06:33 crc kubenswrapper[4898]: E1211 13:06:33.580727 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-11 13:06:34.080716748 +0000 UTC m=+151.653043185 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g9vz7" (UID: "8850b908-0e43-45d6-a8d2-44e1fe06c4e0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 13:06:33 crc kubenswrapper[4898]: I1211 13:06:33.664714 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4wrc2" podStartSLOduration=131.664700505 podStartE2EDuration="2m11.664700505s" podCreationTimestamp="2025-12-11 13:04:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 13:06:33.662422012 +0000 UTC m=+151.234748449" watchObservedRunningTime="2025-12-11 13:06:33.664700505 +0000 UTC m=+151.237026942" Dec 11 13:06:33 crc kubenswrapper[4898]: I1211 13:06:33.681049 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 11 13:06:33 crc kubenswrapper[4898]: E1211 13:06:33.681755 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-11 13:06:34.181738972 +0000 UTC m=+151.754065409 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 13:06:33 crc kubenswrapper[4898]: I1211 13:06:33.781173 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-fvkz4" podStartSLOduration=132.781159531 podStartE2EDuration="2m12.781159531s" podCreationTimestamp="2025-12-11 13:04:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 13:06:33.757027706 +0000 UTC m=+151.329354143" watchObservedRunningTime="2025-12-11 13:06:33.781159531 +0000 UTC m=+151.353485968" Dec 11 13:06:33 crc kubenswrapper[4898]: I1211 13:06:33.785676 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g9vz7\" (UID: \"8850b908-0e43-45d6-a8d2-44e1fe06c4e0\") " pod="openshift-image-registry/image-registry-697d97f7c8-g9vz7" Dec 11 13:06:33 crc kubenswrapper[4898]: E1211 13:06:33.786169 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-11 13:06:34.2861545 +0000 UTC m=+151.858480927 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g9vz7" (UID: "8850b908-0e43-45d6-a8d2-44e1fe06c4e0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 13:06:33 crc kubenswrapper[4898]: I1211 13:06:33.848497 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kdd7p" event={"ID":"bd1c8aad-a95c-4ab4-82f9-beb8702efab8","Type":"ContainerStarted","Data":"0d3179c5ed4440e4e9d1b9e3ed789a48fd9fe9eddaaec99c164c239714ebbf9d"} Dec 11 13:06:33 crc kubenswrapper[4898]: I1211 13:06:33.851111 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-n8qh4" event={"ID":"aa90eeb0-bd02-434e-a457-47336b084be7","Type":"ContainerStarted","Data":"38fa43612feb1733f818ac1f7a7b4d4d67dcb6aa6beb22ae5875ea3313978042"} Dec 11 13:06:33 crc kubenswrapper[4898]: I1211 13:06:33.860012 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gkd27" event={"ID":"730444c8-a1f8-4b14-b415-4b55869e7da3","Type":"ContainerStarted","Data":"6c314ad8680a62fe10960685ce9901de9a6b51ceb887212b644d3103dab4f074"} Dec 11 13:06:33 crc kubenswrapper[4898]: I1211 13:06:33.862271 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-rkglj" event={"ID":"cf77b13b-75a2-4fc1-a068-8fd33773f827","Type":"ContainerStarted","Data":"f8fc4bab34d52facb68e8f93e50deb0caa6ff14f7798cbeaed07789f6aa0ce1a"} Dec 11 13:06:33 crc kubenswrapper[4898]: I1211 13:06:33.873097 4898 generic.go:334] "Generic (PLEG): container finished" podID="f46b0f35-a460-4977-a99a-8e0ea7015416" containerID="aa34d0dbf2d8fa19ca840209192a0a3f2bd5ac55c079419b95dbd534625e4239" exitCode=0 Dec 11 13:06:33 crc kubenswrapper[4898]: I1211 13:06:33.873195 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-7csms" event={"ID":"f46b0f35-a460-4977-a99a-8e0ea7015416","Type":"ContainerDied","Data":"aa34d0dbf2d8fa19ca840209192a0a3f2bd5ac55c079419b95dbd534625e4239"} Dec 11 13:06:33 crc kubenswrapper[4898]: I1211 13:06:33.875024 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-nnwb2" event={"ID":"e65a531b-17ae-4b24-b9f6-71c758a757b0","Type":"ContainerStarted","Data":"adc7f3b9b6e4c43269851afe81ead6142dcf0821e8f3fcb70a7ac27c4585d50e"} Dec 11 13:06:33 crc kubenswrapper[4898]: I1211 13:06:33.875045 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-nnwb2" event={"ID":"e65a531b-17ae-4b24-b9f6-71c758a757b0","Type":"ContainerStarted","Data":"7cdda5c551effc1f8544a6ba82b1df2a0c8fc90625b38c449115c696635891c3"} Dec 11 13:06:33 crc kubenswrapper[4898]: I1211 13:06:33.879402 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-5gljx" event={"ID":"06275739-8ec6-405c-a9d4-0fc545a2277a","Type":"ContainerStarted","Data":"bf23bca03af256f7b46c756bb97be9009c2027793ae8d17d07e97c4b31dedcfb"} Dec 11 13:06:33 crc kubenswrapper[4898]: I1211 13:06:33.887494 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 11 13:06:33 crc kubenswrapper[4898]: E1211 13:06:33.887895 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-11 13:06:34.387875344 +0000 UTC m=+151.960201791 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 13:06:33 crc kubenswrapper[4898]: I1211 13:06:33.892143 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-4bmq4" event={"ID":"5442029d-e1aa-4496-b3c6-18f11b034179","Type":"ContainerStarted","Data":"2f9ef52ec10bff318b365c21238a425bfdfdf77e900fa0dddc4ac867ef8e6015"} Dec 11 13:06:33 crc kubenswrapper[4898]: I1211 13:06:33.892821 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-4bmq4" Dec 11 13:06:33 crc kubenswrapper[4898]: I1211 13:06:33.898198 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-lbjzz" event={"ID":"199efc5b-5cf8-4d4b-85ad-44bdcd033bd6","Type":"ContainerStarted","Data":"22938ddf3d1b682fb07ad119b578ae2f7cadfe39e5b9da228a918ebdb7b94f21"} Dec 11 13:06:33 crc kubenswrapper[4898]: I1211 13:06:33.905052 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-5ht9s" event={"ID":"502c4cda-4f15-490b-bce9-a51f62fb940e","Type":"ContainerStarted","Data":"37914c67da6810df8adb5fc10331b1353001975f6b4d9d9c29438122c8b81264"} Dec 11 13:06:33 crc kubenswrapper[4898]: I1211 13:06:33.931850 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vc9zl" event={"ID":"48dab864-fed6-46a5-bae2-85cfa4edc939","Type":"ContainerStarted","Data":"fbe7483b9d30c2b75d25420451d8ffc1adbdb79018abaa102df2021a95eb13aa"} Dec 11 13:06:33 crc kubenswrapper[4898]: I1211 13:06:33.974557 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-fswhw" event={"ID":"2ca49685-192c-4594-8046-f104c0590b2f","Type":"ContainerStarted","Data":"b823e07713f8597f01853007d48ca78c8c5a388a5c9d3b40446bcc199ce1b977"} Dec 11 13:06:33 crc kubenswrapper[4898]: I1211 13:06:33.989667 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g9vz7\" (UID: \"8850b908-0e43-45d6-a8d2-44e1fe06c4e0\") " pod="openshift-image-registry/image-registry-697d97f7c8-g9vz7" Dec 11 13:06:33 crc kubenswrapper[4898]: E1211 13:06:33.990171 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-11 13:06:34.490154403 +0000 UTC m=+152.062480840 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g9vz7" (UID: "8850b908-0e43-45d6-a8d2-44e1fe06c4e0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 13:06:34 crc kubenswrapper[4898]: I1211 13:06:34.015954 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gkd27" podStartSLOduration=133.015934033 podStartE2EDuration="2m13.015934033s" podCreationTimestamp="2025-12-11 13:04:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 13:06:34.015140871 +0000 UTC m=+151.587467318" watchObservedRunningTime="2025-12-11 13:06:34.015934033 +0000 UTC m=+151.588260480" Dec 11 13:06:34 crc kubenswrapper[4898]: I1211 13:06:34.091761 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 11 13:06:34 crc kubenswrapper[4898]: I1211 13:06:34.091880 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-lqr65" podStartSLOduration=132.091859656 podStartE2EDuration="2m12.091859656s" podCreationTimestamp="2025-12-11 13:04:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 13:06:34.091373552 +0000 UTC m=+151.663699989" watchObservedRunningTime="2025-12-11 13:06:34.091859656 +0000 UTC m=+151.664186093" Dec 11 13:06:34 crc kubenswrapper[4898]: E1211 13:06:34.091933 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-11 13:06:34.591911657 +0000 UTC m=+152.164238094 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 13:06:34 crc kubenswrapper[4898]: I1211 13:06:34.093170 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g9vz7\" (UID: \"8850b908-0e43-45d6-a8d2-44e1fe06c4e0\") " pod="openshift-image-registry/image-registry-697d97f7c8-g9vz7" Dec 11 13:06:34 crc kubenswrapper[4898]: E1211 13:06:34.095667 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-11 13:06:34.595650282 +0000 UTC m=+152.167976719 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g9vz7" (UID: "8850b908-0e43-45d6-a8d2-44e1fe06c4e0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 13:06:34 crc kubenswrapper[4898]: I1211 13:06:34.200903 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 11 13:06:34 crc kubenswrapper[4898]: E1211 13:06:34.201203 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-11 13:06:34.701188422 +0000 UTC m=+152.273514859 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 13:06:34 crc kubenswrapper[4898]: I1211 13:06:34.211504 4898 patch_prober.go:28] interesting pod/downloads-7954f5f757-4bmq4 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Dec 11 13:06:34 crc kubenswrapper[4898]: I1211 13:06:34.211562 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-4bmq4" podUID="5442029d-e1aa-4496-b3c6-18f11b034179" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Dec 11 13:06:34 crc kubenswrapper[4898]: I1211 13:06:34.247504 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-pjp8z" podStartSLOduration=133.247486576 podStartE2EDuration="2m13.247486576s" podCreationTimestamp="2025-12-11 13:04:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 13:06:34.247357842 +0000 UTC m=+151.819684279" watchObservedRunningTime="2025-12-11 13:06:34.247486576 +0000 UTC m=+151.819813023" Dec 11 13:06:34 crc kubenswrapper[4898]: I1211 13:06:34.287710 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-jmljz" podStartSLOduration=132.28769456 podStartE2EDuration="2m12.28769456s" podCreationTimestamp="2025-12-11 13:04:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 13:06:34.285446927 +0000 UTC m=+151.857773384" watchObservedRunningTime="2025-12-11 13:06:34.28769456 +0000 UTC m=+151.860021017" Dec 11 13:06:34 crc kubenswrapper[4898]: I1211 13:06:34.304285 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g9vz7\" (UID: \"8850b908-0e43-45d6-a8d2-44e1fe06c4e0\") " pod="openshift-image-registry/image-registry-697d97f7c8-g9vz7" Dec 11 13:06:34 crc kubenswrapper[4898]: E1211 13:06:34.304730 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-11 13:06:34.804714876 +0000 UTC m=+152.377041323 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g9vz7" (UID: "8850b908-0e43-45d6-a8d2-44e1fe06c4e0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 13:06:34 crc kubenswrapper[4898]: I1211 13:06:34.378173 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-7cbpd" podStartSLOduration=133.378156719 podStartE2EDuration="2m13.378156719s" podCreationTimestamp="2025-12-11 13:04:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 13:06:34.377171771 +0000 UTC m=+151.949498218" watchObservedRunningTime="2025-12-11 13:06:34.378156719 +0000 UTC m=+151.950483166" Dec 11 13:06:34 crc kubenswrapper[4898]: I1211 13:06:34.406155 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 11 13:06:34 crc kubenswrapper[4898]: E1211 13:06:34.406571 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-11 13:06:34.906555893 +0000 UTC m=+152.478882330 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 13:06:34 crc kubenswrapper[4898]: I1211 13:06:34.413502 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-5ht9s" podStartSLOduration=6.413486446 podStartE2EDuration="6.413486446s" podCreationTimestamp="2025-12-11 13:06:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 13:06:34.412486288 +0000 UTC m=+151.984812725" watchObservedRunningTime="2025-12-11 13:06:34.413486446 +0000 UTC m=+151.985812883" Dec 11 13:06:34 crc kubenswrapper[4898]: I1211 13:06:34.465888 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-vc9zl" podStartSLOduration=132.46586724 podStartE2EDuration="2m12.46586724s" podCreationTimestamp="2025-12-11 13:04:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 13:06:34.4640633 +0000 UTC m=+152.036389737" watchObservedRunningTime="2025-12-11 13:06:34.46586724 +0000 UTC m=+152.038193677" Dec 11 13:06:34 crc kubenswrapper[4898]: I1211 13:06:34.502539 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-fswhw" podStartSLOduration=132.502520525 podStartE2EDuration="2m12.502520525s" podCreationTimestamp="2025-12-11 13:04:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 13:06:34.501590959 +0000 UTC m=+152.073917406" watchObservedRunningTime="2025-12-11 13:06:34.502520525 +0000 UTC m=+152.074846962" Dec 11 13:06:34 crc kubenswrapper[4898]: I1211 13:06:34.511257 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g9vz7\" (UID: \"8850b908-0e43-45d6-a8d2-44e1fe06c4e0\") " pod="openshift-image-registry/image-registry-697d97f7c8-g9vz7" Dec 11 13:06:34 crc kubenswrapper[4898]: E1211 13:06:34.511575 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-11 13:06:35.011564328 +0000 UTC m=+152.583890765 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g9vz7" (UID: "8850b908-0e43-45d6-a8d2-44e1fe06c4e0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 13:06:34 crc kubenswrapper[4898]: I1211 13:06:34.536005 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-4bmq4" podStartSLOduration=133.535990531 podStartE2EDuration="2m13.535990531s" podCreationTimestamp="2025-12-11 13:04:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 13:06:34.534374865 +0000 UTC m=+152.106701302" watchObservedRunningTime="2025-12-11 13:06:34.535990531 +0000 UTC m=+152.108316968" Dec 11 13:06:34 crc kubenswrapper[4898]: I1211 13:06:34.608119 4898 patch_prober.go:28] interesting pod/router-default-5444994796-jmljz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 11 13:06:34 crc kubenswrapper[4898]: [-]has-synced failed: reason withheld Dec 11 13:06:34 crc kubenswrapper[4898]: [+]process-running ok Dec 11 13:06:34 crc kubenswrapper[4898]: healthz check failed Dec 11 13:06:34 crc kubenswrapper[4898]: I1211 13:06:34.608178 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jmljz" podUID="7b9b7fbd-6a91-4761-8604-b82c417e12f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 11 13:06:34 crc kubenswrapper[4898]: I1211 13:06:34.616854 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 11 13:06:34 crc kubenswrapper[4898]: E1211 13:06:34.617314 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-11 13:06:35.117294233 +0000 UTC m=+152.689620680 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 13:06:34 crc kubenswrapper[4898]: I1211 13:06:34.720115 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g9vz7\" (UID: \"8850b908-0e43-45d6-a8d2-44e1fe06c4e0\") " pod="openshift-image-registry/image-registry-697d97f7c8-g9vz7" Dec 11 13:06:34 crc kubenswrapper[4898]: E1211 13:06:34.720508 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-11 13:06:35.220497468 +0000 UTC m=+152.792823905 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g9vz7" (UID: "8850b908-0e43-45d6-a8d2-44e1fe06c4e0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 13:06:34 crc kubenswrapper[4898]: I1211 13:06:34.801805 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-676rf"] Dec 11 13:06:34 crc kubenswrapper[4898]: I1211 13:06:34.802137 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-92m7s"] Dec 11 13:06:34 crc kubenswrapper[4898]: I1211 13:06:34.816758 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-p9xjl"] Dec 11 13:06:34 crc kubenswrapper[4898]: I1211 13:06:34.824808 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 11 13:06:34 crc kubenswrapper[4898]: E1211 13:06:34.824962 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-11 13:06:35.324930227 +0000 UTC m=+152.897256674 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 13:06:34 crc kubenswrapper[4898]: I1211 13:06:34.825087 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g9vz7\" (UID: \"8850b908-0e43-45d6-a8d2-44e1fe06c4e0\") " pod="openshift-image-registry/image-registry-697d97f7c8-g9vz7" Dec 11 13:06:34 crc kubenswrapper[4898]: E1211 13:06:34.825501 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-11 13:06:35.325488113 +0000 UTC m=+152.897814610 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g9vz7" (UID: "8850b908-0e43-45d6-a8d2-44e1fe06c4e0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 13:06:34 crc kubenswrapper[4898]: W1211 13:06:34.833018 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5cfb0e1f_8b8f_4f19_9a91_87a9c2bd42f7.slice/crio-fe6294ae9cd2a4722a60cab6aaf4048fa5ccb63c306edef175d95f7ef23c33f1 WatchSource:0}: Error finding container fe6294ae9cd2a4722a60cab6aaf4048fa5ccb63c306edef175d95f7ef23c33f1: Status 404 returned error can't find the container with id fe6294ae9cd2a4722a60cab6aaf4048fa5ccb63c306edef175d95f7ef23c33f1 Dec 11 13:06:34 crc kubenswrapper[4898]: I1211 13:06:34.850139 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-lw2tt"] Dec 11 13:06:34 crc kubenswrapper[4898]: W1211 13:06:34.865724 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7d9ad469_8c6d_45df_8a7d_84250d766f58.slice/crio-fd6d129e11f3df015cae16c612e668c50e42335c6686c092904d35bac536131b WatchSource:0}: Error finding container fd6d129e11f3df015cae16c612e668c50e42335c6686c092904d35bac536131b: Status 404 returned error can't find the container with id fd6d129e11f3df015cae16c612e668c50e42335c6686c092904d35bac536131b Dec 11 13:06:34 crc kubenswrapper[4898]: I1211 13:06:34.931919 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 11 13:06:34 crc kubenswrapper[4898]: E1211 13:06:34.932036 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-11 13:06:35.432011071 +0000 UTC m=+153.004337508 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 13:06:34 crc kubenswrapper[4898]: I1211 13:06:34.932076 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g9vz7\" (UID: \"8850b908-0e43-45d6-a8d2-44e1fe06c4e0\") " pod="openshift-image-registry/image-registry-697d97f7c8-g9vz7" Dec 11 13:06:34 crc kubenswrapper[4898]: E1211 13:06:34.932603 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-11 13:06:35.432590307 +0000 UTC m=+153.004916744 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g9vz7" (UID: "8850b908-0e43-45d6-a8d2-44e1fe06c4e0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 13:06:34 crc kubenswrapper[4898]: I1211 13:06:34.984464 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-fvkz4" Dec 11 13:06:34 crc kubenswrapper[4898]: I1211 13:06:34.984691 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-fvkz4" Dec 11 13:06:35 crc kubenswrapper[4898]: I1211 13:06:34.999968 4898 patch_prober.go:28] interesting pod/machine-config-daemon-7mmvk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 11 13:06:35 crc kubenswrapper[4898]: I1211 13:06:35.000021 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 11 13:06:35 crc kubenswrapper[4898]: I1211 13:06:35.000710 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-fvkz4" Dec 11 13:06:35 crc kubenswrapper[4898]: I1211 13:06:35.008511 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zl9tf"] Dec 11 13:06:35 crc kubenswrapper[4898]: I1211 13:06:35.033524 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 11 13:06:35 crc kubenswrapper[4898]: E1211 13:06:35.033970 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-11 13:06:35.533952849 +0000 UTC m=+153.106279286 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 13:06:35 crc kubenswrapper[4898]: I1211 13:06:35.034856 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kdd7p" event={"ID":"bd1c8aad-a95c-4ab4-82f9-beb8702efab8","Type":"ContainerStarted","Data":"6a6276923797b501f81c3bc4f36016e91f878d372394a04956b07ddb7f19b99f"} Dec 11 13:06:35 crc kubenswrapper[4898]: I1211 13:06:35.035826 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29424300-txml6"] Dec 11 13:06:35 crc kubenswrapper[4898]: I1211 13:06:35.062393 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kdd7p" podStartSLOduration=133.062372334 podStartE2EDuration="2m13.062372334s" podCreationTimestamp="2025-12-11 13:04:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 13:06:35.055235804 +0000 UTC m=+152.627562251" watchObservedRunningTime="2025-12-11 13:06:35.062372334 +0000 UTC m=+152.634698771" Dec 11 13:06:35 crc kubenswrapper[4898]: I1211 13:06:35.063926 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-lbjzz" event={"ID":"199efc5b-5cf8-4d4b-85ad-44bdcd033bd6","Type":"ContainerStarted","Data":"e5b152c52c061285811afdcf1c1031c3c20070d654a4cf616d6c3e808aec136e"} Dec 11 13:06:35 crc kubenswrapper[4898]: I1211 13:06:35.065727 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-lbjzz" Dec 11 13:06:35 crc kubenswrapper[4898]: I1211 13:06:35.065746 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-92m7s" event={"ID":"5cfb0e1f-8b8f-4f19-9a91-87a9c2bd42f7","Type":"ContainerStarted","Data":"fe6294ae9cd2a4722a60cab6aaf4048fa5ccb63c306edef175d95f7ef23c33f1"} Dec 11 13:06:35 crc kubenswrapper[4898]: I1211 13:06:35.100433 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-mmcc8" event={"ID":"b7239fb8-0fed-4730-9608-db8000ae13dd","Type":"ContainerStarted","Data":"3f1e65472fcbda4bf3378ecf3fd39b45e2988c43e78bc831a9a285b3ce9a9f0f"} Dec 11 13:06:35 crc kubenswrapper[4898]: I1211 13:06:35.124821 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-mmcc8" event={"ID":"b7239fb8-0fed-4730-9608-db8000ae13dd","Type":"ContainerStarted","Data":"bd64efc600bd2ccd7236476db36f25ed22a1be3f22e2cb1a1e523f488b467435"} Dec 11 13:06:35 crc kubenswrapper[4898]: I1211 13:06:35.136505 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g9vz7\" (UID: \"8850b908-0e43-45d6-a8d2-44e1fe06c4e0\") " pod="openshift-image-registry/image-registry-697d97f7c8-g9vz7" Dec 11 13:06:35 crc kubenswrapper[4898]: E1211 13:06:35.136844 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-11 13:06:35.636829295 +0000 UTC m=+153.209155732 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g9vz7" (UID: "8850b908-0e43-45d6-a8d2-44e1fe06c4e0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 13:06:35 crc kubenswrapper[4898]: I1211 13:06:35.155767 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-5gljx" event={"ID":"06275739-8ec6-405c-a9d4-0fc545a2277a","Type":"ContainerStarted","Data":"4f86e4d5844128b7f1fa6bc336235db279c991be83be7fa8ea573ee74c2e120e"} Dec 11 13:06:35 crc kubenswrapper[4898]: I1211 13:06:35.155809 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-5gljx" event={"ID":"06275739-8ec6-405c-a9d4-0fc545a2277a","Type":"ContainerStarted","Data":"691abe9398c681accf1eedd87a8b42840b79be8a58424cce260f9e75aa759e63"} Dec 11 13:06:35 crc kubenswrapper[4898]: I1211 13:06:35.164989 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-lbjzz" podStartSLOduration=134.164972992 podStartE2EDuration="2m14.164972992s" podCreationTimestamp="2025-12-11 13:04:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 13:06:35.163574003 +0000 UTC m=+152.735900450" watchObservedRunningTime="2025-12-11 13:06:35.164972992 +0000 UTC m=+152.737299429" Dec 11 13:06:35 crc kubenswrapper[4898]: I1211 13:06:35.181783 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-nv8jk"] Dec 11 13:06:35 crc kubenswrapper[4898]: I1211 13:06:35.181925 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-nnwb2" event={"ID":"e65a531b-17ae-4b24-b9f6-71c758a757b0","Type":"ContainerStarted","Data":"ab8967e76a64ccd86dc31bb3e59cf89d8388c3a26fec55b076765aa64609c4b8"} Dec 11 13:06:35 crc kubenswrapper[4898]: I1211 13:06:35.212360 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-22s99"] Dec 11 13:06:35 crc kubenswrapper[4898]: I1211 13:06:35.212498 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-2cnbc"] Dec 11 13:06:35 crc kubenswrapper[4898]: I1211 13:06:35.222252 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-7csms" event={"ID":"f46b0f35-a460-4977-a99a-8e0ea7015416","Type":"ContainerStarted","Data":"f874537b3cf8ddf2e7ad2c93c35967f17fad71cd2c7f7863e024bfea8bd767a7"} Dec 11 13:06:35 crc kubenswrapper[4898]: W1211 13:06:35.223945 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod588998fa_36f4_49d4_a69b_d60a3952787b.slice/crio-9cfd6e1864e5ee1284a72d1d33f2c00bed9b4edccca00131e5f8c7bad6c49a20 WatchSource:0}: Error finding container 9cfd6e1864e5ee1284a72d1d33f2c00bed9b4edccca00131e5f8c7bad6c49a20: Status 404 returned error can't find the container with id 9cfd6e1864e5ee1284a72d1d33f2c00bed9b4edccca00131e5f8c7bad6c49a20 Dec 11 13:06:35 crc kubenswrapper[4898]: I1211 13:06:35.226519 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ntzhr"] Dec 11 13:06:35 crc kubenswrapper[4898]: I1211 13:06:35.234028 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-vjbmx"] Dec 11 13:06:35 crc kubenswrapper[4898]: I1211 13:06:35.236277 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-mdzwq"] Dec 11 13:06:35 crc kubenswrapper[4898]: I1211 13:06:35.237954 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 11 13:06:35 crc kubenswrapper[4898]: E1211 13:06:35.239360 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-11 13:06:35.739340231 +0000 UTC m=+153.311666678 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 13:06:35 crc kubenswrapper[4898]: I1211 13:06:35.246467 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-25l4n"] Dec 11 13:06:35 crc kubenswrapper[4898]: I1211 13:06:35.249091 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-6df8v"] Dec 11 13:06:35 crc kubenswrapper[4898]: I1211 13:06:35.250354 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-5gljx" podStartSLOduration=133.250344138 podStartE2EDuration="2m13.250344138s" podCreationTimestamp="2025-12-11 13:04:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 13:06:35.212213342 +0000 UTC m=+152.784539779" watchObservedRunningTime="2025-12-11 13:06:35.250344138 +0000 UTC m=+152.822670575" Dec 11 13:06:35 crc kubenswrapper[4898]: I1211 13:06:35.262510 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-wm9qr"] Dec 11 13:06:35 crc kubenswrapper[4898]: I1211 13:06:35.263262 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pjgq4"] Dec 11 13:06:35 crc kubenswrapper[4898]: W1211 13:06:35.265881 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4d25d3be_3b53_4824_b357_5f251a16aa38.slice/crio-236ac0b2584826857b09959e4633e44f2020ba07aab6357263c5c072582c989d WatchSource:0}: Error finding container 236ac0b2584826857b09959e4633e44f2020ba07aab6357263c5c072582c989d: Status 404 returned error can't find the container with id 236ac0b2584826857b09959e4633e44f2020ba07aab6357263c5c072582c989d Dec 11 13:06:35 crc kubenswrapper[4898]: I1211 13:06:35.270530 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-676rf" event={"ID":"aeb38d85-05d0-4a84-b3a7-4a7a168ccd98","Type":"ContainerStarted","Data":"7f28fa2eb2d16f6caa6b54d3c045b9270bd71ea82b7b3e89ac768eebd1f91f95"} Dec 11 13:06:35 crc kubenswrapper[4898]: W1211 13:06:35.286983 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod39bfd95c_8066_475a_ae86_f34ce2cae1e7.slice/crio-3322e8c1e047c523b48f1dd1d27e3fd1fc155cda69840d67df706614dc1c04d6 WatchSource:0}: Error finding container 3322e8c1e047c523b48f1dd1d27e3fd1fc155cda69840d67df706614dc1c04d6: Status 404 returned error can't find the container with id 3322e8c1e047c523b48f1dd1d27e3fd1fc155cda69840d67df706614dc1c04d6 Dec 11 13:06:35 crc kubenswrapper[4898]: I1211 13:06:35.288782 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-mmcc8" podStartSLOduration=134.288768922 podStartE2EDuration="2m14.288768922s" podCreationTimestamp="2025-12-11 13:04:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 13:06:35.288086433 +0000 UTC m=+152.860412870" watchObservedRunningTime="2025-12-11 13:06:35.288768922 +0000 UTC m=+152.861095359" Dec 11 13:06:35 crc kubenswrapper[4898]: I1211 13:06:35.318469 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-cc9qv"] Dec 11 13:06:35 crc kubenswrapper[4898]: I1211 13:06:35.318668 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-rkglj" event={"ID":"cf77b13b-75a2-4fc1-a068-8fd33773f827","Type":"ContainerStarted","Data":"104ac80f708d1f7e764d88de5d39d606a00324ec93407c7be2f96b03d40ae7cf"} Dec 11 13:06:35 crc kubenswrapper[4898]: I1211 13:06:35.319368 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-rkglj" Dec 11 13:06:35 crc kubenswrapper[4898]: I1211 13:06:35.327888 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-lw2tt" event={"ID":"7d9ad469-8c6d-45df-8a7d-84250d766f58","Type":"ContainerStarted","Data":"fd6d129e11f3df015cae16c612e668c50e42335c6686c092904d35bac536131b"} Dec 11 13:06:35 crc kubenswrapper[4898]: I1211 13:06:35.330055 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-7csms" podStartSLOduration=134.330045646 podStartE2EDuration="2m14.330045646s" podCreationTimestamp="2025-12-11 13:04:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 13:06:35.328568975 +0000 UTC m=+152.900895412" watchObservedRunningTime="2025-12-11 13:06:35.330045646 +0000 UTC m=+152.902372083" Dec 11 13:06:35 crc kubenswrapper[4898]: I1211 13:06:35.338993 4898 patch_prober.go:28] interesting pod/console-operator-58897d9998-rkglj container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.10:8443/readyz\": dial tcp 10.217.0.10:8443: connect: connection refused" start-of-body= Dec 11 13:06:35 crc kubenswrapper[4898]: I1211 13:06:35.339048 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-rkglj" podUID="cf77b13b-75a2-4fc1-a068-8fd33773f827" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.10:8443/readyz\": dial tcp 10.217.0.10:8443: connect: connection refused" Dec 11 13:06:35 crc kubenswrapper[4898]: I1211 13:06:35.339906 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g9vz7\" (UID: \"8850b908-0e43-45d6-a8d2-44e1fe06c4e0\") " pod="openshift-image-registry/image-registry-697d97f7c8-g9vz7" Dec 11 13:06:35 crc kubenswrapper[4898]: E1211 13:06:35.343848 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-11 13:06:35.843828921 +0000 UTC m=+153.416155358 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g9vz7" (UID: "8850b908-0e43-45d6-a8d2-44e1fe06c4e0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 13:06:35 crc kubenswrapper[4898]: W1211 13:06:35.344491 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod14360874_1ac7_4262_b7ed_3ccc4d909191.slice/crio-8b0f1c7ca7793bdc3c333ef5849e3e6580ae1583b2779bfaf0dbe9ec494d5b39 WatchSource:0}: Error finding container 8b0f1c7ca7793bdc3c333ef5849e3e6580ae1583b2779bfaf0dbe9ec494d5b39: Status 404 returned error can't find the container with id 8b0f1c7ca7793bdc3c333ef5849e3e6580ae1583b2779bfaf0dbe9ec494d5b39 Dec 11 13:06:35 crc kubenswrapper[4898]: I1211 13:06:35.346131 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-p9xjl" event={"ID":"065a3503-e7a8-4b7d-9cb7-366489ac0247","Type":"ContainerStarted","Data":"5b6477758f469e00b2e535dadcacbf463b8ea6d64c0b150d0308efb1a964ce77"} Dec 11 13:06:35 crc kubenswrapper[4898]: I1211 13:06:35.367113 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-n8qh4" event={"ID":"aa90eeb0-bd02-434e-a457-47336b084be7","Type":"ContainerStarted","Data":"d8d69e1e0a18ec75d1694ab4c26c7d8f8c8ab7091480cc4ff2d9ccf8ad1e6b04"} Dec 11 13:06:35 crc kubenswrapper[4898]: I1211 13:06:35.380921 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-676rf" podStartSLOduration=133.380904308 podStartE2EDuration="2m13.380904308s" podCreationTimestamp="2025-12-11 13:04:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 13:06:35.380789705 +0000 UTC m=+152.953116142" watchObservedRunningTime="2025-12-11 13:06:35.380904308 +0000 UTC m=+152.953230745" Dec 11 13:06:35 crc kubenswrapper[4898]: I1211 13:06:35.381726 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-nnwb2" podStartSLOduration=134.381720971 podStartE2EDuration="2m14.381720971s" podCreationTimestamp="2025-12-11 13:04:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 13:06:35.355621551 +0000 UTC m=+152.927947988" watchObservedRunningTime="2025-12-11 13:06:35.381720971 +0000 UTC m=+152.954047408" Dec 11 13:06:35 crc kubenswrapper[4898]: I1211 13:06:35.399616 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-rkglj" podStartSLOduration=134.39959832 podStartE2EDuration="2m14.39959832s" podCreationTimestamp="2025-12-11 13:04:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 13:06:35.399005234 +0000 UTC m=+152.971331671" watchObservedRunningTime="2025-12-11 13:06:35.39959832 +0000 UTC m=+152.971924757" Dec 11 13:06:35 crc kubenswrapper[4898]: I1211 13:06:35.420243 4898 patch_prober.go:28] interesting pod/router-default-5444994796-jmljz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 11 13:06:35 crc kubenswrapper[4898]: [-]has-synced failed: reason withheld Dec 11 13:06:35 crc kubenswrapper[4898]: [+]process-running ok Dec 11 13:06:35 crc kubenswrapper[4898]: healthz check failed Dec 11 13:06:35 crc kubenswrapper[4898]: I1211 13:06:35.420299 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jmljz" podUID="7b9b7fbd-6a91-4761-8604-b82c417e12f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 11 13:06:35 crc kubenswrapper[4898]: I1211 13:06:35.431398 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vgm9d" event={"ID":"1bd37230-d5b0-47d1-b4c6-df3c1ad3788c","Type":"ContainerStarted","Data":"63206ee6bbb11b91d0a56bcf2effb2e6bed671ba6008668cc4e4882d942b716f"} Dec 11 13:06:35 crc kubenswrapper[4898]: I1211 13:06:35.438824 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-42442" event={"ID":"5ec5ed0a-c3c1-4205-869f-c126a1249aa6","Type":"ContainerStarted","Data":"2863ea48f093531637722dd31c79d014ec2f750bd8b22b526f35c823ca1121e6"} Dec 11 13:06:35 crc kubenswrapper[4898]: I1211 13:06:35.442836 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 11 13:06:35 crc kubenswrapper[4898]: I1211 13:06:35.446913 4898 patch_prober.go:28] interesting pod/downloads-7954f5f757-4bmq4 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Dec 11 13:06:35 crc kubenswrapper[4898]: E1211 13:06:35.446956 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-11 13:06:35.946929003 +0000 UTC m=+153.519255450 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 13:06:35 crc kubenswrapper[4898]: I1211 13:06:35.446958 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-4bmq4" podUID="5442029d-e1aa-4496-b3c6-18f11b034179" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Dec 11 13:06:35 crc kubenswrapper[4898]: I1211 13:06:35.455379 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-p9xjl" podStartSLOduration=134.455359469 podStartE2EDuration="2m14.455359469s" podCreationTimestamp="2025-12-11 13:04:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 13:06:35.430207066 +0000 UTC m=+153.002533493" watchObservedRunningTime="2025-12-11 13:06:35.455359469 +0000 UTC m=+153.027685906" Dec 11 13:06:35 crc kubenswrapper[4898]: I1211 13:06:35.456680 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-lw2tt" podStartSLOduration=133.456672066 podStartE2EDuration="2m13.456672066s" podCreationTimestamp="2025-12-11 13:04:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 13:06:35.454870615 +0000 UTC m=+153.027197072" watchObservedRunningTime="2025-12-11 13:06:35.456672066 +0000 UTC m=+153.028998503" Dec 11 13:06:35 crc kubenswrapper[4898]: I1211 13:06:35.502252 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-42442" podStartSLOduration=134.502234479 podStartE2EDuration="2m14.502234479s" podCreationTimestamp="2025-12-11 13:04:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 13:06:35.481963213 +0000 UTC m=+153.054289660" watchObservedRunningTime="2025-12-11 13:06:35.502234479 +0000 UTC m=+153.074560916" Dec 11 13:06:35 crc kubenswrapper[4898]: I1211 13:06:35.545416 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g9vz7\" (UID: \"8850b908-0e43-45d6-a8d2-44e1fe06c4e0\") " pod="openshift-image-registry/image-registry-697d97f7c8-g9vz7" Dec 11 13:06:35 crc kubenswrapper[4898]: E1211 13:06:35.555293 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-11 13:06:36.055268882 +0000 UTC m=+153.627595319 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g9vz7" (UID: "8850b908-0e43-45d6-a8d2-44e1fe06c4e0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 13:06:35 crc kubenswrapper[4898]: I1211 13:06:35.647737 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 11 13:06:35 crc kubenswrapper[4898]: E1211 13:06:35.648303 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-11 13:06:36.148288932 +0000 UTC m=+153.720615369 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 13:06:35 crc kubenswrapper[4898]: I1211 13:06:35.648999 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-lbjzz" Dec 11 13:06:35 crc kubenswrapper[4898]: I1211 13:06:35.671541 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vgm9d" podStartSLOduration=133.671524261 podStartE2EDuration="2m13.671524261s" podCreationTimestamp="2025-12-11 13:04:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 13:06:35.501002185 +0000 UTC m=+153.073328632" watchObservedRunningTime="2025-12-11 13:06:35.671524261 +0000 UTC m=+153.243850698" Dec 11 13:06:35 crc kubenswrapper[4898]: I1211 13:06:35.749708 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g9vz7\" (UID: \"8850b908-0e43-45d6-a8d2-44e1fe06c4e0\") " pod="openshift-image-registry/image-registry-697d97f7c8-g9vz7" Dec 11 13:06:35 crc kubenswrapper[4898]: E1211 13:06:35.753085 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-11 13:06:36.25304817 +0000 UTC m=+153.825374627 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g9vz7" (UID: "8850b908-0e43-45d6-a8d2-44e1fe06c4e0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 13:06:35 crc kubenswrapper[4898]: I1211 13:06:35.852590 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 11 13:06:35 crc kubenswrapper[4898]: E1211 13:06:35.852929 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-11 13:06:36.352914702 +0000 UTC m=+153.925241139 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 13:06:35 crc kubenswrapper[4898]: I1211 13:06:35.954151 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g9vz7\" (UID: \"8850b908-0e43-45d6-a8d2-44e1fe06c4e0\") " pod="openshift-image-registry/image-registry-697d97f7c8-g9vz7" Dec 11 13:06:35 crc kubenswrapper[4898]: E1211 13:06:35.954727 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-11 13:06:36.454715398 +0000 UTC m=+154.027041835 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g9vz7" (UID: "8850b908-0e43-45d6-a8d2-44e1fe06c4e0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 13:06:36 crc kubenswrapper[4898]: I1211 13:06:36.055523 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 11 13:06:36 crc kubenswrapper[4898]: E1211 13:06:36.055856 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-11 13:06:36.555841164 +0000 UTC m=+154.128167591 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 13:06:36 crc kubenswrapper[4898]: I1211 13:06:36.090157 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vgm9d" Dec 11 13:06:36 crc kubenswrapper[4898]: I1211 13:06:36.090191 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vgm9d" Dec 11 13:06:36 crc kubenswrapper[4898]: I1211 13:06:36.102954 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vgm9d" Dec 11 13:06:36 crc kubenswrapper[4898]: I1211 13:06:36.157441 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g9vz7\" (UID: \"8850b908-0e43-45d6-a8d2-44e1fe06c4e0\") " pod="openshift-image-registry/image-registry-697d97f7c8-g9vz7" Dec 11 13:06:36 crc kubenswrapper[4898]: E1211 13:06:36.158047 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-11 13:06:36.658033721 +0000 UTC m=+154.230360158 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g9vz7" (UID: "8850b908-0e43-45d6-a8d2-44e1fe06c4e0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 13:06:36 crc kubenswrapper[4898]: I1211 13:06:36.212993 4898 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Dec 11 13:06:36 crc kubenswrapper[4898]: I1211 13:06:36.258863 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 11 13:06:36 crc kubenswrapper[4898]: E1211 13:06:36.259185 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-11 13:06:36.759170278 +0000 UTC m=+154.331496715 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 13:06:36 crc kubenswrapper[4898]: I1211 13:06:36.360393 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g9vz7\" (UID: \"8850b908-0e43-45d6-a8d2-44e1fe06c4e0\") " pod="openshift-image-registry/image-registry-697d97f7c8-g9vz7" Dec 11 13:06:36 crc kubenswrapper[4898]: E1211 13:06:36.360787 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-11 13:06:36.860773598 +0000 UTC m=+154.433100035 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g9vz7" (UID: "8850b908-0e43-45d6-a8d2-44e1fe06c4e0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 13:06:36 crc kubenswrapper[4898]: I1211 13:06:36.420003 4898 patch_prober.go:28] interesting pod/router-default-5444994796-jmljz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 11 13:06:36 crc kubenswrapper[4898]: [-]has-synced failed: reason withheld Dec 11 13:06:36 crc kubenswrapper[4898]: [+]process-running ok Dec 11 13:06:36 crc kubenswrapper[4898]: healthz check failed Dec 11 13:06:36 crc kubenswrapper[4898]: I1211 13:06:36.420254 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jmljz" podUID="7b9b7fbd-6a91-4761-8604-b82c417e12f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 11 13:06:36 crc kubenswrapper[4898]: I1211 13:06:36.461545 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 11 13:06:36 crc kubenswrapper[4898]: E1211 13:06:36.461644 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-11 13:06:36.961618977 +0000 UTC m=+154.533945414 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 13:06:36 crc kubenswrapper[4898]: I1211 13:06:36.461834 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g9vz7\" (UID: \"8850b908-0e43-45d6-a8d2-44e1fe06c4e0\") " pod="openshift-image-registry/image-registry-697d97f7c8-g9vz7" Dec 11 13:06:36 crc kubenswrapper[4898]: E1211 13:06:36.462150 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-11 13:06:36.962143012 +0000 UTC m=+154.534469449 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g9vz7" (UID: "8850b908-0e43-45d6-a8d2-44e1fe06c4e0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 13:06:36 crc kubenswrapper[4898]: I1211 13:06:36.467625 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-2cnbc" event={"ID":"2b85feb3-50df-4ad9-ac89-de94e7842c7e","Type":"ContainerStarted","Data":"f892cb79618b9daeda69476f688ebafe1d5c319fcb74626aa3c602a06d827816"} Dec 11 13:06:36 crc kubenswrapper[4898]: I1211 13:06:36.467669 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-2cnbc" event={"ID":"2b85feb3-50df-4ad9-ac89-de94e7842c7e","Type":"ContainerStarted","Data":"f20ab44c72bc081871e8a284eee9652677211860011a5201410365b734a7712b"} Dec 11 13:06:36 crc kubenswrapper[4898]: I1211 13:06:36.476117 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-676rf" event={"ID":"aeb38d85-05d0-4a84-b3a7-4a7a168ccd98","Type":"ContainerStarted","Data":"ea1de51a667fbb67e07fa76fcaf5a43f28b618fe6b175e24bfdb680e832a9b65"} Dec 11 13:06:36 crc kubenswrapper[4898]: I1211 13:06:36.481294 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-25l4n" event={"ID":"4d25d3be-3b53-4824-b357-5f251a16aa38","Type":"ContainerStarted","Data":"5f143afb2272de786e8ab18c7486dddadb6660146d3ac35fba7aff93f64656f4"} Dec 11 13:06:36 crc kubenswrapper[4898]: I1211 13:06:36.481327 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-25l4n" event={"ID":"4d25d3be-3b53-4824-b357-5f251a16aa38","Type":"ContainerStarted","Data":"236ac0b2584826857b09959e4633e44f2020ba07aab6357263c5c072582c989d"} Dec 11 13:06:36 crc kubenswrapper[4898]: I1211 13:06:36.492611 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-2cnbc" podStartSLOduration=134.492591583 podStartE2EDuration="2m14.492591583s" podCreationTimestamp="2025-12-11 13:04:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 13:06:36.491562434 +0000 UTC m=+154.063888871" watchObservedRunningTime="2025-12-11 13:06:36.492591583 +0000 UTC m=+154.064918020" Dec 11 13:06:36 crc kubenswrapper[4898]: I1211 13:06:36.496557 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-mdzwq" event={"ID":"ccaafcb8-0877-4754-b7f7-43d3c35c6283","Type":"ContainerStarted","Data":"b88e2ee759289d3fd0d63c1cff72ea653bfd84f3524e74c10f6c786b7ebc2098"} Dec 11 13:06:36 crc kubenswrapper[4898]: I1211 13:06:36.496593 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-mdzwq" event={"ID":"ccaafcb8-0877-4754-b7f7-43d3c35c6283","Type":"ContainerStarted","Data":"5142402c522f974000dba82f85a8b93228b53b6a6c7e1d9691f7c8161082e61b"} Dec 11 13:06:36 crc kubenswrapper[4898]: I1211 13:06:36.524191 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-wm9qr" event={"ID":"c395a35a-0376-4626-a75d-c1d2631e3de1","Type":"ContainerStarted","Data":"56ef69edb02b605c9c7b95beb757cea3c379f7b830c7e852d821bbd8bbcf202d"} Dec 11 13:06:36 crc kubenswrapper[4898]: I1211 13:06:36.524243 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-wm9qr" event={"ID":"c395a35a-0376-4626-a75d-c1d2631e3de1","Type":"ContainerStarted","Data":"8cddc6e1305b36d23938ae7663df635f523e3b76b57488c0ee044b2768a333c0"} Dec 11 13:06:36 crc kubenswrapper[4898]: I1211 13:06:36.524254 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-wm9qr" event={"ID":"c395a35a-0376-4626-a75d-c1d2631e3de1","Type":"ContainerStarted","Data":"1702e60e879c2d348388cea3c83ee8b8687a89a87093042e1048318dfa377c06"} Dec 11 13:06:36 crc kubenswrapper[4898]: I1211 13:06:36.525631 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-25l4n" podStartSLOduration=134.525615826 podStartE2EDuration="2m14.525615826s" podCreationTimestamp="2025-12-11 13:04:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 13:06:36.52395894 +0000 UTC m=+154.096285387" watchObservedRunningTime="2025-12-11 13:06:36.525615826 +0000 UTC m=+154.097942263" Dec 11 13:06:36 crc kubenswrapper[4898]: I1211 13:06:36.543974 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ntzhr" event={"ID":"c000ebbb-ebbf-4861-b148-6a21649befb4","Type":"ContainerStarted","Data":"306b6a802e97c3f007a3bbcb9d2aa696834e3bc2b7cb7bfff5a3ba6550e58689"} Dec 11 13:06:36 crc kubenswrapper[4898]: I1211 13:06:36.544024 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ntzhr" event={"ID":"c000ebbb-ebbf-4861-b148-6a21649befb4","Type":"ContainerStarted","Data":"525f021f603d395b12cf85420ca10fb5d1fc92caba71d85714fd7ee1f712fe79"} Dec 11 13:06:36 crc kubenswrapper[4898]: I1211 13:06:36.544758 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-mdzwq" podStartSLOduration=7.544744411 podStartE2EDuration="7.544744411s" podCreationTimestamp="2025-12-11 13:06:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 13:06:36.543861466 +0000 UTC m=+154.116187923" watchObservedRunningTime="2025-12-11 13:06:36.544744411 +0000 UTC m=+154.117070848" Dec 11 13:06:36 crc kubenswrapper[4898]: I1211 13:06:36.545175 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ntzhr" Dec 11 13:06:36 crc kubenswrapper[4898]: I1211 13:06:36.553220 4898 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-ntzhr container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.35:8443/healthz\": dial tcp 10.217.0.35:8443: connect: connection refused" start-of-body= Dec 11 13:06:36 crc kubenswrapper[4898]: I1211 13:06:36.553264 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ntzhr" podUID="c000ebbb-ebbf-4861-b148-6a21649befb4" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.35:8443/healthz\": dial tcp 10.217.0.35:8443: connect: connection refused" Dec 11 13:06:36 crc kubenswrapper[4898]: I1211 13:06:36.563843 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 11 13:06:36 crc kubenswrapper[4898]: E1211 13:06:36.565794 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-11 13:06:37.065770529 +0000 UTC m=+154.638096966 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 13:06:36 crc kubenswrapper[4898]: I1211 13:06:36.566094 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-wm9qr" podStartSLOduration=134.566079557 podStartE2EDuration="2m14.566079557s" podCreationTimestamp="2025-12-11 13:04:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 13:06:36.564811182 +0000 UTC m=+154.137137619" watchObservedRunningTime="2025-12-11 13:06:36.566079557 +0000 UTC m=+154.138405994" Dec 11 13:06:36 crc kubenswrapper[4898]: I1211 13:06:36.575757 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-n8qh4" event={"ID":"aa90eeb0-bd02-434e-a457-47336b084be7","Type":"ContainerStarted","Data":"1313db1da18cb61fb6858969dabfba410adbb1e8fbadbde502465f7e1f29454b"} Dec 11 13:06:36 crc kubenswrapper[4898]: I1211 13:06:36.575815 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-n8qh4" event={"ID":"aa90eeb0-bd02-434e-a457-47336b084be7","Type":"ContainerStarted","Data":"e896ca129ea0c03931cf1e367abf5ae761c5e7ed36a01a750151acb898b06326"} Dec 11 13:06:36 crc kubenswrapper[4898]: I1211 13:06:36.619910 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pjgq4" event={"ID":"14360874-1ac7-4262-b7ed-3ccc4d909191","Type":"ContainerStarted","Data":"c98a3f6ef5db596a7f0449ba9bcd46e7d15769206b02bdae37bce71e22f22787"} Dec 11 13:06:36 crc kubenswrapper[4898]: I1211 13:06:36.619961 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pjgq4" event={"ID":"14360874-1ac7-4262-b7ed-3ccc4d909191","Type":"ContainerStarted","Data":"8b0f1c7ca7793bdc3c333ef5849e3e6580ae1583b2779bfaf0dbe9ec494d5b39"} Dec 11 13:06:36 crc kubenswrapper[4898]: I1211 13:06:36.620234 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pjgq4" Dec 11 13:06:36 crc kubenswrapper[4898]: I1211 13:06:36.625386 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-vjbmx" event={"ID":"39bfd95c-8066-475a-ae86-f34ce2cae1e7","Type":"ContainerStarted","Data":"fb8b51c2c96de6c6943cc1af7b4fe8d66823aaccd4490685063b2585ac5df139"} Dec 11 13:06:36 crc kubenswrapper[4898]: I1211 13:06:36.625417 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-vjbmx" event={"ID":"39bfd95c-8066-475a-ae86-f34ce2cae1e7","Type":"ContainerStarted","Data":"3322e8c1e047c523b48f1dd1d27e3fd1fc155cda69840d67df706614dc1c04d6"} Dec 11 13:06:36 crc kubenswrapper[4898]: I1211 13:06:36.643078 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-cc9qv" event={"ID":"2ae587f2-140e-4780-844e-1eb3430f7ee6","Type":"ContainerStarted","Data":"8bb84283090b3206857508faa33e70ebb9240e1e3b74bd79003b1bd842395885"} Dec 11 13:06:36 crc kubenswrapper[4898]: I1211 13:06:36.643125 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-cc9qv" event={"ID":"2ae587f2-140e-4780-844e-1eb3430f7ee6","Type":"ContainerStarted","Data":"dfee136c38083c7d29a9c9b2c90662f7b35e3826785e4fe81e0f7961304f333c"} Dec 11 13:06:36 crc kubenswrapper[4898]: I1211 13:06:36.643143 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-cc9qv" event={"ID":"2ae587f2-140e-4780-844e-1eb3430f7ee6","Type":"ContainerStarted","Data":"487ee88f49216c58a2b4ed41d9c5de8a245e9e99661e14dbcad59a1789019b90"} Dec 11 13:06:36 crc kubenswrapper[4898]: I1211 13:06:36.643422 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-cc9qv" Dec 11 13:06:36 crc kubenswrapper[4898]: I1211 13:06:36.647511 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6df8v" event={"ID":"d5bcb23e-d100-4e41-bcc4-e8773a821c91","Type":"ContainerStarted","Data":"944e93b6284f2a5f1739d386a3439930f39da621a0fc683ba25f8b612880e1c8"} Dec 11 13:06:36 crc kubenswrapper[4898]: I1211 13:06:36.647547 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6df8v" event={"ID":"d5bcb23e-d100-4e41-bcc4-e8773a821c91","Type":"ContainerStarted","Data":"4c28f4b30f6bbee2041fb43a855645b9e3a213e80368331bd33d199e407db0fb"} Dec 11 13:06:36 crc kubenswrapper[4898]: I1211 13:06:36.647561 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6df8v" event={"ID":"d5bcb23e-d100-4e41-bcc4-e8773a821c91","Type":"ContainerStarted","Data":"7b214b5e1d15e53c543e38e5329ecb94dac9dc77fb12dda5ff72f32026ac73f2"} Dec 11 13:06:36 crc kubenswrapper[4898]: I1211 13:06:36.666432 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-nv8jk" event={"ID":"588998fa-36f4-49d4-a69b-d60a3952787b","Type":"ContainerStarted","Data":"c3fa4696e196d8ccc2f585cc26828dbad4f59e29faba60b462e3c0f063b7679f"} Dec 11 13:06:36 crc kubenswrapper[4898]: I1211 13:06:36.666491 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-nv8jk" event={"ID":"588998fa-36f4-49d4-a69b-d60a3952787b","Type":"ContainerStarted","Data":"9cfd6e1864e5ee1284a72d1d33f2c00bed9b4edccca00131e5f8c7bad6c49a20"} Dec 11 13:06:36 crc kubenswrapper[4898]: I1211 13:06:36.667203 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-nv8jk" Dec 11 13:06:36 crc kubenswrapper[4898]: I1211 13:06:36.667904 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g9vz7\" (UID: \"8850b908-0e43-45d6-a8d2-44e1fe06c4e0\") " pod="openshift-image-registry/image-registry-697d97f7c8-g9vz7" Dec 11 13:06:36 crc kubenswrapper[4898]: E1211 13:06:36.669008 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-11 13:06:37.168993514 +0000 UTC m=+154.741319951 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g9vz7" (UID: "8850b908-0e43-45d6-a8d2-44e1fe06c4e0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 13:06:36 crc kubenswrapper[4898]: I1211 13:06:36.675827 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pjgq4" podStartSLOduration=134.675811405 podStartE2EDuration="2m14.675811405s" podCreationTimestamp="2025-12-11 13:04:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 13:06:36.674865718 +0000 UTC m=+154.247192155" watchObservedRunningTime="2025-12-11 13:06:36.675811405 +0000 UTC m=+154.248137842" Dec 11 13:06:36 crc kubenswrapper[4898]: I1211 13:06:36.677060 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ntzhr" podStartSLOduration=134.677052439 podStartE2EDuration="2m14.677052439s" podCreationTimestamp="2025-12-11 13:04:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 13:06:36.585896291 +0000 UTC m=+154.158222728" watchObservedRunningTime="2025-12-11 13:06:36.677052439 +0000 UTC m=+154.249378876" Dec 11 13:06:36 crc kubenswrapper[4898]: I1211 13:06:36.680694 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-nv8jk" Dec 11 13:06:36 crc kubenswrapper[4898]: I1211 13:06:36.691917 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29424300-txml6" event={"ID":"e650293a-d60c-4e05-88dd-ea1fa46b3492","Type":"ContainerStarted","Data":"2336174d7dbc065d5466365b9f408e7c62157f8d84689adcf77e27663f53bd73"} Dec 11 13:06:36 crc kubenswrapper[4898]: I1211 13:06:36.691969 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29424300-txml6" event={"ID":"e650293a-d60c-4e05-88dd-ea1fa46b3492","Type":"ContainerStarted","Data":"43ba8efb427769ae60c7019bc6c117b96ddd8dd29a85c9203783291fa3be13d5"} Dec 11 13:06:36 crc kubenswrapper[4898]: I1211 13:06:36.697386 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-nv8jk" podStartSLOduration=134.697367847 podStartE2EDuration="2m14.697367847s" podCreationTimestamp="2025-12-11 13:04:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 13:06:36.693759206 +0000 UTC m=+154.266085643" watchObservedRunningTime="2025-12-11 13:06:36.697367847 +0000 UTC m=+154.269694284" Dec 11 13:06:36 crc kubenswrapper[4898]: I1211 13:06:36.703751 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-92m7s" event={"ID":"5cfb0e1f-8b8f-4f19-9a91-87a9c2bd42f7","Type":"ContainerStarted","Data":"81686dc63fb1f23bef1974de32d5b70df5510bdbff8216a00b65a616ad229fdb"} Dec 11 13:06:36 crc kubenswrapper[4898]: I1211 13:06:36.704513 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-92m7s" Dec 11 13:06:36 crc kubenswrapper[4898]: I1211 13:06:36.713620 4898 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-92m7s container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.38:8080/healthz\": dial tcp 10.217.0.38:8080: connect: connection refused" start-of-body= Dec 11 13:06:36 crc kubenswrapper[4898]: I1211 13:06:36.713686 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-92m7s" podUID="5cfb0e1f-8b8f-4f19-9a91-87a9c2bd42f7" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.38:8080/healthz\": dial tcp 10.217.0.38:8080: connect: connection refused" Dec 11 13:06:36 crc kubenswrapper[4898]: I1211 13:06:36.716152 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-vjbmx" podStartSLOduration=134.716134082 podStartE2EDuration="2m14.716134082s" podCreationTimestamp="2025-12-11 13:04:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 13:06:36.714228769 +0000 UTC m=+154.286555216" watchObservedRunningTime="2025-12-11 13:06:36.716134082 +0000 UTC m=+154.288460519" Dec 11 13:06:36 crc kubenswrapper[4898]: I1211 13:06:36.724710 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-22s99" event={"ID":"3ab8ca08-87a3-4920-aa33-6bbc290b1c15","Type":"ContainerStarted","Data":"b629c081df5068e41c7ca5ad0a2c271d5c1adc65182839826badc2bfa00839b8"} Dec 11 13:06:36 crc kubenswrapper[4898]: I1211 13:06:36.724756 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-22s99" event={"ID":"3ab8ca08-87a3-4920-aa33-6bbc290b1c15","Type":"ContainerStarted","Data":"e0add9480760b7ec96160363408ac86e54441d9e4a4ebb82fae505d99eed6746"} Dec 11 13:06:36 crc kubenswrapper[4898]: I1211 13:06:36.734357 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-lw2tt" event={"ID":"7d9ad469-8c6d-45df-8a7d-84250d766f58","Type":"ContainerStarted","Data":"05ed553ef8331b279b028206858ee8dd993a197ad6068f6b510f9195a4733b65"} Dec 11 13:06:36 crc kubenswrapper[4898]: I1211 13:06:36.740407 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6df8v" podStartSLOduration=134.740396 podStartE2EDuration="2m14.740396s" podCreationTimestamp="2025-12-11 13:04:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 13:06:36.73788361 +0000 UTC m=+154.310210057" watchObservedRunningTime="2025-12-11 13:06:36.740396 +0000 UTC m=+154.312722437" Dec 11 13:06:36 crc kubenswrapper[4898]: I1211 13:06:36.755349 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-cc9qv" podStartSLOduration=8.755333158 podStartE2EDuration="8.755333158s" podCreationTimestamp="2025-12-11 13:06:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 13:06:36.754015771 +0000 UTC m=+154.326342208" watchObservedRunningTime="2025-12-11 13:06:36.755333158 +0000 UTC m=+154.327659595" Dec 11 13:06:36 crc kubenswrapper[4898]: I1211 13:06:36.760297 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-p9xjl" event={"ID":"065a3503-e7a8-4b7d-9cb7-366489ac0247","Type":"ContainerStarted","Data":"a3ff3d359e564b631f14517ed5d24a8cc9b645c066edef3669e5138b813a223e"} Dec 11 13:06:36 crc kubenswrapper[4898]: I1211 13:06:36.765759 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zl9tf" event={"ID":"d5b90074-739f-4f6d-a41c-29612abca57e","Type":"ContainerStarted","Data":"3804e4fe781cef3b6dd47d6e5b37a8dddcfbe1a69e3fce83384b78b91ba7f330"} Dec 11 13:06:36 crc kubenswrapper[4898]: I1211 13:06:36.765807 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zl9tf" event={"ID":"d5b90074-739f-4f6d-a41c-29612abca57e","Type":"ContainerStarted","Data":"9e5ba46a71ff751fe7b321b87853f9de521308de36a75c69cea5aaf120ff098d"} Dec 11 13:06:36 crc kubenswrapper[4898]: I1211 13:06:36.765823 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zl9tf" event={"ID":"d5b90074-739f-4f6d-a41c-29612abca57e","Type":"ContainerStarted","Data":"1367ab3b523e8337241d6b3641ed5c20db3a64f8e9bf58072771a103ddec0bad"} Dec 11 13:06:36 crc kubenswrapper[4898]: I1211 13:06:36.771242 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-7csms" Dec 11 13:06:36 crc kubenswrapper[4898]: I1211 13:06:36.771290 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zl9tf" Dec 11 13:06:36 crc kubenswrapper[4898]: I1211 13:06:36.771633 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 11 13:06:36 crc kubenswrapper[4898]: E1211 13:06:36.787567 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-11 13:06:37.287546168 +0000 UTC m=+154.859872605 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 13:06:36 crc kubenswrapper[4898]: I1211 13:06:36.855038 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-fvkz4" Dec 11 13:06:36 crc kubenswrapper[4898]: I1211 13:06:36.855100 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vgm9d" Dec 11 13:06:36 crc kubenswrapper[4898]: I1211 13:06:36.855117 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-rkglj" Dec 11 13:06:36 crc kubenswrapper[4898]: I1211 13:06:36.855135 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-7csms" Dec 11 13:06:36 crc kubenswrapper[4898]: I1211 13:06:36.875378 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29424300-txml6" podStartSLOduration=134.875361513 podStartE2EDuration="2m14.875361513s" podCreationTimestamp="2025-12-11 13:04:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 13:06:36.87313176 +0000 UTC m=+154.445458197" watchObservedRunningTime="2025-12-11 13:06:36.875361513 +0000 UTC m=+154.447687940" Dec 11 13:06:36 crc kubenswrapper[4898]: I1211 13:06:36.875603 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-22s99" podStartSLOduration=134.875598669 podStartE2EDuration="2m14.875598669s" podCreationTimestamp="2025-12-11 13:04:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 13:06:36.843380219 +0000 UTC m=+154.415706656" watchObservedRunningTime="2025-12-11 13:06:36.875598669 +0000 UTC m=+154.447925106" Dec 11 13:06:36 crc kubenswrapper[4898]: I1211 13:06:36.875911 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g9vz7\" (UID: \"8850b908-0e43-45d6-a8d2-44e1fe06c4e0\") " pod="openshift-image-registry/image-registry-697d97f7c8-g9vz7" Dec 11 13:06:36 crc kubenswrapper[4898]: E1211 13:06:36.878324 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-11 13:06:37.378307655 +0000 UTC m=+154.950634092 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g9vz7" (UID: "8850b908-0e43-45d6-a8d2-44e1fe06c4e0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 13:06:36 crc kubenswrapper[4898]: I1211 13:06:36.935170 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-92m7s" podStartSLOduration=134.935152144 podStartE2EDuration="2m14.935152144s" podCreationTimestamp="2025-12-11 13:04:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 13:06:36.899740284 +0000 UTC m=+154.472066721" watchObservedRunningTime="2025-12-11 13:06:36.935152144 +0000 UTC m=+154.507478581" Dec 11 13:06:36 crc kubenswrapper[4898]: I1211 13:06:36.971782 4898 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2025-12-11T13:06:36.213020348Z","Handler":null,"Name":""} Dec 11 13:06:36 crc kubenswrapper[4898]: I1211 13:06:36.974670 4898 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Dec 11 13:06:36 crc kubenswrapper[4898]: I1211 13:06:36.974838 4898 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Dec 11 13:06:36 crc kubenswrapper[4898]: I1211 13:06:36.977766 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 11 13:06:37 crc kubenswrapper[4898]: I1211 13:06:37.005021 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zl9tf" podStartSLOduration=135.005007127 podStartE2EDuration="2m15.005007127s" podCreationTimestamp="2025-12-11 13:04:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 13:06:37.002420925 +0000 UTC m=+154.574747362" watchObservedRunningTime="2025-12-11 13:06:37.005007127 +0000 UTC m=+154.577333564" Dec 11 13:06:37 crc kubenswrapper[4898]: I1211 13:06:37.027231 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 11 13:06:37 crc kubenswrapper[4898]: I1211 13:06:37.079292 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g9vz7\" (UID: \"8850b908-0e43-45d6-a8d2-44e1fe06c4e0\") " pod="openshift-image-registry/image-registry-697d97f7c8-g9vz7" Dec 11 13:06:37 crc kubenswrapper[4898]: I1211 13:06:37.099257 4898 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 11 13:06:37 crc kubenswrapper[4898]: I1211 13:06:37.099296 4898 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g9vz7\" (UID: \"8850b908-0e43-45d6-a8d2-44e1fe06c4e0\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-g9vz7" Dec 11 13:06:37 crc kubenswrapper[4898]: I1211 13:06:37.271929 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g9vz7\" (UID: \"8850b908-0e43-45d6-a8d2-44e1fe06c4e0\") " pod="openshift-image-registry/image-registry-697d97f7c8-g9vz7" Dec 11 13:06:37 crc kubenswrapper[4898]: I1211 13:06:37.399523 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-g9vz7" Dec 11 13:06:37 crc kubenswrapper[4898]: I1211 13:06:37.426474 4898 patch_prober.go:28] interesting pod/router-default-5444994796-jmljz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 11 13:06:37 crc kubenswrapper[4898]: [-]has-synced failed: reason withheld Dec 11 13:06:37 crc kubenswrapper[4898]: [+]process-running ok Dec 11 13:06:37 crc kubenswrapper[4898]: healthz check failed Dec 11 13:06:37 crc kubenswrapper[4898]: I1211 13:06:37.426838 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jmljz" podUID="7b9b7fbd-6a91-4761-8604-b82c417e12f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 11 13:06:37 crc kubenswrapper[4898]: I1211 13:06:37.620636 4898 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-pjgq4 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.23:5443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 11 13:06:37 crc kubenswrapper[4898]: I1211 13:06:37.620695 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pjgq4" podUID="14360874-1ac7-4262-b7ed-3ccc4d909191" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.23:5443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 11 13:06:37 crc kubenswrapper[4898]: I1211 13:06:37.772926 4898 generic.go:334] "Generic (PLEG): container finished" podID="e650293a-d60c-4e05-88dd-ea1fa46b3492" containerID="2336174d7dbc065d5466365b9f408e7c62157f8d84689adcf77e27663f53bd73" exitCode=0 Dec 11 13:06:37 crc kubenswrapper[4898]: I1211 13:06:37.772981 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29424300-txml6" event={"ID":"e650293a-d60c-4e05-88dd-ea1fa46b3492","Type":"ContainerDied","Data":"2336174d7dbc065d5466365b9f408e7c62157f8d84689adcf77e27663f53bd73"} Dec 11 13:06:37 crc kubenswrapper[4898]: I1211 13:06:37.776720 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-vjbmx" event={"ID":"39bfd95c-8066-475a-ae86-f34ce2cae1e7","Type":"ContainerStarted","Data":"f1e89a0ab760f47b53757f249d0858479a27d2dc1ee09b69d0ef067e6532279e"} Dec 11 13:06:37 crc kubenswrapper[4898]: I1211 13:06:37.779712 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-n8qh4" event={"ID":"aa90eeb0-bd02-434e-a457-47336b084be7","Type":"ContainerStarted","Data":"eeb4954d5923248802520816eb0503fd0310b469703014f631e0bad2dfa6b928"} Dec 11 13:06:37 crc kubenswrapper[4898]: I1211 13:06:37.788206 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-92m7s" Dec 11 13:06:37 crc kubenswrapper[4898]: I1211 13:06:37.788589 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pjgq4" Dec 11 13:06:37 crc kubenswrapper[4898]: I1211 13:06:37.790150 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ntzhr" Dec 11 13:06:37 crc kubenswrapper[4898]: I1211 13:06:37.860654 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-n8qh4" podStartSLOduration=8.860638325 podStartE2EDuration="8.860638325s" podCreationTimestamp="2025-12-11 13:06:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 13:06:37.827617681 +0000 UTC m=+155.399944118" watchObservedRunningTime="2025-12-11 13:06:37.860638325 +0000 UTC m=+155.432964752" Dec 11 13:06:37 crc kubenswrapper[4898]: I1211 13:06:37.917408 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-g9vz7"] Dec 11 13:06:38 crc kubenswrapper[4898]: I1211 13:06:38.333928 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-lsvvk"] Dec 11 13:06:38 crc kubenswrapper[4898]: I1211 13:06:38.335034 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lsvvk" Dec 11 13:06:38 crc kubenswrapper[4898]: I1211 13:06:38.336804 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 11 13:06:38 crc kubenswrapper[4898]: I1211 13:06:38.342615 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lsvvk"] Dec 11 13:06:38 crc kubenswrapper[4898]: I1211 13:06:38.399592 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/815c6898-33a4-4a0d-b751-689267c17053-catalog-content\") pod \"community-operators-lsvvk\" (UID: \"815c6898-33a4-4a0d-b751-689267c17053\") " pod="openshift-marketplace/community-operators-lsvvk" Dec 11 13:06:38 crc kubenswrapper[4898]: I1211 13:06:38.399679 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/815c6898-33a4-4a0d-b751-689267c17053-utilities\") pod \"community-operators-lsvvk\" (UID: \"815c6898-33a4-4a0d-b751-689267c17053\") " pod="openshift-marketplace/community-operators-lsvvk" Dec 11 13:06:38 crc kubenswrapper[4898]: I1211 13:06:38.399713 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rkpxv\" (UniqueName: \"kubernetes.io/projected/815c6898-33a4-4a0d-b751-689267c17053-kube-api-access-rkpxv\") pod \"community-operators-lsvvk\" (UID: \"815c6898-33a4-4a0d-b751-689267c17053\") " pod="openshift-marketplace/community-operators-lsvvk" Dec 11 13:06:38 crc kubenswrapper[4898]: I1211 13:06:38.418710 4898 patch_prober.go:28] interesting pod/router-default-5444994796-jmljz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 11 13:06:38 crc kubenswrapper[4898]: [-]has-synced failed: reason withheld Dec 11 13:06:38 crc kubenswrapper[4898]: [+]process-running ok Dec 11 13:06:38 crc kubenswrapper[4898]: healthz check failed Dec 11 13:06:38 crc kubenswrapper[4898]: I1211 13:06:38.418778 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jmljz" podUID="7b9b7fbd-6a91-4761-8604-b82c417e12f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 11 13:06:38 crc kubenswrapper[4898]: I1211 13:06:38.501078 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/815c6898-33a4-4a0d-b751-689267c17053-utilities\") pod \"community-operators-lsvvk\" (UID: \"815c6898-33a4-4a0d-b751-689267c17053\") " pod="openshift-marketplace/community-operators-lsvvk" Dec 11 13:06:38 crc kubenswrapper[4898]: I1211 13:06:38.501133 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rkpxv\" (UniqueName: \"kubernetes.io/projected/815c6898-33a4-4a0d-b751-689267c17053-kube-api-access-rkpxv\") pod \"community-operators-lsvvk\" (UID: \"815c6898-33a4-4a0d-b751-689267c17053\") " pod="openshift-marketplace/community-operators-lsvvk" Dec 11 13:06:38 crc kubenswrapper[4898]: I1211 13:06:38.501224 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/815c6898-33a4-4a0d-b751-689267c17053-catalog-content\") pod \"community-operators-lsvvk\" (UID: \"815c6898-33a4-4a0d-b751-689267c17053\") " pod="openshift-marketplace/community-operators-lsvvk" Dec 11 13:06:38 crc kubenswrapper[4898]: I1211 13:06:38.502025 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/815c6898-33a4-4a0d-b751-689267c17053-utilities\") pod \"community-operators-lsvvk\" (UID: \"815c6898-33a4-4a0d-b751-689267c17053\") " pod="openshift-marketplace/community-operators-lsvvk" Dec 11 13:06:38 crc kubenswrapper[4898]: I1211 13:06:38.502048 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/815c6898-33a4-4a0d-b751-689267c17053-catalog-content\") pod \"community-operators-lsvvk\" (UID: \"815c6898-33a4-4a0d-b751-689267c17053\") " pod="openshift-marketplace/community-operators-lsvvk" Dec 11 13:06:38 crc kubenswrapper[4898]: I1211 13:06:38.533191 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rkpxv\" (UniqueName: \"kubernetes.io/projected/815c6898-33a4-4a0d-b751-689267c17053-kube-api-access-rkpxv\") pod \"community-operators-lsvvk\" (UID: \"815c6898-33a4-4a0d-b751-689267c17053\") " pod="openshift-marketplace/community-operators-lsvvk" Dec 11 13:06:38 crc kubenswrapper[4898]: I1211 13:06:38.540329 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-mmpgc"] Dec 11 13:06:38 crc kubenswrapper[4898]: I1211 13:06:38.541306 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mmpgc" Dec 11 13:06:38 crc kubenswrapper[4898]: I1211 13:06:38.545320 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 11 13:06:38 crc kubenswrapper[4898]: I1211 13:06:38.591779 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mmpgc"] Dec 11 13:06:38 crc kubenswrapper[4898]: I1211 13:06:38.602610 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jhfdb\" (UniqueName: \"kubernetes.io/projected/0fca7537-45c2-4c42-adab-0c373132c342-kube-api-access-jhfdb\") pod \"certified-operators-mmpgc\" (UID: \"0fca7537-45c2-4c42-adab-0c373132c342\") " pod="openshift-marketplace/certified-operators-mmpgc" Dec 11 13:06:38 crc kubenswrapper[4898]: I1211 13:06:38.602701 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0fca7537-45c2-4c42-adab-0c373132c342-utilities\") pod \"certified-operators-mmpgc\" (UID: \"0fca7537-45c2-4c42-adab-0c373132c342\") " pod="openshift-marketplace/certified-operators-mmpgc" Dec 11 13:06:38 crc kubenswrapper[4898]: I1211 13:06:38.602746 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0fca7537-45c2-4c42-adab-0c373132c342-catalog-content\") pod \"certified-operators-mmpgc\" (UID: \"0fca7537-45c2-4c42-adab-0c373132c342\") " pod="openshift-marketplace/certified-operators-mmpgc" Dec 11 13:06:38 crc kubenswrapper[4898]: I1211 13:06:38.648466 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lsvvk" Dec 11 13:06:38 crc kubenswrapper[4898]: I1211 13:06:38.698394 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 11 13:06:38 crc kubenswrapper[4898]: I1211 13:06:38.699222 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 11 13:06:38 crc kubenswrapper[4898]: I1211 13:06:38.701549 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Dec 11 13:06:38 crc kubenswrapper[4898]: I1211 13:06:38.701738 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Dec 11 13:06:38 crc kubenswrapper[4898]: I1211 13:06:38.703601 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0fca7537-45c2-4c42-adab-0c373132c342-utilities\") pod \"certified-operators-mmpgc\" (UID: \"0fca7537-45c2-4c42-adab-0c373132c342\") " pod="openshift-marketplace/certified-operators-mmpgc" Dec 11 13:06:38 crc kubenswrapper[4898]: I1211 13:06:38.703662 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0fca7537-45c2-4c42-adab-0c373132c342-catalog-content\") pod \"certified-operators-mmpgc\" (UID: \"0fca7537-45c2-4c42-adab-0c373132c342\") " pod="openshift-marketplace/certified-operators-mmpgc" Dec 11 13:06:38 crc kubenswrapper[4898]: I1211 13:06:38.703733 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jhfdb\" (UniqueName: \"kubernetes.io/projected/0fca7537-45c2-4c42-adab-0c373132c342-kube-api-access-jhfdb\") pod \"certified-operators-mmpgc\" (UID: \"0fca7537-45c2-4c42-adab-0c373132c342\") " pod="openshift-marketplace/certified-operators-mmpgc" Dec 11 13:06:38 crc kubenswrapper[4898]: I1211 13:06:38.704193 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0fca7537-45c2-4c42-adab-0c373132c342-catalog-content\") pod \"certified-operators-mmpgc\" (UID: \"0fca7537-45c2-4c42-adab-0c373132c342\") " pod="openshift-marketplace/certified-operators-mmpgc" Dec 11 13:06:38 crc kubenswrapper[4898]: I1211 13:06:38.704502 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0fca7537-45c2-4c42-adab-0c373132c342-utilities\") pod \"certified-operators-mmpgc\" (UID: \"0fca7537-45c2-4c42-adab-0c373132c342\") " pod="openshift-marketplace/certified-operators-mmpgc" Dec 11 13:06:38 crc kubenswrapper[4898]: I1211 13:06:38.712952 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 11 13:06:38 crc kubenswrapper[4898]: I1211 13:06:38.727257 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jhfdb\" (UniqueName: \"kubernetes.io/projected/0fca7537-45c2-4c42-adab-0c373132c342-kube-api-access-jhfdb\") pod \"certified-operators-mmpgc\" (UID: \"0fca7537-45c2-4c42-adab-0c373132c342\") " pod="openshift-marketplace/certified-operators-mmpgc" Dec 11 13:06:38 crc kubenswrapper[4898]: I1211 13:06:38.746986 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-hhtm7"] Dec 11 13:06:38 crc kubenswrapper[4898]: I1211 13:06:38.748288 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hhtm7" Dec 11 13:06:38 crc kubenswrapper[4898]: I1211 13:06:38.756838 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hhtm7"] Dec 11 13:06:38 crc kubenswrapper[4898]: I1211 13:06:38.787004 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Dec 11 13:06:38 crc kubenswrapper[4898]: I1211 13:06:38.798732 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-g9vz7" event={"ID":"8850b908-0e43-45d6-a8d2-44e1fe06c4e0","Type":"ContainerStarted","Data":"d50693ef0e9ae66365cc644dff94c4ad4ff5f16c0182f45ec44ba9a2fbbeae97"} Dec 11 13:06:38 crc kubenswrapper[4898]: I1211 13:06:38.798769 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-g9vz7" event={"ID":"8850b908-0e43-45d6-a8d2-44e1fe06c4e0","Type":"ContainerStarted","Data":"21d0a9aebc70d734f273ae6d088a4515f6a4e859751fff42a1a275dc723582f0"} Dec 11 13:06:38 crc kubenswrapper[4898]: I1211 13:06:38.798837 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-g9vz7" Dec 11 13:06:38 crc kubenswrapper[4898]: I1211 13:06:38.808864 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/22920719-7563-4c10-b96f-0d30fb4f55bf-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"22920719-7563-4c10-b96f-0d30fb4f55bf\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 11 13:06:38 crc kubenswrapper[4898]: I1211 13:06:38.808902 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/01b72963-e048-458b-8e4b-7642a1dfc096-utilities\") pod \"community-operators-hhtm7\" (UID: \"01b72963-e048-458b-8e4b-7642a1dfc096\") " pod="openshift-marketplace/community-operators-hhtm7" Dec 11 13:06:38 crc kubenswrapper[4898]: I1211 13:06:38.808921 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tqstf\" (UniqueName: \"kubernetes.io/projected/01b72963-e048-458b-8e4b-7642a1dfc096-kube-api-access-tqstf\") pod \"community-operators-hhtm7\" (UID: \"01b72963-e048-458b-8e4b-7642a1dfc096\") " pod="openshift-marketplace/community-operators-hhtm7" Dec 11 13:06:38 crc kubenswrapper[4898]: I1211 13:06:38.809169 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/01b72963-e048-458b-8e4b-7642a1dfc096-catalog-content\") pod \"community-operators-hhtm7\" (UID: \"01b72963-e048-458b-8e4b-7642a1dfc096\") " pod="openshift-marketplace/community-operators-hhtm7" Dec 11 13:06:38 crc kubenswrapper[4898]: I1211 13:06:38.809264 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/22920719-7563-4c10-b96f-0d30fb4f55bf-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"22920719-7563-4c10-b96f-0d30fb4f55bf\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 11 13:06:38 crc kubenswrapper[4898]: I1211 13:06:38.819074 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-g9vz7" podStartSLOduration=137.819047744 podStartE2EDuration="2m17.819047744s" podCreationTimestamp="2025-12-11 13:04:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 13:06:38.818080907 +0000 UTC m=+156.390407364" watchObservedRunningTime="2025-12-11 13:06:38.819047744 +0000 UTC m=+156.391374171" Dec 11 13:06:38 crc kubenswrapper[4898]: I1211 13:06:38.859761 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mmpgc" Dec 11 13:06:38 crc kubenswrapper[4898]: I1211 13:06:38.908313 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lsvvk"] Dec 11 13:06:38 crc kubenswrapper[4898]: I1211 13:06:38.910566 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/01b72963-e048-458b-8e4b-7642a1dfc096-utilities\") pod \"community-operators-hhtm7\" (UID: \"01b72963-e048-458b-8e4b-7642a1dfc096\") " pod="openshift-marketplace/community-operators-hhtm7" Dec 11 13:06:38 crc kubenswrapper[4898]: I1211 13:06:38.910613 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tqstf\" (UniqueName: \"kubernetes.io/projected/01b72963-e048-458b-8e4b-7642a1dfc096-kube-api-access-tqstf\") pod \"community-operators-hhtm7\" (UID: \"01b72963-e048-458b-8e4b-7642a1dfc096\") " pod="openshift-marketplace/community-operators-hhtm7" Dec 11 13:06:38 crc kubenswrapper[4898]: I1211 13:06:38.911084 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/01b72963-e048-458b-8e4b-7642a1dfc096-catalog-content\") pod \"community-operators-hhtm7\" (UID: \"01b72963-e048-458b-8e4b-7642a1dfc096\") " pod="openshift-marketplace/community-operators-hhtm7" Dec 11 13:06:38 crc kubenswrapper[4898]: I1211 13:06:38.911193 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/22920719-7563-4c10-b96f-0d30fb4f55bf-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"22920719-7563-4c10-b96f-0d30fb4f55bf\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 11 13:06:38 crc kubenswrapper[4898]: I1211 13:06:38.911259 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/22920719-7563-4c10-b96f-0d30fb4f55bf-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"22920719-7563-4c10-b96f-0d30fb4f55bf\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 11 13:06:38 crc kubenswrapper[4898]: I1211 13:06:38.911335 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/22920719-7563-4c10-b96f-0d30fb4f55bf-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"22920719-7563-4c10-b96f-0d30fb4f55bf\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 11 13:06:38 crc kubenswrapper[4898]: I1211 13:06:38.912480 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/01b72963-e048-458b-8e4b-7642a1dfc096-utilities\") pod \"community-operators-hhtm7\" (UID: \"01b72963-e048-458b-8e4b-7642a1dfc096\") " pod="openshift-marketplace/community-operators-hhtm7" Dec 11 13:06:38 crc kubenswrapper[4898]: I1211 13:06:38.915860 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/01b72963-e048-458b-8e4b-7642a1dfc096-catalog-content\") pod \"community-operators-hhtm7\" (UID: \"01b72963-e048-458b-8e4b-7642a1dfc096\") " pod="openshift-marketplace/community-operators-hhtm7" Dec 11 13:06:38 crc kubenswrapper[4898]: I1211 13:06:38.932089 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-946ll"] Dec 11 13:06:38 crc kubenswrapper[4898]: I1211 13:06:38.933097 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-946ll" Dec 11 13:06:38 crc kubenswrapper[4898]: I1211 13:06:38.937078 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/22920719-7563-4c10-b96f-0d30fb4f55bf-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"22920719-7563-4c10-b96f-0d30fb4f55bf\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 11 13:06:38 crc kubenswrapper[4898]: I1211 13:06:38.939424 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tqstf\" (UniqueName: \"kubernetes.io/projected/01b72963-e048-458b-8e4b-7642a1dfc096-kube-api-access-tqstf\") pod \"community-operators-hhtm7\" (UID: \"01b72963-e048-458b-8e4b-7642a1dfc096\") " pod="openshift-marketplace/community-operators-hhtm7" Dec 11 13:06:38 crc kubenswrapper[4898]: I1211 13:06:38.950419 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-946ll"] Dec 11 13:06:39 crc kubenswrapper[4898]: I1211 13:06:39.024637 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ss76d\" (UniqueName: \"kubernetes.io/projected/ceb008dd-582e-42f0-b83a-d3523d677e61-kube-api-access-ss76d\") pod \"certified-operators-946ll\" (UID: \"ceb008dd-582e-42f0-b83a-d3523d677e61\") " pod="openshift-marketplace/certified-operators-946ll" Dec 11 13:06:39 crc kubenswrapper[4898]: I1211 13:06:39.024718 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ceb008dd-582e-42f0-b83a-d3523d677e61-utilities\") pod \"certified-operators-946ll\" (UID: \"ceb008dd-582e-42f0-b83a-d3523d677e61\") " pod="openshift-marketplace/certified-operators-946ll" Dec 11 13:06:39 crc kubenswrapper[4898]: I1211 13:06:39.024759 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ceb008dd-582e-42f0-b83a-d3523d677e61-catalog-content\") pod \"certified-operators-946ll\" (UID: \"ceb008dd-582e-42f0-b83a-d3523d677e61\") " pod="openshift-marketplace/certified-operators-946ll" Dec 11 13:06:39 crc kubenswrapper[4898]: I1211 13:06:39.024915 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 11 13:06:39 crc kubenswrapper[4898]: I1211 13:06:39.064866 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hhtm7" Dec 11 13:06:39 crc kubenswrapper[4898]: I1211 13:06:39.088128 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29424300-txml6" Dec 11 13:06:39 crc kubenswrapper[4898]: I1211 13:06:39.125942 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ss76d\" (UniqueName: \"kubernetes.io/projected/ceb008dd-582e-42f0-b83a-d3523d677e61-kube-api-access-ss76d\") pod \"certified-operators-946ll\" (UID: \"ceb008dd-582e-42f0-b83a-d3523d677e61\") " pod="openshift-marketplace/certified-operators-946ll" Dec 11 13:06:39 crc kubenswrapper[4898]: I1211 13:06:39.126019 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ceb008dd-582e-42f0-b83a-d3523d677e61-utilities\") pod \"certified-operators-946ll\" (UID: \"ceb008dd-582e-42f0-b83a-d3523d677e61\") " pod="openshift-marketplace/certified-operators-946ll" Dec 11 13:06:39 crc kubenswrapper[4898]: I1211 13:06:39.126058 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ceb008dd-582e-42f0-b83a-d3523d677e61-catalog-content\") pod \"certified-operators-946ll\" (UID: \"ceb008dd-582e-42f0-b83a-d3523d677e61\") " pod="openshift-marketplace/certified-operators-946ll" Dec 11 13:06:39 crc kubenswrapper[4898]: I1211 13:06:39.126767 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ceb008dd-582e-42f0-b83a-d3523d677e61-catalog-content\") pod \"certified-operators-946ll\" (UID: \"ceb008dd-582e-42f0-b83a-d3523d677e61\") " pod="openshift-marketplace/certified-operators-946ll" Dec 11 13:06:39 crc kubenswrapper[4898]: I1211 13:06:39.126998 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ceb008dd-582e-42f0-b83a-d3523d677e61-utilities\") pod \"certified-operators-946ll\" (UID: \"ceb008dd-582e-42f0-b83a-d3523d677e61\") " pod="openshift-marketplace/certified-operators-946ll" Dec 11 13:06:39 crc kubenswrapper[4898]: I1211 13:06:39.132911 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mmpgc"] Dec 11 13:06:39 crc kubenswrapper[4898]: I1211 13:06:39.149658 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ss76d\" (UniqueName: \"kubernetes.io/projected/ceb008dd-582e-42f0-b83a-d3523d677e61-kube-api-access-ss76d\") pod \"certified-operators-946ll\" (UID: \"ceb008dd-582e-42f0-b83a-d3523d677e61\") " pod="openshift-marketplace/certified-operators-946ll" Dec 11 13:06:39 crc kubenswrapper[4898]: W1211 13:06:39.169981 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0fca7537_45c2_4c42_adab_0c373132c342.slice/crio-1e596a6261ff49e750f0b01904dfa27f844d0c440a257847606c4cb5a745e41a WatchSource:0}: Error finding container 1e596a6261ff49e750f0b01904dfa27f844d0c440a257847606c4cb5a745e41a: Status 404 returned error can't find the container with id 1e596a6261ff49e750f0b01904dfa27f844d0c440a257847606c4cb5a745e41a Dec 11 13:06:39 crc kubenswrapper[4898]: I1211 13:06:39.226803 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e650293a-d60c-4e05-88dd-ea1fa46b3492-config-volume\") pod \"e650293a-d60c-4e05-88dd-ea1fa46b3492\" (UID: \"e650293a-d60c-4e05-88dd-ea1fa46b3492\") " Dec 11 13:06:39 crc kubenswrapper[4898]: I1211 13:06:39.226887 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e650293a-d60c-4e05-88dd-ea1fa46b3492-secret-volume\") pod \"e650293a-d60c-4e05-88dd-ea1fa46b3492\" (UID: \"e650293a-d60c-4e05-88dd-ea1fa46b3492\") " Dec 11 13:06:39 crc kubenswrapper[4898]: I1211 13:06:39.226911 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9pfkw\" (UniqueName: \"kubernetes.io/projected/e650293a-d60c-4e05-88dd-ea1fa46b3492-kube-api-access-9pfkw\") pod \"e650293a-d60c-4e05-88dd-ea1fa46b3492\" (UID: \"e650293a-d60c-4e05-88dd-ea1fa46b3492\") " Dec 11 13:06:39 crc kubenswrapper[4898]: I1211 13:06:39.227898 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e650293a-d60c-4e05-88dd-ea1fa46b3492-config-volume" (OuterVolumeSpecName: "config-volume") pod "e650293a-d60c-4e05-88dd-ea1fa46b3492" (UID: "e650293a-d60c-4e05-88dd-ea1fa46b3492"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:06:39 crc kubenswrapper[4898]: I1211 13:06:39.230499 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e650293a-d60c-4e05-88dd-ea1fa46b3492-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "e650293a-d60c-4e05-88dd-ea1fa46b3492" (UID: "e650293a-d60c-4e05-88dd-ea1fa46b3492"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:06:39 crc kubenswrapper[4898]: I1211 13:06:39.230683 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e650293a-d60c-4e05-88dd-ea1fa46b3492-kube-api-access-9pfkw" (OuterVolumeSpecName: "kube-api-access-9pfkw") pod "e650293a-d60c-4e05-88dd-ea1fa46b3492" (UID: "e650293a-d60c-4e05-88dd-ea1fa46b3492"). InnerVolumeSpecName "kube-api-access-9pfkw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:06:39 crc kubenswrapper[4898]: I1211 13:06:39.256730 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-946ll" Dec 11 13:06:39 crc kubenswrapper[4898]: I1211 13:06:39.272648 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 11 13:06:39 crc kubenswrapper[4898]: I1211 13:06:39.328022 4898 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e650293a-d60c-4e05-88dd-ea1fa46b3492-config-volume\") on node \"crc\" DevicePath \"\"" Dec 11 13:06:39 crc kubenswrapper[4898]: I1211 13:06:39.328047 4898 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e650293a-d60c-4e05-88dd-ea1fa46b3492-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 11 13:06:39 crc kubenswrapper[4898]: I1211 13:06:39.328057 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9pfkw\" (UniqueName: \"kubernetes.io/projected/e650293a-d60c-4e05-88dd-ea1fa46b3492-kube-api-access-9pfkw\") on node \"crc\" DevicePath \"\"" Dec 11 13:06:39 crc kubenswrapper[4898]: I1211 13:06:39.343108 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hhtm7"] Dec 11 13:06:39 crc kubenswrapper[4898]: I1211 13:06:39.427190 4898 patch_prober.go:28] interesting pod/router-default-5444994796-jmljz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 11 13:06:39 crc kubenswrapper[4898]: [-]has-synced failed: reason withheld Dec 11 13:06:39 crc kubenswrapper[4898]: [+]process-running ok Dec 11 13:06:39 crc kubenswrapper[4898]: healthz check failed Dec 11 13:06:39 crc kubenswrapper[4898]: I1211 13:06:39.427276 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jmljz" podUID="7b9b7fbd-6a91-4761-8604-b82c417e12f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 11 13:06:39 crc kubenswrapper[4898]: I1211 13:06:39.527901 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-946ll"] Dec 11 13:06:39 crc kubenswrapper[4898]: I1211 13:06:39.805067 4898 generic.go:334] "Generic (PLEG): container finished" podID="0fca7537-45c2-4c42-adab-0c373132c342" containerID="77a0d31a682497f45a0e260fd72190c8bcabb20d50392b45dff1356100166635" exitCode=0 Dec 11 13:06:39 crc kubenswrapper[4898]: I1211 13:06:39.805227 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mmpgc" event={"ID":"0fca7537-45c2-4c42-adab-0c373132c342","Type":"ContainerDied","Data":"77a0d31a682497f45a0e260fd72190c8bcabb20d50392b45dff1356100166635"} Dec 11 13:06:39 crc kubenswrapper[4898]: I1211 13:06:39.805388 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mmpgc" event={"ID":"0fca7537-45c2-4c42-adab-0c373132c342","Type":"ContainerStarted","Data":"1e596a6261ff49e750f0b01904dfa27f844d0c440a257847606c4cb5a745e41a"} Dec 11 13:06:39 crc kubenswrapper[4898]: I1211 13:06:39.807117 4898 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 11 13:06:39 crc kubenswrapper[4898]: I1211 13:06:39.809184 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-samples-operator_cluster-samples-operator-665b6dd947-nnwb2_e65a531b-17ae-4b24-b9f6-71c758a757b0/cluster-samples-operator/0.log" Dec 11 13:06:39 crc kubenswrapper[4898]: I1211 13:06:39.809235 4898 generic.go:334] "Generic (PLEG): container finished" podID="e65a531b-17ae-4b24-b9f6-71c758a757b0" containerID="adc7f3b9b6e4c43269851afe81ead6142dcf0821e8f3fcb70a7ac27c4585d50e" exitCode=2 Dec 11 13:06:39 crc kubenswrapper[4898]: I1211 13:06:39.809313 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-nnwb2" event={"ID":"e65a531b-17ae-4b24-b9f6-71c758a757b0","Type":"ContainerDied","Data":"adc7f3b9b6e4c43269851afe81ead6142dcf0821e8f3fcb70a7ac27c4585d50e"} Dec 11 13:06:39 crc kubenswrapper[4898]: I1211 13:06:39.809787 4898 scope.go:117] "RemoveContainer" containerID="adc7f3b9b6e4c43269851afe81ead6142dcf0821e8f3fcb70a7ac27c4585d50e" Dec 11 13:06:39 crc kubenswrapper[4898]: I1211 13:06:39.810608 4898 generic.go:334] "Generic (PLEG): container finished" podID="01b72963-e048-458b-8e4b-7642a1dfc096" containerID="41427437818a6458bcb6b899f47745695f57624f80f0bb6b060c263b58405966" exitCode=0 Dec 11 13:06:39 crc kubenswrapper[4898]: I1211 13:06:39.810815 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hhtm7" event={"ID":"01b72963-e048-458b-8e4b-7642a1dfc096","Type":"ContainerDied","Data":"41427437818a6458bcb6b899f47745695f57624f80f0bb6b060c263b58405966"} Dec 11 13:06:39 crc kubenswrapper[4898]: I1211 13:06:39.810891 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hhtm7" event={"ID":"01b72963-e048-458b-8e4b-7642a1dfc096","Type":"ContainerStarted","Data":"bf8beb32f6ff32d80c33248c238cc7cd61c4dc952f34373fde3814634e79d05d"} Dec 11 13:06:39 crc kubenswrapper[4898]: I1211 13:06:39.816981 4898 generic.go:334] "Generic (PLEG): container finished" podID="815c6898-33a4-4a0d-b751-689267c17053" containerID="14574a77796dcddf703853dc6e12ffe7449522da44fe7e5c1a94db882991860d" exitCode=0 Dec 11 13:06:39 crc kubenswrapper[4898]: I1211 13:06:39.817067 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lsvvk" event={"ID":"815c6898-33a4-4a0d-b751-689267c17053","Type":"ContainerDied","Data":"14574a77796dcddf703853dc6e12ffe7449522da44fe7e5c1a94db882991860d"} Dec 11 13:06:39 crc kubenswrapper[4898]: I1211 13:06:39.817093 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lsvvk" event={"ID":"815c6898-33a4-4a0d-b751-689267c17053","Type":"ContainerStarted","Data":"21adf4bb027db954d33468bf7b347277209ad667b342f9f71dd8cf805e18709b"} Dec 11 13:06:39 crc kubenswrapper[4898]: I1211 13:06:39.820442 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29424300-txml6" Dec 11 13:06:39 crc kubenswrapper[4898]: I1211 13:06:39.820441 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29424300-txml6" event={"ID":"e650293a-d60c-4e05-88dd-ea1fa46b3492","Type":"ContainerDied","Data":"43ba8efb427769ae60c7019bc6c117b96ddd8dd29a85c9203783291fa3be13d5"} Dec 11 13:06:39 crc kubenswrapper[4898]: I1211 13:06:39.820669 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="43ba8efb427769ae60c7019bc6c117b96ddd8dd29a85c9203783291fa3be13d5" Dec 11 13:06:39 crc kubenswrapper[4898]: I1211 13:06:39.822759 4898 generic.go:334] "Generic (PLEG): container finished" podID="ceb008dd-582e-42f0-b83a-d3523d677e61" containerID="a374fb785cb718eb3e062f8d9e528f133e4722ac75a7e0c88ab821702fffb460" exitCode=0 Dec 11 13:06:39 crc kubenswrapper[4898]: I1211 13:06:39.822811 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-946ll" event={"ID":"ceb008dd-582e-42f0-b83a-d3523d677e61","Type":"ContainerDied","Data":"a374fb785cb718eb3e062f8d9e528f133e4722ac75a7e0c88ab821702fffb460"} Dec 11 13:06:39 crc kubenswrapper[4898]: I1211 13:06:39.822827 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-946ll" event={"ID":"ceb008dd-582e-42f0-b83a-d3523d677e61","Type":"ContainerStarted","Data":"1e28529044f204c3bab2704e12b531f7536fbd32407b6cb1d5c253119775dee3"} Dec 11 13:06:39 crc kubenswrapper[4898]: I1211 13:06:39.829932 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"22920719-7563-4c10-b96f-0d30fb4f55bf","Type":"ContainerStarted","Data":"7c2c3f262fc692afccf4bcc78cd33302426b1ac372c9a6662c70dc89849dc77b"} Dec 11 13:06:39 crc kubenswrapper[4898]: I1211 13:06:39.902646 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=1.902625684 podStartE2EDuration="1.902625684s" podCreationTimestamp="2025-12-11 13:06:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 13:06:39.884505907 +0000 UTC m=+157.456832344" watchObservedRunningTime="2025-12-11 13:06:39.902625684 +0000 UTC m=+157.474952121" Dec 11 13:06:40 crc kubenswrapper[4898]: I1211 13:06:40.330107 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-kl8sx"] Dec 11 13:06:40 crc kubenswrapper[4898]: E1211 13:06:40.330338 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e650293a-d60c-4e05-88dd-ea1fa46b3492" containerName="collect-profiles" Dec 11 13:06:40 crc kubenswrapper[4898]: I1211 13:06:40.330352 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="e650293a-d60c-4e05-88dd-ea1fa46b3492" containerName="collect-profiles" Dec 11 13:06:40 crc kubenswrapper[4898]: I1211 13:06:40.330451 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="e650293a-d60c-4e05-88dd-ea1fa46b3492" containerName="collect-profiles" Dec 11 13:06:40 crc kubenswrapper[4898]: I1211 13:06:40.331142 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kl8sx" Dec 11 13:06:40 crc kubenswrapper[4898]: I1211 13:06:40.333518 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 11 13:06:40 crc kubenswrapper[4898]: I1211 13:06:40.343254 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-kl8sx"] Dec 11 13:06:40 crc kubenswrapper[4898]: I1211 13:06:40.417501 4898 patch_prober.go:28] interesting pod/router-default-5444994796-jmljz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 11 13:06:40 crc kubenswrapper[4898]: [+]has-synced ok Dec 11 13:06:40 crc kubenswrapper[4898]: [+]process-running ok Dec 11 13:06:40 crc kubenswrapper[4898]: healthz check failed Dec 11 13:06:40 crc kubenswrapper[4898]: I1211 13:06:40.417646 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jmljz" podUID="7b9b7fbd-6a91-4761-8604-b82c417e12f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 11 13:06:40 crc kubenswrapper[4898]: I1211 13:06:40.445769 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9fbaabe9-b4fe-4a1d-9733-b64e0c1c5389-catalog-content\") pod \"redhat-marketplace-kl8sx\" (UID: \"9fbaabe9-b4fe-4a1d-9733-b64e0c1c5389\") " pod="openshift-marketplace/redhat-marketplace-kl8sx" Dec 11 13:06:40 crc kubenswrapper[4898]: I1211 13:06:40.445873 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9fbaabe9-b4fe-4a1d-9733-b64e0c1c5389-utilities\") pod \"redhat-marketplace-kl8sx\" (UID: \"9fbaabe9-b4fe-4a1d-9733-b64e0c1c5389\") " pod="openshift-marketplace/redhat-marketplace-kl8sx" Dec 11 13:06:40 crc kubenswrapper[4898]: I1211 13:06:40.445911 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j458j\" (UniqueName: \"kubernetes.io/projected/9fbaabe9-b4fe-4a1d-9733-b64e0c1c5389-kube-api-access-j458j\") pod \"redhat-marketplace-kl8sx\" (UID: \"9fbaabe9-b4fe-4a1d-9733-b64e0c1c5389\") " pod="openshift-marketplace/redhat-marketplace-kl8sx" Dec 11 13:06:40 crc kubenswrapper[4898]: I1211 13:06:40.547633 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9fbaabe9-b4fe-4a1d-9733-b64e0c1c5389-utilities\") pod \"redhat-marketplace-kl8sx\" (UID: \"9fbaabe9-b4fe-4a1d-9733-b64e0c1c5389\") " pod="openshift-marketplace/redhat-marketplace-kl8sx" Dec 11 13:06:40 crc kubenswrapper[4898]: I1211 13:06:40.547688 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j458j\" (UniqueName: \"kubernetes.io/projected/9fbaabe9-b4fe-4a1d-9733-b64e0c1c5389-kube-api-access-j458j\") pod \"redhat-marketplace-kl8sx\" (UID: \"9fbaabe9-b4fe-4a1d-9733-b64e0c1c5389\") " pod="openshift-marketplace/redhat-marketplace-kl8sx" Dec 11 13:06:40 crc kubenswrapper[4898]: I1211 13:06:40.547725 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9fbaabe9-b4fe-4a1d-9733-b64e0c1c5389-catalog-content\") pod \"redhat-marketplace-kl8sx\" (UID: \"9fbaabe9-b4fe-4a1d-9733-b64e0c1c5389\") " pod="openshift-marketplace/redhat-marketplace-kl8sx" Dec 11 13:06:40 crc kubenswrapper[4898]: I1211 13:06:40.548262 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9fbaabe9-b4fe-4a1d-9733-b64e0c1c5389-catalog-content\") pod \"redhat-marketplace-kl8sx\" (UID: \"9fbaabe9-b4fe-4a1d-9733-b64e0c1c5389\") " pod="openshift-marketplace/redhat-marketplace-kl8sx" Dec 11 13:06:40 crc kubenswrapper[4898]: I1211 13:06:40.548283 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9fbaabe9-b4fe-4a1d-9733-b64e0c1c5389-utilities\") pod \"redhat-marketplace-kl8sx\" (UID: \"9fbaabe9-b4fe-4a1d-9733-b64e0c1c5389\") " pod="openshift-marketplace/redhat-marketplace-kl8sx" Dec 11 13:06:40 crc kubenswrapper[4898]: I1211 13:06:40.566363 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j458j\" (UniqueName: \"kubernetes.io/projected/9fbaabe9-b4fe-4a1d-9733-b64e0c1c5389-kube-api-access-j458j\") pod \"redhat-marketplace-kl8sx\" (UID: \"9fbaabe9-b4fe-4a1d-9733-b64e0c1c5389\") " pod="openshift-marketplace/redhat-marketplace-kl8sx" Dec 11 13:06:40 crc kubenswrapper[4898]: I1211 13:06:40.648310 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kl8sx" Dec 11 13:06:40 crc kubenswrapper[4898]: I1211 13:06:40.738650 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-mpfz8"] Dec 11 13:06:40 crc kubenswrapper[4898]: I1211 13:06:40.739862 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mpfz8" Dec 11 13:06:40 crc kubenswrapper[4898]: I1211 13:06:40.750439 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mpfz8"] Dec 11 13:06:40 crc kubenswrapper[4898]: I1211 13:06:40.814256 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-qndxl" Dec 11 13:06:40 crc kubenswrapper[4898]: I1211 13:06:40.854096 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8c5302aa-5a81-4257-93b8-aefd5e5cc2ed-catalog-content\") pod \"redhat-marketplace-mpfz8\" (UID: \"8c5302aa-5a81-4257-93b8-aefd5e5cc2ed\") " pod="openshift-marketplace/redhat-marketplace-mpfz8" Dec 11 13:06:40 crc kubenswrapper[4898]: I1211 13:06:40.854145 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6txzk\" (UniqueName: \"kubernetes.io/projected/8c5302aa-5a81-4257-93b8-aefd5e5cc2ed-kube-api-access-6txzk\") pod \"redhat-marketplace-mpfz8\" (UID: \"8c5302aa-5a81-4257-93b8-aefd5e5cc2ed\") " pod="openshift-marketplace/redhat-marketplace-mpfz8" Dec 11 13:06:40 crc kubenswrapper[4898]: I1211 13:06:40.854179 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8c5302aa-5a81-4257-93b8-aefd5e5cc2ed-utilities\") pod \"redhat-marketplace-mpfz8\" (UID: \"8c5302aa-5a81-4257-93b8-aefd5e5cc2ed\") " pod="openshift-marketplace/redhat-marketplace-mpfz8" Dec 11 13:06:40 crc kubenswrapper[4898]: I1211 13:06:40.915392 4898 generic.go:334] "Generic (PLEG): container finished" podID="22920719-7563-4c10-b96f-0d30fb4f55bf" containerID="87c4e95be22ee42d2103c3975eedb0a222d3d91e265fa25231a80ecb51d665b4" exitCode=0 Dec 11 13:06:40 crc kubenswrapper[4898]: I1211 13:06:40.915443 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"22920719-7563-4c10-b96f-0d30fb4f55bf","Type":"ContainerDied","Data":"87c4e95be22ee42d2103c3975eedb0a222d3d91e265fa25231a80ecb51d665b4"} Dec 11 13:06:40 crc kubenswrapper[4898]: I1211 13:06:40.943113 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-samples-operator_cluster-samples-operator-665b6dd947-nnwb2_e65a531b-17ae-4b24-b9f6-71c758a757b0/cluster-samples-operator/0.log" Dec 11 13:06:40 crc kubenswrapper[4898]: I1211 13:06:40.943982 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-nnwb2" event={"ID":"e65a531b-17ae-4b24-b9f6-71c758a757b0","Type":"ContainerStarted","Data":"3f03ddf0b2afe1c7709279404810d3d66c43558da72c9235416c1c2256ee9fa5"} Dec 11 13:06:40 crc kubenswrapper[4898]: I1211 13:06:40.955293 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8c5302aa-5a81-4257-93b8-aefd5e5cc2ed-catalog-content\") pod \"redhat-marketplace-mpfz8\" (UID: \"8c5302aa-5a81-4257-93b8-aefd5e5cc2ed\") " pod="openshift-marketplace/redhat-marketplace-mpfz8" Dec 11 13:06:40 crc kubenswrapper[4898]: I1211 13:06:40.955341 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6txzk\" (UniqueName: \"kubernetes.io/projected/8c5302aa-5a81-4257-93b8-aefd5e5cc2ed-kube-api-access-6txzk\") pod \"redhat-marketplace-mpfz8\" (UID: \"8c5302aa-5a81-4257-93b8-aefd5e5cc2ed\") " pod="openshift-marketplace/redhat-marketplace-mpfz8" Dec 11 13:06:40 crc kubenswrapper[4898]: I1211 13:06:40.955375 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8c5302aa-5a81-4257-93b8-aefd5e5cc2ed-utilities\") pod \"redhat-marketplace-mpfz8\" (UID: \"8c5302aa-5a81-4257-93b8-aefd5e5cc2ed\") " pod="openshift-marketplace/redhat-marketplace-mpfz8" Dec 11 13:06:40 crc kubenswrapper[4898]: I1211 13:06:40.956521 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8c5302aa-5a81-4257-93b8-aefd5e5cc2ed-catalog-content\") pod \"redhat-marketplace-mpfz8\" (UID: \"8c5302aa-5a81-4257-93b8-aefd5e5cc2ed\") " pod="openshift-marketplace/redhat-marketplace-mpfz8" Dec 11 13:06:40 crc kubenswrapper[4898]: I1211 13:06:40.957081 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8c5302aa-5a81-4257-93b8-aefd5e5cc2ed-utilities\") pod \"redhat-marketplace-mpfz8\" (UID: \"8c5302aa-5a81-4257-93b8-aefd5e5cc2ed\") " pod="openshift-marketplace/redhat-marketplace-mpfz8" Dec 11 13:06:40 crc kubenswrapper[4898]: I1211 13:06:40.981356 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6txzk\" (UniqueName: \"kubernetes.io/projected/8c5302aa-5a81-4257-93b8-aefd5e5cc2ed-kube-api-access-6txzk\") pod \"redhat-marketplace-mpfz8\" (UID: \"8c5302aa-5a81-4257-93b8-aefd5e5cc2ed\") " pod="openshift-marketplace/redhat-marketplace-mpfz8" Dec 11 13:06:41 crc kubenswrapper[4898]: I1211 13:06:41.026574 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-kl8sx"] Dec 11 13:06:41 crc kubenswrapper[4898]: I1211 13:06:41.075861 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mpfz8" Dec 11 13:06:41 crc kubenswrapper[4898]: I1211 13:06:41.381696 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mpfz8"] Dec 11 13:06:41 crc kubenswrapper[4898]: I1211 13:06:41.386677 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-7cbpd" Dec 11 13:06:41 crc kubenswrapper[4898]: I1211 13:06:41.386745 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-7cbpd" Dec 11 13:06:41 crc kubenswrapper[4898]: I1211 13:06:41.393314 4898 patch_prober.go:28] interesting pod/console-f9d7485db-7cbpd container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.13:8443/health\": dial tcp 10.217.0.13:8443: connect: connection refused" start-of-body= Dec 11 13:06:41 crc kubenswrapper[4898]: I1211 13:06:41.393367 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-7cbpd" podUID="dd7edc7b-2d3a-4402-b7de-e70de317e52e" containerName="console" probeResult="failure" output="Get \"https://10.217.0.13:8443/health\": dial tcp 10.217.0.13:8443: connect: connection refused" Dec 11 13:06:41 crc kubenswrapper[4898]: W1211 13:06:41.394487 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8c5302aa_5a81_4257_93b8_aefd5e5cc2ed.slice/crio-bfba02839018ee6d150c0f949a62799d9db907ae03acb515312999598c77667f WatchSource:0}: Error finding container bfba02839018ee6d150c0f949a62799d9db907ae03acb515312999598c77667f: Status 404 returned error can't find the container with id bfba02839018ee6d150c0f949a62799d9db907ae03acb515312999598c77667f Dec 11 13:06:41 crc kubenswrapper[4898]: I1211 13:06:41.414867 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-jmljz" Dec 11 13:06:41 crc kubenswrapper[4898]: I1211 13:06:41.422966 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-jmljz" Dec 11 13:06:41 crc kubenswrapper[4898]: I1211 13:06:41.535111 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-dnsvs"] Dec 11 13:06:41 crc kubenswrapper[4898]: I1211 13:06:41.545353 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dnsvs" Dec 11 13:06:41 crc kubenswrapper[4898]: I1211 13:06:41.545611 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-dnsvs"] Dec 11 13:06:41 crc kubenswrapper[4898]: I1211 13:06:41.548385 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 11 13:06:41 crc kubenswrapper[4898]: I1211 13:06:41.562199 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e3afb839-6915-42b9-9b88-72d11bf5caa2-catalog-content\") pod \"redhat-operators-dnsvs\" (UID: \"e3afb839-6915-42b9-9b88-72d11bf5caa2\") " pod="openshift-marketplace/redhat-operators-dnsvs" Dec 11 13:06:41 crc kubenswrapper[4898]: I1211 13:06:41.562275 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e3afb839-6915-42b9-9b88-72d11bf5caa2-utilities\") pod \"redhat-operators-dnsvs\" (UID: \"e3afb839-6915-42b9-9b88-72d11bf5caa2\") " pod="openshift-marketplace/redhat-operators-dnsvs" Dec 11 13:06:41 crc kubenswrapper[4898]: I1211 13:06:41.562340 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rbvtn\" (UniqueName: \"kubernetes.io/projected/e3afb839-6915-42b9-9b88-72d11bf5caa2-kube-api-access-rbvtn\") pod \"redhat-operators-dnsvs\" (UID: \"e3afb839-6915-42b9-9b88-72d11bf5caa2\") " pod="openshift-marketplace/redhat-operators-dnsvs" Dec 11 13:06:41 crc kubenswrapper[4898]: I1211 13:06:41.664024 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e3afb839-6915-42b9-9b88-72d11bf5caa2-catalog-content\") pod \"redhat-operators-dnsvs\" (UID: \"e3afb839-6915-42b9-9b88-72d11bf5caa2\") " pod="openshift-marketplace/redhat-operators-dnsvs" Dec 11 13:06:41 crc kubenswrapper[4898]: I1211 13:06:41.664093 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e3afb839-6915-42b9-9b88-72d11bf5caa2-utilities\") pod \"redhat-operators-dnsvs\" (UID: \"e3afb839-6915-42b9-9b88-72d11bf5caa2\") " pod="openshift-marketplace/redhat-operators-dnsvs" Dec 11 13:06:41 crc kubenswrapper[4898]: I1211 13:06:41.664160 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rbvtn\" (UniqueName: \"kubernetes.io/projected/e3afb839-6915-42b9-9b88-72d11bf5caa2-kube-api-access-rbvtn\") pod \"redhat-operators-dnsvs\" (UID: \"e3afb839-6915-42b9-9b88-72d11bf5caa2\") " pod="openshift-marketplace/redhat-operators-dnsvs" Dec 11 13:06:41 crc kubenswrapper[4898]: I1211 13:06:41.664821 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e3afb839-6915-42b9-9b88-72d11bf5caa2-catalog-content\") pod \"redhat-operators-dnsvs\" (UID: \"e3afb839-6915-42b9-9b88-72d11bf5caa2\") " pod="openshift-marketplace/redhat-operators-dnsvs" Dec 11 13:06:41 crc kubenswrapper[4898]: I1211 13:06:41.664864 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e3afb839-6915-42b9-9b88-72d11bf5caa2-utilities\") pod \"redhat-operators-dnsvs\" (UID: \"e3afb839-6915-42b9-9b88-72d11bf5caa2\") " pod="openshift-marketplace/redhat-operators-dnsvs" Dec 11 13:06:41 crc kubenswrapper[4898]: I1211 13:06:41.680158 4898 patch_prober.go:28] interesting pod/downloads-7954f5f757-4bmq4 container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Dec 11 13:06:41 crc kubenswrapper[4898]: I1211 13:06:41.680208 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-4bmq4" podUID="5442029d-e1aa-4496-b3c6-18f11b034179" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Dec 11 13:06:41 crc kubenswrapper[4898]: I1211 13:06:41.680240 4898 patch_prober.go:28] interesting pod/downloads-7954f5f757-4bmq4 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Dec 11 13:06:41 crc kubenswrapper[4898]: I1211 13:06:41.680913 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-4bmq4" podUID="5442029d-e1aa-4496-b3c6-18f11b034179" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Dec 11 13:06:41 crc kubenswrapper[4898]: I1211 13:06:41.701832 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rbvtn\" (UniqueName: \"kubernetes.io/projected/e3afb839-6915-42b9-9b88-72d11bf5caa2-kube-api-access-rbvtn\") pod \"redhat-operators-dnsvs\" (UID: \"e3afb839-6915-42b9-9b88-72d11bf5caa2\") " pod="openshift-marketplace/redhat-operators-dnsvs" Dec 11 13:06:41 crc kubenswrapper[4898]: I1211 13:06:41.812650 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 11 13:06:41 crc kubenswrapper[4898]: I1211 13:06:41.813495 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 11 13:06:41 crc kubenswrapper[4898]: I1211 13:06:41.822647 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Dec 11 13:06:41 crc kubenswrapper[4898]: I1211 13:06:41.824985 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 11 13:06:41 crc kubenswrapper[4898]: I1211 13:06:41.826945 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Dec 11 13:06:41 crc kubenswrapper[4898]: I1211 13:06:41.866512 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9b5f2a47-e760-48ef-80e8-7478de392bd9-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"9b5f2a47-e760-48ef-80e8-7478de392bd9\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 11 13:06:41 crc kubenswrapper[4898]: I1211 13:06:41.866567 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9b5f2a47-e760-48ef-80e8-7478de392bd9-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"9b5f2a47-e760-48ef-80e8-7478de392bd9\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 11 13:06:41 crc kubenswrapper[4898]: I1211 13:06:41.928736 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dnsvs" Dec 11 13:06:41 crc kubenswrapper[4898]: I1211 13:06:41.940414 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-ch69p"] Dec 11 13:06:41 crc kubenswrapper[4898]: I1211 13:06:41.941887 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ch69p" Dec 11 13:06:41 crc kubenswrapper[4898]: I1211 13:06:41.943544 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ch69p"] Dec 11 13:06:41 crc kubenswrapper[4898]: I1211 13:06:41.952855 4898 generic.go:334] "Generic (PLEG): container finished" podID="9fbaabe9-b4fe-4a1d-9733-b64e0c1c5389" containerID="ee92dc81976314da41a11ca82e58bbc002d0cbb3ce4c810295a63c11a9467691" exitCode=0 Dec 11 13:06:41 crc kubenswrapper[4898]: I1211 13:06:41.952981 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kl8sx" event={"ID":"9fbaabe9-b4fe-4a1d-9733-b64e0c1c5389","Type":"ContainerDied","Data":"ee92dc81976314da41a11ca82e58bbc002d0cbb3ce4c810295a63c11a9467691"} Dec 11 13:06:41 crc kubenswrapper[4898]: I1211 13:06:41.953017 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kl8sx" event={"ID":"9fbaabe9-b4fe-4a1d-9733-b64e0c1c5389","Type":"ContainerStarted","Data":"004116e9819d4d531c207532ccb82cbc623f24dd1e7f604337e5ebf912285867"} Dec 11 13:06:41 crc kubenswrapper[4898]: I1211 13:06:41.963399 4898 generic.go:334] "Generic (PLEG): container finished" podID="8c5302aa-5a81-4257-93b8-aefd5e5cc2ed" containerID="a801c1fa5517ca32c45554b7817c497a8cd2e8b2260c9ae471748178efdc61d6" exitCode=0 Dec 11 13:06:41 crc kubenswrapper[4898]: I1211 13:06:41.965579 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mpfz8" event={"ID":"8c5302aa-5a81-4257-93b8-aefd5e5cc2ed","Type":"ContainerDied","Data":"a801c1fa5517ca32c45554b7817c497a8cd2e8b2260c9ae471748178efdc61d6"} Dec 11 13:06:41 crc kubenswrapper[4898]: I1211 13:06:41.965734 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mpfz8" event={"ID":"8c5302aa-5a81-4257-93b8-aefd5e5cc2ed","Type":"ContainerStarted","Data":"bfba02839018ee6d150c0f949a62799d9db907ae03acb515312999598c77667f"} Dec 11 13:06:41 crc kubenswrapper[4898]: I1211 13:06:41.967513 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9b5f2a47-e760-48ef-80e8-7478de392bd9-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"9b5f2a47-e760-48ef-80e8-7478de392bd9\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 11 13:06:41 crc kubenswrapper[4898]: I1211 13:06:41.967571 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6d4f5c98-5dd0-497e-be83-b8d669c734ad-catalog-content\") pod \"redhat-operators-ch69p\" (UID: \"6d4f5c98-5dd0-497e-be83-b8d669c734ad\") " pod="openshift-marketplace/redhat-operators-ch69p" Dec 11 13:06:41 crc kubenswrapper[4898]: I1211 13:06:41.967613 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9b5f2a47-e760-48ef-80e8-7478de392bd9-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"9b5f2a47-e760-48ef-80e8-7478de392bd9\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 11 13:06:41 crc kubenswrapper[4898]: I1211 13:06:41.967649 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9b5f2a47-e760-48ef-80e8-7478de392bd9-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"9b5f2a47-e760-48ef-80e8-7478de392bd9\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 11 13:06:41 crc kubenswrapper[4898]: I1211 13:06:41.967679 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6d4f5c98-5dd0-497e-be83-b8d669c734ad-utilities\") pod \"redhat-operators-ch69p\" (UID: \"6d4f5c98-5dd0-497e-be83-b8d669c734ad\") " pod="openshift-marketplace/redhat-operators-ch69p" Dec 11 13:06:41 crc kubenswrapper[4898]: I1211 13:06:41.967716 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tnswd\" (UniqueName: \"kubernetes.io/projected/6d4f5c98-5dd0-497e-be83-b8d669c734ad-kube-api-access-tnswd\") pod \"redhat-operators-ch69p\" (UID: \"6d4f5c98-5dd0-497e-be83-b8d669c734ad\") " pod="openshift-marketplace/redhat-operators-ch69p" Dec 11 13:06:41 crc kubenswrapper[4898]: I1211 13:06:41.968999 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-jmljz" Dec 11 13:06:41 crc kubenswrapper[4898]: I1211 13:06:41.992321 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9b5f2a47-e760-48ef-80e8-7478de392bd9-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"9b5f2a47-e760-48ef-80e8-7478de392bd9\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 11 13:06:42 crc kubenswrapper[4898]: I1211 13:06:42.068824 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6d4f5c98-5dd0-497e-be83-b8d669c734ad-catalog-content\") pod \"redhat-operators-ch69p\" (UID: \"6d4f5c98-5dd0-497e-be83-b8d669c734ad\") " pod="openshift-marketplace/redhat-operators-ch69p" Dec 11 13:06:42 crc kubenswrapper[4898]: I1211 13:06:42.068990 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6d4f5c98-5dd0-497e-be83-b8d669c734ad-utilities\") pod \"redhat-operators-ch69p\" (UID: \"6d4f5c98-5dd0-497e-be83-b8d669c734ad\") " pod="openshift-marketplace/redhat-operators-ch69p" Dec 11 13:06:42 crc kubenswrapper[4898]: I1211 13:06:42.069046 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tnswd\" (UniqueName: \"kubernetes.io/projected/6d4f5c98-5dd0-497e-be83-b8d669c734ad-kube-api-access-tnswd\") pod \"redhat-operators-ch69p\" (UID: \"6d4f5c98-5dd0-497e-be83-b8d669c734ad\") " pod="openshift-marketplace/redhat-operators-ch69p" Dec 11 13:06:42 crc kubenswrapper[4898]: I1211 13:06:42.071101 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6d4f5c98-5dd0-497e-be83-b8d669c734ad-catalog-content\") pod \"redhat-operators-ch69p\" (UID: \"6d4f5c98-5dd0-497e-be83-b8d669c734ad\") " pod="openshift-marketplace/redhat-operators-ch69p" Dec 11 13:06:42 crc kubenswrapper[4898]: I1211 13:06:42.072594 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6d4f5c98-5dd0-497e-be83-b8d669c734ad-utilities\") pod \"redhat-operators-ch69p\" (UID: \"6d4f5c98-5dd0-497e-be83-b8d669c734ad\") " pod="openshift-marketplace/redhat-operators-ch69p" Dec 11 13:06:42 crc kubenswrapper[4898]: I1211 13:06:42.090639 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tnswd\" (UniqueName: \"kubernetes.io/projected/6d4f5c98-5dd0-497e-be83-b8d669c734ad-kube-api-access-tnswd\") pod \"redhat-operators-ch69p\" (UID: \"6d4f5c98-5dd0-497e-be83-b8d669c734ad\") " pod="openshift-marketplace/redhat-operators-ch69p" Dec 11 13:06:42 crc kubenswrapper[4898]: I1211 13:06:42.149124 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 11 13:06:42 crc kubenswrapper[4898]: I1211 13:06:42.319411 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-dnsvs"] Dec 11 13:06:42 crc kubenswrapper[4898]: I1211 13:06:42.331171 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ch69p" Dec 11 13:06:42 crc kubenswrapper[4898]: I1211 13:06:42.508292 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 11 13:06:42 crc kubenswrapper[4898]: I1211 13:06:42.511956 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 11 13:06:42 crc kubenswrapper[4898]: I1211 13:06:42.631628 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/22920719-7563-4c10-b96f-0d30fb4f55bf-kube-api-access\") pod \"22920719-7563-4c10-b96f-0d30fb4f55bf\" (UID: \"22920719-7563-4c10-b96f-0d30fb4f55bf\") " Dec 11 13:06:42 crc kubenswrapper[4898]: I1211 13:06:42.632093 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/22920719-7563-4c10-b96f-0d30fb4f55bf-kubelet-dir\") pod \"22920719-7563-4c10-b96f-0d30fb4f55bf\" (UID: \"22920719-7563-4c10-b96f-0d30fb4f55bf\") " Dec 11 13:06:42 crc kubenswrapper[4898]: I1211 13:06:42.632429 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/22920719-7563-4c10-b96f-0d30fb4f55bf-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "22920719-7563-4c10-b96f-0d30fb4f55bf" (UID: "22920719-7563-4c10-b96f-0d30fb4f55bf"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 11 13:06:42 crc kubenswrapper[4898]: I1211 13:06:42.639571 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22920719-7563-4c10-b96f-0d30fb4f55bf-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "22920719-7563-4c10-b96f-0d30fb4f55bf" (UID: "22920719-7563-4c10-b96f-0d30fb4f55bf"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:06:42 crc kubenswrapper[4898]: I1211 13:06:42.647587 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ch69p"] Dec 11 13:06:42 crc kubenswrapper[4898]: W1211 13:06:42.668427 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6d4f5c98_5dd0_497e_be83_b8d669c734ad.slice/crio-b53bf77012cbafcd144bf38c74f9888d72246aecb2d1122e57e249665afc2f32 WatchSource:0}: Error finding container b53bf77012cbafcd144bf38c74f9888d72246aecb2d1122e57e249665afc2f32: Status 404 returned error can't find the container with id b53bf77012cbafcd144bf38c74f9888d72246aecb2d1122e57e249665afc2f32 Dec 11 13:06:42 crc kubenswrapper[4898]: I1211 13:06:42.733145 4898 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/22920719-7563-4c10-b96f-0d30fb4f55bf-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 11 13:06:42 crc kubenswrapper[4898]: I1211 13:06:42.733177 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/22920719-7563-4c10-b96f-0d30fb4f55bf-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 11 13:06:42 crc kubenswrapper[4898]: I1211 13:06:42.978205 4898 generic.go:334] "Generic (PLEG): container finished" podID="6d4f5c98-5dd0-497e-be83-b8d669c734ad" containerID="82bd450040b03e2c45ee6a41587dc9612afdcb1b43f04007a766e7d1cde162cc" exitCode=0 Dec 11 13:06:42 crc kubenswrapper[4898]: I1211 13:06:42.978492 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ch69p" event={"ID":"6d4f5c98-5dd0-497e-be83-b8d669c734ad","Type":"ContainerDied","Data":"82bd450040b03e2c45ee6a41587dc9612afdcb1b43f04007a766e7d1cde162cc"} Dec 11 13:06:42 crc kubenswrapper[4898]: I1211 13:06:42.978520 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ch69p" event={"ID":"6d4f5c98-5dd0-497e-be83-b8d669c734ad","Type":"ContainerStarted","Data":"b53bf77012cbafcd144bf38c74f9888d72246aecb2d1122e57e249665afc2f32"} Dec 11 13:06:42 crc kubenswrapper[4898]: I1211 13:06:42.981713 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 11 13:06:42 crc kubenswrapper[4898]: I1211 13:06:42.981739 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"22920719-7563-4c10-b96f-0d30fb4f55bf","Type":"ContainerDied","Data":"7c2c3f262fc692afccf4bcc78cd33302426b1ac372c9a6662c70dc89849dc77b"} Dec 11 13:06:42 crc kubenswrapper[4898]: I1211 13:06:42.981783 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7c2c3f262fc692afccf4bcc78cd33302426b1ac372c9a6662c70dc89849dc77b" Dec 11 13:06:42 crc kubenswrapper[4898]: I1211 13:06:42.988268 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"9b5f2a47-e760-48ef-80e8-7478de392bd9","Type":"ContainerStarted","Data":"092aa905d715c20415cf63f2aaaa41e8a24e349ee1972595709e9308b289899f"} Dec 11 13:06:43 crc kubenswrapper[4898]: I1211 13:06:42.991447 4898 generic.go:334] "Generic (PLEG): container finished" podID="e3afb839-6915-42b9-9b88-72d11bf5caa2" containerID="b9d59b1e43acdeafae38373af523219fc0ed26a5b8a2520a1a9a4dddf4b3cb5b" exitCode=0 Dec 11 13:06:43 crc kubenswrapper[4898]: I1211 13:06:42.992583 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dnsvs" event={"ID":"e3afb839-6915-42b9-9b88-72d11bf5caa2","Type":"ContainerDied","Data":"b9d59b1e43acdeafae38373af523219fc0ed26a5b8a2520a1a9a4dddf4b3cb5b"} Dec 11 13:06:43 crc kubenswrapper[4898]: I1211 13:06:42.992605 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dnsvs" event={"ID":"e3afb839-6915-42b9-9b88-72d11bf5caa2","Type":"ContainerStarted","Data":"4a46a68985b15098357ae9cd7c5e76aa77609e6afacff4cf4e720f6876bcfa51"} Dec 11 13:06:44 crc kubenswrapper[4898]: I1211 13:06:44.010800 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"9b5f2a47-e760-48ef-80e8-7478de392bd9","Type":"ContainerStarted","Data":"1aedf9ad3b595b937536057423faf3e64820015bdd801470f457b3cbd460d4ac"} Dec 11 13:06:44 crc kubenswrapper[4898]: I1211 13:06:44.028567 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=3.028551966 podStartE2EDuration="3.028551966s" podCreationTimestamp="2025-12-11 13:06:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 13:06:44.027324892 +0000 UTC m=+161.599651329" watchObservedRunningTime="2025-12-11 13:06:44.028551966 +0000 UTC m=+161.600878393" Dec 11 13:06:44 crc kubenswrapper[4898]: I1211 13:06:44.361841 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/34380c7c-1d75-4f6f-a6cb-b015a55ca978-metrics-certs\") pod \"network-metrics-daemon-zcq7l\" (UID: \"34380c7c-1d75-4f6f-a6cb-b015a55ca978\") " pod="openshift-multus/network-metrics-daemon-zcq7l" Dec 11 13:06:44 crc kubenswrapper[4898]: I1211 13:06:44.368226 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/34380c7c-1d75-4f6f-a6cb-b015a55ca978-metrics-certs\") pod \"network-metrics-daemon-zcq7l\" (UID: \"34380c7c-1d75-4f6f-a6cb-b015a55ca978\") " pod="openshift-multus/network-metrics-daemon-zcq7l" Dec 11 13:06:44 crc kubenswrapper[4898]: I1211 13:06:44.586506 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zcq7l" Dec 11 13:06:44 crc kubenswrapper[4898]: I1211 13:06:44.890577 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-zcq7l"] Dec 11 13:06:45 crc kubenswrapper[4898]: I1211 13:06:45.031833 4898 generic.go:334] "Generic (PLEG): container finished" podID="9b5f2a47-e760-48ef-80e8-7478de392bd9" containerID="1aedf9ad3b595b937536057423faf3e64820015bdd801470f457b3cbd460d4ac" exitCode=0 Dec 11 13:06:45 crc kubenswrapper[4898]: I1211 13:06:45.034728 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"9b5f2a47-e760-48ef-80e8-7478de392bd9","Type":"ContainerDied","Data":"1aedf9ad3b595b937536057423faf3e64820015bdd801470f457b3cbd460d4ac"} Dec 11 13:06:45 crc kubenswrapper[4898]: I1211 13:06:45.039110 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-zcq7l" event={"ID":"34380c7c-1d75-4f6f-a6cb-b015a55ca978","Type":"ContainerStarted","Data":"e51e5e8558777384c345af60c9e21e53c01c93d2a644e6c842c30eea4bf50e61"} Dec 11 13:06:46 crc kubenswrapper[4898]: I1211 13:06:46.371736 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 11 13:06:46 crc kubenswrapper[4898]: I1211 13:06:46.389500 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9b5f2a47-e760-48ef-80e8-7478de392bd9-kubelet-dir\") pod \"9b5f2a47-e760-48ef-80e8-7478de392bd9\" (UID: \"9b5f2a47-e760-48ef-80e8-7478de392bd9\") " Dec 11 13:06:46 crc kubenswrapper[4898]: I1211 13:06:46.389579 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9b5f2a47-e760-48ef-80e8-7478de392bd9-kube-api-access\") pod \"9b5f2a47-e760-48ef-80e8-7478de392bd9\" (UID: \"9b5f2a47-e760-48ef-80e8-7478de392bd9\") " Dec 11 13:06:46 crc kubenswrapper[4898]: I1211 13:06:46.389620 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9b5f2a47-e760-48ef-80e8-7478de392bd9-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "9b5f2a47-e760-48ef-80e8-7478de392bd9" (UID: "9b5f2a47-e760-48ef-80e8-7478de392bd9"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 11 13:06:46 crc kubenswrapper[4898]: I1211 13:06:46.389940 4898 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9b5f2a47-e760-48ef-80e8-7478de392bd9-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 11 13:06:46 crc kubenswrapper[4898]: I1211 13:06:46.395114 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b5f2a47-e760-48ef-80e8-7478de392bd9-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "9b5f2a47-e760-48ef-80e8-7478de392bd9" (UID: "9b5f2a47-e760-48ef-80e8-7478de392bd9"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:06:46 crc kubenswrapper[4898]: I1211 13:06:46.490729 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9b5f2a47-e760-48ef-80e8-7478de392bd9-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 11 13:06:47 crc kubenswrapper[4898]: I1211 13:06:47.053105 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"9b5f2a47-e760-48ef-80e8-7478de392bd9","Type":"ContainerDied","Data":"092aa905d715c20415cf63f2aaaa41e8a24e349ee1972595709e9308b289899f"} Dec 11 13:06:47 crc kubenswrapper[4898]: I1211 13:06:47.053175 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="092aa905d715c20415cf63f2aaaa41e8a24e349ee1972595709e9308b289899f" Dec 11 13:06:47 crc kubenswrapper[4898]: I1211 13:06:47.053265 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 11 13:06:47 crc kubenswrapper[4898]: I1211 13:06:47.477810 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-cc9qv" Dec 11 13:06:49 crc kubenswrapper[4898]: I1211 13:06:49.068409 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-zcq7l" event={"ID":"34380c7c-1d75-4f6f-a6cb-b015a55ca978","Type":"ContainerStarted","Data":"775181db49dc2ae0871393b74a77545b94aff6c857d97bd3a8939a8e1bc32c1e"} Dec 11 13:06:51 crc kubenswrapper[4898]: I1211 13:06:51.549344 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-7cbpd" Dec 11 13:06:51 crc kubenswrapper[4898]: I1211 13:06:51.555004 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-7cbpd" Dec 11 13:06:51 crc kubenswrapper[4898]: I1211 13:06:51.696163 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-4bmq4" Dec 11 13:06:57 crc kubenswrapper[4898]: I1211 13:06:57.404776 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-g9vz7" Dec 11 13:07:04 crc kubenswrapper[4898]: I1211 13:07:04.995819 4898 patch_prober.go:28] interesting pod/machine-config-daemon-7mmvk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 11 13:07:04 crc kubenswrapper[4898]: I1211 13:07:04.997398 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 11 13:07:07 crc kubenswrapper[4898]: I1211 13:07:07.186239 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-zcq7l" event={"ID":"34380c7c-1d75-4f6f-a6cb-b015a55ca978","Type":"ContainerStarted","Data":"2aa28d3ec8564d1ba44ad9940b9cd0610637fed9d73d7b56d50a8d895f8c94f3"} Dec 11 13:07:08 crc kubenswrapper[4898]: I1211 13:07:08.209006 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-zcq7l" podStartSLOduration=167.208989673 podStartE2EDuration="2m47.208989673s" podCreationTimestamp="2025-12-11 13:04:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 13:07:08.206107103 +0000 UTC m=+185.778433540" watchObservedRunningTime="2025-12-11 13:07:08.208989673 +0000 UTC m=+185.781316110" Dec 11 13:07:11 crc kubenswrapper[4898]: I1211 13:07:11.105331 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 11 13:07:12 crc kubenswrapper[4898]: I1211 13:07:12.428106 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zl9tf" Dec 11 13:07:17 crc kubenswrapper[4898]: I1211 13:07:17.020330 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 11 13:07:17 crc kubenswrapper[4898]: E1211 13:07:17.020795 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b5f2a47-e760-48ef-80e8-7478de392bd9" containerName="pruner" Dec 11 13:07:17 crc kubenswrapper[4898]: I1211 13:07:17.020806 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b5f2a47-e760-48ef-80e8-7478de392bd9" containerName="pruner" Dec 11 13:07:17 crc kubenswrapper[4898]: E1211 13:07:17.020820 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22920719-7563-4c10-b96f-0d30fb4f55bf" containerName="pruner" Dec 11 13:07:17 crc kubenswrapper[4898]: I1211 13:07:17.020826 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="22920719-7563-4c10-b96f-0d30fb4f55bf" containerName="pruner" Dec 11 13:07:17 crc kubenswrapper[4898]: I1211 13:07:17.020916 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b5f2a47-e760-48ef-80e8-7478de392bd9" containerName="pruner" Dec 11 13:07:17 crc kubenswrapper[4898]: I1211 13:07:17.020926 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="22920719-7563-4c10-b96f-0d30fb4f55bf" containerName="pruner" Dec 11 13:07:17 crc kubenswrapper[4898]: I1211 13:07:17.021250 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 11 13:07:17 crc kubenswrapper[4898]: I1211 13:07:17.022604 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 11 13:07:17 crc kubenswrapper[4898]: I1211 13:07:17.026882 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Dec 11 13:07:17 crc kubenswrapper[4898]: I1211 13:07:17.027099 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Dec 11 13:07:17 crc kubenswrapper[4898]: I1211 13:07:17.161535 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/41cb0379-2896-484c-83bc-5137fa621abe-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"41cb0379-2896-484c-83bc-5137fa621abe\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 11 13:07:17 crc kubenswrapper[4898]: I1211 13:07:17.161768 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/41cb0379-2896-484c-83bc-5137fa621abe-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"41cb0379-2896-484c-83bc-5137fa621abe\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 11 13:07:17 crc kubenswrapper[4898]: I1211 13:07:17.263559 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/41cb0379-2896-484c-83bc-5137fa621abe-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"41cb0379-2896-484c-83bc-5137fa621abe\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 11 13:07:17 crc kubenswrapper[4898]: I1211 13:07:17.263615 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/41cb0379-2896-484c-83bc-5137fa621abe-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"41cb0379-2896-484c-83bc-5137fa621abe\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 11 13:07:17 crc kubenswrapper[4898]: I1211 13:07:17.263704 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/41cb0379-2896-484c-83bc-5137fa621abe-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"41cb0379-2896-484c-83bc-5137fa621abe\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 11 13:07:17 crc kubenswrapper[4898]: I1211 13:07:17.287593 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/41cb0379-2896-484c-83bc-5137fa621abe-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"41cb0379-2896-484c-83bc-5137fa621abe\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 11 13:07:17 crc kubenswrapper[4898]: I1211 13:07:17.355096 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 11 13:07:17 crc kubenswrapper[4898]: E1211 13:07:17.590521 4898 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Dec 11 13:07:17 crc kubenswrapper[4898]: E1211 13:07:17.591008 4898 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rkpxv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-lsvvk_openshift-marketplace(815c6898-33a4-4a0d-b751-689267c17053): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 11 13:07:17 crc kubenswrapper[4898]: E1211 13:07:17.592257 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-lsvvk" podUID="815c6898-33a4-4a0d-b751-689267c17053" Dec 11 13:07:18 crc kubenswrapper[4898]: E1211 13:07:18.004070 4898 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Dec 11 13:07:18 crc kubenswrapper[4898]: E1211 13:07:18.004300 4898 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-tqstf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-hhtm7_openshift-marketplace(01b72963-e048-458b-8e4b-7642a1dfc096): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 11 13:07:18 crc kubenswrapper[4898]: E1211 13:07:18.005539 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-hhtm7" podUID="01b72963-e048-458b-8e4b-7642a1dfc096" Dec 11 13:07:19 crc kubenswrapper[4898]: E1211 13:07:19.564071 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-lsvvk" podUID="815c6898-33a4-4a0d-b751-689267c17053" Dec 11 13:07:19 crc kubenswrapper[4898]: E1211 13:07:19.564147 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-hhtm7" podUID="01b72963-e048-458b-8e4b-7642a1dfc096" Dec 11 13:07:19 crc kubenswrapper[4898]: E1211 13:07:19.649646 4898 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Dec 11 13:07:19 crc kubenswrapper[4898]: E1211 13:07:19.649797 4898 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ss76d,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-946ll_openshift-marketplace(ceb008dd-582e-42f0-b83a-d3523d677e61): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 11 13:07:19 crc kubenswrapper[4898]: E1211 13:07:19.651266 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-946ll" podUID="ceb008dd-582e-42f0-b83a-d3523d677e61" Dec 11 13:07:21 crc kubenswrapper[4898]: I1211 13:07:21.615315 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 11 13:07:21 crc kubenswrapper[4898]: I1211 13:07:21.616730 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 11 13:07:21 crc kubenswrapper[4898]: I1211 13:07:21.619665 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 11 13:07:21 crc kubenswrapper[4898]: I1211 13:07:21.714544 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d8b8fd27-089f-46e4-985c-09e1feb795aa-kubelet-dir\") pod \"installer-9-crc\" (UID: \"d8b8fd27-089f-46e4-985c-09e1feb795aa\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 11 13:07:21 crc kubenswrapper[4898]: I1211 13:07:21.714602 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/d8b8fd27-089f-46e4-985c-09e1feb795aa-var-lock\") pod \"installer-9-crc\" (UID: \"d8b8fd27-089f-46e4-985c-09e1feb795aa\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 11 13:07:21 crc kubenswrapper[4898]: I1211 13:07:21.714677 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d8b8fd27-089f-46e4-985c-09e1feb795aa-kube-api-access\") pod \"installer-9-crc\" (UID: \"d8b8fd27-089f-46e4-985c-09e1feb795aa\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 11 13:07:21 crc kubenswrapper[4898]: I1211 13:07:21.815857 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/d8b8fd27-089f-46e4-985c-09e1feb795aa-var-lock\") pod \"installer-9-crc\" (UID: \"d8b8fd27-089f-46e4-985c-09e1feb795aa\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 11 13:07:21 crc kubenswrapper[4898]: I1211 13:07:21.816079 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d8b8fd27-089f-46e4-985c-09e1feb795aa-kube-api-access\") pod \"installer-9-crc\" (UID: \"d8b8fd27-089f-46e4-985c-09e1feb795aa\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 11 13:07:21 crc kubenswrapper[4898]: I1211 13:07:21.816142 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/d8b8fd27-089f-46e4-985c-09e1feb795aa-var-lock\") pod \"installer-9-crc\" (UID: \"d8b8fd27-089f-46e4-985c-09e1feb795aa\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 11 13:07:21 crc kubenswrapper[4898]: I1211 13:07:21.816488 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d8b8fd27-089f-46e4-985c-09e1feb795aa-kubelet-dir\") pod \"installer-9-crc\" (UID: \"d8b8fd27-089f-46e4-985c-09e1feb795aa\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 11 13:07:21 crc kubenswrapper[4898]: I1211 13:07:21.816961 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d8b8fd27-089f-46e4-985c-09e1feb795aa-kubelet-dir\") pod \"installer-9-crc\" (UID: \"d8b8fd27-089f-46e4-985c-09e1feb795aa\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 11 13:07:21 crc kubenswrapper[4898]: I1211 13:07:21.835069 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d8b8fd27-089f-46e4-985c-09e1feb795aa-kube-api-access\") pod \"installer-9-crc\" (UID: \"d8b8fd27-089f-46e4-985c-09e1feb795aa\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 11 13:07:21 crc kubenswrapper[4898]: I1211 13:07:21.947627 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 11 13:07:22 crc kubenswrapper[4898]: E1211 13:07:22.326229 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-946ll" podUID="ceb008dd-582e-42f0-b83a-d3523d677e61" Dec 11 13:07:22 crc kubenswrapper[4898]: E1211 13:07:22.399486 4898 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Dec 11 13:07:22 crc kubenswrapper[4898]: E1211 13:07:22.399628 4898 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jhfdb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-mmpgc_openshift-marketplace(0fca7537-45c2-4c42-adab-0c373132c342): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 11 13:07:22 crc kubenswrapper[4898]: E1211 13:07:22.400915 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-mmpgc" podUID="0fca7537-45c2-4c42-adab-0c373132c342" Dec 11 13:07:23 crc kubenswrapper[4898]: E1211 13:07:23.335229 4898 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Dec 11 13:07:23 crc kubenswrapper[4898]: E1211 13:07:23.335380 4898 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6txzk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-mpfz8_openshift-marketplace(8c5302aa-5a81-4257-93b8-aefd5e5cc2ed): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 11 13:07:23 crc kubenswrapper[4898]: E1211 13:07:23.336540 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-mpfz8" podUID="8c5302aa-5a81-4257-93b8-aefd5e5cc2ed" Dec 11 13:07:26 crc kubenswrapper[4898]: E1211 13:07:26.455490 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-mmpgc" podUID="0fca7537-45c2-4c42-adab-0c373132c342" Dec 11 13:07:26 crc kubenswrapper[4898]: E1211 13:07:26.456277 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-mpfz8" podUID="8c5302aa-5a81-4257-93b8-aefd5e5cc2ed" Dec 11 13:07:26 crc kubenswrapper[4898]: E1211 13:07:26.597627 4898 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Dec 11 13:07:26 crc kubenswrapper[4898]: E1211 13:07:26.598026 4898 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-j458j,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-kl8sx_openshift-marketplace(9fbaabe9-b4fe-4a1d-9733-b64e0c1c5389): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 11 13:07:26 crc kubenswrapper[4898]: E1211 13:07:26.599247 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-kl8sx" podUID="9fbaabe9-b4fe-4a1d-9733-b64e0c1c5389" Dec 11 13:07:26 crc kubenswrapper[4898]: I1211 13:07:26.728403 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 11 13:07:26 crc kubenswrapper[4898]: W1211 13:07:26.753994 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod41cb0379_2896_484c_83bc_5137fa621abe.slice/crio-f43f3ab27c72830bf82202c2de52fdd10e4ac0a209c0ce7235d21b9ef96c1697 WatchSource:0}: Error finding container f43f3ab27c72830bf82202c2de52fdd10e4ac0a209c0ce7235d21b9ef96c1697: Status 404 returned error can't find the container with id f43f3ab27c72830bf82202c2de52fdd10e4ac0a209c0ce7235d21b9ef96c1697 Dec 11 13:07:27 crc kubenswrapper[4898]: I1211 13:07:27.006395 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 11 13:07:27 crc kubenswrapper[4898]: I1211 13:07:27.290502 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"41cb0379-2896-484c-83bc-5137fa621abe","Type":"ContainerStarted","Data":"f43f3ab27c72830bf82202c2de52fdd10e4ac0a209c0ce7235d21b9ef96c1697"} Dec 11 13:07:27 crc kubenswrapper[4898]: I1211 13:07:27.292080 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"d8b8fd27-089f-46e4-985c-09e1feb795aa","Type":"ContainerStarted","Data":"91cac5de6b3d76116700b65b6056c167283246b9de4aadf2a9714ef6b5ba0954"} Dec 11 13:07:27 crc kubenswrapper[4898]: I1211 13:07:27.294299 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dnsvs" event={"ID":"e3afb839-6915-42b9-9b88-72d11bf5caa2","Type":"ContainerStarted","Data":"c23675cc5b4b314df18f2c5f7837e0cabe79692803612d0e426ab6af2c5abd17"} Dec 11 13:07:27 crc kubenswrapper[4898]: I1211 13:07:27.296812 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ch69p" event={"ID":"6d4f5c98-5dd0-497e-be83-b8d669c734ad","Type":"ContainerStarted","Data":"966f320d3540476659073bc47df9fbac48589a763d9b16a24db1c6a4f8c1f55c"} Dec 11 13:07:27 crc kubenswrapper[4898]: E1211 13:07:27.297352 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-kl8sx" podUID="9fbaabe9-b4fe-4a1d-9733-b64e0c1c5389" Dec 11 13:07:29 crc kubenswrapper[4898]: I1211 13:07:29.306957 4898 generic.go:334] "Generic (PLEG): container finished" podID="41cb0379-2896-484c-83bc-5137fa621abe" containerID="4307e0551686e602b89117fca0b74ecfd096cc7075903843968c13bc76010c70" exitCode=0 Dec 11 13:07:29 crc kubenswrapper[4898]: I1211 13:07:29.307026 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"41cb0379-2896-484c-83bc-5137fa621abe","Type":"ContainerDied","Data":"4307e0551686e602b89117fca0b74ecfd096cc7075903843968c13bc76010c70"} Dec 11 13:07:29 crc kubenswrapper[4898]: I1211 13:07:29.309318 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"d8b8fd27-089f-46e4-985c-09e1feb795aa","Type":"ContainerStarted","Data":"d26b466e445b0f327eb76fb60561b9d0c644f7dea23e43dcfe13bd8e132730f2"} Dec 11 13:07:29 crc kubenswrapper[4898]: I1211 13:07:29.311126 4898 generic.go:334] "Generic (PLEG): container finished" podID="e3afb839-6915-42b9-9b88-72d11bf5caa2" containerID="c23675cc5b4b314df18f2c5f7837e0cabe79692803612d0e426ab6af2c5abd17" exitCode=0 Dec 11 13:07:29 crc kubenswrapper[4898]: I1211 13:07:29.311178 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dnsvs" event={"ID":"e3afb839-6915-42b9-9b88-72d11bf5caa2","Type":"ContainerDied","Data":"c23675cc5b4b314df18f2c5f7837e0cabe79692803612d0e426ab6af2c5abd17"} Dec 11 13:07:29 crc kubenswrapper[4898]: I1211 13:07:29.312692 4898 generic.go:334] "Generic (PLEG): container finished" podID="6d4f5c98-5dd0-497e-be83-b8d669c734ad" containerID="966f320d3540476659073bc47df9fbac48589a763d9b16a24db1c6a4f8c1f55c" exitCode=0 Dec 11 13:07:29 crc kubenswrapper[4898]: I1211 13:07:29.312715 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ch69p" event={"ID":"6d4f5c98-5dd0-497e-be83-b8d669c734ad","Type":"ContainerDied","Data":"966f320d3540476659073bc47df9fbac48589a763d9b16a24db1c6a4f8c1f55c"} Dec 11 13:07:29 crc kubenswrapper[4898]: I1211 13:07:29.339164 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=8.339148217 podStartE2EDuration="8.339148217s" podCreationTimestamp="2025-12-11 13:07:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 13:07:29.336133766 +0000 UTC m=+206.908460213" watchObservedRunningTime="2025-12-11 13:07:29.339148217 +0000 UTC m=+206.911474654" Dec 11 13:07:30 crc kubenswrapper[4898]: E1211 13:07:30.415625 4898 kubelet_node_status.go:756] "Failed to set some node status fields" err="failed to validate nodeIP: route ip+net: no such network interface" node="crc" Dec 11 13:07:30 crc kubenswrapper[4898]: I1211 13:07:30.531000 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 11 13:07:30 crc kubenswrapper[4898]: I1211 13:07:30.725874 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/41cb0379-2896-484c-83bc-5137fa621abe-kubelet-dir\") pod \"41cb0379-2896-484c-83bc-5137fa621abe\" (UID: \"41cb0379-2896-484c-83bc-5137fa621abe\") " Dec 11 13:07:30 crc kubenswrapper[4898]: I1211 13:07:30.725959 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/41cb0379-2896-484c-83bc-5137fa621abe-kube-api-access\") pod \"41cb0379-2896-484c-83bc-5137fa621abe\" (UID: \"41cb0379-2896-484c-83bc-5137fa621abe\") " Dec 11 13:07:30 crc kubenswrapper[4898]: I1211 13:07:30.726266 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/41cb0379-2896-484c-83bc-5137fa621abe-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "41cb0379-2896-484c-83bc-5137fa621abe" (UID: "41cb0379-2896-484c-83bc-5137fa621abe"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 11 13:07:30 crc kubenswrapper[4898]: I1211 13:07:30.726388 4898 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/41cb0379-2896-484c-83bc-5137fa621abe-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 11 13:07:30 crc kubenswrapper[4898]: I1211 13:07:30.735577 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/41cb0379-2896-484c-83bc-5137fa621abe-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "41cb0379-2896-484c-83bc-5137fa621abe" (UID: "41cb0379-2896-484c-83bc-5137fa621abe"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:07:30 crc kubenswrapper[4898]: I1211 13:07:30.764613 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-lbjzz"] Dec 11 13:07:30 crc kubenswrapper[4898]: I1211 13:07:30.827341 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/41cb0379-2896-484c-83bc-5137fa621abe-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 11 13:07:31 crc kubenswrapper[4898]: I1211 13:07:31.322945 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"41cb0379-2896-484c-83bc-5137fa621abe","Type":"ContainerDied","Data":"f43f3ab27c72830bf82202c2de52fdd10e4ac0a209c0ce7235d21b9ef96c1697"} Dec 11 13:07:31 crc kubenswrapper[4898]: I1211 13:07:31.323146 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f43f3ab27c72830bf82202c2de52fdd10e4ac0a209c0ce7235d21b9ef96c1697" Dec 11 13:07:31 crc kubenswrapper[4898]: I1211 13:07:31.322990 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 11 13:07:33 crc kubenswrapper[4898]: I1211 13:07:33.333570 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ch69p" event={"ID":"6d4f5c98-5dd0-497e-be83-b8d669c734ad","Type":"ContainerStarted","Data":"6ca9f239355d302abbc61969ed8c8bb9052cbecc480efaad15a2fa580c681361"} Dec 11 13:07:33 crc kubenswrapper[4898]: I1211 13:07:33.351727 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-ch69p" podStartSLOduration=2.913857465 podStartE2EDuration="52.351712484s" podCreationTimestamp="2025-12-11 13:06:41 +0000 UTC" firstStartedPulling="2025-12-11 13:06:42.983796372 +0000 UTC m=+160.556122809" lastFinishedPulling="2025-12-11 13:07:32.421651381 +0000 UTC m=+209.993977828" observedRunningTime="2025-12-11 13:07:33.349664026 +0000 UTC m=+210.921990463" watchObservedRunningTime="2025-12-11 13:07:33.351712484 +0000 UTC m=+210.924038921" Dec 11 13:07:34 crc kubenswrapper[4898]: I1211 13:07:34.995792 4898 patch_prober.go:28] interesting pod/machine-config-daemon-7mmvk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 11 13:07:34 crc kubenswrapper[4898]: I1211 13:07:34.996576 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 11 13:07:34 crc kubenswrapper[4898]: I1211 13:07:34.996624 4898 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" Dec 11 13:07:34 crc kubenswrapper[4898]: I1211 13:07:34.997130 4898 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"eb7b98c50effebe09db7e51fe139da11af7b116ce6b7d81f0dada92fa29c82a6"} pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 11 13:07:34 crc kubenswrapper[4898]: I1211 13:07:34.997224 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" containerName="machine-config-daemon" containerID="cri-o://eb7b98c50effebe09db7e51fe139da11af7b116ce6b7d81f0dada92fa29c82a6" gracePeriod=600 Dec 11 13:07:36 crc kubenswrapper[4898]: I1211 13:07:36.353423 4898 generic.go:334] "Generic (PLEG): container finished" podID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" containerID="eb7b98c50effebe09db7e51fe139da11af7b116ce6b7d81f0dada92fa29c82a6" exitCode=0 Dec 11 13:07:36 crc kubenswrapper[4898]: I1211 13:07:36.353675 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" event={"ID":"b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c","Type":"ContainerDied","Data":"eb7b98c50effebe09db7e51fe139da11af7b116ce6b7d81f0dada92fa29c82a6"} Dec 11 13:07:38 crc kubenswrapper[4898]: I1211 13:07:38.381742 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dnsvs" event={"ID":"e3afb839-6915-42b9-9b88-72d11bf5caa2","Type":"ContainerStarted","Data":"c9d3948613f10500c34ef4e92fb7d39072b31f772f6a601c81f891d5536d62d9"} Dec 11 13:07:38 crc kubenswrapper[4898]: I1211 13:07:38.384316 4898 generic.go:334] "Generic (PLEG): container finished" podID="01b72963-e048-458b-8e4b-7642a1dfc096" containerID="834a3fccd41a7479e3f57aa38c194c72fc2b8aee0bc468b5034daa12a983f756" exitCode=0 Dec 11 13:07:38 crc kubenswrapper[4898]: I1211 13:07:38.384337 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hhtm7" event={"ID":"01b72963-e048-458b-8e4b-7642a1dfc096","Type":"ContainerDied","Data":"834a3fccd41a7479e3f57aa38c194c72fc2b8aee0bc468b5034daa12a983f756"} Dec 11 13:07:38 crc kubenswrapper[4898]: I1211 13:07:38.386102 4898 generic.go:334] "Generic (PLEG): container finished" podID="815c6898-33a4-4a0d-b751-689267c17053" containerID="f41ed5bd5339af68e5578a8bda1e30b14b5277af77862560d2b3d7cb0bf9d45b" exitCode=0 Dec 11 13:07:38 crc kubenswrapper[4898]: I1211 13:07:38.386165 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lsvvk" event={"ID":"815c6898-33a4-4a0d-b751-689267c17053","Type":"ContainerDied","Data":"f41ed5bd5339af68e5578a8bda1e30b14b5277af77862560d2b3d7cb0bf9d45b"} Dec 11 13:07:38 crc kubenswrapper[4898]: I1211 13:07:38.389803 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" event={"ID":"b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c","Type":"ContainerStarted","Data":"f637b15c5c6a67943cd9b7a091adbf77f0c3f8526911048606b42e00cbbb2e61"} Dec 11 13:07:38 crc kubenswrapper[4898]: I1211 13:07:38.404024 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-dnsvs" podStartSLOduration=3.09110569 podStartE2EDuration="57.404002608s" podCreationTimestamp="2025-12-11 13:06:41 +0000 UTC" firstStartedPulling="2025-12-11 13:06:42.993646947 +0000 UTC m=+160.565973384" lastFinishedPulling="2025-12-11 13:07:37.306543855 +0000 UTC m=+214.878870302" observedRunningTime="2025-12-11 13:07:38.402954254 +0000 UTC m=+215.975280711" watchObservedRunningTime="2025-12-11 13:07:38.404002608 +0000 UTC m=+215.976329045" Dec 11 13:07:39 crc kubenswrapper[4898]: I1211 13:07:39.397430 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-946ll" event={"ID":"ceb008dd-582e-42f0-b83a-d3523d677e61","Type":"ContainerStarted","Data":"1669a7fac1fe34cca3af082be16d03a5bb2087bdc8e1d9bcea837b0cd094a5ba"} Dec 11 13:07:40 crc kubenswrapper[4898]: I1211 13:07:40.416135 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mmpgc" event={"ID":"0fca7537-45c2-4c42-adab-0c373132c342","Type":"ContainerDied","Data":"cd0373435e022c2676c7b77522ca912af29676770986a0ec3e6327734ba10ddc"} Dec 11 13:07:40 crc kubenswrapper[4898]: I1211 13:07:40.416963 4898 generic.go:334] "Generic (PLEG): container finished" podID="0fca7537-45c2-4c42-adab-0c373132c342" containerID="cd0373435e022c2676c7b77522ca912af29676770986a0ec3e6327734ba10ddc" exitCode=0 Dec 11 13:07:40 crc kubenswrapper[4898]: I1211 13:07:40.422312 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hhtm7" event={"ID":"01b72963-e048-458b-8e4b-7642a1dfc096","Type":"ContainerStarted","Data":"74154b5c4b3b0b8832e276e1fd7898cb3dbb9e4bcbc5171d4e1a785c08c2e5bc"} Dec 11 13:07:40 crc kubenswrapper[4898]: I1211 13:07:40.424418 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lsvvk" event={"ID":"815c6898-33a4-4a0d-b751-689267c17053","Type":"ContainerStarted","Data":"4028daaf0a69229623ed7dbde7f9fe76d42430aa92c52a7684088e773ee6347d"} Dec 11 13:07:40 crc kubenswrapper[4898]: I1211 13:07:40.426345 4898 generic.go:334] "Generic (PLEG): container finished" podID="ceb008dd-582e-42f0-b83a-d3523d677e61" containerID="1669a7fac1fe34cca3af082be16d03a5bb2087bdc8e1d9bcea837b0cd094a5ba" exitCode=0 Dec 11 13:07:40 crc kubenswrapper[4898]: I1211 13:07:40.426380 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-946ll" event={"ID":"ceb008dd-582e-42f0-b83a-d3523d677e61","Type":"ContainerDied","Data":"1669a7fac1fe34cca3af082be16d03a5bb2087bdc8e1d9bcea837b0cd094a5ba"} Dec 11 13:07:40 crc kubenswrapper[4898]: I1211 13:07:40.456442 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-lsvvk" podStartSLOduration=2.7670404939999997 podStartE2EDuration="1m2.456421273s" podCreationTimestamp="2025-12-11 13:06:38 +0000 UTC" firstStartedPulling="2025-12-11 13:06:39.823477351 +0000 UTC m=+157.395803788" lastFinishedPulling="2025-12-11 13:07:39.51285813 +0000 UTC m=+217.085184567" observedRunningTime="2025-12-11 13:07:40.455523411 +0000 UTC m=+218.027849859" watchObservedRunningTime="2025-12-11 13:07:40.456421273 +0000 UTC m=+218.028747710" Dec 11 13:07:40 crc kubenswrapper[4898]: I1211 13:07:40.503137 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-hhtm7" podStartSLOduration=2.456442346 podStartE2EDuration="1m2.503115868s" podCreationTimestamp="2025-12-11 13:06:38 +0000 UTC" firstStartedPulling="2025-12-11 13:06:39.811795075 +0000 UTC m=+157.384121512" lastFinishedPulling="2025-12-11 13:07:39.858468597 +0000 UTC m=+217.430795034" observedRunningTime="2025-12-11 13:07:40.501592422 +0000 UTC m=+218.073918869" watchObservedRunningTime="2025-12-11 13:07:40.503115868 +0000 UTC m=+218.075442305" Dec 11 13:07:41 crc kubenswrapper[4898]: I1211 13:07:41.431820 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-946ll" event={"ID":"ceb008dd-582e-42f0-b83a-d3523d677e61","Type":"ContainerStarted","Data":"6ccb9f4f6313ac40ea0dfe56a7d426c40de4d612b4f45f0cfbb28a6f1512acb5"} Dec 11 13:07:41 crc kubenswrapper[4898]: I1211 13:07:41.434635 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mmpgc" event={"ID":"0fca7537-45c2-4c42-adab-0c373132c342","Type":"ContainerStarted","Data":"4f37643ee7880d10e37c2a2d33ca1c9580267be5bd02a56426ccb06cb0594a2f"} Dec 11 13:07:41 crc kubenswrapper[4898]: I1211 13:07:41.449813 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-946ll" podStartSLOduration=2.338162181 podStartE2EDuration="1m3.449791243s" podCreationTimestamp="2025-12-11 13:06:38 +0000 UTC" firstStartedPulling="2025-12-11 13:06:39.824377856 +0000 UTC m=+157.396704293" lastFinishedPulling="2025-12-11 13:07:40.936006918 +0000 UTC m=+218.508333355" observedRunningTime="2025-12-11 13:07:41.44797388 +0000 UTC m=+219.020300317" watchObservedRunningTime="2025-12-11 13:07:41.449791243 +0000 UTC m=+219.022117690" Dec 11 13:07:41 crc kubenswrapper[4898]: I1211 13:07:41.473491 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-mmpgc" podStartSLOduration=2.402789376 podStartE2EDuration="1m3.473474844s" podCreationTimestamp="2025-12-11 13:06:38 +0000 UTC" firstStartedPulling="2025-12-11 13:06:39.806895808 +0000 UTC m=+157.379222245" lastFinishedPulling="2025-12-11 13:07:40.877581276 +0000 UTC m=+218.449907713" observedRunningTime="2025-12-11 13:07:41.47161936 +0000 UTC m=+219.043945807" watchObservedRunningTime="2025-12-11 13:07:41.473474844 +0000 UTC m=+219.045801291" Dec 11 13:07:41 crc kubenswrapper[4898]: I1211 13:07:41.930104 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-dnsvs" Dec 11 13:07:41 crc kubenswrapper[4898]: I1211 13:07:41.930498 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-dnsvs" Dec 11 13:07:42 crc kubenswrapper[4898]: I1211 13:07:42.331959 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-ch69p" Dec 11 13:07:42 crc kubenswrapper[4898]: I1211 13:07:42.332197 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-ch69p" Dec 11 13:07:42 crc kubenswrapper[4898]: I1211 13:07:42.449816 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-ch69p" Dec 11 13:07:42 crc kubenswrapper[4898]: I1211 13:07:42.492390 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-ch69p" Dec 11 13:07:43 crc kubenswrapper[4898]: I1211 13:07:43.033657 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-dnsvs" podUID="e3afb839-6915-42b9-9b88-72d11bf5caa2" containerName="registry-server" probeResult="failure" output=< Dec 11 13:07:43 crc kubenswrapper[4898]: timeout: failed to connect service ":50051" within 1s Dec 11 13:07:43 crc kubenswrapper[4898]: > Dec 11 13:07:45 crc kubenswrapper[4898]: I1211 13:07:45.205276 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ch69p"] Dec 11 13:07:45 crc kubenswrapper[4898]: I1211 13:07:45.205641 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-ch69p" podUID="6d4f5c98-5dd0-497e-be83-b8d669c734ad" containerName="registry-server" containerID="cri-o://6ca9f239355d302abbc61969ed8c8bb9052cbecc480efaad15a2fa580c681361" gracePeriod=2 Dec 11 13:07:46 crc kubenswrapper[4898]: I1211 13:07:46.468971 4898 generic.go:334] "Generic (PLEG): container finished" podID="6d4f5c98-5dd0-497e-be83-b8d669c734ad" containerID="6ca9f239355d302abbc61969ed8c8bb9052cbecc480efaad15a2fa580c681361" exitCode=0 Dec 11 13:07:46 crc kubenswrapper[4898]: I1211 13:07:46.469016 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ch69p" event={"ID":"6d4f5c98-5dd0-497e-be83-b8d669c734ad","Type":"ContainerDied","Data":"6ca9f239355d302abbc61969ed8c8bb9052cbecc480efaad15a2fa580c681361"} Dec 11 13:07:48 crc kubenswrapper[4898]: I1211 13:07:48.650170 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-lsvvk" Dec 11 13:07:48 crc kubenswrapper[4898]: I1211 13:07:48.650260 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-lsvvk" Dec 11 13:07:48 crc kubenswrapper[4898]: I1211 13:07:48.722641 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-lsvvk" Dec 11 13:07:48 crc kubenswrapper[4898]: I1211 13:07:48.860990 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-mmpgc" Dec 11 13:07:48 crc kubenswrapper[4898]: I1211 13:07:48.861048 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-mmpgc" Dec 11 13:07:48 crc kubenswrapper[4898]: I1211 13:07:48.902617 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-mmpgc" Dec 11 13:07:49 crc kubenswrapper[4898]: I1211 13:07:49.066113 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-hhtm7" Dec 11 13:07:49 crc kubenswrapper[4898]: I1211 13:07:49.066179 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-hhtm7" Dec 11 13:07:49 crc kubenswrapper[4898]: I1211 13:07:49.126999 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-hhtm7" Dec 11 13:07:49 crc kubenswrapper[4898]: I1211 13:07:49.258206 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-946ll" Dec 11 13:07:49 crc kubenswrapper[4898]: I1211 13:07:49.258271 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-946ll" Dec 11 13:07:49 crc kubenswrapper[4898]: I1211 13:07:49.300851 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-946ll" Dec 11 13:07:49 crc kubenswrapper[4898]: I1211 13:07:49.529386 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-hhtm7" Dec 11 13:07:49 crc kubenswrapper[4898]: I1211 13:07:49.542617 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-lsvvk" Dec 11 13:07:49 crc kubenswrapper[4898]: I1211 13:07:49.560846 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-946ll" Dec 11 13:07:49 crc kubenswrapper[4898]: I1211 13:07:49.561133 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-mmpgc" Dec 11 13:07:51 crc kubenswrapper[4898]: I1211 13:07:51.407934 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hhtm7"] Dec 11 13:07:51 crc kubenswrapper[4898]: I1211 13:07:51.497762 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-hhtm7" podUID="01b72963-e048-458b-8e4b-7642a1dfc096" containerName="registry-server" containerID="cri-o://74154b5c4b3b0b8832e276e1fd7898cb3dbb9e4bcbc5171d4e1a785c08c2e5bc" gracePeriod=2 Dec 11 13:07:51 crc kubenswrapper[4898]: I1211 13:07:51.605519 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-946ll"] Dec 11 13:07:51 crc kubenswrapper[4898]: I1211 13:07:51.606697 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-946ll" podUID="ceb008dd-582e-42f0-b83a-d3523d677e61" containerName="registry-server" containerID="cri-o://6ccb9f4f6313ac40ea0dfe56a7d426c40de4d612b4f45f0cfbb28a6f1512acb5" gracePeriod=2 Dec 11 13:07:51 crc kubenswrapper[4898]: I1211 13:07:51.975918 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-dnsvs" Dec 11 13:07:52 crc kubenswrapper[4898]: I1211 13:07:52.019235 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-dnsvs" Dec 11 13:07:52 crc kubenswrapper[4898]: E1211 13:07:52.334104 4898 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6ca9f239355d302abbc61969ed8c8bb9052cbecc480efaad15a2fa580c681361 is running failed: container process not found" containerID="6ca9f239355d302abbc61969ed8c8bb9052cbecc480efaad15a2fa580c681361" cmd=["grpc_health_probe","-addr=:50051"] Dec 11 13:07:52 crc kubenswrapper[4898]: E1211 13:07:52.334588 4898 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6ca9f239355d302abbc61969ed8c8bb9052cbecc480efaad15a2fa580c681361 is running failed: container process not found" containerID="6ca9f239355d302abbc61969ed8c8bb9052cbecc480efaad15a2fa580c681361" cmd=["grpc_health_probe","-addr=:50051"] Dec 11 13:07:52 crc kubenswrapper[4898]: E1211 13:07:52.335343 4898 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6ca9f239355d302abbc61969ed8c8bb9052cbecc480efaad15a2fa580c681361 is running failed: container process not found" containerID="6ca9f239355d302abbc61969ed8c8bb9052cbecc480efaad15a2fa580c681361" cmd=["grpc_health_probe","-addr=:50051"] Dec 11 13:07:52 crc kubenswrapper[4898]: E1211 13:07:52.335404 4898 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6ca9f239355d302abbc61969ed8c8bb9052cbecc480efaad15a2fa580c681361 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/redhat-operators-ch69p" podUID="6d4f5c98-5dd0-497e-be83-b8d669c734ad" containerName="registry-server" Dec 11 13:07:53 crc kubenswrapper[4898]: I1211 13:07:53.511774 4898 generic.go:334] "Generic (PLEG): container finished" podID="01b72963-e048-458b-8e4b-7642a1dfc096" containerID="74154b5c4b3b0b8832e276e1fd7898cb3dbb9e4bcbc5171d4e1a785c08c2e5bc" exitCode=0 Dec 11 13:07:53 crc kubenswrapper[4898]: I1211 13:07:53.511813 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hhtm7" event={"ID":"01b72963-e048-458b-8e4b-7642a1dfc096","Type":"ContainerDied","Data":"74154b5c4b3b0b8832e276e1fd7898cb3dbb9e4bcbc5171d4e1a785c08c2e5bc"} Dec 11 13:07:54 crc kubenswrapper[4898]: I1211 13:07:54.475596 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-946ll" Dec 11 13:07:54 crc kubenswrapper[4898]: I1211 13:07:54.509593 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ch69p" Dec 11 13:07:54 crc kubenswrapper[4898]: I1211 13:07:54.532006 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hhtm7" Dec 11 13:07:54 crc kubenswrapper[4898]: I1211 13:07:54.540992 4898 generic.go:334] "Generic (PLEG): container finished" podID="ceb008dd-582e-42f0-b83a-d3523d677e61" containerID="6ccb9f4f6313ac40ea0dfe56a7d426c40de4d612b4f45f0cfbb28a6f1512acb5" exitCode=0 Dec 11 13:07:54 crc kubenswrapper[4898]: I1211 13:07:54.541079 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-946ll" Dec 11 13:07:54 crc kubenswrapper[4898]: I1211 13:07:54.541077 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-946ll" event={"ID":"ceb008dd-582e-42f0-b83a-d3523d677e61","Type":"ContainerDied","Data":"6ccb9f4f6313ac40ea0dfe56a7d426c40de4d612b4f45f0cfbb28a6f1512acb5"} Dec 11 13:07:54 crc kubenswrapper[4898]: I1211 13:07:54.541208 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-946ll" event={"ID":"ceb008dd-582e-42f0-b83a-d3523d677e61","Type":"ContainerDied","Data":"1e28529044f204c3bab2704e12b531f7536fbd32407b6cb1d5c253119775dee3"} Dec 11 13:07:54 crc kubenswrapper[4898]: I1211 13:07:54.541232 4898 scope.go:117] "RemoveContainer" containerID="6ccb9f4f6313ac40ea0dfe56a7d426c40de4d612b4f45f0cfbb28a6f1512acb5" Dec 11 13:07:54 crc kubenswrapper[4898]: I1211 13:07:54.546427 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hhtm7" event={"ID":"01b72963-e048-458b-8e4b-7642a1dfc096","Type":"ContainerDied","Data":"bf8beb32f6ff32d80c33248c238cc7cd61c4dc952f34373fde3814634e79d05d"} Dec 11 13:07:54 crc kubenswrapper[4898]: I1211 13:07:54.546515 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hhtm7" Dec 11 13:07:54 crc kubenswrapper[4898]: I1211 13:07:54.551867 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ch69p" event={"ID":"6d4f5c98-5dd0-497e-be83-b8d669c734ad","Type":"ContainerDied","Data":"b53bf77012cbafcd144bf38c74f9888d72246aecb2d1122e57e249665afc2f32"} Dec 11 13:07:54 crc kubenswrapper[4898]: I1211 13:07:54.551964 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ch69p" Dec 11 13:07:54 crc kubenswrapper[4898]: I1211 13:07:54.567048 4898 scope.go:117] "RemoveContainer" containerID="1669a7fac1fe34cca3af082be16d03a5bb2087bdc8e1d9bcea837b0cd094a5ba" Dec 11 13:07:54 crc kubenswrapper[4898]: I1211 13:07:54.579687 4898 scope.go:117] "RemoveContainer" containerID="a374fb785cb718eb3e062f8d9e528f133e4722ac75a7e0c88ab821702fffb460" Dec 11 13:07:54 crc kubenswrapper[4898]: I1211 13:07:54.593646 4898 scope.go:117] "RemoveContainer" containerID="6ccb9f4f6313ac40ea0dfe56a7d426c40de4d612b4f45f0cfbb28a6f1512acb5" Dec 11 13:07:54 crc kubenswrapper[4898]: E1211 13:07:54.594405 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6ccb9f4f6313ac40ea0dfe56a7d426c40de4d612b4f45f0cfbb28a6f1512acb5\": container with ID starting with 6ccb9f4f6313ac40ea0dfe56a7d426c40de4d612b4f45f0cfbb28a6f1512acb5 not found: ID does not exist" containerID="6ccb9f4f6313ac40ea0dfe56a7d426c40de4d612b4f45f0cfbb28a6f1512acb5" Dec 11 13:07:54 crc kubenswrapper[4898]: I1211 13:07:54.594452 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ccb9f4f6313ac40ea0dfe56a7d426c40de4d612b4f45f0cfbb28a6f1512acb5"} err="failed to get container status \"6ccb9f4f6313ac40ea0dfe56a7d426c40de4d612b4f45f0cfbb28a6f1512acb5\": rpc error: code = NotFound desc = could not find container \"6ccb9f4f6313ac40ea0dfe56a7d426c40de4d612b4f45f0cfbb28a6f1512acb5\": container with ID starting with 6ccb9f4f6313ac40ea0dfe56a7d426c40de4d612b4f45f0cfbb28a6f1512acb5 not found: ID does not exist" Dec 11 13:07:54 crc kubenswrapper[4898]: I1211 13:07:54.594537 4898 scope.go:117] "RemoveContainer" containerID="1669a7fac1fe34cca3af082be16d03a5bb2087bdc8e1d9bcea837b0cd094a5ba" Dec 11 13:07:54 crc kubenswrapper[4898]: E1211 13:07:54.594951 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1669a7fac1fe34cca3af082be16d03a5bb2087bdc8e1d9bcea837b0cd094a5ba\": container with ID starting with 1669a7fac1fe34cca3af082be16d03a5bb2087bdc8e1d9bcea837b0cd094a5ba not found: ID does not exist" containerID="1669a7fac1fe34cca3af082be16d03a5bb2087bdc8e1d9bcea837b0cd094a5ba" Dec 11 13:07:54 crc kubenswrapper[4898]: I1211 13:07:54.594978 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1669a7fac1fe34cca3af082be16d03a5bb2087bdc8e1d9bcea837b0cd094a5ba"} err="failed to get container status \"1669a7fac1fe34cca3af082be16d03a5bb2087bdc8e1d9bcea837b0cd094a5ba\": rpc error: code = NotFound desc = could not find container \"1669a7fac1fe34cca3af082be16d03a5bb2087bdc8e1d9bcea837b0cd094a5ba\": container with ID starting with 1669a7fac1fe34cca3af082be16d03a5bb2087bdc8e1d9bcea837b0cd094a5ba not found: ID does not exist" Dec 11 13:07:54 crc kubenswrapper[4898]: I1211 13:07:54.594997 4898 scope.go:117] "RemoveContainer" containerID="a374fb785cb718eb3e062f8d9e528f133e4722ac75a7e0c88ab821702fffb460" Dec 11 13:07:54 crc kubenswrapper[4898]: E1211 13:07:54.595351 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a374fb785cb718eb3e062f8d9e528f133e4722ac75a7e0c88ab821702fffb460\": container with ID starting with a374fb785cb718eb3e062f8d9e528f133e4722ac75a7e0c88ab821702fffb460 not found: ID does not exist" containerID="a374fb785cb718eb3e062f8d9e528f133e4722ac75a7e0c88ab821702fffb460" Dec 11 13:07:54 crc kubenswrapper[4898]: I1211 13:07:54.595381 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a374fb785cb718eb3e062f8d9e528f133e4722ac75a7e0c88ab821702fffb460"} err="failed to get container status \"a374fb785cb718eb3e062f8d9e528f133e4722ac75a7e0c88ab821702fffb460\": rpc error: code = NotFound desc = could not find container \"a374fb785cb718eb3e062f8d9e528f133e4722ac75a7e0c88ab821702fffb460\": container with ID starting with a374fb785cb718eb3e062f8d9e528f133e4722ac75a7e0c88ab821702fffb460 not found: ID does not exist" Dec 11 13:07:54 crc kubenswrapper[4898]: I1211 13:07:54.595403 4898 scope.go:117] "RemoveContainer" containerID="74154b5c4b3b0b8832e276e1fd7898cb3dbb9e4bcbc5171d4e1a785c08c2e5bc" Dec 11 13:07:54 crc kubenswrapper[4898]: I1211 13:07:54.608961 4898 scope.go:117] "RemoveContainer" containerID="834a3fccd41a7479e3f57aa38c194c72fc2b8aee0bc468b5034daa12a983f756" Dec 11 13:07:54 crc kubenswrapper[4898]: I1211 13:07:54.613291 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6d4f5c98-5dd0-497e-be83-b8d669c734ad-catalog-content\") pod \"6d4f5c98-5dd0-497e-be83-b8d669c734ad\" (UID: \"6d4f5c98-5dd0-497e-be83-b8d669c734ad\") " Dec 11 13:07:54 crc kubenswrapper[4898]: I1211 13:07:54.614516 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6d4f5c98-5dd0-497e-be83-b8d669c734ad-utilities" (OuterVolumeSpecName: "utilities") pod "6d4f5c98-5dd0-497e-be83-b8d669c734ad" (UID: "6d4f5c98-5dd0-497e-be83-b8d669c734ad"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 13:07:54 crc kubenswrapper[4898]: I1211 13:07:54.613516 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6d4f5c98-5dd0-497e-be83-b8d669c734ad-utilities\") pod \"6d4f5c98-5dd0-497e-be83-b8d669c734ad\" (UID: \"6d4f5c98-5dd0-497e-be83-b8d669c734ad\") " Dec 11 13:07:54 crc kubenswrapper[4898]: I1211 13:07:54.614579 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ceb008dd-582e-42f0-b83a-d3523d677e61-catalog-content\") pod \"ceb008dd-582e-42f0-b83a-d3523d677e61\" (UID: \"ceb008dd-582e-42f0-b83a-d3523d677e61\") " Dec 11 13:07:54 crc kubenswrapper[4898]: I1211 13:07:54.614650 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/01b72963-e048-458b-8e4b-7642a1dfc096-utilities\") pod \"01b72963-e048-458b-8e4b-7642a1dfc096\" (UID: \"01b72963-e048-458b-8e4b-7642a1dfc096\") " Dec 11 13:07:54 crc kubenswrapper[4898]: I1211 13:07:54.614700 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ss76d\" (UniqueName: \"kubernetes.io/projected/ceb008dd-582e-42f0-b83a-d3523d677e61-kube-api-access-ss76d\") pod \"ceb008dd-582e-42f0-b83a-d3523d677e61\" (UID: \"ceb008dd-582e-42f0-b83a-d3523d677e61\") " Dec 11 13:07:54 crc kubenswrapper[4898]: I1211 13:07:54.614728 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ceb008dd-582e-42f0-b83a-d3523d677e61-utilities\") pod \"ceb008dd-582e-42f0-b83a-d3523d677e61\" (UID: \"ceb008dd-582e-42f0-b83a-d3523d677e61\") " Dec 11 13:07:54 crc kubenswrapper[4898]: I1211 13:07:54.614749 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/01b72963-e048-458b-8e4b-7642a1dfc096-catalog-content\") pod \"01b72963-e048-458b-8e4b-7642a1dfc096\" (UID: \"01b72963-e048-458b-8e4b-7642a1dfc096\") " Dec 11 13:07:54 crc kubenswrapper[4898]: I1211 13:07:54.614813 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tnswd\" (UniqueName: \"kubernetes.io/projected/6d4f5c98-5dd0-497e-be83-b8d669c734ad-kube-api-access-tnswd\") pod \"6d4f5c98-5dd0-497e-be83-b8d669c734ad\" (UID: \"6d4f5c98-5dd0-497e-be83-b8d669c734ad\") " Dec 11 13:07:54 crc kubenswrapper[4898]: I1211 13:07:54.615178 4898 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6d4f5c98-5dd0-497e-be83-b8d669c734ad-utilities\") on node \"crc\" DevicePath \"\"" Dec 11 13:07:54 crc kubenswrapper[4898]: I1211 13:07:54.615554 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/01b72963-e048-458b-8e4b-7642a1dfc096-utilities" (OuterVolumeSpecName: "utilities") pod "01b72963-e048-458b-8e4b-7642a1dfc096" (UID: "01b72963-e048-458b-8e4b-7642a1dfc096"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 13:07:54 crc kubenswrapper[4898]: I1211 13:07:54.616677 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ceb008dd-582e-42f0-b83a-d3523d677e61-utilities" (OuterVolumeSpecName: "utilities") pod "ceb008dd-582e-42f0-b83a-d3523d677e61" (UID: "ceb008dd-582e-42f0-b83a-d3523d677e61"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 13:07:54 crc kubenswrapper[4898]: I1211 13:07:54.621768 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d4f5c98-5dd0-497e-be83-b8d669c734ad-kube-api-access-tnswd" (OuterVolumeSpecName: "kube-api-access-tnswd") pod "6d4f5c98-5dd0-497e-be83-b8d669c734ad" (UID: "6d4f5c98-5dd0-497e-be83-b8d669c734ad"). InnerVolumeSpecName "kube-api-access-tnswd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:07:54 crc kubenswrapper[4898]: I1211 13:07:54.622514 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ceb008dd-582e-42f0-b83a-d3523d677e61-kube-api-access-ss76d" (OuterVolumeSpecName: "kube-api-access-ss76d") pod "ceb008dd-582e-42f0-b83a-d3523d677e61" (UID: "ceb008dd-582e-42f0-b83a-d3523d677e61"). InnerVolumeSpecName "kube-api-access-ss76d". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:07:54 crc kubenswrapper[4898]: I1211 13:07:54.628295 4898 scope.go:117] "RemoveContainer" containerID="41427437818a6458bcb6b899f47745695f57624f80f0bb6b060c263b58405966" Dec 11 13:07:54 crc kubenswrapper[4898]: I1211 13:07:54.644689 4898 scope.go:117] "RemoveContainer" containerID="6ca9f239355d302abbc61969ed8c8bb9052cbecc480efaad15a2fa580c681361" Dec 11 13:07:54 crc kubenswrapper[4898]: I1211 13:07:54.664446 4898 scope.go:117] "RemoveContainer" containerID="966f320d3540476659073bc47df9fbac48589a763d9b16a24db1c6a4f8c1f55c" Dec 11 13:07:54 crc kubenswrapper[4898]: I1211 13:07:54.670389 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ceb008dd-582e-42f0-b83a-d3523d677e61-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ceb008dd-582e-42f0-b83a-d3523d677e61" (UID: "ceb008dd-582e-42f0-b83a-d3523d677e61"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 13:07:54 crc kubenswrapper[4898]: I1211 13:07:54.679818 4898 scope.go:117] "RemoveContainer" containerID="82bd450040b03e2c45ee6a41587dc9612afdcb1b43f04007a766e7d1cde162cc" Dec 11 13:07:54 crc kubenswrapper[4898]: I1211 13:07:54.680371 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/01b72963-e048-458b-8e4b-7642a1dfc096-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "01b72963-e048-458b-8e4b-7642a1dfc096" (UID: "01b72963-e048-458b-8e4b-7642a1dfc096"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 13:07:54 crc kubenswrapper[4898]: I1211 13:07:54.715860 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tqstf\" (UniqueName: \"kubernetes.io/projected/01b72963-e048-458b-8e4b-7642a1dfc096-kube-api-access-tqstf\") pod \"01b72963-e048-458b-8e4b-7642a1dfc096\" (UID: \"01b72963-e048-458b-8e4b-7642a1dfc096\") " Dec 11 13:07:54 crc kubenswrapper[4898]: I1211 13:07:54.716186 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ss76d\" (UniqueName: \"kubernetes.io/projected/ceb008dd-582e-42f0-b83a-d3523d677e61-kube-api-access-ss76d\") on node \"crc\" DevicePath \"\"" Dec 11 13:07:54 crc kubenswrapper[4898]: I1211 13:07:54.716207 4898 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/01b72963-e048-458b-8e4b-7642a1dfc096-utilities\") on node \"crc\" DevicePath \"\"" Dec 11 13:07:54 crc kubenswrapper[4898]: I1211 13:07:54.716221 4898 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ceb008dd-582e-42f0-b83a-d3523d677e61-utilities\") on node \"crc\" DevicePath \"\"" Dec 11 13:07:54 crc kubenswrapper[4898]: I1211 13:07:54.716233 4898 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/01b72963-e048-458b-8e4b-7642a1dfc096-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 11 13:07:54 crc kubenswrapper[4898]: I1211 13:07:54.716245 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tnswd\" (UniqueName: \"kubernetes.io/projected/6d4f5c98-5dd0-497e-be83-b8d669c734ad-kube-api-access-tnswd\") on node \"crc\" DevicePath \"\"" Dec 11 13:07:54 crc kubenswrapper[4898]: I1211 13:07:54.716255 4898 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ceb008dd-582e-42f0-b83a-d3523d677e61-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 11 13:07:54 crc kubenswrapper[4898]: I1211 13:07:54.719659 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01b72963-e048-458b-8e4b-7642a1dfc096-kube-api-access-tqstf" (OuterVolumeSpecName: "kube-api-access-tqstf") pod "01b72963-e048-458b-8e4b-7642a1dfc096" (UID: "01b72963-e048-458b-8e4b-7642a1dfc096"). InnerVolumeSpecName "kube-api-access-tqstf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:07:54 crc kubenswrapper[4898]: I1211 13:07:54.753140 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6d4f5c98-5dd0-497e-be83-b8d669c734ad-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6d4f5c98-5dd0-497e-be83-b8d669c734ad" (UID: "6d4f5c98-5dd0-497e-be83-b8d669c734ad"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 13:07:54 crc kubenswrapper[4898]: I1211 13:07:54.816968 4898 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6d4f5c98-5dd0-497e-be83-b8d669c734ad-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 11 13:07:54 crc kubenswrapper[4898]: I1211 13:07:54.817002 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tqstf\" (UniqueName: \"kubernetes.io/projected/01b72963-e048-458b-8e4b-7642a1dfc096-kube-api-access-tqstf\") on node \"crc\" DevicePath \"\"" Dec 11 13:07:54 crc kubenswrapper[4898]: I1211 13:07:54.861823 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-946ll"] Dec 11 13:07:54 crc kubenswrapper[4898]: I1211 13:07:54.871066 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-946ll"] Dec 11 13:07:54 crc kubenswrapper[4898]: I1211 13:07:54.873202 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hhtm7"] Dec 11 13:07:54 crc kubenswrapper[4898]: I1211 13:07:54.875661 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-hhtm7"] Dec 11 13:07:54 crc kubenswrapper[4898]: I1211 13:07:54.977306 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ch69p"] Dec 11 13:07:54 crc kubenswrapper[4898]: I1211 13:07:54.982613 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-ch69p"] Dec 11 13:07:55 crc kubenswrapper[4898]: I1211 13:07:55.557881 4898 generic.go:334] "Generic (PLEG): container finished" podID="9fbaabe9-b4fe-4a1d-9733-b64e0c1c5389" containerID="e7a98467bf023c9eb677a9d51cccd7aee17c57c5b019db21a6ed7d14acec6321" exitCode=0 Dec 11 13:07:55 crc kubenswrapper[4898]: I1211 13:07:55.557978 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kl8sx" event={"ID":"9fbaabe9-b4fe-4a1d-9733-b64e0c1c5389","Type":"ContainerDied","Data":"e7a98467bf023c9eb677a9d51cccd7aee17c57c5b019db21a6ed7d14acec6321"} Dec 11 13:07:55 crc kubenswrapper[4898]: I1211 13:07:55.568679 4898 generic.go:334] "Generic (PLEG): container finished" podID="8c5302aa-5a81-4257-93b8-aefd5e5cc2ed" containerID="cd3f845b260d077344ddcda1347bf579abd72a822ddb9d37e5986d4b0f914ef8" exitCode=0 Dec 11 13:07:55 crc kubenswrapper[4898]: I1211 13:07:55.568757 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mpfz8" event={"ID":"8c5302aa-5a81-4257-93b8-aefd5e5cc2ed","Type":"ContainerDied","Data":"cd3f845b260d077344ddcda1347bf579abd72a822ddb9d37e5986d4b0f914ef8"} Dec 11 13:07:55 crc kubenswrapper[4898]: I1211 13:07:55.807260 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-lbjzz" podUID="199efc5b-5cf8-4d4b-85ad-44bdcd033bd6" containerName="oauth-openshift" containerID="cri-o://e5b152c52c061285811afdcf1c1031c3c20070d654a4cf616d6c3e808aec136e" gracePeriod=15 Dec 11 13:07:56 crc kubenswrapper[4898]: I1211 13:07:56.282037 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-lbjzz" Dec 11 13:07:56 crc kubenswrapper[4898]: I1211 13:07:56.438617 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/199efc5b-5cf8-4d4b-85ad-44bdcd033bd6-v4-0-config-user-template-provider-selection\") pod \"199efc5b-5cf8-4d4b-85ad-44bdcd033bd6\" (UID: \"199efc5b-5cf8-4d4b-85ad-44bdcd033bd6\") " Dec 11 13:07:56 crc kubenswrapper[4898]: I1211 13:07:56.438719 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/199efc5b-5cf8-4d4b-85ad-44bdcd033bd6-v4-0-config-system-router-certs\") pod \"199efc5b-5cf8-4d4b-85ad-44bdcd033bd6\" (UID: \"199efc5b-5cf8-4d4b-85ad-44bdcd033bd6\") " Dec 11 13:07:56 crc kubenswrapper[4898]: I1211 13:07:56.438748 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/199efc5b-5cf8-4d4b-85ad-44bdcd033bd6-v4-0-config-system-cliconfig\") pod \"199efc5b-5cf8-4d4b-85ad-44bdcd033bd6\" (UID: \"199efc5b-5cf8-4d4b-85ad-44bdcd033bd6\") " Dec 11 13:07:56 crc kubenswrapper[4898]: I1211 13:07:56.438789 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lmt9s\" (UniqueName: \"kubernetes.io/projected/199efc5b-5cf8-4d4b-85ad-44bdcd033bd6-kube-api-access-lmt9s\") pod \"199efc5b-5cf8-4d4b-85ad-44bdcd033bd6\" (UID: \"199efc5b-5cf8-4d4b-85ad-44bdcd033bd6\") " Dec 11 13:07:56 crc kubenswrapper[4898]: I1211 13:07:56.438819 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/199efc5b-5cf8-4d4b-85ad-44bdcd033bd6-v4-0-config-system-serving-cert\") pod \"199efc5b-5cf8-4d4b-85ad-44bdcd033bd6\" (UID: \"199efc5b-5cf8-4d4b-85ad-44bdcd033bd6\") " Dec 11 13:07:56 crc kubenswrapper[4898]: I1211 13:07:56.438862 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/199efc5b-5cf8-4d4b-85ad-44bdcd033bd6-audit-dir\") pod \"199efc5b-5cf8-4d4b-85ad-44bdcd033bd6\" (UID: \"199efc5b-5cf8-4d4b-85ad-44bdcd033bd6\") " Dec 11 13:07:56 crc kubenswrapper[4898]: I1211 13:07:56.438894 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/199efc5b-5cf8-4d4b-85ad-44bdcd033bd6-v4-0-config-user-template-error\") pod \"199efc5b-5cf8-4d4b-85ad-44bdcd033bd6\" (UID: \"199efc5b-5cf8-4d4b-85ad-44bdcd033bd6\") " Dec 11 13:07:56 crc kubenswrapper[4898]: I1211 13:07:56.438979 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/199efc5b-5cf8-4d4b-85ad-44bdcd033bd6-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "199efc5b-5cf8-4d4b-85ad-44bdcd033bd6" (UID: "199efc5b-5cf8-4d4b-85ad-44bdcd033bd6"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 11 13:07:56 crc kubenswrapper[4898]: I1211 13:07:56.439066 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/199efc5b-5cf8-4d4b-85ad-44bdcd033bd6-v4-0-config-user-template-login\") pod \"199efc5b-5cf8-4d4b-85ad-44bdcd033bd6\" (UID: \"199efc5b-5cf8-4d4b-85ad-44bdcd033bd6\") " Dec 11 13:07:56 crc kubenswrapper[4898]: I1211 13:07:56.439102 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/199efc5b-5cf8-4d4b-85ad-44bdcd033bd6-audit-policies\") pod \"199efc5b-5cf8-4d4b-85ad-44bdcd033bd6\" (UID: \"199efc5b-5cf8-4d4b-85ad-44bdcd033bd6\") " Dec 11 13:07:56 crc kubenswrapper[4898]: I1211 13:07:56.439561 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/199efc5b-5cf8-4d4b-85ad-44bdcd033bd6-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "199efc5b-5cf8-4d4b-85ad-44bdcd033bd6" (UID: "199efc5b-5cf8-4d4b-85ad-44bdcd033bd6"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:07:56 crc kubenswrapper[4898]: I1211 13:07:56.439694 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/199efc5b-5cf8-4d4b-85ad-44bdcd033bd6-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "199efc5b-5cf8-4d4b-85ad-44bdcd033bd6" (UID: "199efc5b-5cf8-4d4b-85ad-44bdcd033bd6"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:07:56 crc kubenswrapper[4898]: I1211 13:07:56.439735 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/199efc5b-5cf8-4d4b-85ad-44bdcd033bd6-v4-0-config-system-session\") pod \"199efc5b-5cf8-4d4b-85ad-44bdcd033bd6\" (UID: \"199efc5b-5cf8-4d4b-85ad-44bdcd033bd6\") " Dec 11 13:07:56 crc kubenswrapper[4898]: I1211 13:07:56.439791 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/199efc5b-5cf8-4d4b-85ad-44bdcd033bd6-v4-0-config-system-trusted-ca-bundle\") pod \"199efc5b-5cf8-4d4b-85ad-44bdcd033bd6\" (UID: \"199efc5b-5cf8-4d4b-85ad-44bdcd033bd6\") " Dec 11 13:07:56 crc kubenswrapper[4898]: I1211 13:07:56.439818 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/199efc5b-5cf8-4d4b-85ad-44bdcd033bd6-v4-0-config-system-ocp-branding-template\") pod \"199efc5b-5cf8-4d4b-85ad-44bdcd033bd6\" (UID: \"199efc5b-5cf8-4d4b-85ad-44bdcd033bd6\") " Dec 11 13:07:56 crc kubenswrapper[4898]: I1211 13:07:56.440305 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/199efc5b-5cf8-4d4b-85ad-44bdcd033bd6-v4-0-config-system-service-ca\") pod \"199efc5b-5cf8-4d4b-85ad-44bdcd033bd6\" (UID: \"199efc5b-5cf8-4d4b-85ad-44bdcd033bd6\") " Dec 11 13:07:56 crc kubenswrapper[4898]: I1211 13:07:56.440949 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/199efc5b-5cf8-4d4b-85ad-44bdcd033bd6-v4-0-config-user-idp-0-file-data\") pod \"199efc5b-5cf8-4d4b-85ad-44bdcd033bd6\" (UID: \"199efc5b-5cf8-4d4b-85ad-44bdcd033bd6\") " Dec 11 13:07:56 crc kubenswrapper[4898]: I1211 13:07:56.440235 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/199efc5b-5cf8-4d4b-85ad-44bdcd033bd6-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "199efc5b-5cf8-4d4b-85ad-44bdcd033bd6" (UID: "199efc5b-5cf8-4d4b-85ad-44bdcd033bd6"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:07:56 crc kubenswrapper[4898]: I1211 13:07:56.440889 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/199efc5b-5cf8-4d4b-85ad-44bdcd033bd6-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "199efc5b-5cf8-4d4b-85ad-44bdcd033bd6" (UID: "199efc5b-5cf8-4d4b-85ad-44bdcd033bd6"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:07:56 crc kubenswrapper[4898]: I1211 13:07:56.441630 4898 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/199efc5b-5cf8-4d4b-85ad-44bdcd033bd6-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Dec 11 13:07:56 crc kubenswrapper[4898]: I1211 13:07:56.441656 4898 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/199efc5b-5cf8-4d4b-85ad-44bdcd033bd6-audit-dir\") on node \"crc\" DevicePath \"\"" Dec 11 13:07:56 crc kubenswrapper[4898]: I1211 13:07:56.441669 4898 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/199efc5b-5cf8-4d4b-85ad-44bdcd033bd6-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 11 13:07:56 crc kubenswrapper[4898]: I1211 13:07:56.441680 4898 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/199efc5b-5cf8-4d4b-85ad-44bdcd033bd6-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 11 13:07:56 crc kubenswrapper[4898]: I1211 13:07:56.441692 4898 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/199efc5b-5cf8-4d4b-85ad-44bdcd033bd6-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Dec 11 13:07:56 crc kubenswrapper[4898]: I1211 13:07:56.454427 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/199efc5b-5cf8-4d4b-85ad-44bdcd033bd6-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "199efc5b-5cf8-4d4b-85ad-44bdcd033bd6" (UID: "199efc5b-5cf8-4d4b-85ad-44bdcd033bd6"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:07:56 crc kubenswrapper[4898]: I1211 13:07:56.459064 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/199efc5b-5cf8-4d4b-85ad-44bdcd033bd6-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "199efc5b-5cf8-4d4b-85ad-44bdcd033bd6" (UID: "199efc5b-5cf8-4d4b-85ad-44bdcd033bd6"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:07:56 crc kubenswrapper[4898]: I1211 13:07:56.459115 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/199efc5b-5cf8-4d4b-85ad-44bdcd033bd6-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "199efc5b-5cf8-4d4b-85ad-44bdcd033bd6" (UID: "199efc5b-5cf8-4d4b-85ad-44bdcd033bd6"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:07:56 crc kubenswrapper[4898]: I1211 13:07:56.459174 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/199efc5b-5cf8-4d4b-85ad-44bdcd033bd6-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "199efc5b-5cf8-4d4b-85ad-44bdcd033bd6" (UID: "199efc5b-5cf8-4d4b-85ad-44bdcd033bd6"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:07:56 crc kubenswrapper[4898]: I1211 13:07:56.459185 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/199efc5b-5cf8-4d4b-85ad-44bdcd033bd6-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "199efc5b-5cf8-4d4b-85ad-44bdcd033bd6" (UID: "199efc5b-5cf8-4d4b-85ad-44bdcd033bd6"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:07:56 crc kubenswrapper[4898]: I1211 13:07:56.459225 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/199efc5b-5cf8-4d4b-85ad-44bdcd033bd6-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "199efc5b-5cf8-4d4b-85ad-44bdcd033bd6" (UID: "199efc5b-5cf8-4d4b-85ad-44bdcd033bd6"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:07:56 crc kubenswrapper[4898]: I1211 13:07:56.459270 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/199efc5b-5cf8-4d4b-85ad-44bdcd033bd6-kube-api-access-lmt9s" (OuterVolumeSpecName: "kube-api-access-lmt9s") pod "199efc5b-5cf8-4d4b-85ad-44bdcd033bd6" (UID: "199efc5b-5cf8-4d4b-85ad-44bdcd033bd6"). InnerVolumeSpecName "kube-api-access-lmt9s". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:07:56 crc kubenswrapper[4898]: I1211 13:07:56.459296 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/199efc5b-5cf8-4d4b-85ad-44bdcd033bd6-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "199efc5b-5cf8-4d4b-85ad-44bdcd033bd6" (UID: "199efc5b-5cf8-4d4b-85ad-44bdcd033bd6"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:07:56 crc kubenswrapper[4898]: I1211 13:07:56.459681 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/199efc5b-5cf8-4d4b-85ad-44bdcd033bd6-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "199efc5b-5cf8-4d4b-85ad-44bdcd033bd6" (UID: "199efc5b-5cf8-4d4b-85ad-44bdcd033bd6"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:07:56 crc kubenswrapper[4898]: I1211 13:07:56.542331 4898 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/199efc5b-5cf8-4d4b-85ad-44bdcd033bd6-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Dec 11 13:07:56 crc kubenswrapper[4898]: I1211 13:07:56.542364 4898 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/199efc5b-5cf8-4d4b-85ad-44bdcd033bd6-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Dec 11 13:07:56 crc kubenswrapper[4898]: I1211 13:07:56.542376 4898 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/199efc5b-5cf8-4d4b-85ad-44bdcd033bd6-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Dec 11 13:07:56 crc kubenswrapper[4898]: I1211 13:07:56.542387 4898 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/199efc5b-5cf8-4d4b-85ad-44bdcd033bd6-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Dec 11 13:07:56 crc kubenswrapper[4898]: I1211 13:07:56.542398 4898 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/199efc5b-5cf8-4d4b-85ad-44bdcd033bd6-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Dec 11 13:07:56 crc kubenswrapper[4898]: I1211 13:07:56.542410 4898 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/199efc5b-5cf8-4d4b-85ad-44bdcd033bd6-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Dec 11 13:07:56 crc kubenswrapper[4898]: I1211 13:07:56.542420 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lmt9s\" (UniqueName: \"kubernetes.io/projected/199efc5b-5cf8-4d4b-85ad-44bdcd033bd6-kube-api-access-lmt9s\") on node \"crc\" DevicePath \"\"" Dec 11 13:07:56 crc kubenswrapper[4898]: I1211 13:07:56.542429 4898 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/199efc5b-5cf8-4d4b-85ad-44bdcd033bd6-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 11 13:07:56 crc kubenswrapper[4898]: I1211 13:07:56.542437 4898 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/199efc5b-5cf8-4d4b-85ad-44bdcd033bd6-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Dec 11 13:07:56 crc kubenswrapper[4898]: I1211 13:07:56.575501 4898 generic.go:334] "Generic (PLEG): container finished" podID="199efc5b-5cf8-4d4b-85ad-44bdcd033bd6" containerID="e5b152c52c061285811afdcf1c1031c3c20070d654a4cf616d6c3e808aec136e" exitCode=0 Dec 11 13:07:56 crc kubenswrapper[4898]: I1211 13:07:56.575560 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-lbjzz" event={"ID":"199efc5b-5cf8-4d4b-85ad-44bdcd033bd6","Type":"ContainerDied","Data":"e5b152c52c061285811afdcf1c1031c3c20070d654a4cf616d6c3e808aec136e"} Dec 11 13:07:56 crc kubenswrapper[4898]: I1211 13:07:56.575590 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-lbjzz" event={"ID":"199efc5b-5cf8-4d4b-85ad-44bdcd033bd6","Type":"ContainerDied","Data":"22938ddf3d1b682fb07ad119b578ae2f7cadfe39e5b9da228a918ebdb7b94f21"} Dec 11 13:07:56 crc kubenswrapper[4898]: I1211 13:07:56.575616 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-lbjzz" Dec 11 13:07:56 crc kubenswrapper[4898]: I1211 13:07:56.575621 4898 scope.go:117] "RemoveContainer" containerID="e5b152c52c061285811afdcf1c1031c3c20070d654a4cf616d6c3e808aec136e" Dec 11 13:07:56 crc kubenswrapper[4898]: I1211 13:07:56.591181 4898 scope.go:117] "RemoveContainer" containerID="e5b152c52c061285811afdcf1c1031c3c20070d654a4cf616d6c3e808aec136e" Dec 11 13:07:56 crc kubenswrapper[4898]: E1211 13:07:56.591566 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e5b152c52c061285811afdcf1c1031c3c20070d654a4cf616d6c3e808aec136e\": container with ID starting with e5b152c52c061285811afdcf1c1031c3c20070d654a4cf616d6c3e808aec136e not found: ID does not exist" containerID="e5b152c52c061285811afdcf1c1031c3c20070d654a4cf616d6c3e808aec136e" Dec 11 13:07:56 crc kubenswrapper[4898]: I1211 13:07:56.591594 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e5b152c52c061285811afdcf1c1031c3c20070d654a4cf616d6c3e808aec136e"} err="failed to get container status \"e5b152c52c061285811afdcf1c1031c3c20070d654a4cf616d6c3e808aec136e\": rpc error: code = NotFound desc = could not find container \"e5b152c52c061285811afdcf1c1031c3c20070d654a4cf616d6c3e808aec136e\": container with ID starting with e5b152c52c061285811afdcf1c1031c3c20070d654a4cf616d6c3e808aec136e not found: ID does not exist" Dec 11 13:07:56 crc kubenswrapper[4898]: I1211 13:07:56.612710 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-lbjzz"] Dec 11 13:07:56 crc kubenswrapper[4898]: I1211 13:07:56.619507 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-lbjzz"] Dec 11 13:07:56 crc kubenswrapper[4898]: I1211 13:07:56.781211 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01b72963-e048-458b-8e4b-7642a1dfc096" path="/var/lib/kubelet/pods/01b72963-e048-458b-8e4b-7642a1dfc096/volumes" Dec 11 13:07:56 crc kubenswrapper[4898]: I1211 13:07:56.781859 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="199efc5b-5cf8-4d4b-85ad-44bdcd033bd6" path="/var/lib/kubelet/pods/199efc5b-5cf8-4d4b-85ad-44bdcd033bd6/volumes" Dec 11 13:07:56 crc kubenswrapper[4898]: I1211 13:07:56.782279 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6d4f5c98-5dd0-497e-be83-b8d669c734ad" path="/var/lib/kubelet/pods/6d4f5c98-5dd0-497e-be83-b8d669c734ad/volumes" Dec 11 13:07:56 crc kubenswrapper[4898]: I1211 13:07:56.782828 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ceb008dd-582e-42f0-b83a-d3523d677e61" path="/var/lib/kubelet/pods/ceb008dd-582e-42f0-b83a-d3523d677e61/volumes" Dec 11 13:07:57 crc kubenswrapper[4898]: I1211 13:07:57.481196 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-6cb668d466-7t64n"] Dec 11 13:07:57 crc kubenswrapper[4898]: E1211 13:07:57.481425 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01b72963-e048-458b-8e4b-7642a1dfc096" containerName="extract-utilities" Dec 11 13:07:57 crc kubenswrapper[4898]: I1211 13:07:57.481439 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="01b72963-e048-458b-8e4b-7642a1dfc096" containerName="extract-utilities" Dec 11 13:07:57 crc kubenswrapper[4898]: E1211 13:07:57.481449 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d4f5c98-5dd0-497e-be83-b8d669c734ad" containerName="extract-content" Dec 11 13:07:57 crc kubenswrapper[4898]: I1211 13:07:57.481475 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d4f5c98-5dd0-497e-be83-b8d669c734ad" containerName="extract-content" Dec 11 13:07:57 crc kubenswrapper[4898]: E1211 13:07:57.481486 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01b72963-e048-458b-8e4b-7642a1dfc096" containerName="registry-server" Dec 11 13:07:57 crc kubenswrapper[4898]: I1211 13:07:57.481492 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="01b72963-e048-458b-8e4b-7642a1dfc096" containerName="registry-server" Dec 11 13:07:57 crc kubenswrapper[4898]: E1211 13:07:57.481501 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ceb008dd-582e-42f0-b83a-d3523d677e61" containerName="extract-utilities" Dec 11 13:07:57 crc kubenswrapper[4898]: I1211 13:07:57.481507 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="ceb008dd-582e-42f0-b83a-d3523d677e61" containerName="extract-utilities" Dec 11 13:07:57 crc kubenswrapper[4898]: E1211 13:07:57.481514 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="199efc5b-5cf8-4d4b-85ad-44bdcd033bd6" containerName="oauth-openshift" Dec 11 13:07:57 crc kubenswrapper[4898]: I1211 13:07:57.481520 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="199efc5b-5cf8-4d4b-85ad-44bdcd033bd6" containerName="oauth-openshift" Dec 11 13:07:57 crc kubenswrapper[4898]: E1211 13:07:57.481528 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01b72963-e048-458b-8e4b-7642a1dfc096" containerName="extract-content" Dec 11 13:07:57 crc kubenswrapper[4898]: I1211 13:07:57.481535 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="01b72963-e048-458b-8e4b-7642a1dfc096" containerName="extract-content" Dec 11 13:07:57 crc kubenswrapper[4898]: E1211 13:07:57.481542 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d4f5c98-5dd0-497e-be83-b8d669c734ad" containerName="extract-utilities" Dec 11 13:07:57 crc kubenswrapper[4898]: I1211 13:07:57.481547 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d4f5c98-5dd0-497e-be83-b8d669c734ad" containerName="extract-utilities" Dec 11 13:07:57 crc kubenswrapper[4898]: E1211 13:07:57.481555 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ceb008dd-582e-42f0-b83a-d3523d677e61" containerName="registry-server" Dec 11 13:07:57 crc kubenswrapper[4898]: I1211 13:07:57.481561 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="ceb008dd-582e-42f0-b83a-d3523d677e61" containerName="registry-server" Dec 11 13:07:57 crc kubenswrapper[4898]: E1211 13:07:57.481573 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41cb0379-2896-484c-83bc-5137fa621abe" containerName="pruner" Dec 11 13:07:57 crc kubenswrapper[4898]: I1211 13:07:57.481578 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="41cb0379-2896-484c-83bc-5137fa621abe" containerName="pruner" Dec 11 13:07:57 crc kubenswrapper[4898]: E1211 13:07:57.481584 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ceb008dd-582e-42f0-b83a-d3523d677e61" containerName="extract-content" Dec 11 13:07:57 crc kubenswrapper[4898]: I1211 13:07:57.481590 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="ceb008dd-582e-42f0-b83a-d3523d677e61" containerName="extract-content" Dec 11 13:07:57 crc kubenswrapper[4898]: E1211 13:07:57.481623 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d4f5c98-5dd0-497e-be83-b8d669c734ad" containerName="registry-server" Dec 11 13:07:57 crc kubenswrapper[4898]: I1211 13:07:57.481628 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d4f5c98-5dd0-497e-be83-b8d669c734ad" containerName="registry-server" Dec 11 13:07:57 crc kubenswrapper[4898]: I1211 13:07:57.481745 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d4f5c98-5dd0-497e-be83-b8d669c734ad" containerName="registry-server" Dec 11 13:07:57 crc kubenswrapper[4898]: I1211 13:07:57.481756 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="ceb008dd-582e-42f0-b83a-d3523d677e61" containerName="registry-server" Dec 11 13:07:57 crc kubenswrapper[4898]: I1211 13:07:57.481764 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="199efc5b-5cf8-4d4b-85ad-44bdcd033bd6" containerName="oauth-openshift" Dec 11 13:07:57 crc kubenswrapper[4898]: I1211 13:07:57.481775 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="41cb0379-2896-484c-83bc-5137fa621abe" containerName="pruner" Dec 11 13:07:57 crc kubenswrapper[4898]: I1211 13:07:57.481783 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="01b72963-e048-458b-8e4b-7642a1dfc096" containerName="registry-server" Dec 11 13:07:57 crc kubenswrapper[4898]: I1211 13:07:57.482136 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-6cb668d466-7t64n" Dec 11 13:07:57 crc kubenswrapper[4898]: I1211 13:07:57.484182 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Dec 11 13:07:57 crc kubenswrapper[4898]: I1211 13:07:57.484427 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Dec 11 13:07:57 crc kubenswrapper[4898]: I1211 13:07:57.484652 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Dec 11 13:07:57 crc kubenswrapper[4898]: I1211 13:07:57.485232 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Dec 11 13:07:57 crc kubenswrapper[4898]: I1211 13:07:57.485232 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Dec 11 13:07:57 crc kubenswrapper[4898]: I1211 13:07:57.487010 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Dec 11 13:07:57 crc kubenswrapper[4898]: I1211 13:07:57.487237 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Dec 11 13:07:57 crc kubenswrapper[4898]: I1211 13:07:57.487298 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Dec 11 13:07:57 crc kubenswrapper[4898]: I1211 13:07:57.487360 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Dec 11 13:07:57 crc kubenswrapper[4898]: I1211 13:07:57.487571 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Dec 11 13:07:57 crc kubenswrapper[4898]: I1211 13:07:57.487673 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Dec 11 13:07:57 crc kubenswrapper[4898]: I1211 13:07:57.488159 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Dec 11 13:07:57 crc kubenswrapper[4898]: I1211 13:07:57.494556 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Dec 11 13:07:57 crc kubenswrapper[4898]: I1211 13:07:57.496798 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Dec 11 13:07:57 crc kubenswrapper[4898]: I1211 13:07:57.506847 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-6cb668d466-7t64n"] Dec 11 13:07:57 crc kubenswrapper[4898]: I1211 13:07:57.508757 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Dec 11 13:07:57 crc kubenswrapper[4898]: I1211 13:07:57.555272 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/5531fe73-f1a4-4a40-9458-536d6a8e1865-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-6cb668d466-7t64n\" (UID: \"5531fe73-f1a4-4a40-9458-536d6a8e1865\") " pod="openshift-authentication/oauth-openshift-6cb668d466-7t64n" Dec 11 13:07:57 crc kubenswrapper[4898]: I1211 13:07:57.555320 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/5531fe73-f1a4-4a40-9458-536d6a8e1865-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6cb668d466-7t64n\" (UID: \"5531fe73-f1a4-4a40-9458-536d6a8e1865\") " pod="openshift-authentication/oauth-openshift-6cb668d466-7t64n" Dec 11 13:07:57 crc kubenswrapper[4898]: I1211 13:07:57.555341 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/5531fe73-f1a4-4a40-9458-536d6a8e1865-v4-0-config-system-serving-cert\") pod \"oauth-openshift-6cb668d466-7t64n\" (UID: \"5531fe73-f1a4-4a40-9458-536d6a8e1865\") " pod="openshift-authentication/oauth-openshift-6cb668d466-7t64n" Dec 11 13:07:57 crc kubenswrapper[4898]: I1211 13:07:57.555362 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5531fe73-f1a4-4a40-9458-536d6a8e1865-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-6cb668d466-7t64n\" (UID: \"5531fe73-f1a4-4a40-9458-536d6a8e1865\") " pod="openshift-authentication/oauth-openshift-6cb668d466-7t64n" Dec 11 13:07:57 crc kubenswrapper[4898]: I1211 13:07:57.555382 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/5531fe73-f1a4-4a40-9458-536d6a8e1865-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-6cb668d466-7t64n\" (UID: \"5531fe73-f1a4-4a40-9458-536d6a8e1865\") " pod="openshift-authentication/oauth-openshift-6cb668d466-7t64n" Dec 11 13:07:57 crc kubenswrapper[4898]: I1211 13:07:57.555471 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/5531fe73-f1a4-4a40-9458-536d6a8e1865-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-6cb668d466-7t64n\" (UID: \"5531fe73-f1a4-4a40-9458-536d6a8e1865\") " pod="openshift-authentication/oauth-openshift-6cb668d466-7t64n" Dec 11 13:07:57 crc kubenswrapper[4898]: I1211 13:07:57.555535 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kplj2\" (UniqueName: \"kubernetes.io/projected/5531fe73-f1a4-4a40-9458-536d6a8e1865-kube-api-access-kplj2\") pod \"oauth-openshift-6cb668d466-7t64n\" (UID: \"5531fe73-f1a4-4a40-9458-536d6a8e1865\") " pod="openshift-authentication/oauth-openshift-6cb668d466-7t64n" Dec 11 13:07:57 crc kubenswrapper[4898]: I1211 13:07:57.555633 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/5531fe73-f1a4-4a40-9458-536d6a8e1865-audit-policies\") pod \"oauth-openshift-6cb668d466-7t64n\" (UID: \"5531fe73-f1a4-4a40-9458-536d6a8e1865\") " pod="openshift-authentication/oauth-openshift-6cb668d466-7t64n" Dec 11 13:07:57 crc kubenswrapper[4898]: I1211 13:07:57.555762 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/5531fe73-f1a4-4a40-9458-536d6a8e1865-v4-0-config-user-template-login\") pod \"oauth-openshift-6cb668d466-7t64n\" (UID: \"5531fe73-f1a4-4a40-9458-536d6a8e1865\") " pod="openshift-authentication/oauth-openshift-6cb668d466-7t64n" Dec 11 13:07:57 crc kubenswrapper[4898]: I1211 13:07:57.555859 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/5531fe73-f1a4-4a40-9458-536d6a8e1865-v4-0-config-system-service-ca\") pod \"oauth-openshift-6cb668d466-7t64n\" (UID: \"5531fe73-f1a4-4a40-9458-536d6a8e1865\") " pod="openshift-authentication/oauth-openshift-6cb668d466-7t64n" Dec 11 13:07:57 crc kubenswrapper[4898]: I1211 13:07:57.555919 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5531fe73-f1a4-4a40-9458-536d6a8e1865-audit-dir\") pod \"oauth-openshift-6cb668d466-7t64n\" (UID: \"5531fe73-f1a4-4a40-9458-536d6a8e1865\") " pod="openshift-authentication/oauth-openshift-6cb668d466-7t64n" Dec 11 13:07:57 crc kubenswrapper[4898]: I1211 13:07:57.555983 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/5531fe73-f1a4-4a40-9458-536d6a8e1865-v4-0-config-system-router-certs\") pod \"oauth-openshift-6cb668d466-7t64n\" (UID: \"5531fe73-f1a4-4a40-9458-536d6a8e1865\") " pod="openshift-authentication/oauth-openshift-6cb668d466-7t64n" Dec 11 13:07:57 crc kubenswrapper[4898]: I1211 13:07:57.556074 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/5531fe73-f1a4-4a40-9458-536d6a8e1865-v4-0-config-system-session\") pod \"oauth-openshift-6cb668d466-7t64n\" (UID: \"5531fe73-f1a4-4a40-9458-536d6a8e1865\") " pod="openshift-authentication/oauth-openshift-6cb668d466-7t64n" Dec 11 13:07:57 crc kubenswrapper[4898]: I1211 13:07:57.556137 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/5531fe73-f1a4-4a40-9458-536d6a8e1865-v4-0-config-user-template-error\") pod \"oauth-openshift-6cb668d466-7t64n\" (UID: \"5531fe73-f1a4-4a40-9458-536d6a8e1865\") " pod="openshift-authentication/oauth-openshift-6cb668d466-7t64n" Dec 11 13:07:57 crc kubenswrapper[4898]: I1211 13:07:57.583881 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mpfz8" event={"ID":"8c5302aa-5a81-4257-93b8-aefd5e5cc2ed","Type":"ContainerStarted","Data":"77c45d087fb7d682f6fb466817956e8e9e5d7cca25915b49248a5c96b9de94a9"} Dec 11 13:07:57 crc kubenswrapper[4898]: I1211 13:07:57.586367 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kl8sx" event={"ID":"9fbaabe9-b4fe-4a1d-9733-b64e0c1c5389","Type":"ContainerStarted","Data":"6045840bed859fbe133d9e92bd49915c2570c2780e21de82232e6f3376f5abdb"} Dec 11 13:07:57 crc kubenswrapper[4898]: I1211 13:07:57.656660 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/5531fe73-f1a4-4a40-9458-536d6a8e1865-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-6cb668d466-7t64n\" (UID: \"5531fe73-f1a4-4a40-9458-536d6a8e1865\") " pod="openshift-authentication/oauth-openshift-6cb668d466-7t64n" Dec 11 13:07:57 crc kubenswrapper[4898]: I1211 13:07:57.656718 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kplj2\" (UniqueName: \"kubernetes.io/projected/5531fe73-f1a4-4a40-9458-536d6a8e1865-kube-api-access-kplj2\") pod \"oauth-openshift-6cb668d466-7t64n\" (UID: \"5531fe73-f1a4-4a40-9458-536d6a8e1865\") " pod="openshift-authentication/oauth-openshift-6cb668d466-7t64n" Dec 11 13:07:57 crc kubenswrapper[4898]: I1211 13:07:57.656747 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/5531fe73-f1a4-4a40-9458-536d6a8e1865-audit-policies\") pod \"oauth-openshift-6cb668d466-7t64n\" (UID: \"5531fe73-f1a4-4a40-9458-536d6a8e1865\") " pod="openshift-authentication/oauth-openshift-6cb668d466-7t64n" Dec 11 13:07:57 crc kubenswrapper[4898]: I1211 13:07:57.656783 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/5531fe73-f1a4-4a40-9458-536d6a8e1865-v4-0-config-user-template-login\") pod \"oauth-openshift-6cb668d466-7t64n\" (UID: \"5531fe73-f1a4-4a40-9458-536d6a8e1865\") " pod="openshift-authentication/oauth-openshift-6cb668d466-7t64n" Dec 11 13:07:57 crc kubenswrapper[4898]: I1211 13:07:57.656818 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/5531fe73-f1a4-4a40-9458-536d6a8e1865-v4-0-config-system-service-ca\") pod \"oauth-openshift-6cb668d466-7t64n\" (UID: \"5531fe73-f1a4-4a40-9458-536d6a8e1865\") " pod="openshift-authentication/oauth-openshift-6cb668d466-7t64n" Dec 11 13:07:57 crc kubenswrapper[4898]: I1211 13:07:57.656847 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5531fe73-f1a4-4a40-9458-536d6a8e1865-audit-dir\") pod \"oauth-openshift-6cb668d466-7t64n\" (UID: \"5531fe73-f1a4-4a40-9458-536d6a8e1865\") " pod="openshift-authentication/oauth-openshift-6cb668d466-7t64n" Dec 11 13:07:57 crc kubenswrapper[4898]: I1211 13:07:57.656871 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/5531fe73-f1a4-4a40-9458-536d6a8e1865-v4-0-config-system-router-certs\") pod \"oauth-openshift-6cb668d466-7t64n\" (UID: \"5531fe73-f1a4-4a40-9458-536d6a8e1865\") " pod="openshift-authentication/oauth-openshift-6cb668d466-7t64n" Dec 11 13:07:57 crc kubenswrapper[4898]: I1211 13:07:57.656907 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/5531fe73-f1a4-4a40-9458-536d6a8e1865-v4-0-config-system-session\") pod \"oauth-openshift-6cb668d466-7t64n\" (UID: \"5531fe73-f1a4-4a40-9458-536d6a8e1865\") " pod="openshift-authentication/oauth-openshift-6cb668d466-7t64n" Dec 11 13:07:57 crc kubenswrapper[4898]: I1211 13:07:57.656936 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/5531fe73-f1a4-4a40-9458-536d6a8e1865-v4-0-config-user-template-error\") pod \"oauth-openshift-6cb668d466-7t64n\" (UID: \"5531fe73-f1a4-4a40-9458-536d6a8e1865\") " pod="openshift-authentication/oauth-openshift-6cb668d466-7t64n" Dec 11 13:07:57 crc kubenswrapper[4898]: I1211 13:07:57.656959 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/5531fe73-f1a4-4a40-9458-536d6a8e1865-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-6cb668d466-7t64n\" (UID: \"5531fe73-f1a4-4a40-9458-536d6a8e1865\") " pod="openshift-authentication/oauth-openshift-6cb668d466-7t64n" Dec 11 13:07:57 crc kubenswrapper[4898]: I1211 13:07:57.656983 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/5531fe73-f1a4-4a40-9458-536d6a8e1865-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6cb668d466-7t64n\" (UID: \"5531fe73-f1a4-4a40-9458-536d6a8e1865\") " pod="openshift-authentication/oauth-openshift-6cb668d466-7t64n" Dec 11 13:07:57 crc kubenswrapper[4898]: I1211 13:07:57.657007 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/5531fe73-f1a4-4a40-9458-536d6a8e1865-v4-0-config-system-serving-cert\") pod \"oauth-openshift-6cb668d466-7t64n\" (UID: \"5531fe73-f1a4-4a40-9458-536d6a8e1865\") " pod="openshift-authentication/oauth-openshift-6cb668d466-7t64n" Dec 11 13:07:57 crc kubenswrapper[4898]: I1211 13:07:57.657034 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5531fe73-f1a4-4a40-9458-536d6a8e1865-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-6cb668d466-7t64n\" (UID: \"5531fe73-f1a4-4a40-9458-536d6a8e1865\") " pod="openshift-authentication/oauth-openshift-6cb668d466-7t64n" Dec 11 13:07:57 crc kubenswrapper[4898]: I1211 13:07:57.657058 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/5531fe73-f1a4-4a40-9458-536d6a8e1865-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-6cb668d466-7t64n\" (UID: \"5531fe73-f1a4-4a40-9458-536d6a8e1865\") " pod="openshift-authentication/oauth-openshift-6cb668d466-7t64n" Dec 11 13:07:57 crc kubenswrapper[4898]: I1211 13:07:57.657602 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/5531fe73-f1a4-4a40-9458-536d6a8e1865-v4-0-config-system-service-ca\") pod \"oauth-openshift-6cb668d466-7t64n\" (UID: \"5531fe73-f1a4-4a40-9458-536d6a8e1865\") " pod="openshift-authentication/oauth-openshift-6cb668d466-7t64n" Dec 11 13:07:57 crc kubenswrapper[4898]: I1211 13:07:57.658062 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/5531fe73-f1a4-4a40-9458-536d6a8e1865-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6cb668d466-7t64n\" (UID: \"5531fe73-f1a4-4a40-9458-536d6a8e1865\") " pod="openshift-authentication/oauth-openshift-6cb668d466-7t64n" Dec 11 13:07:57 crc kubenswrapper[4898]: I1211 13:07:57.658188 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5531fe73-f1a4-4a40-9458-536d6a8e1865-audit-dir\") pod \"oauth-openshift-6cb668d466-7t64n\" (UID: \"5531fe73-f1a4-4a40-9458-536d6a8e1865\") " pod="openshift-authentication/oauth-openshift-6cb668d466-7t64n" Dec 11 13:07:57 crc kubenswrapper[4898]: I1211 13:07:57.661418 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5531fe73-f1a4-4a40-9458-536d6a8e1865-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-6cb668d466-7t64n\" (UID: \"5531fe73-f1a4-4a40-9458-536d6a8e1865\") " pod="openshift-authentication/oauth-openshift-6cb668d466-7t64n" Dec 11 13:07:57 crc kubenswrapper[4898]: I1211 13:07:57.664477 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/5531fe73-f1a4-4a40-9458-536d6a8e1865-v4-0-config-system-serving-cert\") pod \"oauth-openshift-6cb668d466-7t64n\" (UID: \"5531fe73-f1a4-4a40-9458-536d6a8e1865\") " pod="openshift-authentication/oauth-openshift-6cb668d466-7t64n" Dec 11 13:07:57 crc kubenswrapper[4898]: I1211 13:07:57.664565 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/5531fe73-f1a4-4a40-9458-536d6a8e1865-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-6cb668d466-7t64n\" (UID: \"5531fe73-f1a4-4a40-9458-536d6a8e1865\") " pod="openshift-authentication/oauth-openshift-6cb668d466-7t64n" Dec 11 13:07:57 crc kubenswrapper[4898]: I1211 13:07:57.664524 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/5531fe73-f1a4-4a40-9458-536d6a8e1865-v4-0-config-system-session\") pod \"oauth-openshift-6cb668d466-7t64n\" (UID: \"5531fe73-f1a4-4a40-9458-536d6a8e1865\") " pod="openshift-authentication/oauth-openshift-6cb668d466-7t64n" Dec 11 13:07:57 crc kubenswrapper[4898]: I1211 13:07:57.664608 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/5531fe73-f1a4-4a40-9458-536d6a8e1865-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-6cb668d466-7t64n\" (UID: \"5531fe73-f1a4-4a40-9458-536d6a8e1865\") " pod="openshift-authentication/oauth-openshift-6cb668d466-7t64n" Dec 11 13:07:57 crc kubenswrapper[4898]: I1211 13:07:57.664797 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/5531fe73-f1a4-4a40-9458-536d6a8e1865-v4-0-config-system-router-certs\") pod \"oauth-openshift-6cb668d466-7t64n\" (UID: \"5531fe73-f1a4-4a40-9458-536d6a8e1865\") " pod="openshift-authentication/oauth-openshift-6cb668d466-7t64n" Dec 11 13:07:57 crc kubenswrapper[4898]: I1211 13:07:57.664899 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/5531fe73-f1a4-4a40-9458-536d6a8e1865-v4-0-config-user-template-login\") pod \"oauth-openshift-6cb668d466-7t64n\" (UID: \"5531fe73-f1a4-4a40-9458-536d6a8e1865\") " pod="openshift-authentication/oauth-openshift-6cb668d466-7t64n" Dec 11 13:07:57 crc kubenswrapper[4898]: I1211 13:07:57.665601 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/5531fe73-f1a4-4a40-9458-536d6a8e1865-audit-policies\") pod \"oauth-openshift-6cb668d466-7t64n\" (UID: \"5531fe73-f1a4-4a40-9458-536d6a8e1865\") " pod="openshift-authentication/oauth-openshift-6cb668d466-7t64n" Dec 11 13:07:57 crc kubenswrapper[4898]: I1211 13:07:57.666138 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/5531fe73-f1a4-4a40-9458-536d6a8e1865-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-6cb668d466-7t64n\" (UID: \"5531fe73-f1a4-4a40-9458-536d6a8e1865\") " pod="openshift-authentication/oauth-openshift-6cb668d466-7t64n" Dec 11 13:07:57 crc kubenswrapper[4898]: I1211 13:07:57.667911 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/5531fe73-f1a4-4a40-9458-536d6a8e1865-v4-0-config-user-template-error\") pod \"oauth-openshift-6cb668d466-7t64n\" (UID: \"5531fe73-f1a4-4a40-9458-536d6a8e1865\") " pod="openshift-authentication/oauth-openshift-6cb668d466-7t64n" Dec 11 13:07:57 crc kubenswrapper[4898]: I1211 13:07:57.682330 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kplj2\" (UniqueName: \"kubernetes.io/projected/5531fe73-f1a4-4a40-9458-536d6a8e1865-kube-api-access-kplj2\") pod \"oauth-openshift-6cb668d466-7t64n\" (UID: \"5531fe73-f1a4-4a40-9458-536d6a8e1865\") " pod="openshift-authentication/oauth-openshift-6cb668d466-7t64n" Dec 11 13:07:57 crc kubenswrapper[4898]: I1211 13:07:57.794520 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-6cb668d466-7t64n" Dec 11 13:07:58 crc kubenswrapper[4898]: I1211 13:07:58.018250 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-6cb668d466-7t64n"] Dec 11 13:07:58 crc kubenswrapper[4898]: W1211 13:07:58.023488 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5531fe73_f1a4_4a40_9458_536d6a8e1865.slice/crio-d01d8849d889d4b7aef3d7f334f2779890664d6d30a63efa80905d2138f85645 WatchSource:0}: Error finding container d01d8849d889d4b7aef3d7f334f2779890664d6d30a63efa80905d2138f85645: Status 404 returned error can't find the container with id d01d8849d889d4b7aef3d7f334f2779890664d6d30a63efa80905d2138f85645 Dec 11 13:07:58 crc kubenswrapper[4898]: I1211 13:07:58.608865 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-6cb668d466-7t64n" event={"ID":"5531fe73-f1a4-4a40-9458-536d6a8e1865","Type":"ContainerStarted","Data":"6eef1143a0e4d15bf2b1f64d9bf45c94965dd150b3cccf64fdc8c84cb238a4ab"} Dec 11 13:07:58 crc kubenswrapper[4898]: I1211 13:07:58.609785 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-6cb668d466-7t64n" event={"ID":"5531fe73-f1a4-4a40-9458-536d6a8e1865","Type":"ContainerStarted","Data":"d01d8849d889d4b7aef3d7f334f2779890664d6d30a63efa80905d2138f85645"} Dec 11 13:07:58 crc kubenswrapper[4898]: I1211 13:07:58.628850 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-kl8sx" podStartSLOduration=4.372821981 podStartE2EDuration="1m18.628830835s" podCreationTimestamp="2025-12-11 13:06:40 +0000 UTC" firstStartedPulling="2025-12-11 13:06:41.956700042 +0000 UTC m=+159.529026479" lastFinishedPulling="2025-12-11 13:07:56.212708886 +0000 UTC m=+233.785035333" observedRunningTime="2025-12-11 13:07:58.6235259 +0000 UTC m=+236.195852327" watchObservedRunningTime="2025-12-11 13:07:58.628830835 +0000 UTC m=+236.201157282" Dec 11 13:07:58 crc kubenswrapper[4898]: I1211 13:07:58.648315 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-6cb668d466-7t64n" podStartSLOduration=28.648296666 podStartE2EDuration="28.648296666s" podCreationTimestamp="2025-12-11 13:07:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 13:07:58.64508596 +0000 UTC m=+236.217412397" watchObservedRunningTime="2025-12-11 13:07:58.648296666 +0000 UTC m=+236.220623103" Dec 11 13:07:58 crc kubenswrapper[4898]: I1211 13:07:58.672306 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-mpfz8" podStartSLOduration=4.52191054 podStartE2EDuration="1m18.672290313s" podCreationTimestamp="2025-12-11 13:06:40 +0000 UTC" firstStartedPulling="2025-12-11 13:06:41.968978665 +0000 UTC m=+159.541305102" lastFinishedPulling="2025-12-11 13:07:56.119358438 +0000 UTC m=+233.691684875" observedRunningTime="2025-12-11 13:07:58.668224977 +0000 UTC m=+236.240551424" watchObservedRunningTime="2025-12-11 13:07:58.672290313 +0000 UTC m=+236.244616750" Dec 11 13:07:59 crc kubenswrapper[4898]: I1211 13:07:59.614187 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-6cb668d466-7t64n" Dec 11 13:07:59 crc kubenswrapper[4898]: I1211 13:07:59.618932 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-6cb668d466-7t64n" Dec 11 13:08:00 crc kubenswrapper[4898]: I1211 13:08:00.649484 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-kl8sx" Dec 11 13:08:00 crc kubenswrapper[4898]: I1211 13:08:00.649567 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-kl8sx" Dec 11 13:08:00 crc kubenswrapper[4898]: I1211 13:08:00.720089 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-kl8sx" Dec 11 13:08:01 crc kubenswrapper[4898]: I1211 13:08:01.076660 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-mpfz8" Dec 11 13:08:01 crc kubenswrapper[4898]: I1211 13:08:01.077051 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-mpfz8" Dec 11 13:08:01 crc kubenswrapper[4898]: I1211 13:08:01.115818 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-mpfz8" Dec 11 13:08:01 crc kubenswrapper[4898]: I1211 13:08:01.677704 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-mpfz8" Dec 11 13:08:02 crc kubenswrapper[4898]: I1211 13:08:02.402736 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mpfz8"] Dec 11 13:08:03 crc kubenswrapper[4898]: I1211 13:08:03.638234 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-mpfz8" podUID="8c5302aa-5a81-4257-93b8-aefd5e5cc2ed" containerName="registry-server" containerID="cri-o://77c45d087fb7d682f6fb466817956e8e9e5d7cca25915b49248a5c96b9de94a9" gracePeriod=2 Dec 11 13:08:03 crc kubenswrapper[4898]: I1211 13:08:03.984749 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mpfz8" Dec 11 13:08:04 crc kubenswrapper[4898]: I1211 13:08:04.058924 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6txzk\" (UniqueName: \"kubernetes.io/projected/8c5302aa-5a81-4257-93b8-aefd5e5cc2ed-kube-api-access-6txzk\") pod \"8c5302aa-5a81-4257-93b8-aefd5e5cc2ed\" (UID: \"8c5302aa-5a81-4257-93b8-aefd5e5cc2ed\") " Dec 11 13:08:04 crc kubenswrapper[4898]: I1211 13:08:04.058995 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8c5302aa-5a81-4257-93b8-aefd5e5cc2ed-utilities\") pod \"8c5302aa-5a81-4257-93b8-aefd5e5cc2ed\" (UID: \"8c5302aa-5a81-4257-93b8-aefd5e5cc2ed\") " Dec 11 13:08:04 crc kubenswrapper[4898]: I1211 13:08:04.059033 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8c5302aa-5a81-4257-93b8-aefd5e5cc2ed-catalog-content\") pod \"8c5302aa-5a81-4257-93b8-aefd5e5cc2ed\" (UID: \"8c5302aa-5a81-4257-93b8-aefd5e5cc2ed\") " Dec 11 13:08:04 crc kubenswrapper[4898]: I1211 13:08:04.059999 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8c5302aa-5a81-4257-93b8-aefd5e5cc2ed-utilities" (OuterVolumeSpecName: "utilities") pod "8c5302aa-5a81-4257-93b8-aefd5e5cc2ed" (UID: "8c5302aa-5a81-4257-93b8-aefd5e5cc2ed"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 13:08:04 crc kubenswrapper[4898]: I1211 13:08:04.064723 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c5302aa-5a81-4257-93b8-aefd5e5cc2ed-kube-api-access-6txzk" (OuterVolumeSpecName: "kube-api-access-6txzk") pod "8c5302aa-5a81-4257-93b8-aefd5e5cc2ed" (UID: "8c5302aa-5a81-4257-93b8-aefd5e5cc2ed"). InnerVolumeSpecName "kube-api-access-6txzk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:08:04 crc kubenswrapper[4898]: I1211 13:08:04.085198 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8c5302aa-5a81-4257-93b8-aefd5e5cc2ed-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8c5302aa-5a81-4257-93b8-aefd5e5cc2ed" (UID: "8c5302aa-5a81-4257-93b8-aefd5e5cc2ed"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 13:08:04 crc kubenswrapper[4898]: I1211 13:08:04.160249 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6txzk\" (UniqueName: \"kubernetes.io/projected/8c5302aa-5a81-4257-93b8-aefd5e5cc2ed-kube-api-access-6txzk\") on node \"crc\" DevicePath \"\"" Dec 11 13:08:04 crc kubenswrapper[4898]: I1211 13:08:04.160312 4898 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8c5302aa-5a81-4257-93b8-aefd5e5cc2ed-utilities\") on node \"crc\" DevicePath \"\"" Dec 11 13:08:04 crc kubenswrapper[4898]: I1211 13:08:04.160327 4898 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8c5302aa-5a81-4257-93b8-aefd5e5cc2ed-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 11 13:08:04 crc kubenswrapper[4898]: I1211 13:08:04.646822 4898 generic.go:334] "Generic (PLEG): container finished" podID="8c5302aa-5a81-4257-93b8-aefd5e5cc2ed" containerID="77c45d087fb7d682f6fb466817956e8e9e5d7cca25915b49248a5c96b9de94a9" exitCode=0 Dec 11 13:08:04 crc kubenswrapper[4898]: I1211 13:08:04.646865 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mpfz8" event={"ID":"8c5302aa-5a81-4257-93b8-aefd5e5cc2ed","Type":"ContainerDied","Data":"77c45d087fb7d682f6fb466817956e8e9e5d7cca25915b49248a5c96b9de94a9"} Dec 11 13:08:04 crc kubenswrapper[4898]: I1211 13:08:04.646908 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mpfz8" event={"ID":"8c5302aa-5a81-4257-93b8-aefd5e5cc2ed","Type":"ContainerDied","Data":"bfba02839018ee6d150c0f949a62799d9db907ae03acb515312999598c77667f"} Dec 11 13:08:04 crc kubenswrapper[4898]: I1211 13:08:04.646923 4898 scope.go:117] "RemoveContainer" containerID="77c45d087fb7d682f6fb466817956e8e9e5d7cca25915b49248a5c96b9de94a9" Dec 11 13:08:04 crc kubenswrapper[4898]: I1211 13:08:04.647045 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mpfz8" Dec 11 13:08:04 crc kubenswrapper[4898]: I1211 13:08:04.670473 4898 scope.go:117] "RemoveContainer" containerID="cd3f845b260d077344ddcda1347bf579abd72a822ddb9d37e5986d4b0f914ef8" Dec 11 13:08:04 crc kubenswrapper[4898]: I1211 13:08:04.687956 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mpfz8"] Dec 11 13:08:04 crc kubenswrapper[4898]: I1211 13:08:04.688419 4898 scope.go:117] "RemoveContainer" containerID="a801c1fa5517ca32c45554b7817c497a8cd2e8b2260c9ae471748178efdc61d6" Dec 11 13:08:04 crc kubenswrapper[4898]: I1211 13:08:04.695063 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-mpfz8"] Dec 11 13:08:04 crc kubenswrapper[4898]: I1211 13:08:04.706130 4898 scope.go:117] "RemoveContainer" containerID="77c45d087fb7d682f6fb466817956e8e9e5d7cca25915b49248a5c96b9de94a9" Dec 11 13:08:04 crc kubenswrapper[4898]: E1211 13:08:04.706772 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"77c45d087fb7d682f6fb466817956e8e9e5d7cca25915b49248a5c96b9de94a9\": container with ID starting with 77c45d087fb7d682f6fb466817956e8e9e5d7cca25915b49248a5c96b9de94a9 not found: ID does not exist" containerID="77c45d087fb7d682f6fb466817956e8e9e5d7cca25915b49248a5c96b9de94a9" Dec 11 13:08:04 crc kubenswrapper[4898]: I1211 13:08:04.706797 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77c45d087fb7d682f6fb466817956e8e9e5d7cca25915b49248a5c96b9de94a9"} err="failed to get container status \"77c45d087fb7d682f6fb466817956e8e9e5d7cca25915b49248a5c96b9de94a9\": rpc error: code = NotFound desc = could not find container \"77c45d087fb7d682f6fb466817956e8e9e5d7cca25915b49248a5c96b9de94a9\": container with ID starting with 77c45d087fb7d682f6fb466817956e8e9e5d7cca25915b49248a5c96b9de94a9 not found: ID does not exist" Dec 11 13:08:04 crc kubenswrapper[4898]: I1211 13:08:04.706818 4898 scope.go:117] "RemoveContainer" containerID="cd3f845b260d077344ddcda1347bf579abd72a822ddb9d37e5986d4b0f914ef8" Dec 11 13:08:04 crc kubenswrapper[4898]: E1211 13:08:04.707249 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd3f845b260d077344ddcda1347bf579abd72a822ddb9d37e5986d4b0f914ef8\": container with ID starting with cd3f845b260d077344ddcda1347bf579abd72a822ddb9d37e5986d4b0f914ef8 not found: ID does not exist" containerID="cd3f845b260d077344ddcda1347bf579abd72a822ddb9d37e5986d4b0f914ef8" Dec 11 13:08:04 crc kubenswrapper[4898]: I1211 13:08:04.707356 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd3f845b260d077344ddcda1347bf579abd72a822ddb9d37e5986d4b0f914ef8"} err="failed to get container status \"cd3f845b260d077344ddcda1347bf579abd72a822ddb9d37e5986d4b0f914ef8\": rpc error: code = NotFound desc = could not find container \"cd3f845b260d077344ddcda1347bf579abd72a822ddb9d37e5986d4b0f914ef8\": container with ID starting with cd3f845b260d077344ddcda1347bf579abd72a822ddb9d37e5986d4b0f914ef8 not found: ID does not exist" Dec 11 13:08:04 crc kubenswrapper[4898]: I1211 13:08:04.707469 4898 scope.go:117] "RemoveContainer" containerID="a801c1fa5517ca32c45554b7817c497a8cd2e8b2260c9ae471748178efdc61d6" Dec 11 13:08:04 crc kubenswrapper[4898]: E1211 13:08:04.708368 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a801c1fa5517ca32c45554b7817c497a8cd2e8b2260c9ae471748178efdc61d6\": container with ID starting with a801c1fa5517ca32c45554b7817c497a8cd2e8b2260c9ae471748178efdc61d6 not found: ID does not exist" containerID="a801c1fa5517ca32c45554b7817c497a8cd2e8b2260c9ae471748178efdc61d6" Dec 11 13:08:04 crc kubenswrapper[4898]: I1211 13:08:04.708430 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a801c1fa5517ca32c45554b7817c497a8cd2e8b2260c9ae471748178efdc61d6"} err="failed to get container status \"a801c1fa5517ca32c45554b7817c497a8cd2e8b2260c9ae471748178efdc61d6\": rpc error: code = NotFound desc = could not find container \"a801c1fa5517ca32c45554b7817c497a8cd2e8b2260c9ae471748178efdc61d6\": container with ID starting with a801c1fa5517ca32c45554b7817c497a8cd2e8b2260c9ae471748178efdc61d6 not found: ID does not exist" Dec 11 13:08:04 crc kubenswrapper[4898]: I1211 13:08:04.780903 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c5302aa-5a81-4257-93b8-aefd5e5cc2ed" path="/var/lib/kubelet/pods/8c5302aa-5a81-4257-93b8-aefd5e5cc2ed/volumes" Dec 11 13:08:06 crc kubenswrapper[4898]: I1211 13:08:06.087565 4898 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 11 13:08:06 crc kubenswrapper[4898]: I1211 13:08:06.088158 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://df081be8bffcbf5068ade5f09ad81f2ea332d110fb0dffc1ab215f4a447e75d5" gracePeriod=15 Dec 11 13:08:06 crc kubenswrapper[4898]: I1211 13:08:06.088220 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://7ccc1530df8a90b252f6008c3a1734fd489bcb57944c2cc46bdf3c7554d837c0" gracePeriod=15 Dec 11 13:08:06 crc kubenswrapper[4898]: I1211 13:08:06.088295 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://41568c567a418a6ebd534c09d02c4331dfa2cd00580441bfdd83a15a12153a59" gracePeriod=15 Dec 11 13:08:06 crc kubenswrapper[4898]: I1211 13:08:06.088283 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://765cb993103d23770e20a725ce1f28ca635d13d2ecd05676039a833274ecf3e5" gracePeriod=15 Dec 11 13:08:06 crc kubenswrapper[4898]: I1211 13:08:06.088255 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://5a6e962a8c37823fc7d0047f9c1cae14311bfa1f36236addff039bba3369a021" gracePeriod=15 Dec 11 13:08:06 crc kubenswrapper[4898]: I1211 13:08:06.090564 4898 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 11 13:08:06 crc kubenswrapper[4898]: E1211 13:08:06.090892 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c5302aa-5a81-4257-93b8-aefd5e5cc2ed" containerName="extract-content" Dec 11 13:08:06 crc kubenswrapper[4898]: I1211 13:08:06.090911 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c5302aa-5a81-4257-93b8-aefd5e5cc2ed" containerName="extract-content" Dec 11 13:08:06 crc kubenswrapper[4898]: E1211 13:08:06.090925 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 11 13:08:06 crc kubenswrapper[4898]: I1211 13:08:06.090961 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 11 13:08:06 crc kubenswrapper[4898]: E1211 13:08:06.090971 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c5302aa-5a81-4257-93b8-aefd5e5cc2ed" containerName="registry-server" Dec 11 13:08:06 crc kubenswrapper[4898]: I1211 13:08:06.090978 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c5302aa-5a81-4257-93b8-aefd5e5cc2ed" containerName="registry-server" Dec 11 13:08:06 crc kubenswrapper[4898]: E1211 13:08:06.090986 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 11 13:08:06 crc kubenswrapper[4898]: I1211 13:08:06.090992 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 11 13:08:06 crc kubenswrapper[4898]: E1211 13:08:06.091001 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 11 13:08:06 crc kubenswrapper[4898]: I1211 13:08:06.091006 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 11 13:08:06 crc kubenswrapper[4898]: E1211 13:08:06.091037 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c5302aa-5a81-4257-93b8-aefd5e5cc2ed" containerName="extract-utilities" Dec 11 13:08:06 crc kubenswrapper[4898]: I1211 13:08:06.091046 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c5302aa-5a81-4257-93b8-aefd5e5cc2ed" containerName="extract-utilities" Dec 11 13:08:06 crc kubenswrapper[4898]: E1211 13:08:06.091057 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 11 13:08:06 crc kubenswrapper[4898]: I1211 13:08:06.091062 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 11 13:08:06 crc kubenswrapper[4898]: E1211 13:08:06.091069 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 11 13:08:06 crc kubenswrapper[4898]: I1211 13:08:06.091075 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 11 13:08:06 crc kubenswrapper[4898]: E1211 13:08:06.091082 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Dec 11 13:08:06 crc kubenswrapper[4898]: I1211 13:08:06.091088 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Dec 11 13:08:06 crc kubenswrapper[4898]: E1211 13:08:06.091118 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 11 13:08:06 crc kubenswrapper[4898]: I1211 13:08:06.091127 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 11 13:08:06 crc kubenswrapper[4898]: I1211 13:08:06.091286 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 11 13:08:06 crc kubenswrapper[4898]: I1211 13:08:06.091299 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c5302aa-5a81-4257-93b8-aefd5e5cc2ed" containerName="registry-server" Dec 11 13:08:06 crc kubenswrapper[4898]: I1211 13:08:06.091308 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 11 13:08:06 crc kubenswrapper[4898]: I1211 13:08:06.091324 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 11 13:08:06 crc kubenswrapper[4898]: I1211 13:08:06.091332 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 11 13:08:06 crc kubenswrapper[4898]: I1211 13:08:06.091340 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 11 13:08:06 crc kubenswrapper[4898]: I1211 13:08:06.091763 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 11 13:08:06 crc kubenswrapper[4898]: I1211 13:08:06.095910 4898 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 11 13:08:06 crc kubenswrapper[4898]: I1211 13:08:06.097011 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 11 13:08:06 crc kubenswrapper[4898]: I1211 13:08:06.101989 4898 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="f4b27818a5e8e43d0dc095d08835c792" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" Dec 11 13:08:06 crc kubenswrapper[4898]: I1211 13:08:06.183788 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 11 13:08:06 crc kubenswrapper[4898]: I1211 13:08:06.183849 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 11 13:08:06 crc kubenswrapper[4898]: I1211 13:08:06.183875 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 11 13:08:06 crc kubenswrapper[4898]: I1211 13:08:06.183913 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 11 13:08:06 crc kubenswrapper[4898]: I1211 13:08:06.183937 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 11 13:08:06 crc kubenswrapper[4898]: I1211 13:08:06.183967 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 11 13:08:06 crc kubenswrapper[4898]: I1211 13:08:06.183992 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 11 13:08:06 crc kubenswrapper[4898]: I1211 13:08:06.184012 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 11 13:08:06 crc kubenswrapper[4898]: I1211 13:08:06.284765 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 11 13:08:06 crc kubenswrapper[4898]: I1211 13:08:06.284854 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 11 13:08:06 crc kubenswrapper[4898]: I1211 13:08:06.284889 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 11 13:08:06 crc kubenswrapper[4898]: I1211 13:08:06.284888 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 11 13:08:06 crc kubenswrapper[4898]: I1211 13:08:06.284909 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 11 13:08:06 crc kubenswrapper[4898]: I1211 13:08:06.284950 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 11 13:08:06 crc kubenswrapper[4898]: I1211 13:08:06.284994 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 11 13:08:06 crc kubenswrapper[4898]: I1211 13:08:06.285004 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 11 13:08:06 crc kubenswrapper[4898]: I1211 13:08:06.285033 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 11 13:08:06 crc kubenswrapper[4898]: I1211 13:08:06.285036 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 11 13:08:06 crc kubenswrapper[4898]: I1211 13:08:06.285059 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 11 13:08:06 crc kubenswrapper[4898]: I1211 13:08:06.285092 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 11 13:08:06 crc kubenswrapper[4898]: I1211 13:08:06.285102 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 11 13:08:06 crc kubenswrapper[4898]: I1211 13:08:06.285114 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 11 13:08:06 crc kubenswrapper[4898]: I1211 13:08:06.285139 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 11 13:08:06 crc kubenswrapper[4898]: I1211 13:08:06.285168 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 11 13:08:06 crc kubenswrapper[4898]: I1211 13:08:06.661402 4898 generic.go:334] "Generic (PLEG): container finished" podID="d8b8fd27-089f-46e4-985c-09e1feb795aa" containerID="d26b466e445b0f327eb76fb60561b9d0c644f7dea23e43dcfe13bd8e132730f2" exitCode=0 Dec 11 13:08:06 crc kubenswrapper[4898]: I1211 13:08:06.661541 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"d8b8fd27-089f-46e4-985c-09e1feb795aa","Type":"ContainerDied","Data":"d26b466e445b0f327eb76fb60561b9d0c644f7dea23e43dcfe13bd8e132730f2"} Dec 11 13:08:06 crc kubenswrapper[4898]: I1211 13:08:06.662675 4898 status_manager.go:851] "Failed to get status for pod" podUID="d8b8fd27-089f-46e4-985c-09e1feb795aa" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.18:6443: connect: connection refused" Dec 11 13:08:06 crc kubenswrapper[4898]: I1211 13:08:06.665678 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Dec 11 13:08:06 crc kubenswrapper[4898]: I1211 13:08:06.667269 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 11 13:08:06 crc kubenswrapper[4898]: I1211 13:08:06.668121 4898 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="7ccc1530df8a90b252f6008c3a1734fd489bcb57944c2cc46bdf3c7554d837c0" exitCode=0 Dec 11 13:08:06 crc kubenswrapper[4898]: I1211 13:08:06.668166 4898 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="5a6e962a8c37823fc7d0047f9c1cae14311bfa1f36236addff039bba3369a021" exitCode=0 Dec 11 13:08:06 crc kubenswrapper[4898]: I1211 13:08:06.668188 4898 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="765cb993103d23770e20a725ce1f28ca635d13d2ecd05676039a833274ecf3e5" exitCode=0 Dec 11 13:08:06 crc kubenswrapper[4898]: I1211 13:08:06.668206 4898 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="41568c567a418a6ebd534c09d02c4331dfa2cd00580441bfdd83a15a12153a59" exitCode=2 Dec 11 13:08:06 crc kubenswrapper[4898]: I1211 13:08:06.668267 4898 scope.go:117] "RemoveContainer" containerID="8857b4ade110b31598d9080f45bc15f5702b2c278fe7909e4d172c9877e61534" Dec 11 13:08:07 crc kubenswrapper[4898]: I1211 13:08:07.681499 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 11 13:08:07 crc kubenswrapper[4898]: I1211 13:08:07.927718 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 11 13:08:07 crc kubenswrapper[4898]: I1211 13:08:07.928292 4898 status_manager.go:851] "Failed to get status for pod" podUID="d8b8fd27-089f-46e4-985c-09e1feb795aa" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.18:6443: connect: connection refused" Dec 11 13:08:08 crc kubenswrapper[4898]: I1211 13:08:08.010936 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/d8b8fd27-089f-46e4-985c-09e1feb795aa-var-lock\") pod \"d8b8fd27-089f-46e4-985c-09e1feb795aa\" (UID: \"d8b8fd27-089f-46e4-985c-09e1feb795aa\") " Dec 11 13:08:08 crc kubenswrapper[4898]: I1211 13:08:08.011317 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d8b8fd27-089f-46e4-985c-09e1feb795aa-kube-api-access\") pod \"d8b8fd27-089f-46e4-985c-09e1feb795aa\" (UID: \"d8b8fd27-089f-46e4-985c-09e1feb795aa\") " Dec 11 13:08:08 crc kubenswrapper[4898]: I1211 13:08:08.011340 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d8b8fd27-089f-46e4-985c-09e1feb795aa-kubelet-dir\") pod \"d8b8fd27-089f-46e4-985c-09e1feb795aa\" (UID: \"d8b8fd27-089f-46e4-985c-09e1feb795aa\") " Dec 11 13:08:08 crc kubenswrapper[4898]: I1211 13:08:08.011025 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d8b8fd27-089f-46e4-985c-09e1feb795aa-var-lock" (OuterVolumeSpecName: "var-lock") pod "d8b8fd27-089f-46e4-985c-09e1feb795aa" (UID: "d8b8fd27-089f-46e4-985c-09e1feb795aa"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 11 13:08:08 crc kubenswrapper[4898]: I1211 13:08:08.011478 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d8b8fd27-089f-46e4-985c-09e1feb795aa-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "d8b8fd27-089f-46e4-985c-09e1feb795aa" (UID: "d8b8fd27-089f-46e4-985c-09e1feb795aa"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 11 13:08:08 crc kubenswrapper[4898]: I1211 13:08:08.011613 4898 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/d8b8fd27-089f-46e4-985c-09e1feb795aa-var-lock\") on node \"crc\" DevicePath \"\"" Dec 11 13:08:08 crc kubenswrapper[4898]: I1211 13:08:08.011630 4898 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d8b8fd27-089f-46e4-985c-09e1feb795aa-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 11 13:08:08 crc kubenswrapper[4898]: I1211 13:08:08.017052 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8b8fd27-089f-46e4-985c-09e1feb795aa-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "d8b8fd27-089f-46e4-985c-09e1feb795aa" (UID: "d8b8fd27-089f-46e4-985c-09e1feb795aa"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:08:08 crc kubenswrapper[4898]: I1211 13:08:08.112535 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d8b8fd27-089f-46e4-985c-09e1feb795aa-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 11 13:08:08 crc kubenswrapper[4898]: I1211 13:08:08.465071 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 11 13:08:08 crc kubenswrapper[4898]: I1211 13:08:08.466077 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 11 13:08:08 crc kubenswrapper[4898]: I1211 13:08:08.467050 4898 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.18:6443: connect: connection refused" Dec 11 13:08:08 crc kubenswrapper[4898]: I1211 13:08:08.467338 4898 status_manager.go:851] "Failed to get status for pod" podUID="d8b8fd27-089f-46e4-985c-09e1feb795aa" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.18:6443: connect: connection refused" Dec 11 13:08:08 crc kubenswrapper[4898]: I1211 13:08:08.616016 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 11 13:08:08 crc kubenswrapper[4898]: I1211 13:08:08.616067 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 11 13:08:08 crc kubenswrapper[4898]: I1211 13:08:08.616092 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 11 13:08:08 crc kubenswrapper[4898]: I1211 13:08:08.616288 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 11 13:08:08 crc kubenswrapper[4898]: I1211 13:08:08.616332 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 11 13:08:08 crc kubenswrapper[4898]: I1211 13:08:08.616336 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 11 13:08:08 crc kubenswrapper[4898]: I1211 13:08:08.688871 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 11 13:08:08 crc kubenswrapper[4898]: I1211 13:08:08.689575 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"d8b8fd27-089f-46e4-985c-09e1feb795aa","Type":"ContainerDied","Data":"91cac5de6b3d76116700b65b6056c167283246b9de4aadf2a9714ef6b5ba0954"} Dec 11 13:08:08 crc kubenswrapper[4898]: I1211 13:08:08.689604 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="91cac5de6b3d76116700b65b6056c167283246b9de4aadf2a9714ef6b5ba0954" Dec 11 13:08:08 crc kubenswrapper[4898]: I1211 13:08:08.691875 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 11 13:08:08 crc kubenswrapper[4898]: I1211 13:08:08.692500 4898 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="df081be8bffcbf5068ade5f09ad81f2ea332d110fb0dffc1ab215f4a447e75d5" exitCode=0 Dec 11 13:08:08 crc kubenswrapper[4898]: I1211 13:08:08.692534 4898 scope.go:117] "RemoveContainer" containerID="7ccc1530df8a90b252f6008c3a1734fd489bcb57944c2cc46bdf3c7554d837c0" Dec 11 13:08:08 crc kubenswrapper[4898]: I1211 13:08:08.692586 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 11 13:08:08 crc kubenswrapper[4898]: I1211 13:08:08.703449 4898 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.18:6443: connect: connection refused" Dec 11 13:08:08 crc kubenswrapper[4898]: I1211 13:08:08.703648 4898 status_manager.go:851] "Failed to get status for pod" podUID="d8b8fd27-089f-46e4-985c-09e1feb795aa" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.18:6443: connect: connection refused" Dec 11 13:08:08 crc kubenswrapper[4898]: I1211 13:08:08.706626 4898 scope.go:117] "RemoveContainer" containerID="5a6e962a8c37823fc7d0047f9c1cae14311bfa1f36236addff039bba3369a021" Dec 11 13:08:08 crc kubenswrapper[4898]: I1211 13:08:08.708390 4898 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.18:6443: connect: connection refused" Dec 11 13:08:08 crc kubenswrapper[4898]: I1211 13:08:08.708871 4898 status_manager.go:851] "Failed to get status for pod" podUID="d8b8fd27-089f-46e4-985c-09e1feb795aa" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.18:6443: connect: connection refused" Dec 11 13:08:08 crc kubenswrapper[4898]: I1211 13:08:08.717954 4898 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 11 13:08:08 crc kubenswrapper[4898]: I1211 13:08:08.717974 4898 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Dec 11 13:08:08 crc kubenswrapper[4898]: I1211 13:08:08.717982 4898 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Dec 11 13:08:08 crc kubenswrapper[4898]: I1211 13:08:08.721653 4898 scope.go:117] "RemoveContainer" containerID="765cb993103d23770e20a725ce1f28ca635d13d2ecd05676039a833274ecf3e5" Dec 11 13:08:08 crc kubenswrapper[4898]: I1211 13:08:08.735063 4898 scope.go:117] "RemoveContainer" containerID="41568c567a418a6ebd534c09d02c4331dfa2cd00580441bfdd83a15a12153a59" Dec 11 13:08:08 crc kubenswrapper[4898]: I1211 13:08:08.746446 4898 scope.go:117] "RemoveContainer" containerID="df081be8bffcbf5068ade5f09ad81f2ea332d110fb0dffc1ab215f4a447e75d5" Dec 11 13:08:08 crc kubenswrapper[4898]: I1211 13:08:08.761718 4898 scope.go:117] "RemoveContainer" containerID="ca4b5408884ed07718750ad82f77961d242327ca11991fd4dbbd1b5b53c5b642" Dec 11 13:08:08 crc kubenswrapper[4898]: I1211 13:08:08.778435 4898 scope.go:117] "RemoveContainer" containerID="7ccc1530df8a90b252f6008c3a1734fd489bcb57944c2cc46bdf3c7554d837c0" Dec 11 13:08:08 crc kubenswrapper[4898]: E1211 13:08:08.779558 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7ccc1530df8a90b252f6008c3a1734fd489bcb57944c2cc46bdf3c7554d837c0\": container with ID starting with 7ccc1530df8a90b252f6008c3a1734fd489bcb57944c2cc46bdf3c7554d837c0 not found: ID does not exist" containerID="7ccc1530df8a90b252f6008c3a1734fd489bcb57944c2cc46bdf3c7554d837c0" Dec 11 13:08:08 crc kubenswrapper[4898]: I1211 13:08:08.779591 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ccc1530df8a90b252f6008c3a1734fd489bcb57944c2cc46bdf3c7554d837c0"} err="failed to get container status \"7ccc1530df8a90b252f6008c3a1734fd489bcb57944c2cc46bdf3c7554d837c0\": rpc error: code = NotFound desc = could not find container \"7ccc1530df8a90b252f6008c3a1734fd489bcb57944c2cc46bdf3c7554d837c0\": container with ID starting with 7ccc1530df8a90b252f6008c3a1734fd489bcb57944c2cc46bdf3c7554d837c0 not found: ID does not exist" Dec 11 13:08:08 crc kubenswrapper[4898]: I1211 13:08:08.779684 4898 scope.go:117] "RemoveContainer" containerID="5a6e962a8c37823fc7d0047f9c1cae14311bfa1f36236addff039bba3369a021" Dec 11 13:08:08 crc kubenswrapper[4898]: E1211 13:08:08.780535 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a6e962a8c37823fc7d0047f9c1cae14311bfa1f36236addff039bba3369a021\": container with ID starting with 5a6e962a8c37823fc7d0047f9c1cae14311bfa1f36236addff039bba3369a021 not found: ID does not exist" containerID="5a6e962a8c37823fc7d0047f9c1cae14311bfa1f36236addff039bba3369a021" Dec 11 13:08:08 crc kubenswrapper[4898]: I1211 13:08:08.780579 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a6e962a8c37823fc7d0047f9c1cae14311bfa1f36236addff039bba3369a021"} err="failed to get container status \"5a6e962a8c37823fc7d0047f9c1cae14311bfa1f36236addff039bba3369a021\": rpc error: code = NotFound desc = could not find container \"5a6e962a8c37823fc7d0047f9c1cae14311bfa1f36236addff039bba3369a021\": container with ID starting with 5a6e962a8c37823fc7d0047f9c1cae14311bfa1f36236addff039bba3369a021 not found: ID does not exist" Dec 11 13:08:08 crc kubenswrapper[4898]: I1211 13:08:08.780611 4898 scope.go:117] "RemoveContainer" containerID="765cb993103d23770e20a725ce1f28ca635d13d2ecd05676039a833274ecf3e5" Dec 11 13:08:08 crc kubenswrapper[4898]: E1211 13:08:08.780906 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"765cb993103d23770e20a725ce1f28ca635d13d2ecd05676039a833274ecf3e5\": container with ID starting with 765cb993103d23770e20a725ce1f28ca635d13d2ecd05676039a833274ecf3e5 not found: ID does not exist" containerID="765cb993103d23770e20a725ce1f28ca635d13d2ecd05676039a833274ecf3e5" Dec 11 13:08:08 crc kubenswrapper[4898]: I1211 13:08:08.780931 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"765cb993103d23770e20a725ce1f28ca635d13d2ecd05676039a833274ecf3e5"} err="failed to get container status \"765cb993103d23770e20a725ce1f28ca635d13d2ecd05676039a833274ecf3e5\": rpc error: code = NotFound desc = could not find container \"765cb993103d23770e20a725ce1f28ca635d13d2ecd05676039a833274ecf3e5\": container with ID starting with 765cb993103d23770e20a725ce1f28ca635d13d2ecd05676039a833274ecf3e5 not found: ID does not exist" Dec 11 13:08:08 crc kubenswrapper[4898]: I1211 13:08:08.780948 4898 scope.go:117] "RemoveContainer" containerID="41568c567a418a6ebd534c09d02c4331dfa2cd00580441bfdd83a15a12153a59" Dec 11 13:08:08 crc kubenswrapper[4898]: E1211 13:08:08.781146 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"41568c567a418a6ebd534c09d02c4331dfa2cd00580441bfdd83a15a12153a59\": container with ID starting with 41568c567a418a6ebd534c09d02c4331dfa2cd00580441bfdd83a15a12153a59 not found: ID does not exist" containerID="41568c567a418a6ebd534c09d02c4331dfa2cd00580441bfdd83a15a12153a59" Dec 11 13:08:08 crc kubenswrapper[4898]: I1211 13:08:08.781171 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41568c567a418a6ebd534c09d02c4331dfa2cd00580441bfdd83a15a12153a59"} err="failed to get container status \"41568c567a418a6ebd534c09d02c4331dfa2cd00580441bfdd83a15a12153a59\": rpc error: code = NotFound desc = could not find container \"41568c567a418a6ebd534c09d02c4331dfa2cd00580441bfdd83a15a12153a59\": container with ID starting with 41568c567a418a6ebd534c09d02c4331dfa2cd00580441bfdd83a15a12153a59 not found: ID does not exist" Dec 11 13:08:08 crc kubenswrapper[4898]: I1211 13:08:08.781188 4898 scope.go:117] "RemoveContainer" containerID="df081be8bffcbf5068ade5f09ad81f2ea332d110fb0dffc1ab215f4a447e75d5" Dec 11 13:08:08 crc kubenswrapper[4898]: I1211 13:08:08.781302 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Dec 11 13:08:08 crc kubenswrapper[4898]: E1211 13:08:08.781392 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"df081be8bffcbf5068ade5f09ad81f2ea332d110fb0dffc1ab215f4a447e75d5\": container with ID starting with df081be8bffcbf5068ade5f09ad81f2ea332d110fb0dffc1ab215f4a447e75d5 not found: ID does not exist" containerID="df081be8bffcbf5068ade5f09ad81f2ea332d110fb0dffc1ab215f4a447e75d5" Dec 11 13:08:08 crc kubenswrapper[4898]: I1211 13:08:08.781412 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df081be8bffcbf5068ade5f09ad81f2ea332d110fb0dffc1ab215f4a447e75d5"} err="failed to get container status \"df081be8bffcbf5068ade5f09ad81f2ea332d110fb0dffc1ab215f4a447e75d5\": rpc error: code = NotFound desc = could not find container \"df081be8bffcbf5068ade5f09ad81f2ea332d110fb0dffc1ab215f4a447e75d5\": container with ID starting with df081be8bffcbf5068ade5f09ad81f2ea332d110fb0dffc1ab215f4a447e75d5 not found: ID does not exist" Dec 11 13:08:08 crc kubenswrapper[4898]: I1211 13:08:08.781427 4898 scope.go:117] "RemoveContainer" containerID="ca4b5408884ed07718750ad82f77961d242327ca11991fd4dbbd1b5b53c5b642" Dec 11 13:08:08 crc kubenswrapper[4898]: E1211 13:08:08.781641 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ca4b5408884ed07718750ad82f77961d242327ca11991fd4dbbd1b5b53c5b642\": container with ID starting with ca4b5408884ed07718750ad82f77961d242327ca11991fd4dbbd1b5b53c5b642 not found: ID does not exist" containerID="ca4b5408884ed07718750ad82f77961d242327ca11991fd4dbbd1b5b53c5b642" Dec 11 13:08:08 crc kubenswrapper[4898]: I1211 13:08:08.781669 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca4b5408884ed07718750ad82f77961d242327ca11991fd4dbbd1b5b53c5b642"} err="failed to get container status \"ca4b5408884ed07718750ad82f77961d242327ca11991fd4dbbd1b5b53c5b642\": rpc error: code = NotFound desc = could not find container \"ca4b5408884ed07718750ad82f77961d242327ca11991fd4dbbd1b5b53c5b642\": container with ID starting with ca4b5408884ed07718750ad82f77961d242327ca11991fd4dbbd1b5b53c5b642 not found: ID does not exist" Dec 11 13:08:10 crc kubenswrapper[4898]: I1211 13:08:10.680924 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-kl8sx" Dec 11 13:08:10 crc kubenswrapper[4898]: I1211 13:08:10.681604 4898 status_manager.go:851] "Failed to get status for pod" podUID="9fbaabe9-b4fe-4a1d-9733-b64e0c1c5389" pod="openshift-marketplace/redhat-marketplace-kl8sx" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-kl8sx\": dial tcp 38.102.83.18:6443: connect: connection refused" Dec 11 13:08:10 crc kubenswrapper[4898]: I1211 13:08:10.682302 4898 status_manager.go:851] "Failed to get status for pod" podUID="d8b8fd27-089f-46e4-985c-09e1feb795aa" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.18:6443: connect: connection refused" Dec 11 13:08:11 crc kubenswrapper[4898]: E1211 13:08:11.102601 4898 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-11T13:08:11Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-11T13:08:11Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-11T13:08:11Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-11T13:08:11Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Patch \"https://api-int.crc.testing:6443/api/v1/nodes/crc/status?timeout=10s\": dial tcp 38.102.83.18:6443: connect: connection refused" Dec 11 13:08:11 crc kubenswrapper[4898]: E1211 13:08:11.102865 4898 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.18:6443: connect: connection refused" Dec 11 13:08:11 crc kubenswrapper[4898]: E1211 13:08:11.103089 4898 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.18:6443: connect: connection refused" Dec 11 13:08:11 crc kubenswrapper[4898]: E1211 13:08:11.103347 4898 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.18:6443: connect: connection refused" Dec 11 13:08:11 crc kubenswrapper[4898]: E1211 13:08:11.103575 4898 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.18:6443: connect: connection refused" Dec 11 13:08:11 crc kubenswrapper[4898]: E1211 13:08:11.103598 4898 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 11 13:08:11 crc kubenswrapper[4898]: E1211 13:08:11.126221 4898 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.18:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 11 13:08:11 crc kubenswrapper[4898]: I1211 13:08:11.126780 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 11 13:08:11 crc kubenswrapper[4898]: W1211 13:08:11.143109 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-8a9396e6c8ffddb7f540962543a1521ef98a9e3a464ffb04753488be3286a02b WatchSource:0}: Error finding container 8a9396e6c8ffddb7f540962543a1521ef98a9e3a464ffb04753488be3286a02b: Status 404 returned error can't find the container with id 8a9396e6c8ffddb7f540962543a1521ef98a9e3a464ffb04753488be3286a02b Dec 11 13:08:11 crc kubenswrapper[4898]: E1211 13:08:11.147416 4898 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.18:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.18802b24d6829240 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-11 13:08:11.1469328 +0000 UTC m=+248.719259237,LastTimestamp:2025-12-11 13:08:11.1469328 +0000 UTC m=+248.719259237,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 11 13:08:11 crc kubenswrapper[4898]: I1211 13:08:11.714789 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"09f45a93d3650f00907d6932465b34c3688f3d834a8480b3e40015627fbd9008"} Dec 11 13:08:11 crc kubenswrapper[4898]: I1211 13:08:11.715305 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"8a9396e6c8ffddb7f540962543a1521ef98a9e3a464ffb04753488be3286a02b"} Dec 11 13:08:11 crc kubenswrapper[4898]: I1211 13:08:11.716497 4898 status_manager.go:851] "Failed to get status for pod" podUID="d8b8fd27-089f-46e4-985c-09e1feb795aa" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.18:6443: connect: connection refused" Dec 11 13:08:11 crc kubenswrapper[4898]: E1211 13:08:11.716604 4898 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.18:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 11 13:08:11 crc kubenswrapper[4898]: I1211 13:08:11.716692 4898 status_manager.go:851] "Failed to get status for pod" podUID="9fbaabe9-b4fe-4a1d-9733-b64e0c1c5389" pod="openshift-marketplace/redhat-marketplace-kl8sx" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-kl8sx\": dial tcp 38.102.83.18:6443: connect: connection refused" Dec 11 13:08:12 crc kubenswrapper[4898]: I1211 13:08:12.777214 4898 status_manager.go:851] "Failed to get status for pod" podUID="9fbaabe9-b4fe-4a1d-9733-b64e0c1c5389" pod="openshift-marketplace/redhat-marketplace-kl8sx" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-kl8sx\": dial tcp 38.102.83.18:6443: connect: connection refused" Dec 11 13:08:12 crc kubenswrapper[4898]: I1211 13:08:12.777836 4898 status_manager.go:851] "Failed to get status for pod" podUID="d8b8fd27-089f-46e4-985c-09e1feb795aa" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.18:6443: connect: connection refused" Dec 11 13:08:12 crc kubenswrapper[4898]: E1211 13:08:12.872447 4898 desired_state_of_world_populator.go:312] "Error processing volume" err="error processing PVC openshift-image-registry/crc-image-registry-storage: failed to fetch PVC from API server: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/persistentvolumeclaims/crc-image-registry-storage\": dial tcp 38.102.83.18:6443: connect: connection refused" pod="openshift-image-registry/image-registry-697d97f7c8-g9vz7" volumeName="registry-storage" Dec 11 13:08:14 crc kubenswrapper[4898]: E1211 13:08:14.979653 4898 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.18:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.18802b24d6829240 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-11 13:08:11.1469328 +0000 UTC m=+248.719259237,LastTimestamp:2025-12-11 13:08:11.1469328 +0000 UTC m=+248.719259237,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 11 13:08:16 crc kubenswrapper[4898]: E1211 13:08:16.293435 4898 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.18:6443: connect: connection refused" Dec 11 13:08:16 crc kubenswrapper[4898]: E1211 13:08:16.294557 4898 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.18:6443: connect: connection refused" Dec 11 13:08:16 crc kubenswrapper[4898]: E1211 13:08:16.295108 4898 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.18:6443: connect: connection refused" Dec 11 13:08:16 crc kubenswrapper[4898]: E1211 13:08:16.295492 4898 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.18:6443: connect: connection refused" Dec 11 13:08:16 crc kubenswrapper[4898]: E1211 13:08:16.295819 4898 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.18:6443: connect: connection refused" Dec 11 13:08:16 crc kubenswrapper[4898]: I1211 13:08:16.295990 4898 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Dec 11 13:08:16 crc kubenswrapper[4898]: E1211 13:08:16.296533 4898 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.18:6443: connect: connection refused" interval="200ms" Dec 11 13:08:16 crc kubenswrapper[4898]: E1211 13:08:16.497594 4898 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.18:6443: connect: connection refused" interval="400ms" Dec 11 13:08:16 crc kubenswrapper[4898]: E1211 13:08:16.898527 4898 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.18:6443: connect: connection refused" interval="800ms" Dec 11 13:08:17 crc kubenswrapper[4898]: E1211 13:08:17.699411 4898 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.18:6443: connect: connection refused" interval="1.6s" Dec 11 13:08:19 crc kubenswrapper[4898]: E1211 13:08:19.301076 4898 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.18:6443: connect: connection refused" interval="3.2s" Dec 11 13:08:20 crc kubenswrapper[4898]: I1211 13:08:20.767266 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Dec 11 13:08:20 crc kubenswrapper[4898]: I1211 13:08:20.767568 4898 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="a52b842d9e7001cf1e9c3284c56ac9a76f642b66457d6717c6367e00ef89fdf3" exitCode=1 Dec 11 13:08:20 crc kubenswrapper[4898]: I1211 13:08:20.767601 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"a52b842d9e7001cf1e9c3284c56ac9a76f642b66457d6717c6367e00ef89fdf3"} Dec 11 13:08:20 crc kubenswrapper[4898]: I1211 13:08:20.768075 4898 scope.go:117] "RemoveContainer" containerID="a52b842d9e7001cf1e9c3284c56ac9a76f642b66457d6717c6367e00ef89fdf3" Dec 11 13:08:20 crc kubenswrapper[4898]: I1211 13:08:20.768414 4898 status_manager.go:851] "Failed to get status for pod" podUID="9fbaabe9-b4fe-4a1d-9733-b64e0c1c5389" pod="openshift-marketplace/redhat-marketplace-kl8sx" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-kl8sx\": dial tcp 38.102.83.18:6443: connect: connection refused" Dec 11 13:08:20 crc kubenswrapper[4898]: I1211 13:08:20.768888 4898 status_manager.go:851] "Failed to get status for pod" podUID="d8b8fd27-089f-46e4-985c-09e1feb795aa" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.18:6443: connect: connection refused" Dec 11 13:08:20 crc kubenswrapper[4898]: I1211 13:08:20.769134 4898 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.18:6443: connect: connection refused" Dec 11 13:08:20 crc kubenswrapper[4898]: I1211 13:08:20.788442 4898 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 11 13:08:21 crc kubenswrapper[4898]: E1211 13:08:21.182439 4898 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-11T13:08:21Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-11T13:08:21Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-11T13:08:21Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-11T13:08:21Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Patch \"https://api-int.crc.testing:6443/api/v1/nodes/crc/status?timeout=10s\": dial tcp 38.102.83.18:6443: connect: connection refused" Dec 11 13:08:21 crc kubenswrapper[4898]: E1211 13:08:21.182879 4898 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.18:6443: connect: connection refused" Dec 11 13:08:21 crc kubenswrapper[4898]: E1211 13:08:21.183160 4898 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.18:6443: connect: connection refused" Dec 11 13:08:21 crc kubenswrapper[4898]: E1211 13:08:21.183479 4898 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.18:6443: connect: connection refused" Dec 11 13:08:21 crc kubenswrapper[4898]: E1211 13:08:21.183733 4898 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.18:6443: connect: connection refused" Dec 11 13:08:21 crc kubenswrapper[4898]: E1211 13:08:21.183751 4898 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 11 13:08:21 crc kubenswrapper[4898]: I1211 13:08:21.773869 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 11 13:08:21 crc kubenswrapper[4898]: I1211 13:08:21.774705 4898 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.18:6443: connect: connection refused" Dec 11 13:08:21 crc kubenswrapper[4898]: I1211 13:08:21.775122 4898 status_manager.go:851] "Failed to get status for pod" podUID="9fbaabe9-b4fe-4a1d-9733-b64e0c1c5389" pod="openshift-marketplace/redhat-marketplace-kl8sx" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-kl8sx\": dial tcp 38.102.83.18:6443: connect: connection refused" Dec 11 13:08:21 crc kubenswrapper[4898]: I1211 13:08:21.775408 4898 status_manager.go:851] "Failed to get status for pod" podUID="d8b8fd27-089f-46e4-985c-09e1feb795aa" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.18:6443: connect: connection refused" Dec 11 13:08:21 crc kubenswrapper[4898]: I1211 13:08:21.777268 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Dec 11 13:08:21 crc kubenswrapper[4898]: I1211 13:08:21.777324 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"6252c045d45cde98c2876fb59e69328d8941f0d752605beba07248178873f791"} Dec 11 13:08:21 crc kubenswrapper[4898]: I1211 13:08:21.778209 4898 status_manager.go:851] "Failed to get status for pod" podUID="9fbaabe9-b4fe-4a1d-9733-b64e0c1c5389" pod="openshift-marketplace/redhat-marketplace-kl8sx" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-kl8sx\": dial tcp 38.102.83.18:6443: connect: connection refused" Dec 11 13:08:21 crc kubenswrapper[4898]: I1211 13:08:21.778700 4898 status_manager.go:851] "Failed to get status for pod" podUID="d8b8fd27-089f-46e4-985c-09e1feb795aa" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.18:6443: connect: connection refused" Dec 11 13:08:21 crc kubenswrapper[4898]: I1211 13:08:21.779021 4898 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.18:6443: connect: connection refused" Dec 11 13:08:21 crc kubenswrapper[4898]: I1211 13:08:21.795722 4898 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="7fc485a2-a69e-4453-8c3d-4b9698caa632" Dec 11 13:08:21 crc kubenswrapper[4898]: I1211 13:08:21.795782 4898 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="7fc485a2-a69e-4453-8c3d-4b9698caa632" Dec 11 13:08:21 crc kubenswrapper[4898]: E1211 13:08:21.796359 4898 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.18:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 11 13:08:21 crc kubenswrapper[4898]: I1211 13:08:21.796960 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 11 13:08:21 crc kubenswrapper[4898]: W1211 13:08:21.815726 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71bb4a3aecc4ba5b26c4b7318770ce13.slice/crio-187156aaa40681813c16d9cea310eb39d7bbfaeab1a398d0ec61bb9af8216d73 WatchSource:0}: Error finding container 187156aaa40681813c16d9cea310eb39d7bbfaeab1a398d0ec61bb9af8216d73: Status 404 returned error can't find the container with id 187156aaa40681813c16d9cea310eb39d7bbfaeab1a398d0ec61bb9af8216d73 Dec 11 13:08:22 crc kubenswrapper[4898]: I1211 13:08:22.121236 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 11 13:08:22 crc kubenswrapper[4898]: E1211 13:08:22.502137 4898 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.18:6443: connect: connection refused" interval="6.4s" Dec 11 13:08:22 crc kubenswrapper[4898]: I1211 13:08:22.786514 4898 status_manager.go:851] "Failed to get status for pod" podUID="9fbaabe9-b4fe-4a1d-9733-b64e0c1c5389" pod="openshift-marketplace/redhat-marketplace-kl8sx" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-kl8sx\": dial tcp 38.102.83.18:6443: connect: connection refused" Dec 11 13:08:22 crc kubenswrapper[4898]: I1211 13:08:22.786893 4898 status_manager.go:851] "Failed to get status for pod" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.18:6443: connect: connection refused" Dec 11 13:08:22 crc kubenswrapper[4898]: I1211 13:08:22.787379 4898 status_manager.go:851] "Failed to get status for pod" podUID="d8b8fd27-089f-46e4-985c-09e1feb795aa" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.18:6443: connect: connection refused" Dec 11 13:08:22 crc kubenswrapper[4898]: I1211 13:08:22.787883 4898 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.18:6443: connect: connection refused" Dec 11 13:08:22 crc kubenswrapper[4898]: I1211 13:08:22.788389 4898 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="3134222a11966763a277018102f0c6ff6378b07f7caa11e9667147980b2fbf56" exitCode=0 Dec 11 13:08:22 crc kubenswrapper[4898]: I1211 13:08:22.788525 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"3134222a11966763a277018102f0c6ff6378b07f7caa11e9667147980b2fbf56"} Dec 11 13:08:22 crc kubenswrapper[4898]: I1211 13:08:22.788597 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"187156aaa40681813c16d9cea310eb39d7bbfaeab1a398d0ec61bb9af8216d73"} Dec 11 13:08:22 crc kubenswrapper[4898]: I1211 13:08:22.789150 4898 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="7fc485a2-a69e-4453-8c3d-4b9698caa632" Dec 11 13:08:22 crc kubenswrapper[4898]: I1211 13:08:22.789198 4898 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="7fc485a2-a69e-4453-8c3d-4b9698caa632" Dec 11 13:08:22 crc kubenswrapper[4898]: I1211 13:08:22.789511 4898 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.18:6443: connect: connection refused" Dec 11 13:08:22 crc kubenswrapper[4898]: E1211 13:08:22.789722 4898 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.18:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 11 13:08:22 crc kubenswrapper[4898]: I1211 13:08:22.789981 4898 status_manager.go:851] "Failed to get status for pod" podUID="9fbaabe9-b4fe-4a1d-9733-b64e0c1c5389" pod="openshift-marketplace/redhat-marketplace-kl8sx" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-kl8sx\": dial tcp 38.102.83.18:6443: connect: connection refused" Dec 11 13:08:22 crc kubenswrapper[4898]: I1211 13:08:22.790447 4898 status_manager.go:851] "Failed to get status for pod" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.18:6443: connect: connection refused" Dec 11 13:08:22 crc kubenswrapper[4898]: I1211 13:08:22.791578 4898 status_manager.go:851] "Failed to get status for pod" podUID="d8b8fd27-089f-46e4-985c-09e1feb795aa" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.18:6443: connect: connection refused" Dec 11 13:08:23 crc kubenswrapper[4898]: I1211 13:08:23.795112 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"80fa49d89ab6d094e70f60b84c4cfb575d6cd8da8fbabb0067657d2d8d014306"} Dec 11 13:08:23 crc kubenswrapper[4898]: I1211 13:08:23.795412 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"e6f3d6e058fa353c2826acc923dd44daf06359c802d91bc4efcbd6f051d12d9f"} Dec 11 13:08:23 crc kubenswrapper[4898]: I1211 13:08:23.795422 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"5c407021afc1ba43df2d704e3faf16c804cde4de699b57de5a69922a008c5fe8"} Dec 11 13:08:23 crc kubenswrapper[4898]: I1211 13:08:23.795430 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"feeeea8abe1463533debf815b42a0a65f79c397fe83c7e751a78196a10587144"} Dec 11 13:08:24 crc kubenswrapper[4898]: I1211 13:08:24.802240 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"4017d596d2e340ee67b3e802d79d650e4d0e3e45f78b60b4349d64b0d8d10ab9"} Dec 11 13:08:24 crc kubenswrapper[4898]: I1211 13:08:24.802568 4898 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="7fc485a2-a69e-4453-8c3d-4b9698caa632" Dec 11 13:08:24 crc kubenswrapper[4898]: I1211 13:08:24.802587 4898 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="7fc485a2-a69e-4453-8c3d-4b9698caa632" Dec 11 13:08:24 crc kubenswrapper[4898]: I1211 13:08:24.802624 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 11 13:08:26 crc kubenswrapper[4898]: I1211 13:08:26.797683 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 11 13:08:26 crc kubenswrapper[4898]: I1211 13:08:26.797766 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 11 13:08:26 crc kubenswrapper[4898]: I1211 13:08:26.804673 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 11 13:08:27 crc kubenswrapper[4898]: I1211 13:08:27.798235 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 11 13:08:27 crc kubenswrapper[4898]: I1211 13:08:27.804860 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 11 13:08:29 crc kubenswrapper[4898]: I1211 13:08:29.811917 4898 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 11 13:08:29 crc kubenswrapper[4898]: I1211 13:08:29.845378 4898 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="b31af037-37dc-4116-99e7-33c3416b548e" Dec 11 13:08:30 crc kubenswrapper[4898]: I1211 13:08:30.832195 4898 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="7fc485a2-a69e-4453-8c3d-4b9698caa632" Dec 11 13:08:30 crc kubenswrapper[4898]: I1211 13:08:30.832585 4898 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="7fc485a2-a69e-4453-8c3d-4b9698caa632" Dec 11 13:08:30 crc kubenswrapper[4898]: I1211 13:08:30.837211 4898 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="b31af037-37dc-4116-99e7-33c3416b548e" Dec 11 13:08:30 crc kubenswrapper[4898]: I1211 13:08:30.840067 4898 status_manager.go:308] "Container readiness changed before pod has synced" pod="openshift-kube-apiserver/kube-apiserver-crc" containerID="cri-o://feeeea8abe1463533debf815b42a0a65f79c397fe83c7e751a78196a10587144" Dec 11 13:08:30 crc kubenswrapper[4898]: I1211 13:08:30.840325 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 11 13:08:31 crc kubenswrapper[4898]: I1211 13:08:31.836702 4898 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="7fc485a2-a69e-4453-8c3d-4b9698caa632" Dec 11 13:08:31 crc kubenswrapper[4898]: I1211 13:08:31.836730 4898 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="7fc485a2-a69e-4453-8c3d-4b9698caa632" Dec 11 13:08:31 crc kubenswrapper[4898]: I1211 13:08:31.840198 4898 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="b31af037-37dc-4116-99e7-33c3416b548e" Dec 11 13:08:32 crc kubenswrapper[4898]: I1211 13:08:32.128947 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 11 13:08:39 crc kubenswrapper[4898]: I1211 13:08:39.325545 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Dec 11 13:08:39 crc kubenswrapper[4898]: I1211 13:08:39.399581 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Dec 11 13:08:39 crc kubenswrapper[4898]: I1211 13:08:39.954755 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Dec 11 13:08:40 crc kubenswrapper[4898]: I1211 13:08:40.273162 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Dec 11 13:08:40 crc kubenswrapper[4898]: I1211 13:08:40.339484 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Dec 11 13:08:40 crc kubenswrapper[4898]: I1211 13:08:40.458134 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Dec 11 13:08:40 crc kubenswrapper[4898]: I1211 13:08:40.639519 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Dec 11 13:08:40 crc kubenswrapper[4898]: I1211 13:08:40.676251 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Dec 11 13:08:40 crc kubenswrapper[4898]: I1211 13:08:40.701705 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Dec 11 13:08:40 crc kubenswrapper[4898]: I1211 13:08:40.934675 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 11 13:08:41 crc kubenswrapper[4898]: I1211 13:08:41.204492 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Dec 11 13:08:41 crc kubenswrapper[4898]: I1211 13:08:41.247288 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Dec 11 13:08:41 crc kubenswrapper[4898]: I1211 13:08:41.318420 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Dec 11 13:08:41 crc kubenswrapper[4898]: I1211 13:08:41.421294 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Dec 11 13:08:41 crc kubenswrapper[4898]: I1211 13:08:41.425936 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Dec 11 13:08:41 crc kubenswrapper[4898]: I1211 13:08:41.514866 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Dec 11 13:08:41 crc kubenswrapper[4898]: I1211 13:08:41.779605 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Dec 11 13:08:41 crc kubenswrapper[4898]: I1211 13:08:41.839256 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Dec 11 13:08:41 crc kubenswrapper[4898]: I1211 13:08:41.849055 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Dec 11 13:08:41 crc kubenswrapper[4898]: I1211 13:08:41.871248 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Dec 11 13:08:41 crc kubenswrapper[4898]: I1211 13:08:41.919306 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Dec 11 13:08:42 crc kubenswrapper[4898]: I1211 13:08:42.169439 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Dec 11 13:08:42 crc kubenswrapper[4898]: I1211 13:08:42.488035 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Dec 11 13:08:42 crc kubenswrapper[4898]: I1211 13:08:42.511752 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Dec 11 13:08:42 crc kubenswrapper[4898]: I1211 13:08:42.522263 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Dec 11 13:08:42 crc kubenswrapper[4898]: I1211 13:08:42.606313 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Dec 11 13:08:42 crc kubenswrapper[4898]: I1211 13:08:42.692535 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Dec 11 13:08:42 crc kubenswrapper[4898]: I1211 13:08:42.710474 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Dec 11 13:08:42 crc kubenswrapper[4898]: I1211 13:08:42.722847 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Dec 11 13:08:42 crc kubenswrapper[4898]: I1211 13:08:42.792877 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Dec 11 13:08:42 crc kubenswrapper[4898]: I1211 13:08:42.856723 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Dec 11 13:08:42 crc kubenswrapper[4898]: I1211 13:08:42.897094 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Dec 11 13:08:43 crc kubenswrapper[4898]: I1211 13:08:43.117755 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Dec 11 13:08:43 crc kubenswrapper[4898]: I1211 13:08:43.123799 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Dec 11 13:08:43 crc kubenswrapper[4898]: I1211 13:08:43.299251 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 11 13:08:43 crc kubenswrapper[4898]: I1211 13:08:43.385364 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Dec 11 13:08:43 crc kubenswrapper[4898]: I1211 13:08:43.443169 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Dec 11 13:08:43 crc kubenswrapper[4898]: I1211 13:08:43.589766 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Dec 11 13:08:43 crc kubenswrapper[4898]: I1211 13:08:43.616078 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Dec 11 13:08:43 crc kubenswrapper[4898]: I1211 13:08:43.661048 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Dec 11 13:08:43 crc kubenswrapper[4898]: I1211 13:08:43.739075 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Dec 11 13:08:43 crc kubenswrapper[4898]: I1211 13:08:43.786968 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Dec 11 13:08:43 crc kubenswrapper[4898]: I1211 13:08:43.803116 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Dec 11 13:08:43 crc kubenswrapper[4898]: I1211 13:08:43.910512 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Dec 11 13:08:43 crc kubenswrapper[4898]: I1211 13:08:43.918469 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 11 13:08:44 crc kubenswrapper[4898]: I1211 13:08:44.065421 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Dec 11 13:08:44 crc kubenswrapper[4898]: I1211 13:08:44.312097 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Dec 11 13:08:44 crc kubenswrapper[4898]: I1211 13:08:44.320574 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Dec 11 13:08:44 crc kubenswrapper[4898]: I1211 13:08:44.373982 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Dec 11 13:08:44 crc kubenswrapper[4898]: I1211 13:08:44.402408 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Dec 11 13:08:44 crc kubenswrapper[4898]: I1211 13:08:44.492490 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Dec 11 13:08:44 crc kubenswrapper[4898]: I1211 13:08:44.507865 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Dec 11 13:08:44 crc kubenswrapper[4898]: I1211 13:08:44.604896 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Dec 11 13:08:44 crc kubenswrapper[4898]: I1211 13:08:44.771806 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Dec 11 13:08:44 crc kubenswrapper[4898]: I1211 13:08:44.820574 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Dec 11 13:08:44 crc kubenswrapper[4898]: I1211 13:08:44.833389 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Dec 11 13:08:44 crc kubenswrapper[4898]: I1211 13:08:44.922193 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Dec 11 13:08:44 crc kubenswrapper[4898]: I1211 13:08:44.989619 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Dec 11 13:08:45 crc kubenswrapper[4898]: I1211 13:08:45.002509 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Dec 11 13:08:45 crc kubenswrapper[4898]: I1211 13:08:45.078020 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Dec 11 13:08:45 crc kubenswrapper[4898]: I1211 13:08:45.083174 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Dec 11 13:08:45 crc kubenswrapper[4898]: I1211 13:08:45.130284 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Dec 11 13:08:45 crc kubenswrapper[4898]: I1211 13:08:45.215574 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Dec 11 13:08:45 crc kubenswrapper[4898]: I1211 13:08:45.244606 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Dec 11 13:08:45 crc kubenswrapper[4898]: I1211 13:08:45.327252 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Dec 11 13:08:45 crc kubenswrapper[4898]: I1211 13:08:45.482500 4898 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Dec 11 13:08:45 crc kubenswrapper[4898]: I1211 13:08:45.485159 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Dec 11 13:08:45 crc kubenswrapper[4898]: I1211 13:08:45.548859 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Dec 11 13:08:45 crc kubenswrapper[4898]: I1211 13:08:45.605001 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Dec 11 13:08:45 crc kubenswrapper[4898]: I1211 13:08:45.840985 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Dec 11 13:08:45 crc kubenswrapper[4898]: I1211 13:08:45.883426 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 11 13:08:45 crc kubenswrapper[4898]: I1211 13:08:45.928229 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Dec 11 13:08:45 crc kubenswrapper[4898]: I1211 13:08:45.962416 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Dec 11 13:08:46 crc kubenswrapper[4898]: I1211 13:08:46.028558 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Dec 11 13:08:46 crc kubenswrapper[4898]: I1211 13:08:46.119263 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Dec 11 13:08:46 crc kubenswrapper[4898]: I1211 13:08:46.170558 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Dec 11 13:08:46 crc kubenswrapper[4898]: I1211 13:08:46.254253 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 11 13:08:46 crc kubenswrapper[4898]: I1211 13:08:46.288249 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Dec 11 13:08:46 crc kubenswrapper[4898]: I1211 13:08:46.379814 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Dec 11 13:08:46 crc kubenswrapper[4898]: I1211 13:08:46.476321 4898 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Dec 11 13:08:46 crc kubenswrapper[4898]: I1211 13:08:46.506708 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Dec 11 13:08:46 crc kubenswrapper[4898]: I1211 13:08:46.547811 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 11 13:08:46 crc kubenswrapper[4898]: I1211 13:08:46.715352 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Dec 11 13:08:46 crc kubenswrapper[4898]: I1211 13:08:46.760011 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 11 13:08:46 crc kubenswrapper[4898]: I1211 13:08:46.763687 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Dec 11 13:08:46 crc kubenswrapper[4898]: I1211 13:08:46.839630 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Dec 11 13:08:46 crc kubenswrapper[4898]: I1211 13:08:46.886841 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Dec 11 13:08:46 crc kubenswrapper[4898]: I1211 13:08:46.937424 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Dec 11 13:08:46 crc kubenswrapper[4898]: I1211 13:08:46.999568 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Dec 11 13:08:47 crc kubenswrapper[4898]: I1211 13:08:47.019649 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Dec 11 13:08:47 crc kubenswrapper[4898]: I1211 13:08:47.060223 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Dec 11 13:08:47 crc kubenswrapper[4898]: I1211 13:08:47.115732 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Dec 11 13:08:47 crc kubenswrapper[4898]: I1211 13:08:47.311966 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Dec 11 13:08:47 crc kubenswrapper[4898]: I1211 13:08:47.336969 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Dec 11 13:08:47 crc kubenswrapper[4898]: I1211 13:08:47.535402 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Dec 11 13:08:47 crc kubenswrapper[4898]: I1211 13:08:47.620666 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Dec 11 13:08:47 crc kubenswrapper[4898]: I1211 13:08:47.659718 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Dec 11 13:08:47 crc kubenswrapper[4898]: I1211 13:08:47.673157 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Dec 11 13:08:47 crc kubenswrapper[4898]: I1211 13:08:47.679346 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Dec 11 13:08:47 crc kubenswrapper[4898]: I1211 13:08:47.720277 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Dec 11 13:08:47 crc kubenswrapper[4898]: I1211 13:08:47.751639 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Dec 11 13:08:47 crc kubenswrapper[4898]: I1211 13:08:47.773433 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Dec 11 13:08:47 crc kubenswrapper[4898]: I1211 13:08:47.800031 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Dec 11 13:08:47 crc kubenswrapper[4898]: I1211 13:08:47.805708 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Dec 11 13:08:47 crc kubenswrapper[4898]: I1211 13:08:47.995380 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Dec 11 13:08:48 crc kubenswrapper[4898]: I1211 13:08:48.032171 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Dec 11 13:08:48 crc kubenswrapper[4898]: I1211 13:08:48.050387 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 11 13:08:48 crc kubenswrapper[4898]: I1211 13:08:48.059375 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Dec 11 13:08:48 crc kubenswrapper[4898]: I1211 13:08:48.072236 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Dec 11 13:08:48 crc kubenswrapper[4898]: I1211 13:08:48.117109 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Dec 11 13:08:48 crc kubenswrapper[4898]: I1211 13:08:48.157977 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Dec 11 13:08:48 crc kubenswrapper[4898]: I1211 13:08:48.215102 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Dec 11 13:08:48 crc kubenswrapper[4898]: I1211 13:08:48.276377 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Dec 11 13:08:48 crc kubenswrapper[4898]: I1211 13:08:48.425664 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Dec 11 13:08:48 crc kubenswrapper[4898]: I1211 13:08:48.434681 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Dec 11 13:08:48 crc kubenswrapper[4898]: I1211 13:08:48.437416 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Dec 11 13:08:48 crc kubenswrapper[4898]: I1211 13:08:48.464313 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Dec 11 13:08:48 crc kubenswrapper[4898]: I1211 13:08:48.557880 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Dec 11 13:08:48 crc kubenswrapper[4898]: I1211 13:08:48.559190 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Dec 11 13:08:48 crc kubenswrapper[4898]: I1211 13:08:48.579545 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Dec 11 13:08:48 crc kubenswrapper[4898]: I1211 13:08:48.665271 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Dec 11 13:08:48 crc kubenswrapper[4898]: I1211 13:08:48.669671 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Dec 11 13:08:48 crc kubenswrapper[4898]: I1211 13:08:48.682990 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Dec 11 13:08:48 crc kubenswrapper[4898]: I1211 13:08:48.792321 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Dec 11 13:08:48 crc kubenswrapper[4898]: I1211 13:08:48.799036 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Dec 11 13:08:48 crc kubenswrapper[4898]: I1211 13:08:48.822649 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Dec 11 13:08:48 crc kubenswrapper[4898]: I1211 13:08:48.890201 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Dec 11 13:08:49 crc kubenswrapper[4898]: I1211 13:08:49.075506 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Dec 11 13:08:49 crc kubenswrapper[4898]: I1211 13:08:49.171591 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Dec 11 13:08:49 crc kubenswrapper[4898]: I1211 13:08:49.203406 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Dec 11 13:08:49 crc kubenswrapper[4898]: I1211 13:08:49.261697 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Dec 11 13:08:49 crc kubenswrapper[4898]: I1211 13:08:49.301000 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Dec 11 13:08:49 crc kubenswrapper[4898]: I1211 13:08:49.303613 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Dec 11 13:08:49 crc kubenswrapper[4898]: I1211 13:08:49.315819 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Dec 11 13:08:49 crc kubenswrapper[4898]: I1211 13:08:49.374718 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Dec 11 13:08:49 crc kubenswrapper[4898]: I1211 13:08:49.399796 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Dec 11 13:08:49 crc kubenswrapper[4898]: I1211 13:08:49.564824 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Dec 11 13:08:49 crc kubenswrapper[4898]: I1211 13:08:49.620014 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Dec 11 13:08:49 crc kubenswrapper[4898]: I1211 13:08:49.633742 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Dec 11 13:08:49 crc kubenswrapper[4898]: I1211 13:08:49.697711 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Dec 11 13:08:49 crc kubenswrapper[4898]: I1211 13:08:49.723153 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 11 13:08:49 crc kubenswrapper[4898]: I1211 13:08:49.797958 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Dec 11 13:08:49 crc kubenswrapper[4898]: I1211 13:08:49.805549 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Dec 11 13:08:49 crc kubenswrapper[4898]: I1211 13:08:49.827745 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Dec 11 13:08:49 crc kubenswrapper[4898]: I1211 13:08:49.891436 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Dec 11 13:08:49 crc kubenswrapper[4898]: I1211 13:08:49.967607 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 11 13:08:49 crc kubenswrapper[4898]: I1211 13:08:49.996622 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Dec 11 13:08:50 crc kubenswrapper[4898]: I1211 13:08:50.145637 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Dec 11 13:08:50 crc kubenswrapper[4898]: I1211 13:08:50.166171 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Dec 11 13:08:50 crc kubenswrapper[4898]: I1211 13:08:50.242287 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Dec 11 13:08:50 crc kubenswrapper[4898]: I1211 13:08:50.350081 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Dec 11 13:08:50 crc kubenswrapper[4898]: I1211 13:08:50.450448 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Dec 11 13:08:50 crc kubenswrapper[4898]: I1211 13:08:50.463617 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Dec 11 13:08:50 crc kubenswrapper[4898]: I1211 13:08:50.529034 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 11 13:08:50 crc kubenswrapper[4898]: I1211 13:08:50.547138 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Dec 11 13:08:50 crc kubenswrapper[4898]: I1211 13:08:50.695438 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Dec 11 13:08:50 crc kubenswrapper[4898]: I1211 13:08:50.701916 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Dec 11 13:08:50 crc kubenswrapper[4898]: I1211 13:08:50.776774 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Dec 11 13:08:50 crc kubenswrapper[4898]: I1211 13:08:50.805806 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Dec 11 13:08:50 crc kubenswrapper[4898]: I1211 13:08:50.808137 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Dec 11 13:08:50 crc kubenswrapper[4898]: I1211 13:08:50.809239 4898 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Dec 11 13:08:50 crc kubenswrapper[4898]: I1211 13:08:50.813049 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 11 13:08:50 crc kubenswrapper[4898]: I1211 13:08:50.813093 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 11 13:08:50 crc kubenswrapper[4898]: I1211 13:08:50.816492 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 11 13:08:50 crc kubenswrapper[4898]: I1211 13:08:50.833440 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=21.833416957 podStartE2EDuration="21.833416957s" podCreationTimestamp="2025-12-11 13:08:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 13:08:50.829501631 +0000 UTC m=+288.401828068" watchObservedRunningTime="2025-12-11 13:08:50.833416957 +0000 UTC m=+288.405743394" Dec 11 13:08:50 crc kubenswrapper[4898]: I1211 13:08:50.869962 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Dec 11 13:08:50 crc kubenswrapper[4898]: I1211 13:08:50.956505 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Dec 11 13:08:51 crc kubenswrapper[4898]: I1211 13:08:51.048142 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 11 13:08:51 crc kubenswrapper[4898]: I1211 13:08:51.071256 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Dec 11 13:08:51 crc kubenswrapper[4898]: I1211 13:08:51.101138 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Dec 11 13:08:51 crc kubenswrapper[4898]: I1211 13:08:51.102584 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 11 13:08:51 crc kubenswrapper[4898]: I1211 13:08:51.120685 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Dec 11 13:08:51 crc kubenswrapper[4898]: I1211 13:08:51.138407 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Dec 11 13:08:51 crc kubenswrapper[4898]: I1211 13:08:51.291541 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Dec 11 13:08:51 crc kubenswrapper[4898]: I1211 13:08:51.298075 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Dec 11 13:08:51 crc kubenswrapper[4898]: I1211 13:08:51.481543 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Dec 11 13:08:51 crc kubenswrapper[4898]: I1211 13:08:51.576861 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Dec 11 13:08:51 crc kubenswrapper[4898]: I1211 13:08:51.633841 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Dec 11 13:08:51 crc kubenswrapper[4898]: I1211 13:08:51.664730 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Dec 11 13:08:51 crc kubenswrapper[4898]: I1211 13:08:51.685362 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Dec 11 13:08:51 crc kubenswrapper[4898]: I1211 13:08:51.689503 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 11 13:08:51 crc kubenswrapper[4898]: I1211 13:08:51.825330 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Dec 11 13:08:51 crc kubenswrapper[4898]: I1211 13:08:51.826270 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Dec 11 13:08:51 crc kubenswrapper[4898]: I1211 13:08:51.897239 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Dec 11 13:08:51 crc kubenswrapper[4898]: I1211 13:08:51.933268 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Dec 11 13:08:51 crc kubenswrapper[4898]: I1211 13:08:51.952980 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Dec 11 13:08:51 crc kubenswrapper[4898]: I1211 13:08:51.962198 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Dec 11 13:08:52 crc kubenswrapper[4898]: I1211 13:08:52.048920 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Dec 11 13:08:52 crc kubenswrapper[4898]: I1211 13:08:52.065027 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Dec 11 13:08:52 crc kubenswrapper[4898]: I1211 13:08:52.090259 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Dec 11 13:08:52 crc kubenswrapper[4898]: I1211 13:08:52.130642 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Dec 11 13:08:52 crc kubenswrapper[4898]: I1211 13:08:52.223345 4898 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 11 13:08:52 crc kubenswrapper[4898]: I1211 13:08:52.223644 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://09f45a93d3650f00907d6932465b34c3688f3d834a8480b3e40015627fbd9008" gracePeriod=5 Dec 11 13:08:52 crc kubenswrapper[4898]: I1211 13:08:52.337300 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Dec 11 13:08:52 crc kubenswrapper[4898]: I1211 13:08:52.383565 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Dec 11 13:08:52 crc kubenswrapper[4898]: I1211 13:08:52.384178 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Dec 11 13:08:52 crc kubenswrapper[4898]: I1211 13:08:52.417221 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Dec 11 13:08:52 crc kubenswrapper[4898]: I1211 13:08:52.437057 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Dec 11 13:08:52 crc kubenswrapper[4898]: I1211 13:08:52.489009 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Dec 11 13:08:52 crc kubenswrapper[4898]: I1211 13:08:52.693266 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Dec 11 13:08:52 crc kubenswrapper[4898]: I1211 13:08:52.915133 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Dec 11 13:08:52 crc kubenswrapper[4898]: I1211 13:08:52.927187 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Dec 11 13:08:52 crc kubenswrapper[4898]: I1211 13:08:52.936160 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Dec 11 13:08:53 crc kubenswrapper[4898]: I1211 13:08:53.004515 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Dec 11 13:08:53 crc kubenswrapper[4898]: I1211 13:08:53.093235 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Dec 11 13:08:53 crc kubenswrapper[4898]: I1211 13:08:53.206954 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Dec 11 13:08:53 crc kubenswrapper[4898]: I1211 13:08:53.218729 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Dec 11 13:08:53 crc kubenswrapper[4898]: I1211 13:08:53.330231 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Dec 11 13:08:53 crc kubenswrapper[4898]: I1211 13:08:53.379831 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Dec 11 13:08:53 crc kubenswrapper[4898]: I1211 13:08:53.451159 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Dec 11 13:08:53 crc kubenswrapper[4898]: I1211 13:08:53.585638 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Dec 11 13:08:53 crc kubenswrapper[4898]: I1211 13:08:53.669160 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Dec 11 13:08:53 crc kubenswrapper[4898]: I1211 13:08:53.761349 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mmpgc"] Dec 11 13:08:53 crc kubenswrapper[4898]: I1211 13:08:53.763259 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-mmpgc" podUID="0fca7537-45c2-4c42-adab-0c373132c342" containerName="registry-server" containerID="cri-o://4f37643ee7880d10e37c2a2d33ca1c9580267be5bd02a56426ccb06cb0594a2f" gracePeriod=30 Dec 11 13:08:53 crc kubenswrapper[4898]: I1211 13:08:53.771765 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Dec 11 13:08:53 crc kubenswrapper[4898]: I1211 13:08:53.772066 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-lsvvk"] Dec 11 13:08:53 crc kubenswrapper[4898]: I1211 13:08:53.772277 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-lsvvk" podUID="815c6898-33a4-4a0d-b751-689267c17053" containerName="registry-server" containerID="cri-o://4028daaf0a69229623ed7dbde7f9fe76d42430aa92c52a7684088e773ee6347d" gracePeriod=30 Dec 11 13:08:53 crc kubenswrapper[4898]: I1211 13:08:53.792373 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-92m7s"] Dec 11 13:08:53 crc kubenswrapper[4898]: I1211 13:08:53.792661 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-92m7s" podUID="5cfb0e1f-8b8f-4f19-9a91-87a9c2bd42f7" containerName="marketplace-operator" containerID="cri-o://81686dc63fb1f23bef1974de32d5b70df5510bdbff8216a00b65a616ad229fdb" gracePeriod=30 Dec 11 13:08:53 crc kubenswrapper[4898]: I1211 13:08:53.798167 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-kl8sx"] Dec 11 13:08:53 crc kubenswrapper[4898]: I1211 13:08:53.798638 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-kl8sx" podUID="9fbaabe9-b4fe-4a1d-9733-b64e0c1c5389" containerName="registry-server" containerID="cri-o://6045840bed859fbe133d9e92bd49915c2570c2780e21de82232e6f3376f5abdb" gracePeriod=30 Dec 11 13:08:53 crc kubenswrapper[4898]: I1211 13:08:53.806093 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-dnsvs"] Dec 11 13:08:53 crc kubenswrapper[4898]: I1211 13:08:53.806346 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-dnsvs" podUID="e3afb839-6915-42b9-9b88-72d11bf5caa2" containerName="registry-server" containerID="cri-o://c9d3948613f10500c34ef4e92fb7d39072b31f772f6a601c81f891d5536d62d9" gracePeriod=30 Dec 11 13:08:53 crc kubenswrapper[4898]: I1211 13:08:53.818541 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-trbn2"] Dec 11 13:08:53 crc kubenswrapper[4898]: E1211 13:08:53.819070 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8b8fd27-089f-46e4-985c-09e1feb795aa" containerName="installer" Dec 11 13:08:53 crc kubenswrapper[4898]: I1211 13:08:53.819173 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8b8fd27-089f-46e4-985c-09e1feb795aa" containerName="installer" Dec 11 13:08:53 crc kubenswrapper[4898]: E1211 13:08:53.819259 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 11 13:08:53 crc kubenswrapper[4898]: I1211 13:08:53.819332 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 11 13:08:53 crc kubenswrapper[4898]: I1211 13:08:53.821088 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8b8fd27-089f-46e4-985c-09e1feb795aa" containerName="installer" Dec 11 13:08:53 crc kubenswrapper[4898]: I1211 13:08:53.821200 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 11 13:08:53 crc kubenswrapper[4898]: I1211 13:08:53.821753 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-trbn2" Dec 11 13:08:53 crc kubenswrapper[4898]: I1211 13:08:53.841544 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-trbn2"] Dec 11 13:08:53 crc kubenswrapper[4898]: I1211 13:08:53.901681 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Dec 11 13:08:53 crc kubenswrapper[4898]: I1211 13:08:53.958275 4898 generic.go:334] "Generic (PLEG): container finished" podID="5cfb0e1f-8b8f-4f19-9a91-87a9c2bd42f7" containerID="81686dc63fb1f23bef1974de32d5b70df5510bdbff8216a00b65a616ad229fdb" exitCode=0 Dec 11 13:08:53 crc kubenswrapper[4898]: I1211 13:08:53.958386 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-92m7s" event={"ID":"5cfb0e1f-8b8f-4f19-9a91-87a9c2bd42f7","Type":"ContainerDied","Data":"81686dc63fb1f23bef1974de32d5b70df5510bdbff8216a00b65a616ad229fdb"} Dec 11 13:08:53 crc kubenswrapper[4898]: I1211 13:08:53.960960 4898 generic.go:334] "Generic (PLEG): container finished" podID="e3afb839-6915-42b9-9b88-72d11bf5caa2" containerID="c9d3948613f10500c34ef4e92fb7d39072b31f772f6a601c81f891d5536d62d9" exitCode=0 Dec 11 13:08:53 crc kubenswrapper[4898]: I1211 13:08:53.961007 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dnsvs" event={"ID":"e3afb839-6915-42b9-9b88-72d11bf5caa2","Type":"ContainerDied","Data":"c9d3948613f10500c34ef4e92fb7d39072b31f772f6a601c81f891d5536d62d9"} Dec 11 13:08:53 crc kubenswrapper[4898]: I1211 13:08:53.962968 4898 generic.go:334] "Generic (PLEG): container finished" podID="9fbaabe9-b4fe-4a1d-9733-b64e0c1c5389" containerID="6045840bed859fbe133d9e92bd49915c2570c2780e21de82232e6f3376f5abdb" exitCode=0 Dec 11 13:08:53 crc kubenswrapper[4898]: I1211 13:08:53.963002 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kl8sx" event={"ID":"9fbaabe9-b4fe-4a1d-9733-b64e0c1c5389","Type":"ContainerDied","Data":"6045840bed859fbe133d9e92bd49915c2570c2780e21de82232e6f3376f5abdb"} Dec 11 13:08:53 crc kubenswrapper[4898]: I1211 13:08:53.964348 4898 generic.go:334] "Generic (PLEG): container finished" podID="0fca7537-45c2-4c42-adab-0c373132c342" containerID="4f37643ee7880d10e37c2a2d33ca1c9580267be5bd02a56426ccb06cb0594a2f" exitCode=0 Dec 11 13:08:53 crc kubenswrapper[4898]: I1211 13:08:53.964382 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mmpgc" event={"ID":"0fca7537-45c2-4c42-adab-0c373132c342","Type":"ContainerDied","Data":"4f37643ee7880d10e37c2a2d33ca1c9580267be5bd02a56426ccb06cb0594a2f"} Dec 11 13:08:53 crc kubenswrapper[4898]: I1211 13:08:53.966145 4898 generic.go:334] "Generic (PLEG): container finished" podID="815c6898-33a4-4a0d-b751-689267c17053" containerID="4028daaf0a69229623ed7dbde7f9fe76d42430aa92c52a7684088e773ee6347d" exitCode=0 Dec 11 13:08:53 crc kubenswrapper[4898]: I1211 13:08:53.966180 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lsvvk" event={"ID":"815c6898-33a4-4a0d-b751-689267c17053","Type":"ContainerDied","Data":"4028daaf0a69229623ed7dbde7f9fe76d42430aa92c52a7684088e773ee6347d"} Dec 11 13:08:53 crc kubenswrapper[4898]: I1211 13:08:53.985444 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bb6a1bca-9d01-4e70-882f-47a6e90923df-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-trbn2\" (UID: \"bb6a1bca-9d01-4e70-882f-47a6e90923df\") " pod="openshift-marketplace/marketplace-operator-79b997595-trbn2" Dec 11 13:08:53 crc kubenswrapper[4898]: I1211 13:08:53.985531 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/bb6a1bca-9d01-4e70-882f-47a6e90923df-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-trbn2\" (UID: \"bb6a1bca-9d01-4e70-882f-47a6e90923df\") " pod="openshift-marketplace/marketplace-operator-79b997595-trbn2" Dec 11 13:08:53 crc kubenswrapper[4898]: I1211 13:08:53.985571 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n4fjh\" (UniqueName: \"kubernetes.io/projected/bb6a1bca-9d01-4e70-882f-47a6e90923df-kube-api-access-n4fjh\") pod \"marketplace-operator-79b997595-trbn2\" (UID: \"bb6a1bca-9d01-4e70-882f-47a6e90923df\") " pod="openshift-marketplace/marketplace-operator-79b997595-trbn2" Dec 11 13:08:53 crc kubenswrapper[4898]: I1211 13:08:53.985795 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Dec 11 13:08:54 crc kubenswrapper[4898]: I1211 13:08:54.086210 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/bb6a1bca-9d01-4e70-882f-47a6e90923df-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-trbn2\" (UID: \"bb6a1bca-9d01-4e70-882f-47a6e90923df\") " pod="openshift-marketplace/marketplace-operator-79b997595-trbn2" Dec 11 13:08:54 crc kubenswrapper[4898]: I1211 13:08:54.086274 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n4fjh\" (UniqueName: \"kubernetes.io/projected/bb6a1bca-9d01-4e70-882f-47a6e90923df-kube-api-access-n4fjh\") pod \"marketplace-operator-79b997595-trbn2\" (UID: \"bb6a1bca-9d01-4e70-882f-47a6e90923df\") " pod="openshift-marketplace/marketplace-operator-79b997595-trbn2" Dec 11 13:08:54 crc kubenswrapper[4898]: I1211 13:08:54.086317 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bb6a1bca-9d01-4e70-882f-47a6e90923df-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-trbn2\" (UID: \"bb6a1bca-9d01-4e70-882f-47a6e90923df\") " pod="openshift-marketplace/marketplace-operator-79b997595-trbn2" Dec 11 13:08:54 crc kubenswrapper[4898]: I1211 13:08:54.092401 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bb6a1bca-9d01-4e70-882f-47a6e90923df-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-trbn2\" (UID: \"bb6a1bca-9d01-4e70-882f-47a6e90923df\") " pod="openshift-marketplace/marketplace-operator-79b997595-trbn2" Dec 11 13:08:54 crc kubenswrapper[4898]: I1211 13:08:54.107623 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n4fjh\" (UniqueName: \"kubernetes.io/projected/bb6a1bca-9d01-4e70-882f-47a6e90923df-kube-api-access-n4fjh\") pod \"marketplace-operator-79b997595-trbn2\" (UID: \"bb6a1bca-9d01-4e70-882f-47a6e90923df\") " pod="openshift-marketplace/marketplace-operator-79b997595-trbn2" Dec 11 13:08:54 crc kubenswrapper[4898]: I1211 13:08:54.112603 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/bb6a1bca-9d01-4e70-882f-47a6e90923df-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-trbn2\" (UID: \"bb6a1bca-9d01-4e70-882f-47a6e90923df\") " pod="openshift-marketplace/marketplace-operator-79b997595-trbn2" Dec 11 13:08:54 crc kubenswrapper[4898]: I1211 13:08:54.137848 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Dec 11 13:08:54 crc kubenswrapper[4898]: I1211 13:08:54.166982 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-trbn2" Dec 11 13:08:54 crc kubenswrapper[4898]: I1211 13:08:54.178860 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Dec 11 13:08:54 crc kubenswrapper[4898]: I1211 13:08:54.195686 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mmpgc" Dec 11 13:08:54 crc kubenswrapper[4898]: I1211 13:08:54.208148 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Dec 11 13:08:54 crc kubenswrapper[4898]: I1211 13:08:54.233607 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lsvvk" Dec 11 13:08:54 crc kubenswrapper[4898]: I1211 13:08:54.235200 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-92m7s" Dec 11 13:08:54 crc kubenswrapper[4898]: I1211 13:08:54.235972 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kl8sx" Dec 11 13:08:54 crc kubenswrapper[4898]: I1211 13:08:54.257072 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Dec 11 13:08:54 crc kubenswrapper[4898]: I1211 13:08:54.283960 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dnsvs" Dec 11 13:08:54 crc kubenswrapper[4898]: I1211 13:08:54.392119 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/815c6898-33a4-4a0d-b751-689267c17053-utilities\") pod \"815c6898-33a4-4a0d-b751-689267c17053\" (UID: \"815c6898-33a4-4a0d-b751-689267c17053\") " Dec 11 13:08:54 crc kubenswrapper[4898]: I1211 13:08:54.392362 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0fca7537-45c2-4c42-adab-0c373132c342-catalog-content\") pod \"0fca7537-45c2-4c42-adab-0c373132c342\" (UID: \"0fca7537-45c2-4c42-adab-0c373132c342\") " Dec 11 13:08:54 crc kubenswrapper[4898]: I1211 13:08:54.392399 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v79b5\" (UniqueName: \"kubernetes.io/projected/5cfb0e1f-8b8f-4f19-9a91-87a9c2bd42f7-kube-api-access-v79b5\") pod \"5cfb0e1f-8b8f-4f19-9a91-87a9c2bd42f7\" (UID: \"5cfb0e1f-8b8f-4f19-9a91-87a9c2bd42f7\") " Dec 11 13:08:54 crc kubenswrapper[4898]: I1211 13:08:54.392434 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5cfb0e1f-8b8f-4f19-9a91-87a9c2bd42f7-marketplace-trusted-ca\") pod \"5cfb0e1f-8b8f-4f19-9a91-87a9c2bd42f7\" (UID: \"5cfb0e1f-8b8f-4f19-9a91-87a9c2bd42f7\") " Dec 11 13:08:54 crc kubenswrapper[4898]: I1211 13:08:54.392479 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e3afb839-6915-42b9-9b88-72d11bf5caa2-catalog-content\") pod \"e3afb839-6915-42b9-9b88-72d11bf5caa2\" (UID: \"e3afb839-6915-42b9-9b88-72d11bf5caa2\") " Dec 11 13:08:54 crc kubenswrapper[4898]: I1211 13:08:54.392522 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/5cfb0e1f-8b8f-4f19-9a91-87a9c2bd42f7-marketplace-operator-metrics\") pod \"5cfb0e1f-8b8f-4f19-9a91-87a9c2bd42f7\" (UID: \"5cfb0e1f-8b8f-4f19-9a91-87a9c2bd42f7\") " Dec 11 13:08:54 crc kubenswrapper[4898]: I1211 13:08:54.392571 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j458j\" (UniqueName: \"kubernetes.io/projected/9fbaabe9-b4fe-4a1d-9733-b64e0c1c5389-kube-api-access-j458j\") pod \"9fbaabe9-b4fe-4a1d-9733-b64e0c1c5389\" (UID: \"9fbaabe9-b4fe-4a1d-9733-b64e0c1c5389\") " Dec 11 13:08:54 crc kubenswrapper[4898]: I1211 13:08:54.392596 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhfdb\" (UniqueName: \"kubernetes.io/projected/0fca7537-45c2-4c42-adab-0c373132c342-kube-api-access-jhfdb\") pod \"0fca7537-45c2-4c42-adab-0c373132c342\" (UID: \"0fca7537-45c2-4c42-adab-0c373132c342\") " Dec 11 13:08:54 crc kubenswrapper[4898]: I1211 13:08:54.392615 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0fca7537-45c2-4c42-adab-0c373132c342-utilities\") pod \"0fca7537-45c2-4c42-adab-0c373132c342\" (UID: \"0fca7537-45c2-4c42-adab-0c373132c342\") " Dec 11 13:08:54 crc kubenswrapper[4898]: I1211 13:08:54.392632 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e3afb839-6915-42b9-9b88-72d11bf5caa2-utilities\") pod \"e3afb839-6915-42b9-9b88-72d11bf5caa2\" (UID: \"e3afb839-6915-42b9-9b88-72d11bf5caa2\") " Dec 11 13:08:54 crc kubenswrapper[4898]: I1211 13:08:54.392657 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/815c6898-33a4-4a0d-b751-689267c17053-catalog-content\") pod \"815c6898-33a4-4a0d-b751-689267c17053\" (UID: \"815c6898-33a4-4a0d-b751-689267c17053\") " Dec 11 13:08:54 crc kubenswrapper[4898]: I1211 13:08:54.392672 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rkpxv\" (UniqueName: \"kubernetes.io/projected/815c6898-33a4-4a0d-b751-689267c17053-kube-api-access-rkpxv\") pod \"815c6898-33a4-4a0d-b751-689267c17053\" (UID: \"815c6898-33a4-4a0d-b751-689267c17053\") " Dec 11 13:08:54 crc kubenswrapper[4898]: I1211 13:08:54.392689 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9fbaabe9-b4fe-4a1d-9733-b64e0c1c5389-utilities\") pod \"9fbaabe9-b4fe-4a1d-9733-b64e0c1c5389\" (UID: \"9fbaabe9-b4fe-4a1d-9733-b64e0c1c5389\") " Dec 11 13:08:54 crc kubenswrapper[4898]: I1211 13:08:54.392725 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9fbaabe9-b4fe-4a1d-9733-b64e0c1c5389-catalog-content\") pod \"9fbaabe9-b4fe-4a1d-9733-b64e0c1c5389\" (UID: \"9fbaabe9-b4fe-4a1d-9733-b64e0c1c5389\") " Dec 11 13:08:54 crc kubenswrapper[4898]: I1211 13:08:54.392746 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rbvtn\" (UniqueName: \"kubernetes.io/projected/e3afb839-6915-42b9-9b88-72d11bf5caa2-kube-api-access-rbvtn\") pod \"e3afb839-6915-42b9-9b88-72d11bf5caa2\" (UID: \"e3afb839-6915-42b9-9b88-72d11bf5caa2\") " Dec 11 13:08:54 crc kubenswrapper[4898]: I1211 13:08:54.393076 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/815c6898-33a4-4a0d-b751-689267c17053-utilities" (OuterVolumeSpecName: "utilities") pod "815c6898-33a4-4a0d-b751-689267c17053" (UID: "815c6898-33a4-4a0d-b751-689267c17053"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 13:08:54 crc kubenswrapper[4898]: I1211 13:08:54.393906 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0fca7537-45c2-4c42-adab-0c373132c342-utilities" (OuterVolumeSpecName: "utilities") pod "0fca7537-45c2-4c42-adab-0c373132c342" (UID: "0fca7537-45c2-4c42-adab-0c373132c342"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 13:08:54 crc kubenswrapper[4898]: I1211 13:08:54.394758 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e3afb839-6915-42b9-9b88-72d11bf5caa2-utilities" (OuterVolumeSpecName: "utilities") pod "e3afb839-6915-42b9-9b88-72d11bf5caa2" (UID: "e3afb839-6915-42b9-9b88-72d11bf5caa2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 13:08:54 crc kubenswrapper[4898]: I1211 13:08:54.396965 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5cfb0e1f-8b8f-4f19-9a91-87a9c2bd42f7-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "5cfb0e1f-8b8f-4f19-9a91-87a9c2bd42f7" (UID: "5cfb0e1f-8b8f-4f19-9a91-87a9c2bd42f7"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:08:54 crc kubenswrapper[4898]: I1211 13:08:54.397484 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9fbaabe9-b4fe-4a1d-9733-b64e0c1c5389-kube-api-access-j458j" (OuterVolumeSpecName: "kube-api-access-j458j") pod "9fbaabe9-b4fe-4a1d-9733-b64e0c1c5389" (UID: "9fbaabe9-b4fe-4a1d-9733-b64e0c1c5389"). InnerVolumeSpecName "kube-api-access-j458j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:08:54 crc kubenswrapper[4898]: I1211 13:08:54.399655 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9fbaabe9-b4fe-4a1d-9733-b64e0c1c5389-utilities" (OuterVolumeSpecName: "utilities") pod "9fbaabe9-b4fe-4a1d-9733-b64e0c1c5389" (UID: "9fbaabe9-b4fe-4a1d-9733-b64e0c1c5389"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 13:08:54 crc kubenswrapper[4898]: I1211 13:08:54.401954 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3afb839-6915-42b9-9b88-72d11bf5caa2-kube-api-access-rbvtn" (OuterVolumeSpecName: "kube-api-access-rbvtn") pod "e3afb839-6915-42b9-9b88-72d11bf5caa2" (UID: "e3afb839-6915-42b9-9b88-72d11bf5caa2"). InnerVolumeSpecName "kube-api-access-rbvtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:08:54 crc kubenswrapper[4898]: I1211 13:08:54.406135 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5cfb0e1f-8b8f-4f19-9a91-87a9c2bd42f7-kube-api-access-v79b5" (OuterVolumeSpecName: "kube-api-access-v79b5") pod "5cfb0e1f-8b8f-4f19-9a91-87a9c2bd42f7" (UID: "5cfb0e1f-8b8f-4f19-9a91-87a9c2bd42f7"). InnerVolumeSpecName "kube-api-access-v79b5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:08:54 crc kubenswrapper[4898]: I1211 13:08:54.406354 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/815c6898-33a4-4a0d-b751-689267c17053-kube-api-access-rkpxv" (OuterVolumeSpecName: "kube-api-access-rkpxv") pod "815c6898-33a4-4a0d-b751-689267c17053" (UID: "815c6898-33a4-4a0d-b751-689267c17053"). InnerVolumeSpecName "kube-api-access-rkpxv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:08:54 crc kubenswrapper[4898]: I1211 13:08:54.410443 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5cfb0e1f-8b8f-4f19-9a91-87a9c2bd42f7-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "5cfb0e1f-8b8f-4f19-9a91-87a9c2bd42f7" (UID: "5cfb0e1f-8b8f-4f19-9a91-87a9c2bd42f7"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:08:54 crc kubenswrapper[4898]: I1211 13:08:54.415280 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0fca7537-45c2-4c42-adab-0c373132c342-kube-api-access-jhfdb" (OuterVolumeSpecName: "kube-api-access-jhfdb") pod "0fca7537-45c2-4c42-adab-0c373132c342" (UID: "0fca7537-45c2-4c42-adab-0c373132c342"). InnerVolumeSpecName "kube-api-access-jhfdb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:08:54 crc kubenswrapper[4898]: I1211 13:08:54.425873 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-trbn2"] Dec 11 13:08:54 crc kubenswrapper[4898]: I1211 13:08:54.434535 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Dec 11 13:08:54 crc kubenswrapper[4898]: I1211 13:08:54.437163 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9fbaabe9-b4fe-4a1d-9733-b64e0c1c5389-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9fbaabe9-b4fe-4a1d-9733-b64e0c1c5389" (UID: "9fbaabe9-b4fe-4a1d-9733-b64e0c1c5389"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 13:08:54 crc kubenswrapper[4898]: I1211 13:08:54.450715 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Dec 11 13:08:54 crc kubenswrapper[4898]: I1211 13:08:54.457623 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/815c6898-33a4-4a0d-b751-689267c17053-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "815c6898-33a4-4a0d-b751-689267c17053" (UID: "815c6898-33a4-4a0d-b751-689267c17053"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 13:08:54 crc kubenswrapper[4898]: I1211 13:08:54.467048 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0fca7537-45c2-4c42-adab-0c373132c342-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0fca7537-45c2-4c42-adab-0c373132c342" (UID: "0fca7537-45c2-4c42-adab-0c373132c342"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 13:08:54 crc kubenswrapper[4898]: I1211 13:08:54.477572 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Dec 11 13:08:54 crc kubenswrapper[4898]: I1211 13:08:54.493291 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j458j\" (UniqueName: \"kubernetes.io/projected/9fbaabe9-b4fe-4a1d-9733-b64e0c1c5389-kube-api-access-j458j\") on node \"crc\" DevicePath \"\"" Dec 11 13:08:54 crc kubenswrapper[4898]: I1211 13:08:54.493325 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhfdb\" (UniqueName: \"kubernetes.io/projected/0fca7537-45c2-4c42-adab-0c373132c342-kube-api-access-jhfdb\") on node \"crc\" DevicePath \"\"" Dec 11 13:08:54 crc kubenswrapper[4898]: I1211 13:08:54.493337 4898 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0fca7537-45c2-4c42-adab-0c373132c342-utilities\") on node \"crc\" DevicePath \"\"" Dec 11 13:08:54 crc kubenswrapper[4898]: I1211 13:08:54.493349 4898 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e3afb839-6915-42b9-9b88-72d11bf5caa2-utilities\") on node \"crc\" DevicePath \"\"" Dec 11 13:08:54 crc kubenswrapper[4898]: I1211 13:08:54.493370 4898 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/815c6898-33a4-4a0d-b751-689267c17053-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 11 13:08:54 crc kubenswrapper[4898]: I1211 13:08:54.493379 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rkpxv\" (UniqueName: \"kubernetes.io/projected/815c6898-33a4-4a0d-b751-689267c17053-kube-api-access-rkpxv\") on node \"crc\" DevicePath \"\"" Dec 11 13:08:54 crc kubenswrapper[4898]: I1211 13:08:54.493389 4898 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9fbaabe9-b4fe-4a1d-9733-b64e0c1c5389-utilities\") on node \"crc\" DevicePath \"\"" Dec 11 13:08:54 crc kubenswrapper[4898]: I1211 13:08:54.493399 4898 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9fbaabe9-b4fe-4a1d-9733-b64e0c1c5389-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 11 13:08:54 crc kubenswrapper[4898]: I1211 13:08:54.493410 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rbvtn\" (UniqueName: \"kubernetes.io/projected/e3afb839-6915-42b9-9b88-72d11bf5caa2-kube-api-access-rbvtn\") on node \"crc\" DevicePath \"\"" Dec 11 13:08:54 crc kubenswrapper[4898]: I1211 13:08:54.493420 4898 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/815c6898-33a4-4a0d-b751-689267c17053-utilities\") on node \"crc\" DevicePath \"\"" Dec 11 13:08:54 crc kubenswrapper[4898]: I1211 13:08:54.493429 4898 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0fca7537-45c2-4c42-adab-0c373132c342-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 11 13:08:54 crc kubenswrapper[4898]: I1211 13:08:54.493473 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v79b5\" (UniqueName: \"kubernetes.io/projected/5cfb0e1f-8b8f-4f19-9a91-87a9c2bd42f7-kube-api-access-v79b5\") on node \"crc\" DevicePath \"\"" Dec 11 13:08:54 crc kubenswrapper[4898]: I1211 13:08:54.493484 4898 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5cfb0e1f-8b8f-4f19-9a91-87a9c2bd42f7-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 11 13:08:54 crc kubenswrapper[4898]: I1211 13:08:54.493496 4898 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/5cfb0e1f-8b8f-4f19-9a91-87a9c2bd42f7-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Dec 11 13:08:54 crc kubenswrapper[4898]: I1211 13:08:54.539708 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e3afb839-6915-42b9-9b88-72d11bf5caa2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e3afb839-6915-42b9-9b88-72d11bf5caa2" (UID: "e3afb839-6915-42b9-9b88-72d11bf5caa2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 13:08:54 crc kubenswrapper[4898]: I1211 13:08:54.544323 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 11 13:08:54 crc kubenswrapper[4898]: I1211 13:08:54.594669 4898 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e3afb839-6915-42b9-9b88-72d11bf5caa2-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 11 13:08:54 crc kubenswrapper[4898]: I1211 13:08:54.663889 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Dec 11 13:08:54 crc kubenswrapper[4898]: I1211 13:08:54.747434 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Dec 11 13:08:54 crc kubenswrapper[4898]: I1211 13:08:54.972765 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dnsvs" event={"ID":"e3afb839-6915-42b9-9b88-72d11bf5caa2","Type":"ContainerDied","Data":"4a46a68985b15098357ae9cd7c5e76aa77609e6afacff4cf4e720f6876bcfa51"} Dec 11 13:08:54 crc kubenswrapper[4898]: I1211 13:08:54.972818 4898 scope.go:117] "RemoveContainer" containerID="c9d3948613f10500c34ef4e92fb7d39072b31f772f6a601c81f891d5536d62d9" Dec 11 13:08:54 crc kubenswrapper[4898]: I1211 13:08:54.973862 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dnsvs" Dec 11 13:08:54 crc kubenswrapper[4898]: I1211 13:08:54.976938 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kl8sx" Dec 11 13:08:54 crc kubenswrapper[4898]: I1211 13:08:54.977263 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kl8sx" event={"ID":"9fbaabe9-b4fe-4a1d-9733-b64e0c1c5389","Type":"ContainerDied","Data":"004116e9819d4d531c207532ccb82cbc623f24dd1e7f604337e5ebf912285867"} Dec 11 13:08:54 crc kubenswrapper[4898]: I1211 13:08:54.980293 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mmpgc" Dec 11 13:08:54 crc kubenswrapper[4898]: I1211 13:08:54.980472 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mmpgc" event={"ID":"0fca7537-45c2-4c42-adab-0c373132c342","Type":"ContainerDied","Data":"1e596a6261ff49e750f0b01904dfa27f844d0c440a257847606c4cb5a745e41a"} Dec 11 13:08:54 crc kubenswrapper[4898]: I1211 13:08:54.985294 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lsvvk" Dec 11 13:08:54 crc kubenswrapper[4898]: I1211 13:08:54.985308 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lsvvk" event={"ID":"815c6898-33a4-4a0d-b751-689267c17053","Type":"ContainerDied","Data":"21adf4bb027db954d33468bf7b347277209ad667b342f9f71dd8cf805e18709b"} Dec 11 13:08:54 crc kubenswrapper[4898]: I1211 13:08:54.988565 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-92m7s" event={"ID":"5cfb0e1f-8b8f-4f19-9a91-87a9c2bd42f7","Type":"ContainerDied","Data":"fe6294ae9cd2a4722a60cab6aaf4048fa5ccb63c306edef175d95f7ef23c33f1"} Dec 11 13:08:54 crc kubenswrapper[4898]: I1211 13:08:54.988627 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-92m7s" Dec 11 13:08:54 crc kubenswrapper[4898]: I1211 13:08:54.992213 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-trbn2" event={"ID":"bb6a1bca-9d01-4e70-882f-47a6e90923df","Type":"ContainerStarted","Data":"053290ba41d6c9ae253f3c246afc836907853f9421ef228e4039534250a69b42"} Dec 11 13:08:54 crc kubenswrapper[4898]: I1211 13:08:54.992260 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-trbn2" event={"ID":"bb6a1bca-9d01-4e70-882f-47a6e90923df","Type":"ContainerStarted","Data":"db322b2cebed0bc3730a992a58d0828a858bef5c03d1a0de479875cbcf7cd02d"} Dec 11 13:08:54 crc kubenswrapper[4898]: I1211 13:08:54.993116 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-trbn2" Dec 11 13:08:54 crc kubenswrapper[4898]: I1211 13:08:54.993464 4898 scope.go:117] "RemoveContainer" containerID="c23675cc5b4b314df18f2c5f7837e0cabe79692803612d0e426ab6af2c5abd17" Dec 11 13:08:55 crc kubenswrapper[4898]: I1211 13:08:55.000984 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-dnsvs"] Dec 11 13:08:55 crc kubenswrapper[4898]: I1211 13:08:55.018802 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-trbn2" Dec 11 13:08:55 crc kubenswrapper[4898]: I1211 13:08:55.022668 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-dnsvs"] Dec 11 13:08:55 crc kubenswrapper[4898]: I1211 13:08:55.024646 4898 scope.go:117] "RemoveContainer" containerID="b9d59b1e43acdeafae38373af523219fc0ed26a5b8a2520a1a9a4dddf4b3cb5b" Dec 11 13:08:55 crc kubenswrapper[4898]: I1211 13:08:55.027501 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-kl8sx"] Dec 11 13:08:55 crc kubenswrapper[4898]: I1211 13:08:55.039176 4898 scope.go:117] "RemoveContainer" containerID="6045840bed859fbe133d9e92bd49915c2570c2780e21de82232e6f3376f5abdb" Dec 11 13:08:55 crc kubenswrapper[4898]: I1211 13:08:55.043589 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-kl8sx"] Dec 11 13:08:55 crc kubenswrapper[4898]: I1211 13:08:55.045149 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-lsvvk"] Dec 11 13:08:55 crc kubenswrapper[4898]: I1211 13:08:55.049031 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-lsvvk"] Dec 11 13:08:55 crc kubenswrapper[4898]: I1211 13:08:55.051913 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mmpgc"] Dec 11 13:08:55 crc kubenswrapper[4898]: I1211 13:08:55.054825 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-mmpgc"] Dec 11 13:08:55 crc kubenswrapper[4898]: I1211 13:08:55.057847 4898 scope.go:117] "RemoveContainer" containerID="e7a98467bf023c9eb677a9d51cccd7aee17c57c5b019db21a6ed7d14acec6321" Dec 11 13:08:55 crc kubenswrapper[4898]: I1211 13:08:55.067395 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-trbn2" podStartSLOduration=2.067377812 podStartE2EDuration="2.067377812s" podCreationTimestamp="2025-12-11 13:08:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 13:08:55.06431652 +0000 UTC m=+292.636642957" watchObservedRunningTime="2025-12-11 13:08:55.067377812 +0000 UTC m=+292.639704249" Dec 11 13:08:55 crc kubenswrapper[4898]: I1211 13:08:55.078270 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-92m7s"] Dec 11 13:08:55 crc kubenswrapper[4898]: I1211 13:08:55.086325 4898 scope.go:117] "RemoveContainer" containerID="ee92dc81976314da41a11ca82e58bbc002d0cbb3ce4c810295a63c11a9467691" Dec 11 13:08:55 crc kubenswrapper[4898]: I1211 13:08:55.090467 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 11 13:08:55 crc kubenswrapper[4898]: I1211 13:08:55.091950 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Dec 11 13:08:55 crc kubenswrapper[4898]: I1211 13:08:55.091994 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-92m7s"] Dec 11 13:08:55 crc kubenswrapper[4898]: I1211 13:08:55.109257 4898 scope.go:117] "RemoveContainer" containerID="4f37643ee7880d10e37c2a2d33ca1c9580267be5bd02a56426ccb06cb0594a2f" Dec 11 13:08:55 crc kubenswrapper[4898]: I1211 13:08:55.111327 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Dec 11 13:08:55 crc kubenswrapper[4898]: I1211 13:08:55.129977 4898 scope.go:117] "RemoveContainer" containerID="cd0373435e022c2676c7b77522ca912af29676770986a0ec3e6327734ba10ddc" Dec 11 13:08:55 crc kubenswrapper[4898]: I1211 13:08:55.148750 4898 scope.go:117] "RemoveContainer" containerID="77a0d31a682497f45a0e260fd72190c8bcabb20d50392b45dff1356100166635" Dec 11 13:08:55 crc kubenswrapper[4898]: I1211 13:08:55.164326 4898 scope.go:117] "RemoveContainer" containerID="4028daaf0a69229623ed7dbde7f9fe76d42430aa92c52a7684088e773ee6347d" Dec 11 13:08:55 crc kubenswrapper[4898]: I1211 13:08:55.183698 4898 scope.go:117] "RemoveContainer" containerID="f41ed5bd5339af68e5578a8bda1e30b14b5277af77862560d2b3d7cb0bf9d45b" Dec 11 13:08:55 crc kubenswrapper[4898]: I1211 13:08:55.198868 4898 scope.go:117] "RemoveContainer" containerID="14574a77796dcddf703853dc6e12ffe7449522da44fe7e5c1a94db882991860d" Dec 11 13:08:55 crc kubenswrapper[4898]: I1211 13:08:55.213359 4898 scope.go:117] "RemoveContainer" containerID="81686dc63fb1f23bef1974de32d5b70df5510bdbff8216a00b65a616ad229fdb" Dec 11 13:08:55 crc kubenswrapper[4898]: I1211 13:08:55.230899 4898 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Dec 11 13:08:56 crc kubenswrapper[4898]: I1211 13:08:56.781225 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0fca7537-45c2-4c42-adab-0c373132c342" path="/var/lib/kubelet/pods/0fca7537-45c2-4c42-adab-0c373132c342/volumes" Dec 11 13:08:56 crc kubenswrapper[4898]: I1211 13:08:56.782016 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5cfb0e1f-8b8f-4f19-9a91-87a9c2bd42f7" path="/var/lib/kubelet/pods/5cfb0e1f-8b8f-4f19-9a91-87a9c2bd42f7/volumes" Dec 11 13:08:56 crc kubenswrapper[4898]: I1211 13:08:56.782600 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="815c6898-33a4-4a0d-b751-689267c17053" path="/var/lib/kubelet/pods/815c6898-33a4-4a0d-b751-689267c17053/volumes" Dec 11 13:08:56 crc kubenswrapper[4898]: I1211 13:08:56.783617 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9fbaabe9-b4fe-4a1d-9733-b64e0c1c5389" path="/var/lib/kubelet/pods/9fbaabe9-b4fe-4a1d-9733-b64e0c1c5389/volumes" Dec 11 13:08:56 crc kubenswrapper[4898]: I1211 13:08:56.784212 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e3afb839-6915-42b9-9b88-72d11bf5caa2" path="/var/lib/kubelet/pods/e3afb839-6915-42b9-9b88-72d11bf5caa2/volumes" Dec 11 13:08:57 crc kubenswrapper[4898]: I1211 13:08:57.916742 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Dec 11 13:08:57 crc kubenswrapper[4898]: I1211 13:08:57.917114 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 11 13:08:58 crc kubenswrapper[4898]: I1211 13:08:58.015398 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Dec 11 13:08:58 crc kubenswrapper[4898]: I1211 13:08:58.015492 4898 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="09f45a93d3650f00907d6932465b34c3688f3d834a8480b3e40015627fbd9008" exitCode=137 Dec 11 13:08:58 crc kubenswrapper[4898]: I1211 13:08:58.015559 4898 scope.go:117] "RemoveContainer" containerID="09f45a93d3650f00907d6932465b34c3688f3d834a8480b3e40015627fbd9008" Dec 11 13:08:58 crc kubenswrapper[4898]: I1211 13:08:58.015598 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 11 13:08:58 crc kubenswrapper[4898]: I1211 13:08:58.031309 4898 scope.go:117] "RemoveContainer" containerID="09f45a93d3650f00907d6932465b34c3688f3d834a8480b3e40015627fbd9008" Dec 11 13:08:58 crc kubenswrapper[4898]: E1211 13:08:58.031837 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"09f45a93d3650f00907d6932465b34c3688f3d834a8480b3e40015627fbd9008\": container with ID starting with 09f45a93d3650f00907d6932465b34c3688f3d834a8480b3e40015627fbd9008 not found: ID does not exist" containerID="09f45a93d3650f00907d6932465b34c3688f3d834a8480b3e40015627fbd9008" Dec 11 13:08:58 crc kubenswrapper[4898]: I1211 13:08:58.031901 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09f45a93d3650f00907d6932465b34c3688f3d834a8480b3e40015627fbd9008"} err="failed to get container status \"09f45a93d3650f00907d6932465b34c3688f3d834a8480b3e40015627fbd9008\": rpc error: code = NotFound desc = could not find container \"09f45a93d3650f00907d6932465b34c3688f3d834a8480b3e40015627fbd9008\": container with ID starting with 09f45a93d3650f00907d6932465b34c3688f3d834a8480b3e40015627fbd9008 not found: ID does not exist" Dec 11 13:08:58 crc kubenswrapper[4898]: I1211 13:08:58.033412 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 11 13:08:58 crc kubenswrapper[4898]: I1211 13:08:58.033485 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 11 13:08:58 crc kubenswrapper[4898]: I1211 13:08:58.033560 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 11 13:08:58 crc kubenswrapper[4898]: I1211 13:08:58.033629 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 11 13:08:58 crc kubenswrapper[4898]: I1211 13:08:58.033670 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 11 13:08:58 crc kubenswrapper[4898]: I1211 13:08:58.033697 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 11 13:08:58 crc kubenswrapper[4898]: I1211 13:08:58.033750 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 11 13:08:58 crc kubenswrapper[4898]: I1211 13:08:58.033751 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 11 13:08:58 crc kubenswrapper[4898]: I1211 13:08:58.033771 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 11 13:08:58 crc kubenswrapper[4898]: I1211 13:08:58.034124 4898 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Dec 11 13:08:58 crc kubenswrapper[4898]: I1211 13:08:58.034145 4898 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Dec 11 13:08:58 crc kubenswrapper[4898]: I1211 13:08:58.034156 4898 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Dec 11 13:08:58 crc kubenswrapper[4898]: I1211 13:08:58.034170 4898 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 11 13:08:58 crc kubenswrapper[4898]: I1211 13:08:58.040655 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 11 13:08:58 crc kubenswrapper[4898]: I1211 13:08:58.134779 4898 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 11 13:08:58 crc kubenswrapper[4898]: I1211 13:08:58.792598 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Dec 11 13:09:06 crc kubenswrapper[4898]: I1211 13:09:06.236557 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Dec 11 13:09:06 crc kubenswrapper[4898]: I1211 13:09:06.966439 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Dec 11 13:09:07 crc kubenswrapper[4898]: I1211 13:09:07.323810 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Dec 11 13:09:09 crc kubenswrapper[4898]: I1211 13:09:09.555342 4898 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Dec 11 13:09:13 crc kubenswrapper[4898]: I1211 13:09:13.156022 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Dec 11 13:09:13 crc kubenswrapper[4898]: I1211 13:09:13.678665 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Dec 11 13:09:14 crc kubenswrapper[4898]: I1211 13:09:14.186360 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Dec 11 13:09:15 crc kubenswrapper[4898]: I1211 13:09:15.011447 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Dec 11 13:09:15 crc kubenswrapper[4898]: I1211 13:09:15.341536 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 11 13:09:15 crc kubenswrapper[4898]: I1211 13:09:15.503050 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Dec 11 13:09:15 crc kubenswrapper[4898]: I1211 13:09:15.575609 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Dec 11 13:09:17 crc kubenswrapper[4898]: I1211 13:09:17.219633 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Dec 11 13:09:17 crc kubenswrapper[4898]: I1211 13:09:17.430752 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Dec 11 13:09:19 crc kubenswrapper[4898]: I1211 13:09:19.069411 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Dec 11 13:09:19 crc kubenswrapper[4898]: I1211 13:09:19.279413 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Dec 11 13:09:20 crc kubenswrapper[4898]: I1211 13:09:20.078926 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Dec 11 13:09:20 crc kubenswrapper[4898]: I1211 13:09:20.535496 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Dec 11 13:09:21 crc kubenswrapper[4898]: I1211 13:09:21.161858 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Dec 11 13:09:22 crc kubenswrapper[4898]: I1211 13:09:22.288820 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Dec 11 13:09:23 crc kubenswrapper[4898]: I1211 13:09:23.281260 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-6d5b84845-jq4bp"] Dec 11 13:09:23 crc kubenswrapper[4898]: E1211 13:09:23.281428 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9fbaabe9-b4fe-4a1d-9733-b64e0c1c5389" containerName="extract-content" Dec 11 13:09:23 crc kubenswrapper[4898]: I1211 13:09:23.281439 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="9fbaabe9-b4fe-4a1d-9733-b64e0c1c5389" containerName="extract-content" Dec 11 13:09:23 crc kubenswrapper[4898]: E1211 13:09:23.281465 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3afb839-6915-42b9-9b88-72d11bf5caa2" containerName="registry-server" Dec 11 13:09:23 crc kubenswrapper[4898]: I1211 13:09:23.281471 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3afb839-6915-42b9-9b88-72d11bf5caa2" containerName="registry-server" Dec 11 13:09:23 crc kubenswrapper[4898]: E1211 13:09:23.281479 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0fca7537-45c2-4c42-adab-0c373132c342" containerName="extract-utilities" Dec 11 13:09:23 crc kubenswrapper[4898]: I1211 13:09:23.281486 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="0fca7537-45c2-4c42-adab-0c373132c342" containerName="extract-utilities" Dec 11 13:09:23 crc kubenswrapper[4898]: E1211 13:09:23.281496 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9fbaabe9-b4fe-4a1d-9733-b64e0c1c5389" containerName="registry-server" Dec 11 13:09:23 crc kubenswrapper[4898]: I1211 13:09:23.281501 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="9fbaabe9-b4fe-4a1d-9733-b64e0c1c5389" containerName="registry-server" Dec 11 13:09:23 crc kubenswrapper[4898]: E1211 13:09:23.281508 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5cfb0e1f-8b8f-4f19-9a91-87a9c2bd42f7" containerName="marketplace-operator" Dec 11 13:09:23 crc kubenswrapper[4898]: I1211 13:09:23.281514 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="5cfb0e1f-8b8f-4f19-9a91-87a9c2bd42f7" containerName="marketplace-operator" Dec 11 13:09:23 crc kubenswrapper[4898]: E1211 13:09:23.281523 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3afb839-6915-42b9-9b88-72d11bf5caa2" containerName="extract-utilities" Dec 11 13:09:23 crc kubenswrapper[4898]: I1211 13:09:23.281528 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3afb839-6915-42b9-9b88-72d11bf5caa2" containerName="extract-utilities" Dec 11 13:09:23 crc kubenswrapper[4898]: E1211 13:09:23.281534 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3afb839-6915-42b9-9b88-72d11bf5caa2" containerName="extract-content" Dec 11 13:09:23 crc kubenswrapper[4898]: I1211 13:09:23.281539 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3afb839-6915-42b9-9b88-72d11bf5caa2" containerName="extract-content" Dec 11 13:09:23 crc kubenswrapper[4898]: E1211 13:09:23.281546 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0fca7537-45c2-4c42-adab-0c373132c342" containerName="extract-content" Dec 11 13:09:23 crc kubenswrapper[4898]: I1211 13:09:23.281552 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="0fca7537-45c2-4c42-adab-0c373132c342" containerName="extract-content" Dec 11 13:09:23 crc kubenswrapper[4898]: E1211 13:09:23.281560 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0fca7537-45c2-4c42-adab-0c373132c342" containerName="registry-server" Dec 11 13:09:23 crc kubenswrapper[4898]: I1211 13:09:23.281566 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="0fca7537-45c2-4c42-adab-0c373132c342" containerName="registry-server" Dec 11 13:09:23 crc kubenswrapper[4898]: E1211 13:09:23.281577 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9fbaabe9-b4fe-4a1d-9733-b64e0c1c5389" containerName="extract-utilities" Dec 11 13:09:23 crc kubenswrapper[4898]: I1211 13:09:23.281583 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="9fbaabe9-b4fe-4a1d-9733-b64e0c1c5389" containerName="extract-utilities" Dec 11 13:09:23 crc kubenswrapper[4898]: E1211 13:09:23.281590 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="815c6898-33a4-4a0d-b751-689267c17053" containerName="extract-utilities" Dec 11 13:09:23 crc kubenswrapper[4898]: I1211 13:09:23.281595 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="815c6898-33a4-4a0d-b751-689267c17053" containerName="extract-utilities" Dec 11 13:09:23 crc kubenswrapper[4898]: E1211 13:09:23.281604 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="815c6898-33a4-4a0d-b751-689267c17053" containerName="extract-content" Dec 11 13:09:23 crc kubenswrapper[4898]: I1211 13:09:23.281609 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="815c6898-33a4-4a0d-b751-689267c17053" containerName="extract-content" Dec 11 13:09:23 crc kubenswrapper[4898]: E1211 13:09:23.281617 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="815c6898-33a4-4a0d-b751-689267c17053" containerName="registry-server" Dec 11 13:09:23 crc kubenswrapper[4898]: I1211 13:09:23.281623 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="815c6898-33a4-4a0d-b751-689267c17053" containerName="registry-server" Dec 11 13:09:23 crc kubenswrapper[4898]: I1211 13:09:23.281703 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="0fca7537-45c2-4c42-adab-0c373132c342" containerName="registry-server" Dec 11 13:09:23 crc kubenswrapper[4898]: I1211 13:09:23.281713 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="9fbaabe9-b4fe-4a1d-9733-b64e0c1c5389" containerName="registry-server" Dec 11 13:09:23 crc kubenswrapper[4898]: I1211 13:09:23.281722 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="5cfb0e1f-8b8f-4f19-9a91-87a9c2bd42f7" containerName="marketplace-operator" Dec 11 13:09:23 crc kubenswrapper[4898]: I1211 13:09:23.281732 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="815c6898-33a4-4a0d-b751-689267c17053" containerName="registry-server" Dec 11 13:09:23 crc kubenswrapper[4898]: I1211 13:09:23.281738 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3afb839-6915-42b9-9b88-72d11bf5caa2" containerName="registry-server" Dec 11 13:09:23 crc kubenswrapper[4898]: I1211 13:09:23.282068 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-jq4bp" Dec 11 13:09:23 crc kubenswrapper[4898]: I1211 13:09:23.284866 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kube-root-ca.crt" Dec 11 13:09:23 crc kubenswrapper[4898]: I1211 13:09:23.284891 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"cluster-monitoring-operator-dockercfg-wwt9l" Dec 11 13:09:23 crc kubenswrapper[4898]: I1211 13:09:23.284891 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"cluster-monitoring-operator-tls" Dec 11 13:09:23 crc kubenswrapper[4898]: I1211 13:09:23.284984 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"telemetry-config" Dec 11 13:09:23 crc kubenswrapper[4898]: I1211 13:09:23.286369 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"openshift-service-ca.crt" Dec 11 13:09:23 crc kubenswrapper[4898]: I1211 13:09:23.296438 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-6d5b84845-jq4bp"] Dec 11 13:09:23 crc kubenswrapper[4898]: I1211 13:09:23.440075 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7zkl2\" (UniqueName: \"kubernetes.io/projected/19a6c72a-2ab6-4971-ba09-c480e7f11d93-kube-api-access-7zkl2\") pod \"cluster-monitoring-operator-6d5b84845-jq4bp\" (UID: \"19a6c72a-2ab6-4971-ba09-c480e7f11d93\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-jq4bp" Dec 11 13:09:23 crc kubenswrapper[4898]: I1211 13:09:23.440434 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/19a6c72a-2ab6-4971-ba09-c480e7f11d93-telemetry-config\") pod \"cluster-monitoring-operator-6d5b84845-jq4bp\" (UID: \"19a6c72a-2ab6-4971-ba09-c480e7f11d93\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-jq4bp" Dec 11 13:09:23 crc kubenswrapper[4898]: I1211 13:09:23.440539 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/19a6c72a-2ab6-4971-ba09-c480e7f11d93-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6d5b84845-jq4bp\" (UID: \"19a6c72a-2ab6-4971-ba09-c480e7f11d93\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-jq4bp" Dec 11 13:09:23 crc kubenswrapper[4898]: I1211 13:09:23.467183 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Dec 11 13:09:23 crc kubenswrapper[4898]: I1211 13:09:23.541628 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/19a6c72a-2ab6-4971-ba09-c480e7f11d93-telemetry-config\") pod \"cluster-monitoring-operator-6d5b84845-jq4bp\" (UID: \"19a6c72a-2ab6-4971-ba09-c480e7f11d93\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-jq4bp" Dec 11 13:09:23 crc kubenswrapper[4898]: I1211 13:09:23.541680 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/19a6c72a-2ab6-4971-ba09-c480e7f11d93-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6d5b84845-jq4bp\" (UID: \"19a6c72a-2ab6-4971-ba09-c480e7f11d93\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-jq4bp" Dec 11 13:09:23 crc kubenswrapper[4898]: I1211 13:09:23.541703 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7zkl2\" (UniqueName: \"kubernetes.io/projected/19a6c72a-2ab6-4971-ba09-c480e7f11d93-kube-api-access-7zkl2\") pod \"cluster-monitoring-operator-6d5b84845-jq4bp\" (UID: \"19a6c72a-2ab6-4971-ba09-c480e7f11d93\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-jq4bp" Dec 11 13:09:23 crc kubenswrapper[4898]: I1211 13:09:23.542907 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/19a6c72a-2ab6-4971-ba09-c480e7f11d93-telemetry-config\") pod \"cluster-monitoring-operator-6d5b84845-jq4bp\" (UID: \"19a6c72a-2ab6-4971-ba09-c480e7f11d93\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-jq4bp" Dec 11 13:09:23 crc kubenswrapper[4898]: I1211 13:09:23.549261 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/19a6c72a-2ab6-4971-ba09-c480e7f11d93-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6d5b84845-jq4bp\" (UID: \"19a6c72a-2ab6-4971-ba09-c480e7f11d93\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-jq4bp" Dec 11 13:09:23 crc kubenswrapper[4898]: I1211 13:09:23.558592 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7zkl2\" (UniqueName: \"kubernetes.io/projected/19a6c72a-2ab6-4971-ba09-c480e7f11d93-kube-api-access-7zkl2\") pod \"cluster-monitoring-operator-6d5b84845-jq4bp\" (UID: \"19a6c72a-2ab6-4971-ba09-c480e7f11d93\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-jq4bp" Dec 11 13:09:23 crc kubenswrapper[4898]: I1211 13:09:23.611754 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-jq4bp" Dec 11 13:09:23 crc kubenswrapper[4898]: I1211 13:09:23.811166 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-6d5b84845-jq4bp"] Dec 11 13:09:24 crc kubenswrapper[4898]: I1211 13:09:24.150395 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-jq4bp" event={"ID":"19a6c72a-2ab6-4971-ba09-c480e7f11d93","Type":"ContainerStarted","Data":"a5f03b615773ef468f44fdd8860af96c574fbfd39ab093387530485d8a8842be"} Dec 11 13:09:24 crc kubenswrapper[4898]: I1211 13:09:24.466815 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Dec 11 13:09:25 crc kubenswrapper[4898]: I1211 13:09:25.099150 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Dec 11 13:09:26 crc kubenswrapper[4898]: I1211 13:09:26.952304 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-56mlq"] Dec 11 13:09:26 crc kubenswrapper[4898]: I1211 13:09:26.953697 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-56mlq" Dec 11 13:09:26 crc kubenswrapper[4898]: I1211 13:09:26.956807 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-admission-webhook-tls" Dec 11 13:09:26 crc kubenswrapper[4898]: I1211 13:09:26.961109 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-56mlq"] Dec 11 13:09:27 crc kubenswrapper[4898]: I1211 13:09:27.095284 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/eca70e56-1feb-4a48-9c32-db8e075ebfff-tls-certificates\") pod \"prometheus-operator-admission-webhook-f54c54754-56mlq\" (UID: \"eca70e56-1feb-4a48-9c32-db8e075ebfff\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-56mlq" Dec 11 13:09:27 crc kubenswrapper[4898]: I1211 13:09:27.165796 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-jq4bp" event={"ID":"19a6c72a-2ab6-4971-ba09-c480e7f11d93","Type":"ContainerStarted","Data":"d840c5998efcad5ff0a6ee32456ac2edad63824a49537ecb67fa96bf1647994a"} Dec 11 13:09:27 crc kubenswrapper[4898]: I1211 13:09:27.182266 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-jq4bp" podStartSLOduration=1.61667046 podStartE2EDuration="4.182247709s" podCreationTimestamp="2025-12-11 13:09:23 +0000 UTC" firstStartedPulling="2025-12-11 13:09:23.821169881 +0000 UTC m=+321.393496318" lastFinishedPulling="2025-12-11 13:09:26.38674713 +0000 UTC m=+323.959073567" observedRunningTime="2025-12-11 13:09:27.181336394 +0000 UTC m=+324.753662871" watchObservedRunningTime="2025-12-11 13:09:27.182247709 +0000 UTC m=+324.754574146" Dec 11 13:09:27 crc kubenswrapper[4898]: I1211 13:09:27.196970 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/eca70e56-1feb-4a48-9c32-db8e075ebfff-tls-certificates\") pod \"prometheus-operator-admission-webhook-f54c54754-56mlq\" (UID: \"eca70e56-1feb-4a48-9c32-db8e075ebfff\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-56mlq" Dec 11 13:09:27 crc kubenswrapper[4898]: E1211 13:09:27.197251 4898 secret.go:188] Couldn't get secret openshift-monitoring/prometheus-operator-admission-webhook-tls: secret "prometheus-operator-admission-webhook-tls" not found Dec 11 13:09:27 crc kubenswrapper[4898]: E1211 13:09:27.197356 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eca70e56-1feb-4a48-9c32-db8e075ebfff-tls-certificates podName:eca70e56-1feb-4a48-9c32-db8e075ebfff nodeName:}" failed. No retries permitted until 2025-12-11 13:09:27.697329364 +0000 UTC m=+325.269655821 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-certificates" (UniqueName: "kubernetes.io/secret/eca70e56-1feb-4a48-9c32-db8e075ebfff-tls-certificates") pod "prometheus-operator-admission-webhook-f54c54754-56mlq" (UID: "eca70e56-1feb-4a48-9c32-db8e075ebfff") : secret "prometheus-operator-admission-webhook-tls" not found Dec 11 13:09:27 crc kubenswrapper[4898]: I1211 13:09:27.703756 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/eca70e56-1feb-4a48-9c32-db8e075ebfff-tls-certificates\") pod \"prometheus-operator-admission-webhook-f54c54754-56mlq\" (UID: \"eca70e56-1feb-4a48-9c32-db8e075ebfff\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-56mlq" Dec 11 13:09:27 crc kubenswrapper[4898]: E1211 13:09:27.703992 4898 secret.go:188] Couldn't get secret openshift-monitoring/prometheus-operator-admission-webhook-tls: secret "prometheus-operator-admission-webhook-tls" not found Dec 11 13:09:27 crc kubenswrapper[4898]: E1211 13:09:27.704211 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eca70e56-1feb-4a48-9c32-db8e075ebfff-tls-certificates podName:eca70e56-1feb-4a48-9c32-db8e075ebfff nodeName:}" failed. No retries permitted until 2025-12-11 13:09:28.70418923 +0000 UTC m=+326.276515677 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "tls-certificates" (UniqueName: "kubernetes.io/secret/eca70e56-1feb-4a48-9c32-db8e075ebfff-tls-certificates") pod "prometheus-operator-admission-webhook-f54c54754-56mlq" (UID: "eca70e56-1feb-4a48-9c32-db8e075ebfff") : secret "prometheus-operator-admission-webhook-tls" not found Dec 11 13:09:27 crc kubenswrapper[4898]: I1211 13:09:27.775224 4898 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Dec 11 13:09:27 crc kubenswrapper[4898]: I1211 13:09:27.990611 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Dec 11 13:09:28 crc kubenswrapper[4898]: I1211 13:09:28.717832 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/eca70e56-1feb-4a48-9c32-db8e075ebfff-tls-certificates\") pod \"prometheus-operator-admission-webhook-f54c54754-56mlq\" (UID: \"eca70e56-1feb-4a48-9c32-db8e075ebfff\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-56mlq" Dec 11 13:09:28 crc kubenswrapper[4898]: E1211 13:09:28.718066 4898 secret.go:188] Couldn't get secret openshift-monitoring/prometheus-operator-admission-webhook-tls: secret "prometheus-operator-admission-webhook-tls" not found Dec 11 13:09:28 crc kubenswrapper[4898]: E1211 13:09:28.718169 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eca70e56-1feb-4a48-9c32-db8e075ebfff-tls-certificates podName:eca70e56-1feb-4a48-9c32-db8e075ebfff nodeName:}" failed. No retries permitted until 2025-12-11 13:09:30.718145088 +0000 UTC m=+328.290471525 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "tls-certificates" (UniqueName: "kubernetes.io/secret/eca70e56-1feb-4a48-9c32-db8e075ebfff-tls-certificates") pod "prometheus-operator-admission-webhook-f54c54754-56mlq" (UID: "eca70e56-1feb-4a48-9c32-db8e075ebfff") : secret "prometheus-operator-admission-webhook-tls" not found Dec 11 13:09:29 crc kubenswrapper[4898]: I1211 13:09:29.806635 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Dec 11 13:09:29 crc kubenswrapper[4898]: I1211 13:09:29.998956 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Dec 11 13:09:30 crc kubenswrapper[4898]: I1211 13:09:30.744672 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/eca70e56-1feb-4a48-9c32-db8e075ebfff-tls-certificates\") pod \"prometheus-operator-admission-webhook-f54c54754-56mlq\" (UID: \"eca70e56-1feb-4a48-9c32-db8e075ebfff\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-56mlq" Dec 11 13:09:30 crc kubenswrapper[4898]: E1211 13:09:30.744942 4898 secret.go:188] Couldn't get secret openshift-monitoring/prometheus-operator-admission-webhook-tls: secret "prometheus-operator-admission-webhook-tls" not found Dec 11 13:09:30 crc kubenswrapper[4898]: E1211 13:09:30.745073 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eca70e56-1feb-4a48-9c32-db8e075ebfff-tls-certificates podName:eca70e56-1feb-4a48-9c32-db8e075ebfff nodeName:}" failed. No retries permitted until 2025-12-11 13:09:34.74503826 +0000 UTC m=+332.317364737 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "tls-certificates" (UniqueName: "kubernetes.io/secret/eca70e56-1feb-4a48-9c32-db8e075ebfff-tls-certificates") pod "prometheus-operator-admission-webhook-f54c54754-56mlq" (UID: "eca70e56-1feb-4a48-9c32-db8e075ebfff") : secret "prometheus-operator-admission-webhook-tls" not found Dec 11 13:09:31 crc kubenswrapper[4898]: I1211 13:09:31.714768 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Dec 11 13:09:32 crc kubenswrapper[4898]: I1211 13:09:32.173779 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-lpsbl"] Dec 11 13:09:32 crc kubenswrapper[4898]: I1211 13:09:32.174568 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-lpsbl" podUID="a3c9df9f-29e5-45ed-bd07-642ae50d0ed9" containerName="controller-manager" containerID="cri-o://ab995c5f1f5f9f9d65b60620584c0282782f1011515d8925109e9439571767a7" gracePeriod=30 Dec 11 13:09:32 crc kubenswrapper[4898]: I1211 13:09:32.271542 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-4wrc2"] Dec 11 13:09:32 crc kubenswrapper[4898]: I1211 13:09:32.272047 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4wrc2" podUID="92578329-85d1-4295-bc06-88764e9d54c2" containerName="route-controller-manager" containerID="cri-o://8172453c4fccc0cee6395f6ebd67fda17373afb5b60717670fd9f48c977824f6" gracePeriod=30 Dec 11 13:09:32 crc kubenswrapper[4898]: I1211 13:09:32.538715 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-lpsbl" Dec 11 13:09:32 crc kubenswrapper[4898]: I1211 13:09:32.605708 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4wrc2" Dec 11 13:09:32 crc kubenswrapper[4898]: I1211 13:09:32.615801 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-dbf56b754-mxg92"] Dec 11 13:09:32 crc kubenswrapper[4898]: E1211 13:09:32.616017 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92578329-85d1-4295-bc06-88764e9d54c2" containerName="route-controller-manager" Dec 11 13:09:32 crc kubenswrapper[4898]: I1211 13:09:32.616034 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="92578329-85d1-4295-bc06-88764e9d54c2" containerName="route-controller-manager" Dec 11 13:09:32 crc kubenswrapper[4898]: E1211 13:09:32.616058 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3c9df9f-29e5-45ed-bd07-642ae50d0ed9" containerName="controller-manager" Dec 11 13:09:32 crc kubenswrapper[4898]: I1211 13:09:32.616067 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3c9df9f-29e5-45ed-bd07-642ae50d0ed9" containerName="controller-manager" Dec 11 13:09:32 crc kubenswrapper[4898]: I1211 13:09:32.616175 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3c9df9f-29e5-45ed-bd07-642ae50d0ed9" containerName="controller-manager" Dec 11 13:09:32 crc kubenswrapper[4898]: I1211 13:09:32.616201 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="92578329-85d1-4295-bc06-88764e9d54c2" containerName="route-controller-manager" Dec 11 13:09:32 crc kubenswrapper[4898]: I1211 13:09:32.616660 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-dbf56b754-mxg92" Dec 11 13:09:32 crc kubenswrapper[4898]: I1211 13:09:32.624569 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-dbf56b754-mxg92"] Dec 11 13:09:32 crc kubenswrapper[4898]: I1211 13:09:32.666533 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a3c9df9f-29e5-45ed-bd07-642ae50d0ed9-serving-cert\") pod \"a3c9df9f-29e5-45ed-bd07-642ae50d0ed9\" (UID: \"a3c9df9f-29e5-45ed-bd07-642ae50d0ed9\") " Dec 11 13:09:32 crc kubenswrapper[4898]: I1211 13:09:32.666580 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a3c9df9f-29e5-45ed-bd07-642ae50d0ed9-proxy-ca-bundles\") pod \"a3c9df9f-29e5-45ed-bd07-642ae50d0ed9\" (UID: \"a3c9df9f-29e5-45ed-bd07-642ae50d0ed9\") " Dec 11 13:09:32 crc kubenswrapper[4898]: I1211 13:09:32.666627 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3c9df9f-29e5-45ed-bd07-642ae50d0ed9-config\") pod \"a3c9df9f-29e5-45ed-bd07-642ae50d0ed9\" (UID: \"a3c9df9f-29e5-45ed-bd07-642ae50d0ed9\") " Dec 11 13:09:32 crc kubenswrapper[4898]: I1211 13:09:32.666670 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a3c9df9f-29e5-45ed-bd07-642ae50d0ed9-client-ca\") pod \"a3c9df9f-29e5-45ed-bd07-642ae50d0ed9\" (UID: \"a3c9df9f-29e5-45ed-bd07-642ae50d0ed9\") " Dec 11 13:09:32 crc kubenswrapper[4898]: I1211 13:09:32.667101 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-98998769b-5vw79"] Dec 11 13:09:32 crc kubenswrapper[4898]: I1211 13:09:32.667344 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a3c9df9f-29e5-45ed-bd07-642ae50d0ed9-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "a3c9df9f-29e5-45ed-bd07-642ae50d0ed9" (UID: "a3c9df9f-29e5-45ed-bd07-642ae50d0ed9"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:09:32 crc kubenswrapper[4898]: I1211 13:09:32.667533 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a3c9df9f-29e5-45ed-bd07-642ae50d0ed9-config" (OuterVolumeSpecName: "config") pod "a3c9df9f-29e5-45ed-bd07-642ae50d0ed9" (UID: "a3c9df9f-29e5-45ed-bd07-642ae50d0ed9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:09:32 crc kubenswrapper[4898]: I1211 13:09:32.667572 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ljmkm\" (UniqueName: \"kubernetes.io/projected/a3c9df9f-29e5-45ed-bd07-642ae50d0ed9-kube-api-access-ljmkm\") pod \"a3c9df9f-29e5-45ed-bd07-642ae50d0ed9\" (UID: \"a3c9df9f-29e5-45ed-bd07-642ae50d0ed9\") " Dec 11 13:09:32 crc kubenswrapper[4898]: I1211 13:09:32.667641 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a3c9df9f-29e5-45ed-bd07-642ae50d0ed9-client-ca" (OuterVolumeSpecName: "client-ca") pod "a3c9df9f-29e5-45ed-bd07-642ae50d0ed9" (UID: "a3c9df9f-29e5-45ed-bd07-642ae50d0ed9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:09:32 crc kubenswrapper[4898]: I1211 13:09:32.667792 4898 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a3c9df9f-29e5-45ed-bd07-642ae50d0ed9-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 11 13:09:32 crc kubenswrapper[4898]: I1211 13:09:32.667804 4898 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3c9df9f-29e5-45ed-bd07-642ae50d0ed9-config\") on node \"crc\" DevicePath \"\"" Dec 11 13:09:32 crc kubenswrapper[4898]: I1211 13:09:32.667812 4898 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a3c9df9f-29e5-45ed-bd07-642ae50d0ed9-client-ca\") on node \"crc\" DevicePath \"\"" Dec 11 13:09:32 crc kubenswrapper[4898]: I1211 13:09:32.667916 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-98998769b-5vw79" Dec 11 13:09:32 crc kubenswrapper[4898]: I1211 13:09:32.674102 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3c9df9f-29e5-45ed-bd07-642ae50d0ed9-kube-api-access-ljmkm" (OuterVolumeSpecName: "kube-api-access-ljmkm") pod "a3c9df9f-29e5-45ed-bd07-642ae50d0ed9" (UID: "a3c9df9f-29e5-45ed-bd07-642ae50d0ed9"). InnerVolumeSpecName "kube-api-access-ljmkm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:09:32 crc kubenswrapper[4898]: I1211 13:09:32.674158 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3c9df9f-29e5-45ed-bd07-642ae50d0ed9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "a3c9df9f-29e5-45ed-bd07-642ae50d0ed9" (UID: "a3c9df9f-29e5-45ed-bd07-642ae50d0ed9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:09:32 crc kubenswrapper[4898]: I1211 13:09:32.680013 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-98998769b-5vw79"] Dec 11 13:09:32 crc kubenswrapper[4898]: I1211 13:09:32.768711 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/92578329-85d1-4295-bc06-88764e9d54c2-config\") pod \"92578329-85d1-4295-bc06-88764e9d54c2\" (UID: \"92578329-85d1-4295-bc06-88764e9d54c2\") " Dec 11 13:09:32 crc kubenswrapper[4898]: I1211 13:09:32.768785 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-db2rp\" (UniqueName: \"kubernetes.io/projected/92578329-85d1-4295-bc06-88764e9d54c2-kube-api-access-db2rp\") pod \"92578329-85d1-4295-bc06-88764e9d54c2\" (UID: \"92578329-85d1-4295-bc06-88764e9d54c2\") " Dec 11 13:09:32 crc kubenswrapper[4898]: I1211 13:09:32.768853 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/92578329-85d1-4295-bc06-88764e9d54c2-client-ca\") pod \"92578329-85d1-4295-bc06-88764e9d54c2\" (UID: \"92578329-85d1-4295-bc06-88764e9d54c2\") " Dec 11 13:09:32 crc kubenswrapper[4898]: I1211 13:09:32.768900 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/92578329-85d1-4295-bc06-88764e9d54c2-serving-cert\") pod \"92578329-85d1-4295-bc06-88764e9d54c2\" (UID: \"92578329-85d1-4295-bc06-88764e9d54c2\") " Dec 11 13:09:32 crc kubenswrapper[4898]: I1211 13:09:32.769060 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fdc5cc96-d8f0-48ea-9837-a74863c2d69d-config\") pod \"route-controller-manager-98998769b-5vw79\" (UID: \"fdc5cc96-d8f0-48ea-9837-a74863c2d69d\") " pod="openshift-route-controller-manager/route-controller-manager-98998769b-5vw79" Dec 11 13:09:32 crc kubenswrapper[4898]: I1211 13:09:32.769093 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1c9a929f-745b-48d6-9799-f762ee7b14fe-proxy-ca-bundles\") pod \"controller-manager-dbf56b754-mxg92\" (UID: \"1c9a929f-745b-48d6-9799-f762ee7b14fe\") " pod="openshift-controller-manager/controller-manager-dbf56b754-mxg92" Dec 11 13:09:32 crc kubenswrapper[4898]: I1211 13:09:32.769111 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1c9a929f-745b-48d6-9799-f762ee7b14fe-client-ca\") pod \"controller-manager-dbf56b754-mxg92\" (UID: \"1c9a929f-745b-48d6-9799-f762ee7b14fe\") " pod="openshift-controller-manager/controller-manager-dbf56b754-mxg92" Dec 11 13:09:32 crc kubenswrapper[4898]: I1211 13:09:32.769143 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9pkjr\" (UniqueName: \"kubernetes.io/projected/1c9a929f-745b-48d6-9799-f762ee7b14fe-kube-api-access-9pkjr\") pod \"controller-manager-dbf56b754-mxg92\" (UID: \"1c9a929f-745b-48d6-9799-f762ee7b14fe\") " pod="openshift-controller-manager/controller-manager-dbf56b754-mxg92" Dec 11 13:09:32 crc kubenswrapper[4898]: I1211 13:09:32.769168 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fdc5cc96-d8f0-48ea-9837-a74863c2d69d-serving-cert\") pod \"route-controller-manager-98998769b-5vw79\" (UID: \"fdc5cc96-d8f0-48ea-9837-a74863c2d69d\") " pod="openshift-route-controller-manager/route-controller-manager-98998769b-5vw79" Dec 11 13:09:32 crc kubenswrapper[4898]: I1211 13:09:32.769383 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1c9a929f-745b-48d6-9799-f762ee7b14fe-config\") pod \"controller-manager-dbf56b754-mxg92\" (UID: \"1c9a929f-745b-48d6-9799-f762ee7b14fe\") " pod="openshift-controller-manager/controller-manager-dbf56b754-mxg92" Dec 11 13:09:32 crc kubenswrapper[4898]: I1211 13:09:32.769487 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1c9a929f-745b-48d6-9799-f762ee7b14fe-serving-cert\") pod \"controller-manager-dbf56b754-mxg92\" (UID: \"1c9a929f-745b-48d6-9799-f762ee7b14fe\") " pod="openshift-controller-manager/controller-manager-dbf56b754-mxg92" Dec 11 13:09:32 crc kubenswrapper[4898]: I1211 13:09:32.769530 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fdc5cc96-d8f0-48ea-9837-a74863c2d69d-client-ca\") pod \"route-controller-manager-98998769b-5vw79\" (UID: \"fdc5cc96-d8f0-48ea-9837-a74863c2d69d\") " pod="openshift-route-controller-manager/route-controller-manager-98998769b-5vw79" Dec 11 13:09:32 crc kubenswrapper[4898]: I1211 13:09:32.769624 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f4sj8\" (UniqueName: \"kubernetes.io/projected/fdc5cc96-d8f0-48ea-9837-a74863c2d69d-kube-api-access-f4sj8\") pod \"route-controller-manager-98998769b-5vw79\" (UID: \"fdc5cc96-d8f0-48ea-9837-a74863c2d69d\") " pod="openshift-route-controller-manager/route-controller-manager-98998769b-5vw79" Dec 11 13:09:32 crc kubenswrapper[4898]: I1211 13:09:32.769655 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/92578329-85d1-4295-bc06-88764e9d54c2-config" (OuterVolumeSpecName: "config") pod "92578329-85d1-4295-bc06-88764e9d54c2" (UID: "92578329-85d1-4295-bc06-88764e9d54c2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:09:32 crc kubenswrapper[4898]: I1211 13:09:32.769682 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/92578329-85d1-4295-bc06-88764e9d54c2-client-ca" (OuterVolumeSpecName: "client-ca") pod "92578329-85d1-4295-bc06-88764e9d54c2" (UID: "92578329-85d1-4295-bc06-88764e9d54c2"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:09:32 crc kubenswrapper[4898]: I1211 13:09:32.769747 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ljmkm\" (UniqueName: \"kubernetes.io/projected/a3c9df9f-29e5-45ed-bd07-642ae50d0ed9-kube-api-access-ljmkm\") on node \"crc\" DevicePath \"\"" Dec 11 13:09:32 crc kubenswrapper[4898]: I1211 13:09:32.769771 4898 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a3c9df9f-29e5-45ed-bd07-642ae50d0ed9-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 11 13:09:32 crc kubenswrapper[4898]: I1211 13:09:32.771529 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/92578329-85d1-4295-bc06-88764e9d54c2-kube-api-access-db2rp" (OuterVolumeSpecName: "kube-api-access-db2rp") pod "92578329-85d1-4295-bc06-88764e9d54c2" (UID: "92578329-85d1-4295-bc06-88764e9d54c2"). InnerVolumeSpecName "kube-api-access-db2rp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:09:32 crc kubenswrapper[4898]: I1211 13:09:32.772262 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92578329-85d1-4295-bc06-88764e9d54c2-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "92578329-85d1-4295-bc06-88764e9d54c2" (UID: "92578329-85d1-4295-bc06-88764e9d54c2"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:09:32 crc kubenswrapper[4898]: I1211 13:09:32.871286 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f4sj8\" (UniqueName: \"kubernetes.io/projected/fdc5cc96-d8f0-48ea-9837-a74863c2d69d-kube-api-access-f4sj8\") pod \"route-controller-manager-98998769b-5vw79\" (UID: \"fdc5cc96-d8f0-48ea-9837-a74863c2d69d\") " pod="openshift-route-controller-manager/route-controller-manager-98998769b-5vw79" Dec 11 13:09:32 crc kubenswrapper[4898]: I1211 13:09:32.871391 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fdc5cc96-d8f0-48ea-9837-a74863c2d69d-config\") pod \"route-controller-manager-98998769b-5vw79\" (UID: \"fdc5cc96-d8f0-48ea-9837-a74863c2d69d\") " pod="openshift-route-controller-manager/route-controller-manager-98998769b-5vw79" Dec 11 13:09:32 crc kubenswrapper[4898]: I1211 13:09:32.871496 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1c9a929f-745b-48d6-9799-f762ee7b14fe-proxy-ca-bundles\") pod \"controller-manager-dbf56b754-mxg92\" (UID: \"1c9a929f-745b-48d6-9799-f762ee7b14fe\") " pod="openshift-controller-manager/controller-manager-dbf56b754-mxg92" Dec 11 13:09:32 crc kubenswrapper[4898]: I1211 13:09:32.871556 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1c9a929f-745b-48d6-9799-f762ee7b14fe-client-ca\") pod \"controller-manager-dbf56b754-mxg92\" (UID: \"1c9a929f-745b-48d6-9799-f762ee7b14fe\") " pod="openshift-controller-manager/controller-manager-dbf56b754-mxg92" Dec 11 13:09:32 crc kubenswrapper[4898]: I1211 13:09:32.871631 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9pkjr\" (UniqueName: \"kubernetes.io/projected/1c9a929f-745b-48d6-9799-f762ee7b14fe-kube-api-access-9pkjr\") pod \"controller-manager-dbf56b754-mxg92\" (UID: \"1c9a929f-745b-48d6-9799-f762ee7b14fe\") " pod="openshift-controller-manager/controller-manager-dbf56b754-mxg92" Dec 11 13:09:32 crc kubenswrapper[4898]: I1211 13:09:32.871696 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fdc5cc96-d8f0-48ea-9837-a74863c2d69d-serving-cert\") pod \"route-controller-manager-98998769b-5vw79\" (UID: \"fdc5cc96-d8f0-48ea-9837-a74863c2d69d\") " pod="openshift-route-controller-manager/route-controller-manager-98998769b-5vw79" Dec 11 13:09:32 crc kubenswrapper[4898]: I1211 13:09:32.871833 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1c9a929f-745b-48d6-9799-f762ee7b14fe-config\") pod \"controller-manager-dbf56b754-mxg92\" (UID: \"1c9a929f-745b-48d6-9799-f762ee7b14fe\") " pod="openshift-controller-manager/controller-manager-dbf56b754-mxg92" Dec 11 13:09:32 crc kubenswrapper[4898]: I1211 13:09:32.871888 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1c9a929f-745b-48d6-9799-f762ee7b14fe-serving-cert\") pod \"controller-manager-dbf56b754-mxg92\" (UID: \"1c9a929f-745b-48d6-9799-f762ee7b14fe\") " pod="openshift-controller-manager/controller-manager-dbf56b754-mxg92" Dec 11 13:09:32 crc kubenswrapper[4898]: I1211 13:09:32.871935 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fdc5cc96-d8f0-48ea-9837-a74863c2d69d-client-ca\") pod \"route-controller-manager-98998769b-5vw79\" (UID: \"fdc5cc96-d8f0-48ea-9837-a74863c2d69d\") " pod="openshift-route-controller-manager/route-controller-manager-98998769b-5vw79" Dec 11 13:09:32 crc kubenswrapper[4898]: I1211 13:09:32.872079 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-db2rp\" (UniqueName: \"kubernetes.io/projected/92578329-85d1-4295-bc06-88764e9d54c2-kube-api-access-db2rp\") on node \"crc\" DevicePath \"\"" Dec 11 13:09:32 crc kubenswrapper[4898]: I1211 13:09:32.872110 4898 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/92578329-85d1-4295-bc06-88764e9d54c2-client-ca\") on node \"crc\" DevicePath \"\"" Dec 11 13:09:32 crc kubenswrapper[4898]: I1211 13:09:32.872139 4898 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/92578329-85d1-4295-bc06-88764e9d54c2-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 11 13:09:32 crc kubenswrapper[4898]: I1211 13:09:32.872164 4898 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/92578329-85d1-4295-bc06-88764e9d54c2-config\") on node \"crc\" DevicePath \"\"" Dec 11 13:09:32 crc kubenswrapper[4898]: I1211 13:09:32.873548 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1c9a929f-745b-48d6-9799-f762ee7b14fe-client-ca\") pod \"controller-manager-dbf56b754-mxg92\" (UID: \"1c9a929f-745b-48d6-9799-f762ee7b14fe\") " pod="openshift-controller-manager/controller-manager-dbf56b754-mxg92" Dec 11 13:09:32 crc kubenswrapper[4898]: I1211 13:09:32.874267 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fdc5cc96-d8f0-48ea-9837-a74863c2d69d-config\") pod \"route-controller-manager-98998769b-5vw79\" (UID: \"fdc5cc96-d8f0-48ea-9837-a74863c2d69d\") " pod="openshift-route-controller-manager/route-controller-manager-98998769b-5vw79" Dec 11 13:09:32 crc kubenswrapper[4898]: I1211 13:09:32.875074 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fdc5cc96-d8f0-48ea-9837-a74863c2d69d-client-ca\") pod \"route-controller-manager-98998769b-5vw79\" (UID: \"fdc5cc96-d8f0-48ea-9837-a74863c2d69d\") " pod="openshift-route-controller-manager/route-controller-manager-98998769b-5vw79" Dec 11 13:09:32 crc kubenswrapper[4898]: I1211 13:09:32.875073 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1c9a929f-745b-48d6-9799-f762ee7b14fe-proxy-ca-bundles\") pod \"controller-manager-dbf56b754-mxg92\" (UID: \"1c9a929f-745b-48d6-9799-f762ee7b14fe\") " pod="openshift-controller-manager/controller-manager-dbf56b754-mxg92" Dec 11 13:09:32 crc kubenswrapper[4898]: I1211 13:09:32.875870 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1c9a929f-745b-48d6-9799-f762ee7b14fe-config\") pod \"controller-manager-dbf56b754-mxg92\" (UID: \"1c9a929f-745b-48d6-9799-f762ee7b14fe\") " pod="openshift-controller-manager/controller-manager-dbf56b754-mxg92" Dec 11 13:09:32 crc kubenswrapper[4898]: I1211 13:09:32.876213 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fdc5cc96-d8f0-48ea-9837-a74863c2d69d-serving-cert\") pod \"route-controller-manager-98998769b-5vw79\" (UID: \"fdc5cc96-d8f0-48ea-9837-a74863c2d69d\") " pod="openshift-route-controller-manager/route-controller-manager-98998769b-5vw79" Dec 11 13:09:32 crc kubenswrapper[4898]: I1211 13:09:32.880388 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1c9a929f-745b-48d6-9799-f762ee7b14fe-serving-cert\") pod \"controller-manager-dbf56b754-mxg92\" (UID: \"1c9a929f-745b-48d6-9799-f762ee7b14fe\") " pod="openshift-controller-manager/controller-manager-dbf56b754-mxg92" Dec 11 13:09:32 crc kubenswrapper[4898]: I1211 13:09:32.890582 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f4sj8\" (UniqueName: \"kubernetes.io/projected/fdc5cc96-d8f0-48ea-9837-a74863c2d69d-kube-api-access-f4sj8\") pod \"route-controller-manager-98998769b-5vw79\" (UID: \"fdc5cc96-d8f0-48ea-9837-a74863c2d69d\") " pod="openshift-route-controller-manager/route-controller-manager-98998769b-5vw79" Dec 11 13:09:32 crc kubenswrapper[4898]: I1211 13:09:32.901968 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9pkjr\" (UniqueName: \"kubernetes.io/projected/1c9a929f-745b-48d6-9799-f762ee7b14fe-kube-api-access-9pkjr\") pod \"controller-manager-dbf56b754-mxg92\" (UID: \"1c9a929f-745b-48d6-9799-f762ee7b14fe\") " pod="openshift-controller-manager/controller-manager-dbf56b754-mxg92" Dec 11 13:09:32 crc kubenswrapper[4898]: I1211 13:09:32.930071 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-dbf56b754-mxg92" Dec 11 13:09:32 crc kubenswrapper[4898]: I1211 13:09:32.986274 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-98998769b-5vw79" Dec 11 13:09:33 crc kubenswrapper[4898]: W1211 13:09:33.192694 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1c9a929f_745b_48d6_9799_f762ee7b14fe.slice/crio-fdd29a949ca0d9927f5e4691f7ecadc2f1ac9ddffd00b5cbf3c36da6fa32727e WatchSource:0}: Error finding container fdd29a949ca0d9927f5e4691f7ecadc2f1ac9ddffd00b5cbf3c36da6fa32727e: Status 404 returned error can't find the container with id fdd29a949ca0d9927f5e4691f7ecadc2f1ac9ddffd00b5cbf3c36da6fa32727e Dec 11 13:09:33 crc kubenswrapper[4898]: I1211 13:09:33.193121 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-dbf56b754-mxg92"] Dec 11 13:09:33 crc kubenswrapper[4898]: I1211 13:09:33.204593 4898 generic.go:334] "Generic (PLEG): container finished" podID="a3c9df9f-29e5-45ed-bd07-642ae50d0ed9" containerID="ab995c5f1f5f9f9d65b60620584c0282782f1011515d8925109e9439571767a7" exitCode=0 Dec 11 13:09:33 crc kubenswrapper[4898]: I1211 13:09:33.204632 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-lpsbl" event={"ID":"a3c9df9f-29e5-45ed-bd07-642ae50d0ed9","Type":"ContainerDied","Data":"ab995c5f1f5f9f9d65b60620584c0282782f1011515d8925109e9439571767a7"} Dec 11 13:09:33 crc kubenswrapper[4898]: I1211 13:09:33.204675 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-lpsbl" event={"ID":"a3c9df9f-29e5-45ed-bd07-642ae50d0ed9","Type":"ContainerDied","Data":"faac4b437985a5980206e67d881ecc78ea05fd74851705ae735b5f3da3f93ead"} Dec 11 13:09:33 crc kubenswrapper[4898]: I1211 13:09:33.204684 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-lpsbl" Dec 11 13:09:33 crc kubenswrapper[4898]: I1211 13:09:33.204696 4898 scope.go:117] "RemoveContainer" containerID="ab995c5f1f5f9f9d65b60620584c0282782f1011515d8925109e9439571767a7" Dec 11 13:09:33 crc kubenswrapper[4898]: I1211 13:09:33.208870 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-dbf56b754-mxg92" event={"ID":"1c9a929f-745b-48d6-9799-f762ee7b14fe","Type":"ContainerStarted","Data":"fdd29a949ca0d9927f5e4691f7ecadc2f1ac9ddffd00b5cbf3c36da6fa32727e"} Dec 11 13:09:33 crc kubenswrapper[4898]: I1211 13:09:33.210414 4898 generic.go:334] "Generic (PLEG): container finished" podID="92578329-85d1-4295-bc06-88764e9d54c2" containerID="8172453c4fccc0cee6395f6ebd67fda17373afb5b60717670fd9f48c977824f6" exitCode=0 Dec 11 13:09:33 crc kubenswrapper[4898]: I1211 13:09:33.210477 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4wrc2" event={"ID":"92578329-85d1-4295-bc06-88764e9d54c2","Type":"ContainerDied","Data":"8172453c4fccc0cee6395f6ebd67fda17373afb5b60717670fd9f48c977824f6"} Dec 11 13:09:33 crc kubenswrapper[4898]: I1211 13:09:33.210503 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4wrc2" event={"ID":"92578329-85d1-4295-bc06-88764e9d54c2","Type":"ContainerDied","Data":"15b3f81d1ee3b8f5b1bdb46f09546b19243a411c1d45876cf30a4487834827a1"} Dec 11 13:09:33 crc kubenswrapper[4898]: I1211 13:09:33.210560 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4wrc2" Dec 11 13:09:33 crc kubenswrapper[4898]: I1211 13:09:33.224582 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-98998769b-5vw79"] Dec 11 13:09:33 crc kubenswrapper[4898]: I1211 13:09:33.234305 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-lpsbl"] Dec 11 13:09:33 crc kubenswrapper[4898]: I1211 13:09:33.239125 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-lpsbl"] Dec 11 13:09:33 crc kubenswrapper[4898]: I1211 13:09:33.240463 4898 scope.go:117] "RemoveContainer" containerID="ab995c5f1f5f9f9d65b60620584c0282782f1011515d8925109e9439571767a7" Dec 11 13:09:33 crc kubenswrapper[4898]: E1211 13:09:33.240890 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ab995c5f1f5f9f9d65b60620584c0282782f1011515d8925109e9439571767a7\": container with ID starting with ab995c5f1f5f9f9d65b60620584c0282782f1011515d8925109e9439571767a7 not found: ID does not exist" containerID="ab995c5f1f5f9f9d65b60620584c0282782f1011515d8925109e9439571767a7" Dec 11 13:09:33 crc kubenswrapper[4898]: I1211 13:09:33.240922 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab995c5f1f5f9f9d65b60620584c0282782f1011515d8925109e9439571767a7"} err="failed to get container status \"ab995c5f1f5f9f9d65b60620584c0282782f1011515d8925109e9439571767a7\": rpc error: code = NotFound desc = could not find container \"ab995c5f1f5f9f9d65b60620584c0282782f1011515d8925109e9439571767a7\": container with ID starting with ab995c5f1f5f9f9d65b60620584c0282782f1011515d8925109e9439571767a7 not found: ID does not exist" Dec 11 13:09:33 crc kubenswrapper[4898]: I1211 13:09:33.240944 4898 scope.go:117] "RemoveContainer" containerID="8172453c4fccc0cee6395f6ebd67fda17373afb5b60717670fd9f48c977824f6" Dec 11 13:09:33 crc kubenswrapper[4898]: I1211 13:09:33.242392 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-4wrc2"] Dec 11 13:09:33 crc kubenswrapper[4898]: I1211 13:09:33.245329 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-4wrc2"] Dec 11 13:09:33 crc kubenswrapper[4898]: W1211 13:09:33.245733 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfdc5cc96_d8f0_48ea_9837_a74863c2d69d.slice/crio-735fccc1a2af25fba2346fbd7ee28fe8966af1e1cc6f15b7523199a12df71171 WatchSource:0}: Error finding container 735fccc1a2af25fba2346fbd7ee28fe8966af1e1cc6f15b7523199a12df71171: Status 404 returned error can't find the container with id 735fccc1a2af25fba2346fbd7ee28fe8966af1e1cc6f15b7523199a12df71171 Dec 11 13:09:33 crc kubenswrapper[4898]: I1211 13:09:33.255807 4898 scope.go:117] "RemoveContainer" containerID="8172453c4fccc0cee6395f6ebd67fda17373afb5b60717670fd9f48c977824f6" Dec 11 13:09:33 crc kubenswrapper[4898]: E1211 13:09:33.256271 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8172453c4fccc0cee6395f6ebd67fda17373afb5b60717670fd9f48c977824f6\": container with ID starting with 8172453c4fccc0cee6395f6ebd67fda17373afb5b60717670fd9f48c977824f6 not found: ID does not exist" containerID="8172453c4fccc0cee6395f6ebd67fda17373afb5b60717670fd9f48c977824f6" Dec 11 13:09:33 crc kubenswrapper[4898]: I1211 13:09:33.256308 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8172453c4fccc0cee6395f6ebd67fda17373afb5b60717670fd9f48c977824f6"} err="failed to get container status \"8172453c4fccc0cee6395f6ebd67fda17373afb5b60717670fd9f48c977824f6\": rpc error: code = NotFound desc = could not find container \"8172453c4fccc0cee6395f6ebd67fda17373afb5b60717670fd9f48c977824f6\": container with ID starting with 8172453c4fccc0cee6395f6ebd67fda17373afb5b60717670fd9f48c977824f6 not found: ID does not exist" Dec 11 13:09:33 crc kubenswrapper[4898]: I1211 13:09:33.713293 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Dec 11 13:09:34 crc kubenswrapper[4898]: I1211 13:09:34.219932 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-98998769b-5vw79" event={"ID":"fdc5cc96-d8f0-48ea-9837-a74863c2d69d","Type":"ContainerStarted","Data":"7dc4b9ab4508aaba07a27b6f6ac01a3abfca8bdba82c5eebc1b670771da0daf6"} Dec 11 13:09:34 crc kubenswrapper[4898]: I1211 13:09:34.220334 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-98998769b-5vw79" event={"ID":"fdc5cc96-d8f0-48ea-9837-a74863c2d69d","Type":"ContainerStarted","Data":"735fccc1a2af25fba2346fbd7ee28fe8966af1e1cc6f15b7523199a12df71171"} Dec 11 13:09:34 crc kubenswrapper[4898]: I1211 13:09:34.221305 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-98998769b-5vw79" Dec 11 13:09:34 crc kubenswrapper[4898]: I1211 13:09:34.227263 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-dbf56b754-mxg92" event={"ID":"1c9a929f-745b-48d6-9799-f762ee7b14fe","Type":"ContainerStarted","Data":"ef4276fe2fd24f0cf5ade8cb78337c92a6e94ed9be75fb32e3ecbf6510057c7e"} Dec 11 13:09:34 crc kubenswrapper[4898]: I1211 13:09:34.228257 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-dbf56b754-mxg92" Dec 11 13:09:34 crc kubenswrapper[4898]: I1211 13:09:34.229426 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-98998769b-5vw79" Dec 11 13:09:34 crc kubenswrapper[4898]: I1211 13:09:34.236418 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-dbf56b754-mxg92" Dec 11 13:09:34 crc kubenswrapper[4898]: I1211 13:09:34.248288 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-98998769b-5vw79" podStartSLOduration=2.248257084 podStartE2EDuration="2.248257084s" podCreationTimestamp="2025-12-11 13:09:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 13:09:34.24186424 +0000 UTC m=+331.814190687" watchObservedRunningTime="2025-12-11 13:09:34.248257084 +0000 UTC m=+331.820583561" Dec 11 13:09:34 crc kubenswrapper[4898]: I1211 13:09:34.256942 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Dec 11 13:09:34 crc kubenswrapper[4898]: I1211 13:09:34.305615 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-dbf56b754-mxg92" podStartSLOduration=2.305593067 podStartE2EDuration="2.305593067s" podCreationTimestamp="2025-12-11 13:09:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 13:09:34.299414578 +0000 UTC m=+331.871741035" watchObservedRunningTime="2025-12-11 13:09:34.305593067 +0000 UTC m=+331.877919504" Dec 11 13:09:34 crc kubenswrapper[4898]: I1211 13:09:34.786229 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="92578329-85d1-4295-bc06-88764e9d54c2" path="/var/lib/kubelet/pods/92578329-85d1-4295-bc06-88764e9d54c2/volumes" Dec 11 13:09:34 crc kubenswrapper[4898]: I1211 13:09:34.787698 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a3c9df9f-29e5-45ed-bd07-642ae50d0ed9" path="/var/lib/kubelet/pods/a3c9df9f-29e5-45ed-bd07-642ae50d0ed9/volumes" Dec 11 13:09:34 crc kubenswrapper[4898]: I1211 13:09:34.797376 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/eca70e56-1feb-4a48-9c32-db8e075ebfff-tls-certificates\") pod \"prometheus-operator-admission-webhook-f54c54754-56mlq\" (UID: \"eca70e56-1feb-4a48-9c32-db8e075ebfff\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-56mlq" Dec 11 13:09:34 crc kubenswrapper[4898]: E1211 13:09:34.797628 4898 secret.go:188] Couldn't get secret openshift-monitoring/prometheus-operator-admission-webhook-tls: secret "prometheus-operator-admission-webhook-tls" not found Dec 11 13:09:34 crc kubenswrapper[4898]: E1211 13:09:34.797724 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eca70e56-1feb-4a48-9c32-db8e075ebfff-tls-certificates podName:eca70e56-1feb-4a48-9c32-db8e075ebfff nodeName:}" failed. No retries permitted until 2025-12-11 13:09:42.797698887 +0000 UTC m=+340.370025364 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "tls-certificates" (UniqueName: "kubernetes.io/secret/eca70e56-1feb-4a48-9c32-db8e075ebfff-tls-certificates") pod "prometheus-operator-admission-webhook-f54c54754-56mlq" (UID: "eca70e56-1feb-4a48-9c32-db8e075ebfff") : secret "prometheus-operator-admission-webhook-tls" not found Dec 11 13:09:42 crc kubenswrapper[4898]: I1211 13:09:42.806363 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/eca70e56-1feb-4a48-9c32-db8e075ebfff-tls-certificates\") pod \"prometheus-operator-admission-webhook-f54c54754-56mlq\" (UID: \"eca70e56-1feb-4a48-9c32-db8e075ebfff\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-56mlq" Dec 11 13:09:42 crc kubenswrapper[4898]: E1211 13:09:42.806611 4898 secret.go:188] Couldn't get secret openshift-monitoring/prometheus-operator-admission-webhook-tls: secret "prometheus-operator-admission-webhook-tls" not found Dec 11 13:09:42 crc kubenswrapper[4898]: E1211 13:09:42.807076 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eca70e56-1feb-4a48-9c32-db8e075ebfff-tls-certificates podName:eca70e56-1feb-4a48-9c32-db8e075ebfff nodeName:}" failed. No retries permitted until 2025-12-11 13:09:58.807037083 +0000 UTC m=+356.379363520 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "tls-certificates" (UniqueName: "kubernetes.io/secret/eca70e56-1feb-4a48-9c32-db8e075ebfff-tls-certificates") pod "prometheus-operator-admission-webhook-f54c54754-56mlq" (UID: "eca70e56-1feb-4a48-9c32-db8e075ebfff") : secret "prometheus-operator-admission-webhook-tls" not found Dec 11 13:09:53 crc kubenswrapper[4898]: I1211 13:09:53.081861 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-nkc2q"] Dec 11 13:09:53 crc kubenswrapper[4898]: I1211 13:09:53.085657 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nkc2q" Dec 11 13:09:53 crc kubenswrapper[4898]: I1211 13:09:53.097172 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-nkc2q"] Dec 11 13:09:53 crc kubenswrapper[4898]: I1211 13:09:53.097941 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 11 13:09:53 crc kubenswrapper[4898]: I1211 13:09:53.238369 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b434beff-f5d1-4b10-8715-10cdfe445919-utilities\") pod \"certified-operators-nkc2q\" (UID: \"b434beff-f5d1-4b10-8715-10cdfe445919\") " pod="openshift-marketplace/certified-operators-nkc2q" Dec 11 13:09:53 crc kubenswrapper[4898]: I1211 13:09:53.238487 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b434beff-f5d1-4b10-8715-10cdfe445919-catalog-content\") pod \"certified-operators-nkc2q\" (UID: \"b434beff-f5d1-4b10-8715-10cdfe445919\") " pod="openshift-marketplace/certified-operators-nkc2q" Dec 11 13:09:53 crc kubenswrapper[4898]: I1211 13:09:53.238520 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wj7nm\" (UniqueName: \"kubernetes.io/projected/b434beff-f5d1-4b10-8715-10cdfe445919-kube-api-access-wj7nm\") pod \"certified-operators-nkc2q\" (UID: \"b434beff-f5d1-4b10-8715-10cdfe445919\") " pod="openshift-marketplace/certified-operators-nkc2q" Dec 11 13:09:53 crc kubenswrapper[4898]: I1211 13:09:53.275854 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-jncnt"] Dec 11 13:09:53 crc kubenswrapper[4898]: I1211 13:09:53.276796 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jncnt" Dec 11 13:09:53 crc kubenswrapper[4898]: I1211 13:09:53.279603 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 11 13:09:53 crc kubenswrapper[4898]: I1211 13:09:53.297191 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-jncnt"] Dec 11 13:09:53 crc kubenswrapper[4898]: I1211 13:09:53.340050 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b434beff-f5d1-4b10-8715-10cdfe445919-catalog-content\") pod \"certified-operators-nkc2q\" (UID: \"b434beff-f5d1-4b10-8715-10cdfe445919\") " pod="openshift-marketplace/certified-operators-nkc2q" Dec 11 13:09:53 crc kubenswrapper[4898]: I1211 13:09:53.340143 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wj7nm\" (UniqueName: \"kubernetes.io/projected/b434beff-f5d1-4b10-8715-10cdfe445919-kube-api-access-wj7nm\") pod \"certified-operators-nkc2q\" (UID: \"b434beff-f5d1-4b10-8715-10cdfe445919\") " pod="openshift-marketplace/certified-operators-nkc2q" Dec 11 13:09:53 crc kubenswrapper[4898]: I1211 13:09:53.340209 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b434beff-f5d1-4b10-8715-10cdfe445919-utilities\") pod \"certified-operators-nkc2q\" (UID: \"b434beff-f5d1-4b10-8715-10cdfe445919\") " pod="openshift-marketplace/certified-operators-nkc2q" Dec 11 13:09:53 crc kubenswrapper[4898]: I1211 13:09:53.340289 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e837aefb-2b43-47e8-87b0-232560ff1b37-catalog-content\") pod \"community-operators-jncnt\" (UID: \"e837aefb-2b43-47e8-87b0-232560ff1b37\") " pod="openshift-marketplace/community-operators-jncnt" Dec 11 13:09:53 crc kubenswrapper[4898]: I1211 13:09:53.340326 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e837aefb-2b43-47e8-87b0-232560ff1b37-utilities\") pod \"community-operators-jncnt\" (UID: \"e837aefb-2b43-47e8-87b0-232560ff1b37\") " pod="openshift-marketplace/community-operators-jncnt" Dec 11 13:09:53 crc kubenswrapper[4898]: I1211 13:09:53.340375 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lfh78\" (UniqueName: \"kubernetes.io/projected/e837aefb-2b43-47e8-87b0-232560ff1b37-kube-api-access-lfh78\") pod \"community-operators-jncnt\" (UID: \"e837aefb-2b43-47e8-87b0-232560ff1b37\") " pod="openshift-marketplace/community-operators-jncnt" Dec 11 13:09:53 crc kubenswrapper[4898]: I1211 13:09:53.341103 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b434beff-f5d1-4b10-8715-10cdfe445919-catalog-content\") pod \"certified-operators-nkc2q\" (UID: \"b434beff-f5d1-4b10-8715-10cdfe445919\") " pod="openshift-marketplace/certified-operators-nkc2q" Dec 11 13:09:53 crc kubenswrapper[4898]: I1211 13:09:53.341899 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b434beff-f5d1-4b10-8715-10cdfe445919-utilities\") pod \"certified-operators-nkc2q\" (UID: \"b434beff-f5d1-4b10-8715-10cdfe445919\") " pod="openshift-marketplace/certified-operators-nkc2q" Dec 11 13:09:53 crc kubenswrapper[4898]: I1211 13:09:53.360196 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wj7nm\" (UniqueName: \"kubernetes.io/projected/b434beff-f5d1-4b10-8715-10cdfe445919-kube-api-access-wj7nm\") pod \"certified-operators-nkc2q\" (UID: \"b434beff-f5d1-4b10-8715-10cdfe445919\") " pod="openshift-marketplace/certified-operators-nkc2q" Dec 11 13:09:53 crc kubenswrapper[4898]: I1211 13:09:53.409173 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nkc2q" Dec 11 13:09:53 crc kubenswrapper[4898]: I1211 13:09:53.442172 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e837aefb-2b43-47e8-87b0-232560ff1b37-catalog-content\") pod \"community-operators-jncnt\" (UID: \"e837aefb-2b43-47e8-87b0-232560ff1b37\") " pod="openshift-marketplace/community-operators-jncnt" Dec 11 13:09:53 crc kubenswrapper[4898]: I1211 13:09:53.442501 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e837aefb-2b43-47e8-87b0-232560ff1b37-utilities\") pod \"community-operators-jncnt\" (UID: \"e837aefb-2b43-47e8-87b0-232560ff1b37\") " pod="openshift-marketplace/community-operators-jncnt" Dec 11 13:09:53 crc kubenswrapper[4898]: I1211 13:09:53.442575 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lfh78\" (UniqueName: \"kubernetes.io/projected/e837aefb-2b43-47e8-87b0-232560ff1b37-kube-api-access-lfh78\") pod \"community-operators-jncnt\" (UID: \"e837aefb-2b43-47e8-87b0-232560ff1b37\") " pod="openshift-marketplace/community-operators-jncnt" Dec 11 13:09:53 crc kubenswrapper[4898]: I1211 13:09:53.443425 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e837aefb-2b43-47e8-87b0-232560ff1b37-catalog-content\") pod \"community-operators-jncnt\" (UID: \"e837aefb-2b43-47e8-87b0-232560ff1b37\") " pod="openshift-marketplace/community-operators-jncnt" Dec 11 13:09:53 crc kubenswrapper[4898]: I1211 13:09:53.443759 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e837aefb-2b43-47e8-87b0-232560ff1b37-utilities\") pod \"community-operators-jncnt\" (UID: \"e837aefb-2b43-47e8-87b0-232560ff1b37\") " pod="openshift-marketplace/community-operators-jncnt" Dec 11 13:09:53 crc kubenswrapper[4898]: I1211 13:09:53.464689 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lfh78\" (UniqueName: \"kubernetes.io/projected/e837aefb-2b43-47e8-87b0-232560ff1b37-kube-api-access-lfh78\") pod \"community-operators-jncnt\" (UID: \"e837aefb-2b43-47e8-87b0-232560ff1b37\") " pod="openshift-marketplace/community-operators-jncnt" Dec 11 13:09:53 crc kubenswrapper[4898]: I1211 13:09:53.600192 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jncnt" Dec 11 13:09:53 crc kubenswrapper[4898]: I1211 13:09:53.810620 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-nkc2q"] Dec 11 13:09:53 crc kubenswrapper[4898]: W1211 13:09:53.813142 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb434beff_f5d1_4b10_8715_10cdfe445919.slice/crio-fb0bc6e0db9757b276574b4114f238ea854e91eb4ff93d982340effa5af9ee80 WatchSource:0}: Error finding container fb0bc6e0db9757b276574b4114f238ea854e91eb4ff93d982340effa5af9ee80: Status 404 returned error can't find the container with id fb0bc6e0db9757b276574b4114f238ea854e91eb4ff93d982340effa5af9ee80 Dec 11 13:09:53 crc kubenswrapper[4898]: I1211 13:09:53.990503 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-jncnt"] Dec 11 13:09:53 crc kubenswrapper[4898]: W1211 13:09:53.992820 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode837aefb_2b43_47e8_87b0_232560ff1b37.slice/crio-d6850ed2969d7497dcac7fc4782c83401cd9598144bc72f713a94fbc71ccf4ee WatchSource:0}: Error finding container d6850ed2969d7497dcac7fc4782c83401cd9598144bc72f713a94fbc71ccf4ee: Status 404 returned error can't find the container with id d6850ed2969d7497dcac7fc4782c83401cd9598144bc72f713a94fbc71ccf4ee Dec 11 13:09:54 crc kubenswrapper[4898]: I1211 13:09:54.338305 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nkc2q" event={"ID":"b434beff-f5d1-4b10-8715-10cdfe445919","Type":"ContainerStarted","Data":"fb0bc6e0db9757b276574b4114f238ea854e91eb4ff93d982340effa5af9ee80"} Dec 11 13:09:54 crc kubenswrapper[4898]: I1211 13:09:54.339485 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jncnt" event={"ID":"e837aefb-2b43-47e8-87b0-232560ff1b37","Type":"ContainerStarted","Data":"d6850ed2969d7497dcac7fc4782c83401cd9598144bc72f713a94fbc71ccf4ee"} Dec 11 13:09:55 crc kubenswrapper[4898]: I1211 13:09:55.389062 4898 generic.go:334] "Generic (PLEG): container finished" podID="b434beff-f5d1-4b10-8715-10cdfe445919" containerID="4c3ce650cd10601935ed8a528cec2220aa22009d849ebea3a3749e35dfff3744" exitCode=0 Dec 11 13:09:55 crc kubenswrapper[4898]: I1211 13:09:55.390527 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nkc2q" event={"ID":"b434beff-f5d1-4b10-8715-10cdfe445919","Type":"ContainerDied","Data":"4c3ce650cd10601935ed8a528cec2220aa22009d849ebea3a3749e35dfff3744"} Dec 11 13:09:55 crc kubenswrapper[4898]: I1211 13:09:55.398797 4898 generic.go:334] "Generic (PLEG): container finished" podID="e837aefb-2b43-47e8-87b0-232560ff1b37" containerID="1ff677b49eeceb6c58c9193b6fd1164744c41c9649515b46349516a5de4dedd3" exitCode=0 Dec 11 13:09:55 crc kubenswrapper[4898]: I1211 13:09:55.398938 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jncnt" event={"ID":"e837aefb-2b43-47e8-87b0-232560ff1b37","Type":"ContainerDied","Data":"1ff677b49eeceb6c58c9193b6fd1164744c41c9649515b46349516a5de4dedd3"} Dec 11 13:09:55 crc kubenswrapper[4898]: I1211 13:09:55.468438 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-jqfm2"] Dec 11 13:09:55 crc kubenswrapper[4898]: I1211 13:09:55.469872 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jqfm2" Dec 11 13:09:55 crc kubenswrapper[4898]: I1211 13:09:55.475172 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 11 13:09:55 crc kubenswrapper[4898]: I1211 13:09:55.482056 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jqfm2"] Dec 11 13:09:55 crc kubenswrapper[4898]: I1211 13:09:55.669758 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zd2qg\" (UniqueName: \"kubernetes.io/projected/f1d2e209-8101-4341-b232-ed52d1d9f629-kube-api-access-zd2qg\") pod \"redhat-marketplace-jqfm2\" (UID: \"f1d2e209-8101-4341-b232-ed52d1d9f629\") " pod="openshift-marketplace/redhat-marketplace-jqfm2" Dec 11 13:09:55 crc kubenswrapper[4898]: I1211 13:09:55.669799 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f1d2e209-8101-4341-b232-ed52d1d9f629-catalog-content\") pod \"redhat-marketplace-jqfm2\" (UID: \"f1d2e209-8101-4341-b232-ed52d1d9f629\") " pod="openshift-marketplace/redhat-marketplace-jqfm2" Dec 11 13:09:55 crc kubenswrapper[4898]: I1211 13:09:55.669858 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f1d2e209-8101-4341-b232-ed52d1d9f629-utilities\") pod \"redhat-marketplace-jqfm2\" (UID: \"f1d2e209-8101-4341-b232-ed52d1d9f629\") " pod="openshift-marketplace/redhat-marketplace-jqfm2" Dec 11 13:09:55 crc kubenswrapper[4898]: I1211 13:09:55.673693 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-kxmhp"] Dec 11 13:09:55 crc kubenswrapper[4898]: I1211 13:09:55.674818 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kxmhp" Dec 11 13:09:55 crc kubenswrapper[4898]: I1211 13:09:55.677093 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 11 13:09:55 crc kubenswrapper[4898]: I1211 13:09:55.687476 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-kxmhp"] Dec 11 13:09:55 crc kubenswrapper[4898]: I1211 13:09:55.770656 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f1d2e209-8101-4341-b232-ed52d1d9f629-catalog-content\") pod \"redhat-marketplace-jqfm2\" (UID: \"f1d2e209-8101-4341-b232-ed52d1d9f629\") " pod="openshift-marketplace/redhat-marketplace-jqfm2" Dec 11 13:09:55 crc kubenswrapper[4898]: I1211 13:09:55.770738 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f1d2e209-8101-4341-b232-ed52d1d9f629-utilities\") pod \"redhat-marketplace-jqfm2\" (UID: \"f1d2e209-8101-4341-b232-ed52d1d9f629\") " pod="openshift-marketplace/redhat-marketplace-jqfm2" Dec 11 13:09:55 crc kubenswrapper[4898]: I1211 13:09:55.770788 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zd2qg\" (UniqueName: \"kubernetes.io/projected/f1d2e209-8101-4341-b232-ed52d1d9f629-kube-api-access-zd2qg\") pod \"redhat-marketplace-jqfm2\" (UID: \"f1d2e209-8101-4341-b232-ed52d1d9f629\") " pod="openshift-marketplace/redhat-marketplace-jqfm2" Dec 11 13:09:55 crc kubenswrapper[4898]: I1211 13:09:55.771301 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f1d2e209-8101-4341-b232-ed52d1d9f629-catalog-content\") pod \"redhat-marketplace-jqfm2\" (UID: \"f1d2e209-8101-4341-b232-ed52d1d9f629\") " pod="openshift-marketplace/redhat-marketplace-jqfm2" Dec 11 13:09:55 crc kubenswrapper[4898]: I1211 13:09:55.771383 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f1d2e209-8101-4341-b232-ed52d1d9f629-utilities\") pod \"redhat-marketplace-jqfm2\" (UID: \"f1d2e209-8101-4341-b232-ed52d1d9f629\") " pod="openshift-marketplace/redhat-marketplace-jqfm2" Dec 11 13:09:55 crc kubenswrapper[4898]: I1211 13:09:55.803513 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zd2qg\" (UniqueName: \"kubernetes.io/projected/f1d2e209-8101-4341-b232-ed52d1d9f629-kube-api-access-zd2qg\") pod \"redhat-marketplace-jqfm2\" (UID: \"f1d2e209-8101-4341-b232-ed52d1d9f629\") " pod="openshift-marketplace/redhat-marketplace-jqfm2" Dec 11 13:09:55 crc kubenswrapper[4898]: I1211 13:09:55.872043 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c3a84ac6-e73e-4e1f-bd6e-cfb6c607dd4a-catalog-content\") pod \"redhat-operators-kxmhp\" (UID: \"c3a84ac6-e73e-4e1f-bd6e-cfb6c607dd4a\") " pod="openshift-marketplace/redhat-operators-kxmhp" Dec 11 13:09:55 crc kubenswrapper[4898]: I1211 13:09:55.872136 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6qkt6\" (UniqueName: \"kubernetes.io/projected/c3a84ac6-e73e-4e1f-bd6e-cfb6c607dd4a-kube-api-access-6qkt6\") pod \"redhat-operators-kxmhp\" (UID: \"c3a84ac6-e73e-4e1f-bd6e-cfb6c607dd4a\") " pod="openshift-marketplace/redhat-operators-kxmhp" Dec 11 13:09:55 crc kubenswrapper[4898]: I1211 13:09:55.872494 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c3a84ac6-e73e-4e1f-bd6e-cfb6c607dd4a-utilities\") pod \"redhat-operators-kxmhp\" (UID: \"c3a84ac6-e73e-4e1f-bd6e-cfb6c607dd4a\") " pod="openshift-marketplace/redhat-operators-kxmhp" Dec 11 13:09:55 crc kubenswrapper[4898]: I1211 13:09:55.974268 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c3a84ac6-e73e-4e1f-bd6e-cfb6c607dd4a-catalog-content\") pod \"redhat-operators-kxmhp\" (UID: \"c3a84ac6-e73e-4e1f-bd6e-cfb6c607dd4a\") " pod="openshift-marketplace/redhat-operators-kxmhp" Dec 11 13:09:55 crc kubenswrapper[4898]: I1211 13:09:55.974437 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6qkt6\" (UniqueName: \"kubernetes.io/projected/c3a84ac6-e73e-4e1f-bd6e-cfb6c607dd4a-kube-api-access-6qkt6\") pod \"redhat-operators-kxmhp\" (UID: \"c3a84ac6-e73e-4e1f-bd6e-cfb6c607dd4a\") " pod="openshift-marketplace/redhat-operators-kxmhp" Dec 11 13:09:55 crc kubenswrapper[4898]: I1211 13:09:55.974589 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c3a84ac6-e73e-4e1f-bd6e-cfb6c607dd4a-utilities\") pod \"redhat-operators-kxmhp\" (UID: \"c3a84ac6-e73e-4e1f-bd6e-cfb6c607dd4a\") " pod="openshift-marketplace/redhat-operators-kxmhp" Dec 11 13:09:55 crc kubenswrapper[4898]: I1211 13:09:55.974998 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c3a84ac6-e73e-4e1f-bd6e-cfb6c607dd4a-catalog-content\") pod \"redhat-operators-kxmhp\" (UID: \"c3a84ac6-e73e-4e1f-bd6e-cfb6c607dd4a\") " pod="openshift-marketplace/redhat-operators-kxmhp" Dec 11 13:09:55 crc kubenswrapper[4898]: I1211 13:09:55.975121 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c3a84ac6-e73e-4e1f-bd6e-cfb6c607dd4a-utilities\") pod \"redhat-operators-kxmhp\" (UID: \"c3a84ac6-e73e-4e1f-bd6e-cfb6c607dd4a\") " pod="openshift-marketplace/redhat-operators-kxmhp" Dec 11 13:09:55 crc kubenswrapper[4898]: I1211 13:09:55.997501 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6qkt6\" (UniqueName: \"kubernetes.io/projected/c3a84ac6-e73e-4e1f-bd6e-cfb6c607dd4a-kube-api-access-6qkt6\") pod \"redhat-operators-kxmhp\" (UID: \"c3a84ac6-e73e-4e1f-bd6e-cfb6c607dd4a\") " pod="openshift-marketplace/redhat-operators-kxmhp" Dec 11 13:09:56 crc kubenswrapper[4898]: I1211 13:09:56.007144 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kxmhp" Dec 11 13:09:56 crc kubenswrapper[4898]: I1211 13:09:56.087505 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jqfm2" Dec 11 13:09:56 crc kubenswrapper[4898]: I1211 13:09:56.404809 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nkc2q" event={"ID":"b434beff-f5d1-4b10-8715-10cdfe445919","Type":"ContainerStarted","Data":"a4635cb646835a4796b49fc993dcdb0e1f5f73a4f0c0499b1186b31bd2891382"} Dec 11 13:09:56 crc kubenswrapper[4898]: I1211 13:09:56.406514 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jncnt" event={"ID":"e837aefb-2b43-47e8-87b0-232560ff1b37","Type":"ContainerStarted","Data":"3dc142382f93bb6d287e87490aa8d54d6e6dae1e7720cb5ce7e779702c3af0bc"} Dec 11 13:09:56 crc kubenswrapper[4898]: I1211 13:09:56.493140 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-kxmhp"] Dec 11 13:09:56 crc kubenswrapper[4898]: I1211 13:09:56.587646 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jqfm2"] Dec 11 13:09:56 crc kubenswrapper[4898]: W1211 13:09:56.595355 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc3a84ac6_e73e_4e1f_bd6e_cfb6c607dd4a.slice/crio-243a3c28a422847152c3d33da45116e05f372dcd17d7198e380428b8872418ab WatchSource:0}: Error finding container 243a3c28a422847152c3d33da45116e05f372dcd17d7198e380428b8872418ab: Status 404 returned error can't find the container with id 243a3c28a422847152c3d33da45116e05f372dcd17d7198e380428b8872418ab Dec 11 13:09:56 crc kubenswrapper[4898]: W1211 13:09:56.596295 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf1d2e209_8101_4341_b232_ed52d1d9f629.slice/crio-556b47f9507fbce55eed7112e1b666318a2618bce95b1cf8155cac72b1c10f36 WatchSource:0}: Error finding container 556b47f9507fbce55eed7112e1b666318a2618bce95b1cf8155cac72b1c10f36: Status 404 returned error can't find the container with id 556b47f9507fbce55eed7112e1b666318a2618bce95b1cf8155cac72b1c10f36 Dec 11 13:09:57 crc kubenswrapper[4898]: I1211 13:09:57.413315 4898 generic.go:334] "Generic (PLEG): container finished" podID="b434beff-f5d1-4b10-8715-10cdfe445919" containerID="a4635cb646835a4796b49fc993dcdb0e1f5f73a4f0c0499b1186b31bd2891382" exitCode=0 Dec 11 13:09:57 crc kubenswrapper[4898]: I1211 13:09:57.413575 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nkc2q" event={"ID":"b434beff-f5d1-4b10-8715-10cdfe445919","Type":"ContainerDied","Data":"a4635cb646835a4796b49fc993dcdb0e1f5f73a4f0c0499b1186b31bd2891382"} Dec 11 13:09:57 crc kubenswrapper[4898]: I1211 13:09:57.416399 4898 generic.go:334] "Generic (PLEG): container finished" podID="e837aefb-2b43-47e8-87b0-232560ff1b37" containerID="3dc142382f93bb6d287e87490aa8d54d6e6dae1e7720cb5ce7e779702c3af0bc" exitCode=0 Dec 11 13:09:57 crc kubenswrapper[4898]: I1211 13:09:57.416446 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jncnt" event={"ID":"e837aefb-2b43-47e8-87b0-232560ff1b37","Type":"ContainerDied","Data":"3dc142382f93bb6d287e87490aa8d54d6e6dae1e7720cb5ce7e779702c3af0bc"} Dec 11 13:09:57 crc kubenswrapper[4898]: I1211 13:09:57.420180 4898 generic.go:334] "Generic (PLEG): container finished" podID="f1d2e209-8101-4341-b232-ed52d1d9f629" containerID="3f7641082200fbe2d662c02a6f4e975122f91035a4b738d6a48516245ba1da10" exitCode=0 Dec 11 13:09:57 crc kubenswrapper[4898]: I1211 13:09:57.420564 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jqfm2" event={"ID":"f1d2e209-8101-4341-b232-ed52d1d9f629","Type":"ContainerDied","Data":"3f7641082200fbe2d662c02a6f4e975122f91035a4b738d6a48516245ba1da10"} Dec 11 13:09:57 crc kubenswrapper[4898]: I1211 13:09:57.420620 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jqfm2" event={"ID":"f1d2e209-8101-4341-b232-ed52d1d9f629","Type":"ContainerStarted","Data":"556b47f9507fbce55eed7112e1b666318a2618bce95b1cf8155cac72b1c10f36"} Dec 11 13:09:57 crc kubenswrapper[4898]: I1211 13:09:57.424252 4898 generic.go:334] "Generic (PLEG): container finished" podID="c3a84ac6-e73e-4e1f-bd6e-cfb6c607dd4a" containerID="01ae789b1ab5c83e1b66406c82d0864abba4a63396886d1496ba37e5044e252e" exitCode=0 Dec 11 13:09:57 crc kubenswrapper[4898]: I1211 13:09:57.424304 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kxmhp" event={"ID":"c3a84ac6-e73e-4e1f-bd6e-cfb6c607dd4a","Type":"ContainerDied","Data":"01ae789b1ab5c83e1b66406c82d0864abba4a63396886d1496ba37e5044e252e"} Dec 11 13:09:57 crc kubenswrapper[4898]: I1211 13:09:57.424333 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kxmhp" event={"ID":"c3a84ac6-e73e-4e1f-bd6e-cfb6c607dd4a","Type":"ContainerStarted","Data":"243a3c28a422847152c3d33da45116e05f372dcd17d7198e380428b8872418ab"} Dec 11 13:09:58 crc kubenswrapper[4898]: I1211 13:09:58.437379 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jncnt" event={"ID":"e837aefb-2b43-47e8-87b0-232560ff1b37","Type":"ContainerStarted","Data":"07e965eb6b103c4e2659fb2c994cc362dc059642fab98e3d70f1f7ab427a964a"} Dec 11 13:09:58 crc kubenswrapper[4898]: I1211 13:09:58.443083 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kxmhp" event={"ID":"c3a84ac6-e73e-4e1f-bd6e-cfb6c607dd4a","Type":"ContainerStarted","Data":"0ea3ac00163e039580092c274dc04ebee598a9985d91227d3f9c27f3d3df2cdb"} Dec 11 13:09:58 crc kubenswrapper[4898]: I1211 13:09:58.445505 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nkc2q" event={"ID":"b434beff-f5d1-4b10-8715-10cdfe445919","Type":"ContainerStarted","Data":"b30ae26ad276388bdca5effb256fe4997fedb39637a0d2e16ab82cb5a4f01aa4"} Dec 11 13:09:58 crc kubenswrapper[4898]: I1211 13:09:58.460398 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-jncnt" podStartSLOduration=2.738819445 podStartE2EDuration="5.460382488s" podCreationTimestamp="2025-12-11 13:09:53 +0000 UTC" firstStartedPulling="2025-12-11 13:09:55.401147313 +0000 UTC m=+352.973473750" lastFinishedPulling="2025-12-11 13:09:58.122710356 +0000 UTC m=+355.695036793" observedRunningTime="2025-12-11 13:09:58.457266123 +0000 UTC m=+356.029592560" watchObservedRunningTime="2025-12-11 13:09:58.460382488 +0000 UTC m=+356.032708945" Dec 11 13:09:58 crc kubenswrapper[4898]: I1211 13:09:58.501879 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-nkc2q" podStartSLOduration=2.806112939 podStartE2EDuration="5.501857818s" podCreationTimestamp="2025-12-11 13:09:53 +0000 UTC" firstStartedPulling="2025-12-11 13:09:55.391183892 +0000 UTC m=+352.963510339" lastFinishedPulling="2025-12-11 13:09:58.086928781 +0000 UTC m=+355.659255218" observedRunningTime="2025-12-11 13:09:58.499018901 +0000 UTC m=+356.071345338" watchObservedRunningTime="2025-12-11 13:09:58.501857818 +0000 UTC m=+356.074184265" Dec 11 13:09:58 crc kubenswrapper[4898]: I1211 13:09:58.907066 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/eca70e56-1feb-4a48-9c32-db8e075ebfff-tls-certificates\") pod \"prometheus-operator-admission-webhook-f54c54754-56mlq\" (UID: \"eca70e56-1feb-4a48-9c32-db8e075ebfff\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-56mlq" Dec 11 13:09:58 crc kubenswrapper[4898]: E1211 13:09:58.907207 4898 secret.go:188] Couldn't get secret openshift-monitoring/prometheus-operator-admission-webhook-tls: secret "prometheus-operator-admission-webhook-tls" not found Dec 11 13:09:58 crc kubenswrapper[4898]: E1211 13:09:58.907730 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eca70e56-1feb-4a48-9c32-db8e075ebfff-tls-certificates podName:eca70e56-1feb-4a48-9c32-db8e075ebfff nodeName:}" failed. No retries permitted until 2025-12-11 13:10:30.907706298 +0000 UTC m=+388.480032735 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "tls-certificates" (UniqueName: "kubernetes.io/secret/eca70e56-1feb-4a48-9c32-db8e075ebfff-tls-certificates") pod "prometheus-operator-admission-webhook-f54c54754-56mlq" (UID: "eca70e56-1feb-4a48-9c32-db8e075ebfff") : secret "prometheus-operator-admission-webhook-tls" not found Dec 11 13:09:59 crc kubenswrapper[4898]: I1211 13:09:59.453293 4898 generic.go:334] "Generic (PLEG): container finished" podID="f1d2e209-8101-4341-b232-ed52d1d9f629" containerID="90dc7921149f6dbfe17a6766e1b2c853c433056ecd76fb3062b09e242430e7e7" exitCode=0 Dec 11 13:09:59 crc kubenswrapper[4898]: I1211 13:09:59.453374 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jqfm2" event={"ID":"f1d2e209-8101-4341-b232-ed52d1d9f629","Type":"ContainerDied","Data":"90dc7921149f6dbfe17a6766e1b2c853c433056ecd76fb3062b09e242430e7e7"} Dec 11 13:09:59 crc kubenswrapper[4898]: I1211 13:09:59.455213 4898 generic.go:334] "Generic (PLEG): container finished" podID="c3a84ac6-e73e-4e1f-bd6e-cfb6c607dd4a" containerID="0ea3ac00163e039580092c274dc04ebee598a9985d91227d3f9c27f3d3df2cdb" exitCode=0 Dec 11 13:09:59 crc kubenswrapper[4898]: I1211 13:09:59.455270 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kxmhp" event={"ID":"c3a84ac6-e73e-4e1f-bd6e-cfb6c607dd4a","Type":"ContainerDied","Data":"0ea3ac00163e039580092c274dc04ebee598a9985d91227d3f9c27f3d3df2cdb"} Dec 11 13:10:00 crc kubenswrapper[4898]: I1211 13:10:00.462322 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jqfm2" event={"ID":"f1d2e209-8101-4341-b232-ed52d1d9f629","Type":"ContainerStarted","Data":"938c9f34f93e292b19e16d064e1c90e1bc18aa4bb09ea76a6d00e34040ecd564"} Dec 11 13:10:00 crc kubenswrapper[4898]: I1211 13:10:00.464118 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kxmhp" event={"ID":"c3a84ac6-e73e-4e1f-bd6e-cfb6c607dd4a","Type":"ContainerStarted","Data":"90a99bfd70f69a4d76721cfbcf8744865c195809a37a8fdbe17c1dde03a4a083"} Dec 11 13:10:00 crc kubenswrapper[4898]: I1211 13:10:00.482693 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-jqfm2" podStartSLOduration=2.98301955 podStartE2EDuration="5.482674376s" podCreationTimestamp="2025-12-11 13:09:55 +0000 UTC" firstStartedPulling="2025-12-11 13:09:57.421984091 +0000 UTC m=+354.994310538" lastFinishedPulling="2025-12-11 13:09:59.921638927 +0000 UTC m=+357.493965364" observedRunningTime="2025-12-11 13:10:00.481308908 +0000 UTC m=+358.053635365" watchObservedRunningTime="2025-12-11 13:10:00.482674376 +0000 UTC m=+358.055000813" Dec 11 13:10:00 crc kubenswrapper[4898]: I1211 13:10:00.500670 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-kxmhp" podStartSLOduration=2.97938045 podStartE2EDuration="5.500654566s" podCreationTimestamp="2025-12-11 13:09:55 +0000 UTC" firstStartedPulling="2025-12-11 13:09:57.42599629 +0000 UTC m=+354.998322777" lastFinishedPulling="2025-12-11 13:09:59.947270456 +0000 UTC m=+357.519596893" observedRunningTime="2025-12-11 13:10:00.499318409 +0000 UTC m=+358.071644856" watchObservedRunningTime="2025-12-11 13:10:00.500654566 +0000 UTC m=+358.072981003" Dec 11 13:10:03 crc kubenswrapper[4898]: I1211 13:10:03.410114 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-nkc2q" Dec 11 13:10:03 crc kubenswrapper[4898]: I1211 13:10:03.410494 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-nkc2q" Dec 11 13:10:03 crc kubenswrapper[4898]: I1211 13:10:03.461218 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-nkc2q" Dec 11 13:10:03 crc kubenswrapper[4898]: I1211 13:10:03.545162 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-nkc2q" Dec 11 13:10:03 crc kubenswrapper[4898]: I1211 13:10:03.600784 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-jncnt" Dec 11 13:10:03 crc kubenswrapper[4898]: I1211 13:10:03.601077 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-jncnt" Dec 11 13:10:03 crc kubenswrapper[4898]: I1211 13:10:03.639648 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-jncnt" Dec 11 13:10:04 crc kubenswrapper[4898]: I1211 13:10:04.550777 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-jncnt" Dec 11 13:10:04 crc kubenswrapper[4898]: I1211 13:10:04.995405 4898 patch_prober.go:28] interesting pod/machine-config-daemon-7mmvk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 11 13:10:04 crc kubenswrapper[4898]: I1211 13:10:04.995747 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 11 13:10:06 crc kubenswrapper[4898]: I1211 13:10:06.007678 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-kxmhp" Dec 11 13:10:06 crc kubenswrapper[4898]: I1211 13:10:06.007749 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-kxmhp" Dec 11 13:10:06 crc kubenswrapper[4898]: I1211 13:10:06.088030 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-jqfm2" Dec 11 13:10:06 crc kubenswrapper[4898]: I1211 13:10:06.088086 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-jqfm2" Dec 11 13:10:06 crc kubenswrapper[4898]: I1211 13:10:06.137887 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-jqfm2" Dec 11 13:10:06 crc kubenswrapper[4898]: I1211 13:10:06.551193 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-jqfm2" Dec 11 13:10:07 crc kubenswrapper[4898]: I1211 13:10:07.062806 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-kxmhp" podUID="c3a84ac6-e73e-4e1f-bd6e-cfb6c607dd4a" containerName="registry-server" probeResult="failure" output=< Dec 11 13:10:07 crc kubenswrapper[4898]: timeout: failed to connect service ":50051" within 1s Dec 11 13:10:07 crc kubenswrapper[4898]: > Dec 11 13:10:10 crc kubenswrapper[4898]: I1211 13:10:10.705405 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-jhl8f"] Dec 11 13:10:10 crc kubenswrapper[4898]: I1211 13:10:10.706515 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-jhl8f" Dec 11 13:10:10 crc kubenswrapper[4898]: I1211 13:10:10.717813 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-jhl8f"] Dec 11 13:10:10 crc kubenswrapper[4898]: I1211 13:10:10.779096 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c1be863d-5347-436b-a9d0-0525eb27e141-ca-trust-extracted\") pod \"image-registry-66df7c8f76-jhl8f\" (UID: \"c1be863d-5347-436b-a9d0-0525eb27e141\") " pod="openshift-image-registry/image-registry-66df7c8f76-jhl8f" Dec 11 13:10:10 crc kubenswrapper[4898]: I1211 13:10:10.780332 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c1be863d-5347-436b-a9d0-0525eb27e141-bound-sa-token\") pod \"image-registry-66df7c8f76-jhl8f\" (UID: \"c1be863d-5347-436b-a9d0-0525eb27e141\") " pod="openshift-image-registry/image-registry-66df7c8f76-jhl8f" Dec 11 13:10:10 crc kubenswrapper[4898]: I1211 13:10:10.780401 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c1be863d-5347-436b-a9d0-0525eb27e141-registry-certificates\") pod \"image-registry-66df7c8f76-jhl8f\" (UID: \"c1be863d-5347-436b-a9d0-0525eb27e141\") " pod="openshift-image-registry/image-registry-66df7c8f76-jhl8f" Dec 11 13:10:10 crc kubenswrapper[4898]: I1211 13:10:10.780423 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c1be863d-5347-436b-a9d0-0525eb27e141-installation-pull-secrets\") pod \"image-registry-66df7c8f76-jhl8f\" (UID: \"c1be863d-5347-436b-a9d0-0525eb27e141\") " pod="openshift-image-registry/image-registry-66df7c8f76-jhl8f" Dec 11 13:10:10 crc kubenswrapper[4898]: I1211 13:10:10.780631 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c1be863d-5347-436b-a9d0-0525eb27e141-trusted-ca\") pod \"image-registry-66df7c8f76-jhl8f\" (UID: \"c1be863d-5347-436b-a9d0-0525eb27e141\") " pod="openshift-image-registry/image-registry-66df7c8f76-jhl8f" Dec 11 13:10:10 crc kubenswrapper[4898]: I1211 13:10:10.780699 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pn645\" (UniqueName: \"kubernetes.io/projected/c1be863d-5347-436b-a9d0-0525eb27e141-kube-api-access-pn645\") pod \"image-registry-66df7c8f76-jhl8f\" (UID: \"c1be863d-5347-436b-a9d0-0525eb27e141\") " pod="openshift-image-registry/image-registry-66df7c8f76-jhl8f" Dec 11 13:10:10 crc kubenswrapper[4898]: I1211 13:10:10.780775 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-jhl8f\" (UID: \"c1be863d-5347-436b-a9d0-0525eb27e141\") " pod="openshift-image-registry/image-registry-66df7c8f76-jhl8f" Dec 11 13:10:10 crc kubenswrapper[4898]: I1211 13:10:10.780822 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c1be863d-5347-436b-a9d0-0525eb27e141-registry-tls\") pod \"image-registry-66df7c8f76-jhl8f\" (UID: \"c1be863d-5347-436b-a9d0-0525eb27e141\") " pod="openshift-image-registry/image-registry-66df7c8f76-jhl8f" Dec 11 13:10:10 crc kubenswrapper[4898]: I1211 13:10:10.812761 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-jhl8f\" (UID: \"c1be863d-5347-436b-a9d0-0525eb27e141\") " pod="openshift-image-registry/image-registry-66df7c8f76-jhl8f" Dec 11 13:10:10 crc kubenswrapper[4898]: I1211 13:10:10.882036 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c1be863d-5347-436b-a9d0-0525eb27e141-trusted-ca\") pod \"image-registry-66df7c8f76-jhl8f\" (UID: \"c1be863d-5347-436b-a9d0-0525eb27e141\") " pod="openshift-image-registry/image-registry-66df7c8f76-jhl8f" Dec 11 13:10:10 crc kubenswrapper[4898]: I1211 13:10:10.882287 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pn645\" (UniqueName: \"kubernetes.io/projected/c1be863d-5347-436b-a9d0-0525eb27e141-kube-api-access-pn645\") pod \"image-registry-66df7c8f76-jhl8f\" (UID: \"c1be863d-5347-436b-a9d0-0525eb27e141\") " pod="openshift-image-registry/image-registry-66df7c8f76-jhl8f" Dec 11 13:10:10 crc kubenswrapper[4898]: I1211 13:10:10.882384 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c1be863d-5347-436b-a9d0-0525eb27e141-registry-tls\") pod \"image-registry-66df7c8f76-jhl8f\" (UID: \"c1be863d-5347-436b-a9d0-0525eb27e141\") " pod="openshift-image-registry/image-registry-66df7c8f76-jhl8f" Dec 11 13:10:10 crc kubenswrapper[4898]: I1211 13:10:10.882488 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c1be863d-5347-436b-a9d0-0525eb27e141-ca-trust-extracted\") pod \"image-registry-66df7c8f76-jhl8f\" (UID: \"c1be863d-5347-436b-a9d0-0525eb27e141\") " pod="openshift-image-registry/image-registry-66df7c8f76-jhl8f" Dec 11 13:10:10 crc kubenswrapper[4898]: I1211 13:10:10.882610 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c1be863d-5347-436b-a9d0-0525eb27e141-bound-sa-token\") pod \"image-registry-66df7c8f76-jhl8f\" (UID: \"c1be863d-5347-436b-a9d0-0525eb27e141\") " pod="openshift-image-registry/image-registry-66df7c8f76-jhl8f" Dec 11 13:10:10 crc kubenswrapper[4898]: I1211 13:10:10.882711 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c1be863d-5347-436b-a9d0-0525eb27e141-registry-certificates\") pod \"image-registry-66df7c8f76-jhl8f\" (UID: \"c1be863d-5347-436b-a9d0-0525eb27e141\") " pod="openshift-image-registry/image-registry-66df7c8f76-jhl8f" Dec 11 13:10:10 crc kubenswrapper[4898]: I1211 13:10:10.882790 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c1be863d-5347-436b-a9d0-0525eb27e141-installation-pull-secrets\") pod \"image-registry-66df7c8f76-jhl8f\" (UID: \"c1be863d-5347-436b-a9d0-0525eb27e141\") " pod="openshift-image-registry/image-registry-66df7c8f76-jhl8f" Dec 11 13:10:10 crc kubenswrapper[4898]: I1211 13:10:10.883201 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c1be863d-5347-436b-a9d0-0525eb27e141-ca-trust-extracted\") pod \"image-registry-66df7c8f76-jhl8f\" (UID: \"c1be863d-5347-436b-a9d0-0525eb27e141\") " pod="openshift-image-registry/image-registry-66df7c8f76-jhl8f" Dec 11 13:10:10 crc kubenswrapper[4898]: I1211 13:10:10.883720 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c1be863d-5347-436b-a9d0-0525eb27e141-trusted-ca\") pod \"image-registry-66df7c8f76-jhl8f\" (UID: \"c1be863d-5347-436b-a9d0-0525eb27e141\") " pod="openshift-image-registry/image-registry-66df7c8f76-jhl8f" Dec 11 13:10:10 crc kubenswrapper[4898]: I1211 13:10:10.884703 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c1be863d-5347-436b-a9d0-0525eb27e141-registry-certificates\") pod \"image-registry-66df7c8f76-jhl8f\" (UID: \"c1be863d-5347-436b-a9d0-0525eb27e141\") " pod="openshift-image-registry/image-registry-66df7c8f76-jhl8f" Dec 11 13:10:10 crc kubenswrapper[4898]: I1211 13:10:10.888151 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c1be863d-5347-436b-a9d0-0525eb27e141-installation-pull-secrets\") pod \"image-registry-66df7c8f76-jhl8f\" (UID: \"c1be863d-5347-436b-a9d0-0525eb27e141\") " pod="openshift-image-registry/image-registry-66df7c8f76-jhl8f" Dec 11 13:10:10 crc kubenswrapper[4898]: I1211 13:10:10.889939 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c1be863d-5347-436b-a9d0-0525eb27e141-registry-tls\") pod \"image-registry-66df7c8f76-jhl8f\" (UID: \"c1be863d-5347-436b-a9d0-0525eb27e141\") " pod="openshift-image-registry/image-registry-66df7c8f76-jhl8f" Dec 11 13:10:10 crc kubenswrapper[4898]: I1211 13:10:10.899990 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pn645\" (UniqueName: \"kubernetes.io/projected/c1be863d-5347-436b-a9d0-0525eb27e141-kube-api-access-pn645\") pod \"image-registry-66df7c8f76-jhl8f\" (UID: \"c1be863d-5347-436b-a9d0-0525eb27e141\") " pod="openshift-image-registry/image-registry-66df7c8f76-jhl8f" Dec 11 13:10:10 crc kubenswrapper[4898]: I1211 13:10:10.902389 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c1be863d-5347-436b-a9d0-0525eb27e141-bound-sa-token\") pod \"image-registry-66df7c8f76-jhl8f\" (UID: \"c1be863d-5347-436b-a9d0-0525eb27e141\") " pod="openshift-image-registry/image-registry-66df7c8f76-jhl8f" Dec 11 13:10:11 crc kubenswrapper[4898]: I1211 13:10:11.023259 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-jhl8f" Dec 11 13:10:11 crc kubenswrapper[4898]: I1211 13:10:11.485574 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-jhl8f"] Dec 11 13:10:11 crc kubenswrapper[4898]: W1211 13:10:11.496649 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc1be863d_5347_436b_a9d0_0525eb27e141.slice/crio-8825b94ded4d50cd9a813c175897bc9b6d92260026fb89e77029eab935daf89e WatchSource:0}: Error finding container 8825b94ded4d50cd9a813c175897bc9b6d92260026fb89e77029eab935daf89e: Status 404 returned error can't find the container with id 8825b94ded4d50cd9a813c175897bc9b6d92260026fb89e77029eab935daf89e Dec 11 13:10:11 crc kubenswrapper[4898]: I1211 13:10:11.525910 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-jhl8f" event={"ID":"c1be863d-5347-436b-a9d0-0525eb27e141","Type":"ContainerStarted","Data":"8825b94ded4d50cd9a813c175897bc9b6d92260026fb89e77029eab935daf89e"} Dec 11 13:10:12 crc kubenswrapper[4898]: I1211 13:10:12.170499 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-dbf56b754-mxg92"] Dec 11 13:10:12 crc kubenswrapper[4898]: I1211 13:10:12.171057 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-dbf56b754-mxg92" podUID="1c9a929f-745b-48d6-9799-f762ee7b14fe" containerName="controller-manager" containerID="cri-o://ef4276fe2fd24f0cf5ade8cb78337c92a6e94ed9be75fb32e3ecbf6510057c7e" gracePeriod=30 Dec 11 13:10:12 crc kubenswrapper[4898]: I1211 13:10:12.931049 4898 patch_prober.go:28] interesting pod/controller-manager-dbf56b754-mxg92 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.61:8443/healthz\": dial tcp 10.217.0.61:8443: connect: connection refused" start-of-body= Dec 11 13:10:12 crc kubenswrapper[4898]: I1211 13:10:12.931162 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-dbf56b754-mxg92" podUID="1c9a929f-745b-48d6-9799-f762ee7b14fe" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.61:8443/healthz\": dial tcp 10.217.0.61:8443: connect: connection refused" Dec 11 13:10:13 crc kubenswrapper[4898]: I1211 13:10:13.548620 4898 generic.go:334] "Generic (PLEG): container finished" podID="1c9a929f-745b-48d6-9799-f762ee7b14fe" containerID="ef4276fe2fd24f0cf5ade8cb78337c92a6e94ed9be75fb32e3ecbf6510057c7e" exitCode=0 Dec 11 13:10:13 crc kubenswrapper[4898]: I1211 13:10:13.548792 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-dbf56b754-mxg92" event={"ID":"1c9a929f-745b-48d6-9799-f762ee7b14fe","Type":"ContainerDied","Data":"ef4276fe2fd24f0cf5ade8cb78337c92a6e94ed9be75fb32e3ecbf6510057c7e"} Dec 11 13:10:13 crc kubenswrapper[4898]: I1211 13:10:13.551554 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-jhl8f" event={"ID":"c1be863d-5347-436b-a9d0-0525eb27e141","Type":"ContainerStarted","Data":"9379a44efd628603606c01836125e3e853e330b20f827eb5444bfaff33d322ef"} Dec 11 13:10:13 crc kubenswrapper[4898]: I1211 13:10:13.551865 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-jhl8f" Dec 11 13:10:13 crc kubenswrapper[4898]: I1211 13:10:13.703575 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-dbf56b754-mxg92" Dec 11 13:10:13 crc kubenswrapper[4898]: I1211 13:10:13.722589 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-jhl8f" podStartSLOduration=3.7225682559999997 podStartE2EDuration="3.722568256s" podCreationTimestamp="2025-12-11 13:10:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 13:10:13.577863382 +0000 UTC m=+371.150189819" watchObservedRunningTime="2025-12-11 13:10:13.722568256 +0000 UTC m=+371.294894693" Dec 11 13:10:13 crc kubenswrapper[4898]: I1211 13:10:13.730892 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6d6f47df77-zcqxc"] Dec 11 13:10:13 crc kubenswrapper[4898]: E1211 13:10:13.731092 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c9a929f-745b-48d6-9799-f762ee7b14fe" containerName="controller-manager" Dec 11 13:10:13 crc kubenswrapper[4898]: I1211 13:10:13.731103 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c9a929f-745b-48d6-9799-f762ee7b14fe" containerName="controller-manager" Dec 11 13:10:13 crc kubenswrapper[4898]: I1211 13:10:13.731207 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c9a929f-745b-48d6-9799-f762ee7b14fe" containerName="controller-manager" Dec 11 13:10:13 crc kubenswrapper[4898]: I1211 13:10:13.731584 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6d6f47df77-zcqxc" Dec 11 13:10:13 crc kubenswrapper[4898]: I1211 13:10:13.739862 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6d6f47df77-zcqxc"] Dec 11 13:10:13 crc kubenswrapper[4898]: I1211 13:10:13.821020 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1c9a929f-745b-48d6-9799-f762ee7b14fe-client-ca\") pod \"1c9a929f-745b-48d6-9799-f762ee7b14fe\" (UID: \"1c9a929f-745b-48d6-9799-f762ee7b14fe\") " Dec 11 13:10:13 crc kubenswrapper[4898]: I1211 13:10:13.821075 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1c9a929f-745b-48d6-9799-f762ee7b14fe-serving-cert\") pod \"1c9a929f-745b-48d6-9799-f762ee7b14fe\" (UID: \"1c9a929f-745b-48d6-9799-f762ee7b14fe\") " Dec 11 13:10:13 crc kubenswrapper[4898]: I1211 13:10:13.821104 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9pkjr\" (UniqueName: \"kubernetes.io/projected/1c9a929f-745b-48d6-9799-f762ee7b14fe-kube-api-access-9pkjr\") pod \"1c9a929f-745b-48d6-9799-f762ee7b14fe\" (UID: \"1c9a929f-745b-48d6-9799-f762ee7b14fe\") " Dec 11 13:10:13 crc kubenswrapper[4898]: I1211 13:10:13.821175 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1c9a929f-745b-48d6-9799-f762ee7b14fe-proxy-ca-bundles\") pod \"1c9a929f-745b-48d6-9799-f762ee7b14fe\" (UID: \"1c9a929f-745b-48d6-9799-f762ee7b14fe\") " Dec 11 13:10:13 crc kubenswrapper[4898]: I1211 13:10:13.821213 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1c9a929f-745b-48d6-9799-f762ee7b14fe-config\") pod \"1c9a929f-745b-48d6-9799-f762ee7b14fe\" (UID: \"1c9a929f-745b-48d6-9799-f762ee7b14fe\") " Dec 11 13:10:13 crc kubenswrapper[4898]: I1211 13:10:13.822177 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1c9a929f-745b-48d6-9799-f762ee7b14fe-client-ca" (OuterVolumeSpecName: "client-ca") pod "1c9a929f-745b-48d6-9799-f762ee7b14fe" (UID: "1c9a929f-745b-48d6-9799-f762ee7b14fe"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:10:13 crc kubenswrapper[4898]: I1211 13:10:13.822222 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1c9a929f-745b-48d6-9799-f762ee7b14fe-config" (OuterVolumeSpecName: "config") pod "1c9a929f-745b-48d6-9799-f762ee7b14fe" (UID: "1c9a929f-745b-48d6-9799-f762ee7b14fe"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:10:13 crc kubenswrapper[4898]: I1211 13:10:13.822799 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1c9a929f-745b-48d6-9799-f762ee7b14fe-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "1c9a929f-745b-48d6-9799-f762ee7b14fe" (UID: "1c9a929f-745b-48d6-9799-f762ee7b14fe"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:10:13 crc kubenswrapper[4898]: I1211 13:10:13.828024 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c9a929f-745b-48d6-9799-f762ee7b14fe-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1c9a929f-745b-48d6-9799-f762ee7b14fe" (UID: "1c9a929f-745b-48d6-9799-f762ee7b14fe"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:10:13 crc kubenswrapper[4898]: I1211 13:10:13.828418 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c9a929f-745b-48d6-9799-f762ee7b14fe-kube-api-access-9pkjr" (OuterVolumeSpecName: "kube-api-access-9pkjr") pod "1c9a929f-745b-48d6-9799-f762ee7b14fe" (UID: "1c9a929f-745b-48d6-9799-f762ee7b14fe"). InnerVolumeSpecName "kube-api-access-9pkjr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:10:13 crc kubenswrapper[4898]: I1211 13:10:13.922686 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0fd6bf11-4912-4c55-b89d-5866167a0283-serving-cert\") pod \"controller-manager-6d6f47df77-zcqxc\" (UID: \"0fd6bf11-4912-4c55-b89d-5866167a0283\") " pod="openshift-controller-manager/controller-manager-6d6f47df77-zcqxc" Dec 11 13:10:13 crc kubenswrapper[4898]: I1211 13:10:13.922765 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pz6xv\" (UniqueName: \"kubernetes.io/projected/0fd6bf11-4912-4c55-b89d-5866167a0283-kube-api-access-pz6xv\") pod \"controller-manager-6d6f47df77-zcqxc\" (UID: \"0fd6bf11-4912-4c55-b89d-5866167a0283\") " pod="openshift-controller-manager/controller-manager-6d6f47df77-zcqxc" Dec 11 13:10:13 crc kubenswrapper[4898]: I1211 13:10:13.922791 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0fd6bf11-4912-4c55-b89d-5866167a0283-config\") pod \"controller-manager-6d6f47df77-zcqxc\" (UID: \"0fd6bf11-4912-4c55-b89d-5866167a0283\") " pod="openshift-controller-manager/controller-manager-6d6f47df77-zcqxc" Dec 11 13:10:13 crc kubenswrapper[4898]: I1211 13:10:13.922807 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0fd6bf11-4912-4c55-b89d-5866167a0283-proxy-ca-bundles\") pod \"controller-manager-6d6f47df77-zcqxc\" (UID: \"0fd6bf11-4912-4c55-b89d-5866167a0283\") " pod="openshift-controller-manager/controller-manager-6d6f47df77-zcqxc" Dec 11 13:10:13 crc kubenswrapper[4898]: I1211 13:10:13.922840 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0fd6bf11-4912-4c55-b89d-5866167a0283-client-ca\") pod \"controller-manager-6d6f47df77-zcqxc\" (UID: \"0fd6bf11-4912-4c55-b89d-5866167a0283\") " pod="openshift-controller-manager/controller-manager-6d6f47df77-zcqxc" Dec 11 13:10:13 crc kubenswrapper[4898]: I1211 13:10:13.922895 4898 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1c9a929f-745b-48d6-9799-f762ee7b14fe-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 11 13:10:13 crc kubenswrapper[4898]: I1211 13:10:13.922906 4898 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1c9a929f-745b-48d6-9799-f762ee7b14fe-config\") on node \"crc\" DevicePath \"\"" Dec 11 13:10:13 crc kubenswrapper[4898]: I1211 13:10:13.922916 4898 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1c9a929f-745b-48d6-9799-f762ee7b14fe-client-ca\") on node \"crc\" DevicePath \"\"" Dec 11 13:10:13 crc kubenswrapper[4898]: I1211 13:10:13.922927 4898 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1c9a929f-745b-48d6-9799-f762ee7b14fe-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 11 13:10:13 crc kubenswrapper[4898]: I1211 13:10:13.922939 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9pkjr\" (UniqueName: \"kubernetes.io/projected/1c9a929f-745b-48d6-9799-f762ee7b14fe-kube-api-access-9pkjr\") on node \"crc\" DevicePath \"\"" Dec 11 13:10:14 crc kubenswrapper[4898]: I1211 13:10:14.023705 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0fd6bf11-4912-4c55-b89d-5866167a0283-client-ca\") pod \"controller-manager-6d6f47df77-zcqxc\" (UID: \"0fd6bf11-4912-4c55-b89d-5866167a0283\") " pod="openshift-controller-manager/controller-manager-6d6f47df77-zcqxc" Dec 11 13:10:14 crc kubenswrapper[4898]: I1211 13:10:14.023782 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0fd6bf11-4912-4c55-b89d-5866167a0283-serving-cert\") pod \"controller-manager-6d6f47df77-zcqxc\" (UID: \"0fd6bf11-4912-4c55-b89d-5866167a0283\") " pod="openshift-controller-manager/controller-manager-6d6f47df77-zcqxc" Dec 11 13:10:14 crc kubenswrapper[4898]: I1211 13:10:14.023841 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pz6xv\" (UniqueName: \"kubernetes.io/projected/0fd6bf11-4912-4c55-b89d-5866167a0283-kube-api-access-pz6xv\") pod \"controller-manager-6d6f47df77-zcqxc\" (UID: \"0fd6bf11-4912-4c55-b89d-5866167a0283\") " pod="openshift-controller-manager/controller-manager-6d6f47df77-zcqxc" Dec 11 13:10:14 crc kubenswrapper[4898]: I1211 13:10:14.023859 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0fd6bf11-4912-4c55-b89d-5866167a0283-config\") pod \"controller-manager-6d6f47df77-zcqxc\" (UID: \"0fd6bf11-4912-4c55-b89d-5866167a0283\") " pod="openshift-controller-manager/controller-manager-6d6f47df77-zcqxc" Dec 11 13:10:14 crc kubenswrapper[4898]: I1211 13:10:14.023873 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0fd6bf11-4912-4c55-b89d-5866167a0283-proxy-ca-bundles\") pod \"controller-manager-6d6f47df77-zcqxc\" (UID: \"0fd6bf11-4912-4c55-b89d-5866167a0283\") " pod="openshift-controller-manager/controller-manager-6d6f47df77-zcqxc" Dec 11 13:10:14 crc kubenswrapper[4898]: I1211 13:10:14.025582 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0fd6bf11-4912-4c55-b89d-5866167a0283-proxy-ca-bundles\") pod \"controller-manager-6d6f47df77-zcqxc\" (UID: \"0fd6bf11-4912-4c55-b89d-5866167a0283\") " pod="openshift-controller-manager/controller-manager-6d6f47df77-zcqxc" Dec 11 13:10:14 crc kubenswrapper[4898]: I1211 13:10:14.025913 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0fd6bf11-4912-4c55-b89d-5866167a0283-config\") pod \"controller-manager-6d6f47df77-zcqxc\" (UID: \"0fd6bf11-4912-4c55-b89d-5866167a0283\") " pod="openshift-controller-manager/controller-manager-6d6f47df77-zcqxc" Dec 11 13:10:14 crc kubenswrapper[4898]: I1211 13:10:14.026830 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0fd6bf11-4912-4c55-b89d-5866167a0283-client-ca\") pod \"controller-manager-6d6f47df77-zcqxc\" (UID: \"0fd6bf11-4912-4c55-b89d-5866167a0283\") " pod="openshift-controller-manager/controller-manager-6d6f47df77-zcqxc" Dec 11 13:10:14 crc kubenswrapper[4898]: I1211 13:10:14.027595 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0fd6bf11-4912-4c55-b89d-5866167a0283-serving-cert\") pod \"controller-manager-6d6f47df77-zcqxc\" (UID: \"0fd6bf11-4912-4c55-b89d-5866167a0283\") " pod="openshift-controller-manager/controller-manager-6d6f47df77-zcqxc" Dec 11 13:10:14 crc kubenswrapper[4898]: I1211 13:10:14.053154 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pz6xv\" (UniqueName: \"kubernetes.io/projected/0fd6bf11-4912-4c55-b89d-5866167a0283-kube-api-access-pz6xv\") pod \"controller-manager-6d6f47df77-zcqxc\" (UID: \"0fd6bf11-4912-4c55-b89d-5866167a0283\") " pod="openshift-controller-manager/controller-manager-6d6f47df77-zcqxc" Dec 11 13:10:14 crc kubenswrapper[4898]: I1211 13:10:14.060067 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6d6f47df77-zcqxc" Dec 11 13:10:14 crc kubenswrapper[4898]: I1211 13:10:14.448385 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6d6f47df77-zcqxc"] Dec 11 13:10:14 crc kubenswrapper[4898]: W1211 13:10:14.455807 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0fd6bf11_4912_4c55_b89d_5866167a0283.slice/crio-12283f32ecd87706ad50740ea140aadabae4f1e31f27f5ef417830b440245a91 WatchSource:0}: Error finding container 12283f32ecd87706ad50740ea140aadabae4f1e31f27f5ef417830b440245a91: Status 404 returned error can't find the container with id 12283f32ecd87706ad50740ea140aadabae4f1e31f27f5ef417830b440245a91 Dec 11 13:10:14 crc kubenswrapper[4898]: I1211 13:10:14.556230 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6d6f47df77-zcqxc" event={"ID":"0fd6bf11-4912-4c55-b89d-5866167a0283","Type":"ContainerStarted","Data":"12283f32ecd87706ad50740ea140aadabae4f1e31f27f5ef417830b440245a91"} Dec 11 13:10:14 crc kubenswrapper[4898]: I1211 13:10:14.557844 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-dbf56b754-mxg92" Dec 11 13:10:14 crc kubenswrapper[4898]: I1211 13:10:14.557879 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-dbf56b754-mxg92" event={"ID":"1c9a929f-745b-48d6-9799-f762ee7b14fe","Type":"ContainerDied","Data":"fdd29a949ca0d9927f5e4691f7ecadc2f1ac9ddffd00b5cbf3c36da6fa32727e"} Dec 11 13:10:14 crc kubenswrapper[4898]: I1211 13:10:14.557949 4898 scope.go:117] "RemoveContainer" containerID="ef4276fe2fd24f0cf5ade8cb78337c92a6e94ed9be75fb32e3ecbf6510057c7e" Dec 11 13:10:14 crc kubenswrapper[4898]: I1211 13:10:14.583285 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-dbf56b754-mxg92"] Dec 11 13:10:14 crc kubenswrapper[4898]: I1211 13:10:14.588671 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-dbf56b754-mxg92"] Dec 11 13:10:14 crc kubenswrapper[4898]: I1211 13:10:14.784317 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1c9a929f-745b-48d6-9799-f762ee7b14fe" path="/var/lib/kubelet/pods/1c9a929f-745b-48d6-9799-f762ee7b14fe/volumes" Dec 11 13:10:15 crc kubenswrapper[4898]: I1211 13:10:15.573001 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6d6f47df77-zcqxc" event={"ID":"0fd6bf11-4912-4c55-b89d-5866167a0283","Type":"ContainerStarted","Data":"0c657109b177cacd0383bd9aa3611de1715c1ce9d7557c7ea26e926288884439"} Dec 11 13:10:15 crc kubenswrapper[4898]: I1211 13:10:15.573405 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6d6f47df77-zcqxc" Dec 11 13:10:15 crc kubenswrapper[4898]: I1211 13:10:15.578765 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6d6f47df77-zcqxc" Dec 11 13:10:15 crc kubenswrapper[4898]: I1211 13:10:15.598715 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-6d6f47df77-zcqxc" podStartSLOduration=3.598680088 podStartE2EDuration="3.598680088s" podCreationTimestamp="2025-12-11 13:10:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 13:10:15.59436229 +0000 UTC m=+373.166688737" watchObservedRunningTime="2025-12-11 13:10:15.598680088 +0000 UTC m=+373.171006525" Dec 11 13:10:16 crc kubenswrapper[4898]: I1211 13:10:16.053216 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-kxmhp" Dec 11 13:10:16 crc kubenswrapper[4898]: I1211 13:10:16.096095 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-kxmhp" Dec 11 13:10:30 crc kubenswrapper[4898]: I1211 13:10:30.908797 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/eca70e56-1feb-4a48-9c32-db8e075ebfff-tls-certificates\") pod \"prometheus-operator-admission-webhook-f54c54754-56mlq\" (UID: \"eca70e56-1feb-4a48-9c32-db8e075ebfff\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-56mlq" Dec 11 13:10:30 crc kubenswrapper[4898]: I1211 13:10:30.919152 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/eca70e56-1feb-4a48-9c32-db8e075ebfff-tls-certificates\") pod \"prometheus-operator-admission-webhook-f54c54754-56mlq\" (UID: \"eca70e56-1feb-4a48-9c32-db8e075ebfff\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-56mlq" Dec 11 13:10:31 crc kubenswrapper[4898]: I1211 13:10:31.028522 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-jhl8f" Dec 11 13:10:31 crc kubenswrapper[4898]: I1211 13:10:31.087351 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-g9vz7"] Dec 11 13:10:31 crc kubenswrapper[4898]: I1211 13:10:31.171973 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-56mlq" Dec 11 13:10:31 crc kubenswrapper[4898]: I1211 13:10:31.574551 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-56mlq"] Dec 11 13:10:31 crc kubenswrapper[4898]: I1211 13:10:31.670388 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-56mlq" event={"ID":"eca70e56-1feb-4a48-9c32-db8e075ebfff","Type":"ContainerStarted","Data":"3fcb4734d4370ccb3701614a539c85e8f66fdb2f2824100dab5cd1b2016f9b80"} Dec 11 13:10:32 crc kubenswrapper[4898]: I1211 13:10:32.152227 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-98998769b-5vw79"] Dec 11 13:10:32 crc kubenswrapper[4898]: I1211 13:10:32.152613 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-98998769b-5vw79" podUID="fdc5cc96-d8f0-48ea-9837-a74863c2d69d" containerName="route-controller-manager" containerID="cri-o://7dc4b9ab4508aaba07a27b6f6ac01a3abfca8bdba82c5eebc1b670771da0daf6" gracePeriod=30 Dec 11 13:10:32 crc kubenswrapper[4898]: I1211 13:10:32.635280 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-98998769b-5vw79" Dec 11 13:10:32 crc kubenswrapper[4898]: I1211 13:10:32.677732 4898 generic.go:334] "Generic (PLEG): container finished" podID="fdc5cc96-d8f0-48ea-9837-a74863c2d69d" containerID="7dc4b9ab4508aaba07a27b6f6ac01a3abfca8bdba82c5eebc1b670771da0daf6" exitCode=0 Dec 11 13:10:32 crc kubenswrapper[4898]: I1211 13:10:32.677777 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-98998769b-5vw79" event={"ID":"fdc5cc96-d8f0-48ea-9837-a74863c2d69d","Type":"ContainerDied","Data":"7dc4b9ab4508aaba07a27b6f6ac01a3abfca8bdba82c5eebc1b670771da0daf6"} Dec 11 13:10:32 crc kubenswrapper[4898]: I1211 13:10:32.677807 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-98998769b-5vw79" event={"ID":"fdc5cc96-d8f0-48ea-9837-a74863c2d69d","Type":"ContainerDied","Data":"735fccc1a2af25fba2346fbd7ee28fe8966af1e1cc6f15b7523199a12df71171"} Dec 11 13:10:32 crc kubenswrapper[4898]: I1211 13:10:32.677828 4898 scope.go:117] "RemoveContainer" containerID="7dc4b9ab4508aaba07a27b6f6ac01a3abfca8bdba82c5eebc1b670771da0daf6" Dec 11 13:10:32 crc kubenswrapper[4898]: I1211 13:10:32.677937 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-98998769b-5vw79" Dec 11 13:10:32 crc kubenswrapper[4898]: I1211 13:10:32.694986 4898 scope.go:117] "RemoveContainer" containerID="7dc4b9ab4508aaba07a27b6f6ac01a3abfca8bdba82c5eebc1b670771da0daf6" Dec 11 13:10:32 crc kubenswrapper[4898]: E1211 13:10:32.695525 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7dc4b9ab4508aaba07a27b6f6ac01a3abfca8bdba82c5eebc1b670771da0daf6\": container with ID starting with 7dc4b9ab4508aaba07a27b6f6ac01a3abfca8bdba82c5eebc1b670771da0daf6 not found: ID does not exist" containerID="7dc4b9ab4508aaba07a27b6f6ac01a3abfca8bdba82c5eebc1b670771da0daf6" Dec 11 13:10:32 crc kubenswrapper[4898]: I1211 13:10:32.695569 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7dc4b9ab4508aaba07a27b6f6ac01a3abfca8bdba82c5eebc1b670771da0daf6"} err="failed to get container status \"7dc4b9ab4508aaba07a27b6f6ac01a3abfca8bdba82c5eebc1b670771da0daf6\": rpc error: code = NotFound desc = could not find container \"7dc4b9ab4508aaba07a27b6f6ac01a3abfca8bdba82c5eebc1b670771da0daf6\": container with ID starting with 7dc4b9ab4508aaba07a27b6f6ac01a3abfca8bdba82c5eebc1b670771da0daf6 not found: ID does not exist" Dec 11 13:10:32 crc kubenswrapper[4898]: I1211 13:10:32.731951 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f4sj8\" (UniqueName: \"kubernetes.io/projected/fdc5cc96-d8f0-48ea-9837-a74863c2d69d-kube-api-access-f4sj8\") pod \"fdc5cc96-d8f0-48ea-9837-a74863c2d69d\" (UID: \"fdc5cc96-d8f0-48ea-9837-a74863c2d69d\") " Dec 11 13:10:32 crc kubenswrapper[4898]: I1211 13:10:32.732005 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fdc5cc96-d8f0-48ea-9837-a74863c2d69d-serving-cert\") pod \"fdc5cc96-d8f0-48ea-9837-a74863c2d69d\" (UID: \"fdc5cc96-d8f0-48ea-9837-a74863c2d69d\") " Dec 11 13:10:32 crc kubenswrapper[4898]: I1211 13:10:32.732064 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fdc5cc96-d8f0-48ea-9837-a74863c2d69d-client-ca\") pod \"fdc5cc96-d8f0-48ea-9837-a74863c2d69d\" (UID: \"fdc5cc96-d8f0-48ea-9837-a74863c2d69d\") " Dec 11 13:10:32 crc kubenswrapper[4898]: I1211 13:10:32.732108 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fdc5cc96-d8f0-48ea-9837-a74863c2d69d-config\") pod \"fdc5cc96-d8f0-48ea-9837-a74863c2d69d\" (UID: \"fdc5cc96-d8f0-48ea-9837-a74863c2d69d\") " Dec 11 13:10:32 crc kubenswrapper[4898]: I1211 13:10:32.732892 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fdc5cc96-d8f0-48ea-9837-a74863c2d69d-client-ca" (OuterVolumeSpecName: "client-ca") pod "fdc5cc96-d8f0-48ea-9837-a74863c2d69d" (UID: "fdc5cc96-d8f0-48ea-9837-a74863c2d69d"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:10:32 crc kubenswrapper[4898]: I1211 13:10:32.733023 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fdc5cc96-d8f0-48ea-9837-a74863c2d69d-config" (OuterVolumeSpecName: "config") pod "fdc5cc96-d8f0-48ea-9837-a74863c2d69d" (UID: "fdc5cc96-d8f0-48ea-9837-a74863c2d69d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:10:32 crc kubenswrapper[4898]: I1211 13:10:32.733279 4898 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fdc5cc96-d8f0-48ea-9837-a74863c2d69d-client-ca\") on node \"crc\" DevicePath \"\"" Dec 11 13:10:32 crc kubenswrapper[4898]: I1211 13:10:32.733298 4898 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fdc5cc96-d8f0-48ea-9837-a74863c2d69d-config\") on node \"crc\" DevicePath \"\"" Dec 11 13:10:32 crc kubenswrapper[4898]: I1211 13:10:32.737699 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fdc5cc96-d8f0-48ea-9837-a74863c2d69d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "fdc5cc96-d8f0-48ea-9837-a74863c2d69d" (UID: "fdc5cc96-d8f0-48ea-9837-a74863c2d69d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:10:32 crc kubenswrapper[4898]: I1211 13:10:32.740616 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fdc5cc96-d8f0-48ea-9837-a74863c2d69d-kube-api-access-f4sj8" (OuterVolumeSpecName: "kube-api-access-f4sj8") pod "fdc5cc96-d8f0-48ea-9837-a74863c2d69d" (UID: "fdc5cc96-d8f0-48ea-9837-a74863c2d69d"). InnerVolumeSpecName "kube-api-access-f4sj8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:10:32 crc kubenswrapper[4898]: I1211 13:10:32.834957 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f4sj8\" (UniqueName: \"kubernetes.io/projected/fdc5cc96-d8f0-48ea-9837-a74863c2d69d-kube-api-access-f4sj8\") on node \"crc\" DevicePath \"\"" Dec 11 13:10:32 crc kubenswrapper[4898]: I1211 13:10:32.835193 4898 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fdc5cc96-d8f0-48ea-9837-a74863c2d69d-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 11 13:10:32 crc kubenswrapper[4898]: I1211 13:10:32.996795 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-98998769b-5vw79"] Dec 11 13:10:33 crc kubenswrapper[4898]: I1211 13:10:33.002437 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-98998769b-5vw79"] Dec 11 13:10:33 crc kubenswrapper[4898]: I1211 13:10:33.586622 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6fcffdd775-jsnhs"] Dec 11 13:10:33 crc kubenswrapper[4898]: E1211 13:10:33.587090 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fdc5cc96-d8f0-48ea-9837-a74863c2d69d" containerName="route-controller-manager" Dec 11 13:10:33 crc kubenswrapper[4898]: I1211 13:10:33.587128 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="fdc5cc96-d8f0-48ea-9837-a74863c2d69d" containerName="route-controller-manager" Dec 11 13:10:33 crc kubenswrapper[4898]: I1211 13:10:33.587349 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="fdc5cc96-d8f0-48ea-9837-a74863c2d69d" containerName="route-controller-manager" Dec 11 13:10:33 crc kubenswrapper[4898]: I1211 13:10:33.588136 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6fcffdd775-jsnhs" Dec 11 13:10:33 crc kubenswrapper[4898]: I1211 13:10:33.591424 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 11 13:10:33 crc kubenswrapper[4898]: I1211 13:10:33.591453 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 11 13:10:33 crc kubenswrapper[4898]: I1211 13:10:33.591516 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 11 13:10:33 crc kubenswrapper[4898]: I1211 13:10:33.591545 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 11 13:10:33 crc kubenswrapper[4898]: I1211 13:10:33.591787 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 11 13:10:33 crc kubenswrapper[4898]: I1211 13:10:33.592786 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6fcffdd775-jsnhs"] Dec 11 13:10:33 crc kubenswrapper[4898]: I1211 13:10:33.594115 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 11 13:10:33 crc kubenswrapper[4898]: I1211 13:10:33.647260 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2dcb95b5-448f-4cc1-8399-9c4c1cc5046f-serving-cert\") pod \"route-controller-manager-6fcffdd775-jsnhs\" (UID: \"2dcb95b5-448f-4cc1-8399-9c4c1cc5046f\") " pod="openshift-route-controller-manager/route-controller-manager-6fcffdd775-jsnhs" Dec 11 13:10:33 crc kubenswrapper[4898]: I1211 13:10:33.647388 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2dcb95b5-448f-4cc1-8399-9c4c1cc5046f-client-ca\") pod \"route-controller-manager-6fcffdd775-jsnhs\" (UID: \"2dcb95b5-448f-4cc1-8399-9c4c1cc5046f\") " pod="openshift-route-controller-manager/route-controller-manager-6fcffdd775-jsnhs" Dec 11 13:10:33 crc kubenswrapper[4898]: I1211 13:10:33.647444 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kmrq6\" (UniqueName: \"kubernetes.io/projected/2dcb95b5-448f-4cc1-8399-9c4c1cc5046f-kube-api-access-kmrq6\") pod \"route-controller-manager-6fcffdd775-jsnhs\" (UID: \"2dcb95b5-448f-4cc1-8399-9c4c1cc5046f\") " pod="openshift-route-controller-manager/route-controller-manager-6fcffdd775-jsnhs" Dec 11 13:10:33 crc kubenswrapper[4898]: I1211 13:10:33.647655 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2dcb95b5-448f-4cc1-8399-9c4c1cc5046f-config\") pod \"route-controller-manager-6fcffdd775-jsnhs\" (UID: \"2dcb95b5-448f-4cc1-8399-9c4c1cc5046f\") " pod="openshift-route-controller-manager/route-controller-manager-6fcffdd775-jsnhs" Dec 11 13:10:33 crc kubenswrapper[4898]: I1211 13:10:33.685350 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-56mlq" event={"ID":"eca70e56-1feb-4a48-9c32-db8e075ebfff","Type":"ContainerStarted","Data":"f21ad34c8231f03401332d905a77343435f61cd8d14cbe06f8e36f60082ae522"} Dec 11 13:10:33 crc kubenswrapper[4898]: I1211 13:10:33.685691 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-56mlq" Dec 11 13:10:33 crc kubenswrapper[4898]: I1211 13:10:33.695812 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-56mlq" Dec 11 13:10:33 crc kubenswrapper[4898]: I1211 13:10:33.708224 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-56mlq" podStartSLOduration=66.082449785 podStartE2EDuration="1m7.708194747s" podCreationTimestamp="2025-12-11 13:09:26 +0000 UTC" firstStartedPulling="2025-12-11 13:10:31.586126299 +0000 UTC m=+389.158452736" lastFinishedPulling="2025-12-11 13:10:33.211871261 +0000 UTC m=+390.784197698" observedRunningTime="2025-12-11 13:10:33.70281682 +0000 UTC m=+391.275143267" watchObservedRunningTime="2025-12-11 13:10:33.708194747 +0000 UTC m=+391.280521224" Dec 11 13:10:33 crc kubenswrapper[4898]: I1211 13:10:33.748672 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2dcb95b5-448f-4cc1-8399-9c4c1cc5046f-serving-cert\") pod \"route-controller-manager-6fcffdd775-jsnhs\" (UID: \"2dcb95b5-448f-4cc1-8399-9c4c1cc5046f\") " pod="openshift-route-controller-manager/route-controller-manager-6fcffdd775-jsnhs" Dec 11 13:10:33 crc kubenswrapper[4898]: I1211 13:10:33.748809 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2dcb95b5-448f-4cc1-8399-9c4c1cc5046f-client-ca\") pod \"route-controller-manager-6fcffdd775-jsnhs\" (UID: \"2dcb95b5-448f-4cc1-8399-9c4c1cc5046f\") " pod="openshift-route-controller-manager/route-controller-manager-6fcffdd775-jsnhs" Dec 11 13:10:33 crc kubenswrapper[4898]: I1211 13:10:33.748847 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kmrq6\" (UniqueName: \"kubernetes.io/projected/2dcb95b5-448f-4cc1-8399-9c4c1cc5046f-kube-api-access-kmrq6\") pod \"route-controller-manager-6fcffdd775-jsnhs\" (UID: \"2dcb95b5-448f-4cc1-8399-9c4c1cc5046f\") " pod="openshift-route-controller-manager/route-controller-manager-6fcffdd775-jsnhs" Dec 11 13:10:33 crc kubenswrapper[4898]: I1211 13:10:33.748921 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2dcb95b5-448f-4cc1-8399-9c4c1cc5046f-config\") pod \"route-controller-manager-6fcffdd775-jsnhs\" (UID: \"2dcb95b5-448f-4cc1-8399-9c4c1cc5046f\") " pod="openshift-route-controller-manager/route-controller-manager-6fcffdd775-jsnhs" Dec 11 13:10:33 crc kubenswrapper[4898]: I1211 13:10:33.750790 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2dcb95b5-448f-4cc1-8399-9c4c1cc5046f-config\") pod \"route-controller-manager-6fcffdd775-jsnhs\" (UID: \"2dcb95b5-448f-4cc1-8399-9c4c1cc5046f\") " pod="openshift-route-controller-manager/route-controller-manager-6fcffdd775-jsnhs" Dec 11 13:10:33 crc kubenswrapper[4898]: I1211 13:10:33.751548 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2dcb95b5-448f-4cc1-8399-9c4c1cc5046f-client-ca\") pod \"route-controller-manager-6fcffdd775-jsnhs\" (UID: \"2dcb95b5-448f-4cc1-8399-9c4c1cc5046f\") " pod="openshift-route-controller-manager/route-controller-manager-6fcffdd775-jsnhs" Dec 11 13:10:33 crc kubenswrapper[4898]: I1211 13:10:33.768796 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2dcb95b5-448f-4cc1-8399-9c4c1cc5046f-serving-cert\") pod \"route-controller-manager-6fcffdd775-jsnhs\" (UID: \"2dcb95b5-448f-4cc1-8399-9c4c1cc5046f\") " pod="openshift-route-controller-manager/route-controller-manager-6fcffdd775-jsnhs" Dec 11 13:10:33 crc kubenswrapper[4898]: I1211 13:10:33.771482 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kmrq6\" (UniqueName: \"kubernetes.io/projected/2dcb95b5-448f-4cc1-8399-9c4c1cc5046f-kube-api-access-kmrq6\") pod \"route-controller-manager-6fcffdd775-jsnhs\" (UID: \"2dcb95b5-448f-4cc1-8399-9c4c1cc5046f\") " pod="openshift-route-controller-manager/route-controller-manager-6fcffdd775-jsnhs" Dec 11 13:10:33 crc kubenswrapper[4898]: I1211 13:10:33.909038 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6fcffdd775-jsnhs" Dec 11 13:10:34 crc kubenswrapper[4898]: I1211 13:10:34.031523 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-db54df47d-q452r"] Dec 11 13:10:34 crc kubenswrapper[4898]: I1211 13:10:34.032571 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-db54df47d-q452r" Dec 11 13:10:34 crc kubenswrapper[4898]: I1211 13:10:34.034815 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-tls" Dec 11 13:10:34 crc kubenswrapper[4898]: I1211 13:10:34.035560 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-dockercfg-567rj" Dec 11 13:10:34 crc kubenswrapper[4898]: I1211 13:10:34.035774 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"metrics-client-ca" Dec 11 13:10:34 crc kubenswrapper[4898]: I1211 13:10:34.042974 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-kube-rbac-proxy-config" Dec 11 13:10:34 crc kubenswrapper[4898]: I1211 13:10:34.046492 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-db54df47d-q452r"] Dec 11 13:10:34 crc kubenswrapper[4898]: I1211 13:10:34.155910 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/70210806-abb3-447a-a8da-f6f69b2b9206-metrics-client-ca\") pod \"prometheus-operator-db54df47d-q452r\" (UID: \"70210806-abb3-447a-a8da-f6f69b2b9206\") " pod="openshift-monitoring/prometheus-operator-db54df47d-q452r" Dec 11 13:10:34 crc kubenswrapper[4898]: I1211 13:10:34.156324 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/70210806-abb3-447a-a8da-f6f69b2b9206-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-db54df47d-q452r\" (UID: \"70210806-abb3-447a-a8da-f6f69b2b9206\") " pod="openshift-monitoring/prometheus-operator-db54df47d-q452r" Dec 11 13:10:34 crc kubenswrapper[4898]: I1211 13:10:34.156400 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/70210806-abb3-447a-a8da-f6f69b2b9206-prometheus-operator-tls\") pod \"prometheus-operator-db54df47d-q452r\" (UID: \"70210806-abb3-447a-a8da-f6f69b2b9206\") " pod="openshift-monitoring/prometheus-operator-db54df47d-q452r" Dec 11 13:10:34 crc kubenswrapper[4898]: I1211 13:10:34.156487 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vgg7g\" (UniqueName: \"kubernetes.io/projected/70210806-abb3-447a-a8da-f6f69b2b9206-kube-api-access-vgg7g\") pod \"prometheus-operator-db54df47d-q452r\" (UID: \"70210806-abb3-447a-a8da-f6f69b2b9206\") " pod="openshift-monitoring/prometheus-operator-db54df47d-q452r" Dec 11 13:10:34 crc kubenswrapper[4898]: I1211 13:10:34.257165 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/70210806-abb3-447a-a8da-f6f69b2b9206-metrics-client-ca\") pod \"prometheus-operator-db54df47d-q452r\" (UID: \"70210806-abb3-447a-a8da-f6f69b2b9206\") " pod="openshift-monitoring/prometheus-operator-db54df47d-q452r" Dec 11 13:10:34 crc kubenswrapper[4898]: I1211 13:10:34.257226 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/70210806-abb3-447a-a8da-f6f69b2b9206-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-db54df47d-q452r\" (UID: \"70210806-abb3-447a-a8da-f6f69b2b9206\") " pod="openshift-monitoring/prometheus-operator-db54df47d-q452r" Dec 11 13:10:34 crc kubenswrapper[4898]: I1211 13:10:34.257270 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/70210806-abb3-447a-a8da-f6f69b2b9206-prometheus-operator-tls\") pod \"prometheus-operator-db54df47d-q452r\" (UID: \"70210806-abb3-447a-a8da-f6f69b2b9206\") " pod="openshift-monitoring/prometheus-operator-db54df47d-q452r" Dec 11 13:10:34 crc kubenswrapper[4898]: I1211 13:10:34.257298 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vgg7g\" (UniqueName: \"kubernetes.io/projected/70210806-abb3-447a-a8da-f6f69b2b9206-kube-api-access-vgg7g\") pod \"prometheus-operator-db54df47d-q452r\" (UID: \"70210806-abb3-447a-a8da-f6f69b2b9206\") " pod="openshift-monitoring/prometheus-operator-db54df47d-q452r" Dec 11 13:10:34 crc kubenswrapper[4898]: I1211 13:10:34.258851 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/70210806-abb3-447a-a8da-f6f69b2b9206-metrics-client-ca\") pod \"prometheus-operator-db54df47d-q452r\" (UID: \"70210806-abb3-447a-a8da-f6f69b2b9206\") " pod="openshift-monitoring/prometheus-operator-db54df47d-q452r" Dec 11 13:10:34 crc kubenswrapper[4898]: I1211 13:10:34.263013 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/70210806-abb3-447a-a8da-f6f69b2b9206-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-db54df47d-q452r\" (UID: \"70210806-abb3-447a-a8da-f6f69b2b9206\") " pod="openshift-monitoring/prometheus-operator-db54df47d-q452r" Dec 11 13:10:34 crc kubenswrapper[4898]: I1211 13:10:34.263762 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/70210806-abb3-447a-a8da-f6f69b2b9206-prometheus-operator-tls\") pod \"prometheus-operator-db54df47d-q452r\" (UID: \"70210806-abb3-447a-a8da-f6f69b2b9206\") " pod="openshift-monitoring/prometheus-operator-db54df47d-q452r" Dec 11 13:10:34 crc kubenswrapper[4898]: I1211 13:10:34.274048 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vgg7g\" (UniqueName: \"kubernetes.io/projected/70210806-abb3-447a-a8da-f6f69b2b9206-kube-api-access-vgg7g\") pod \"prometheus-operator-db54df47d-q452r\" (UID: \"70210806-abb3-447a-a8da-f6f69b2b9206\") " pod="openshift-monitoring/prometheus-operator-db54df47d-q452r" Dec 11 13:10:34 crc kubenswrapper[4898]: I1211 13:10:34.358789 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-db54df47d-q452r" Dec 11 13:10:34 crc kubenswrapper[4898]: I1211 13:10:34.372476 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6fcffdd775-jsnhs"] Dec 11 13:10:34 crc kubenswrapper[4898]: W1211 13:10:34.387896 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2dcb95b5_448f_4cc1_8399_9c4c1cc5046f.slice/crio-563c2be6076a0089b205e98def0a72a8f727c60e25453f3d9eb5d229836a357a WatchSource:0}: Error finding container 563c2be6076a0089b205e98def0a72a8f727c60e25453f3d9eb5d229836a357a: Status 404 returned error can't find the container with id 563c2be6076a0089b205e98def0a72a8f727c60e25453f3d9eb5d229836a357a Dec 11 13:10:34 crc kubenswrapper[4898]: I1211 13:10:34.693987 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6fcffdd775-jsnhs" event={"ID":"2dcb95b5-448f-4cc1-8399-9c4c1cc5046f","Type":"ContainerStarted","Data":"32a8ca710e770b7acc886ab5999148f5516d8db063d401ef58c5a09a44899414"} Dec 11 13:10:34 crc kubenswrapper[4898]: I1211 13:10:34.694437 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6fcffdd775-jsnhs" event={"ID":"2dcb95b5-448f-4cc1-8399-9c4c1cc5046f","Type":"ContainerStarted","Data":"563c2be6076a0089b205e98def0a72a8f727c60e25453f3d9eb5d229836a357a"} Dec 11 13:10:34 crc kubenswrapper[4898]: I1211 13:10:34.694481 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6fcffdd775-jsnhs" Dec 11 13:10:34 crc kubenswrapper[4898]: I1211 13:10:34.716388 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6fcffdd775-jsnhs" podStartSLOduration=2.71636292 podStartE2EDuration="2.71636292s" podCreationTimestamp="2025-12-11 13:10:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 13:10:34.711254501 +0000 UTC m=+392.283580968" watchObservedRunningTime="2025-12-11 13:10:34.71636292 +0000 UTC m=+392.288689357" Dec 11 13:10:34 crc kubenswrapper[4898]: I1211 13:10:34.785991 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fdc5cc96-d8f0-48ea-9837-a74863c2d69d" path="/var/lib/kubelet/pods/fdc5cc96-d8f0-48ea-9837-a74863c2d69d/volumes" Dec 11 13:10:34 crc kubenswrapper[4898]: I1211 13:10:34.819281 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-db54df47d-q452r"] Dec 11 13:10:34 crc kubenswrapper[4898]: W1211 13:10:34.829820 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod70210806_abb3_447a_a8da_f6f69b2b9206.slice/crio-ce7fa0c50bd26d02bcce08610711beac4a4b9b0406fd4c94054806341c8055ec WatchSource:0}: Error finding container ce7fa0c50bd26d02bcce08610711beac4a4b9b0406fd4c94054806341c8055ec: Status 404 returned error can't find the container with id ce7fa0c50bd26d02bcce08610711beac4a4b9b0406fd4c94054806341c8055ec Dec 11 13:10:34 crc kubenswrapper[4898]: I1211 13:10:34.995712 4898 patch_prober.go:28] interesting pod/machine-config-daemon-7mmvk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 11 13:10:34 crc kubenswrapper[4898]: I1211 13:10:34.995772 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 11 13:10:35 crc kubenswrapper[4898]: I1211 13:10:35.095097 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6fcffdd775-jsnhs" Dec 11 13:10:35 crc kubenswrapper[4898]: I1211 13:10:35.702498 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-db54df47d-q452r" event={"ID":"70210806-abb3-447a-a8da-f6f69b2b9206","Type":"ContainerStarted","Data":"ce7fa0c50bd26d02bcce08610711beac4a4b9b0406fd4c94054806341c8055ec"} Dec 11 13:10:36 crc kubenswrapper[4898]: I1211 13:10:36.709976 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-db54df47d-q452r" event={"ID":"70210806-abb3-447a-a8da-f6f69b2b9206","Type":"ContainerStarted","Data":"6b2e39271581522521c287e294393785e857db2914245296760dcff1a69bc9d8"} Dec 11 13:10:37 crc kubenswrapper[4898]: I1211 13:10:37.717364 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-db54df47d-q452r" event={"ID":"70210806-abb3-447a-a8da-f6f69b2b9206","Type":"ContainerStarted","Data":"788b8a5ca4d8e37e343a745d99a3ae034ecf87d7bd50097ccde20ebcb3ed22b4"} Dec 11 13:10:37 crc kubenswrapper[4898]: I1211 13:10:37.743001 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-db54df47d-q452r" podStartSLOduration=2.132363734 podStartE2EDuration="3.742761722s" podCreationTimestamp="2025-12-11 13:10:34 +0000 UTC" firstStartedPulling="2025-12-11 13:10:34.833579395 +0000 UTC m=+392.405905832" lastFinishedPulling="2025-12-11 13:10:36.443977383 +0000 UTC m=+394.016303820" observedRunningTime="2025-12-11 13:10:37.741525599 +0000 UTC m=+395.313852056" watchObservedRunningTime="2025-12-11 13:10:37.742761722 +0000 UTC m=+395.315088149" Dec 11 13:10:39 crc kubenswrapper[4898]: I1211 13:10:39.450633 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-566fddb674-p7nkr"] Dec 11 13:10:39 crc kubenswrapper[4898]: I1211 13:10:39.452227 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-566fddb674-p7nkr" Dec 11 13:10:39 crc kubenswrapper[4898]: I1211 13:10:39.458911 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-kube-rbac-proxy-config" Dec 11 13:10:39 crc kubenswrapper[4898]: I1211 13:10:39.458918 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-dockercfg-qqqn8" Dec 11 13:10:39 crc kubenswrapper[4898]: I1211 13:10:39.458915 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-tls" Dec 11 13:10:39 crc kubenswrapper[4898]: I1211 13:10:39.485719 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-566fddb674-p7nkr"] Dec 11 13:10:39 crc kubenswrapper[4898]: I1211 13:10:39.497769 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-777cb5bd5d-cwpg4"] Dec 11 13:10:39 crc kubenswrapper[4898]: I1211 13:10:39.507958 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-cwpg4" Dec 11 13:10:39 crc kubenswrapper[4898]: I1211 13:10:39.510692 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-tls" Dec 11 13:10:39 crc kubenswrapper[4898]: I1211 13:10:39.510924 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-dockercfg-4kbh2" Dec 11 13:10:39 crc kubenswrapper[4898]: I1211 13:10:39.511141 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kube-state-metrics-custom-resource-state-configmap" Dec 11 13:10:39 crc kubenswrapper[4898]: I1211 13:10:39.511314 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-kube-rbac-proxy-config" Dec 11 13:10:39 crc kubenswrapper[4898]: I1211 13:10:39.526972 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-777cb5bd5d-cwpg4"] Dec 11 13:10:39 crc kubenswrapper[4898]: I1211 13:10:39.549804 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/05eede04-2d90-4979-908e-29a4d88daf36-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-566fddb674-p7nkr\" (UID: \"05eede04-2d90-4979-908e-29a4d88daf36\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-p7nkr" Dec 11 13:10:39 crc kubenswrapper[4898]: I1211 13:10:39.549868 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/ba58f08b-30ee-4ed4-9156-ce30817e7231-volume-directive-shadow\") pod \"kube-state-metrics-777cb5bd5d-cwpg4\" (UID: \"ba58f08b-30ee-4ed4-9156-ce30817e7231\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-cwpg4" Dec 11 13:10:39 crc kubenswrapper[4898]: I1211 13:10:39.549892 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/ba58f08b-30ee-4ed4-9156-ce30817e7231-kube-state-metrics-tls\") pod \"kube-state-metrics-777cb5bd5d-cwpg4\" (UID: \"ba58f08b-30ee-4ed4-9156-ce30817e7231\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-cwpg4" Dec 11 13:10:39 crc kubenswrapper[4898]: I1211 13:10:39.549910 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qkm9g\" (UniqueName: \"kubernetes.io/projected/05eede04-2d90-4979-908e-29a4d88daf36-kube-api-access-qkm9g\") pod \"openshift-state-metrics-566fddb674-p7nkr\" (UID: \"05eede04-2d90-4979-908e-29a4d88daf36\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-p7nkr" Dec 11 13:10:39 crc kubenswrapper[4898]: I1211 13:10:39.549947 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ba58f08b-30ee-4ed4-9156-ce30817e7231-metrics-client-ca\") pod \"kube-state-metrics-777cb5bd5d-cwpg4\" (UID: \"ba58f08b-30ee-4ed4-9156-ce30817e7231\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-cwpg4" Dec 11 13:10:39 crc kubenswrapper[4898]: I1211 13:10:39.549971 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/05eede04-2d90-4979-908e-29a4d88daf36-metrics-client-ca\") pod \"openshift-state-metrics-566fddb674-p7nkr\" (UID: \"05eede04-2d90-4979-908e-29a4d88daf36\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-p7nkr" Dec 11 13:10:39 crc kubenswrapper[4898]: I1211 13:10:39.549997 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/05eede04-2d90-4979-908e-29a4d88daf36-openshift-state-metrics-tls\") pod \"openshift-state-metrics-566fddb674-p7nkr\" (UID: \"05eede04-2d90-4979-908e-29a4d88daf36\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-p7nkr" Dec 11 13:10:39 crc kubenswrapper[4898]: I1211 13:10:39.550031 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/ba58f08b-30ee-4ed4-9156-ce30817e7231-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-777cb5bd5d-cwpg4\" (UID: \"ba58f08b-30ee-4ed4-9156-ce30817e7231\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-cwpg4" Dec 11 13:10:39 crc kubenswrapper[4898]: I1211 13:10:39.550061 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ba58f08b-30ee-4ed4-9156-ce30817e7231-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-777cb5bd5d-cwpg4\" (UID: \"ba58f08b-30ee-4ed4-9156-ce30817e7231\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-cwpg4" Dec 11 13:10:39 crc kubenswrapper[4898]: I1211 13:10:39.550100 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xng9c\" (UniqueName: \"kubernetes.io/projected/ba58f08b-30ee-4ed4-9156-ce30817e7231-kube-api-access-xng9c\") pod \"kube-state-metrics-777cb5bd5d-cwpg4\" (UID: \"ba58f08b-30ee-4ed4-9156-ce30817e7231\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-cwpg4" Dec 11 13:10:39 crc kubenswrapper[4898]: I1211 13:10:39.563555 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-8p8qb"] Dec 11 13:10:39 crc kubenswrapper[4898]: I1211 13:10:39.568051 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-8p8qb" Dec 11 13:10:39 crc kubenswrapper[4898]: I1211 13:10:39.571113 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-kube-rbac-proxy-config" Dec 11 13:10:39 crc kubenswrapper[4898]: I1211 13:10:39.571113 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-dockercfg-d9wmc" Dec 11 13:10:39 crc kubenswrapper[4898]: I1211 13:10:39.571122 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-tls" Dec 11 13:10:41 crc kubenswrapper[4898]: I1211 13:10:39.651416 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ba58f08b-30ee-4ed4-9156-ce30817e7231-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-777cb5bd5d-cwpg4\" (UID: \"ba58f08b-30ee-4ed4-9156-ce30817e7231\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-cwpg4" Dec 11 13:10:41 crc kubenswrapper[4898]: I1211 13:10:39.651479 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xng9c\" (UniqueName: \"kubernetes.io/projected/ba58f08b-30ee-4ed4-9156-ce30817e7231-kube-api-access-xng9c\") pod \"kube-state-metrics-777cb5bd5d-cwpg4\" (UID: \"ba58f08b-30ee-4ed4-9156-ce30817e7231\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-cwpg4" Dec 11 13:10:41 crc kubenswrapper[4898]: I1211 13:10:39.651516 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/05eede04-2d90-4979-908e-29a4d88daf36-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-566fddb674-p7nkr\" (UID: \"05eede04-2d90-4979-908e-29a4d88daf36\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-p7nkr" Dec 11 13:10:41 crc kubenswrapper[4898]: I1211 13:10:39.651540 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/13ca8d97-b6c7-4670-908e-d4cd949bcb0b-node-exporter-wtmp\") pod \"node-exporter-8p8qb\" (UID: \"13ca8d97-b6c7-4670-908e-d4cd949bcb0b\") " pod="openshift-monitoring/node-exporter-8p8qb" Dec 11 13:10:41 crc kubenswrapper[4898]: I1211 13:10:39.651560 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/13ca8d97-b6c7-4670-908e-d4cd949bcb0b-node-exporter-textfile\") pod \"node-exporter-8p8qb\" (UID: \"13ca8d97-b6c7-4670-908e-d4cd949bcb0b\") " pod="openshift-monitoring/node-exporter-8p8qb" Dec 11 13:10:41 crc kubenswrapper[4898]: I1211 13:10:39.651580 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/ba58f08b-30ee-4ed4-9156-ce30817e7231-volume-directive-shadow\") pod \"kube-state-metrics-777cb5bd5d-cwpg4\" (UID: \"ba58f08b-30ee-4ed4-9156-ce30817e7231\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-cwpg4" Dec 11 13:10:41 crc kubenswrapper[4898]: I1211 13:10:39.651603 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/ba58f08b-30ee-4ed4-9156-ce30817e7231-kube-state-metrics-tls\") pod \"kube-state-metrics-777cb5bd5d-cwpg4\" (UID: \"ba58f08b-30ee-4ed4-9156-ce30817e7231\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-cwpg4" Dec 11 13:10:41 crc kubenswrapper[4898]: I1211 13:10:39.651619 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qkm9g\" (UniqueName: \"kubernetes.io/projected/05eede04-2d90-4979-908e-29a4d88daf36-kube-api-access-qkm9g\") pod \"openshift-state-metrics-566fddb674-p7nkr\" (UID: \"05eede04-2d90-4979-908e-29a4d88daf36\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-p7nkr" Dec 11 13:10:41 crc kubenswrapper[4898]: I1211 13:10:39.651641 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ba58f08b-30ee-4ed4-9156-ce30817e7231-metrics-client-ca\") pod \"kube-state-metrics-777cb5bd5d-cwpg4\" (UID: \"ba58f08b-30ee-4ed4-9156-ce30817e7231\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-cwpg4" Dec 11 13:10:41 crc kubenswrapper[4898]: I1211 13:10:39.651664 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/05eede04-2d90-4979-908e-29a4d88daf36-metrics-client-ca\") pod \"openshift-state-metrics-566fddb674-p7nkr\" (UID: \"05eede04-2d90-4979-908e-29a4d88daf36\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-p7nkr" Dec 11 13:10:41 crc kubenswrapper[4898]: I1211 13:10:39.651682 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/13ca8d97-b6c7-4670-908e-d4cd949bcb0b-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-8p8qb\" (UID: \"13ca8d97-b6c7-4670-908e-d4cd949bcb0b\") " pod="openshift-monitoring/node-exporter-8p8qb" Dec 11 13:10:41 crc kubenswrapper[4898]: I1211 13:10:39.651708 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/05eede04-2d90-4979-908e-29a4d88daf36-openshift-state-metrics-tls\") pod \"openshift-state-metrics-566fddb674-p7nkr\" (UID: \"05eede04-2d90-4979-908e-29a4d88daf36\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-p7nkr" Dec 11 13:10:41 crc kubenswrapper[4898]: I1211 13:10:39.651723 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/13ca8d97-b6c7-4670-908e-d4cd949bcb0b-metrics-client-ca\") pod \"node-exporter-8p8qb\" (UID: \"13ca8d97-b6c7-4670-908e-d4cd949bcb0b\") " pod="openshift-monitoring/node-exporter-8p8qb" Dec 11 13:10:41 crc kubenswrapper[4898]: I1211 13:10:39.651737 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8kcz8\" (UniqueName: \"kubernetes.io/projected/13ca8d97-b6c7-4670-908e-d4cd949bcb0b-kube-api-access-8kcz8\") pod \"node-exporter-8p8qb\" (UID: \"13ca8d97-b6c7-4670-908e-d4cd949bcb0b\") " pod="openshift-monitoring/node-exporter-8p8qb" Dec 11 13:10:41 crc kubenswrapper[4898]: I1211 13:10:39.651756 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/ba58f08b-30ee-4ed4-9156-ce30817e7231-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-777cb5bd5d-cwpg4\" (UID: \"ba58f08b-30ee-4ed4-9156-ce30817e7231\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-cwpg4" Dec 11 13:10:41 crc kubenswrapper[4898]: I1211 13:10:39.651771 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/13ca8d97-b6c7-4670-908e-d4cd949bcb0b-node-exporter-tls\") pod \"node-exporter-8p8qb\" (UID: \"13ca8d97-b6c7-4670-908e-d4cd949bcb0b\") " pod="openshift-monitoring/node-exporter-8p8qb" Dec 11 13:10:41 crc kubenswrapper[4898]: I1211 13:10:39.651786 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/13ca8d97-b6c7-4670-908e-d4cd949bcb0b-root\") pod \"node-exporter-8p8qb\" (UID: \"13ca8d97-b6c7-4670-908e-d4cd949bcb0b\") " pod="openshift-monitoring/node-exporter-8p8qb" Dec 11 13:10:41 crc kubenswrapper[4898]: I1211 13:10:39.651805 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/13ca8d97-b6c7-4670-908e-d4cd949bcb0b-sys\") pod \"node-exporter-8p8qb\" (UID: \"13ca8d97-b6c7-4670-908e-d4cd949bcb0b\") " pod="openshift-monitoring/node-exporter-8p8qb" Dec 11 13:10:41 crc kubenswrapper[4898]: E1211 13:10:39.652599 4898 secret.go:188] Couldn't get secret openshift-monitoring/openshift-state-metrics-tls: secret "openshift-state-metrics-tls" not found Dec 11 13:10:41 crc kubenswrapper[4898]: E1211 13:10:39.652666 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/05eede04-2d90-4979-908e-29a4d88daf36-openshift-state-metrics-tls podName:05eede04-2d90-4979-908e-29a4d88daf36 nodeName:}" failed. No retries permitted until 2025-12-11 13:10:40.152647337 +0000 UTC m=+397.724973774 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "openshift-state-metrics-tls" (UniqueName: "kubernetes.io/secret/05eede04-2d90-4979-908e-29a4d88daf36-openshift-state-metrics-tls") pod "openshift-state-metrics-566fddb674-p7nkr" (UID: "05eede04-2d90-4979-908e-29a4d88daf36") : secret "openshift-state-metrics-tls" not found Dec 11 13:10:41 crc kubenswrapper[4898]: I1211 13:10:39.652782 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/05eede04-2d90-4979-908e-29a4d88daf36-metrics-client-ca\") pod \"openshift-state-metrics-566fddb674-p7nkr\" (UID: \"05eede04-2d90-4979-908e-29a4d88daf36\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-p7nkr" Dec 11 13:10:41 crc kubenswrapper[4898]: I1211 13:10:39.652807 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ba58f08b-30ee-4ed4-9156-ce30817e7231-metrics-client-ca\") pod \"kube-state-metrics-777cb5bd5d-cwpg4\" (UID: \"ba58f08b-30ee-4ed4-9156-ce30817e7231\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-cwpg4" Dec 11 13:10:41 crc kubenswrapper[4898]: I1211 13:10:39.653301 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/ba58f08b-30ee-4ed4-9156-ce30817e7231-volume-directive-shadow\") pod \"kube-state-metrics-777cb5bd5d-cwpg4\" (UID: \"ba58f08b-30ee-4ed4-9156-ce30817e7231\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-cwpg4" Dec 11 13:10:41 crc kubenswrapper[4898]: I1211 13:10:39.653333 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/ba58f08b-30ee-4ed4-9156-ce30817e7231-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-777cb5bd5d-cwpg4\" (UID: \"ba58f08b-30ee-4ed4-9156-ce30817e7231\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-cwpg4" Dec 11 13:10:41 crc kubenswrapper[4898]: I1211 13:10:39.658542 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ba58f08b-30ee-4ed4-9156-ce30817e7231-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-777cb5bd5d-cwpg4\" (UID: \"ba58f08b-30ee-4ed4-9156-ce30817e7231\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-cwpg4" Dec 11 13:10:41 crc kubenswrapper[4898]: I1211 13:10:39.659098 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/05eede04-2d90-4979-908e-29a4d88daf36-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-566fddb674-p7nkr\" (UID: \"05eede04-2d90-4979-908e-29a4d88daf36\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-p7nkr" Dec 11 13:10:41 crc kubenswrapper[4898]: I1211 13:10:39.679776 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/ba58f08b-30ee-4ed4-9156-ce30817e7231-kube-state-metrics-tls\") pod \"kube-state-metrics-777cb5bd5d-cwpg4\" (UID: \"ba58f08b-30ee-4ed4-9156-ce30817e7231\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-cwpg4" Dec 11 13:10:41 crc kubenswrapper[4898]: I1211 13:10:39.695430 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qkm9g\" (UniqueName: \"kubernetes.io/projected/05eede04-2d90-4979-908e-29a4d88daf36-kube-api-access-qkm9g\") pod \"openshift-state-metrics-566fddb674-p7nkr\" (UID: \"05eede04-2d90-4979-908e-29a4d88daf36\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-p7nkr" Dec 11 13:10:41 crc kubenswrapper[4898]: I1211 13:10:39.696107 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xng9c\" (UniqueName: \"kubernetes.io/projected/ba58f08b-30ee-4ed4-9156-ce30817e7231-kube-api-access-xng9c\") pod \"kube-state-metrics-777cb5bd5d-cwpg4\" (UID: \"ba58f08b-30ee-4ed4-9156-ce30817e7231\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-cwpg4" Dec 11 13:10:41 crc kubenswrapper[4898]: I1211 13:10:39.752652 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/13ca8d97-b6c7-4670-908e-d4cd949bcb0b-metrics-client-ca\") pod \"node-exporter-8p8qb\" (UID: \"13ca8d97-b6c7-4670-908e-d4cd949bcb0b\") " pod="openshift-monitoring/node-exporter-8p8qb" Dec 11 13:10:41 crc kubenswrapper[4898]: I1211 13:10:39.752965 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8kcz8\" (UniqueName: \"kubernetes.io/projected/13ca8d97-b6c7-4670-908e-d4cd949bcb0b-kube-api-access-8kcz8\") pod \"node-exporter-8p8qb\" (UID: \"13ca8d97-b6c7-4670-908e-d4cd949bcb0b\") " pod="openshift-monitoring/node-exporter-8p8qb" Dec 11 13:10:41 crc kubenswrapper[4898]: I1211 13:10:39.752982 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/13ca8d97-b6c7-4670-908e-d4cd949bcb0b-node-exporter-tls\") pod \"node-exporter-8p8qb\" (UID: \"13ca8d97-b6c7-4670-908e-d4cd949bcb0b\") " pod="openshift-monitoring/node-exporter-8p8qb" Dec 11 13:10:41 crc kubenswrapper[4898]: I1211 13:10:39.752997 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/13ca8d97-b6c7-4670-908e-d4cd949bcb0b-root\") pod \"node-exporter-8p8qb\" (UID: \"13ca8d97-b6c7-4670-908e-d4cd949bcb0b\") " pod="openshift-monitoring/node-exporter-8p8qb" Dec 11 13:10:41 crc kubenswrapper[4898]: I1211 13:10:39.753022 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/13ca8d97-b6c7-4670-908e-d4cd949bcb0b-sys\") pod \"node-exporter-8p8qb\" (UID: \"13ca8d97-b6c7-4670-908e-d4cd949bcb0b\") " pod="openshift-monitoring/node-exporter-8p8qb" Dec 11 13:10:41 crc kubenswrapper[4898]: I1211 13:10:39.753159 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/13ca8d97-b6c7-4670-908e-d4cd949bcb0b-root\") pod \"node-exporter-8p8qb\" (UID: \"13ca8d97-b6c7-4670-908e-d4cd949bcb0b\") " pod="openshift-monitoring/node-exporter-8p8qb" Dec 11 13:10:41 crc kubenswrapper[4898]: E1211 13:10:39.753163 4898 secret.go:188] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Dec 11 13:10:41 crc kubenswrapper[4898]: E1211 13:10:39.753277 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/13ca8d97-b6c7-4670-908e-d4cd949bcb0b-node-exporter-tls podName:13ca8d97-b6c7-4670-908e-d4cd949bcb0b nodeName:}" failed. No retries permitted until 2025-12-11 13:10:40.253255133 +0000 UTC m=+397.825581570 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/13ca8d97-b6c7-4670-908e-d4cd949bcb0b-node-exporter-tls") pod "node-exporter-8p8qb" (UID: "13ca8d97-b6c7-4670-908e-d4cd949bcb0b") : secret "node-exporter-tls" not found Dec 11 13:10:41 crc kubenswrapper[4898]: I1211 13:10:39.753278 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/13ca8d97-b6c7-4670-908e-d4cd949bcb0b-sys\") pod \"node-exporter-8p8qb\" (UID: \"13ca8d97-b6c7-4670-908e-d4cd949bcb0b\") " pod="openshift-monitoring/node-exporter-8p8qb" Dec 11 13:10:41 crc kubenswrapper[4898]: I1211 13:10:39.753425 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/13ca8d97-b6c7-4670-908e-d4cd949bcb0b-node-exporter-wtmp\") pod \"node-exporter-8p8qb\" (UID: \"13ca8d97-b6c7-4670-908e-d4cd949bcb0b\") " pod="openshift-monitoring/node-exporter-8p8qb" Dec 11 13:10:41 crc kubenswrapper[4898]: I1211 13:10:39.753443 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/13ca8d97-b6c7-4670-908e-d4cd949bcb0b-metrics-client-ca\") pod \"node-exporter-8p8qb\" (UID: \"13ca8d97-b6c7-4670-908e-d4cd949bcb0b\") " pod="openshift-monitoring/node-exporter-8p8qb" Dec 11 13:10:41 crc kubenswrapper[4898]: I1211 13:10:39.753672 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/13ca8d97-b6c7-4670-908e-d4cd949bcb0b-node-exporter-wtmp\") pod \"node-exporter-8p8qb\" (UID: \"13ca8d97-b6c7-4670-908e-d4cd949bcb0b\") " pod="openshift-monitoring/node-exporter-8p8qb" Dec 11 13:10:41 crc kubenswrapper[4898]: I1211 13:10:39.753449 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/13ca8d97-b6c7-4670-908e-d4cd949bcb0b-node-exporter-textfile\") pod \"node-exporter-8p8qb\" (UID: \"13ca8d97-b6c7-4670-908e-d4cd949bcb0b\") " pod="openshift-monitoring/node-exporter-8p8qb" Dec 11 13:10:41 crc kubenswrapper[4898]: I1211 13:10:39.753783 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/13ca8d97-b6c7-4670-908e-d4cd949bcb0b-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-8p8qb\" (UID: \"13ca8d97-b6c7-4670-908e-d4cd949bcb0b\") " pod="openshift-monitoring/node-exporter-8p8qb" Dec 11 13:10:41 crc kubenswrapper[4898]: I1211 13:10:39.753876 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/13ca8d97-b6c7-4670-908e-d4cd949bcb0b-node-exporter-textfile\") pod \"node-exporter-8p8qb\" (UID: \"13ca8d97-b6c7-4670-908e-d4cd949bcb0b\") " pod="openshift-monitoring/node-exporter-8p8qb" Dec 11 13:10:41 crc kubenswrapper[4898]: I1211 13:10:39.771536 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/13ca8d97-b6c7-4670-908e-d4cd949bcb0b-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-8p8qb\" (UID: \"13ca8d97-b6c7-4670-908e-d4cd949bcb0b\") " pod="openshift-monitoring/node-exporter-8p8qb" Dec 11 13:10:41 crc kubenswrapper[4898]: I1211 13:10:39.778214 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8kcz8\" (UniqueName: \"kubernetes.io/projected/13ca8d97-b6c7-4670-908e-d4cd949bcb0b-kube-api-access-8kcz8\") pod \"node-exporter-8p8qb\" (UID: \"13ca8d97-b6c7-4670-908e-d4cd949bcb0b\") " pod="openshift-monitoring/node-exporter-8p8qb" Dec 11 13:10:41 crc kubenswrapper[4898]: I1211 13:10:39.837963 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-cwpg4" Dec 11 13:10:41 crc kubenswrapper[4898]: I1211 13:10:40.163021 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/05eede04-2d90-4979-908e-29a4d88daf36-openshift-state-metrics-tls\") pod \"openshift-state-metrics-566fddb674-p7nkr\" (UID: \"05eede04-2d90-4979-908e-29a4d88daf36\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-p7nkr" Dec 11 13:10:41 crc kubenswrapper[4898]: I1211 13:10:40.167131 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/05eede04-2d90-4979-908e-29a4d88daf36-openshift-state-metrics-tls\") pod \"openshift-state-metrics-566fddb674-p7nkr\" (UID: \"05eede04-2d90-4979-908e-29a4d88daf36\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-p7nkr" Dec 11 13:10:41 crc kubenswrapper[4898]: I1211 13:10:40.264707 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/13ca8d97-b6c7-4670-908e-d4cd949bcb0b-node-exporter-tls\") pod \"node-exporter-8p8qb\" (UID: \"13ca8d97-b6c7-4670-908e-d4cd949bcb0b\") " pod="openshift-monitoring/node-exporter-8p8qb" Dec 11 13:10:41 crc kubenswrapper[4898]: I1211 13:10:40.268606 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/13ca8d97-b6c7-4670-908e-d4cd949bcb0b-node-exporter-tls\") pod \"node-exporter-8p8qb\" (UID: \"13ca8d97-b6c7-4670-908e-d4cd949bcb0b\") " pod="openshift-monitoring/node-exporter-8p8qb" Dec 11 13:10:41 crc kubenswrapper[4898]: I1211 13:10:40.409813 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-566fddb674-p7nkr" Dec 11 13:10:41 crc kubenswrapper[4898]: I1211 13:10:40.488155 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-8p8qb" Dec 11 13:10:41 crc kubenswrapper[4898]: I1211 13:10:40.621155 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Dec 11 13:10:41 crc kubenswrapper[4898]: I1211 13:10:40.627850 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Dec 11 13:10:41 crc kubenswrapper[4898]: I1211 13:10:40.629865 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-tls-assets-0" Dec 11 13:10:41 crc kubenswrapper[4898]: I1211 13:10:40.630149 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy-web" Dec 11 13:10:41 crc kubenswrapper[4898]: I1211 13:10:40.630342 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-dockercfg-zfhwx" Dec 11 13:10:41 crc kubenswrapper[4898]: I1211 13:10:40.630502 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-generated" Dec 11 13:10:41 crc kubenswrapper[4898]: I1211 13:10:40.630740 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-web-config" Dec 11 13:10:41 crc kubenswrapper[4898]: I1211 13:10:40.631406 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy" Dec 11 13:10:41 crc kubenswrapper[4898]: I1211 13:10:40.632689 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-tls" Dec 11 13:10:41 crc kubenswrapper[4898]: I1211 13:10:40.632750 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy-metric" Dec 11 13:10:41 crc kubenswrapper[4898]: I1211 13:10:40.639777 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"alertmanager-trusted-ca-bundle" Dec 11 13:10:41 crc kubenswrapper[4898]: I1211 13:10:40.642477 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Dec 11 13:10:41 crc kubenswrapper[4898]: I1211 13:10:40.669557 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/33b2dad5-ea59-439e-8659-60fdaba49f82-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"33b2dad5-ea59-439e-8659-60fdaba49f82\") " pod="openshift-monitoring/alertmanager-main-0" Dec 11 13:10:41 crc kubenswrapper[4898]: I1211 13:10:40.669594 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/33b2dad5-ea59-439e-8659-60fdaba49f82-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"33b2dad5-ea59-439e-8659-60fdaba49f82\") " pod="openshift-monitoring/alertmanager-main-0" Dec 11 13:10:41 crc kubenswrapper[4898]: I1211 13:10:40.669612 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/33b2dad5-ea59-439e-8659-60fdaba49f82-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"33b2dad5-ea59-439e-8659-60fdaba49f82\") " pod="openshift-monitoring/alertmanager-main-0" Dec 11 13:10:41 crc kubenswrapper[4898]: I1211 13:10:40.669823 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/33b2dad5-ea59-439e-8659-60fdaba49f82-tls-assets\") pod \"alertmanager-main-0\" (UID: \"33b2dad5-ea59-439e-8659-60fdaba49f82\") " pod="openshift-monitoring/alertmanager-main-0" Dec 11 13:10:41 crc kubenswrapper[4898]: I1211 13:10:40.669877 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/33b2dad5-ea59-439e-8659-60fdaba49f82-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"33b2dad5-ea59-439e-8659-60fdaba49f82\") " pod="openshift-monitoring/alertmanager-main-0" Dec 11 13:10:41 crc kubenswrapper[4898]: I1211 13:10:40.669901 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/33b2dad5-ea59-439e-8659-60fdaba49f82-config-out\") pod \"alertmanager-main-0\" (UID: \"33b2dad5-ea59-439e-8659-60fdaba49f82\") " pod="openshift-monitoring/alertmanager-main-0" Dec 11 13:10:41 crc kubenswrapper[4898]: I1211 13:10:40.669970 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/33b2dad5-ea59-439e-8659-60fdaba49f82-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"33b2dad5-ea59-439e-8659-60fdaba49f82\") " pod="openshift-monitoring/alertmanager-main-0" Dec 11 13:10:41 crc kubenswrapper[4898]: I1211 13:10:40.670002 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gpkp4\" (UniqueName: \"kubernetes.io/projected/33b2dad5-ea59-439e-8659-60fdaba49f82-kube-api-access-gpkp4\") pod \"alertmanager-main-0\" (UID: \"33b2dad5-ea59-439e-8659-60fdaba49f82\") " pod="openshift-monitoring/alertmanager-main-0" Dec 11 13:10:41 crc kubenswrapper[4898]: I1211 13:10:40.670121 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/33b2dad5-ea59-439e-8659-60fdaba49f82-config-volume\") pod \"alertmanager-main-0\" (UID: \"33b2dad5-ea59-439e-8659-60fdaba49f82\") " pod="openshift-monitoring/alertmanager-main-0" Dec 11 13:10:41 crc kubenswrapper[4898]: I1211 13:10:40.670156 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/33b2dad5-ea59-439e-8659-60fdaba49f82-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"33b2dad5-ea59-439e-8659-60fdaba49f82\") " pod="openshift-monitoring/alertmanager-main-0" Dec 11 13:10:41 crc kubenswrapper[4898]: I1211 13:10:40.670210 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/33b2dad5-ea59-439e-8659-60fdaba49f82-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"33b2dad5-ea59-439e-8659-60fdaba49f82\") " pod="openshift-monitoring/alertmanager-main-0" Dec 11 13:10:41 crc kubenswrapper[4898]: I1211 13:10:40.670246 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/33b2dad5-ea59-439e-8659-60fdaba49f82-web-config\") pod \"alertmanager-main-0\" (UID: \"33b2dad5-ea59-439e-8659-60fdaba49f82\") " pod="openshift-monitoring/alertmanager-main-0" Dec 11 13:10:41 crc kubenswrapper[4898]: I1211 13:10:40.771394 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/33b2dad5-ea59-439e-8659-60fdaba49f82-tls-assets\") pod \"alertmanager-main-0\" (UID: \"33b2dad5-ea59-439e-8659-60fdaba49f82\") " pod="openshift-monitoring/alertmanager-main-0" Dec 11 13:10:41 crc kubenswrapper[4898]: I1211 13:10:40.771709 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/33b2dad5-ea59-439e-8659-60fdaba49f82-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"33b2dad5-ea59-439e-8659-60fdaba49f82\") " pod="openshift-monitoring/alertmanager-main-0" Dec 11 13:10:41 crc kubenswrapper[4898]: I1211 13:10:40.771732 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/33b2dad5-ea59-439e-8659-60fdaba49f82-config-out\") pod \"alertmanager-main-0\" (UID: \"33b2dad5-ea59-439e-8659-60fdaba49f82\") " pod="openshift-monitoring/alertmanager-main-0" Dec 11 13:10:41 crc kubenswrapper[4898]: I1211 13:10:40.771755 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/33b2dad5-ea59-439e-8659-60fdaba49f82-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"33b2dad5-ea59-439e-8659-60fdaba49f82\") " pod="openshift-monitoring/alertmanager-main-0" Dec 11 13:10:41 crc kubenswrapper[4898]: I1211 13:10:40.771773 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gpkp4\" (UniqueName: \"kubernetes.io/projected/33b2dad5-ea59-439e-8659-60fdaba49f82-kube-api-access-gpkp4\") pod \"alertmanager-main-0\" (UID: \"33b2dad5-ea59-439e-8659-60fdaba49f82\") " pod="openshift-monitoring/alertmanager-main-0" Dec 11 13:10:41 crc kubenswrapper[4898]: I1211 13:10:40.771810 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/33b2dad5-ea59-439e-8659-60fdaba49f82-config-volume\") pod \"alertmanager-main-0\" (UID: \"33b2dad5-ea59-439e-8659-60fdaba49f82\") " pod="openshift-monitoring/alertmanager-main-0" Dec 11 13:10:41 crc kubenswrapper[4898]: I1211 13:10:40.771827 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/33b2dad5-ea59-439e-8659-60fdaba49f82-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"33b2dad5-ea59-439e-8659-60fdaba49f82\") " pod="openshift-monitoring/alertmanager-main-0" Dec 11 13:10:41 crc kubenswrapper[4898]: I1211 13:10:40.771851 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/33b2dad5-ea59-439e-8659-60fdaba49f82-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"33b2dad5-ea59-439e-8659-60fdaba49f82\") " pod="openshift-monitoring/alertmanager-main-0" Dec 11 13:10:41 crc kubenswrapper[4898]: I1211 13:10:40.771873 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/33b2dad5-ea59-439e-8659-60fdaba49f82-web-config\") pod \"alertmanager-main-0\" (UID: \"33b2dad5-ea59-439e-8659-60fdaba49f82\") " pod="openshift-monitoring/alertmanager-main-0" Dec 11 13:10:41 crc kubenswrapper[4898]: I1211 13:10:40.771901 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/33b2dad5-ea59-439e-8659-60fdaba49f82-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"33b2dad5-ea59-439e-8659-60fdaba49f82\") " pod="openshift-monitoring/alertmanager-main-0" Dec 11 13:10:41 crc kubenswrapper[4898]: I1211 13:10:40.771918 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/33b2dad5-ea59-439e-8659-60fdaba49f82-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"33b2dad5-ea59-439e-8659-60fdaba49f82\") " pod="openshift-monitoring/alertmanager-main-0" Dec 11 13:10:41 crc kubenswrapper[4898]: I1211 13:10:40.771934 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/33b2dad5-ea59-439e-8659-60fdaba49f82-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"33b2dad5-ea59-439e-8659-60fdaba49f82\") " pod="openshift-monitoring/alertmanager-main-0" Dec 11 13:10:41 crc kubenswrapper[4898]: I1211 13:10:40.772504 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/33b2dad5-ea59-439e-8659-60fdaba49f82-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"33b2dad5-ea59-439e-8659-60fdaba49f82\") " pod="openshift-monitoring/alertmanager-main-0" Dec 11 13:10:41 crc kubenswrapper[4898]: I1211 13:10:40.773343 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/33b2dad5-ea59-439e-8659-60fdaba49f82-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"33b2dad5-ea59-439e-8659-60fdaba49f82\") " pod="openshift-monitoring/alertmanager-main-0" Dec 11 13:10:41 crc kubenswrapper[4898]: I1211 13:10:40.774242 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/33b2dad5-ea59-439e-8659-60fdaba49f82-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"33b2dad5-ea59-439e-8659-60fdaba49f82\") " pod="openshift-monitoring/alertmanager-main-0" Dec 11 13:10:41 crc kubenswrapper[4898]: I1211 13:10:40.778634 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/33b2dad5-ea59-439e-8659-60fdaba49f82-web-config\") pod \"alertmanager-main-0\" (UID: \"33b2dad5-ea59-439e-8659-60fdaba49f82\") " pod="openshift-monitoring/alertmanager-main-0" Dec 11 13:10:41 crc kubenswrapper[4898]: I1211 13:10:40.781431 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/33b2dad5-ea59-439e-8659-60fdaba49f82-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"33b2dad5-ea59-439e-8659-60fdaba49f82\") " pod="openshift-monitoring/alertmanager-main-0" Dec 11 13:10:41 crc kubenswrapper[4898]: I1211 13:10:40.783355 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/33b2dad5-ea59-439e-8659-60fdaba49f82-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"33b2dad5-ea59-439e-8659-60fdaba49f82\") " pod="openshift-monitoring/alertmanager-main-0" Dec 11 13:10:41 crc kubenswrapper[4898]: I1211 13:10:40.784402 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/33b2dad5-ea59-439e-8659-60fdaba49f82-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"33b2dad5-ea59-439e-8659-60fdaba49f82\") " pod="openshift-monitoring/alertmanager-main-0" Dec 11 13:10:41 crc kubenswrapper[4898]: I1211 13:10:40.784871 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/33b2dad5-ea59-439e-8659-60fdaba49f82-tls-assets\") pod \"alertmanager-main-0\" (UID: \"33b2dad5-ea59-439e-8659-60fdaba49f82\") " pod="openshift-monitoring/alertmanager-main-0" Dec 11 13:10:41 crc kubenswrapper[4898]: I1211 13:10:40.791258 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/33b2dad5-ea59-439e-8659-60fdaba49f82-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"33b2dad5-ea59-439e-8659-60fdaba49f82\") " pod="openshift-monitoring/alertmanager-main-0" Dec 11 13:10:41 crc kubenswrapper[4898]: I1211 13:10:40.792900 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/33b2dad5-ea59-439e-8659-60fdaba49f82-config-out\") pod \"alertmanager-main-0\" (UID: \"33b2dad5-ea59-439e-8659-60fdaba49f82\") " pod="openshift-monitoring/alertmanager-main-0" Dec 11 13:10:41 crc kubenswrapper[4898]: I1211 13:10:40.801150 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gpkp4\" (UniqueName: \"kubernetes.io/projected/33b2dad5-ea59-439e-8659-60fdaba49f82-kube-api-access-gpkp4\") pod \"alertmanager-main-0\" (UID: \"33b2dad5-ea59-439e-8659-60fdaba49f82\") " pod="openshift-monitoring/alertmanager-main-0" Dec 11 13:10:41 crc kubenswrapper[4898]: I1211 13:10:40.814323 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/33b2dad5-ea59-439e-8659-60fdaba49f82-config-volume\") pod \"alertmanager-main-0\" (UID: \"33b2dad5-ea59-439e-8659-60fdaba49f82\") " pod="openshift-monitoring/alertmanager-main-0" Dec 11 13:10:41 crc kubenswrapper[4898]: I1211 13:10:40.956994 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Dec 11 13:10:41 crc kubenswrapper[4898]: W1211 13:10:41.190018 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod13ca8d97_b6c7_4670_908e_d4cd949bcb0b.slice/crio-358f542d7ac5f991fd04c6638c947daa47bee057a08c9c88da7f12c3db2060df WatchSource:0}: Error finding container 358f542d7ac5f991fd04c6638c947daa47bee057a08c9c88da7f12c3db2060df: Status 404 returned error can't find the container with id 358f542d7ac5f991fd04c6638c947daa47bee057a08c9c88da7f12c3db2060df Dec 11 13:10:41 crc kubenswrapper[4898]: I1211 13:10:41.510478 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-b96d76946-svz9h"] Dec 11 13:10:41 crc kubenswrapper[4898]: I1211 13:10:41.512492 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-b96d76946-svz9h" Dec 11 13:10:41 crc kubenswrapper[4898]: I1211 13:10:41.516056 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy-rules" Dec 11 13:10:41 crc kubenswrapper[4898]: I1211 13:10:41.516586 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-dockercfg-f4vmc" Dec 11 13:10:41 crc kubenswrapper[4898]: I1211 13:10:41.516687 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-grpc-tls-1jdnda0g57d4d" Dec 11 13:10:41 crc kubenswrapper[4898]: I1211 13:10:41.516851 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-tls" Dec 11 13:10:41 crc kubenswrapper[4898]: I1211 13:10:41.517013 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy-metrics" Dec 11 13:10:41 crc kubenswrapper[4898]: I1211 13:10:41.517106 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy" Dec 11 13:10:41 crc kubenswrapper[4898]: I1211 13:10:41.517205 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy-web" Dec 11 13:10:41 crc kubenswrapper[4898]: I1211 13:10:41.572146 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-b96d76946-svz9h"] Dec 11 13:10:41 crc kubenswrapper[4898]: I1211 13:10:41.583124 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h7b6m\" (UniqueName: \"kubernetes.io/projected/7cd8b643-f10c-480e-b851-1d13d35eba18-kube-api-access-h7b6m\") pod \"thanos-querier-b96d76946-svz9h\" (UID: \"7cd8b643-f10c-480e-b851-1d13d35eba18\") " pod="openshift-monitoring/thanos-querier-b96d76946-svz9h" Dec 11 13:10:41 crc kubenswrapper[4898]: I1211 13:10:41.583175 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/7cd8b643-f10c-480e-b851-1d13d35eba18-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-b96d76946-svz9h\" (UID: \"7cd8b643-f10c-480e-b851-1d13d35eba18\") " pod="openshift-monitoring/thanos-querier-b96d76946-svz9h" Dec 11 13:10:41 crc kubenswrapper[4898]: I1211 13:10:41.583200 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/7cd8b643-f10c-480e-b851-1d13d35eba18-secret-thanos-querier-tls\") pod \"thanos-querier-b96d76946-svz9h\" (UID: \"7cd8b643-f10c-480e-b851-1d13d35eba18\") " pod="openshift-monitoring/thanos-querier-b96d76946-svz9h" Dec 11 13:10:41 crc kubenswrapper[4898]: I1211 13:10:41.583224 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/7cd8b643-f10c-480e-b851-1d13d35eba18-secret-grpc-tls\") pod \"thanos-querier-b96d76946-svz9h\" (UID: \"7cd8b643-f10c-480e-b851-1d13d35eba18\") " pod="openshift-monitoring/thanos-querier-b96d76946-svz9h" Dec 11 13:10:41 crc kubenswrapper[4898]: I1211 13:10:41.583251 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/7cd8b643-f10c-480e-b851-1d13d35eba18-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-b96d76946-svz9h\" (UID: \"7cd8b643-f10c-480e-b851-1d13d35eba18\") " pod="openshift-monitoring/thanos-querier-b96d76946-svz9h" Dec 11 13:10:41 crc kubenswrapper[4898]: I1211 13:10:41.583293 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/7cd8b643-f10c-480e-b851-1d13d35eba18-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-b96d76946-svz9h\" (UID: \"7cd8b643-f10c-480e-b851-1d13d35eba18\") " pod="openshift-monitoring/thanos-querier-b96d76946-svz9h" Dec 11 13:10:41 crc kubenswrapper[4898]: I1211 13:10:41.583340 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7cd8b643-f10c-480e-b851-1d13d35eba18-metrics-client-ca\") pod \"thanos-querier-b96d76946-svz9h\" (UID: \"7cd8b643-f10c-480e-b851-1d13d35eba18\") " pod="openshift-monitoring/thanos-querier-b96d76946-svz9h" Dec 11 13:10:41 crc kubenswrapper[4898]: I1211 13:10:41.583360 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/7cd8b643-f10c-480e-b851-1d13d35eba18-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-b96d76946-svz9h\" (UID: \"7cd8b643-f10c-480e-b851-1d13d35eba18\") " pod="openshift-monitoring/thanos-querier-b96d76946-svz9h" Dec 11 13:10:41 crc kubenswrapper[4898]: I1211 13:10:41.684175 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7cd8b643-f10c-480e-b851-1d13d35eba18-metrics-client-ca\") pod \"thanos-querier-b96d76946-svz9h\" (UID: \"7cd8b643-f10c-480e-b851-1d13d35eba18\") " pod="openshift-monitoring/thanos-querier-b96d76946-svz9h" Dec 11 13:10:41 crc kubenswrapper[4898]: I1211 13:10:41.684240 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/7cd8b643-f10c-480e-b851-1d13d35eba18-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-b96d76946-svz9h\" (UID: \"7cd8b643-f10c-480e-b851-1d13d35eba18\") " pod="openshift-monitoring/thanos-querier-b96d76946-svz9h" Dec 11 13:10:41 crc kubenswrapper[4898]: I1211 13:10:41.684312 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h7b6m\" (UniqueName: \"kubernetes.io/projected/7cd8b643-f10c-480e-b851-1d13d35eba18-kube-api-access-h7b6m\") pod \"thanos-querier-b96d76946-svz9h\" (UID: \"7cd8b643-f10c-480e-b851-1d13d35eba18\") " pod="openshift-monitoring/thanos-querier-b96d76946-svz9h" Dec 11 13:10:41 crc kubenswrapper[4898]: I1211 13:10:41.684344 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/7cd8b643-f10c-480e-b851-1d13d35eba18-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-b96d76946-svz9h\" (UID: \"7cd8b643-f10c-480e-b851-1d13d35eba18\") " pod="openshift-monitoring/thanos-querier-b96d76946-svz9h" Dec 11 13:10:41 crc kubenswrapper[4898]: I1211 13:10:41.684387 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/7cd8b643-f10c-480e-b851-1d13d35eba18-secret-thanos-querier-tls\") pod \"thanos-querier-b96d76946-svz9h\" (UID: \"7cd8b643-f10c-480e-b851-1d13d35eba18\") " pod="openshift-monitoring/thanos-querier-b96d76946-svz9h" Dec 11 13:10:41 crc kubenswrapper[4898]: I1211 13:10:41.684495 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/7cd8b643-f10c-480e-b851-1d13d35eba18-secret-grpc-tls\") pod \"thanos-querier-b96d76946-svz9h\" (UID: \"7cd8b643-f10c-480e-b851-1d13d35eba18\") " pod="openshift-monitoring/thanos-querier-b96d76946-svz9h" Dec 11 13:10:41 crc kubenswrapper[4898]: I1211 13:10:41.684546 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/7cd8b643-f10c-480e-b851-1d13d35eba18-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-b96d76946-svz9h\" (UID: \"7cd8b643-f10c-480e-b851-1d13d35eba18\") " pod="openshift-monitoring/thanos-querier-b96d76946-svz9h" Dec 11 13:10:41 crc kubenswrapper[4898]: I1211 13:10:41.684602 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/7cd8b643-f10c-480e-b851-1d13d35eba18-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-b96d76946-svz9h\" (UID: \"7cd8b643-f10c-480e-b851-1d13d35eba18\") " pod="openshift-monitoring/thanos-querier-b96d76946-svz9h" Dec 11 13:10:41 crc kubenswrapper[4898]: I1211 13:10:41.685178 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7cd8b643-f10c-480e-b851-1d13d35eba18-metrics-client-ca\") pod \"thanos-querier-b96d76946-svz9h\" (UID: \"7cd8b643-f10c-480e-b851-1d13d35eba18\") " pod="openshift-monitoring/thanos-querier-b96d76946-svz9h" Dec 11 13:10:41 crc kubenswrapper[4898]: I1211 13:10:41.691657 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/7cd8b643-f10c-480e-b851-1d13d35eba18-secret-thanos-querier-tls\") pod \"thanos-querier-b96d76946-svz9h\" (UID: \"7cd8b643-f10c-480e-b851-1d13d35eba18\") " pod="openshift-monitoring/thanos-querier-b96d76946-svz9h" Dec 11 13:10:41 crc kubenswrapper[4898]: I1211 13:10:41.691776 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/7cd8b643-f10c-480e-b851-1d13d35eba18-secret-grpc-tls\") pod \"thanos-querier-b96d76946-svz9h\" (UID: \"7cd8b643-f10c-480e-b851-1d13d35eba18\") " pod="openshift-monitoring/thanos-querier-b96d76946-svz9h" Dec 11 13:10:41 crc kubenswrapper[4898]: I1211 13:10:41.692148 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/7cd8b643-f10c-480e-b851-1d13d35eba18-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-b96d76946-svz9h\" (UID: \"7cd8b643-f10c-480e-b851-1d13d35eba18\") " pod="openshift-monitoring/thanos-querier-b96d76946-svz9h" Dec 11 13:10:41 crc kubenswrapper[4898]: I1211 13:10:41.692795 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/7cd8b643-f10c-480e-b851-1d13d35eba18-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-b96d76946-svz9h\" (UID: \"7cd8b643-f10c-480e-b851-1d13d35eba18\") " pod="openshift-monitoring/thanos-querier-b96d76946-svz9h" Dec 11 13:10:41 crc kubenswrapper[4898]: I1211 13:10:41.693353 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/7cd8b643-f10c-480e-b851-1d13d35eba18-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-b96d76946-svz9h\" (UID: \"7cd8b643-f10c-480e-b851-1d13d35eba18\") " pod="openshift-monitoring/thanos-querier-b96d76946-svz9h" Dec 11 13:10:41 crc kubenswrapper[4898]: I1211 13:10:41.697135 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/7cd8b643-f10c-480e-b851-1d13d35eba18-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-b96d76946-svz9h\" (UID: \"7cd8b643-f10c-480e-b851-1d13d35eba18\") " pod="openshift-monitoring/thanos-querier-b96d76946-svz9h" Dec 11 13:10:41 crc kubenswrapper[4898]: I1211 13:10:41.703605 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h7b6m\" (UniqueName: \"kubernetes.io/projected/7cd8b643-f10c-480e-b851-1d13d35eba18-kube-api-access-h7b6m\") pod \"thanos-querier-b96d76946-svz9h\" (UID: \"7cd8b643-f10c-480e-b851-1d13d35eba18\") " pod="openshift-monitoring/thanos-querier-b96d76946-svz9h" Dec 11 13:10:41 crc kubenswrapper[4898]: I1211 13:10:41.749474 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-8p8qb" event={"ID":"13ca8d97-b6c7-4670-908e-d4cd949bcb0b","Type":"ContainerStarted","Data":"358f542d7ac5f991fd04c6638c947daa47bee057a08c9c88da7f12c3db2060df"} Dec 11 13:10:41 crc kubenswrapper[4898]: I1211 13:10:41.815993 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-777cb5bd5d-cwpg4"] Dec 11 13:10:41 crc kubenswrapper[4898]: W1211 13:10:41.817281 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podba58f08b_30ee_4ed4_9156_ce30817e7231.slice/crio-239b44f7dbd7743e4481dd5937ff6ca84f5bfc4b24e1a8548036aa65bcdc9e22 WatchSource:0}: Error finding container 239b44f7dbd7743e4481dd5937ff6ca84f5bfc4b24e1a8548036aa65bcdc9e22: Status 404 returned error can't find the container with id 239b44f7dbd7743e4481dd5937ff6ca84f5bfc4b24e1a8548036aa65bcdc9e22 Dec 11 13:10:41 crc kubenswrapper[4898]: I1211 13:10:41.862138 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-b96d76946-svz9h" Dec 11 13:10:41 crc kubenswrapper[4898]: I1211 13:10:41.884606 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-566fddb674-p7nkr"] Dec 11 13:10:41 crc kubenswrapper[4898]: W1211 13:10:41.893053 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod05eede04_2d90_4979_908e_29a4d88daf36.slice/crio-b9d606c4c98ffa106c1a7f2c2b1ff3b2844dadf776acaf558316353e4a8ec422 WatchSource:0}: Error finding container b9d606c4c98ffa106c1a7f2c2b1ff3b2844dadf776acaf558316353e4a8ec422: Status 404 returned error can't find the container with id b9d606c4c98ffa106c1a7f2c2b1ff3b2844dadf776acaf558316353e4a8ec422 Dec 11 13:10:41 crc kubenswrapper[4898]: I1211 13:10:41.894649 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Dec 11 13:10:41 crc kubenswrapper[4898]: W1211 13:10:41.895008 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod33b2dad5_ea59_439e_8659_60fdaba49f82.slice/crio-7d86f60aea6a3dd4636aa48ac84c5d24eb47d3260702905674dc73581f1ab431 WatchSource:0}: Error finding container 7d86f60aea6a3dd4636aa48ac84c5d24eb47d3260702905674dc73581f1ab431: Status 404 returned error can't find the container with id 7d86f60aea6a3dd4636aa48ac84c5d24eb47d3260702905674dc73581f1ab431 Dec 11 13:10:42 crc kubenswrapper[4898]: I1211 13:10:42.288975 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-b96d76946-svz9h"] Dec 11 13:10:42 crc kubenswrapper[4898]: I1211 13:10:42.756188 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-566fddb674-p7nkr" event={"ID":"05eede04-2d90-4979-908e-29a4d88daf36","Type":"ContainerStarted","Data":"164ecf3beb64a311bc8d3ae9f8613629fc28fc817405788aca4cd9dfa80e402d"} Dec 11 13:10:42 crc kubenswrapper[4898]: I1211 13:10:42.756785 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-566fddb674-p7nkr" event={"ID":"05eede04-2d90-4979-908e-29a4d88daf36","Type":"ContainerStarted","Data":"876b4eed4af45d8dc9f11d6ce6e9e8ed6cb361ce04c3ed3c13aa9d65d34f884e"} Dec 11 13:10:42 crc kubenswrapper[4898]: I1211 13:10:42.756806 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-566fddb674-p7nkr" event={"ID":"05eede04-2d90-4979-908e-29a4d88daf36","Type":"ContainerStarted","Data":"b9d606c4c98ffa106c1a7f2c2b1ff3b2844dadf776acaf558316353e4a8ec422"} Dec 11 13:10:42 crc kubenswrapper[4898]: I1211 13:10:42.757204 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-b96d76946-svz9h" event={"ID":"7cd8b643-f10c-480e-b851-1d13d35eba18","Type":"ContainerStarted","Data":"b672a047c501ce07cae8ee1d04c80e6f5fc40aba5e3f8fa01cf8b002a9f2acff"} Dec 11 13:10:42 crc kubenswrapper[4898]: I1211 13:10:42.758531 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-cwpg4" event={"ID":"ba58f08b-30ee-4ed4-9156-ce30817e7231","Type":"ContainerStarted","Data":"239b44f7dbd7743e4481dd5937ff6ca84f5bfc4b24e1a8548036aa65bcdc9e22"} Dec 11 13:10:42 crc kubenswrapper[4898]: I1211 13:10:42.759887 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"33b2dad5-ea59-439e-8659-60fdaba49f82","Type":"ContainerStarted","Data":"7d86f60aea6a3dd4636aa48ac84c5d24eb47d3260702905674dc73581f1ab431"} Dec 11 13:10:43 crc kubenswrapper[4898]: I1211 13:10:43.767312 4898 generic.go:334] "Generic (PLEG): container finished" podID="13ca8d97-b6c7-4670-908e-d4cd949bcb0b" containerID="5f6c5fc08ffebef70210f6d9385421419092a39ffd2e46894eb443e748128538" exitCode=0 Dec 11 13:10:43 crc kubenswrapper[4898]: I1211 13:10:43.767803 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-8p8qb" event={"ID":"13ca8d97-b6c7-4670-908e-d4cd949bcb0b","Type":"ContainerDied","Data":"5f6c5fc08ffebef70210f6d9385421419092a39ffd2e46894eb443e748128538"} Dec 11 13:10:44 crc kubenswrapper[4898]: I1211 13:10:44.179292 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-8cd884dbc-x76wj"] Dec 11 13:10:44 crc kubenswrapper[4898]: I1211 13:10:44.180661 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-8cd884dbc-x76wj" Dec 11 13:10:44 crc kubenswrapper[4898]: I1211 13:10:44.206897 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-8cd884dbc-x76wj"] Dec 11 13:10:44 crc kubenswrapper[4898]: I1211 13:10:44.227546 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b9e31705-36ee-462a-82ce-7dd3195f945b-console-serving-cert\") pod \"console-8cd884dbc-x76wj\" (UID: \"b9e31705-36ee-462a-82ce-7dd3195f945b\") " pod="openshift-console/console-8cd884dbc-x76wj" Dec 11 13:10:44 crc kubenswrapper[4898]: I1211 13:10:44.227616 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b9e31705-36ee-462a-82ce-7dd3195f945b-console-oauth-config\") pod \"console-8cd884dbc-x76wj\" (UID: \"b9e31705-36ee-462a-82ce-7dd3195f945b\") " pod="openshift-console/console-8cd884dbc-x76wj" Dec 11 13:10:44 crc kubenswrapper[4898]: I1211 13:10:44.227642 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b9e31705-36ee-462a-82ce-7dd3195f945b-oauth-serving-cert\") pod \"console-8cd884dbc-x76wj\" (UID: \"b9e31705-36ee-462a-82ce-7dd3195f945b\") " pod="openshift-console/console-8cd884dbc-x76wj" Dec 11 13:10:44 crc kubenswrapper[4898]: I1211 13:10:44.227664 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b9e31705-36ee-462a-82ce-7dd3195f945b-service-ca\") pod \"console-8cd884dbc-x76wj\" (UID: \"b9e31705-36ee-462a-82ce-7dd3195f945b\") " pod="openshift-console/console-8cd884dbc-x76wj" Dec 11 13:10:44 crc kubenswrapper[4898]: I1211 13:10:44.227684 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b9e31705-36ee-462a-82ce-7dd3195f945b-console-config\") pod \"console-8cd884dbc-x76wj\" (UID: \"b9e31705-36ee-462a-82ce-7dd3195f945b\") " pod="openshift-console/console-8cd884dbc-x76wj" Dec 11 13:10:44 crc kubenswrapper[4898]: I1211 13:10:44.227707 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b9e31705-36ee-462a-82ce-7dd3195f945b-trusted-ca-bundle\") pod \"console-8cd884dbc-x76wj\" (UID: \"b9e31705-36ee-462a-82ce-7dd3195f945b\") " pod="openshift-console/console-8cd884dbc-x76wj" Dec 11 13:10:44 crc kubenswrapper[4898]: I1211 13:10:44.227795 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p9gnl\" (UniqueName: \"kubernetes.io/projected/b9e31705-36ee-462a-82ce-7dd3195f945b-kube-api-access-p9gnl\") pod \"console-8cd884dbc-x76wj\" (UID: \"b9e31705-36ee-462a-82ce-7dd3195f945b\") " pod="openshift-console/console-8cd884dbc-x76wj" Dec 11 13:10:44 crc kubenswrapper[4898]: I1211 13:10:44.328957 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b9e31705-36ee-462a-82ce-7dd3195f945b-console-oauth-config\") pod \"console-8cd884dbc-x76wj\" (UID: \"b9e31705-36ee-462a-82ce-7dd3195f945b\") " pod="openshift-console/console-8cd884dbc-x76wj" Dec 11 13:10:44 crc kubenswrapper[4898]: I1211 13:10:44.329000 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b9e31705-36ee-462a-82ce-7dd3195f945b-oauth-serving-cert\") pod \"console-8cd884dbc-x76wj\" (UID: \"b9e31705-36ee-462a-82ce-7dd3195f945b\") " pod="openshift-console/console-8cd884dbc-x76wj" Dec 11 13:10:44 crc kubenswrapper[4898]: I1211 13:10:44.329037 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b9e31705-36ee-462a-82ce-7dd3195f945b-service-ca\") pod \"console-8cd884dbc-x76wj\" (UID: \"b9e31705-36ee-462a-82ce-7dd3195f945b\") " pod="openshift-console/console-8cd884dbc-x76wj" Dec 11 13:10:44 crc kubenswrapper[4898]: I1211 13:10:44.329065 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b9e31705-36ee-462a-82ce-7dd3195f945b-console-config\") pod \"console-8cd884dbc-x76wj\" (UID: \"b9e31705-36ee-462a-82ce-7dd3195f945b\") " pod="openshift-console/console-8cd884dbc-x76wj" Dec 11 13:10:44 crc kubenswrapper[4898]: I1211 13:10:44.329098 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b9e31705-36ee-462a-82ce-7dd3195f945b-trusted-ca-bundle\") pod \"console-8cd884dbc-x76wj\" (UID: \"b9e31705-36ee-462a-82ce-7dd3195f945b\") " pod="openshift-console/console-8cd884dbc-x76wj" Dec 11 13:10:44 crc kubenswrapper[4898]: I1211 13:10:44.329120 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p9gnl\" (UniqueName: \"kubernetes.io/projected/b9e31705-36ee-462a-82ce-7dd3195f945b-kube-api-access-p9gnl\") pod \"console-8cd884dbc-x76wj\" (UID: \"b9e31705-36ee-462a-82ce-7dd3195f945b\") " pod="openshift-console/console-8cd884dbc-x76wj" Dec 11 13:10:44 crc kubenswrapper[4898]: I1211 13:10:44.329157 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b9e31705-36ee-462a-82ce-7dd3195f945b-console-serving-cert\") pod \"console-8cd884dbc-x76wj\" (UID: \"b9e31705-36ee-462a-82ce-7dd3195f945b\") " pod="openshift-console/console-8cd884dbc-x76wj" Dec 11 13:10:44 crc kubenswrapper[4898]: I1211 13:10:44.330447 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b9e31705-36ee-462a-82ce-7dd3195f945b-console-config\") pod \"console-8cd884dbc-x76wj\" (UID: \"b9e31705-36ee-462a-82ce-7dd3195f945b\") " pod="openshift-console/console-8cd884dbc-x76wj" Dec 11 13:10:44 crc kubenswrapper[4898]: I1211 13:10:44.330506 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b9e31705-36ee-462a-82ce-7dd3195f945b-service-ca\") pod \"console-8cd884dbc-x76wj\" (UID: \"b9e31705-36ee-462a-82ce-7dd3195f945b\") " pod="openshift-console/console-8cd884dbc-x76wj" Dec 11 13:10:44 crc kubenswrapper[4898]: I1211 13:10:44.330536 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b9e31705-36ee-462a-82ce-7dd3195f945b-oauth-serving-cert\") pod \"console-8cd884dbc-x76wj\" (UID: \"b9e31705-36ee-462a-82ce-7dd3195f945b\") " pod="openshift-console/console-8cd884dbc-x76wj" Dec 11 13:10:44 crc kubenswrapper[4898]: I1211 13:10:44.330880 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b9e31705-36ee-462a-82ce-7dd3195f945b-trusted-ca-bundle\") pod \"console-8cd884dbc-x76wj\" (UID: \"b9e31705-36ee-462a-82ce-7dd3195f945b\") " pod="openshift-console/console-8cd884dbc-x76wj" Dec 11 13:10:44 crc kubenswrapper[4898]: I1211 13:10:44.334320 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b9e31705-36ee-462a-82ce-7dd3195f945b-console-oauth-config\") pod \"console-8cd884dbc-x76wj\" (UID: \"b9e31705-36ee-462a-82ce-7dd3195f945b\") " pod="openshift-console/console-8cd884dbc-x76wj" Dec 11 13:10:44 crc kubenswrapper[4898]: I1211 13:10:44.344036 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b9e31705-36ee-462a-82ce-7dd3195f945b-console-serving-cert\") pod \"console-8cd884dbc-x76wj\" (UID: \"b9e31705-36ee-462a-82ce-7dd3195f945b\") " pod="openshift-console/console-8cd884dbc-x76wj" Dec 11 13:10:44 crc kubenswrapper[4898]: I1211 13:10:44.346183 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p9gnl\" (UniqueName: \"kubernetes.io/projected/b9e31705-36ee-462a-82ce-7dd3195f945b-kube-api-access-p9gnl\") pod \"console-8cd884dbc-x76wj\" (UID: \"b9e31705-36ee-462a-82ce-7dd3195f945b\") " pod="openshift-console/console-8cd884dbc-x76wj" Dec 11 13:10:44 crc kubenswrapper[4898]: I1211 13:10:44.505434 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-8cd884dbc-x76wj" Dec 11 13:10:44 crc kubenswrapper[4898]: I1211 13:10:44.746109 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-7c75cc77ff-zj4jw"] Dec 11 13:10:44 crc kubenswrapper[4898]: I1211 13:10:44.747361 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-7c75cc77ff-zj4jw" Dec 11 13:10:44 crc kubenswrapper[4898]: I1211 13:10:44.752770 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-client-certs" Dec 11 13:10:44 crc kubenswrapper[4898]: I1211 13:10:44.753034 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kubelet-serving-ca-bundle" Dec 11 13:10:44 crc kubenswrapper[4898]: I1211 13:10:44.753205 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-dockercfg-jdbtn" Dec 11 13:10:44 crc kubenswrapper[4898]: I1211 13:10:44.753987 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-tls" Dec 11 13:10:44 crc kubenswrapper[4898]: I1211 13:10:44.754169 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-b24k2eagpaooq" Dec 11 13:10:44 crc kubenswrapper[4898]: I1211 13:10:44.757093 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"metrics-server-audit-profiles" Dec 11 13:10:44 crc kubenswrapper[4898]: I1211 13:10:44.758196 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-7c75cc77ff-zj4jw"] Dec 11 13:10:44 crc kubenswrapper[4898]: I1211 13:10:44.836811 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/faa81158-1b24-4a0a-8fb6-f362177c51fd-audit-log\") pod \"metrics-server-7c75cc77ff-zj4jw\" (UID: \"faa81158-1b24-4a0a-8fb6-f362177c51fd\") " pod="openshift-monitoring/metrics-server-7c75cc77ff-zj4jw" Dec 11 13:10:44 crc kubenswrapper[4898]: I1211 13:10:44.836892 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-skwpw\" (UniqueName: \"kubernetes.io/projected/faa81158-1b24-4a0a-8fb6-f362177c51fd-kube-api-access-skwpw\") pod \"metrics-server-7c75cc77ff-zj4jw\" (UID: \"faa81158-1b24-4a0a-8fb6-f362177c51fd\") " pod="openshift-monitoring/metrics-server-7c75cc77ff-zj4jw" Dec 11 13:10:44 crc kubenswrapper[4898]: I1211 13:10:44.836932 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/faa81158-1b24-4a0a-8fb6-f362177c51fd-secret-metrics-server-tls\") pod \"metrics-server-7c75cc77ff-zj4jw\" (UID: \"faa81158-1b24-4a0a-8fb6-f362177c51fd\") " pod="openshift-monitoring/metrics-server-7c75cc77ff-zj4jw" Dec 11 13:10:44 crc kubenswrapper[4898]: I1211 13:10:44.836976 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/faa81158-1b24-4a0a-8fb6-f362177c51fd-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-7c75cc77ff-zj4jw\" (UID: \"faa81158-1b24-4a0a-8fb6-f362177c51fd\") " pod="openshift-monitoring/metrics-server-7c75cc77ff-zj4jw" Dec 11 13:10:44 crc kubenswrapper[4898]: I1211 13:10:44.837005 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/faa81158-1b24-4a0a-8fb6-f362177c51fd-client-ca-bundle\") pod \"metrics-server-7c75cc77ff-zj4jw\" (UID: \"faa81158-1b24-4a0a-8fb6-f362177c51fd\") " pod="openshift-monitoring/metrics-server-7c75cc77ff-zj4jw" Dec 11 13:10:44 crc kubenswrapper[4898]: I1211 13:10:44.837072 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/faa81158-1b24-4a0a-8fb6-f362177c51fd-secret-metrics-client-certs\") pod \"metrics-server-7c75cc77ff-zj4jw\" (UID: \"faa81158-1b24-4a0a-8fb6-f362177c51fd\") " pod="openshift-monitoring/metrics-server-7c75cc77ff-zj4jw" Dec 11 13:10:44 crc kubenswrapper[4898]: I1211 13:10:44.837119 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/faa81158-1b24-4a0a-8fb6-f362177c51fd-metrics-server-audit-profiles\") pod \"metrics-server-7c75cc77ff-zj4jw\" (UID: \"faa81158-1b24-4a0a-8fb6-f362177c51fd\") " pod="openshift-monitoring/metrics-server-7c75cc77ff-zj4jw" Dec 11 13:10:44 crc kubenswrapper[4898]: I1211 13:10:44.937912 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/faa81158-1b24-4a0a-8fb6-f362177c51fd-secret-metrics-client-certs\") pod \"metrics-server-7c75cc77ff-zj4jw\" (UID: \"faa81158-1b24-4a0a-8fb6-f362177c51fd\") " pod="openshift-monitoring/metrics-server-7c75cc77ff-zj4jw" Dec 11 13:10:44 crc kubenswrapper[4898]: I1211 13:10:44.937996 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/faa81158-1b24-4a0a-8fb6-f362177c51fd-metrics-server-audit-profiles\") pod \"metrics-server-7c75cc77ff-zj4jw\" (UID: \"faa81158-1b24-4a0a-8fb6-f362177c51fd\") " pod="openshift-monitoring/metrics-server-7c75cc77ff-zj4jw" Dec 11 13:10:44 crc kubenswrapper[4898]: I1211 13:10:44.938041 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/faa81158-1b24-4a0a-8fb6-f362177c51fd-audit-log\") pod \"metrics-server-7c75cc77ff-zj4jw\" (UID: \"faa81158-1b24-4a0a-8fb6-f362177c51fd\") " pod="openshift-monitoring/metrics-server-7c75cc77ff-zj4jw" Dec 11 13:10:44 crc kubenswrapper[4898]: I1211 13:10:44.938085 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-skwpw\" (UniqueName: \"kubernetes.io/projected/faa81158-1b24-4a0a-8fb6-f362177c51fd-kube-api-access-skwpw\") pod \"metrics-server-7c75cc77ff-zj4jw\" (UID: \"faa81158-1b24-4a0a-8fb6-f362177c51fd\") " pod="openshift-monitoring/metrics-server-7c75cc77ff-zj4jw" Dec 11 13:10:44 crc kubenswrapper[4898]: I1211 13:10:44.938533 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/faa81158-1b24-4a0a-8fb6-f362177c51fd-secret-metrics-server-tls\") pod \"metrics-server-7c75cc77ff-zj4jw\" (UID: \"faa81158-1b24-4a0a-8fb6-f362177c51fd\") " pod="openshift-monitoring/metrics-server-7c75cc77ff-zj4jw" Dec 11 13:10:44 crc kubenswrapper[4898]: I1211 13:10:44.938573 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/faa81158-1b24-4a0a-8fb6-f362177c51fd-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-7c75cc77ff-zj4jw\" (UID: \"faa81158-1b24-4a0a-8fb6-f362177c51fd\") " pod="openshift-monitoring/metrics-server-7c75cc77ff-zj4jw" Dec 11 13:10:44 crc kubenswrapper[4898]: I1211 13:10:44.938607 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/faa81158-1b24-4a0a-8fb6-f362177c51fd-client-ca-bundle\") pod \"metrics-server-7c75cc77ff-zj4jw\" (UID: \"faa81158-1b24-4a0a-8fb6-f362177c51fd\") " pod="openshift-monitoring/metrics-server-7c75cc77ff-zj4jw" Dec 11 13:10:44 crc kubenswrapper[4898]: I1211 13:10:44.940338 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/faa81158-1b24-4a0a-8fb6-f362177c51fd-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-7c75cc77ff-zj4jw\" (UID: \"faa81158-1b24-4a0a-8fb6-f362177c51fd\") " pod="openshift-monitoring/metrics-server-7c75cc77ff-zj4jw" Dec 11 13:10:44 crc kubenswrapper[4898]: I1211 13:10:44.940726 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/faa81158-1b24-4a0a-8fb6-f362177c51fd-audit-log\") pod \"metrics-server-7c75cc77ff-zj4jw\" (UID: \"faa81158-1b24-4a0a-8fb6-f362177c51fd\") " pod="openshift-monitoring/metrics-server-7c75cc77ff-zj4jw" Dec 11 13:10:44 crc kubenswrapper[4898]: I1211 13:10:44.941680 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/faa81158-1b24-4a0a-8fb6-f362177c51fd-metrics-server-audit-profiles\") pod \"metrics-server-7c75cc77ff-zj4jw\" (UID: \"faa81158-1b24-4a0a-8fb6-f362177c51fd\") " pod="openshift-monitoring/metrics-server-7c75cc77ff-zj4jw" Dec 11 13:10:44 crc kubenswrapper[4898]: I1211 13:10:44.943846 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/faa81158-1b24-4a0a-8fb6-f362177c51fd-secret-metrics-server-tls\") pod \"metrics-server-7c75cc77ff-zj4jw\" (UID: \"faa81158-1b24-4a0a-8fb6-f362177c51fd\") " pod="openshift-monitoring/metrics-server-7c75cc77ff-zj4jw" Dec 11 13:10:44 crc kubenswrapper[4898]: I1211 13:10:44.956217 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/faa81158-1b24-4a0a-8fb6-f362177c51fd-secret-metrics-client-certs\") pod \"metrics-server-7c75cc77ff-zj4jw\" (UID: \"faa81158-1b24-4a0a-8fb6-f362177c51fd\") " pod="openshift-monitoring/metrics-server-7c75cc77ff-zj4jw" Dec 11 13:10:44 crc kubenswrapper[4898]: I1211 13:10:44.956672 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-skwpw\" (UniqueName: \"kubernetes.io/projected/faa81158-1b24-4a0a-8fb6-f362177c51fd-kube-api-access-skwpw\") pod \"metrics-server-7c75cc77ff-zj4jw\" (UID: \"faa81158-1b24-4a0a-8fb6-f362177c51fd\") " pod="openshift-monitoring/metrics-server-7c75cc77ff-zj4jw" Dec 11 13:10:44 crc kubenswrapper[4898]: I1211 13:10:44.958896 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/faa81158-1b24-4a0a-8fb6-f362177c51fd-client-ca-bundle\") pod \"metrics-server-7c75cc77ff-zj4jw\" (UID: \"faa81158-1b24-4a0a-8fb6-f362177c51fd\") " pod="openshift-monitoring/metrics-server-7c75cc77ff-zj4jw" Dec 11 13:10:45 crc kubenswrapper[4898]: I1211 13:10:45.072879 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-7c75cc77ff-zj4jw" Dec 11 13:10:45 crc kubenswrapper[4898]: I1211 13:10:45.157801 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-5ffffb4f84-84dnp"] Dec 11 13:10:45 crc kubenswrapper[4898]: I1211 13:10:45.158532 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-5ffffb4f84-84dnp" Dec 11 13:10:45 crc kubenswrapper[4898]: I1211 13:10:45.160931 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"default-dockercfg-6tstp" Dec 11 13:10:45 crc kubenswrapper[4898]: I1211 13:10:45.161032 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"monitoring-plugin-cert" Dec 11 13:10:45 crc kubenswrapper[4898]: I1211 13:10:45.174515 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-5ffffb4f84-84dnp"] Dec 11 13:10:45 crc kubenswrapper[4898]: I1211 13:10:45.242697 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/1111559b-96c1-4918-b502-1b5045b8a9da-monitoring-plugin-cert\") pod \"monitoring-plugin-5ffffb4f84-84dnp\" (UID: \"1111559b-96c1-4918-b502-1b5045b8a9da\") " pod="openshift-monitoring/monitoring-plugin-5ffffb4f84-84dnp" Dec 11 13:10:45 crc kubenswrapper[4898]: I1211 13:10:45.345123 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/1111559b-96c1-4918-b502-1b5045b8a9da-monitoring-plugin-cert\") pod \"monitoring-plugin-5ffffb4f84-84dnp\" (UID: \"1111559b-96c1-4918-b502-1b5045b8a9da\") " pod="openshift-monitoring/monitoring-plugin-5ffffb4f84-84dnp" Dec 11 13:10:45 crc kubenswrapper[4898]: I1211 13:10:45.368612 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/1111559b-96c1-4918-b502-1b5045b8a9da-monitoring-plugin-cert\") pod \"monitoring-plugin-5ffffb4f84-84dnp\" (UID: \"1111559b-96c1-4918-b502-1b5045b8a9da\") " pod="openshift-monitoring/monitoring-plugin-5ffffb4f84-84dnp" Dec 11 13:10:45 crc kubenswrapper[4898]: I1211 13:10:45.475144 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-5ffffb4f84-84dnp" Dec 11 13:10:45 crc kubenswrapper[4898]: I1211 13:10:45.695069 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-8cd884dbc-x76wj"] Dec 11 13:10:45 crc kubenswrapper[4898]: I1211 13:10:45.787974 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-8cd884dbc-x76wj" event={"ID":"b9e31705-36ee-462a-82ce-7dd3195f945b","Type":"ContainerStarted","Data":"845c0e1769a724643c67bad26c0eb358971cf72fea6e8044abae13ab382bae6f"} Dec 11 13:10:45 crc kubenswrapper[4898]: I1211 13:10:45.804237 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-566fddb674-p7nkr" event={"ID":"05eede04-2d90-4979-908e-29a4d88daf36","Type":"ContainerStarted","Data":"604d5e31fa409f9ffb6ecbb32ef24d648bdfccdd5a046044bcc4b3c197bbe771"} Dec 11 13:10:45 crc kubenswrapper[4898]: I1211 13:10:45.827635 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-cwpg4" event={"ID":"ba58f08b-30ee-4ed4-9156-ce30817e7231","Type":"ContainerStarted","Data":"b68c605bfe94649f2c356ae66b98a0c2c92541b63312789b91785b2616b1da5c"} Dec 11 13:10:45 crc kubenswrapper[4898]: I1211 13:10:45.829866 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"33b2dad5-ea59-439e-8659-60fdaba49f82","Type":"ContainerStarted","Data":"f205f79a24aaf699bf09a4238e0e52c6f96113747e9e273ec980fb8edfd3ff47"} Dec 11 13:10:45 crc kubenswrapper[4898]: I1211 13:10:45.842359 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-7c75cc77ff-zj4jw"] Dec 11 13:10:45 crc kubenswrapper[4898]: W1211 13:10:45.843996 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfaa81158_1b24_4a0a_8fb6_f362177c51fd.slice/crio-679f0bbec4c0f3263caaa4ab9cf9272a071da065f061078b0018d225e05f177c WatchSource:0}: Error finding container 679f0bbec4c0f3263caaa4ab9cf9272a071da065f061078b0018d225e05f177c: Status 404 returned error can't find the container with id 679f0bbec4c0f3263caaa4ab9cf9272a071da065f061078b0018d225e05f177c Dec 11 13:10:45 crc kubenswrapper[4898]: I1211 13:10:45.929224 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Dec 11 13:10:45 crc kubenswrapper[4898]: I1211 13:10:45.936641 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Dec 11 13:10:45 crc kubenswrapper[4898]: I1211 13:10:45.940215 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-web-config" Dec 11 13:10:45 crc kubenswrapper[4898]: I1211 13:10:45.941145 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-thanos-prometheus-http-client-file" Dec 11 13:10:45 crc kubenswrapper[4898]: I1211 13:10:45.941285 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-thanos-sidecar-tls" Dec 11 13:10:45 crc kubenswrapper[4898]: I1211 13:10:45.941543 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s" Dec 11 13:10:45 crc kubenswrapper[4898]: I1211 13:10:45.941669 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-rbac-proxy" Dec 11 13:10:45 crc kubenswrapper[4898]: I1211 13:10:45.941774 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-kube-rbac-proxy-web" Dec 11 13:10:45 crc kubenswrapper[4898]: I1211 13:10:45.941549 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"serving-certs-ca-bundle" Dec 11 13:10:45 crc kubenswrapper[4898]: I1211 13:10:45.941213 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-tls" Dec 11 13:10:45 crc kubenswrapper[4898]: I1211 13:10:45.941930 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-tls-assets-0" Dec 11 13:10:45 crc kubenswrapper[4898]: I1211 13:10:45.942134 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-grpc-tls-8t131begtaj73" Dec 11 13:10:45 crc kubenswrapper[4898]: I1211 13:10:45.942289 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-dockercfg-jtfbb" Dec 11 13:10:45 crc kubenswrapper[4898]: I1211 13:10:45.947044 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"prometheus-k8s-rulefiles-0" Dec 11 13:10:45 crc kubenswrapper[4898]: I1211 13:10:45.955147 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"prometheus-trusted-ca-bundle" Dec 11 13:10:45 crc kubenswrapper[4898]: I1211 13:10:45.957369 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Dec 11 13:10:45 crc kubenswrapper[4898]: I1211 13:10:45.965114 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/d4348035-0731-4884-9bae-eaf5d2a693f9-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"d4348035-0731-4884-9bae-eaf5d2a693f9\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 11 13:10:45 crc kubenswrapper[4898]: I1211 13:10:45.965166 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/d4348035-0731-4884-9bae-eaf5d2a693f9-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"d4348035-0731-4884-9bae-eaf5d2a693f9\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 11 13:10:45 crc kubenswrapper[4898]: I1211 13:10:45.965186 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/d4348035-0731-4884-9bae-eaf5d2a693f9-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"d4348035-0731-4884-9bae-eaf5d2a693f9\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 11 13:10:45 crc kubenswrapper[4898]: I1211 13:10:45.965202 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d4348035-0731-4884-9bae-eaf5d2a693f9-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"d4348035-0731-4884-9bae-eaf5d2a693f9\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 11 13:10:45 crc kubenswrapper[4898]: I1211 13:10:45.965224 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/d4348035-0731-4884-9bae-eaf5d2a693f9-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"d4348035-0731-4884-9bae-eaf5d2a693f9\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 11 13:10:45 crc kubenswrapper[4898]: I1211 13:10:45.965247 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/d4348035-0731-4884-9bae-eaf5d2a693f9-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"d4348035-0731-4884-9bae-eaf5d2a693f9\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 11 13:10:45 crc kubenswrapper[4898]: I1211 13:10:45.965271 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d4348035-0731-4884-9bae-eaf5d2a693f9-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"d4348035-0731-4884-9bae-eaf5d2a693f9\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 11 13:10:45 crc kubenswrapper[4898]: I1211 13:10:45.965290 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/d4348035-0731-4884-9bae-eaf5d2a693f9-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"d4348035-0731-4884-9bae-eaf5d2a693f9\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 11 13:10:45 crc kubenswrapper[4898]: I1211 13:10:45.965308 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/d4348035-0731-4884-9bae-eaf5d2a693f9-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"d4348035-0731-4884-9bae-eaf5d2a693f9\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 11 13:10:45 crc kubenswrapper[4898]: I1211 13:10:45.965323 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d4348035-0731-4884-9bae-eaf5d2a693f9-config-out\") pod \"prometheus-k8s-0\" (UID: \"d4348035-0731-4884-9bae-eaf5d2a693f9\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 11 13:10:45 crc kubenswrapper[4898]: I1211 13:10:45.965344 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d4348035-0731-4884-9bae-eaf5d2a693f9-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"d4348035-0731-4884-9bae-eaf5d2a693f9\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 11 13:10:45 crc kubenswrapper[4898]: I1211 13:10:45.965365 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/d4348035-0731-4884-9bae-eaf5d2a693f9-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"d4348035-0731-4884-9bae-eaf5d2a693f9\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 11 13:10:45 crc kubenswrapper[4898]: I1211 13:10:45.965383 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d4348035-0731-4884-9bae-eaf5d2a693f9-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"d4348035-0731-4884-9bae-eaf5d2a693f9\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 11 13:10:45 crc kubenswrapper[4898]: I1211 13:10:45.965398 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d4348035-0731-4884-9bae-eaf5d2a693f9-web-config\") pod \"prometheus-k8s-0\" (UID: \"d4348035-0731-4884-9bae-eaf5d2a693f9\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 11 13:10:45 crc kubenswrapper[4898]: I1211 13:10:45.965421 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-svnj9\" (UniqueName: \"kubernetes.io/projected/d4348035-0731-4884-9bae-eaf5d2a693f9-kube-api-access-svnj9\") pod \"prometheus-k8s-0\" (UID: \"d4348035-0731-4884-9bae-eaf5d2a693f9\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 11 13:10:45 crc kubenswrapper[4898]: I1211 13:10:45.965451 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d4348035-0731-4884-9bae-eaf5d2a693f9-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"d4348035-0731-4884-9bae-eaf5d2a693f9\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 11 13:10:45 crc kubenswrapper[4898]: I1211 13:10:45.965483 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/d4348035-0731-4884-9bae-eaf5d2a693f9-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"d4348035-0731-4884-9bae-eaf5d2a693f9\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 11 13:10:45 crc kubenswrapper[4898]: I1211 13:10:45.965506 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d4348035-0731-4884-9bae-eaf5d2a693f9-config\") pod \"prometheus-k8s-0\" (UID: \"d4348035-0731-4884-9bae-eaf5d2a693f9\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 11 13:10:46 crc kubenswrapper[4898]: I1211 13:10:46.058542 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-5ffffb4f84-84dnp"] Dec 11 13:10:46 crc kubenswrapper[4898]: I1211 13:10:46.069188 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/d4348035-0731-4884-9bae-eaf5d2a693f9-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"d4348035-0731-4884-9bae-eaf5d2a693f9\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 11 13:10:46 crc kubenswrapper[4898]: I1211 13:10:46.069231 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/d4348035-0731-4884-9bae-eaf5d2a693f9-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"d4348035-0731-4884-9bae-eaf5d2a693f9\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 11 13:10:46 crc kubenswrapper[4898]: I1211 13:10:46.069252 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d4348035-0731-4884-9bae-eaf5d2a693f9-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"d4348035-0731-4884-9bae-eaf5d2a693f9\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 11 13:10:46 crc kubenswrapper[4898]: I1211 13:10:46.069286 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/d4348035-0731-4884-9bae-eaf5d2a693f9-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"d4348035-0731-4884-9bae-eaf5d2a693f9\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 11 13:10:46 crc kubenswrapper[4898]: I1211 13:10:46.069307 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/d4348035-0731-4884-9bae-eaf5d2a693f9-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"d4348035-0731-4884-9bae-eaf5d2a693f9\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 11 13:10:46 crc kubenswrapper[4898]: I1211 13:10:46.069329 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d4348035-0731-4884-9bae-eaf5d2a693f9-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"d4348035-0731-4884-9bae-eaf5d2a693f9\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 11 13:10:46 crc kubenswrapper[4898]: I1211 13:10:46.069347 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/d4348035-0731-4884-9bae-eaf5d2a693f9-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"d4348035-0731-4884-9bae-eaf5d2a693f9\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 11 13:10:46 crc kubenswrapper[4898]: I1211 13:10:46.069368 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/d4348035-0731-4884-9bae-eaf5d2a693f9-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"d4348035-0731-4884-9bae-eaf5d2a693f9\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 11 13:10:46 crc kubenswrapper[4898]: I1211 13:10:46.069383 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d4348035-0731-4884-9bae-eaf5d2a693f9-config-out\") pod \"prometheus-k8s-0\" (UID: \"d4348035-0731-4884-9bae-eaf5d2a693f9\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 11 13:10:46 crc kubenswrapper[4898]: I1211 13:10:46.069410 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d4348035-0731-4884-9bae-eaf5d2a693f9-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"d4348035-0731-4884-9bae-eaf5d2a693f9\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 11 13:10:46 crc kubenswrapper[4898]: I1211 13:10:46.069441 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/d4348035-0731-4884-9bae-eaf5d2a693f9-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"d4348035-0731-4884-9bae-eaf5d2a693f9\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 11 13:10:46 crc kubenswrapper[4898]: I1211 13:10:46.069479 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d4348035-0731-4884-9bae-eaf5d2a693f9-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"d4348035-0731-4884-9bae-eaf5d2a693f9\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 11 13:10:46 crc kubenswrapper[4898]: I1211 13:10:46.069500 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d4348035-0731-4884-9bae-eaf5d2a693f9-web-config\") pod \"prometheus-k8s-0\" (UID: \"d4348035-0731-4884-9bae-eaf5d2a693f9\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 11 13:10:46 crc kubenswrapper[4898]: I1211 13:10:46.069528 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-svnj9\" (UniqueName: \"kubernetes.io/projected/d4348035-0731-4884-9bae-eaf5d2a693f9-kube-api-access-svnj9\") pod \"prometheus-k8s-0\" (UID: \"d4348035-0731-4884-9bae-eaf5d2a693f9\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 11 13:10:46 crc kubenswrapper[4898]: I1211 13:10:46.069561 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d4348035-0731-4884-9bae-eaf5d2a693f9-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"d4348035-0731-4884-9bae-eaf5d2a693f9\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 11 13:10:46 crc kubenswrapper[4898]: I1211 13:10:46.069578 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/d4348035-0731-4884-9bae-eaf5d2a693f9-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"d4348035-0731-4884-9bae-eaf5d2a693f9\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 11 13:10:46 crc kubenswrapper[4898]: I1211 13:10:46.069598 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d4348035-0731-4884-9bae-eaf5d2a693f9-config\") pod \"prometheus-k8s-0\" (UID: \"d4348035-0731-4884-9bae-eaf5d2a693f9\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 11 13:10:46 crc kubenswrapper[4898]: I1211 13:10:46.069625 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/d4348035-0731-4884-9bae-eaf5d2a693f9-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"d4348035-0731-4884-9bae-eaf5d2a693f9\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 11 13:10:46 crc kubenswrapper[4898]: I1211 13:10:46.071886 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d4348035-0731-4884-9bae-eaf5d2a693f9-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"d4348035-0731-4884-9bae-eaf5d2a693f9\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 11 13:10:46 crc kubenswrapper[4898]: I1211 13:10:46.073813 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d4348035-0731-4884-9bae-eaf5d2a693f9-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"d4348035-0731-4884-9bae-eaf5d2a693f9\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 11 13:10:46 crc kubenswrapper[4898]: I1211 13:10:46.077138 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d4348035-0731-4884-9bae-eaf5d2a693f9-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"d4348035-0731-4884-9bae-eaf5d2a693f9\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 11 13:10:46 crc kubenswrapper[4898]: I1211 13:10:46.077228 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/d4348035-0731-4884-9bae-eaf5d2a693f9-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"d4348035-0731-4884-9bae-eaf5d2a693f9\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 11 13:10:46 crc kubenswrapper[4898]: I1211 13:10:46.077533 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/d4348035-0731-4884-9bae-eaf5d2a693f9-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"d4348035-0731-4884-9bae-eaf5d2a693f9\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 11 13:10:46 crc kubenswrapper[4898]: I1211 13:10:46.078948 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d4348035-0731-4884-9bae-eaf5d2a693f9-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"d4348035-0731-4884-9bae-eaf5d2a693f9\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 11 13:10:46 crc kubenswrapper[4898]: I1211 13:10:46.079641 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d4348035-0731-4884-9bae-eaf5d2a693f9-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"d4348035-0731-4884-9bae-eaf5d2a693f9\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 11 13:10:46 crc kubenswrapper[4898]: I1211 13:10:46.080044 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/d4348035-0731-4884-9bae-eaf5d2a693f9-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"d4348035-0731-4884-9bae-eaf5d2a693f9\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 11 13:10:46 crc kubenswrapper[4898]: I1211 13:10:46.081319 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/d4348035-0731-4884-9bae-eaf5d2a693f9-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"d4348035-0731-4884-9bae-eaf5d2a693f9\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 11 13:10:46 crc kubenswrapper[4898]: I1211 13:10:46.081839 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d4348035-0731-4884-9bae-eaf5d2a693f9-web-config\") pod \"prometheus-k8s-0\" (UID: \"d4348035-0731-4884-9bae-eaf5d2a693f9\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 11 13:10:46 crc kubenswrapper[4898]: I1211 13:10:46.082008 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/d4348035-0731-4884-9bae-eaf5d2a693f9-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"d4348035-0731-4884-9bae-eaf5d2a693f9\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 11 13:10:46 crc kubenswrapper[4898]: I1211 13:10:46.082295 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/d4348035-0731-4884-9bae-eaf5d2a693f9-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"d4348035-0731-4884-9bae-eaf5d2a693f9\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 11 13:10:46 crc kubenswrapper[4898]: I1211 13:10:46.082717 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d4348035-0731-4884-9bae-eaf5d2a693f9-config-out\") pod \"prometheus-k8s-0\" (UID: \"d4348035-0731-4884-9bae-eaf5d2a693f9\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 11 13:10:46 crc kubenswrapper[4898]: I1211 13:10:46.083562 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/d4348035-0731-4884-9bae-eaf5d2a693f9-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"d4348035-0731-4884-9bae-eaf5d2a693f9\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 11 13:10:46 crc kubenswrapper[4898]: W1211 13:10:46.084059 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1111559b_96c1_4918_b502_1b5045b8a9da.slice/crio-70a37ee9818e47d8e0a2487ca73bae4f9e9528631613f45ba801940deb6f2d74 WatchSource:0}: Error finding container 70a37ee9818e47d8e0a2487ca73bae4f9e9528631613f45ba801940deb6f2d74: Status 404 returned error can't find the container with id 70a37ee9818e47d8e0a2487ca73bae4f9e9528631613f45ba801940deb6f2d74 Dec 11 13:10:46 crc kubenswrapper[4898]: I1211 13:10:46.084582 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/d4348035-0731-4884-9bae-eaf5d2a693f9-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"d4348035-0731-4884-9bae-eaf5d2a693f9\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 11 13:10:46 crc kubenswrapper[4898]: I1211 13:10:46.085812 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/d4348035-0731-4884-9bae-eaf5d2a693f9-config\") pod \"prometheus-k8s-0\" (UID: \"d4348035-0731-4884-9bae-eaf5d2a693f9\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 11 13:10:46 crc kubenswrapper[4898]: I1211 13:10:46.086173 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/d4348035-0731-4884-9bae-eaf5d2a693f9-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"d4348035-0731-4884-9bae-eaf5d2a693f9\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 11 13:10:46 crc kubenswrapper[4898]: I1211 13:10:46.094002 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-svnj9\" (UniqueName: \"kubernetes.io/projected/d4348035-0731-4884-9bae-eaf5d2a693f9-kube-api-access-svnj9\") pod \"prometheus-k8s-0\" (UID: \"d4348035-0731-4884-9bae-eaf5d2a693f9\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 11 13:10:46 crc kubenswrapper[4898]: I1211 13:10:46.300586 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Dec 11 13:10:46 crc kubenswrapper[4898]: I1211 13:10:46.754025 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Dec 11 13:10:46 crc kubenswrapper[4898]: W1211 13:10:46.780441 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd4348035_0731_4884_9bae_eaf5d2a693f9.slice/crio-9d14dfc0f944381bf72f9de38ae8dab906bd5ec435c1f153e21f3964d02ff0cd WatchSource:0}: Error finding container 9d14dfc0f944381bf72f9de38ae8dab906bd5ec435c1f153e21f3964d02ff0cd: Status 404 returned error can't find the container with id 9d14dfc0f944381bf72f9de38ae8dab906bd5ec435c1f153e21f3964d02ff0cd Dec 11 13:10:46 crc kubenswrapper[4898]: I1211 13:10:46.844722 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-8p8qb" event={"ID":"13ca8d97-b6c7-4670-908e-d4cd949bcb0b","Type":"ContainerStarted","Data":"97becaa7b92193abec97c86668e89ef471d974933c24eed1804bb2e4ffca0a16"} Dec 11 13:10:46 crc kubenswrapper[4898]: I1211 13:10:46.845075 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-8p8qb" event={"ID":"13ca8d97-b6c7-4670-908e-d4cd949bcb0b","Type":"ContainerStarted","Data":"56bb2702314fc1e165c4b56316f1940e61c32ff3d01de8c44e66abab2133a33b"} Dec 11 13:10:46 crc kubenswrapper[4898]: I1211 13:10:46.846883 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-7c75cc77ff-zj4jw" event={"ID":"faa81158-1b24-4a0a-8fb6-f362177c51fd","Type":"ContainerStarted","Data":"679f0bbec4c0f3263caaa4ab9cf9272a071da065f061078b0018d225e05f177c"} Dec 11 13:10:46 crc kubenswrapper[4898]: I1211 13:10:46.849611 4898 generic.go:334] "Generic (PLEG): container finished" podID="33b2dad5-ea59-439e-8659-60fdaba49f82" containerID="f205f79a24aaf699bf09a4238e0e52c6f96113747e9e273ec980fb8edfd3ff47" exitCode=0 Dec 11 13:10:46 crc kubenswrapper[4898]: I1211 13:10:46.849684 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"33b2dad5-ea59-439e-8659-60fdaba49f82","Type":"ContainerDied","Data":"f205f79a24aaf699bf09a4238e0e52c6f96113747e9e273ec980fb8edfd3ff47"} Dec 11 13:10:46 crc kubenswrapper[4898]: I1211 13:10:46.853583 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-8cd884dbc-x76wj" event={"ID":"b9e31705-36ee-462a-82ce-7dd3195f945b","Type":"ContainerStarted","Data":"856c44d78997d736f1b3be281864f7c59220a269c44057094a81a4af99d4db13"} Dec 11 13:10:46 crc kubenswrapper[4898]: I1211 13:10:46.856262 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-5ffffb4f84-84dnp" event={"ID":"1111559b-96c1-4918-b502-1b5045b8a9da","Type":"ContainerStarted","Data":"70a37ee9818e47d8e0a2487ca73bae4f9e9528631613f45ba801940deb6f2d74"} Dec 11 13:10:46 crc kubenswrapper[4898]: I1211 13:10:46.857308 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"d4348035-0731-4884-9bae-eaf5d2a693f9","Type":"ContainerStarted","Data":"9d14dfc0f944381bf72f9de38ae8dab906bd5ec435c1f153e21f3964d02ff0cd"} Dec 11 13:10:46 crc kubenswrapper[4898]: I1211 13:10:46.859217 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-b96d76946-svz9h" event={"ID":"7cd8b643-f10c-480e-b851-1d13d35eba18","Type":"ContainerStarted","Data":"40a772011fefd8d6de62dca87bdd2204b7d51acbf94a6c775d188c5f0715d4b0"} Dec 11 13:10:46 crc kubenswrapper[4898]: I1211 13:10:46.859237 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-b96d76946-svz9h" event={"ID":"7cd8b643-f10c-480e-b851-1d13d35eba18","Type":"ContainerStarted","Data":"ecb5340a6e01f3093b965ba06ded061d585292be6d7db7ece1e691e3f9e1b682"} Dec 11 13:10:46 crc kubenswrapper[4898]: I1211 13:10:46.859250 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-b96d76946-svz9h" event={"ID":"7cd8b643-f10c-480e-b851-1d13d35eba18","Type":"ContainerStarted","Data":"13741d79b61e0cfdc8ff0b5c52ef366adad75ab3e9622855789426826294fef9"} Dec 11 13:10:46 crc kubenswrapper[4898]: I1211 13:10:46.864169 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-cwpg4" event={"ID":"ba58f08b-30ee-4ed4-9156-ce30817e7231","Type":"ContainerStarted","Data":"a416a81e25dfc38a29b630cd1311c9f885a8787834a67612c52aba80a41bd566"} Dec 11 13:10:46 crc kubenswrapper[4898]: I1211 13:10:46.864236 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-cwpg4" event={"ID":"ba58f08b-30ee-4ed4-9156-ce30817e7231","Type":"ContainerStarted","Data":"e60608241341e3027e9b4c7608b4d3e950d2140846422c30ecb1469215d14ef5"} Dec 11 13:10:46 crc kubenswrapper[4898]: I1211 13:10:46.865109 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-8p8qb" podStartSLOduration=6.380083874 podStartE2EDuration="7.865099675s" podCreationTimestamp="2025-12-11 13:10:39 +0000 UTC" firstStartedPulling="2025-12-11 13:10:41.194626169 +0000 UTC m=+398.766952606" lastFinishedPulling="2025-12-11 13:10:42.67964195 +0000 UTC m=+400.251968407" observedRunningTime="2025-12-11 13:10:46.862377982 +0000 UTC m=+404.434704419" watchObservedRunningTime="2025-12-11 13:10:46.865099675 +0000 UTC m=+404.437426112" Dec 11 13:10:46 crc kubenswrapper[4898]: I1211 13:10:46.886697 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-8cd884dbc-x76wj" podStartSLOduration=2.886682281 podStartE2EDuration="2.886682281s" podCreationTimestamp="2025-12-11 13:10:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 13:10:46.885194451 +0000 UTC m=+404.457520898" watchObservedRunningTime="2025-12-11 13:10:46.886682281 +0000 UTC m=+404.459008718" Dec 11 13:10:46 crc kubenswrapper[4898]: I1211 13:10:46.942061 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-cwpg4" podStartSLOduration=4.519128564 podStartE2EDuration="7.942041429s" podCreationTimestamp="2025-12-11 13:10:39 +0000 UTC" firstStartedPulling="2025-12-11 13:10:41.819440912 +0000 UTC m=+399.391767349" lastFinishedPulling="2025-12-11 13:10:45.242353777 +0000 UTC m=+402.814680214" observedRunningTime="2025-12-11 13:10:46.931513218 +0000 UTC m=+404.503839665" watchObservedRunningTime="2025-12-11 13:10:46.942041429 +0000 UTC m=+404.514367866" Dec 11 13:10:46 crc kubenswrapper[4898]: I1211 13:10:46.949315 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-566fddb674-p7nkr" podStartSLOduration=4.93957178 podStartE2EDuration="7.949296053s" podCreationTimestamp="2025-12-11 13:10:39 +0000 UTC" firstStartedPulling="2025-12-11 13:10:42.227657301 +0000 UTC m=+399.799983738" lastFinishedPulling="2025-12-11 13:10:45.237381574 +0000 UTC m=+402.809708011" observedRunningTime="2025-12-11 13:10:46.945687426 +0000 UTC m=+404.518013853" watchObservedRunningTime="2025-12-11 13:10:46.949296053 +0000 UTC m=+404.521622490" Dec 11 13:10:47 crc kubenswrapper[4898]: E1211 13:10:47.778297 4898 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd4348035_0731_4884_9bae_eaf5d2a693f9.slice/crio-6a15807eb5a82dde520d9301e05c487e8974f2be93b5a1a1dcea39f08d1185f2.scope\": RecentStats: unable to find data in memory cache]" Dec 11 13:10:47 crc kubenswrapper[4898]: I1211 13:10:47.871703 4898 generic.go:334] "Generic (PLEG): container finished" podID="d4348035-0731-4884-9bae-eaf5d2a693f9" containerID="6a15807eb5a82dde520d9301e05c487e8974f2be93b5a1a1dcea39f08d1185f2" exitCode=0 Dec 11 13:10:47 crc kubenswrapper[4898]: I1211 13:10:47.871830 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"d4348035-0731-4884-9bae-eaf5d2a693f9","Type":"ContainerDied","Data":"6a15807eb5a82dde520d9301e05c487e8974f2be93b5a1a1dcea39f08d1185f2"} Dec 11 13:10:50 crc kubenswrapper[4898]: I1211 13:10:50.895399 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-7c75cc77ff-zj4jw" event={"ID":"faa81158-1b24-4a0a-8fb6-f362177c51fd","Type":"ContainerStarted","Data":"b04b9f5e5f57cab7e73af05fd3f16881cceae4adb2df99b741923ff7084297ae"} Dec 11 13:10:50 crc kubenswrapper[4898]: I1211 13:10:50.897118 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-5ffffb4f84-84dnp" event={"ID":"1111559b-96c1-4918-b502-1b5045b8a9da","Type":"ContainerStarted","Data":"309c738f34ffb8800e281a805cc272423331b9c5078d972be381c5c3ba9f53dd"} Dec 11 13:10:50 crc kubenswrapper[4898]: I1211 13:10:50.897302 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/monitoring-plugin-5ffffb4f84-84dnp" Dec 11 13:10:50 crc kubenswrapper[4898]: I1211 13:10:50.900537 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-b96d76946-svz9h" event={"ID":"7cd8b643-f10c-480e-b851-1d13d35eba18","Type":"ContainerStarted","Data":"85a78cca84e38f00d18f1c88c616c80a2cfbe4c02986f2b2def096f5464f8d7f"} Dec 11 13:10:50 crc kubenswrapper[4898]: I1211 13:10:50.902696 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-5ffffb4f84-84dnp" Dec 11 13:10:50 crc kubenswrapper[4898]: I1211 13:10:50.912414 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-7c75cc77ff-zj4jw" podStartSLOduration=3.707774015 podStartE2EDuration="6.912395101s" podCreationTimestamp="2025-12-11 13:10:44 +0000 UTC" firstStartedPulling="2025-12-11 13:10:45.849561089 +0000 UTC m=+403.421887536" lastFinishedPulling="2025-12-11 13:10:49.054182185 +0000 UTC m=+406.626508622" observedRunningTime="2025-12-11 13:10:50.910628703 +0000 UTC m=+408.482955140" watchObservedRunningTime="2025-12-11 13:10:50.912395101 +0000 UTC m=+408.484721538" Dec 11 13:10:50 crc kubenswrapper[4898]: I1211 13:10:50.926820 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-5ffffb4f84-84dnp" podStartSLOduration=2.951392049 podStartE2EDuration="5.926803415s" podCreationTimestamp="2025-12-11 13:10:45 +0000 UTC" firstStartedPulling="2025-12-11 13:10:46.087754089 +0000 UTC m=+403.660080526" lastFinishedPulling="2025-12-11 13:10:49.063165455 +0000 UTC m=+406.635491892" observedRunningTime="2025-12-11 13:10:50.924525894 +0000 UTC m=+408.496852331" watchObservedRunningTime="2025-12-11 13:10:50.926803415 +0000 UTC m=+408.499129852" Dec 11 13:10:52 crc kubenswrapper[4898]: I1211 13:10:52.914131 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"d4348035-0731-4884-9bae-eaf5d2a693f9","Type":"ContainerStarted","Data":"87a5cd3abaee84234537e90d5c56f2bfa1cf0ac15c817ba71b3472268af47830"} Dec 11 13:10:52 crc kubenswrapper[4898]: I1211 13:10:52.914433 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"d4348035-0731-4884-9bae-eaf5d2a693f9","Type":"ContainerStarted","Data":"ce0da9b5bfac2d7b5e629ae60bdab536e2223899ff527c197a9a850af2b07d96"} Dec 11 13:10:52 crc kubenswrapper[4898]: I1211 13:10:52.914448 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"d4348035-0731-4884-9bae-eaf5d2a693f9","Type":"ContainerStarted","Data":"0db3f4b85de85e867ad4487cbaea5aae078aa727bea346f275811ccf7bf3c4e9"} Dec 11 13:10:52 crc kubenswrapper[4898]: I1211 13:10:52.914473 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"d4348035-0731-4884-9bae-eaf5d2a693f9","Type":"ContainerStarted","Data":"09aa65d086b886fda4562f215f47e0c2fb114690578bfe5b37940e0e4da9183c"} Dec 11 13:10:52 crc kubenswrapper[4898]: I1211 13:10:52.914482 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"d4348035-0731-4884-9bae-eaf5d2a693f9","Type":"ContainerStarted","Data":"d74072404cc3b76e261cbdc77dd54e52ca7ed51d4687d7e33c2a08842d19d6c4"} Dec 11 13:10:52 crc kubenswrapper[4898]: I1211 13:10:52.917019 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-b96d76946-svz9h" event={"ID":"7cd8b643-f10c-480e-b851-1d13d35eba18","Type":"ContainerStarted","Data":"b63816e920c06dfd56487b5c2fd3b25849e2e4b0b326c6fb891fc1072e7ffa04"} Dec 11 13:10:52 crc kubenswrapper[4898]: I1211 13:10:52.917080 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-b96d76946-svz9h" event={"ID":"7cd8b643-f10c-480e-b851-1d13d35eba18","Type":"ContainerStarted","Data":"6a553be6c230f5fc8c4982ca264facbb4fa8839756d7b7c81519f42090dec12f"} Dec 11 13:10:52 crc kubenswrapper[4898]: I1211 13:10:52.917740 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/thanos-querier-b96d76946-svz9h" Dec 11 13:10:52 crc kubenswrapper[4898]: I1211 13:10:52.921485 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"33b2dad5-ea59-439e-8659-60fdaba49f82","Type":"ContainerStarted","Data":"922c38558a17d9c5c482c18743d968002acf539a7d359bc097cef08aa14e5aad"} Dec 11 13:10:52 crc kubenswrapper[4898]: I1211 13:10:52.921514 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"33b2dad5-ea59-439e-8659-60fdaba49f82","Type":"ContainerStarted","Data":"f6df407fe80d697b6c1826364c3620c84560a1118eec7897863421b257a30d90"} Dec 11 13:10:52 crc kubenswrapper[4898]: I1211 13:10:52.921527 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"33b2dad5-ea59-439e-8659-60fdaba49f82","Type":"ContainerStarted","Data":"0edd9ce626ec4db81a4e8eced1d8b17620937cd7fbbb48683922f23e3f69db57"} Dec 11 13:10:52 crc kubenswrapper[4898]: I1211 13:10:52.921541 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"33b2dad5-ea59-439e-8659-60fdaba49f82","Type":"ContainerStarted","Data":"9b908313e242acd60b501a3f474dfd6d7cfc925668b22d0db2613c0daf156030"} Dec 11 13:10:52 crc kubenswrapper[4898]: I1211 13:10:52.946307 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-b96d76946-svz9h" Dec 11 13:10:52 crc kubenswrapper[4898]: I1211 13:10:52.953734 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-b96d76946-svz9h" podStartSLOduration=5.19497544 podStartE2EDuration="11.953718185s" podCreationTimestamp="2025-12-11 13:10:41 +0000 UTC" firstStartedPulling="2025-12-11 13:10:42.301516433 +0000 UTC m=+399.873842870" lastFinishedPulling="2025-12-11 13:10:49.060259178 +0000 UTC m=+406.632585615" observedRunningTime="2025-12-11 13:10:52.947522069 +0000 UTC m=+410.519848506" watchObservedRunningTime="2025-12-11 13:10:52.953718185 +0000 UTC m=+410.526044622" Dec 11 13:10:53 crc kubenswrapper[4898]: I1211 13:10:53.931234 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"33b2dad5-ea59-439e-8659-60fdaba49f82","Type":"ContainerStarted","Data":"94249a59bd7d3733c4441b1e0f5bc94f603cd4d67efaf3481fa32aa9bd9e0a59"} Dec 11 13:10:53 crc kubenswrapper[4898]: I1211 13:10:53.931543 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"33b2dad5-ea59-439e-8659-60fdaba49f82","Type":"ContainerStarted","Data":"5c6c31a302892db8b6002de0c9feaf10eb65865cc253fe04e99aab1b42024601"} Dec 11 13:10:53 crc kubenswrapper[4898]: I1211 13:10:53.934758 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"d4348035-0731-4884-9bae-eaf5d2a693f9","Type":"ContainerStarted","Data":"8a174809170181ed56d6fee11c581b4ba82a88f2288cc2afeb02d47656360b7a"} Dec 11 13:10:53 crc kubenswrapper[4898]: I1211 13:10:53.974955 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=3.684812809 podStartE2EDuration="13.974933022s" podCreationTimestamp="2025-12-11 13:10:40 +0000 UTC" firstStartedPulling="2025-12-11 13:10:41.897196169 +0000 UTC m=+399.469522606" lastFinishedPulling="2025-12-11 13:10:52.187316382 +0000 UTC m=+409.759642819" observedRunningTime="2025-12-11 13:10:53.965945032 +0000 UTC m=+411.538271509" watchObservedRunningTime="2025-12-11 13:10:53.974933022 +0000 UTC m=+411.547259479" Dec 11 13:10:54 crc kubenswrapper[4898]: I1211 13:10:54.021531 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=4.689538369 podStartE2EDuration="9.021511156s" podCreationTimestamp="2025-12-11 13:10:45 +0000 UTC" firstStartedPulling="2025-12-11 13:10:47.873570742 +0000 UTC m=+405.445897179" lastFinishedPulling="2025-12-11 13:10:52.205543529 +0000 UTC m=+409.777869966" observedRunningTime="2025-12-11 13:10:54.018149506 +0000 UTC m=+411.590475943" watchObservedRunningTime="2025-12-11 13:10:54.021511156 +0000 UTC m=+411.593837583" Dec 11 13:10:54 crc kubenswrapper[4898]: I1211 13:10:54.506444 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-8cd884dbc-x76wj" Dec 11 13:10:54 crc kubenswrapper[4898]: I1211 13:10:54.506510 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-8cd884dbc-x76wj" Dec 11 13:10:54 crc kubenswrapper[4898]: I1211 13:10:54.514100 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-8cd884dbc-x76wj" Dec 11 13:10:54 crc kubenswrapper[4898]: I1211 13:10:54.943916 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-8cd884dbc-x76wj" Dec 11 13:10:55 crc kubenswrapper[4898]: I1211 13:10:55.003161 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-7cbpd"] Dec 11 13:10:56 crc kubenswrapper[4898]: I1211 13:10:56.127844 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-g9vz7" podUID="8850b908-0e43-45d6-a8d2-44e1fe06c4e0" containerName="registry" containerID="cri-o://d50693ef0e9ae66365cc644dff94c4ad4ff5f16c0182f45ec44ba9a2fbbeae97" gracePeriod=30 Dec 11 13:10:56 crc kubenswrapper[4898]: I1211 13:10:56.301642 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/prometheus-k8s-0" Dec 11 13:10:56 crc kubenswrapper[4898]: I1211 13:10:56.540650 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-g9vz7" Dec 11 13:10:56 crc kubenswrapper[4898]: I1211 13:10:56.635877 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8850b908-0e43-45d6-a8d2-44e1fe06c4e0-bound-sa-token\") pod \"8850b908-0e43-45d6-a8d2-44e1fe06c4e0\" (UID: \"8850b908-0e43-45d6-a8d2-44e1fe06c4e0\") " Dec 11 13:10:56 crc kubenswrapper[4898]: I1211 13:10:56.635959 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8850b908-0e43-45d6-a8d2-44e1fe06c4e0-installation-pull-secrets\") pod \"8850b908-0e43-45d6-a8d2-44e1fe06c4e0\" (UID: \"8850b908-0e43-45d6-a8d2-44e1fe06c4e0\") " Dec 11 13:10:56 crc kubenswrapper[4898]: I1211 13:10:56.635999 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8850b908-0e43-45d6-a8d2-44e1fe06c4e0-trusted-ca\") pod \"8850b908-0e43-45d6-a8d2-44e1fe06c4e0\" (UID: \"8850b908-0e43-45d6-a8d2-44e1fe06c4e0\") " Dec 11 13:10:56 crc kubenswrapper[4898]: I1211 13:10:56.636064 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8850b908-0e43-45d6-a8d2-44e1fe06c4e0-ca-trust-extracted\") pod \"8850b908-0e43-45d6-a8d2-44e1fe06c4e0\" (UID: \"8850b908-0e43-45d6-a8d2-44e1fe06c4e0\") " Dec 11 13:10:56 crc kubenswrapper[4898]: I1211 13:10:56.636245 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8850b908-0e43-45d6-a8d2-44e1fe06c4e0\" (UID: \"8850b908-0e43-45d6-a8d2-44e1fe06c4e0\") " Dec 11 13:10:56 crc kubenswrapper[4898]: I1211 13:10:56.636307 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b6j5q\" (UniqueName: \"kubernetes.io/projected/8850b908-0e43-45d6-a8d2-44e1fe06c4e0-kube-api-access-b6j5q\") pod \"8850b908-0e43-45d6-a8d2-44e1fe06c4e0\" (UID: \"8850b908-0e43-45d6-a8d2-44e1fe06c4e0\") " Dec 11 13:10:56 crc kubenswrapper[4898]: I1211 13:10:56.636403 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8850b908-0e43-45d6-a8d2-44e1fe06c4e0-registry-certificates\") pod \"8850b908-0e43-45d6-a8d2-44e1fe06c4e0\" (UID: \"8850b908-0e43-45d6-a8d2-44e1fe06c4e0\") " Dec 11 13:10:56 crc kubenswrapper[4898]: I1211 13:10:56.636476 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8850b908-0e43-45d6-a8d2-44e1fe06c4e0-registry-tls\") pod \"8850b908-0e43-45d6-a8d2-44e1fe06c4e0\" (UID: \"8850b908-0e43-45d6-a8d2-44e1fe06c4e0\") " Dec 11 13:10:56 crc kubenswrapper[4898]: I1211 13:10:56.639952 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8850b908-0e43-45d6-a8d2-44e1fe06c4e0-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8850b908-0e43-45d6-a8d2-44e1fe06c4e0" (UID: "8850b908-0e43-45d6-a8d2-44e1fe06c4e0"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:10:56 crc kubenswrapper[4898]: I1211 13:10:56.639970 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8850b908-0e43-45d6-a8d2-44e1fe06c4e0-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8850b908-0e43-45d6-a8d2-44e1fe06c4e0" (UID: "8850b908-0e43-45d6-a8d2-44e1fe06c4e0"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:10:56 crc kubenswrapper[4898]: I1211 13:10:56.641523 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8850b908-0e43-45d6-a8d2-44e1fe06c4e0-kube-api-access-b6j5q" (OuterVolumeSpecName: "kube-api-access-b6j5q") pod "8850b908-0e43-45d6-a8d2-44e1fe06c4e0" (UID: "8850b908-0e43-45d6-a8d2-44e1fe06c4e0"). InnerVolumeSpecName "kube-api-access-b6j5q". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:10:56 crc kubenswrapper[4898]: I1211 13:10:56.641592 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8850b908-0e43-45d6-a8d2-44e1fe06c4e0-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8850b908-0e43-45d6-a8d2-44e1fe06c4e0" (UID: "8850b908-0e43-45d6-a8d2-44e1fe06c4e0"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:10:56 crc kubenswrapper[4898]: I1211 13:10:56.642117 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8850b908-0e43-45d6-a8d2-44e1fe06c4e0-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8850b908-0e43-45d6-a8d2-44e1fe06c4e0" (UID: "8850b908-0e43-45d6-a8d2-44e1fe06c4e0"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:10:56 crc kubenswrapper[4898]: I1211 13:10:56.643035 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8850b908-0e43-45d6-a8d2-44e1fe06c4e0-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8850b908-0e43-45d6-a8d2-44e1fe06c4e0" (UID: "8850b908-0e43-45d6-a8d2-44e1fe06c4e0"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:10:56 crc kubenswrapper[4898]: I1211 13:10:56.651824 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "8850b908-0e43-45d6-a8d2-44e1fe06c4e0" (UID: "8850b908-0e43-45d6-a8d2-44e1fe06c4e0"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 11 13:10:56 crc kubenswrapper[4898]: I1211 13:10:56.652680 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8850b908-0e43-45d6-a8d2-44e1fe06c4e0-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8850b908-0e43-45d6-a8d2-44e1fe06c4e0" (UID: "8850b908-0e43-45d6-a8d2-44e1fe06c4e0"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 13:10:56 crc kubenswrapper[4898]: I1211 13:10:56.738414 4898 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8850b908-0e43-45d6-a8d2-44e1fe06c4e0-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Dec 11 13:10:56 crc kubenswrapper[4898]: I1211 13:10:56.738497 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b6j5q\" (UniqueName: \"kubernetes.io/projected/8850b908-0e43-45d6-a8d2-44e1fe06c4e0-kube-api-access-b6j5q\") on node \"crc\" DevicePath \"\"" Dec 11 13:10:56 crc kubenswrapper[4898]: I1211 13:10:56.738517 4898 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8850b908-0e43-45d6-a8d2-44e1fe06c4e0-registry-certificates\") on node \"crc\" DevicePath \"\"" Dec 11 13:10:56 crc kubenswrapper[4898]: I1211 13:10:56.738552 4898 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8850b908-0e43-45d6-a8d2-44e1fe06c4e0-registry-tls\") on node \"crc\" DevicePath \"\"" Dec 11 13:10:56 crc kubenswrapper[4898]: I1211 13:10:56.738568 4898 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8850b908-0e43-45d6-a8d2-44e1fe06c4e0-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 11 13:10:56 crc kubenswrapper[4898]: I1211 13:10:56.738580 4898 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8850b908-0e43-45d6-a8d2-44e1fe06c4e0-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Dec 11 13:10:56 crc kubenswrapper[4898]: I1211 13:10:56.738591 4898 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8850b908-0e43-45d6-a8d2-44e1fe06c4e0-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 11 13:10:56 crc kubenswrapper[4898]: I1211 13:10:56.953652 4898 generic.go:334] "Generic (PLEG): container finished" podID="8850b908-0e43-45d6-a8d2-44e1fe06c4e0" containerID="d50693ef0e9ae66365cc644dff94c4ad4ff5f16c0182f45ec44ba9a2fbbeae97" exitCode=0 Dec 11 13:10:56 crc kubenswrapper[4898]: I1211 13:10:56.954038 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-g9vz7" event={"ID":"8850b908-0e43-45d6-a8d2-44e1fe06c4e0","Type":"ContainerDied","Data":"d50693ef0e9ae66365cc644dff94c4ad4ff5f16c0182f45ec44ba9a2fbbeae97"} Dec 11 13:10:56 crc kubenswrapper[4898]: I1211 13:10:56.954175 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-g9vz7" event={"ID":"8850b908-0e43-45d6-a8d2-44e1fe06c4e0","Type":"ContainerDied","Data":"21d0a9aebc70d734f273ae6d088a4515f6a4e859751fff42a1a275dc723582f0"} Dec 11 13:10:56 crc kubenswrapper[4898]: I1211 13:10:56.954274 4898 scope.go:117] "RemoveContainer" containerID="d50693ef0e9ae66365cc644dff94c4ad4ff5f16c0182f45ec44ba9a2fbbeae97" Dec 11 13:10:56 crc kubenswrapper[4898]: I1211 13:10:56.954518 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-g9vz7" Dec 11 13:10:56 crc kubenswrapper[4898]: I1211 13:10:56.980323 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-g9vz7"] Dec 11 13:10:56 crc kubenswrapper[4898]: I1211 13:10:56.989236 4898 scope.go:117] "RemoveContainer" containerID="d50693ef0e9ae66365cc644dff94c4ad4ff5f16c0182f45ec44ba9a2fbbeae97" Dec 11 13:10:56 crc kubenswrapper[4898]: I1211 13:10:56.989307 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-g9vz7"] Dec 11 13:10:56 crc kubenswrapper[4898]: E1211 13:10:56.989860 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d50693ef0e9ae66365cc644dff94c4ad4ff5f16c0182f45ec44ba9a2fbbeae97\": container with ID starting with d50693ef0e9ae66365cc644dff94c4ad4ff5f16c0182f45ec44ba9a2fbbeae97 not found: ID does not exist" containerID="d50693ef0e9ae66365cc644dff94c4ad4ff5f16c0182f45ec44ba9a2fbbeae97" Dec 11 13:10:56 crc kubenswrapper[4898]: I1211 13:10:56.989923 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d50693ef0e9ae66365cc644dff94c4ad4ff5f16c0182f45ec44ba9a2fbbeae97"} err="failed to get container status \"d50693ef0e9ae66365cc644dff94c4ad4ff5f16c0182f45ec44ba9a2fbbeae97\": rpc error: code = NotFound desc = could not find container \"d50693ef0e9ae66365cc644dff94c4ad4ff5f16c0182f45ec44ba9a2fbbeae97\": container with ID starting with d50693ef0e9ae66365cc644dff94c4ad4ff5f16c0182f45ec44ba9a2fbbeae97 not found: ID does not exist" Dec 11 13:10:58 crc kubenswrapper[4898]: I1211 13:10:58.790854 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8850b908-0e43-45d6-a8d2-44e1fe06c4e0" path="/var/lib/kubelet/pods/8850b908-0e43-45d6-a8d2-44e1fe06c4e0/volumes" Dec 11 13:11:04 crc kubenswrapper[4898]: I1211 13:11:04.995320 4898 patch_prober.go:28] interesting pod/machine-config-daemon-7mmvk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 11 13:11:04 crc kubenswrapper[4898]: I1211 13:11:04.996109 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 11 13:11:04 crc kubenswrapper[4898]: I1211 13:11:04.996176 4898 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" Dec 11 13:11:04 crc kubenswrapper[4898]: I1211 13:11:04.997127 4898 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f637b15c5c6a67943cd9b7a091adbf77f0c3f8526911048606b42e00cbbb2e61"} pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 11 13:11:04 crc kubenswrapper[4898]: I1211 13:11:04.997243 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" containerName="machine-config-daemon" containerID="cri-o://f637b15c5c6a67943cd9b7a091adbf77f0c3f8526911048606b42e00cbbb2e61" gracePeriod=600 Dec 11 13:11:05 crc kubenswrapper[4898]: I1211 13:11:05.073893 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-7c75cc77ff-zj4jw" Dec 11 13:11:05 crc kubenswrapper[4898]: I1211 13:11:05.073957 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/metrics-server-7c75cc77ff-zj4jw" Dec 11 13:11:06 crc kubenswrapper[4898]: I1211 13:11:06.041421 4898 generic.go:334] "Generic (PLEG): container finished" podID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" containerID="f637b15c5c6a67943cd9b7a091adbf77f0c3f8526911048606b42e00cbbb2e61" exitCode=0 Dec 11 13:11:06 crc kubenswrapper[4898]: I1211 13:11:06.041501 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" event={"ID":"b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c","Type":"ContainerDied","Data":"f637b15c5c6a67943cd9b7a091adbf77f0c3f8526911048606b42e00cbbb2e61"} Dec 11 13:11:06 crc kubenswrapper[4898]: I1211 13:11:06.042286 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" event={"ID":"b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c","Type":"ContainerStarted","Data":"207d0448c3903ada5a8d43c8e4caa5a44f53738d1392a4585159b0ae517f8656"} Dec 11 13:11:06 crc kubenswrapper[4898]: I1211 13:11:06.042339 4898 scope.go:117] "RemoveContainer" containerID="eb7b98c50effebe09db7e51fe139da11af7b116ce6b7d81f0dada92fa29c82a6" Dec 11 13:11:20 crc kubenswrapper[4898]: I1211 13:11:20.054563 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-7cbpd" podUID="dd7edc7b-2d3a-4402-b7de-e70de317e52e" containerName="console" containerID="cri-o://77cb8186e37ceeb95553e18fa80b275131038f66f4ef145697387a2e3a44122b" gracePeriod=15 Dec 11 13:11:21 crc kubenswrapper[4898]: I1211 13:11:21.386973 4898 patch_prober.go:28] interesting pod/console-f9d7485db-7cbpd container/console namespace/openshift-console: Readiness probe status=failure output="Get \"https://10.217.0.13:8443/health\": dial tcp 10.217.0.13:8443: connect: connection refused" start-of-body= Dec 11 13:11:21 crc kubenswrapper[4898]: I1211 13:11:21.387325 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/console-f9d7485db-7cbpd" podUID="dd7edc7b-2d3a-4402-b7de-e70de317e52e" containerName="console" probeResult="failure" output="Get \"https://10.217.0.13:8443/health\": dial tcp 10.217.0.13:8443: connect: connection refused" Dec 11 13:11:23 crc kubenswrapper[4898]: I1211 13:11:23.136556 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-7cbpd_dd7edc7b-2d3a-4402-b7de-e70de317e52e/console/0.log" Dec 11 13:11:23 crc kubenswrapper[4898]: I1211 13:11:23.136961 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-7cbpd" Dec 11 13:11:23 crc kubenswrapper[4898]: I1211 13:11:23.168411 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-7cbpd_dd7edc7b-2d3a-4402-b7de-e70de317e52e/console/0.log" Dec 11 13:11:23 crc kubenswrapper[4898]: I1211 13:11:23.168476 4898 generic.go:334] "Generic (PLEG): container finished" podID="dd7edc7b-2d3a-4402-b7de-e70de317e52e" containerID="77cb8186e37ceeb95553e18fa80b275131038f66f4ef145697387a2e3a44122b" exitCode=2 Dec 11 13:11:23 crc kubenswrapper[4898]: I1211 13:11:23.168535 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-7cbpd" event={"ID":"dd7edc7b-2d3a-4402-b7de-e70de317e52e","Type":"ContainerDied","Data":"77cb8186e37ceeb95553e18fa80b275131038f66f4ef145697387a2e3a44122b"} Dec 11 13:11:23 crc kubenswrapper[4898]: I1211 13:11:23.168581 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-7cbpd" event={"ID":"dd7edc7b-2d3a-4402-b7de-e70de317e52e","Type":"ContainerDied","Data":"e38e187f9d9ca3467e385951b420634e42884eacc7a246a7d2ffe3a369a2bb59"} Dec 11 13:11:23 crc kubenswrapper[4898]: I1211 13:11:23.168580 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-7cbpd" Dec 11 13:11:23 crc kubenswrapper[4898]: I1211 13:11:23.168601 4898 scope.go:117] "RemoveContainer" containerID="77cb8186e37ceeb95553e18fa80b275131038f66f4ef145697387a2e3a44122b" Dec 11 13:11:23 crc kubenswrapper[4898]: I1211 13:11:23.185831 4898 scope.go:117] "RemoveContainer" containerID="77cb8186e37ceeb95553e18fa80b275131038f66f4ef145697387a2e3a44122b" Dec 11 13:11:23 crc kubenswrapper[4898]: E1211 13:11:23.186505 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"77cb8186e37ceeb95553e18fa80b275131038f66f4ef145697387a2e3a44122b\": container with ID starting with 77cb8186e37ceeb95553e18fa80b275131038f66f4ef145697387a2e3a44122b not found: ID does not exist" containerID="77cb8186e37ceeb95553e18fa80b275131038f66f4ef145697387a2e3a44122b" Dec 11 13:11:23 crc kubenswrapper[4898]: I1211 13:11:23.186592 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77cb8186e37ceeb95553e18fa80b275131038f66f4ef145697387a2e3a44122b"} err="failed to get container status \"77cb8186e37ceeb95553e18fa80b275131038f66f4ef145697387a2e3a44122b\": rpc error: code = NotFound desc = could not find container \"77cb8186e37ceeb95553e18fa80b275131038f66f4ef145697387a2e3a44122b\": container with ID starting with 77cb8186e37ceeb95553e18fa80b275131038f66f4ef145697387a2e3a44122b not found: ID does not exist" Dec 11 13:11:23 crc kubenswrapper[4898]: I1211 13:11:23.289622 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7t87p\" (UniqueName: \"kubernetes.io/projected/dd7edc7b-2d3a-4402-b7de-e70de317e52e-kube-api-access-7t87p\") pod \"dd7edc7b-2d3a-4402-b7de-e70de317e52e\" (UID: \"dd7edc7b-2d3a-4402-b7de-e70de317e52e\") " Dec 11 13:11:23 crc kubenswrapper[4898]: I1211 13:11:23.289733 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/dd7edc7b-2d3a-4402-b7de-e70de317e52e-service-ca\") pod \"dd7edc7b-2d3a-4402-b7de-e70de317e52e\" (UID: \"dd7edc7b-2d3a-4402-b7de-e70de317e52e\") " Dec 11 13:11:23 crc kubenswrapper[4898]: I1211 13:11:23.289819 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/dd7edc7b-2d3a-4402-b7de-e70de317e52e-oauth-serving-cert\") pod \"dd7edc7b-2d3a-4402-b7de-e70de317e52e\" (UID: \"dd7edc7b-2d3a-4402-b7de-e70de317e52e\") " Dec 11 13:11:23 crc kubenswrapper[4898]: I1211 13:11:23.289904 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/dd7edc7b-2d3a-4402-b7de-e70de317e52e-console-config\") pod \"dd7edc7b-2d3a-4402-b7de-e70de317e52e\" (UID: \"dd7edc7b-2d3a-4402-b7de-e70de317e52e\") " Dec 11 13:11:23 crc kubenswrapper[4898]: I1211 13:11:23.289991 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dd7edc7b-2d3a-4402-b7de-e70de317e52e-trusted-ca-bundle\") pod \"dd7edc7b-2d3a-4402-b7de-e70de317e52e\" (UID: \"dd7edc7b-2d3a-4402-b7de-e70de317e52e\") " Dec 11 13:11:23 crc kubenswrapper[4898]: I1211 13:11:23.290230 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/dd7edc7b-2d3a-4402-b7de-e70de317e52e-console-oauth-config\") pod \"dd7edc7b-2d3a-4402-b7de-e70de317e52e\" (UID: \"dd7edc7b-2d3a-4402-b7de-e70de317e52e\") " Dec 11 13:11:23 crc kubenswrapper[4898]: I1211 13:11:23.290272 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/dd7edc7b-2d3a-4402-b7de-e70de317e52e-console-serving-cert\") pod \"dd7edc7b-2d3a-4402-b7de-e70de317e52e\" (UID: \"dd7edc7b-2d3a-4402-b7de-e70de317e52e\") " Dec 11 13:11:23 crc kubenswrapper[4898]: I1211 13:11:23.290491 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd7edc7b-2d3a-4402-b7de-e70de317e52e-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "dd7edc7b-2d3a-4402-b7de-e70de317e52e" (UID: "dd7edc7b-2d3a-4402-b7de-e70de317e52e"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:11:23 crc kubenswrapper[4898]: I1211 13:11:23.290806 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd7edc7b-2d3a-4402-b7de-e70de317e52e-console-config" (OuterVolumeSpecName: "console-config") pod "dd7edc7b-2d3a-4402-b7de-e70de317e52e" (UID: "dd7edc7b-2d3a-4402-b7de-e70de317e52e"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:11:23 crc kubenswrapper[4898]: I1211 13:11:23.291032 4898 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/dd7edc7b-2d3a-4402-b7de-e70de317e52e-console-config\") on node \"crc\" DevicePath \"\"" Dec 11 13:11:23 crc kubenswrapper[4898]: I1211 13:11:23.291041 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd7edc7b-2d3a-4402-b7de-e70de317e52e-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "dd7edc7b-2d3a-4402-b7de-e70de317e52e" (UID: "dd7edc7b-2d3a-4402-b7de-e70de317e52e"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:11:23 crc kubenswrapper[4898]: I1211 13:11:23.291061 4898 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/dd7edc7b-2d3a-4402-b7de-e70de317e52e-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 11 13:11:23 crc kubenswrapper[4898]: I1211 13:11:23.291872 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd7edc7b-2d3a-4402-b7de-e70de317e52e-service-ca" (OuterVolumeSpecName: "service-ca") pod "dd7edc7b-2d3a-4402-b7de-e70de317e52e" (UID: "dd7edc7b-2d3a-4402-b7de-e70de317e52e"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:11:23 crc kubenswrapper[4898]: I1211 13:11:23.295702 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd7edc7b-2d3a-4402-b7de-e70de317e52e-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "dd7edc7b-2d3a-4402-b7de-e70de317e52e" (UID: "dd7edc7b-2d3a-4402-b7de-e70de317e52e"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:11:23 crc kubenswrapper[4898]: I1211 13:11:23.295940 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd7edc7b-2d3a-4402-b7de-e70de317e52e-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "dd7edc7b-2d3a-4402-b7de-e70de317e52e" (UID: "dd7edc7b-2d3a-4402-b7de-e70de317e52e"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:11:23 crc kubenswrapper[4898]: I1211 13:11:23.296649 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd7edc7b-2d3a-4402-b7de-e70de317e52e-kube-api-access-7t87p" (OuterVolumeSpecName: "kube-api-access-7t87p") pod "dd7edc7b-2d3a-4402-b7de-e70de317e52e" (UID: "dd7edc7b-2d3a-4402-b7de-e70de317e52e"). InnerVolumeSpecName "kube-api-access-7t87p". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:11:23 crc kubenswrapper[4898]: I1211 13:11:23.392701 4898 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/dd7edc7b-2d3a-4402-b7de-e70de317e52e-console-oauth-config\") on node \"crc\" DevicePath \"\"" Dec 11 13:11:23 crc kubenswrapper[4898]: I1211 13:11:23.392734 4898 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/dd7edc7b-2d3a-4402-b7de-e70de317e52e-console-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 11 13:11:23 crc kubenswrapper[4898]: I1211 13:11:23.392751 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7t87p\" (UniqueName: \"kubernetes.io/projected/dd7edc7b-2d3a-4402-b7de-e70de317e52e-kube-api-access-7t87p\") on node \"crc\" DevicePath \"\"" Dec 11 13:11:23 crc kubenswrapper[4898]: I1211 13:11:23.392763 4898 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/dd7edc7b-2d3a-4402-b7de-e70de317e52e-service-ca\") on node \"crc\" DevicePath \"\"" Dec 11 13:11:23 crc kubenswrapper[4898]: I1211 13:11:23.392771 4898 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dd7edc7b-2d3a-4402-b7de-e70de317e52e-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 11 13:11:23 crc kubenswrapper[4898]: I1211 13:11:23.506759 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-7cbpd"] Dec 11 13:11:23 crc kubenswrapper[4898]: I1211 13:11:23.522088 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-7cbpd"] Dec 11 13:11:24 crc kubenswrapper[4898]: I1211 13:11:24.788545 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd7edc7b-2d3a-4402-b7de-e70de317e52e" path="/var/lib/kubelet/pods/dd7edc7b-2d3a-4402-b7de-e70de317e52e/volumes" Dec 11 13:11:25 crc kubenswrapper[4898]: I1211 13:11:25.083109 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-7c75cc77ff-zj4jw" Dec 11 13:11:25 crc kubenswrapper[4898]: I1211 13:11:25.088746 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-7c75cc77ff-zj4jw" Dec 11 13:11:46 crc kubenswrapper[4898]: I1211 13:11:46.300943 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Dec 11 13:11:46 crc kubenswrapper[4898]: I1211 13:11:46.348733 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Dec 11 13:11:46 crc kubenswrapper[4898]: I1211 13:11:46.385303 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Dec 11 13:12:40 crc kubenswrapper[4898]: I1211 13:12:40.633180 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-bf88584db-ctz8c"] Dec 11 13:12:40 crc kubenswrapper[4898]: E1211 13:12:40.634031 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd7edc7b-2d3a-4402-b7de-e70de317e52e" containerName="console" Dec 11 13:12:40 crc kubenswrapper[4898]: I1211 13:12:40.634048 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd7edc7b-2d3a-4402-b7de-e70de317e52e" containerName="console" Dec 11 13:12:40 crc kubenswrapper[4898]: E1211 13:12:40.634074 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8850b908-0e43-45d6-a8d2-44e1fe06c4e0" containerName="registry" Dec 11 13:12:40 crc kubenswrapper[4898]: I1211 13:12:40.634082 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="8850b908-0e43-45d6-a8d2-44e1fe06c4e0" containerName="registry" Dec 11 13:12:40 crc kubenswrapper[4898]: I1211 13:12:40.634213 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd7edc7b-2d3a-4402-b7de-e70de317e52e" containerName="console" Dec 11 13:12:40 crc kubenswrapper[4898]: I1211 13:12:40.634234 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="8850b908-0e43-45d6-a8d2-44e1fe06c4e0" containerName="registry" Dec 11 13:12:40 crc kubenswrapper[4898]: I1211 13:12:40.634771 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-bf88584db-ctz8c" Dec 11 13:12:40 crc kubenswrapper[4898]: I1211 13:12:40.640508 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-bf88584db-ctz8c"] Dec 11 13:12:40 crc kubenswrapper[4898]: I1211 13:12:40.715755 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8bdbf089-988a-404d-beb0-212d4aa26387-console-config\") pod \"console-bf88584db-ctz8c\" (UID: \"8bdbf089-988a-404d-beb0-212d4aa26387\") " pod="openshift-console/console-bf88584db-ctz8c" Dec 11 13:12:40 crc kubenswrapper[4898]: I1211 13:12:40.715805 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8bdbf089-988a-404d-beb0-212d4aa26387-trusted-ca-bundle\") pod \"console-bf88584db-ctz8c\" (UID: \"8bdbf089-988a-404d-beb0-212d4aa26387\") " pod="openshift-console/console-bf88584db-ctz8c" Dec 11 13:12:40 crc kubenswrapper[4898]: I1211 13:12:40.715867 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8bdbf089-988a-404d-beb0-212d4aa26387-console-serving-cert\") pod \"console-bf88584db-ctz8c\" (UID: \"8bdbf089-988a-404d-beb0-212d4aa26387\") " pod="openshift-console/console-bf88584db-ctz8c" Dec 11 13:12:40 crc kubenswrapper[4898]: I1211 13:12:40.715886 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w68w4\" (UniqueName: \"kubernetes.io/projected/8bdbf089-988a-404d-beb0-212d4aa26387-kube-api-access-w68w4\") pod \"console-bf88584db-ctz8c\" (UID: \"8bdbf089-988a-404d-beb0-212d4aa26387\") " pod="openshift-console/console-bf88584db-ctz8c" Dec 11 13:12:40 crc kubenswrapper[4898]: I1211 13:12:40.715931 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8bdbf089-988a-404d-beb0-212d4aa26387-service-ca\") pod \"console-bf88584db-ctz8c\" (UID: \"8bdbf089-988a-404d-beb0-212d4aa26387\") " pod="openshift-console/console-bf88584db-ctz8c" Dec 11 13:12:40 crc kubenswrapper[4898]: I1211 13:12:40.715950 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8bdbf089-988a-404d-beb0-212d4aa26387-oauth-serving-cert\") pod \"console-bf88584db-ctz8c\" (UID: \"8bdbf089-988a-404d-beb0-212d4aa26387\") " pod="openshift-console/console-bf88584db-ctz8c" Dec 11 13:12:40 crc kubenswrapper[4898]: I1211 13:12:40.715992 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8bdbf089-988a-404d-beb0-212d4aa26387-console-oauth-config\") pod \"console-bf88584db-ctz8c\" (UID: \"8bdbf089-988a-404d-beb0-212d4aa26387\") " pod="openshift-console/console-bf88584db-ctz8c" Dec 11 13:12:40 crc kubenswrapper[4898]: I1211 13:12:40.817432 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8bdbf089-988a-404d-beb0-212d4aa26387-console-oauth-config\") pod \"console-bf88584db-ctz8c\" (UID: \"8bdbf089-988a-404d-beb0-212d4aa26387\") " pod="openshift-console/console-bf88584db-ctz8c" Dec 11 13:12:40 crc kubenswrapper[4898]: I1211 13:12:40.817588 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8bdbf089-988a-404d-beb0-212d4aa26387-console-config\") pod \"console-bf88584db-ctz8c\" (UID: \"8bdbf089-988a-404d-beb0-212d4aa26387\") " pod="openshift-console/console-bf88584db-ctz8c" Dec 11 13:12:40 crc kubenswrapper[4898]: I1211 13:12:40.817654 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8bdbf089-988a-404d-beb0-212d4aa26387-trusted-ca-bundle\") pod \"console-bf88584db-ctz8c\" (UID: \"8bdbf089-988a-404d-beb0-212d4aa26387\") " pod="openshift-console/console-bf88584db-ctz8c" Dec 11 13:12:40 crc kubenswrapper[4898]: I1211 13:12:40.817790 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8bdbf089-988a-404d-beb0-212d4aa26387-console-serving-cert\") pod \"console-bf88584db-ctz8c\" (UID: \"8bdbf089-988a-404d-beb0-212d4aa26387\") " pod="openshift-console/console-bf88584db-ctz8c" Dec 11 13:12:40 crc kubenswrapper[4898]: I1211 13:12:40.817824 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w68w4\" (UniqueName: \"kubernetes.io/projected/8bdbf089-988a-404d-beb0-212d4aa26387-kube-api-access-w68w4\") pod \"console-bf88584db-ctz8c\" (UID: \"8bdbf089-988a-404d-beb0-212d4aa26387\") " pod="openshift-console/console-bf88584db-ctz8c" Dec 11 13:12:40 crc kubenswrapper[4898]: I1211 13:12:40.817876 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8bdbf089-988a-404d-beb0-212d4aa26387-service-ca\") pod \"console-bf88584db-ctz8c\" (UID: \"8bdbf089-988a-404d-beb0-212d4aa26387\") " pod="openshift-console/console-bf88584db-ctz8c" Dec 11 13:12:40 crc kubenswrapper[4898]: I1211 13:12:40.817929 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8bdbf089-988a-404d-beb0-212d4aa26387-oauth-serving-cert\") pod \"console-bf88584db-ctz8c\" (UID: \"8bdbf089-988a-404d-beb0-212d4aa26387\") " pod="openshift-console/console-bf88584db-ctz8c" Dec 11 13:12:40 crc kubenswrapper[4898]: I1211 13:12:40.820962 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8bdbf089-988a-404d-beb0-212d4aa26387-console-config\") pod \"console-bf88584db-ctz8c\" (UID: \"8bdbf089-988a-404d-beb0-212d4aa26387\") " pod="openshift-console/console-bf88584db-ctz8c" Dec 11 13:12:40 crc kubenswrapper[4898]: I1211 13:12:40.821145 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8bdbf089-988a-404d-beb0-212d4aa26387-service-ca\") pod \"console-bf88584db-ctz8c\" (UID: \"8bdbf089-988a-404d-beb0-212d4aa26387\") " pod="openshift-console/console-bf88584db-ctz8c" Dec 11 13:12:40 crc kubenswrapper[4898]: I1211 13:12:40.821234 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8bdbf089-988a-404d-beb0-212d4aa26387-oauth-serving-cert\") pod \"console-bf88584db-ctz8c\" (UID: \"8bdbf089-988a-404d-beb0-212d4aa26387\") " pod="openshift-console/console-bf88584db-ctz8c" Dec 11 13:12:40 crc kubenswrapper[4898]: I1211 13:12:40.821510 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8bdbf089-988a-404d-beb0-212d4aa26387-trusted-ca-bundle\") pod \"console-bf88584db-ctz8c\" (UID: \"8bdbf089-988a-404d-beb0-212d4aa26387\") " pod="openshift-console/console-bf88584db-ctz8c" Dec 11 13:12:40 crc kubenswrapper[4898]: I1211 13:12:40.831297 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8bdbf089-988a-404d-beb0-212d4aa26387-console-oauth-config\") pod \"console-bf88584db-ctz8c\" (UID: \"8bdbf089-988a-404d-beb0-212d4aa26387\") " pod="openshift-console/console-bf88584db-ctz8c" Dec 11 13:12:40 crc kubenswrapper[4898]: I1211 13:12:40.837887 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8bdbf089-988a-404d-beb0-212d4aa26387-console-serving-cert\") pod \"console-bf88584db-ctz8c\" (UID: \"8bdbf089-988a-404d-beb0-212d4aa26387\") " pod="openshift-console/console-bf88584db-ctz8c" Dec 11 13:12:40 crc kubenswrapper[4898]: I1211 13:12:40.854587 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w68w4\" (UniqueName: \"kubernetes.io/projected/8bdbf089-988a-404d-beb0-212d4aa26387-kube-api-access-w68w4\") pod \"console-bf88584db-ctz8c\" (UID: \"8bdbf089-988a-404d-beb0-212d4aa26387\") " pod="openshift-console/console-bf88584db-ctz8c" Dec 11 13:12:40 crc kubenswrapper[4898]: I1211 13:12:40.966270 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-bf88584db-ctz8c" Dec 11 13:12:41 crc kubenswrapper[4898]: I1211 13:12:41.181618 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-bf88584db-ctz8c"] Dec 11 13:12:41 crc kubenswrapper[4898]: I1211 13:12:41.729565 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-bf88584db-ctz8c" event={"ID":"8bdbf089-988a-404d-beb0-212d4aa26387","Type":"ContainerStarted","Data":"07a11f0707136045f796a5f1b5b4cdd6f868b69626ab7b74214b44dee21de322"} Dec 11 13:12:41 crc kubenswrapper[4898]: I1211 13:12:41.729842 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-bf88584db-ctz8c" event={"ID":"8bdbf089-988a-404d-beb0-212d4aa26387","Type":"ContainerStarted","Data":"e6327fcecf51c0161df22c63e0b4450e64bb8d4b3bbeacaf3269bcb43e257548"} Dec 11 13:12:41 crc kubenswrapper[4898]: I1211 13:12:41.755958 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-bf88584db-ctz8c" podStartSLOduration=1.755931108 podStartE2EDuration="1.755931108s" podCreationTimestamp="2025-12-11 13:12:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 13:12:41.750652087 +0000 UTC m=+519.322978524" watchObservedRunningTime="2025-12-11 13:12:41.755931108 +0000 UTC m=+519.328257595" Dec 11 13:12:50 crc kubenswrapper[4898]: I1211 13:12:50.966464 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-bf88584db-ctz8c" Dec 11 13:12:50 crc kubenswrapper[4898]: I1211 13:12:50.966996 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-bf88584db-ctz8c" Dec 11 13:12:50 crc kubenswrapper[4898]: I1211 13:12:50.970524 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-bf88584db-ctz8c" Dec 11 13:12:51 crc kubenswrapper[4898]: I1211 13:12:51.813515 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-bf88584db-ctz8c" Dec 11 13:12:51 crc kubenswrapper[4898]: I1211 13:12:51.865894 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-8cd884dbc-x76wj"] Dec 11 13:13:16 crc kubenswrapper[4898]: I1211 13:13:16.926687 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-8cd884dbc-x76wj" podUID="b9e31705-36ee-462a-82ce-7dd3195f945b" containerName="console" containerID="cri-o://856c44d78997d736f1b3be281864f7c59220a269c44057094a81a4af99d4db13" gracePeriod=15 Dec 11 13:13:17 crc kubenswrapper[4898]: I1211 13:13:17.314440 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-8cd884dbc-x76wj_b9e31705-36ee-462a-82ce-7dd3195f945b/console/0.log" Dec 11 13:13:17 crc kubenswrapper[4898]: I1211 13:13:17.314833 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-8cd884dbc-x76wj" Dec 11 13:13:17 crc kubenswrapper[4898]: I1211 13:13:17.370514 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b9e31705-36ee-462a-82ce-7dd3195f945b-service-ca\") pod \"b9e31705-36ee-462a-82ce-7dd3195f945b\" (UID: \"b9e31705-36ee-462a-82ce-7dd3195f945b\") " Dec 11 13:13:17 crc kubenswrapper[4898]: I1211 13:13:17.370608 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p9gnl\" (UniqueName: \"kubernetes.io/projected/b9e31705-36ee-462a-82ce-7dd3195f945b-kube-api-access-p9gnl\") pod \"b9e31705-36ee-462a-82ce-7dd3195f945b\" (UID: \"b9e31705-36ee-462a-82ce-7dd3195f945b\") " Dec 11 13:13:17 crc kubenswrapper[4898]: I1211 13:13:17.370629 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b9e31705-36ee-462a-82ce-7dd3195f945b-trusted-ca-bundle\") pod \"b9e31705-36ee-462a-82ce-7dd3195f945b\" (UID: \"b9e31705-36ee-462a-82ce-7dd3195f945b\") " Dec 11 13:13:17 crc kubenswrapper[4898]: I1211 13:13:17.370697 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b9e31705-36ee-462a-82ce-7dd3195f945b-oauth-serving-cert\") pod \"b9e31705-36ee-462a-82ce-7dd3195f945b\" (UID: \"b9e31705-36ee-462a-82ce-7dd3195f945b\") " Dec 11 13:13:17 crc kubenswrapper[4898]: I1211 13:13:17.371489 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b9e31705-36ee-462a-82ce-7dd3195f945b-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "b9e31705-36ee-462a-82ce-7dd3195f945b" (UID: "b9e31705-36ee-462a-82ce-7dd3195f945b"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:13:17 crc kubenswrapper[4898]: I1211 13:13:17.371527 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b9e31705-36ee-462a-82ce-7dd3195f945b-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "b9e31705-36ee-462a-82ce-7dd3195f945b" (UID: "b9e31705-36ee-462a-82ce-7dd3195f945b"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:13:17 crc kubenswrapper[4898]: I1211 13:13:17.371618 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b9e31705-36ee-462a-82ce-7dd3195f945b-console-oauth-config\") pod \"b9e31705-36ee-462a-82ce-7dd3195f945b\" (UID: \"b9e31705-36ee-462a-82ce-7dd3195f945b\") " Dec 11 13:13:17 crc kubenswrapper[4898]: I1211 13:13:17.371611 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b9e31705-36ee-462a-82ce-7dd3195f945b-service-ca" (OuterVolumeSpecName: "service-ca") pod "b9e31705-36ee-462a-82ce-7dd3195f945b" (UID: "b9e31705-36ee-462a-82ce-7dd3195f945b"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:13:17 crc kubenswrapper[4898]: I1211 13:13:17.371648 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b9e31705-36ee-462a-82ce-7dd3195f945b-console-serving-cert\") pod \"b9e31705-36ee-462a-82ce-7dd3195f945b\" (UID: \"b9e31705-36ee-462a-82ce-7dd3195f945b\") " Dec 11 13:13:17 crc kubenswrapper[4898]: I1211 13:13:17.371698 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b9e31705-36ee-462a-82ce-7dd3195f945b-console-config\") pod \"b9e31705-36ee-462a-82ce-7dd3195f945b\" (UID: \"b9e31705-36ee-462a-82ce-7dd3195f945b\") " Dec 11 13:13:17 crc kubenswrapper[4898]: I1211 13:13:17.371965 4898 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b9e31705-36ee-462a-82ce-7dd3195f945b-service-ca\") on node \"crc\" DevicePath \"\"" Dec 11 13:13:17 crc kubenswrapper[4898]: I1211 13:13:17.371979 4898 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b9e31705-36ee-462a-82ce-7dd3195f945b-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 11 13:13:17 crc kubenswrapper[4898]: I1211 13:13:17.371988 4898 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b9e31705-36ee-462a-82ce-7dd3195f945b-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 11 13:13:17 crc kubenswrapper[4898]: I1211 13:13:17.372212 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b9e31705-36ee-462a-82ce-7dd3195f945b-console-config" (OuterVolumeSpecName: "console-config") pod "b9e31705-36ee-462a-82ce-7dd3195f945b" (UID: "b9e31705-36ee-462a-82ce-7dd3195f945b"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:13:17 crc kubenswrapper[4898]: I1211 13:13:17.375244 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9e31705-36ee-462a-82ce-7dd3195f945b-kube-api-access-p9gnl" (OuterVolumeSpecName: "kube-api-access-p9gnl") pod "b9e31705-36ee-462a-82ce-7dd3195f945b" (UID: "b9e31705-36ee-462a-82ce-7dd3195f945b"). InnerVolumeSpecName "kube-api-access-p9gnl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:13:17 crc kubenswrapper[4898]: I1211 13:13:17.375813 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9e31705-36ee-462a-82ce-7dd3195f945b-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "b9e31705-36ee-462a-82ce-7dd3195f945b" (UID: "b9e31705-36ee-462a-82ce-7dd3195f945b"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:13:17 crc kubenswrapper[4898]: I1211 13:13:17.376231 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9e31705-36ee-462a-82ce-7dd3195f945b-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "b9e31705-36ee-462a-82ce-7dd3195f945b" (UID: "b9e31705-36ee-462a-82ce-7dd3195f945b"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:13:17 crc kubenswrapper[4898]: I1211 13:13:17.473315 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p9gnl\" (UniqueName: \"kubernetes.io/projected/b9e31705-36ee-462a-82ce-7dd3195f945b-kube-api-access-p9gnl\") on node \"crc\" DevicePath \"\"" Dec 11 13:13:17 crc kubenswrapper[4898]: I1211 13:13:17.473357 4898 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b9e31705-36ee-462a-82ce-7dd3195f945b-console-oauth-config\") on node \"crc\" DevicePath \"\"" Dec 11 13:13:17 crc kubenswrapper[4898]: I1211 13:13:17.473369 4898 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b9e31705-36ee-462a-82ce-7dd3195f945b-console-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 11 13:13:17 crc kubenswrapper[4898]: I1211 13:13:17.473382 4898 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b9e31705-36ee-462a-82ce-7dd3195f945b-console-config\") on node \"crc\" DevicePath \"\"" Dec 11 13:13:18 crc kubenswrapper[4898]: I1211 13:13:18.045609 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-8cd884dbc-x76wj_b9e31705-36ee-462a-82ce-7dd3195f945b/console/0.log" Dec 11 13:13:18 crc kubenswrapper[4898]: I1211 13:13:18.045667 4898 generic.go:334] "Generic (PLEG): container finished" podID="b9e31705-36ee-462a-82ce-7dd3195f945b" containerID="856c44d78997d736f1b3be281864f7c59220a269c44057094a81a4af99d4db13" exitCode=2 Dec 11 13:13:18 crc kubenswrapper[4898]: I1211 13:13:18.045702 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-8cd884dbc-x76wj" event={"ID":"b9e31705-36ee-462a-82ce-7dd3195f945b","Type":"ContainerDied","Data":"856c44d78997d736f1b3be281864f7c59220a269c44057094a81a4af99d4db13"} Dec 11 13:13:18 crc kubenswrapper[4898]: I1211 13:13:18.045730 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-8cd884dbc-x76wj" event={"ID":"b9e31705-36ee-462a-82ce-7dd3195f945b","Type":"ContainerDied","Data":"845c0e1769a724643c67bad26c0eb358971cf72fea6e8044abae13ab382bae6f"} Dec 11 13:13:18 crc kubenswrapper[4898]: I1211 13:13:18.045752 4898 scope.go:117] "RemoveContainer" containerID="856c44d78997d736f1b3be281864f7c59220a269c44057094a81a4af99d4db13" Dec 11 13:13:18 crc kubenswrapper[4898]: I1211 13:13:18.045878 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-8cd884dbc-x76wj" Dec 11 13:13:18 crc kubenswrapper[4898]: I1211 13:13:18.067677 4898 scope.go:117] "RemoveContainer" containerID="856c44d78997d736f1b3be281864f7c59220a269c44057094a81a4af99d4db13" Dec 11 13:13:18 crc kubenswrapper[4898]: E1211 13:13:18.068109 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"856c44d78997d736f1b3be281864f7c59220a269c44057094a81a4af99d4db13\": container with ID starting with 856c44d78997d736f1b3be281864f7c59220a269c44057094a81a4af99d4db13 not found: ID does not exist" containerID="856c44d78997d736f1b3be281864f7c59220a269c44057094a81a4af99d4db13" Dec 11 13:13:18 crc kubenswrapper[4898]: I1211 13:13:18.068171 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"856c44d78997d736f1b3be281864f7c59220a269c44057094a81a4af99d4db13"} err="failed to get container status \"856c44d78997d736f1b3be281864f7c59220a269c44057094a81a4af99d4db13\": rpc error: code = NotFound desc = could not find container \"856c44d78997d736f1b3be281864f7c59220a269c44057094a81a4af99d4db13\": container with ID starting with 856c44d78997d736f1b3be281864f7c59220a269c44057094a81a4af99d4db13 not found: ID does not exist" Dec 11 13:13:18 crc kubenswrapper[4898]: I1211 13:13:18.100960 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-8cd884dbc-x76wj"] Dec 11 13:13:18 crc kubenswrapper[4898]: I1211 13:13:18.105806 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-8cd884dbc-x76wj"] Dec 11 13:13:18 crc kubenswrapper[4898]: I1211 13:13:18.785379 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b9e31705-36ee-462a-82ce-7dd3195f945b" path="/var/lib/kubelet/pods/b9e31705-36ee-462a-82ce-7dd3195f945b/volumes" Dec 11 13:13:34 crc kubenswrapper[4898]: I1211 13:13:34.996198 4898 patch_prober.go:28] interesting pod/machine-config-daemon-7mmvk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 11 13:13:34 crc kubenswrapper[4898]: I1211 13:13:34.997008 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 11 13:14:04 crc kubenswrapper[4898]: I1211 13:14:04.995858 4898 patch_prober.go:28] interesting pod/machine-config-daemon-7mmvk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 11 13:14:04 crc kubenswrapper[4898]: I1211 13:14:04.996601 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 11 13:14:34 crc kubenswrapper[4898]: I1211 13:14:34.995728 4898 patch_prober.go:28] interesting pod/machine-config-daemon-7mmvk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 11 13:14:34 crc kubenswrapper[4898]: I1211 13:14:34.996426 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 11 13:14:34 crc kubenswrapper[4898]: I1211 13:14:34.996521 4898 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" Dec 11 13:14:34 crc kubenswrapper[4898]: I1211 13:14:34.997408 4898 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"207d0448c3903ada5a8d43c8e4caa5a44f53738d1392a4585159b0ae517f8656"} pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 11 13:14:34 crc kubenswrapper[4898]: I1211 13:14:34.997543 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" containerName="machine-config-daemon" containerID="cri-o://207d0448c3903ada5a8d43c8e4caa5a44f53738d1392a4585159b0ae517f8656" gracePeriod=600 Dec 11 13:14:35 crc kubenswrapper[4898]: I1211 13:14:35.633638 4898 generic.go:334] "Generic (PLEG): container finished" podID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" containerID="207d0448c3903ada5a8d43c8e4caa5a44f53738d1392a4585159b0ae517f8656" exitCode=0 Dec 11 13:14:35 crc kubenswrapper[4898]: I1211 13:14:35.633692 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" event={"ID":"b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c","Type":"ContainerDied","Data":"207d0448c3903ada5a8d43c8e4caa5a44f53738d1392a4585159b0ae517f8656"} Dec 11 13:14:35 crc kubenswrapper[4898]: I1211 13:14:35.634229 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" event={"ID":"b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c","Type":"ContainerStarted","Data":"0d92a34a49f382a6447faa4993a93c91c6e3595696f153fe710d6ffe0ba35fec"} Dec 11 13:14:35 crc kubenswrapper[4898]: I1211 13:14:35.634276 4898 scope.go:117] "RemoveContainer" containerID="f637b15c5c6a67943cd9b7a091adbf77f0c3f8526911048606b42e00cbbb2e61" Dec 11 13:14:40 crc kubenswrapper[4898]: I1211 13:14:40.753228 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92108dkt4"] Dec 11 13:14:40 crc kubenswrapper[4898]: E1211 13:14:40.754030 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9e31705-36ee-462a-82ce-7dd3195f945b" containerName="console" Dec 11 13:14:40 crc kubenswrapper[4898]: I1211 13:14:40.754044 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9e31705-36ee-462a-82ce-7dd3195f945b" containerName="console" Dec 11 13:14:40 crc kubenswrapper[4898]: I1211 13:14:40.754188 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="b9e31705-36ee-462a-82ce-7dd3195f945b" containerName="console" Dec 11 13:14:40 crc kubenswrapper[4898]: I1211 13:14:40.755225 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92108dkt4" Dec 11 13:14:40 crc kubenswrapper[4898]: I1211 13:14:40.757841 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 11 13:14:40 crc kubenswrapper[4898]: I1211 13:14:40.770925 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92108dkt4"] Dec 11 13:14:40 crc kubenswrapper[4898]: I1211 13:14:40.932214 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/26dfece8-854b-4146-bcee-c25943aab4b2-util\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92108dkt4\" (UID: \"26dfece8-854b-4146-bcee-c25943aab4b2\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92108dkt4" Dec 11 13:14:40 crc kubenswrapper[4898]: I1211 13:14:40.932511 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m4xx7\" (UniqueName: \"kubernetes.io/projected/26dfece8-854b-4146-bcee-c25943aab4b2-kube-api-access-m4xx7\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92108dkt4\" (UID: \"26dfece8-854b-4146-bcee-c25943aab4b2\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92108dkt4" Dec 11 13:14:40 crc kubenswrapper[4898]: I1211 13:14:40.932641 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/26dfece8-854b-4146-bcee-c25943aab4b2-bundle\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92108dkt4\" (UID: \"26dfece8-854b-4146-bcee-c25943aab4b2\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92108dkt4" Dec 11 13:14:41 crc kubenswrapper[4898]: I1211 13:14:41.033802 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/26dfece8-854b-4146-bcee-c25943aab4b2-bundle\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92108dkt4\" (UID: \"26dfece8-854b-4146-bcee-c25943aab4b2\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92108dkt4" Dec 11 13:14:41 crc kubenswrapper[4898]: I1211 13:14:41.033859 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/26dfece8-854b-4146-bcee-c25943aab4b2-util\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92108dkt4\" (UID: \"26dfece8-854b-4146-bcee-c25943aab4b2\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92108dkt4" Dec 11 13:14:41 crc kubenswrapper[4898]: I1211 13:14:41.033908 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m4xx7\" (UniqueName: \"kubernetes.io/projected/26dfece8-854b-4146-bcee-c25943aab4b2-kube-api-access-m4xx7\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92108dkt4\" (UID: \"26dfece8-854b-4146-bcee-c25943aab4b2\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92108dkt4" Dec 11 13:14:41 crc kubenswrapper[4898]: I1211 13:14:41.034539 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/26dfece8-854b-4146-bcee-c25943aab4b2-bundle\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92108dkt4\" (UID: \"26dfece8-854b-4146-bcee-c25943aab4b2\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92108dkt4" Dec 11 13:14:41 crc kubenswrapper[4898]: I1211 13:14:41.035028 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/26dfece8-854b-4146-bcee-c25943aab4b2-util\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92108dkt4\" (UID: \"26dfece8-854b-4146-bcee-c25943aab4b2\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92108dkt4" Dec 11 13:14:41 crc kubenswrapper[4898]: I1211 13:14:41.056806 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m4xx7\" (UniqueName: \"kubernetes.io/projected/26dfece8-854b-4146-bcee-c25943aab4b2-kube-api-access-m4xx7\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92108dkt4\" (UID: \"26dfece8-854b-4146-bcee-c25943aab4b2\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92108dkt4" Dec 11 13:14:41 crc kubenswrapper[4898]: I1211 13:14:41.073094 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92108dkt4" Dec 11 13:14:41 crc kubenswrapper[4898]: I1211 13:14:41.303179 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92108dkt4"] Dec 11 13:14:41 crc kubenswrapper[4898]: I1211 13:14:41.680213 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92108dkt4" event={"ID":"26dfece8-854b-4146-bcee-c25943aab4b2","Type":"ContainerStarted","Data":"882d048e7b77ac25feff63834b3159b9e12193d55ee881cd07bd71544d274095"} Dec 11 13:14:41 crc kubenswrapper[4898]: I1211 13:14:41.680590 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92108dkt4" event={"ID":"26dfece8-854b-4146-bcee-c25943aab4b2","Type":"ContainerStarted","Data":"f8c327217aab38d604a130c4c994a2fff51bd616d861541dc27e7bc6143b1d92"} Dec 11 13:14:42 crc kubenswrapper[4898]: I1211 13:14:42.689954 4898 generic.go:334] "Generic (PLEG): container finished" podID="26dfece8-854b-4146-bcee-c25943aab4b2" containerID="882d048e7b77ac25feff63834b3159b9e12193d55ee881cd07bd71544d274095" exitCode=0 Dec 11 13:14:42 crc kubenswrapper[4898]: I1211 13:14:42.690001 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92108dkt4" event={"ID":"26dfece8-854b-4146-bcee-c25943aab4b2","Type":"ContainerDied","Data":"882d048e7b77ac25feff63834b3159b9e12193d55ee881cd07bd71544d274095"} Dec 11 13:14:42 crc kubenswrapper[4898]: I1211 13:14:42.691577 4898 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 11 13:14:44 crc kubenswrapper[4898]: I1211 13:14:44.707237 4898 generic.go:334] "Generic (PLEG): container finished" podID="26dfece8-854b-4146-bcee-c25943aab4b2" containerID="3b6ad80fddc39e34c9d7559e125e5adec2d23e51ae725b70e27e7b799636b8e3" exitCode=0 Dec 11 13:14:44 crc kubenswrapper[4898]: I1211 13:14:44.707285 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92108dkt4" event={"ID":"26dfece8-854b-4146-bcee-c25943aab4b2","Type":"ContainerDied","Data":"3b6ad80fddc39e34c9d7559e125e5adec2d23e51ae725b70e27e7b799636b8e3"} Dec 11 13:14:45 crc kubenswrapper[4898]: I1211 13:14:45.722315 4898 generic.go:334] "Generic (PLEG): container finished" podID="26dfece8-854b-4146-bcee-c25943aab4b2" containerID="a259b5232e89227e80d32934008102f342e16f8bb92eadf8f65a6083d2363d79" exitCode=0 Dec 11 13:14:45 crc kubenswrapper[4898]: I1211 13:14:45.722411 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92108dkt4" event={"ID":"26dfece8-854b-4146-bcee-c25943aab4b2","Type":"ContainerDied","Data":"a259b5232e89227e80d32934008102f342e16f8bb92eadf8f65a6083d2363d79"} Dec 11 13:14:47 crc kubenswrapper[4898]: I1211 13:14:47.014870 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92108dkt4" Dec 11 13:14:47 crc kubenswrapper[4898]: I1211 13:14:47.131333 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/26dfece8-854b-4146-bcee-c25943aab4b2-bundle\") pod \"26dfece8-854b-4146-bcee-c25943aab4b2\" (UID: \"26dfece8-854b-4146-bcee-c25943aab4b2\") " Dec 11 13:14:47 crc kubenswrapper[4898]: I1211 13:14:47.131510 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/26dfece8-854b-4146-bcee-c25943aab4b2-util\") pod \"26dfece8-854b-4146-bcee-c25943aab4b2\" (UID: \"26dfece8-854b-4146-bcee-c25943aab4b2\") " Dec 11 13:14:47 crc kubenswrapper[4898]: I1211 13:14:47.131534 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m4xx7\" (UniqueName: \"kubernetes.io/projected/26dfece8-854b-4146-bcee-c25943aab4b2-kube-api-access-m4xx7\") pod \"26dfece8-854b-4146-bcee-c25943aab4b2\" (UID: \"26dfece8-854b-4146-bcee-c25943aab4b2\") " Dec 11 13:14:47 crc kubenswrapper[4898]: I1211 13:14:47.134361 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/26dfece8-854b-4146-bcee-c25943aab4b2-bundle" (OuterVolumeSpecName: "bundle") pod "26dfece8-854b-4146-bcee-c25943aab4b2" (UID: "26dfece8-854b-4146-bcee-c25943aab4b2"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 13:14:47 crc kubenswrapper[4898]: I1211 13:14:47.137684 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/26dfece8-854b-4146-bcee-c25943aab4b2-kube-api-access-m4xx7" (OuterVolumeSpecName: "kube-api-access-m4xx7") pod "26dfece8-854b-4146-bcee-c25943aab4b2" (UID: "26dfece8-854b-4146-bcee-c25943aab4b2"). InnerVolumeSpecName "kube-api-access-m4xx7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:14:47 crc kubenswrapper[4898]: I1211 13:14:47.169776 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/26dfece8-854b-4146-bcee-c25943aab4b2-util" (OuterVolumeSpecName: "util") pod "26dfece8-854b-4146-bcee-c25943aab4b2" (UID: "26dfece8-854b-4146-bcee-c25943aab4b2"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 13:14:47 crc kubenswrapper[4898]: I1211 13:14:47.233136 4898 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/26dfece8-854b-4146-bcee-c25943aab4b2-util\") on node \"crc\" DevicePath \"\"" Dec 11 13:14:47 crc kubenswrapper[4898]: I1211 13:14:47.233175 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m4xx7\" (UniqueName: \"kubernetes.io/projected/26dfece8-854b-4146-bcee-c25943aab4b2-kube-api-access-m4xx7\") on node \"crc\" DevicePath \"\"" Dec 11 13:14:47 crc kubenswrapper[4898]: I1211 13:14:47.233187 4898 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/26dfece8-854b-4146-bcee-c25943aab4b2-bundle\") on node \"crc\" DevicePath \"\"" Dec 11 13:14:47 crc kubenswrapper[4898]: I1211 13:14:47.740247 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92108dkt4" event={"ID":"26dfece8-854b-4146-bcee-c25943aab4b2","Type":"ContainerDied","Data":"f8c327217aab38d604a130c4c994a2fff51bd616d861541dc27e7bc6143b1d92"} Dec 11 13:14:47 crc kubenswrapper[4898]: I1211 13:14:47.740299 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f8c327217aab38d604a130c4c994a2fff51bd616d861541dc27e7bc6143b1d92" Dec 11 13:14:47 crc kubenswrapper[4898]: I1211 13:14:47.740956 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92108dkt4" Dec 11 13:14:51 crc kubenswrapper[4898]: I1211 13:14:51.832288 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-qndxl"] Dec 11 13:14:51 crc kubenswrapper[4898]: I1211 13:14:51.834082 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-qndxl" podUID="1efa7034-8a95-4e6e-bd84-0189dc5acaa3" containerName="ovn-controller" containerID="cri-o://b7e7d7ded2a6e0e20781a16b0ed56a744db593c6fe40c75e89a088ccb8cdb127" gracePeriod=30 Dec 11 13:14:51 crc kubenswrapper[4898]: I1211 13:14:51.834184 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-qndxl" podUID="1efa7034-8a95-4e6e-bd84-0189dc5acaa3" containerName="nbdb" containerID="cri-o://6ab0efc2d7d46f575125a46f025b016017e66f74466a9d796dfb0285577f5bcb" gracePeriod=30 Dec 11 13:14:51 crc kubenswrapper[4898]: I1211 13:14:51.834267 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-qndxl" podUID="1efa7034-8a95-4e6e-bd84-0189dc5acaa3" containerName="kube-rbac-proxy-node" containerID="cri-o://d2a77e22893dca5ea818fc6992db6c34f8eb90ce2f25e814bec5f314d60ad569" gracePeriod=30 Dec 11 13:14:51 crc kubenswrapper[4898]: I1211 13:14:51.834296 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-qndxl" podUID="1efa7034-8a95-4e6e-bd84-0189dc5acaa3" containerName="northd" containerID="cri-o://d992af21fd68cce19a4be8328466a3d46df62ed908fe4e911ba14e7c009ce982" gracePeriod=30 Dec 11 13:14:51 crc kubenswrapper[4898]: I1211 13:14:51.834369 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-qndxl" podUID="1efa7034-8a95-4e6e-bd84-0189dc5acaa3" containerName="ovn-acl-logging" containerID="cri-o://a60901e27e67b7483e02800a91a86f2676b610d538c999ab1df0d42fde146f90" gracePeriod=30 Dec 11 13:14:51 crc kubenswrapper[4898]: I1211 13:14:51.834203 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-qndxl" podUID="1efa7034-8a95-4e6e-bd84-0189dc5acaa3" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://dd81ec4902eeb225d5a2969e211c63bbab448c1cccb327cfa84fb68c4f386ab0" gracePeriod=30 Dec 11 13:14:51 crc kubenswrapper[4898]: I1211 13:14:51.834849 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-qndxl" podUID="1efa7034-8a95-4e6e-bd84-0189dc5acaa3" containerName="sbdb" containerID="cri-o://e3aa907a5ca865510f1aafd0f949963b5938b6fe62fa9fd690447c4ab62566f0" gracePeriod=30 Dec 11 13:14:51 crc kubenswrapper[4898]: I1211 13:14:51.881512 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-qndxl" podUID="1efa7034-8a95-4e6e-bd84-0189dc5acaa3" containerName="ovnkube-controller" containerID="cri-o://1562e3d204cfba3207d1a92c464e141095798dfd147032dc841d0c3d87c7ca2d" gracePeriod=30 Dec 11 13:14:52 crc kubenswrapper[4898]: I1211 13:14:52.772398 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-dlqfj_4e8ed6cb-b822-4b64-9e00-e755c5aea812/kube-multus/2.log" Dec 11 13:14:52 crc kubenswrapper[4898]: I1211 13:14:52.773152 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-dlqfj_4e8ed6cb-b822-4b64-9e00-e755c5aea812/kube-multus/1.log" Dec 11 13:14:52 crc kubenswrapper[4898]: I1211 13:14:52.773199 4898 generic.go:334] "Generic (PLEG): container finished" podID="4e8ed6cb-b822-4b64-9e00-e755c5aea812" containerID="114b3259a9fd034573fe9dc3c103980a05be8bdb8205c084f35e27725255ec28" exitCode=2 Dec 11 13:14:52 crc kubenswrapper[4898]: I1211 13:14:52.773254 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-dlqfj" event={"ID":"4e8ed6cb-b822-4b64-9e00-e755c5aea812","Type":"ContainerDied","Data":"114b3259a9fd034573fe9dc3c103980a05be8bdb8205c084f35e27725255ec28"} Dec 11 13:14:52 crc kubenswrapper[4898]: I1211 13:14:52.773288 4898 scope.go:117] "RemoveContainer" containerID="dbc9b844a873836a500af9d781e17c635495762b0e50af71dadf141ff00723a3" Dec 11 13:14:52 crc kubenswrapper[4898]: I1211 13:14:52.773787 4898 scope.go:117] "RemoveContainer" containerID="114b3259a9fd034573fe9dc3c103980a05be8bdb8205c084f35e27725255ec28" Dec 11 13:14:52 crc kubenswrapper[4898]: E1211 13:14:52.774137 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-dlqfj_openshift-multus(4e8ed6cb-b822-4b64-9e00-e755c5aea812)\"" pod="openshift-multus/multus-dlqfj" podUID="4e8ed6cb-b822-4b64-9e00-e755c5aea812" Dec 11 13:14:52 crc kubenswrapper[4898]: I1211 13:14:52.777343 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qndxl_1efa7034-8a95-4e6e-bd84-0189dc5acaa3/ovnkube-controller/3.log" Dec 11 13:14:52 crc kubenswrapper[4898]: I1211 13:14:52.780003 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qndxl_1efa7034-8a95-4e6e-bd84-0189dc5acaa3/ovn-acl-logging/0.log" Dec 11 13:14:52 crc kubenswrapper[4898]: I1211 13:14:52.780612 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qndxl_1efa7034-8a95-4e6e-bd84-0189dc5acaa3/ovn-controller/0.log" Dec 11 13:14:52 crc kubenswrapper[4898]: I1211 13:14:52.780991 4898 generic.go:334] "Generic (PLEG): container finished" podID="1efa7034-8a95-4e6e-bd84-0189dc5acaa3" containerID="1562e3d204cfba3207d1a92c464e141095798dfd147032dc841d0c3d87c7ca2d" exitCode=0 Dec 11 13:14:52 crc kubenswrapper[4898]: I1211 13:14:52.781010 4898 generic.go:334] "Generic (PLEG): container finished" podID="1efa7034-8a95-4e6e-bd84-0189dc5acaa3" containerID="e3aa907a5ca865510f1aafd0f949963b5938b6fe62fa9fd690447c4ab62566f0" exitCode=0 Dec 11 13:14:52 crc kubenswrapper[4898]: I1211 13:14:52.781018 4898 generic.go:334] "Generic (PLEG): container finished" podID="1efa7034-8a95-4e6e-bd84-0189dc5acaa3" containerID="6ab0efc2d7d46f575125a46f025b016017e66f74466a9d796dfb0285577f5bcb" exitCode=0 Dec 11 13:14:52 crc kubenswrapper[4898]: I1211 13:14:52.781024 4898 generic.go:334] "Generic (PLEG): container finished" podID="1efa7034-8a95-4e6e-bd84-0189dc5acaa3" containerID="d992af21fd68cce19a4be8328466a3d46df62ed908fe4e911ba14e7c009ce982" exitCode=0 Dec 11 13:14:52 crc kubenswrapper[4898]: I1211 13:14:52.781030 4898 generic.go:334] "Generic (PLEG): container finished" podID="1efa7034-8a95-4e6e-bd84-0189dc5acaa3" containerID="a60901e27e67b7483e02800a91a86f2676b610d538c999ab1df0d42fde146f90" exitCode=143 Dec 11 13:14:52 crc kubenswrapper[4898]: I1211 13:14:52.781037 4898 generic.go:334] "Generic (PLEG): container finished" podID="1efa7034-8a95-4e6e-bd84-0189dc5acaa3" containerID="b7e7d7ded2a6e0e20781a16b0ed56a744db593c6fe40c75e89a088ccb8cdb127" exitCode=143 Dec 11 13:14:52 crc kubenswrapper[4898]: I1211 13:14:52.782725 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qndxl" event={"ID":"1efa7034-8a95-4e6e-bd84-0189dc5acaa3","Type":"ContainerDied","Data":"1562e3d204cfba3207d1a92c464e141095798dfd147032dc841d0c3d87c7ca2d"} Dec 11 13:14:52 crc kubenswrapper[4898]: I1211 13:14:52.782752 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qndxl" event={"ID":"1efa7034-8a95-4e6e-bd84-0189dc5acaa3","Type":"ContainerDied","Data":"e3aa907a5ca865510f1aafd0f949963b5938b6fe62fa9fd690447c4ab62566f0"} Dec 11 13:14:52 crc kubenswrapper[4898]: I1211 13:14:52.782763 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qndxl" event={"ID":"1efa7034-8a95-4e6e-bd84-0189dc5acaa3","Type":"ContainerDied","Data":"6ab0efc2d7d46f575125a46f025b016017e66f74466a9d796dfb0285577f5bcb"} Dec 11 13:14:52 crc kubenswrapper[4898]: I1211 13:14:52.782771 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qndxl" event={"ID":"1efa7034-8a95-4e6e-bd84-0189dc5acaa3","Type":"ContainerDied","Data":"d992af21fd68cce19a4be8328466a3d46df62ed908fe4e911ba14e7c009ce982"} Dec 11 13:14:52 crc kubenswrapper[4898]: I1211 13:14:52.782780 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qndxl" event={"ID":"1efa7034-8a95-4e6e-bd84-0189dc5acaa3","Type":"ContainerDied","Data":"a60901e27e67b7483e02800a91a86f2676b610d538c999ab1df0d42fde146f90"} Dec 11 13:14:52 crc kubenswrapper[4898]: I1211 13:14:52.782788 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qndxl" event={"ID":"1efa7034-8a95-4e6e-bd84-0189dc5acaa3","Type":"ContainerDied","Data":"b7e7d7ded2a6e0e20781a16b0ed56a744db593c6fe40c75e89a088ccb8cdb127"} Dec 11 13:14:52 crc kubenswrapper[4898]: I1211 13:14:52.796232 4898 scope.go:117] "RemoveContainer" containerID="b19bc85d9d1f776f0f2a0bfedbfa6f30383df3673e54acadcc74a46e4d24bcde" Dec 11 13:14:53 crc kubenswrapper[4898]: E1211 13:14:53.045397 4898 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6ab0efc2d7d46f575125a46f025b016017e66f74466a9d796dfb0285577f5bcb is running failed: container process not found" containerID="6ab0efc2d7d46f575125a46f025b016017e66f74466a9d796dfb0285577f5bcb" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"nb\"\n"] Dec 11 13:14:53 crc kubenswrapper[4898]: E1211 13:14:53.045403 4898 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e3aa907a5ca865510f1aafd0f949963b5938b6fe62fa9fd690447c4ab62566f0 is running failed: container process not found" containerID="e3aa907a5ca865510f1aafd0f949963b5938b6fe62fa9fd690447c4ab62566f0" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"sb\"\n"] Dec 11 13:14:53 crc kubenswrapper[4898]: E1211 13:14:53.046354 4898 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6ab0efc2d7d46f575125a46f025b016017e66f74466a9d796dfb0285577f5bcb is running failed: container process not found" containerID="6ab0efc2d7d46f575125a46f025b016017e66f74466a9d796dfb0285577f5bcb" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"nb\"\n"] Dec 11 13:14:53 crc kubenswrapper[4898]: E1211 13:14:53.046406 4898 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e3aa907a5ca865510f1aafd0f949963b5938b6fe62fa9fd690447c4ab62566f0 is running failed: container process not found" containerID="e3aa907a5ca865510f1aafd0f949963b5938b6fe62fa9fd690447c4ab62566f0" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"sb\"\n"] Dec 11 13:14:53 crc kubenswrapper[4898]: E1211 13:14:53.046722 4898 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6ab0efc2d7d46f575125a46f025b016017e66f74466a9d796dfb0285577f5bcb is running failed: container process not found" containerID="6ab0efc2d7d46f575125a46f025b016017e66f74466a9d796dfb0285577f5bcb" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"nb\"\n"] Dec 11 13:14:53 crc kubenswrapper[4898]: E1211 13:14:53.046745 4898 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e3aa907a5ca865510f1aafd0f949963b5938b6fe62fa9fd690447c4ab62566f0 is running failed: container process not found" containerID="e3aa907a5ca865510f1aafd0f949963b5938b6fe62fa9fd690447c4ab62566f0" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"sb\"\n"] Dec 11 13:14:53 crc kubenswrapper[4898]: E1211 13:14:53.046757 4898 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6ab0efc2d7d46f575125a46f025b016017e66f74466a9d796dfb0285577f5bcb is running failed: container process not found" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-qndxl" podUID="1efa7034-8a95-4e6e-bd84-0189dc5acaa3" containerName="nbdb" Dec 11 13:14:53 crc kubenswrapper[4898]: E1211 13:14:53.046776 4898 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e3aa907a5ca865510f1aafd0f949963b5938b6fe62fa9fd690447c4ab62566f0 is running failed: container process not found" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-qndxl" podUID="1efa7034-8a95-4e6e-bd84-0189dc5acaa3" containerName="sbdb" Dec 11 13:14:53 crc kubenswrapper[4898]: I1211 13:14:53.132092 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qndxl_1efa7034-8a95-4e6e-bd84-0189dc5acaa3/ovn-acl-logging/0.log" Dec 11 13:14:53 crc kubenswrapper[4898]: I1211 13:14:53.132441 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qndxl_1efa7034-8a95-4e6e-bd84-0189dc5acaa3/ovn-controller/0.log" Dec 11 13:14:53 crc kubenswrapper[4898]: I1211 13:14:53.132801 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-qndxl" Dec 11 13:14:53 crc kubenswrapper[4898]: I1211 13:14:53.296304 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-lwjvl"] Dec 11 13:14:53 crc kubenswrapper[4898]: E1211 13:14:53.296518 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26dfece8-854b-4146-bcee-c25943aab4b2" containerName="util" Dec 11 13:14:53 crc kubenswrapper[4898]: I1211 13:14:53.296529 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="26dfece8-854b-4146-bcee-c25943aab4b2" containerName="util" Dec 11 13:14:53 crc kubenswrapper[4898]: E1211 13:14:53.296540 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1efa7034-8a95-4e6e-bd84-0189dc5acaa3" containerName="kubecfg-setup" Dec 11 13:14:53 crc kubenswrapper[4898]: I1211 13:14:53.296545 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="1efa7034-8a95-4e6e-bd84-0189dc5acaa3" containerName="kubecfg-setup" Dec 11 13:14:53 crc kubenswrapper[4898]: E1211 13:14:53.296553 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1efa7034-8a95-4e6e-bd84-0189dc5acaa3" containerName="ovnkube-controller" Dec 11 13:14:53 crc kubenswrapper[4898]: I1211 13:14:53.296559 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="1efa7034-8a95-4e6e-bd84-0189dc5acaa3" containerName="ovnkube-controller" Dec 11 13:14:53 crc kubenswrapper[4898]: E1211 13:14:53.296566 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26dfece8-854b-4146-bcee-c25943aab4b2" containerName="pull" Dec 11 13:14:53 crc kubenswrapper[4898]: I1211 13:14:53.296572 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="26dfece8-854b-4146-bcee-c25943aab4b2" containerName="pull" Dec 11 13:14:53 crc kubenswrapper[4898]: E1211 13:14:53.296588 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1efa7034-8a95-4e6e-bd84-0189dc5acaa3" containerName="ovnkube-controller" Dec 11 13:14:53 crc kubenswrapper[4898]: I1211 13:14:53.296593 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="1efa7034-8a95-4e6e-bd84-0189dc5acaa3" containerName="ovnkube-controller" Dec 11 13:14:53 crc kubenswrapper[4898]: E1211 13:14:53.296599 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1efa7034-8a95-4e6e-bd84-0189dc5acaa3" containerName="kube-rbac-proxy-node" Dec 11 13:14:53 crc kubenswrapper[4898]: I1211 13:14:53.296605 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="1efa7034-8a95-4e6e-bd84-0189dc5acaa3" containerName="kube-rbac-proxy-node" Dec 11 13:14:53 crc kubenswrapper[4898]: E1211 13:14:53.296614 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1efa7034-8a95-4e6e-bd84-0189dc5acaa3" containerName="sbdb" Dec 11 13:14:53 crc kubenswrapper[4898]: I1211 13:14:53.296620 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="1efa7034-8a95-4e6e-bd84-0189dc5acaa3" containerName="sbdb" Dec 11 13:14:53 crc kubenswrapper[4898]: E1211 13:14:53.296626 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26dfece8-854b-4146-bcee-c25943aab4b2" containerName="extract" Dec 11 13:14:53 crc kubenswrapper[4898]: I1211 13:14:53.296633 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="26dfece8-854b-4146-bcee-c25943aab4b2" containerName="extract" Dec 11 13:14:53 crc kubenswrapper[4898]: E1211 13:14:53.296641 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1efa7034-8a95-4e6e-bd84-0189dc5acaa3" containerName="ovn-controller" Dec 11 13:14:53 crc kubenswrapper[4898]: I1211 13:14:53.296647 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="1efa7034-8a95-4e6e-bd84-0189dc5acaa3" containerName="ovn-controller" Dec 11 13:14:53 crc kubenswrapper[4898]: E1211 13:14:53.296658 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1efa7034-8a95-4e6e-bd84-0189dc5acaa3" containerName="ovnkube-controller" Dec 11 13:14:53 crc kubenswrapper[4898]: I1211 13:14:53.296663 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="1efa7034-8a95-4e6e-bd84-0189dc5acaa3" containerName="ovnkube-controller" Dec 11 13:14:53 crc kubenswrapper[4898]: E1211 13:14:53.296669 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1efa7034-8a95-4e6e-bd84-0189dc5acaa3" containerName="northd" Dec 11 13:14:53 crc kubenswrapper[4898]: I1211 13:14:53.296674 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="1efa7034-8a95-4e6e-bd84-0189dc5acaa3" containerName="northd" Dec 11 13:14:53 crc kubenswrapper[4898]: E1211 13:14:53.296682 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1efa7034-8a95-4e6e-bd84-0189dc5acaa3" containerName="ovn-acl-logging" Dec 11 13:14:53 crc kubenswrapper[4898]: I1211 13:14:53.296688 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="1efa7034-8a95-4e6e-bd84-0189dc5acaa3" containerName="ovn-acl-logging" Dec 11 13:14:53 crc kubenswrapper[4898]: E1211 13:14:53.296696 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1efa7034-8a95-4e6e-bd84-0189dc5acaa3" containerName="kube-rbac-proxy-ovn-metrics" Dec 11 13:14:53 crc kubenswrapper[4898]: I1211 13:14:53.296701 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="1efa7034-8a95-4e6e-bd84-0189dc5acaa3" containerName="kube-rbac-proxy-ovn-metrics" Dec 11 13:14:53 crc kubenswrapper[4898]: E1211 13:14:53.296708 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1efa7034-8a95-4e6e-bd84-0189dc5acaa3" containerName="nbdb" Dec 11 13:14:53 crc kubenswrapper[4898]: I1211 13:14:53.296714 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="1efa7034-8a95-4e6e-bd84-0189dc5acaa3" containerName="nbdb" Dec 11 13:14:53 crc kubenswrapper[4898]: I1211 13:14:53.296819 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="1efa7034-8a95-4e6e-bd84-0189dc5acaa3" containerName="ovnkube-controller" Dec 11 13:14:53 crc kubenswrapper[4898]: I1211 13:14:53.296830 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="1efa7034-8a95-4e6e-bd84-0189dc5acaa3" containerName="ovnkube-controller" Dec 11 13:14:53 crc kubenswrapper[4898]: I1211 13:14:53.296836 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="1efa7034-8a95-4e6e-bd84-0189dc5acaa3" containerName="ovn-acl-logging" Dec 11 13:14:53 crc kubenswrapper[4898]: I1211 13:14:53.296845 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="1efa7034-8a95-4e6e-bd84-0189dc5acaa3" containerName="northd" Dec 11 13:14:53 crc kubenswrapper[4898]: I1211 13:14:53.296856 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="1efa7034-8a95-4e6e-bd84-0189dc5acaa3" containerName="ovn-controller" Dec 11 13:14:53 crc kubenswrapper[4898]: I1211 13:14:53.296866 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="1efa7034-8a95-4e6e-bd84-0189dc5acaa3" containerName="ovnkube-controller" Dec 11 13:14:53 crc kubenswrapper[4898]: I1211 13:14:53.296876 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="1efa7034-8a95-4e6e-bd84-0189dc5acaa3" containerName="kube-rbac-proxy-node" Dec 11 13:14:53 crc kubenswrapper[4898]: I1211 13:14:53.296883 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="1efa7034-8a95-4e6e-bd84-0189dc5acaa3" containerName="kube-rbac-proxy-ovn-metrics" Dec 11 13:14:53 crc kubenswrapper[4898]: I1211 13:14:53.296890 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="1efa7034-8a95-4e6e-bd84-0189dc5acaa3" containerName="sbdb" Dec 11 13:14:53 crc kubenswrapper[4898]: I1211 13:14:53.296899 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="26dfece8-854b-4146-bcee-c25943aab4b2" containerName="extract" Dec 11 13:14:53 crc kubenswrapper[4898]: I1211 13:14:53.296906 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="1efa7034-8a95-4e6e-bd84-0189dc5acaa3" containerName="nbdb" Dec 11 13:14:53 crc kubenswrapper[4898]: E1211 13:14:53.296996 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1efa7034-8a95-4e6e-bd84-0189dc5acaa3" containerName="ovnkube-controller" Dec 11 13:14:53 crc kubenswrapper[4898]: I1211 13:14:53.297003 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="1efa7034-8a95-4e6e-bd84-0189dc5acaa3" containerName="ovnkube-controller" Dec 11 13:14:53 crc kubenswrapper[4898]: I1211 13:14:53.297108 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="1efa7034-8a95-4e6e-bd84-0189dc5acaa3" containerName="ovnkube-controller" Dec 11 13:14:53 crc kubenswrapper[4898]: I1211 13:14:53.297121 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="1efa7034-8a95-4e6e-bd84-0189dc5acaa3" containerName="ovnkube-controller" Dec 11 13:14:53 crc kubenswrapper[4898]: E1211 13:14:53.297207 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1efa7034-8a95-4e6e-bd84-0189dc5acaa3" containerName="ovnkube-controller" Dec 11 13:14:53 crc kubenswrapper[4898]: I1211 13:14:53.297215 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="1efa7034-8a95-4e6e-bd84-0189dc5acaa3" containerName="ovnkube-controller" Dec 11 13:14:53 crc kubenswrapper[4898]: I1211 13:14:53.298697 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-lwjvl" Dec 11 13:14:53 crc kubenswrapper[4898]: I1211 13:14:53.315859 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1efa7034-8a95-4e6e-bd84-0189dc5acaa3-ovn-node-metrics-cert\") pod \"1efa7034-8a95-4e6e-bd84-0189dc5acaa3\" (UID: \"1efa7034-8a95-4e6e-bd84-0189dc5acaa3\") " Dec 11 13:14:53 crc kubenswrapper[4898]: I1211 13:14:53.315900 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1efa7034-8a95-4e6e-bd84-0189dc5acaa3-env-overrides\") pod \"1efa7034-8a95-4e6e-bd84-0189dc5acaa3\" (UID: \"1efa7034-8a95-4e6e-bd84-0189dc5acaa3\") " Dec 11 13:14:53 crc kubenswrapper[4898]: I1211 13:14:53.315921 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1efa7034-8a95-4e6e-bd84-0189dc5acaa3-ovnkube-config\") pod \"1efa7034-8a95-4e6e-bd84-0189dc5acaa3\" (UID: \"1efa7034-8a95-4e6e-bd84-0189dc5acaa3\") " Dec 11 13:14:53 crc kubenswrapper[4898]: I1211 13:14:53.315944 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1efa7034-8a95-4e6e-bd84-0189dc5acaa3-run-openvswitch\") pod \"1efa7034-8a95-4e6e-bd84-0189dc5acaa3\" (UID: \"1efa7034-8a95-4e6e-bd84-0189dc5acaa3\") " Dec 11 13:14:53 crc kubenswrapper[4898]: I1211 13:14:53.315972 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1efa7034-8a95-4e6e-bd84-0189dc5acaa3-host-run-netns\") pod \"1efa7034-8a95-4e6e-bd84-0189dc5acaa3\" (UID: \"1efa7034-8a95-4e6e-bd84-0189dc5acaa3\") " Dec 11 13:14:53 crc kubenswrapper[4898]: I1211 13:14:53.316010 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1efa7034-8a95-4e6e-bd84-0189dc5acaa3-var-lib-openvswitch\") pod \"1efa7034-8a95-4e6e-bd84-0189dc5acaa3\" (UID: \"1efa7034-8a95-4e6e-bd84-0189dc5acaa3\") " Dec 11 13:14:53 crc kubenswrapper[4898]: I1211 13:14:53.316027 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/1efa7034-8a95-4e6e-bd84-0189dc5acaa3-node-log\") pod \"1efa7034-8a95-4e6e-bd84-0189dc5acaa3\" (UID: \"1efa7034-8a95-4e6e-bd84-0189dc5acaa3\") " Dec 11 13:14:53 crc kubenswrapper[4898]: I1211 13:14:53.316045 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/1efa7034-8a95-4e6e-bd84-0189dc5acaa3-ovnkube-script-lib\") pod \"1efa7034-8a95-4e6e-bd84-0189dc5acaa3\" (UID: \"1efa7034-8a95-4e6e-bd84-0189dc5acaa3\") " Dec 11 13:14:53 crc kubenswrapper[4898]: I1211 13:14:53.316059 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1efa7034-8a95-4e6e-bd84-0189dc5acaa3-etc-openvswitch\") pod \"1efa7034-8a95-4e6e-bd84-0189dc5acaa3\" (UID: \"1efa7034-8a95-4e6e-bd84-0189dc5acaa3\") " Dec 11 13:14:53 crc kubenswrapper[4898]: I1211 13:14:53.316079 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1efa7034-8a95-4e6e-bd84-0189dc5acaa3-host-var-lib-cni-networks-ovn-kubernetes\") pod \"1efa7034-8a95-4e6e-bd84-0189dc5acaa3\" (UID: \"1efa7034-8a95-4e6e-bd84-0189dc5acaa3\") " Dec 11 13:14:53 crc kubenswrapper[4898]: I1211 13:14:53.316099 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/1efa7034-8a95-4e6e-bd84-0189dc5acaa3-run-systemd\") pod \"1efa7034-8a95-4e6e-bd84-0189dc5acaa3\" (UID: \"1efa7034-8a95-4e6e-bd84-0189dc5acaa3\") " Dec 11 13:14:53 crc kubenswrapper[4898]: I1211 13:14:53.316123 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/1efa7034-8a95-4e6e-bd84-0189dc5acaa3-systemd-units\") pod \"1efa7034-8a95-4e6e-bd84-0189dc5acaa3\" (UID: \"1efa7034-8a95-4e6e-bd84-0189dc5acaa3\") " Dec 11 13:14:53 crc kubenswrapper[4898]: I1211 13:14:53.316141 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/1efa7034-8a95-4e6e-bd84-0189dc5acaa3-host-cni-netd\") pod \"1efa7034-8a95-4e6e-bd84-0189dc5acaa3\" (UID: \"1efa7034-8a95-4e6e-bd84-0189dc5acaa3\") " Dec 11 13:14:53 crc kubenswrapper[4898]: I1211 13:14:53.316155 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1efa7034-8a95-4e6e-bd84-0189dc5acaa3-host-slash\") pod \"1efa7034-8a95-4e6e-bd84-0189dc5acaa3\" (UID: \"1efa7034-8a95-4e6e-bd84-0189dc5acaa3\") " Dec 11 13:14:53 crc kubenswrapper[4898]: I1211 13:14:53.316190 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/1efa7034-8a95-4e6e-bd84-0189dc5acaa3-log-socket\") pod \"1efa7034-8a95-4e6e-bd84-0189dc5acaa3\" (UID: \"1efa7034-8a95-4e6e-bd84-0189dc5acaa3\") " Dec 11 13:14:53 crc kubenswrapper[4898]: I1211 13:14:53.316208 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-729s2\" (UniqueName: \"kubernetes.io/projected/1efa7034-8a95-4e6e-bd84-0189dc5acaa3-kube-api-access-729s2\") pod \"1efa7034-8a95-4e6e-bd84-0189dc5acaa3\" (UID: \"1efa7034-8a95-4e6e-bd84-0189dc5acaa3\") " Dec 11 13:14:53 crc kubenswrapper[4898]: I1211 13:14:53.316224 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1efa7034-8a95-4e6e-bd84-0189dc5acaa3-host-cni-bin\") pod \"1efa7034-8a95-4e6e-bd84-0189dc5acaa3\" (UID: \"1efa7034-8a95-4e6e-bd84-0189dc5acaa3\") " Dec 11 13:14:53 crc kubenswrapper[4898]: I1211 13:14:53.316245 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/1efa7034-8a95-4e6e-bd84-0189dc5acaa3-host-kubelet\") pod \"1efa7034-8a95-4e6e-bd84-0189dc5acaa3\" (UID: \"1efa7034-8a95-4e6e-bd84-0189dc5acaa3\") " Dec 11 13:14:53 crc kubenswrapper[4898]: I1211 13:14:53.316297 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1efa7034-8a95-4e6e-bd84-0189dc5acaa3-host-run-ovn-kubernetes\") pod \"1efa7034-8a95-4e6e-bd84-0189dc5acaa3\" (UID: \"1efa7034-8a95-4e6e-bd84-0189dc5acaa3\") " Dec 11 13:14:53 crc kubenswrapper[4898]: I1211 13:14:53.316325 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/1efa7034-8a95-4e6e-bd84-0189dc5acaa3-run-ovn\") pod \"1efa7034-8a95-4e6e-bd84-0189dc5acaa3\" (UID: \"1efa7034-8a95-4e6e-bd84-0189dc5acaa3\") " Dec 11 13:14:53 crc kubenswrapper[4898]: I1211 13:14:53.316404 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1efa7034-8a95-4e6e-bd84-0189dc5acaa3-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "1efa7034-8a95-4e6e-bd84-0189dc5acaa3" (UID: "1efa7034-8a95-4e6e-bd84-0189dc5acaa3"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:14:53 crc kubenswrapper[4898]: I1211 13:14:53.316439 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1efa7034-8a95-4e6e-bd84-0189dc5acaa3-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "1efa7034-8a95-4e6e-bd84-0189dc5acaa3" (UID: "1efa7034-8a95-4e6e-bd84-0189dc5acaa3"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:14:53 crc kubenswrapper[4898]: I1211 13:14:53.316510 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1efa7034-8a95-4e6e-bd84-0189dc5acaa3-log-socket" (OuterVolumeSpecName: "log-socket") pod "1efa7034-8a95-4e6e-bd84-0189dc5acaa3" (UID: "1efa7034-8a95-4e6e-bd84-0189dc5acaa3"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 11 13:14:53 crc kubenswrapper[4898]: I1211 13:14:53.316678 4898 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1efa7034-8a95-4e6e-bd84-0189dc5acaa3-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 11 13:14:53 crc kubenswrapper[4898]: I1211 13:14:53.316697 4898 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1efa7034-8a95-4e6e-bd84-0189dc5acaa3-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 11 13:14:53 crc kubenswrapper[4898]: I1211 13:14:53.316709 4898 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/1efa7034-8a95-4e6e-bd84-0189dc5acaa3-log-socket\") on node \"crc\" DevicePath \"\"" Dec 11 13:14:53 crc kubenswrapper[4898]: I1211 13:14:53.316739 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1efa7034-8a95-4e6e-bd84-0189dc5acaa3-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "1efa7034-8a95-4e6e-bd84-0189dc5acaa3" (UID: "1efa7034-8a95-4e6e-bd84-0189dc5acaa3"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 11 13:14:53 crc kubenswrapper[4898]: I1211 13:14:53.316767 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1efa7034-8a95-4e6e-bd84-0189dc5acaa3-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "1efa7034-8a95-4e6e-bd84-0189dc5acaa3" (UID: "1efa7034-8a95-4e6e-bd84-0189dc5acaa3"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 11 13:14:53 crc kubenswrapper[4898]: I1211 13:14:53.316794 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1efa7034-8a95-4e6e-bd84-0189dc5acaa3-host-slash" (OuterVolumeSpecName: "host-slash") pod "1efa7034-8a95-4e6e-bd84-0189dc5acaa3" (UID: "1efa7034-8a95-4e6e-bd84-0189dc5acaa3"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 11 13:14:53 crc kubenswrapper[4898]: I1211 13:14:53.316825 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1efa7034-8a95-4e6e-bd84-0189dc5acaa3-node-log" (OuterVolumeSpecName: "node-log") pod "1efa7034-8a95-4e6e-bd84-0189dc5acaa3" (UID: "1efa7034-8a95-4e6e-bd84-0189dc5acaa3"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 11 13:14:53 crc kubenswrapper[4898]: I1211 13:14:53.316810 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1efa7034-8a95-4e6e-bd84-0189dc5acaa3-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "1efa7034-8a95-4e6e-bd84-0189dc5acaa3" (UID: "1efa7034-8a95-4e6e-bd84-0189dc5acaa3"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 11 13:14:53 crc kubenswrapper[4898]: I1211 13:14:53.316846 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1efa7034-8a95-4e6e-bd84-0189dc5acaa3-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "1efa7034-8a95-4e6e-bd84-0189dc5acaa3" (UID: "1efa7034-8a95-4e6e-bd84-0189dc5acaa3"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 11 13:14:53 crc kubenswrapper[4898]: I1211 13:14:53.316859 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1efa7034-8a95-4e6e-bd84-0189dc5acaa3-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "1efa7034-8a95-4e6e-bd84-0189dc5acaa3" (UID: "1efa7034-8a95-4e6e-bd84-0189dc5acaa3"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 11 13:14:53 crc kubenswrapper[4898]: I1211 13:14:53.316886 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1efa7034-8a95-4e6e-bd84-0189dc5acaa3-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "1efa7034-8a95-4e6e-bd84-0189dc5acaa3" (UID: "1efa7034-8a95-4e6e-bd84-0189dc5acaa3"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 11 13:14:53 crc kubenswrapper[4898]: I1211 13:14:53.316888 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1efa7034-8a95-4e6e-bd84-0189dc5acaa3-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "1efa7034-8a95-4e6e-bd84-0189dc5acaa3" (UID: "1efa7034-8a95-4e6e-bd84-0189dc5acaa3"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 11 13:14:53 crc kubenswrapper[4898]: I1211 13:14:53.316919 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1efa7034-8a95-4e6e-bd84-0189dc5acaa3-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "1efa7034-8a95-4e6e-bd84-0189dc5acaa3" (UID: "1efa7034-8a95-4e6e-bd84-0189dc5acaa3"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 11 13:14:53 crc kubenswrapper[4898]: I1211 13:14:53.316946 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1efa7034-8a95-4e6e-bd84-0189dc5acaa3-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "1efa7034-8a95-4e6e-bd84-0189dc5acaa3" (UID: "1efa7034-8a95-4e6e-bd84-0189dc5acaa3"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 11 13:14:53 crc kubenswrapper[4898]: I1211 13:14:53.316969 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1efa7034-8a95-4e6e-bd84-0189dc5acaa3-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "1efa7034-8a95-4e6e-bd84-0189dc5acaa3" (UID: "1efa7034-8a95-4e6e-bd84-0189dc5acaa3"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 11 13:14:53 crc kubenswrapper[4898]: I1211 13:14:53.316996 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1efa7034-8a95-4e6e-bd84-0189dc5acaa3-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "1efa7034-8a95-4e6e-bd84-0189dc5acaa3" (UID: "1efa7034-8a95-4e6e-bd84-0189dc5acaa3"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 11 13:14:53 crc kubenswrapper[4898]: I1211 13:14:53.317223 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1efa7034-8a95-4e6e-bd84-0189dc5acaa3-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "1efa7034-8a95-4e6e-bd84-0189dc5acaa3" (UID: "1efa7034-8a95-4e6e-bd84-0189dc5acaa3"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:14:53 crc kubenswrapper[4898]: I1211 13:14:53.322897 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1efa7034-8a95-4e6e-bd84-0189dc5acaa3-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "1efa7034-8a95-4e6e-bd84-0189dc5acaa3" (UID: "1efa7034-8a95-4e6e-bd84-0189dc5acaa3"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:14:53 crc kubenswrapper[4898]: I1211 13:14:53.337716 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1efa7034-8a95-4e6e-bd84-0189dc5acaa3-kube-api-access-729s2" (OuterVolumeSpecName: "kube-api-access-729s2") pod "1efa7034-8a95-4e6e-bd84-0189dc5acaa3" (UID: "1efa7034-8a95-4e6e-bd84-0189dc5acaa3"). InnerVolumeSpecName "kube-api-access-729s2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:14:53 crc kubenswrapper[4898]: I1211 13:14:53.356681 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1efa7034-8a95-4e6e-bd84-0189dc5acaa3-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "1efa7034-8a95-4e6e-bd84-0189dc5acaa3" (UID: "1efa7034-8a95-4e6e-bd84-0189dc5acaa3"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 11 13:14:53 crc kubenswrapper[4898]: I1211 13:14:53.418103 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f21a88db-d1eb-4a8d-a82c-c9652c86a935-etc-openvswitch\") pod \"ovnkube-node-lwjvl\" (UID: \"f21a88db-d1eb-4a8d-a82c-c9652c86a935\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwjvl" Dec 11 13:14:53 crc kubenswrapper[4898]: I1211 13:14:53.418346 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f21a88db-d1eb-4a8d-a82c-c9652c86a935-env-overrides\") pod \"ovnkube-node-lwjvl\" (UID: \"f21a88db-d1eb-4a8d-a82c-c9652c86a935\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwjvl" Dec 11 13:14:53 crc kubenswrapper[4898]: I1211 13:14:53.418362 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f21a88db-d1eb-4a8d-a82c-c9652c86a935-ovnkube-config\") pod \"ovnkube-node-lwjvl\" (UID: \"f21a88db-d1eb-4a8d-a82c-c9652c86a935\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwjvl" Dec 11 13:14:53 crc kubenswrapper[4898]: I1211 13:14:53.418384 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f21a88db-d1eb-4a8d-a82c-c9652c86a935-run-systemd\") pod \"ovnkube-node-lwjvl\" (UID: \"f21a88db-d1eb-4a8d-a82c-c9652c86a935\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwjvl" Dec 11 13:14:53 crc kubenswrapper[4898]: I1211 13:14:53.418405 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f21a88db-d1eb-4a8d-a82c-c9652c86a935-host-kubelet\") pod \"ovnkube-node-lwjvl\" (UID: \"f21a88db-d1eb-4a8d-a82c-c9652c86a935\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwjvl" Dec 11 13:14:53 crc kubenswrapper[4898]: I1211 13:14:53.418419 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f21a88db-d1eb-4a8d-a82c-c9652c86a935-run-openvswitch\") pod \"ovnkube-node-lwjvl\" (UID: \"f21a88db-d1eb-4a8d-a82c-c9652c86a935\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwjvl" Dec 11 13:14:53 crc kubenswrapper[4898]: I1211 13:14:53.418443 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f21a88db-d1eb-4a8d-a82c-c9652c86a935-host-cni-netd\") pod \"ovnkube-node-lwjvl\" (UID: \"f21a88db-d1eb-4a8d-a82c-c9652c86a935\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwjvl" Dec 11 13:14:53 crc kubenswrapper[4898]: I1211 13:14:53.418477 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f21a88db-d1eb-4a8d-a82c-c9652c86a935-run-ovn\") pod \"ovnkube-node-lwjvl\" (UID: \"f21a88db-d1eb-4a8d-a82c-c9652c86a935\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwjvl" Dec 11 13:14:53 crc kubenswrapper[4898]: I1211 13:14:53.418492 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f21a88db-d1eb-4a8d-a82c-c9652c86a935-ovn-node-metrics-cert\") pod \"ovnkube-node-lwjvl\" (UID: \"f21a88db-d1eb-4a8d-a82c-c9652c86a935\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwjvl" Dec 11 13:14:53 crc kubenswrapper[4898]: I1211 13:14:53.418518 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f21a88db-d1eb-4a8d-a82c-c9652c86a935-host-slash\") pod \"ovnkube-node-lwjvl\" (UID: \"f21a88db-d1eb-4a8d-a82c-c9652c86a935\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwjvl" Dec 11 13:14:53 crc kubenswrapper[4898]: I1211 13:14:53.418533 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f21a88db-d1eb-4a8d-a82c-c9652c86a935-systemd-units\") pod \"ovnkube-node-lwjvl\" (UID: \"f21a88db-d1eb-4a8d-a82c-c9652c86a935\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwjvl" Dec 11 13:14:53 crc kubenswrapper[4898]: I1211 13:14:53.418559 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f21a88db-d1eb-4a8d-a82c-c9652c86a935-host-run-ovn-kubernetes\") pod \"ovnkube-node-lwjvl\" (UID: \"f21a88db-d1eb-4a8d-a82c-c9652c86a935\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwjvl" Dec 11 13:14:53 crc kubenswrapper[4898]: I1211 13:14:53.418577 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f21a88db-d1eb-4a8d-a82c-c9652c86a935-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-lwjvl\" (UID: \"f21a88db-d1eb-4a8d-a82c-c9652c86a935\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwjvl" Dec 11 13:14:53 crc kubenswrapper[4898]: I1211 13:14:53.418592 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j75x5\" (UniqueName: \"kubernetes.io/projected/f21a88db-d1eb-4a8d-a82c-c9652c86a935-kube-api-access-j75x5\") pod \"ovnkube-node-lwjvl\" (UID: \"f21a88db-d1eb-4a8d-a82c-c9652c86a935\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwjvl" Dec 11 13:14:53 crc kubenswrapper[4898]: I1211 13:14:53.418611 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f21a88db-d1eb-4a8d-a82c-c9652c86a935-ovnkube-script-lib\") pod \"ovnkube-node-lwjvl\" (UID: \"f21a88db-d1eb-4a8d-a82c-c9652c86a935\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwjvl" Dec 11 13:14:53 crc kubenswrapper[4898]: I1211 13:14:53.418626 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f21a88db-d1eb-4a8d-a82c-c9652c86a935-host-run-netns\") pod \"ovnkube-node-lwjvl\" (UID: \"f21a88db-d1eb-4a8d-a82c-c9652c86a935\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwjvl" Dec 11 13:14:53 crc kubenswrapper[4898]: I1211 13:14:53.418642 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f21a88db-d1eb-4a8d-a82c-c9652c86a935-var-lib-openvswitch\") pod \"ovnkube-node-lwjvl\" (UID: \"f21a88db-d1eb-4a8d-a82c-c9652c86a935\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwjvl" Dec 11 13:14:53 crc kubenswrapper[4898]: I1211 13:14:53.418668 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f21a88db-d1eb-4a8d-a82c-c9652c86a935-node-log\") pod \"ovnkube-node-lwjvl\" (UID: \"f21a88db-d1eb-4a8d-a82c-c9652c86a935\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwjvl" Dec 11 13:14:53 crc kubenswrapper[4898]: I1211 13:14:53.418688 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f21a88db-d1eb-4a8d-a82c-c9652c86a935-log-socket\") pod \"ovnkube-node-lwjvl\" (UID: \"f21a88db-d1eb-4a8d-a82c-c9652c86a935\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwjvl" Dec 11 13:14:53 crc kubenswrapper[4898]: I1211 13:14:53.418703 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f21a88db-d1eb-4a8d-a82c-c9652c86a935-host-cni-bin\") pod \"ovnkube-node-lwjvl\" (UID: \"f21a88db-d1eb-4a8d-a82c-c9652c86a935\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwjvl" Dec 11 13:14:53 crc kubenswrapper[4898]: I1211 13:14:53.418737 4898 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1efa7034-8a95-4e6e-bd84-0189dc5acaa3-host-run-netns\") on node \"crc\" DevicePath \"\"" Dec 11 13:14:53 crc kubenswrapper[4898]: I1211 13:14:53.418747 4898 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1efa7034-8a95-4e6e-bd84-0189dc5acaa3-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 11 13:14:53 crc kubenswrapper[4898]: I1211 13:14:53.418755 4898 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/1efa7034-8a95-4e6e-bd84-0189dc5acaa3-node-log\") on node \"crc\" DevicePath \"\"" Dec 11 13:14:53 crc kubenswrapper[4898]: I1211 13:14:53.418765 4898 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/1efa7034-8a95-4e6e-bd84-0189dc5acaa3-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Dec 11 13:14:53 crc kubenswrapper[4898]: I1211 13:14:53.418779 4898 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1efa7034-8a95-4e6e-bd84-0189dc5acaa3-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 11 13:14:53 crc kubenswrapper[4898]: I1211 13:14:53.418790 4898 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1efa7034-8a95-4e6e-bd84-0189dc5acaa3-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Dec 11 13:14:53 crc kubenswrapper[4898]: I1211 13:14:53.418798 4898 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/1efa7034-8a95-4e6e-bd84-0189dc5acaa3-run-systemd\") on node \"crc\" DevicePath \"\"" Dec 11 13:14:53 crc kubenswrapper[4898]: I1211 13:14:53.418807 4898 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/1efa7034-8a95-4e6e-bd84-0189dc5acaa3-systemd-units\") on node \"crc\" DevicePath \"\"" Dec 11 13:14:53 crc kubenswrapper[4898]: I1211 13:14:53.418816 4898 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/1efa7034-8a95-4e6e-bd84-0189dc5acaa3-host-cni-netd\") on node \"crc\" DevicePath \"\"" Dec 11 13:14:53 crc kubenswrapper[4898]: I1211 13:14:53.418824 4898 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1efa7034-8a95-4e6e-bd84-0189dc5acaa3-host-slash\") on node \"crc\" DevicePath \"\"" Dec 11 13:14:53 crc kubenswrapper[4898]: I1211 13:14:53.418832 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-729s2\" (UniqueName: \"kubernetes.io/projected/1efa7034-8a95-4e6e-bd84-0189dc5acaa3-kube-api-access-729s2\") on node \"crc\" DevicePath \"\"" Dec 11 13:14:53 crc kubenswrapper[4898]: I1211 13:14:53.418840 4898 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1efa7034-8a95-4e6e-bd84-0189dc5acaa3-host-cni-bin\") on node \"crc\" DevicePath \"\"" Dec 11 13:14:53 crc kubenswrapper[4898]: I1211 13:14:53.418848 4898 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/1efa7034-8a95-4e6e-bd84-0189dc5acaa3-host-kubelet\") on node \"crc\" DevicePath \"\"" Dec 11 13:14:53 crc kubenswrapper[4898]: I1211 13:14:53.418858 4898 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1efa7034-8a95-4e6e-bd84-0189dc5acaa3-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Dec 11 13:14:53 crc kubenswrapper[4898]: I1211 13:14:53.418866 4898 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/1efa7034-8a95-4e6e-bd84-0189dc5acaa3-run-ovn\") on node \"crc\" DevicePath \"\"" Dec 11 13:14:53 crc kubenswrapper[4898]: I1211 13:14:53.418874 4898 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1efa7034-8a95-4e6e-bd84-0189dc5acaa3-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 11 13:14:53 crc kubenswrapper[4898]: I1211 13:14:53.418882 4898 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1efa7034-8a95-4e6e-bd84-0189dc5acaa3-run-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 11 13:14:53 crc kubenswrapper[4898]: I1211 13:14:53.519581 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f21a88db-d1eb-4a8d-a82c-c9652c86a935-host-cni-bin\") pod \"ovnkube-node-lwjvl\" (UID: \"f21a88db-d1eb-4a8d-a82c-c9652c86a935\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwjvl" Dec 11 13:14:53 crc kubenswrapper[4898]: I1211 13:14:53.519848 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f21a88db-d1eb-4a8d-a82c-c9652c86a935-etc-openvswitch\") pod \"ovnkube-node-lwjvl\" (UID: \"f21a88db-d1eb-4a8d-a82c-c9652c86a935\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwjvl" Dec 11 13:14:53 crc kubenswrapper[4898]: I1211 13:14:53.519981 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f21a88db-d1eb-4a8d-a82c-c9652c86a935-env-overrides\") pod \"ovnkube-node-lwjvl\" (UID: \"f21a88db-d1eb-4a8d-a82c-c9652c86a935\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwjvl" Dec 11 13:14:53 crc kubenswrapper[4898]: I1211 13:14:53.520079 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f21a88db-d1eb-4a8d-a82c-c9652c86a935-ovnkube-config\") pod \"ovnkube-node-lwjvl\" (UID: \"f21a88db-d1eb-4a8d-a82c-c9652c86a935\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwjvl" Dec 11 13:14:53 crc kubenswrapper[4898]: I1211 13:14:53.520168 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f21a88db-d1eb-4a8d-a82c-c9652c86a935-run-systemd\") pod \"ovnkube-node-lwjvl\" (UID: \"f21a88db-d1eb-4a8d-a82c-c9652c86a935\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwjvl" Dec 11 13:14:53 crc kubenswrapper[4898]: I1211 13:14:53.520241 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f21a88db-d1eb-4a8d-a82c-c9652c86a935-host-kubelet\") pod \"ovnkube-node-lwjvl\" (UID: \"f21a88db-d1eb-4a8d-a82c-c9652c86a935\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwjvl" Dec 11 13:14:53 crc kubenswrapper[4898]: I1211 13:14:53.520299 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f21a88db-d1eb-4a8d-a82c-c9652c86a935-run-openvswitch\") pod \"ovnkube-node-lwjvl\" (UID: \"f21a88db-d1eb-4a8d-a82c-c9652c86a935\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwjvl" Dec 11 13:14:53 crc kubenswrapper[4898]: I1211 13:14:53.520374 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f21a88db-d1eb-4a8d-a82c-c9652c86a935-host-cni-netd\") pod \"ovnkube-node-lwjvl\" (UID: \"f21a88db-d1eb-4a8d-a82c-c9652c86a935\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwjvl" Dec 11 13:14:53 crc kubenswrapper[4898]: I1211 13:14:53.520441 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f21a88db-d1eb-4a8d-a82c-c9652c86a935-run-ovn\") pod \"ovnkube-node-lwjvl\" (UID: \"f21a88db-d1eb-4a8d-a82c-c9652c86a935\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwjvl" Dec 11 13:14:53 crc kubenswrapper[4898]: I1211 13:14:53.520553 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f21a88db-d1eb-4a8d-a82c-c9652c86a935-ovn-node-metrics-cert\") pod \"ovnkube-node-lwjvl\" (UID: \"f21a88db-d1eb-4a8d-a82c-c9652c86a935\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwjvl" Dec 11 13:14:53 crc kubenswrapper[4898]: I1211 13:14:53.520626 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f21a88db-d1eb-4a8d-a82c-c9652c86a935-host-slash\") pod \"ovnkube-node-lwjvl\" (UID: \"f21a88db-d1eb-4a8d-a82c-c9652c86a935\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwjvl" Dec 11 13:14:53 crc kubenswrapper[4898]: I1211 13:14:53.520693 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f21a88db-d1eb-4a8d-a82c-c9652c86a935-systemd-units\") pod \"ovnkube-node-lwjvl\" (UID: \"f21a88db-d1eb-4a8d-a82c-c9652c86a935\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwjvl" Dec 11 13:14:53 crc kubenswrapper[4898]: I1211 13:14:53.520754 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f21a88db-d1eb-4a8d-a82c-c9652c86a935-env-overrides\") pod \"ovnkube-node-lwjvl\" (UID: \"f21a88db-d1eb-4a8d-a82c-c9652c86a935\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwjvl" Dec 11 13:14:53 crc kubenswrapper[4898]: I1211 13:14:53.520809 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f21a88db-d1eb-4a8d-a82c-c9652c86a935-etc-openvswitch\") pod \"ovnkube-node-lwjvl\" (UID: \"f21a88db-d1eb-4a8d-a82c-c9652c86a935\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwjvl" Dec 11 13:14:53 crc kubenswrapper[4898]: I1211 13:14:53.520846 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f21a88db-d1eb-4a8d-a82c-c9652c86a935-run-openvswitch\") pod \"ovnkube-node-lwjvl\" (UID: \"f21a88db-d1eb-4a8d-a82c-c9652c86a935\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwjvl" Dec 11 13:14:53 crc kubenswrapper[4898]: I1211 13:14:53.520893 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f21a88db-d1eb-4a8d-a82c-c9652c86a935-host-run-ovn-kubernetes\") pod \"ovnkube-node-lwjvl\" (UID: \"f21a88db-d1eb-4a8d-a82c-c9652c86a935\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwjvl" Dec 11 13:14:53 crc kubenswrapper[4898]: I1211 13:14:53.520188 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f21a88db-d1eb-4a8d-a82c-c9652c86a935-host-cni-bin\") pod \"ovnkube-node-lwjvl\" (UID: \"f21a88db-d1eb-4a8d-a82c-c9652c86a935\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwjvl" Dec 11 13:14:53 crc kubenswrapper[4898]: I1211 13:14:53.521110 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f21a88db-d1eb-4a8d-a82c-c9652c86a935-run-systemd\") pod \"ovnkube-node-lwjvl\" (UID: \"f21a88db-d1eb-4a8d-a82c-c9652c86a935\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwjvl" Dec 11 13:14:53 crc kubenswrapper[4898]: I1211 13:14:53.521021 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f21a88db-d1eb-4a8d-a82c-c9652c86a935-host-slash\") pod \"ovnkube-node-lwjvl\" (UID: \"f21a88db-d1eb-4a8d-a82c-c9652c86a935\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwjvl" Dec 11 13:14:53 crc kubenswrapper[4898]: I1211 13:14:53.521045 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f21a88db-d1eb-4a8d-a82c-c9652c86a935-systemd-units\") pod \"ovnkube-node-lwjvl\" (UID: \"f21a88db-d1eb-4a8d-a82c-c9652c86a935\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwjvl" Dec 11 13:14:53 crc kubenswrapper[4898]: I1211 13:14:53.521049 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f21a88db-d1eb-4a8d-a82c-c9652c86a935-host-kubelet\") pod \"ovnkube-node-lwjvl\" (UID: \"f21a88db-d1eb-4a8d-a82c-c9652c86a935\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwjvl" Dec 11 13:14:53 crc kubenswrapper[4898]: I1211 13:14:53.521069 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f21a88db-d1eb-4a8d-a82c-c9652c86a935-host-cni-netd\") pod \"ovnkube-node-lwjvl\" (UID: \"f21a88db-d1eb-4a8d-a82c-c9652c86a935\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwjvl" Dec 11 13:14:53 crc kubenswrapper[4898]: I1211 13:14:53.521074 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f21a88db-d1eb-4a8d-a82c-c9652c86a935-run-ovn\") pod \"ovnkube-node-lwjvl\" (UID: \"f21a88db-d1eb-4a8d-a82c-c9652c86a935\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwjvl" Dec 11 13:14:53 crc kubenswrapper[4898]: I1211 13:14:53.521020 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f21a88db-d1eb-4a8d-a82c-c9652c86a935-host-run-ovn-kubernetes\") pod \"ovnkube-node-lwjvl\" (UID: \"f21a88db-d1eb-4a8d-a82c-c9652c86a935\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwjvl" Dec 11 13:14:53 crc kubenswrapper[4898]: I1211 13:14:53.521242 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f21a88db-d1eb-4a8d-a82c-c9652c86a935-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-lwjvl\" (UID: \"f21a88db-d1eb-4a8d-a82c-c9652c86a935\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwjvl" Dec 11 13:14:53 crc kubenswrapper[4898]: I1211 13:14:53.521317 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f21a88db-d1eb-4a8d-a82c-c9652c86a935-ovnkube-config\") pod \"ovnkube-node-lwjvl\" (UID: \"f21a88db-d1eb-4a8d-a82c-c9652c86a935\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwjvl" Dec 11 13:14:53 crc kubenswrapper[4898]: I1211 13:14:53.521502 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f21a88db-d1eb-4a8d-a82c-c9652c86a935-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-lwjvl\" (UID: \"f21a88db-d1eb-4a8d-a82c-c9652c86a935\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwjvl" Dec 11 13:14:53 crc kubenswrapper[4898]: I1211 13:14:53.522025 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j75x5\" (UniqueName: \"kubernetes.io/projected/f21a88db-d1eb-4a8d-a82c-c9652c86a935-kube-api-access-j75x5\") pod \"ovnkube-node-lwjvl\" (UID: \"f21a88db-d1eb-4a8d-a82c-c9652c86a935\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwjvl" Dec 11 13:14:53 crc kubenswrapper[4898]: I1211 13:14:53.522408 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f21a88db-d1eb-4a8d-a82c-c9652c86a935-ovnkube-script-lib\") pod \"ovnkube-node-lwjvl\" (UID: \"f21a88db-d1eb-4a8d-a82c-c9652c86a935\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwjvl" Dec 11 13:14:53 crc kubenswrapper[4898]: I1211 13:14:53.522603 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f21a88db-d1eb-4a8d-a82c-c9652c86a935-host-run-netns\") pod \"ovnkube-node-lwjvl\" (UID: \"f21a88db-d1eb-4a8d-a82c-c9652c86a935\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwjvl" Dec 11 13:14:53 crc kubenswrapper[4898]: I1211 13:14:53.522769 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f21a88db-d1eb-4a8d-a82c-c9652c86a935-var-lib-openvswitch\") pod \"ovnkube-node-lwjvl\" (UID: \"f21a88db-d1eb-4a8d-a82c-c9652c86a935\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwjvl" Dec 11 13:14:53 crc kubenswrapper[4898]: I1211 13:14:53.522969 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f21a88db-d1eb-4a8d-a82c-c9652c86a935-node-log\") pod \"ovnkube-node-lwjvl\" (UID: \"f21a88db-d1eb-4a8d-a82c-c9652c86a935\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwjvl" Dec 11 13:14:53 crc kubenswrapper[4898]: I1211 13:14:53.523119 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f21a88db-d1eb-4a8d-a82c-c9652c86a935-log-socket\") pod \"ovnkube-node-lwjvl\" (UID: \"f21a88db-d1eb-4a8d-a82c-c9652c86a935\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwjvl" Dec 11 13:14:53 crc kubenswrapper[4898]: I1211 13:14:53.523199 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f21a88db-d1eb-4a8d-a82c-c9652c86a935-log-socket\") pod \"ovnkube-node-lwjvl\" (UID: \"f21a88db-d1eb-4a8d-a82c-c9652c86a935\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwjvl" Dec 11 13:14:53 crc kubenswrapper[4898]: I1211 13:14:53.523015 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f21a88db-d1eb-4a8d-a82c-c9652c86a935-node-log\") pod \"ovnkube-node-lwjvl\" (UID: \"f21a88db-d1eb-4a8d-a82c-c9652c86a935\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwjvl" Dec 11 13:14:53 crc kubenswrapper[4898]: I1211 13:14:53.523058 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f21a88db-d1eb-4a8d-a82c-c9652c86a935-ovnkube-script-lib\") pod \"ovnkube-node-lwjvl\" (UID: \"f21a88db-d1eb-4a8d-a82c-c9652c86a935\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwjvl" Dec 11 13:14:53 crc kubenswrapper[4898]: I1211 13:14:53.522907 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f21a88db-d1eb-4a8d-a82c-c9652c86a935-var-lib-openvswitch\") pod \"ovnkube-node-lwjvl\" (UID: \"f21a88db-d1eb-4a8d-a82c-c9652c86a935\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwjvl" Dec 11 13:14:53 crc kubenswrapper[4898]: I1211 13:14:53.522729 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f21a88db-d1eb-4a8d-a82c-c9652c86a935-host-run-netns\") pod \"ovnkube-node-lwjvl\" (UID: \"f21a88db-d1eb-4a8d-a82c-c9652c86a935\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwjvl" Dec 11 13:14:53 crc kubenswrapper[4898]: I1211 13:14:53.525192 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f21a88db-d1eb-4a8d-a82c-c9652c86a935-ovn-node-metrics-cert\") pod \"ovnkube-node-lwjvl\" (UID: \"f21a88db-d1eb-4a8d-a82c-c9652c86a935\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwjvl" Dec 11 13:14:53 crc kubenswrapper[4898]: I1211 13:14:53.538922 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j75x5\" (UniqueName: \"kubernetes.io/projected/f21a88db-d1eb-4a8d-a82c-c9652c86a935-kube-api-access-j75x5\") pod \"ovnkube-node-lwjvl\" (UID: \"f21a88db-d1eb-4a8d-a82c-c9652c86a935\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwjvl" Dec 11 13:14:53 crc kubenswrapper[4898]: I1211 13:14:53.610903 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-lwjvl" Dec 11 13:14:53 crc kubenswrapper[4898]: I1211 13:14:53.797253 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qndxl_1efa7034-8a95-4e6e-bd84-0189dc5acaa3/ovn-acl-logging/0.log" Dec 11 13:14:53 crc kubenswrapper[4898]: I1211 13:14:53.797835 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qndxl_1efa7034-8a95-4e6e-bd84-0189dc5acaa3/ovn-controller/0.log" Dec 11 13:14:53 crc kubenswrapper[4898]: I1211 13:14:53.802073 4898 generic.go:334] "Generic (PLEG): container finished" podID="1efa7034-8a95-4e6e-bd84-0189dc5acaa3" containerID="dd81ec4902eeb225d5a2969e211c63bbab448c1cccb327cfa84fb68c4f386ab0" exitCode=0 Dec 11 13:14:53 crc kubenswrapper[4898]: I1211 13:14:53.802321 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-qndxl" Dec 11 13:14:53 crc kubenswrapper[4898]: I1211 13:14:53.802316 4898 generic.go:334] "Generic (PLEG): container finished" podID="1efa7034-8a95-4e6e-bd84-0189dc5acaa3" containerID="d2a77e22893dca5ea818fc6992db6c34f8eb90ce2f25e814bec5f314d60ad569" exitCode=0 Dec 11 13:14:53 crc kubenswrapper[4898]: I1211 13:14:53.802220 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qndxl" event={"ID":"1efa7034-8a95-4e6e-bd84-0189dc5acaa3","Type":"ContainerDied","Data":"dd81ec4902eeb225d5a2969e211c63bbab448c1cccb327cfa84fb68c4f386ab0"} Dec 11 13:14:53 crc kubenswrapper[4898]: I1211 13:14:53.802716 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qndxl" event={"ID":"1efa7034-8a95-4e6e-bd84-0189dc5acaa3","Type":"ContainerDied","Data":"d2a77e22893dca5ea818fc6992db6c34f8eb90ce2f25e814bec5f314d60ad569"} Dec 11 13:14:53 crc kubenswrapper[4898]: I1211 13:14:53.802792 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qndxl" event={"ID":"1efa7034-8a95-4e6e-bd84-0189dc5acaa3","Type":"ContainerDied","Data":"f1c2b6189ae69df6ce73a2937e9e0c5b5e610f3e04da5f5d832b13e9de9295d6"} Dec 11 13:14:53 crc kubenswrapper[4898]: I1211 13:14:53.802872 4898 scope.go:117] "RemoveContainer" containerID="1562e3d204cfba3207d1a92c464e141095798dfd147032dc841d0c3d87c7ca2d" Dec 11 13:14:53 crc kubenswrapper[4898]: I1211 13:14:53.805424 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-dlqfj_4e8ed6cb-b822-4b64-9e00-e755c5aea812/kube-multus/2.log" Dec 11 13:14:53 crc kubenswrapper[4898]: I1211 13:14:53.817969 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lwjvl" event={"ID":"f21a88db-d1eb-4a8d-a82c-c9652c86a935","Type":"ContainerStarted","Data":"4d8109402a4e165200b86cab9e71355af394e937e96eb775696b73471d3a5f39"} Dec 11 13:14:53 crc kubenswrapper[4898]: I1211 13:14:53.872640 4898 scope.go:117] "RemoveContainer" containerID="e3aa907a5ca865510f1aafd0f949963b5938b6fe62fa9fd690447c4ab62566f0" Dec 11 13:14:53 crc kubenswrapper[4898]: I1211 13:14:53.898625 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-qndxl"] Dec 11 13:14:53 crc kubenswrapper[4898]: I1211 13:14:53.905207 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-qndxl"] Dec 11 13:14:53 crc kubenswrapper[4898]: I1211 13:14:53.916138 4898 scope.go:117] "RemoveContainer" containerID="6ab0efc2d7d46f575125a46f025b016017e66f74466a9d796dfb0285577f5bcb" Dec 11 13:14:53 crc kubenswrapper[4898]: I1211 13:14:53.934958 4898 scope.go:117] "RemoveContainer" containerID="d992af21fd68cce19a4be8328466a3d46df62ed908fe4e911ba14e7c009ce982" Dec 11 13:14:53 crc kubenswrapper[4898]: I1211 13:14:53.958480 4898 scope.go:117] "RemoveContainer" containerID="dd81ec4902eeb225d5a2969e211c63bbab448c1cccb327cfa84fb68c4f386ab0" Dec 11 13:14:53 crc kubenswrapper[4898]: I1211 13:14:53.973008 4898 scope.go:117] "RemoveContainer" containerID="d2a77e22893dca5ea818fc6992db6c34f8eb90ce2f25e814bec5f314d60ad569" Dec 11 13:14:54 crc kubenswrapper[4898]: I1211 13:14:54.021833 4898 scope.go:117] "RemoveContainer" containerID="a60901e27e67b7483e02800a91a86f2676b610d538c999ab1df0d42fde146f90" Dec 11 13:14:54 crc kubenswrapper[4898]: I1211 13:14:54.057698 4898 scope.go:117] "RemoveContainer" containerID="b7e7d7ded2a6e0e20781a16b0ed56a744db593c6fe40c75e89a088ccb8cdb127" Dec 11 13:14:54 crc kubenswrapper[4898]: I1211 13:14:54.130293 4898 scope.go:117] "RemoveContainer" containerID="8fc397b24e8480c4d07281b09fb871f859eb9fdf47fa9103b9243f91e5800d61" Dec 11 13:14:54 crc kubenswrapper[4898]: I1211 13:14:54.207304 4898 scope.go:117] "RemoveContainer" containerID="1562e3d204cfba3207d1a92c464e141095798dfd147032dc841d0c3d87c7ca2d" Dec 11 13:14:54 crc kubenswrapper[4898]: E1211 13:14:54.211320 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1562e3d204cfba3207d1a92c464e141095798dfd147032dc841d0c3d87c7ca2d\": container with ID starting with 1562e3d204cfba3207d1a92c464e141095798dfd147032dc841d0c3d87c7ca2d not found: ID does not exist" containerID="1562e3d204cfba3207d1a92c464e141095798dfd147032dc841d0c3d87c7ca2d" Dec 11 13:14:54 crc kubenswrapper[4898]: I1211 13:14:54.211355 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1562e3d204cfba3207d1a92c464e141095798dfd147032dc841d0c3d87c7ca2d"} err="failed to get container status \"1562e3d204cfba3207d1a92c464e141095798dfd147032dc841d0c3d87c7ca2d\": rpc error: code = NotFound desc = could not find container \"1562e3d204cfba3207d1a92c464e141095798dfd147032dc841d0c3d87c7ca2d\": container with ID starting with 1562e3d204cfba3207d1a92c464e141095798dfd147032dc841d0c3d87c7ca2d not found: ID does not exist" Dec 11 13:14:54 crc kubenswrapper[4898]: I1211 13:14:54.211380 4898 scope.go:117] "RemoveContainer" containerID="e3aa907a5ca865510f1aafd0f949963b5938b6fe62fa9fd690447c4ab62566f0" Dec 11 13:14:54 crc kubenswrapper[4898]: E1211 13:14:54.214022 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e3aa907a5ca865510f1aafd0f949963b5938b6fe62fa9fd690447c4ab62566f0\": container with ID starting with e3aa907a5ca865510f1aafd0f949963b5938b6fe62fa9fd690447c4ab62566f0 not found: ID does not exist" containerID="e3aa907a5ca865510f1aafd0f949963b5938b6fe62fa9fd690447c4ab62566f0" Dec 11 13:14:54 crc kubenswrapper[4898]: I1211 13:14:54.214060 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e3aa907a5ca865510f1aafd0f949963b5938b6fe62fa9fd690447c4ab62566f0"} err="failed to get container status \"e3aa907a5ca865510f1aafd0f949963b5938b6fe62fa9fd690447c4ab62566f0\": rpc error: code = NotFound desc = could not find container \"e3aa907a5ca865510f1aafd0f949963b5938b6fe62fa9fd690447c4ab62566f0\": container with ID starting with e3aa907a5ca865510f1aafd0f949963b5938b6fe62fa9fd690447c4ab62566f0 not found: ID does not exist" Dec 11 13:14:54 crc kubenswrapper[4898]: I1211 13:14:54.214078 4898 scope.go:117] "RemoveContainer" containerID="6ab0efc2d7d46f575125a46f025b016017e66f74466a9d796dfb0285577f5bcb" Dec 11 13:14:54 crc kubenswrapper[4898]: E1211 13:14:54.214615 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6ab0efc2d7d46f575125a46f025b016017e66f74466a9d796dfb0285577f5bcb\": container with ID starting with 6ab0efc2d7d46f575125a46f025b016017e66f74466a9d796dfb0285577f5bcb not found: ID does not exist" containerID="6ab0efc2d7d46f575125a46f025b016017e66f74466a9d796dfb0285577f5bcb" Dec 11 13:14:54 crc kubenswrapper[4898]: I1211 13:14:54.214636 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ab0efc2d7d46f575125a46f025b016017e66f74466a9d796dfb0285577f5bcb"} err="failed to get container status \"6ab0efc2d7d46f575125a46f025b016017e66f74466a9d796dfb0285577f5bcb\": rpc error: code = NotFound desc = could not find container \"6ab0efc2d7d46f575125a46f025b016017e66f74466a9d796dfb0285577f5bcb\": container with ID starting with 6ab0efc2d7d46f575125a46f025b016017e66f74466a9d796dfb0285577f5bcb not found: ID does not exist" Dec 11 13:14:54 crc kubenswrapper[4898]: I1211 13:14:54.214650 4898 scope.go:117] "RemoveContainer" containerID="d992af21fd68cce19a4be8328466a3d46df62ed908fe4e911ba14e7c009ce982" Dec 11 13:14:54 crc kubenswrapper[4898]: E1211 13:14:54.215481 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d992af21fd68cce19a4be8328466a3d46df62ed908fe4e911ba14e7c009ce982\": container with ID starting with d992af21fd68cce19a4be8328466a3d46df62ed908fe4e911ba14e7c009ce982 not found: ID does not exist" containerID="d992af21fd68cce19a4be8328466a3d46df62ed908fe4e911ba14e7c009ce982" Dec 11 13:14:54 crc kubenswrapper[4898]: I1211 13:14:54.215504 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d992af21fd68cce19a4be8328466a3d46df62ed908fe4e911ba14e7c009ce982"} err="failed to get container status \"d992af21fd68cce19a4be8328466a3d46df62ed908fe4e911ba14e7c009ce982\": rpc error: code = NotFound desc = could not find container \"d992af21fd68cce19a4be8328466a3d46df62ed908fe4e911ba14e7c009ce982\": container with ID starting with d992af21fd68cce19a4be8328466a3d46df62ed908fe4e911ba14e7c009ce982 not found: ID does not exist" Dec 11 13:14:54 crc kubenswrapper[4898]: I1211 13:14:54.215523 4898 scope.go:117] "RemoveContainer" containerID="dd81ec4902eeb225d5a2969e211c63bbab448c1cccb327cfa84fb68c4f386ab0" Dec 11 13:14:54 crc kubenswrapper[4898]: E1211 13:14:54.215960 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dd81ec4902eeb225d5a2969e211c63bbab448c1cccb327cfa84fb68c4f386ab0\": container with ID starting with dd81ec4902eeb225d5a2969e211c63bbab448c1cccb327cfa84fb68c4f386ab0 not found: ID does not exist" containerID="dd81ec4902eeb225d5a2969e211c63bbab448c1cccb327cfa84fb68c4f386ab0" Dec 11 13:14:54 crc kubenswrapper[4898]: I1211 13:14:54.215981 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd81ec4902eeb225d5a2969e211c63bbab448c1cccb327cfa84fb68c4f386ab0"} err="failed to get container status \"dd81ec4902eeb225d5a2969e211c63bbab448c1cccb327cfa84fb68c4f386ab0\": rpc error: code = NotFound desc = could not find container \"dd81ec4902eeb225d5a2969e211c63bbab448c1cccb327cfa84fb68c4f386ab0\": container with ID starting with dd81ec4902eeb225d5a2969e211c63bbab448c1cccb327cfa84fb68c4f386ab0 not found: ID does not exist" Dec 11 13:14:54 crc kubenswrapper[4898]: I1211 13:14:54.215995 4898 scope.go:117] "RemoveContainer" containerID="d2a77e22893dca5ea818fc6992db6c34f8eb90ce2f25e814bec5f314d60ad569" Dec 11 13:14:54 crc kubenswrapper[4898]: E1211 13:14:54.216600 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d2a77e22893dca5ea818fc6992db6c34f8eb90ce2f25e814bec5f314d60ad569\": container with ID starting with d2a77e22893dca5ea818fc6992db6c34f8eb90ce2f25e814bec5f314d60ad569 not found: ID does not exist" containerID="d2a77e22893dca5ea818fc6992db6c34f8eb90ce2f25e814bec5f314d60ad569" Dec 11 13:14:54 crc kubenswrapper[4898]: I1211 13:14:54.216625 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d2a77e22893dca5ea818fc6992db6c34f8eb90ce2f25e814bec5f314d60ad569"} err="failed to get container status \"d2a77e22893dca5ea818fc6992db6c34f8eb90ce2f25e814bec5f314d60ad569\": rpc error: code = NotFound desc = could not find container \"d2a77e22893dca5ea818fc6992db6c34f8eb90ce2f25e814bec5f314d60ad569\": container with ID starting with d2a77e22893dca5ea818fc6992db6c34f8eb90ce2f25e814bec5f314d60ad569 not found: ID does not exist" Dec 11 13:14:54 crc kubenswrapper[4898]: I1211 13:14:54.216684 4898 scope.go:117] "RemoveContainer" containerID="a60901e27e67b7483e02800a91a86f2676b610d538c999ab1df0d42fde146f90" Dec 11 13:14:54 crc kubenswrapper[4898]: E1211 13:14:54.217260 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a60901e27e67b7483e02800a91a86f2676b610d538c999ab1df0d42fde146f90\": container with ID starting with a60901e27e67b7483e02800a91a86f2676b610d538c999ab1df0d42fde146f90 not found: ID does not exist" containerID="a60901e27e67b7483e02800a91a86f2676b610d538c999ab1df0d42fde146f90" Dec 11 13:14:54 crc kubenswrapper[4898]: I1211 13:14:54.217324 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a60901e27e67b7483e02800a91a86f2676b610d538c999ab1df0d42fde146f90"} err="failed to get container status \"a60901e27e67b7483e02800a91a86f2676b610d538c999ab1df0d42fde146f90\": rpc error: code = NotFound desc = could not find container \"a60901e27e67b7483e02800a91a86f2676b610d538c999ab1df0d42fde146f90\": container with ID starting with a60901e27e67b7483e02800a91a86f2676b610d538c999ab1df0d42fde146f90 not found: ID does not exist" Dec 11 13:14:54 crc kubenswrapper[4898]: I1211 13:14:54.217351 4898 scope.go:117] "RemoveContainer" containerID="b7e7d7ded2a6e0e20781a16b0ed56a744db593c6fe40c75e89a088ccb8cdb127" Dec 11 13:14:54 crc kubenswrapper[4898]: E1211 13:14:54.222886 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b7e7d7ded2a6e0e20781a16b0ed56a744db593c6fe40c75e89a088ccb8cdb127\": container with ID starting with b7e7d7ded2a6e0e20781a16b0ed56a744db593c6fe40c75e89a088ccb8cdb127 not found: ID does not exist" containerID="b7e7d7ded2a6e0e20781a16b0ed56a744db593c6fe40c75e89a088ccb8cdb127" Dec 11 13:14:54 crc kubenswrapper[4898]: I1211 13:14:54.222923 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7e7d7ded2a6e0e20781a16b0ed56a744db593c6fe40c75e89a088ccb8cdb127"} err="failed to get container status \"b7e7d7ded2a6e0e20781a16b0ed56a744db593c6fe40c75e89a088ccb8cdb127\": rpc error: code = NotFound desc = could not find container \"b7e7d7ded2a6e0e20781a16b0ed56a744db593c6fe40c75e89a088ccb8cdb127\": container with ID starting with b7e7d7ded2a6e0e20781a16b0ed56a744db593c6fe40c75e89a088ccb8cdb127 not found: ID does not exist" Dec 11 13:14:54 crc kubenswrapper[4898]: I1211 13:14:54.222948 4898 scope.go:117] "RemoveContainer" containerID="8fc397b24e8480c4d07281b09fb871f859eb9fdf47fa9103b9243f91e5800d61" Dec 11 13:14:54 crc kubenswrapper[4898]: E1211 13:14:54.225131 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8fc397b24e8480c4d07281b09fb871f859eb9fdf47fa9103b9243f91e5800d61\": container with ID starting with 8fc397b24e8480c4d07281b09fb871f859eb9fdf47fa9103b9243f91e5800d61 not found: ID does not exist" containerID="8fc397b24e8480c4d07281b09fb871f859eb9fdf47fa9103b9243f91e5800d61" Dec 11 13:14:54 crc kubenswrapper[4898]: I1211 13:14:54.225157 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8fc397b24e8480c4d07281b09fb871f859eb9fdf47fa9103b9243f91e5800d61"} err="failed to get container status \"8fc397b24e8480c4d07281b09fb871f859eb9fdf47fa9103b9243f91e5800d61\": rpc error: code = NotFound desc = could not find container \"8fc397b24e8480c4d07281b09fb871f859eb9fdf47fa9103b9243f91e5800d61\": container with ID starting with 8fc397b24e8480c4d07281b09fb871f859eb9fdf47fa9103b9243f91e5800d61 not found: ID does not exist" Dec 11 13:14:54 crc kubenswrapper[4898]: I1211 13:14:54.225175 4898 scope.go:117] "RemoveContainer" containerID="1562e3d204cfba3207d1a92c464e141095798dfd147032dc841d0c3d87c7ca2d" Dec 11 13:14:54 crc kubenswrapper[4898]: I1211 13:14:54.226732 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1562e3d204cfba3207d1a92c464e141095798dfd147032dc841d0c3d87c7ca2d"} err="failed to get container status \"1562e3d204cfba3207d1a92c464e141095798dfd147032dc841d0c3d87c7ca2d\": rpc error: code = NotFound desc = could not find container \"1562e3d204cfba3207d1a92c464e141095798dfd147032dc841d0c3d87c7ca2d\": container with ID starting with 1562e3d204cfba3207d1a92c464e141095798dfd147032dc841d0c3d87c7ca2d not found: ID does not exist" Dec 11 13:14:54 crc kubenswrapper[4898]: I1211 13:14:54.226757 4898 scope.go:117] "RemoveContainer" containerID="e3aa907a5ca865510f1aafd0f949963b5938b6fe62fa9fd690447c4ab62566f0" Dec 11 13:14:54 crc kubenswrapper[4898]: I1211 13:14:54.230026 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e3aa907a5ca865510f1aafd0f949963b5938b6fe62fa9fd690447c4ab62566f0"} err="failed to get container status \"e3aa907a5ca865510f1aafd0f949963b5938b6fe62fa9fd690447c4ab62566f0\": rpc error: code = NotFound desc = could not find container \"e3aa907a5ca865510f1aafd0f949963b5938b6fe62fa9fd690447c4ab62566f0\": container with ID starting with e3aa907a5ca865510f1aafd0f949963b5938b6fe62fa9fd690447c4ab62566f0 not found: ID does not exist" Dec 11 13:14:54 crc kubenswrapper[4898]: I1211 13:14:54.230050 4898 scope.go:117] "RemoveContainer" containerID="6ab0efc2d7d46f575125a46f025b016017e66f74466a9d796dfb0285577f5bcb" Dec 11 13:14:54 crc kubenswrapper[4898]: I1211 13:14:54.231687 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ab0efc2d7d46f575125a46f025b016017e66f74466a9d796dfb0285577f5bcb"} err="failed to get container status \"6ab0efc2d7d46f575125a46f025b016017e66f74466a9d796dfb0285577f5bcb\": rpc error: code = NotFound desc = could not find container \"6ab0efc2d7d46f575125a46f025b016017e66f74466a9d796dfb0285577f5bcb\": container with ID starting with 6ab0efc2d7d46f575125a46f025b016017e66f74466a9d796dfb0285577f5bcb not found: ID does not exist" Dec 11 13:14:54 crc kubenswrapper[4898]: I1211 13:14:54.231702 4898 scope.go:117] "RemoveContainer" containerID="d992af21fd68cce19a4be8328466a3d46df62ed908fe4e911ba14e7c009ce982" Dec 11 13:14:54 crc kubenswrapper[4898]: I1211 13:14:54.232665 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d992af21fd68cce19a4be8328466a3d46df62ed908fe4e911ba14e7c009ce982"} err="failed to get container status \"d992af21fd68cce19a4be8328466a3d46df62ed908fe4e911ba14e7c009ce982\": rpc error: code = NotFound desc = could not find container \"d992af21fd68cce19a4be8328466a3d46df62ed908fe4e911ba14e7c009ce982\": container with ID starting with d992af21fd68cce19a4be8328466a3d46df62ed908fe4e911ba14e7c009ce982 not found: ID does not exist" Dec 11 13:14:54 crc kubenswrapper[4898]: I1211 13:14:54.232686 4898 scope.go:117] "RemoveContainer" containerID="dd81ec4902eeb225d5a2969e211c63bbab448c1cccb327cfa84fb68c4f386ab0" Dec 11 13:14:54 crc kubenswrapper[4898]: I1211 13:14:54.233003 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd81ec4902eeb225d5a2969e211c63bbab448c1cccb327cfa84fb68c4f386ab0"} err="failed to get container status \"dd81ec4902eeb225d5a2969e211c63bbab448c1cccb327cfa84fb68c4f386ab0\": rpc error: code = NotFound desc = could not find container \"dd81ec4902eeb225d5a2969e211c63bbab448c1cccb327cfa84fb68c4f386ab0\": container with ID starting with dd81ec4902eeb225d5a2969e211c63bbab448c1cccb327cfa84fb68c4f386ab0 not found: ID does not exist" Dec 11 13:14:54 crc kubenswrapper[4898]: I1211 13:14:54.233020 4898 scope.go:117] "RemoveContainer" containerID="d2a77e22893dca5ea818fc6992db6c34f8eb90ce2f25e814bec5f314d60ad569" Dec 11 13:14:54 crc kubenswrapper[4898]: I1211 13:14:54.233303 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d2a77e22893dca5ea818fc6992db6c34f8eb90ce2f25e814bec5f314d60ad569"} err="failed to get container status \"d2a77e22893dca5ea818fc6992db6c34f8eb90ce2f25e814bec5f314d60ad569\": rpc error: code = NotFound desc = could not find container \"d2a77e22893dca5ea818fc6992db6c34f8eb90ce2f25e814bec5f314d60ad569\": container with ID starting with d2a77e22893dca5ea818fc6992db6c34f8eb90ce2f25e814bec5f314d60ad569 not found: ID does not exist" Dec 11 13:14:54 crc kubenswrapper[4898]: I1211 13:14:54.233320 4898 scope.go:117] "RemoveContainer" containerID="a60901e27e67b7483e02800a91a86f2676b610d538c999ab1df0d42fde146f90" Dec 11 13:14:54 crc kubenswrapper[4898]: I1211 13:14:54.233649 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a60901e27e67b7483e02800a91a86f2676b610d538c999ab1df0d42fde146f90"} err="failed to get container status \"a60901e27e67b7483e02800a91a86f2676b610d538c999ab1df0d42fde146f90\": rpc error: code = NotFound desc = could not find container \"a60901e27e67b7483e02800a91a86f2676b610d538c999ab1df0d42fde146f90\": container with ID starting with a60901e27e67b7483e02800a91a86f2676b610d538c999ab1df0d42fde146f90 not found: ID does not exist" Dec 11 13:14:54 crc kubenswrapper[4898]: I1211 13:14:54.233670 4898 scope.go:117] "RemoveContainer" containerID="b7e7d7ded2a6e0e20781a16b0ed56a744db593c6fe40c75e89a088ccb8cdb127" Dec 11 13:14:54 crc kubenswrapper[4898]: I1211 13:14:54.233965 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7e7d7ded2a6e0e20781a16b0ed56a744db593c6fe40c75e89a088ccb8cdb127"} err="failed to get container status \"b7e7d7ded2a6e0e20781a16b0ed56a744db593c6fe40c75e89a088ccb8cdb127\": rpc error: code = NotFound desc = could not find container \"b7e7d7ded2a6e0e20781a16b0ed56a744db593c6fe40c75e89a088ccb8cdb127\": container with ID starting with b7e7d7ded2a6e0e20781a16b0ed56a744db593c6fe40c75e89a088ccb8cdb127 not found: ID does not exist" Dec 11 13:14:54 crc kubenswrapper[4898]: I1211 13:14:54.233982 4898 scope.go:117] "RemoveContainer" containerID="8fc397b24e8480c4d07281b09fb871f859eb9fdf47fa9103b9243f91e5800d61" Dec 11 13:14:54 crc kubenswrapper[4898]: I1211 13:14:54.234264 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8fc397b24e8480c4d07281b09fb871f859eb9fdf47fa9103b9243f91e5800d61"} err="failed to get container status \"8fc397b24e8480c4d07281b09fb871f859eb9fdf47fa9103b9243f91e5800d61\": rpc error: code = NotFound desc = could not find container \"8fc397b24e8480c4d07281b09fb871f859eb9fdf47fa9103b9243f91e5800d61\": container with ID starting with 8fc397b24e8480c4d07281b09fb871f859eb9fdf47fa9103b9243f91e5800d61 not found: ID does not exist" Dec 11 13:14:54 crc kubenswrapper[4898]: I1211 13:14:54.782025 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1efa7034-8a95-4e6e-bd84-0189dc5acaa3" path="/var/lib/kubelet/pods/1efa7034-8a95-4e6e-bd84-0189dc5acaa3/volumes" Dec 11 13:14:54 crc kubenswrapper[4898]: I1211 13:14:54.827720 4898 generic.go:334] "Generic (PLEG): container finished" podID="f21a88db-d1eb-4a8d-a82c-c9652c86a935" containerID="b0a219a9dcdaab83c4811ca756411bae8cb946d80503091989a54972463093f0" exitCode=0 Dec 11 13:14:54 crc kubenswrapper[4898]: I1211 13:14:54.827755 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lwjvl" event={"ID":"f21a88db-d1eb-4a8d-a82c-c9652c86a935","Type":"ContainerStarted","Data":"6be5194a302604ce52a917db09b5317841d48c058966deef479e17cf5851bc17"} Dec 11 13:14:54 crc kubenswrapper[4898]: I1211 13:14:54.827780 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lwjvl" event={"ID":"f21a88db-d1eb-4a8d-a82c-c9652c86a935","Type":"ContainerStarted","Data":"715d32c3727282b8983b942d9c279cb9807c328d2812f01994170c23094dbfa0"} Dec 11 13:14:54 crc kubenswrapper[4898]: I1211 13:14:54.827791 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lwjvl" event={"ID":"f21a88db-d1eb-4a8d-a82c-c9652c86a935","Type":"ContainerStarted","Data":"dde86796ed919fe0c17efe3184bd31121459f08c65c3c2b54471ddd30e2d5220"} Dec 11 13:14:54 crc kubenswrapper[4898]: I1211 13:14:54.827800 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lwjvl" event={"ID":"f21a88db-d1eb-4a8d-a82c-c9652c86a935","Type":"ContainerStarted","Data":"899278671db072bac9ad47d41da3075bbadd06a4e083e6293cc44a6ec663e16a"} Dec 11 13:14:54 crc kubenswrapper[4898]: I1211 13:14:54.827818 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lwjvl" event={"ID":"f21a88db-d1eb-4a8d-a82c-c9652c86a935","Type":"ContainerStarted","Data":"15a9e0b6c1b0746453617e1b715a8b7fe4388f8b8d10398fbc558a92013aef60"} Dec 11 13:14:54 crc kubenswrapper[4898]: I1211 13:14:54.827827 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lwjvl" event={"ID":"f21a88db-d1eb-4a8d-a82c-c9652c86a935","Type":"ContainerStarted","Data":"3b842b906db0025eec75feeafe2cffb5860b9a9c7e16b51a4ca464de03e1190b"} Dec 11 13:14:54 crc kubenswrapper[4898]: I1211 13:14:54.827837 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lwjvl" event={"ID":"f21a88db-d1eb-4a8d-a82c-c9652c86a935","Type":"ContainerDied","Data":"b0a219a9dcdaab83c4811ca756411bae8cb946d80503091989a54972463093f0"} Dec 11 13:14:57 crc kubenswrapper[4898]: I1211 13:14:57.849617 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lwjvl" event={"ID":"f21a88db-d1eb-4a8d-a82c-c9652c86a935","Type":"ContainerStarted","Data":"0e8128d123a137a0d4b8377019e223c9e7d8d95a6ddd7c33471f0bd23d36179e"} Dec 11 13:14:58 crc kubenswrapper[4898]: I1211 13:14:58.851321 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-668cf9dfbb-kflrx"] Dec 11 13:14:58 crc kubenswrapper[4898]: I1211 13:14:58.852427 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-kflrx" Dec 11 13:14:58 crc kubenswrapper[4898]: I1211 13:14:58.858735 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-dockercfg-5whpt" Dec 11 13:14:58 crc kubenswrapper[4898]: I1211 13:14:58.858749 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Dec 11 13:14:58 crc kubenswrapper[4898]: I1211 13:14:58.860183 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Dec 11 13:14:58 crc kubenswrapper[4898]: I1211 13:14:58.904062 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dcdrz\" (UniqueName: \"kubernetes.io/projected/e1934c2f-8b0a-4a1d-9da5-5de2822c6b82-kube-api-access-dcdrz\") pod \"obo-prometheus-operator-668cf9dfbb-kflrx\" (UID: \"e1934c2f-8b0a-4a1d-9da5-5de2822c6b82\") " pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-kflrx" Dec 11 13:14:58 crc kubenswrapper[4898]: I1211 13:14:58.919849 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-67ddb68bbf-v6w7g"] Dec 11 13:14:58 crc kubenswrapper[4898]: I1211 13:14:58.920726 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-67ddb68bbf-v6w7g" Dec 11 13:14:58 crc kubenswrapper[4898]: I1211 13:14:58.922303 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-dockercfg-kvk4l" Dec 11 13:14:58 crc kubenswrapper[4898]: I1211 13:14:58.922645 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Dec 11 13:14:58 crc kubenswrapper[4898]: I1211 13:14:58.929439 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-67ddb68bbf-wvkwh"] Dec 11 13:14:58 crc kubenswrapper[4898]: I1211 13:14:58.930239 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-67ddb68bbf-wvkwh" Dec 11 13:14:59 crc kubenswrapper[4898]: I1211 13:14:59.005250 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a328cfd7-383d-4f47-9723-ef24187542bd-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-67ddb68bbf-wvkwh\" (UID: \"a328cfd7-383d-4f47-9723-ef24187542bd\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-67ddb68bbf-wvkwh" Dec 11 13:14:59 crc kubenswrapper[4898]: I1211 13:14:59.005559 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8c1b0eb8-fe9f-4ad8-897e-52e76251f1ef-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-67ddb68bbf-v6w7g\" (UID: \"8c1b0eb8-fe9f-4ad8-897e-52e76251f1ef\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-67ddb68bbf-v6w7g" Dec 11 13:14:59 crc kubenswrapper[4898]: I1211 13:14:59.005659 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8c1b0eb8-fe9f-4ad8-897e-52e76251f1ef-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-67ddb68bbf-v6w7g\" (UID: \"8c1b0eb8-fe9f-4ad8-897e-52e76251f1ef\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-67ddb68bbf-v6w7g" Dec 11 13:14:59 crc kubenswrapper[4898]: I1211 13:14:59.005811 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a328cfd7-383d-4f47-9723-ef24187542bd-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-67ddb68bbf-wvkwh\" (UID: \"a328cfd7-383d-4f47-9723-ef24187542bd\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-67ddb68bbf-wvkwh" Dec 11 13:14:59 crc kubenswrapper[4898]: I1211 13:14:59.005900 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dcdrz\" (UniqueName: \"kubernetes.io/projected/e1934c2f-8b0a-4a1d-9da5-5de2822c6b82-kube-api-access-dcdrz\") pod \"obo-prometheus-operator-668cf9dfbb-kflrx\" (UID: \"e1934c2f-8b0a-4a1d-9da5-5de2822c6b82\") " pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-kflrx" Dec 11 13:14:59 crc kubenswrapper[4898]: I1211 13:14:59.025175 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dcdrz\" (UniqueName: \"kubernetes.io/projected/e1934c2f-8b0a-4a1d-9da5-5de2822c6b82-kube-api-access-dcdrz\") pod \"obo-prometheus-operator-668cf9dfbb-kflrx\" (UID: \"e1934c2f-8b0a-4a1d-9da5-5de2822c6b82\") " pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-kflrx" Dec 11 13:14:59 crc kubenswrapper[4898]: I1211 13:14:59.108246 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a328cfd7-383d-4f47-9723-ef24187542bd-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-67ddb68bbf-wvkwh\" (UID: \"a328cfd7-383d-4f47-9723-ef24187542bd\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-67ddb68bbf-wvkwh" Dec 11 13:14:59 crc kubenswrapper[4898]: I1211 13:14:59.108317 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a328cfd7-383d-4f47-9723-ef24187542bd-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-67ddb68bbf-wvkwh\" (UID: \"a328cfd7-383d-4f47-9723-ef24187542bd\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-67ddb68bbf-wvkwh" Dec 11 13:14:59 crc kubenswrapper[4898]: I1211 13:14:59.108335 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8c1b0eb8-fe9f-4ad8-897e-52e76251f1ef-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-67ddb68bbf-v6w7g\" (UID: \"8c1b0eb8-fe9f-4ad8-897e-52e76251f1ef\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-67ddb68bbf-v6w7g" Dec 11 13:14:59 crc kubenswrapper[4898]: I1211 13:14:59.108364 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8c1b0eb8-fe9f-4ad8-897e-52e76251f1ef-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-67ddb68bbf-v6w7g\" (UID: \"8c1b0eb8-fe9f-4ad8-897e-52e76251f1ef\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-67ddb68bbf-v6w7g" Dec 11 13:14:59 crc kubenswrapper[4898]: I1211 13:14:59.111658 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-d8bb48f5d-cnkcn"] Dec 11 13:14:59 crc kubenswrapper[4898]: I1211 13:14:59.112957 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-d8bb48f5d-cnkcn" Dec 11 13:14:59 crc kubenswrapper[4898]: I1211 13:14:59.117203 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8c1b0eb8-fe9f-4ad8-897e-52e76251f1ef-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-67ddb68bbf-v6w7g\" (UID: \"8c1b0eb8-fe9f-4ad8-897e-52e76251f1ef\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-67ddb68bbf-v6w7g" Dec 11 13:14:59 crc kubenswrapper[4898]: I1211 13:14:59.118367 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a328cfd7-383d-4f47-9723-ef24187542bd-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-67ddb68bbf-wvkwh\" (UID: \"a328cfd7-383d-4f47-9723-ef24187542bd\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-67ddb68bbf-wvkwh" Dec 11 13:14:59 crc kubenswrapper[4898]: W1211 13:14:59.118808 4898 reflector.go:561] object-"openshift-operators"/"observability-operator-sa-dockercfg-vp9ds": failed to list *v1.Secret: secrets "observability-operator-sa-dockercfg-vp9ds" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-operators": no relationship found between node 'crc' and this object Dec 11 13:14:59 crc kubenswrapper[4898]: E1211 13:14:59.118859 4898 reflector.go:158] "Unhandled Error" err="object-\"openshift-operators\"/\"observability-operator-sa-dockercfg-vp9ds\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"observability-operator-sa-dockercfg-vp9ds\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-operators\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 11 13:14:59 crc kubenswrapper[4898]: W1211 13:14:59.119018 4898 reflector.go:561] object-"openshift-operators"/"observability-operator-tls": failed to list *v1.Secret: secrets "observability-operator-tls" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-operators": no relationship found between node 'crc' and this object Dec 11 13:14:59 crc kubenswrapper[4898]: I1211 13:14:59.119017 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8c1b0eb8-fe9f-4ad8-897e-52e76251f1ef-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-67ddb68bbf-v6w7g\" (UID: \"8c1b0eb8-fe9f-4ad8-897e-52e76251f1ef\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-67ddb68bbf-v6w7g" Dec 11 13:14:59 crc kubenswrapper[4898]: E1211 13:14:59.119036 4898 reflector.go:158] "Unhandled Error" err="object-\"openshift-operators\"/\"observability-operator-tls\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"observability-operator-tls\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-operators\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 11 13:14:59 crc kubenswrapper[4898]: I1211 13:14:59.125500 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a328cfd7-383d-4f47-9723-ef24187542bd-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-67ddb68bbf-wvkwh\" (UID: \"a328cfd7-383d-4f47-9723-ef24187542bd\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-67ddb68bbf-wvkwh" Dec 11 13:14:59 crc kubenswrapper[4898]: I1211 13:14:59.171137 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-kflrx" Dec 11 13:14:59 crc kubenswrapper[4898]: E1211 13:14:59.200869 4898 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-668cf9dfbb-kflrx_openshift-operators_e1934c2f-8b0a-4a1d-9da5-5de2822c6b82_0(46f5e9a7961484e1addabb6435f1fe9881a119c82d822261cccdafa1415fdcd4): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 11 13:14:59 crc kubenswrapper[4898]: E1211 13:14:59.200945 4898 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-668cf9dfbb-kflrx_openshift-operators_e1934c2f-8b0a-4a1d-9da5-5de2822c6b82_0(46f5e9a7961484e1addabb6435f1fe9881a119c82d822261cccdafa1415fdcd4): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-kflrx" Dec 11 13:14:59 crc kubenswrapper[4898]: E1211 13:14:59.200965 4898 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-668cf9dfbb-kflrx_openshift-operators_e1934c2f-8b0a-4a1d-9da5-5de2822c6b82_0(46f5e9a7961484e1addabb6435f1fe9881a119c82d822261cccdafa1415fdcd4): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-kflrx" Dec 11 13:14:59 crc kubenswrapper[4898]: E1211 13:14:59.201010 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-668cf9dfbb-kflrx_openshift-operators(e1934c2f-8b0a-4a1d-9da5-5de2822c6b82)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-668cf9dfbb-kflrx_openshift-operators(e1934c2f-8b0a-4a1d-9da5-5de2822c6b82)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-668cf9dfbb-kflrx_openshift-operators_e1934c2f-8b0a-4a1d-9da5-5de2822c6b82_0(46f5e9a7961484e1addabb6435f1fe9881a119c82d822261cccdafa1415fdcd4): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-kflrx" podUID="e1934c2f-8b0a-4a1d-9da5-5de2822c6b82" Dec 11 13:14:59 crc kubenswrapper[4898]: I1211 13:14:59.209371 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m6v29\" (UniqueName: \"kubernetes.io/projected/2fed6ea1-0c4d-476c-9f45-ca4b5a9dc406-kube-api-access-m6v29\") pod \"observability-operator-d8bb48f5d-cnkcn\" (UID: \"2fed6ea1-0c4d-476c-9f45-ca4b5a9dc406\") " pod="openshift-operators/observability-operator-d8bb48f5d-cnkcn" Dec 11 13:14:59 crc kubenswrapper[4898]: I1211 13:14:59.209527 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/2fed6ea1-0c4d-476c-9f45-ca4b5a9dc406-observability-operator-tls\") pod \"observability-operator-d8bb48f5d-cnkcn\" (UID: \"2fed6ea1-0c4d-476c-9f45-ca4b5a9dc406\") " pod="openshift-operators/observability-operator-d8bb48f5d-cnkcn" Dec 11 13:14:59 crc kubenswrapper[4898]: I1211 13:14:59.239498 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-67ddb68bbf-v6w7g" Dec 11 13:14:59 crc kubenswrapper[4898]: I1211 13:14:59.252569 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-67ddb68bbf-wvkwh" Dec 11 13:14:59 crc kubenswrapper[4898]: E1211 13:14:59.259731 4898 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-67ddb68bbf-v6w7g_openshift-operators_8c1b0eb8-fe9f-4ad8-897e-52e76251f1ef_0(1fb0130333f6f9e93af0ba88a9c693a6b388dfc0e9e0f5c6b8b89f6c439afaa8): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 11 13:14:59 crc kubenswrapper[4898]: E1211 13:14:59.259801 4898 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-67ddb68bbf-v6w7g_openshift-operators_8c1b0eb8-fe9f-4ad8-897e-52e76251f1ef_0(1fb0130333f6f9e93af0ba88a9c693a6b388dfc0e9e0f5c6b8b89f6c439afaa8): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-67ddb68bbf-v6w7g" Dec 11 13:14:59 crc kubenswrapper[4898]: E1211 13:14:59.259825 4898 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-67ddb68bbf-v6w7g_openshift-operators_8c1b0eb8-fe9f-4ad8-897e-52e76251f1ef_0(1fb0130333f6f9e93af0ba88a9c693a6b388dfc0e9e0f5c6b8b89f6c439afaa8): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-67ddb68bbf-v6w7g" Dec 11 13:14:59 crc kubenswrapper[4898]: E1211 13:14:59.259874 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-67ddb68bbf-v6w7g_openshift-operators(8c1b0eb8-fe9f-4ad8-897e-52e76251f1ef)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-67ddb68bbf-v6w7g_openshift-operators(8c1b0eb8-fe9f-4ad8-897e-52e76251f1ef)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-67ddb68bbf-v6w7g_openshift-operators_8c1b0eb8-fe9f-4ad8-897e-52e76251f1ef_0(1fb0130333f6f9e93af0ba88a9c693a6b388dfc0e9e0f5c6b8b89f6c439afaa8): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-67ddb68bbf-v6w7g" podUID="8c1b0eb8-fe9f-4ad8-897e-52e76251f1ef" Dec 11 13:14:59 crc kubenswrapper[4898]: E1211 13:14:59.280979 4898 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-67ddb68bbf-wvkwh_openshift-operators_a328cfd7-383d-4f47-9723-ef24187542bd_0(aa3429052c6e853be008f42de5510f81cbb626f5c1963990adadeb2b8ede7323): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 11 13:14:59 crc kubenswrapper[4898]: E1211 13:14:59.281024 4898 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-67ddb68bbf-wvkwh_openshift-operators_a328cfd7-383d-4f47-9723-ef24187542bd_0(aa3429052c6e853be008f42de5510f81cbb626f5c1963990adadeb2b8ede7323): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-67ddb68bbf-wvkwh" Dec 11 13:14:59 crc kubenswrapper[4898]: E1211 13:14:59.281048 4898 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-67ddb68bbf-wvkwh_openshift-operators_a328cfd7-383d-4f47-9723-ef24187542bd_0(aa3429052c6e853be008f42de5510f81cbb626f5c1963990adadeb2b8ede7323): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-67ddb68bbf-wvkwh" Dec 11 13:14:59 crc kubenswrapper[4898]: E1211 13:14:59.281090 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-67ddb68bbf-wvkwh_openshift-operators(a328cfd7-383d-4f47-9723-ef24187542bd)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-67ddb68bbf-wvkwh_openshift-operators(a328cfd7-383d-4f47-9723-ef24187542bd)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-67ddb68bbf-wvkwh_openshift-operators_a328cfd7-383d-4f47-9723-ef24187542bd_0(aa3429052c6e853be008f42de5510f81cbb626f5c1963990adadeb2b8ede7323): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-67ddb68bbf-wvkwh" podUID="a328cfd7-383d-4f47-9723-ef24187542bd" Dec 11 13:14:59 crc kubenswrapper[4898]: I1211 13:14:59.299791 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-5446b9c989-nq968"] Dec 11 13:14:59 crc kubenswrapper[4898]: I1211 13:14:59.300530 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5446b9c989-nq968" Dec 11 13:14:59 crc kubenswrapper[4898]: I1211 13:14:59.301900 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-dockercfg-7qrvb" Dec 11 13:14:59 crc kubenswrapper[4898]: I1211 13:14:59.310253 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/2fed6ea1-0c4d-476c-9f45-ca4b5a9dc406-observability-operator-tls\") pod \"observability-operator-d8bb48f5d-cnkcn\" (UID: \"2fed6ea1-0c4d-476c-9f45-ca4b5a9dc406\") " pod="openshift-operators/observability-operator-d8bb48f5d-cnkcn" Dec 11 13:14:59 crc kubenswrapper[4898]: I1211 13:14:59.310364 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m6v29\" (UniqueName: \"kubernetes.io/projected/2fed6ea1-0c4d-476c-9f45-ca4b5a9dc406-kube-api-access-m6v29\") pod \"observability-operator-d8bb48f5d-cnkcn\" (UID: \"2fed6ea1-0c4d-476c-9f45-ca4b5a9dc406\") " pod="openshift-operators/observability-operator-d8bb48f5d-cnkcn" Dec 11 13:14:59 crc kubenswrapper[4898]: I1211 13:14:59.326038 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m6v29\" (UniqueName: \"kubernetes.io/projected/2fed6ea1-0c4d-476c-9f45-ca4b5a9dc406-kube-api-access-m6v29\") pod \"observability-operator-d8bb48f5d-cnkcn\" (UID: \"2fed6ea1-0c4d-476c-9f45-ca4b5a9dc406\") " pod="openshift-operators/observability-operator-d8bb48f5d-cnkcn" Dec 11 13:14:59 crc kubenswrapper[4898]: I1211 13:14:59.411475 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bzmjb\" (UniqueName: \"kubernetes.io/projected/c5f15058-ca6b-40a5-bad2-83ea7339d28b-kube-api-access-bzmjb\") pod \"perses-operator-5446b9c989-nq968\" (UID: \"c5f15058-ca6b-40a5-bad2-83ea7339d28b\") " pod="openshift-operators/perses-operator-5446b9c989-nq968" Dec 11 13:14:59 crc kubenswrapper[4898]: I1211 13:14:59.411550 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/c5f15058-ca6b-40a5-bad2-83ea7339d28b-openshift-service-ca\") pod \"perses-operator-5446b9c989-nq968\" (UID: \"c5f15058-ca6b-40a5-bad2-83ea7339d28b\") " pod="openshift-operators/perses-operator-5446b9c989-nq968" Dec 11 13:14:59 crc kubenswrapper[4898]: I1211 13:14:59.512551 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/c5f15058-ca6b-40a5-bad2-83ea7339d28b-openshift-service-ca\") pod \"perses-operator-5446b9c989-nq968\" (UID: \"c5f15058-ca6b-40a5-bad2-83ea7339d28b\") " pod="openshift-operators/perses-operator-5446b9c989-nq968" Dec 11 13:14:59 crc kubenswrapper[4898]: I1211 13:14:59.512653 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bzmjb\" (UniqueName: \"kubernetes.io/projected/c5f15058-ca6b-40a5-bad2-83ea7339d28b-kube-api-access-bzmjb\") pod \"perses-operator-5446b9c989-nq968\" (UID: \"c5f15058-ca6b-40a5-bad2-83ea7339d28b\") " pod="openshift-operators/perses-operator-5446b9c989-nq968" Dec 11 13:14:59 crc kubenswrapper[4898]: I1211 13:14:59.513864 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/c5f15058-ca6b-40a5-bad2-83ea7339d28b-openshift-service-ca\") pod \"perses-operator-5446b9c989-nq968\" (UID: \"c5f15058-ca6b-40a5-bad2-83ea7339d28b\") " pod="openshift-operators/perses-operator-5446b9c989-nq968" Dec 11 13:14:59 crc kubenswrapper[4898]: I1211 13:14:59.559122 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bzmjb\" (UniqueName: \"kubernetes.io/projected/c5f15058-ca6b-40a5-bad2-83ea7339d28b-kube-api-access-bzmjb\") pod \"perses-operator-5446b9c989-nq968\" (UID: \"c5f15058-ca6b-40a5-bad2-83ea7339d28b\") " pod="openshift-operators/perses-operator-5446b9c989-nq968" Dec 11 13:14:59 crc kubenswrapper[4898]: I1211 13:14:59.622436 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5446b9c989-nq968" Dec 11 13:14:59 crc kubenswrapper[4898]: E1211 13:14:59.647618 4898 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5446b9c989-nq968_openshift-operators_c5f15058-ca6b-40a5-bad2-83ea7339d28b_0(45f279efbf381f4cfd06afcb089a68e736980dddcd9b5d5f793a2a106e26beb7): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 11 13:14:59 crc kubenswrapper[4898]: E1211 13:14:59.647933 4898 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5446b9c989-nq968_openshift-operators_c5f15058-ca6b-40a5-bad2-83ea7339d28b_0(45f279efbf381f4cfd06afcb089a68e736980dddcd9b5d5f793a2a106e26beb7): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5446b9c989-nq968" Dec 11 13:14:59 crc kubenswrapper[4898]: E1211 13:14:59.647956 4898 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5446b9c989-nq968_openshift-operators_c5f15058-ca6b-40a5-bad2-83ea7339d28b_0(45f279efbf381f4cfd06afcb089a68e736980dddcd9b5d5f793a2a106e26beb7): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5446b9c989-nq968" Dec 11 13:14:59 crc kubenswrapper[4898]: E1211 13:14:59.647995 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"perses-operator-5446b9c989-nq968_openshift-operators(c5f15058-ca6b-40a5-bad2-83ea7339d28b)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"perses-operator-5446b9c989-nq968_openshift-operators(c5f15058-ca6b-40a5-bad2-83ea7339d28b)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5446b9c989-nq968_openshift-operators_c5f15058-ca6b-40a5-bad2-83ea7339d28b_0(45f279efbf381f4cfd06afcb089a68e736980dddcd9b5d5f793a2a106e26beb7): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/perses-operator-5446b9c989-nq968" podUID="c5f15058-ca6b-40a5-bad2-83ea7339d28b" Dec 11 13:14:59 crc kubenswrapper[4898]: I1211 13:14:59.866143 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lwjvl" event={"ID":"f21a88db-d1eb-4a8d-a82c-c9652c86a935","Type":"ContainerStarted","Data":"a9cc5fbedd1de0e44bf13452301ee351a41ef63071c0466c939dfce67afee74f"} Dec 11 13:14:59 crc kubenswrapper[4898]: I1211 13:14:59.866499 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-lwjvl" Dec 11 13:14:59 crc kubenswrapper[4898]: I1211 13:14:59.866534 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-lwjvl" Dec 11 13:14:59 crc kubenswrapper[4898]: I1211 13:14:59.892531 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-lwjvl" podStartSLOduration=6.892512876 podStartE2EDuration="6.892512876s" podCreationTimestamp="2025-12-11 13:14:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 13:14:59.890641635 +0000 UTC m=+657.462968072" watchObservedRunningTime="2025-12-11 13:14:59.892512876 +0000 UTC m=+657.464839313" Dec 11 13:14:59 crc kubenswrapper[4898]: I1211 13:14:59.897376 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-lwjvl" Dec 11 13:15:00 crc kubenswrapper[4898]: I1211 13:15:00.139354 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Dec 11 13:15:00 crc kubenswrapper[4898]: I1211 13:15:00.146117 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/2fed6ea1-0c4d-476c-9f45-ca4b5a9dc406-observability-operator-tls\") pod \"observability-operator-d8bb48f5d-cnkcn\" (UID: \"2fed6ea1-0c4d-476c-9f45-ca4b5a9dc406\") " pod="openshift-operators/observability-operator-d8bb48f5d-cnkcn" Dec 11 13:15:00 crc kubenswrapper[4898]: I1211 13:15:00.182755 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29424315-nc8pn"] Dec 11 13:15:00 crc kubenswrapper[4898]: I1211 13:15:00.183537 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29424315-nc8pn" Dec 11 13:15:00 crc kubenswrapper[4898]: I1211 13:15:00.185563 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 11 13:15:00 crc kubenswrapper[4898]: I1211 13:15:00.186302 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 11 13:15:00 crc kubenswrapper[4898]: I1211 13:15:00.237477 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2eff25b9-7a31-4118-b8c3-c57b8b4714fa-config-volume\") pod \"collect-profiles-29424315-nc8pn\" (UID: \"2eff25b9-7a31-4118-b8c3-c57b8b4714fa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29424315-nc8pn" Dec 11 13:15:00 crc kubenswrapper[4898]: I1211 13:15:00.237535 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2eff25b9-7a31-4118-b8c3-c57b8b4714fa-secret-volume\") pod \"collect-profiles-29424315-nc8pn\" (UID: \"2eff25b9-7a31-4118-b8c3-c57b8b4714fa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29424315-nc8pn" Dec 11 13:15:00 crc kubenswrapper[4898]: I1211 13:15:00.237639 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6lj48\" (UniqueName: \"kubernetes.io/projected/2eff25b9-7a31-4118-b8c3-c57b8b4714fa-kube-api-access-6lj48\") pod \"collect-profiles-29424315-nc8pn\" (UID: \"2eff25b9-7a31-4118-b8c3-c57b8b4714fa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29424315-nc8pn" Dec 11 13:15:00 crc kubenswrapper[4898]: I1211 13:15:00.338527 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6lj48\" (UniqueName: \"kubernetes.io/projected/2eff25b9-7a31-4118-b8c3-c57b8b4714fa-kube-api-access-6lj48\") pod \"collect-profiles-29424315-nc8pn\" (UID: \"2eff25b9-7a31-4118-b8c3-c57b8b4714fa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29424315-nc8pn" Dec 11 13:15:00 crc kubenswrapper[4898]: I1211 13:15:00.338631 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2eff25b9-7a31-4118-b8c3-c57b8b4714fa-config-volume\") pod \"collect-profiles-29424315-nc8pn\" (UID: \"2eff25b9-7a31-4118-b8c3-c57b8b4714fa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29424315-nc8pn" Dec 11 13:15:00 crc kubenswrapper[4898]: I1211 13:15:00.338672 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2eff25b9-7a31-4118-b8c3-c57b8b4714fa-secret-volume\") pod \"collect-profiles-29424315-nc8pn\" (UID: \"2eff25b9-7a31-4118-b8c3-c57b8b4714fa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29424315-nc8pn" Dec 11 13:15:00 crc kubenswrapper[4898]: I1211 13:15:00.339614 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2eff25b9-7a31-4118-b8c3-c57b8b4714fa-config-volume\") pod \"collect-profiles-29424315-nc8pn\" (UID: \"2eff25b9-7a31-4118-b8c3-c57b8b4714fa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29424315-nc8pn" Dec 11 13:15:00 crc kubenswrapper[4898]: I1211 13:15:00.344297 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2eff25b9-7a31-4118-b8c3-c57b8b4714fa-secret-volume\") pod \"collect-profiles-29424315-nc8pn\" (UID: \"2eff25b9-7a31-4118-b8c3-c57b8b4714fa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29424315-nc8pn" Dec 11 13:15:00 crc kubenswrapper[4898]: I1211 13:15:00.376181 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6lj48\" (UniqueName: \"kubernetes.io/projected/2eff25b9-7a31-4118-b8c3-c57b8b4714fa-kube-api-access-6lj48\") pod \"collect-profiles-29424315-nc8pn\" (UID: \"2eff25b9-7a31-4118-b8c3-c57b8b4714fa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29424315-nc8pn" Dec 11 13:15:00 crc kubenswrapper[4898]: I1211 13:15:00.400787 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5446b9c989-nq968"] Dec 11 13:15:00 crc kubenswrapper[4898]: I1211 13:15:00.400938 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5446b9c989-nq968" Dec 11 13:15:00 crc kubenswrapper[4898]: I1211 13:15:00.401509 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5446b9c989-nq968" Dec 11 13:15:00 crc kubenswrapper[4898]: I1211 13:15:00.406856 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-67ddb68bbf-v6w7g"] Dec 11 13:15:00 crc kubenswrapper[4898]: I1211 13:15:00.406970 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-67ddb68bbf-v6w7g" Dec 11 13:15:00 crc kubenswrapper[4898]: I1211 13:15:00.407441 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-67ddb68bbf-v6w7g" Dec 11 13:15:00 crc kubenswrapper[4898]: I1211 13:15:00.457600 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-sa-dockercfg-vp9ds" Dec 11 13:15:00 crc kubenswrapper[4898]: I1211 13:15:00.463083 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-d8bb48f5d-cnkcn" Dec 11 13:15:00 crc kubenswrapper[4898]: E1211 13:15:00.470337 4898 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-67ddb68bbf-v6w7g_openshift-operators_8c1b0eb8-fe9f-4ad8-897e-52e76251f1ef_0(42eb89c23d7619f09f5f80ce4192f0cf7281ebf4befe85d08b65dd586ea0d216): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 11 13:15:00 crc kubenswrapper[4898]: E1211 13:15:00.470410 4898 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-67ddb68bbf-v6w7g_openshift-operators_8c1b0eb8-fe9f-4ad8-897e-52e76251f1ef_0(42eb89c23d7619f09f5f80ce4192f0cf7281ebf4befe85d08b65dd586ea0d216): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-67ddb68bbf-v6w7g" Dec 11 13:15:00 crc kubenswrapper[4898]: E1211 13:15:00.470438 4898 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-67ddb68bbf-v6w7g_openshift-operators_8c1b0eb8-fe9f-4ad8-897e-52e76251f1ef_0(42eb89c23d7619f09f5f80ce4192f0cf7281ebf4befe85d08b65dd586ea0d216): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-67ddb68bbf-v6w7g" Dec 11 13:15:00 crc kubenswrapper[4898]: E1211 13:15:00.470505 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-67ddb68bbf-v6w7g_openshift-operators(8c1b0eb8-fe9f-4ad8-897e-52e76251f1ef)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-67ddb68bbf-v6w7g_openshift-operators(8c1b0eb8-fe9f-4ad8-897e-52e76251f1ef)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-67ddb68bbf-v6w7g_openshift-operators_8c1b0eb8-fe9f-4ad8-897e-52e76251f1ef_0(42eb89c23d7619f09f5f80ce4192f0cf7281ebf4befe85d08b65dd586ea0d216): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-67ddb68bbf-v6w7g" podUID="8c1b0eb8-fe9f-4ad8-897e-52e76251f1ef" Dec 11 13:15:00 crc kubenswrapper[4898]: E1211 13:15:00.476661 4898 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5446b9c989-nq968_openshift-operators_c5f15058-ca6b-40a5-bad2-83ea7339d28b_0(d26d2b3073505452de30560b84dceecdad1a33df53f3a075e35e4ae91f22b48f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 11 13:15:00 crc kubenswrapper[4898]: E1211 13:15:00.476728 4898 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5446b9c989-nq968_openshift-operators_c5f15058-ca6b-40a5-bad2-83ea7339d28b_0(d26d2b3073505452de30560b84dceecdad1a33df53f3a075e35e4ae91f22b48f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5446b9c989-nq968" Dec 11 13:15:00 crc kubenswrapper[4898]: E1211 13:15:00.476755 4898 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5446b9c989-nq968_openshift-operators_c5f15058-ca6b-40a5-bad2-83ea7339d28b_0(d26d2b3073505452de30560b84dceecdad1a33df53f3a075e35e4ae91f22b48f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5446b9c989-nq968" Dec 11 13:15:00 crc kubenswrapper[4898]: E1211 13:15:00.476804 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"perses-operator-5446b9c989-nq968_openshift-operators(c5f15058-ca6b-40a5-bad2-83ea7339d28b)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"perses-operator-5446b9c989-nq968_openshift-operators(c5f15058-ca6b-40a5-bad2-83ea7339d28b)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5446b9c989-nq968_openshift-operators_c5f15058-ca6b-40a5-bad2-83ea7339d28b_0(d26d2b3073505452de30560b84dceecdad1a33df53f3a075e35e4ae91f22b48f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/perses-operator-5446b9c989-nq968" podUID="c5f15058-ca6b-40a5-bad2-83ea7339d28b" Dec 11 13:15:00 crc kubenswrapper[4898]: I1211 13:15:00.499448 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29424315-nc8pn" Dec 11 13:15:00 crc kubenswrapper[4898]: E1211 13:15:00.500003 4898 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-d8bb48f5d-cnkcn_openshift-operators_2fed6ea1-0c4d-476c-9f45-ca4b5a9dc406_0(ad22ae522c41a6ab804e8b56d051faae4e0c2b71666e5cc87284ee5e2eaa1a32): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 11 13:15:00 crc kubenswrapper[4898]: E1211 13:15:00.500054 4898 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-d8bb48f5d-cnkcn_openshift-operators_2fed6ea1-0c4d-476c-9f45-ca4b5a9dc406_0(ad22ae522c41a6ab804e8b56d051faae4e0c2b71666e5cc87284ee5e2eaa1a32): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-d8bb48f5d-cnkcn" Dec 11 13:15:00 crc kubenswrapper[4898]: E1211 13:15:00.500080 4898 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-d8bb48f5d-cnkcn_openshift-operators_2fed6ea1-0c4d-476c-9f45-ca4b5a9dc406_0(ad22ae522c41a6ab804e8b56d051faae4e0c2b71666e5cc87284ee5e2eaa1a32): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-d8bb48f5d-cnkcn" Dec 11 13:15:00 crc kubenswrapper[4898]: E1211 13:15:00.500124 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"observability-operator-d8bb48f5d-cnkcn_openshift-operators(2fed6ea1-0c4d-476c-9f45-ca4b5a9dc406)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"observability-operator-d8bb48f5d-cnkcn_openshift-operators(2fed6ea1-0c4d-476c-9f45-ca4b5a9dc406)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-d8bb48f5d-cnkcn_openshift-operators_2fed6ea1-0c4d-476c-9f45-ca4b5a9dc406_0(ad22ae522c41a6ab804e8b56d051faae4e0c2b71666e5cc87284ee5e2eaa1a32): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/observability-operator-d8bb48f5d-cnkcn" podUID="2fed6ea1-0c4d-476c-9f45-ca4b5a9dc406" Dec 11 13:15:00 crc kubenswrapper[4898]: I1211 13:15:00.518927 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-67ddb68bbf-wvkwh"] Dec 11 13:15:00 crc kubenswrapper[4898]: I1211 13:15:00.519034 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-67ddb68bbf-wvkwh" Dec 11 13:15:00 crc kubenswrapper[4898]: I1211 13:15:00.519504 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-67ddb68bbf-wvkwh" Dec 11 13:15:00 crc kubenswrapper[4898]: I1211 13:15:00.528811 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-d8bb48f5d-cnkcn"] Dec 11 13:15:00 crc kubenswrapper[4898]: E1211 13:15:00.539183 4898 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_collect-profiles-29424315-nc8pn_openshift-operator-lifecycle-manager_2eff25b9-7a31-4118-b8c3-c57b8b4714fa_0(5ec080a1de2bbad35c9a93359e2f018d908e2a8a9c1e7268ece35aeeed026ef2): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 11 13:15:00 crc kubenswrapper[4898]: E1211 13:15:00.539243 4898 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_collect-profiles-29424315-nc8pn_openshift-operator-lifecycle-manager_2eff25b9-7a31-4118-b8c3-c57b8b4714fa_0(5ec080a1de2bbad35c9a93359e2f018d908e2a8a9c1e7268ece35aeeed026ef2): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operator-lifecycle-manager/collect-profiles-29424315-nc8pn" Dec 11 13:15:00 crc kubenswrapper[4898]: E1211 13:15:00.539267 4898 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_collect-profiles-29424315-nc8pn_openshift-operator-lifecycle-manager_2eff25b9-7a31-4118-b8c3-c57b8b4714fa_0(5ec080a1de2bbad35c9a93359e2f018d908e2a8a9c1e7268ece35aeeed026ef2): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operator-lifecycle-manager/collect-profiles-29424315-nc8pn" Dec 11 13:15:00 crc kubenswrapper[4898]: E1211 13:15:00.539324 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"collect-profiles-29424315-nc8pn_openshift-operator-lifecycle-manager(2eff25b9-7a31-4118-b8c3-c57b8b4714fa)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"collect-profiles-29424315-nc8pn_openshift-operator-lifecycle-manager(2eff25b9-7a31-4118-b8c3-c57b8b4714fa)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_collect-profiles-29424315-nc8pn_openshift-operator-lifecycle-manager_2eff25b9-7a31-4118-b8c3-c57b8b4714fa_0(5ec080a1de2bbad35c9a93359e2f018d908e2a8a9c1e7268ece35aeeed026ef2): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operator-lifecycle-manager/collect-profiles-29424315-nc8pn" podUID="2eff25b9-7a31-4118-b8c3-c57b8b4714fa" Dec 11 13:15:00 crc kubenswrapper[4898]: I1211 13:15:00.547308 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-668cf9dfbb-kflrx"] Dec 11 13:15:00 crc kubenswrapper[4898]: I1211 13:15:00.547405 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-kflrx" Dec 11 13:15:00 crc kubenswrapper[4898]: I1211 13:15:00.547814 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-kflrx" Dec 11 13:15:00 crc kubenswrapper[4898]: E1211 13:15:00.559696 4898 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-67ddb68bbf-wvkwh_openshift-operators_a328cfd7-383d-4f47-9723-ef24187542bd_0(4581dfa9117119dcbb9943ac11d9db0475d21c4d240729dce917f25a50304ace): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 11 13:15:00 crc kubenswrapper[4898]: E1211 13:15:00.559755 4898 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-67ddb68bbf-wvkwh_openshift-operators_a328cfd7-383d-4f47-9723-ef24187542bd_0(4581dfa9117119dcbb9943ac11d9db0475d21c4d240729dce917f25a50304ace): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-67ddb68bbf-wvkwh" Dec 11 13:15:00 crc kubenswrapper[4898]: E1211 13:15:00.559780 4898 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-67ddb68bbf-wvkwh_openshift-operators_a328cfd7-383d-4f47-9723-ef24187542bd_0(4581dfa9117119dcbb9943ac11d9db0475d21c4d240729dce917f25a50304ace): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-67ddb68bbf-wvkwh" Dec 11 13:15:00 crc kubenswrapper[4898]: E1211 13:15:00.559820 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-67ddb68bbf-wvkwh_openshift-operators(a328cfd7-383d-4f47-9723-ef24187542bd)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-67ddb68bbf-wvkwh_openshift-operators(a328cfd7-383d-4f47-9723-ef24187542bd)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-67ddb68bbf-wvkwh_openshift-operators_a328cfd7-383d-4f47-9723-ef24187542bd_0(4581dfa9117119dcbb9943ac11d9db0475d21c4d240729dce917f25a50304ace): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-67ddb68bbf-wvkwh" podUID="a328cfd7-383d-4f47-9723-ef24187542bd" Dec 11 13:15:00 crc kubenswrapper[4898]: I1211 13:15:00.576062 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29424315-nc8pn"] Dec 11 13:15:00 crc kubenswrapper[4898]: E1211 13:15:00.579603 4898 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-668cf9dfbb-kflrx_openshift-operators_e1934c2f-8b0a-4a1d-9da5-5de2822c6b82_0(60dfec48240f885fa090d63b58128261d3c6708d5e74cdb5d8b701bfdec617a4): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 11 13:15:00 crc kubenswrapper[4898]: E1211 13:15:00.579659 4898 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-668cf9dfbb-kflrx_openshift-operators_e1934c2f-8b0a-4a1d-9da5-5de2822c6b82_0(60dfec48240f885fa090d63b58128261d3c6708d5e74cdb5d8b701bfdec617a4): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-kflrx" Dec 11 13:15:00 crc kubenswrapper[4898]: E1211 13:15:00.579678 4898 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-668cf9dfbb-kflrx_openshift-operators_e1934c2f-8b0a-4a1d-9da5-5de2822c6b82_0(60dfec48240f885fa090d63b58128261d3c6708d5e74cdb5d8b701bfdec617a4): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-kflrx" Dec 11 13:15:00 crc kubenswrapper[4898]: E1211 13:15:00.579726 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-668cf9dfbb-kflrx_openshift-operators(e1934c2f-8b0a-4a1d-9da5-5de2822c6b82)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-668cf9dfbb-kflrx_openshift-operators(e1934c2f-8b0a-4a1d-9da5-5de2822c6b82)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-668cf9dfbb-kflrx_openshift-operators_e1934c2f-8b0a-4a1d-9da5-5de2822c6b82_0(60dfec48240f885fa090d63b58128261d3c6708d5e74cdb5d8b701bfdec617a4): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-kflrx" podUID="e1934c2f-8b0a-4a1d-9da5-5de2822c6b82" Dec 11 13:15:00 crc kubenswrapper[4898]: I1211 13:15:00.873130 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29424315-nc8pn" Dec 11 13:15:00 crc kubenswrapper[4898]: I1211 13:15:00.873144 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-d8bb48f5d-cnkcn" Dec 11 13:15:00 crc kubenswrapper[4898]: I1211 13:15:00.873521 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-lwjvl" Dec 11 13:15:00 crc kubenswrapper[4898]: I1211 13:15:00.874597 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-d8bb48f5d-cnkcn" Dec 11 13:15:00 crc kubenswrapper[4898]: I1211 13:15:00.874727 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29424315-nc8pn" Dec 11 13:15:00 crc kubenswrapper[4898]: E1211 13:15:00.901539 4898 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_collect-profiles-29424315-nc8pn_openshift-operator-lifecycle-manager_2eff25b9-7a31-4118-b8c3-c57b8b4714fa_0(36929442afe7635cd27b18a4c1662756499228a2e3c4a59718e3f0859ee2fb19): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 11 13:15:00 crc kubenswrapper[4898]: E1211 13:15:00.901603 4898 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_collect-profiles-29424315-nc8pn_openshift-operator-lifecycle-manager_2eff25b9-7a31-4118-b8c3-c57b8b4714fa_0(36929442afe7635cd27b18a4c1662756499228a2e3c4a59718e3f0859ee2fb19): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operator-lifecycle-manager/collect-profiles-29424315-nc8pn" Dec 11 13:15:00 crc kubenswrapper[4898]: E1211 13:15:00.901622 4898 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_collect-profiles-29424315-nc8pn_openshift-operator-lifecycle-manager_2eff25b9-7a31-4118-b8c3-c57b8b4714fa_0(36929442afe7635cd27b18a4c1662756499228a2e3c4a59718e3f0859ee2fb19): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operator-lifecycle-manager/collect-profiles-29424315-nc8pn" Dec 11 13:15:00 crc kubenswrapper[4898]: E1211 13:15:00.901661 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"collect-profiles-29424315-nc8pn_openshift-operator-lifecycle-manager(2eff25b9-7a31-4118-b8c3-c57b8b4714fa)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"collect-profiles-29424315-nc8pn_openshift-operator-lifecycle-manager(2eff25b9-7a31-4118-b8c3-c57b8b4714fa)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_collect-profiles-29424315-nc8pn_openshift-operator-lifecycle-manager_2eff25b9-7a31-4118-b8c3-c57b8b4714fa_0(36929442afe7635cd27b18a4c1662756499228a2e3c4a59718e3f0859ee2fb19): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operator-lifecycle-manager/collect-profiles-29424315-nc8pn" podUID="2eff25b9-7a31-4118-b8c3-c57b8b4714fa" Dec 11 13:15:00 crc kubenswrapper[4898]: I1211 13:15:00.906329 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-lwjvl" Dec 11 13:15:00 crc kubenswrapper[4898]: E1211 13:15:00.908308 4898 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-d8bb48f5d-cnkcn_openshift-operators_2fed6ea1-0c4d-476c-9f45-ca4b5a9dc406_0(f818f67e4eba03a8efd108defd180c9817c5db53f3ef89dc934eda7618d18b30): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 11 13:15:00 crc kubenswrapper[4898]: E1211 13:15:00.908342 4898 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-d8bb48f5d-cnkcn_openshift-operators_2fed6ea1-0c4d-476c-9f45-ca4b5a9dc406_0(f818f67e4eba03a8efd108defd180c9817c5db53f3ef89dc934eda7618d18b30): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-d8bb48f5d-cnkcn" Dec 11 13:15:00 crc kubenswrapper[4898]: E1211 13:15:00.908382 4898 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-d8bb48f5d-cnkcn_openshift-operators_2fed6ea1-0c4d-476c-9f45-ca4b5a9dc406_0(f818f67e4eba03a8efd108defd180c9817c5db53f3ef89dc934eda7618d18b30): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-d8bb48f5d-cnkcn" Dec 11 13:15:00 crc kubenswrapper[4898]: E1211 13:15:00.908420 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"observability-operator-d8bb48f5d-cnkcn_openshift-operators(2fed6ea1-0c4d-476c-9f45-ca4b5a9dc406)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"observability-operator-d8bb48f5d-cnkcn_openshift-operators(2fed6ea1-0c4d-476c-9f45-ca4b5a9dc406)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-d8bb48f5d-cnkcn_openshift-operators_2fed6ea1-0c4d-476c-9f45-ca4b5a9dc406_0(f818f67e4eba03a8efd108defd180c9817c5db53f3ef89dc934eda7618d18b30): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/observability-operator-d8bb48f5d-cnkcn" podUID="2fed6ea1-0c4d-476c-9f45-ca4b5a9dc406" Dec 11 13:15:05 crc kubenswrapper[4898]: I1211 13:15:05.775149 4898 scope.go:117] "RemoveContainer" containerID="114b3259a9fd034573fe9dc3c103980a05be8bdb8205c084f35e27725255ec28" Dec 11 13:15:05 crc kubenswrapper[4898]: E1211 13:15:05.776143 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-dlqfj_openshift-multus(4e8ed6cb-b822-4b64-9e00-e755c5aea812)\"" pod="openshift-multus/multus-dlqfj" podUID="4e8ed6cb-b822-4b64-9e00-e755c5aea812" Dec 11 13:15:11 crc kubenswrapper[4898]: I1211 13:15:11.774719 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29424315-nc8pn" Dec 11 13:15:11 crc kubenswrapper[4898]: I1211 13:15:11.774848 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-kflrx" Dec 11 13:15:11 crc kubenswrapper[4898]: I1211 13:15:11.775536 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29424315-nc8pn" Dec 11 13:15:11 crc kubenswrapper[4898]: I1211 13:15:11.775933 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-kflrx" Dec 11 13:15:11 crc kubenswrapper[4898]: E1211 13:15:11.821548 4898 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_collect-profiles-29424315-nc8pn_openshift-operator-lifecycle-manager_2eff25b9-7a31-4118-b8c3-c57b8b4714fa_0(340f2e458eeb75afc35faf566656a0deb4a8b07a0563e2e6d3b8aa7cdb3a0537): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 11 13:15:11 crc kubenswrapper[4898]: E1211 13:15:11.821656 4898 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_collect-profiles-29424315-nc8pn_openshift-operator-lifecycle-manager_2eff25b9-7a31-4118-b8c3-c57b8b4714fa_0(340f2e458eeb75afc35faf566656a0deb4a8b07a0563e2e6d3b8aa7cdb3a0537): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operator-lifecycle-manager/collect-profiles-29424315-nc8pn" Dec 11 13:15:11 crc kubenswrapper[4898]: E1211 13:15:11.821689 4898 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_collect-profiles-29424315-nc8pn_openshift-operator-lifecycle-manager_2eff25b9-7a31-4118-b8c3-c57b8b4714fa_0(340f2e458eeb75afc35faf566656a0deb4a8b07a0563e2e6d3b8aa7cdb3a0537): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operator-lifecycle-manager/collect-profiles-29424315-nc8pn" Dec 11 13:15:11 crc kubenswrapper[4898]: E1211 13:15:11.821754 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"collect-profiles-29424315-nc8pn_openshift-operator-lifecycle-manager(2eff25b9-7a31-4118-b8c3-c57b8b4714fa)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"collect-profiles-29424315-nc8pn_openshift-operator-lifecycle-manager(2eff25b9-7a31-4118-b8c3-c57b8b4714fa)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_collect-profiles-29424315-nc8pn_openshift-operator-lifecycle-manager_2eff25b9-7a31-4118-b8c3-c57b8b4714fa_0(340f2e458eeb75afc35faf566656a0deb4a8b07a0563e2e6d3b8aa7cdb3a0537): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operator-lifecycle-manager/collect-profiles-29424315-nc8pn" podUID="2eff25b9-7a31-4118-b8c3-c57b8b4714fa" Dec 11 13:15:11 crc kubenswrapper[4898]: E1211 13:15:11.830426 4898 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-668cf9dfbb-kflrx_openshift-operators_e1934c2f-8b0a-4a1d-9da5-5de2822c6b82_0(029196928e115fb5e618a5f99d7c06f0f95f4d9e0fbb29c6c115d68c25f3d079): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 11 13:15:11 crc kubenswrapper[4898]: E1211 13:15:11.830519 4898 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-668cf9dfbb-kflrx_openshift-operators_e1934c2f-8b0a-4a1d-9da5-5de2822c6b82_0(029196928e115fb5e618a5f99d7c06f0f95f4d9e0fbb29c6c115d68c25f3d079): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-kflrx" Dec 11 13:15:11 crc kubenswrapper[4898]: E1211 13:15:11.830545 4898 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-668cf9dfbb-kflrx_openshift-operators_e1934c2f-8b0a-4a1d-9da5-5de2822c6b82_0(029196928e115fb5e618a5f99d7c06f0f95f4d9e0fbb29c6c115d68c25f3d079): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-kflrx" Dec 11 13:15:11 crc kubenswrapper[4898]: E1211 13:15:11.830604 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-668cf9dfbb-kflrx_openshift-operators(e1934c2f-8b0a-4a1d-9da5-5de2822c6b82)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-668cf9dfbb-kflrx_openshift-operators(e1934c2f-8b0a-4a1d-9da5-5de2822c6b82)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-668cf9dfbb-kflrx_openshift-operators_e1934c2f-8b0a-4a1d-9da5-5de2822c6b82_0(029196928e115fb5e618a5f99d7c06f0f95f4d9e0fbb29c6c115d68c25f3d079): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-kflrx" podUID="e1934c2f-8b0a-4a1d-9da5-5de2822c6b82" Dec 11 13:15:13 crc kubenswrapper[4898]: I1211 13:15:13.774934 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5446b9c989-nq968" Dec 11 13:15:13 crc kubenswrapper[4898]: I1211 13:15:13.776301 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5446b9c989-nq968" Dec 11 13:15:13 crc kubenswrapper[4898]: E1211 13:15:13.810353 4898 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5446b9c989-nq968_openshift-operators_c5f15058-ca6b-40a5-bad2-83ea7339d28b_0(df5709af972a85cc22018a0da70ab8fc16d5b07d3b6cd1fd9882f092678e0cbd): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 11 13:15:13 crc kubenswrapper[4898]: E1211 13:15:13.810426 4898 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5446b9c989-nq968_openshift-operators_c5f15058-ca6b-40a5-bad2-83ea7339d28b_0(df5709af972a85cc22018a0da70ab8fc16d5b07d3b6cd1fd9882f092678e0cbd): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5446b9c989-nq968" Dec 11 13:15:13 crc kubenswrapper[4898]: E1211 13:15:13.810496 4898 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5446b9c989-nq968_openshift-operators_c5f15058-ca6b-40a5-bad2-83ea7339d28b_0(df5709af972a85cc22018a0da70ab8fc16d5b07d3b6cd1fd9882f092678e0cbd): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5446b9c989-nq968" Dec 11 13:15:13 crc kubenswrapper[4898]: E1211 13:15:13.810542 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"perses-operator-5446b9c989-nq968_openshift-operators(c5f15058-ca6b-40a5-bad2-83ea7339d28b)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"perses-operator-5446b9c989-nq968_openshift-operators(c5f15058-ca6b-40a5-bad2-83ea7339d28b)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5446b9c989-nq968_openshift-operators_c5f15058-ca6b-40a5-bad2-83ea7339d28b_0(df5709af972a85cc22018a0da70ab8fc16d5b07d3b6cd1fd9882f092678e0cbd): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/perses-operator-5446b9c989-nq968" podUID="c5f15058-ca6b-40a5-bad2-83ea7339d28b" Dec 11 13:15:14 crc kubenswrapper[4898]: I1211 13:15:14.774805 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-67ddb68bbf-wvkwh" Dec 11 13:15:14 crc kubenswrapper[4898]: I1211 13:15:14.774868 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-67ddb68bbf-v6w7g" Dec 11 13:15:14 crc kubenswrapper[4898]: I1211 13:15:14.775858 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-67ddb68bbf-wvkwh" Dec 11 13:15:14 crc kubenswrapper[4898]: I1211 13:15:14.776026 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-67ddb68bbf-v6w7g" Dec 11 13:15:14 crc kubenswrapper[4898]: E1211 13:15:14.842999 4898 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-67ddb68bbf-v6w7g_openshift-operators_8c1b0eb8-fe9f-4ad8-897e-52e76251f1ef_0(67c3bb5c2a3e6343b326fafb58cd06f8c595b2dd9eba5745f713afd59cfd7266): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 11 13:15:14 crc kubenswrapper[4898]: E1211 13:15:14.843127 4898 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-67ddb68bbf-v6w7g_openshift-operators_8c1b0eb8-fe9f-4ad8-897e-52e76251f1ef_0(67c3bb5c2a3e6343b326fafb58cd06f8c595b2dd9eba5745f713afd59cfd7266): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-67ddb68bbf-v6w7g" Dec 11 13:15:14 crc kubenswrapper[4898]: E1211 13:15:14.843174 4898 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-67ddb68bbf-v6w7g_openshift-operators_8c1b0eb8-fe9f-4ad8-897e-52e76251f1ef_0(67c3bb5c2a3e6343b326fafb58cd06f8c595b2dd9eba5745f713afd59cfd7266): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-67ddb68bbf-v6w7g" Dec 11 13:15:14 crc kubenswrapper[4898]: E1211 13:15:14.843257 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-67ddb68bbf-v6w7g_openshift-operators(8c1b0eb8-fe9f-4ad8-897e-52e76251f1ef)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-67ddb68bbf-v6w7g_openshift-operators(8c1b0eb8-fe9f-4ad8-897e-52e76251f1ef)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-67ddb68bbf-v6w7g_openshift-operators_8c1b0eb8-fe9f-4ad8-897e-52e76251f1ef_0(67c3bb5c2a3e6343b326fafb58cd06f8c595b2dd9eba5745f713afd59cfd7266): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-67ddb68bbf-v6w7g" podUID="8c1b0eb8-fe9f-4ad8-897e-52e76251f1ef" Dec 11 13:15:14 crc kubenswrapper[4898]: E1211 13:15:14.867353 4898 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-67ddb68bbf-wvkwh_openshift-operators_a328cfd7-383d-4f47-9723-ef24187542bd_0(8978f91f74c4195c400148bbb9906ea0c259b67f5b95b4e6b88606202b8fc8fd): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 11 13:15:14 crc kubenswrapper[4898]: E1211 13:15:14.867430 4898 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-67ddb68bbf-wvkwh_openshift-operators_a328cfd7-383d-4f47-9723-ef24187542bd_0(8978f91f74c4195c400148bbb9906ea0c259b67f5b95b4e6b88606202b8fc8fd): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-67ddb68bbf-wvkwh" Dec 11 13:15:14 crc kubenswrapper[4898]: E1211 13:15:14.867474 4898 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-67ddb68bbf-wvkwh_openshift-operators_a328cfd7-383d-4f47-9723-ef24187542bd_0(8978f91f74c4195c400148bbb9906ea0c259b67f5b95b4e6b88606202b8fc8fd): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-67ddb68bbf-wvkwh" Dec 11 13:15:14 crc kubenswrapper[4898]: E1211 13:15:14.867543 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-67ddb68bbf-wvkwh_openshift-operators(a328cfd7-383d-4f47-9723-ef24187542bd)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-67ddb68bbf-wvkwh_openshift-operators(a328cfd7-383d-4f47-9723-ef24187542bd)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-67ddb68bbf-wvkwh_openshift-operators_a328cfd7-383d-4f47-9723-ef24187542bd_0(8978f91f74c4195c400148bbb9906ea0c259b67f5b95b4e6b88606202b8fc8fd): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-67ddb68bbf-wvkwh" podUID="a328cfd7-383d-4f47-9723-ef24187542bd" Dec 11 13:15:15 crc kubenswrapper[4898]: I1211 13:15:15.774208 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-d8bb48f5d-cnkcn" Dec 11 13:15:15 crc kubenswrapper[4898]: I1211 13:15:15.774822 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-d8bb48f5d-cnkcn" Dec 11 13:15:15 crc kubenswrapper[4898]: E1211 13:15:15.812479 4898 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-d8bb48f5d-cnkcn_openshift-operators_2fed6ea1-0c4d-476c-9f45-ca4b5a9dc406_0(ca573be3a2fcdc05de72f9f0d98b81e1690211556a2c379a270bfcfc26b4b600): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 11 13:15:15 crc kubenswrapper[4898]: E1211 13:15:15.812557 4898 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-d8bb48f5d-cnkcn_openshift-operators_2fed6ea1-0c4d-476c-9f45-ca4b5a9dc406_0(ca573be3a2fcdc05de72f9f0d98b81e1690211556a2c379a270bfcfc26b4b600): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-d8bb48f5d-cnkcn" Dec 11 13:15:15 crc kubenswrapper[4898]: E1211 13:15:15.812587 4898 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-d8bb48f5d-cnkcn_openshift-operators_2fed6ea1-0c4d-476c-9f45-ca4b5a9dc406_0(ca573be3a2fcdc05de72f9f0d98b81e1690211556a2c379a270bfcfc26b4b600): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-d8bb48f5d-cnkcn" Dec 11 13:15:15 crc kubenswrapper[4898]: E1211 13:15:15.812642 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"observability-operator-d8bb48f5d-cnkcn_openshift-operators(2fed6ea1-0c4d-476c-9f45-ca4b5a9dc406)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"observability-operator-d8bb48f5d-cnkcn_openshift-operators(2fed6ea1-0c4d-476c-9f45-ca4b5a9dc406)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-d8bb48f5d-cnkcn_openshift-operators_2fed6ea1-0c4d-476c-9f45-ca4b5a9dc406_0(ca573be3a2fcdc05de72f9f0d98b81e1690211556a2c379a270bfcfc26b4b600): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/observability-operator-d8bb48f5d-cnkcn" podUID="2fed6ea1-0c4d-476c-9f45-ca4b5a9dc406" Dec 11 13:15:17 crc kubenswrapper[4898]: I1211 13:15:17.775355 4898 scope.go:117] "RemoveContainer" containerID="114b3259a9fd034573fe9dc3c103980a05be8bdb8205c084f35e27725255ec28" Dec 11 13:15:17 crc kubenswrapper[4898]: I1211 13:15:17.973930 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-dlqfj_4e8ed6cb-b822-4b64-9e00-e755c5aea812/kube-multus/2.log" Dec 11 13:15:17 crc kubenswrapper[4898]: I1211 13:15:17.974014 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-dlqfj" event={"ID":"4e8ed6cb-b822-4b64-9e00-e755c5aea812","Type":"ContainerStarted","Data":"e68bcd3b1191c751961083da0fe8600593c177ee74f7c2994d009665929c93b3"} Dec 11 13:15:22 crc kubenswrapper[4898]: I1211 13:15:22.774851 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-kflrx" Dec 11 13:15:22 crc kubenswrapper[4898]: I1211 13:15:22.779593 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-kflrx" Dec 11 13:15:23 crc kubenswrapper[4898]: I1211 13:15:23.003177 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-668cf9dfbb-kflrx"] Dec 11 13:15:23 crc kubenswrapper[4898]: I1211 13:15:23.015613 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-kflrx" event={"ID":"e1934c2f-8b0a-4a1d-9da5-5de2822c6b82","Type":"ContainerStarted","Data":"a68af52f7da602f5a57ca95ed124e05049eda59718db273d419b07361952e88c"} Dec 11 13:15:23 crc kubenswrapper[4898]: I1211 13:15:23.640309 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-lwjvl" Dec 11 13:15:25 crc kubenswrapper[4898]: I1211 13:15:25.774342 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29424315-nc8pn" Dec 11 13:15:25 crc kubenswrapper[4898]: I1211 13:15:25.774844 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29424315-nc8pn" Dec 11 13:15:26 crc kubenswrapper[4898]: I1211 13:15:26.164254 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29424315-nc8pn"] Dec 11 13:15:26 crc kubenswrapper[4898]: I1211 13:15:26.774885 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-67ddb68bbf-wvkwh" Dec 11 13:15:26 crc kubenswrapper[4898]: I1211 13:15:26.775100 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-d8bb48f5d-cnkcn" Dec 11 13:15:26 crc kubenswrapper[4898]: I1211 13:15:26.775718 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-d8bb48f5d-cnkcn" Dec 11 13:15:26 crc kubenswrapper[4898]: I1211 13:15:26.775838 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-67ddb68bbf-wvkwh" Dec 11 13:15:28 crc kubenswrapper[4898]: I1211 13:15:28.775630 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-67ddb68bbf-v6w7g" Dec 11 13:15:28 crc kubenswrapper[4898]: I1211 13:15:28.775647 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5446b9c989-nq968" Dec 11 13:15:28 crc kubenswrapper[4898]: I1211 13:15:28.776034 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-67ddb68bbf-v6w7g" Dec 11 13:15:28 crc kubenswrapper[4898]: I1211 13:15:28.776335 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5446b9c989-nq968" Dec 11 13:15:29 crc kubenswrapper[4898]: W1211 13:15:29.233675 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2eff25b9_7a31_4118_b8c3_c57b8b4714fa.slice/crio-701a2ba4f514f3e6a4958624c748108a2f543468a3e1f7c0428d717c52d65420 WatchSource:0}: Error finding container 701a2ba4f514f3e6a4958624c748108a2f543468a3e1f7c0428d717c52d65420: Status 404 returned error can't find the container with id 701a2ba4f514f3e6a4958624c748108a2f543468a3e1f7c0428d717c52d65420 Dec 11 13:15:29 crc kubenswrapper[4898]: I1211 13:15:29.543229 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-d8bb48f5d-cnkcn"] Dec 11 13:15:29 crc kubenswrapper[4898]: W1211 13:15:29.585587 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2fed6ea1_0c4d_476c_9f45_ca4b5a9dc406.slice/crio-f00c10b3f7c77d6a5790bc82741cec6b4f58a0eccbb389214527f06cfecb5533 WatchSource:0}: Error finding container f00c10b3f7c77d6a5790bc82741cec6b4f58a0eccbb389214527f06cfecb5533: Status 404 returned error can't find the container with id f00c10b3f7c77d6a5790bc82741cec6b4f58a0eccbb389214527f06cfecb5533 Dec 11 13:15:29 crc kubenswrapper[4898]: I1211 13:15:29.614718 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-67ddb68bbf-wvkwh"] Dec 11 13:15:29 crc kubenswrapper[4898]: W1211 13:15:29.617716 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda328cfd7_383d_4f47_9723_ef24187542bd.slice/crio-3e8bc4fa0f1e448f361e8a9b5f21756c3ed1aeaac7831ce0abbbad19e031152e WatchSource:0}: Error finding container 3e8bc4fa0f1e448f361e8a9b5f21756c3ed1aeaac7831ce0abbbad19e031152e: Status 404 returned error can't find the container with id 3e8bc4fa0f1e448f361e8a9b5f21756c3ed1aeaac7831ce0abbbad19e031152e Dec 11 13:15:29 crc kubenswrapper[4898]: I1211 13:15:29.872395 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-67ddb68bbf-v6w7g"] Dec 11 13:15:29 crc kubenswrapper[4898]: I1211 13:15:29.888312 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5446b9c989-nq968"] Dec 11 13:15:29 crc kubenswrapper[4898]: W1211 13:15:29.932449 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc5f15058_ca6b_40a5_bad2_83ea7339d28b.slice/crio-79c150b7821cb18f34b0ba9c84ecea630e00d0bcf6b7d2bc31499dd2d71fe106 WatchSource:0}: Error finding container 79c150b7821cb18f34b0ba9c84ecea630e00d0bcf6b7d2bc31499dd2d71fe106: Status 404 returned error can't find the container with id 79c150b7821cb18f34b0ba9c84ecea630e00d0bcf6b7d2bc31499dd2d71fe106 Dec 11 13:15:29 crc kubenswrapper[4898]: W1211 13:15:29.932745 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8c1b0eb8_fe9f_4ad8_897e_52e76251f1ef.slice/crio-5371517ff719490a6517b986b4aada986f27201bfd6e7629b18cb334c499734e WatchSource:0}: Error finding container 5371517ff719490a6517b986b4aada986f27201bfd6e7629b18cb334c499734e: Status 404 returned error can't find the container with id 5371517ff719490a6517b986b4aada986f27201bfd6e7629b18cb334c499734e Dec 11 13:15:30 crc kubenswrapper[4898]: I1211 13:15:30.059108 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-d8bb48f5d-cnkcn" event={"ID":"2fed6ea1-0c4d-476c-9f45-ca4b5a9dc406","Type":"ContainerStarted","Data":"f00c10b3f7c77d6a5790bc82741cec6b4f58a0eccbb389214527f06cfecb5533"} Dec 11 13:15:30 crc kubenswrapper[4898]: I1211 13:15:30.061232 4898 generic.go:334] "Generic (PLEG): container finished" podID="2eff25b9-7a31-4118-b8c3-c57b8b4714fa" containerID="be19fc1280d69a9d6fa41cdb51d79c5e5bb5bf4dee9f5d0453662f5904a9dc80" exitCode=0 Dec 11 13:15:30 crc kubenswrapper[4898]: I1211 13:15:30.061581 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29424315-nc8pn" event={"ID":"2eff25b9-7a31-4118-b8c3-c57b8b4714fa","Type":"ContainerDied","Data":"be19fc1280d69a9d6fa41cdb51d79c5e5bb5bf4dee9f5d0453662f5904a9dc80"} Dec 11 13:15:30 crc kubenswrapper[4898]: I1211 13:15:30.061603 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29424315-nc8pn" event={"ID":"2eff25b9-7a31-4118-b8c3-c57b8b4714fa","Type":"ContainerStarted","Data":"701a2ba4f514f3e6a4958624c748108a2f543468a3e1f7c0428d717c52d65420"} Dec 11 13:15:30 crc kubenswrapper[4898]: I1211 13:15:30.063277 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-67ddb68bbf-wvkwh" event={"ID":"a328cfd7-383d-4f47-9723-ef24187542bd","Type":"ContainerStarted","Data":"3e8bc4fa0f1e448f361e8a9b5f21756c3ed1aeaac7831ce0abbbad19e031152e"} Dec 11 13:15:30 crc kubenswrapper[4898]: I1211 13:15:30.065337 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5446b9c989-nq968" event={"ID":"c5f15058-ca6b-40a5-bad2-83ea7339d28b","Type":"ContainerStarted","Data":"79c150b7821cb18f34b0ba9c84ecea630e00d0bcf6b7d2bc31499dd2d71fe106"} Dec 11 13:15:30 crc kubenswrapper[4898]: I1211 13:15:30.066905 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-kflrx" event={"ID":"e1934c2f-8b0a-4a1d-9da5-5de2822c6b82","Type":"ContainerStarted","Data":"19ecce3ea7c24773a2e9047e44577901c4791c4bb44d21463c260e4ee45cfb8e"} Dec 11 13:15:30 crc kubenswrapper[4898]: I1211 13:15:30.068110 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-67ddb68bbf-v6w7g" event={"ID":"8c1b0eb8-fe9f-4ad8-897e-52e76251f1ef","Type":"ContainerStarted","Data":"5371517ff719490a6517b986b4aada986f27201bfd6e7629b18cb334c499734e"} Dec 11 13:15:30 crc kubenswrapper[4898]: I1211 13:15:30.096426 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-kflrx" podStartSLOduration=25.758102726 podStartE2EDuration="32.096411014s" podCreationTimestamp="2025-12-11 13:14:58 +0000 UTC" firstStartedPulling="2025-12-11 13:15:23.011450036 +0000 UTC m=+680.583776483" lastFinishedPulling="2025-12-11 13:15:29.349758334 +0000 UTC m=+686.922084771" observedRunningTime="2025-12-11 13:15:30.095237912 +0000 UTC m=+687.667564349" watchObservedRunningTime="2025-12-11 13:15:30.096411014 +0000 UTC m=+687.668737451" Dec 11 13:15:31 crc kubenswrapper[4898]: I1211 13:15:31.355529 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29424315-nc8pn" Dec 11 13:15:31 crc kubenswrapper[4898]: I1211 13:15:31.392864 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2eff25b9-7a31-4118-b8c3-c57b8b4714fa-secret-volume\") pod \"2eff25b9-7a31-4118-b8c3-c57b8b4714fa\" (UID: \"2eff25b9-7a31-4118-b8c3-c57b8b4714fa\") " Dec 11 13:15:31 crc kubenswrapper[4898]: I1211 13:15:31.392937 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2eff25b9-7a31-4118-b8c3-c57b8b4714fa-config-volume\") pod \"2eff25b9-7a31-4118-b8c3-c57b8b4714fa\" (UID: \"2eff25b9-7a31-4118-b8c3-c57b8b4714fa\") " Dec 11 13:15:31 crc kubenswrapper[4898]: I1211 13:15:31.392962 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6lj48\" (UniqueName: \"kubernetes.io/projected/2eff25b9-7a31-4118-b8c3-c57b8b4714fa-kube-api-access-6lj48\") pod \"2eff25b9-7a31-4118-b8c3-c57b8b4714fa\" (UID: \"2eff25b9-7a31-4118-b8c3-c57b8b4714fa\") " Dec 11 13:15:31 crc kubenswrapper[4898]: I1211 13:15:31.394337 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2eff25b9-7a31-4118-b8c3-c57b8b4714fa-config-volume" (OuterVolumeSpecName: "config-volume") pod "2eff25b9-7a31-4118-b8c3-c57b8b4714fa" (UID: "2eff25b9-7a31-4118-b8c3-c57b8b4714fa"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:15:31 crc kubenswrapper[4898]: I1211 13:15:31.398816 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2eff25b9-7a31-4118-b8c3-c57b8b4714fa-kube-api-access-6lj48" (OuterVolumeSpecName: "kube-api-access-6lj48") pod "2eff25b9-7a31-4118-b8c3-c57b8b4714fa" (UID: "2eff25b9-7a31-4118-b8c3-c57b8b4714fa"). InnerVolumeSpecName "kube-api-access-6lj48". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:15:31 crc kubenswrapper[4898]: I1211 13:15:31.399036 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2eff25b9-7a31-4118-b8c3-c57b8b4714fa-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "2eff25b9-7a31-4118-b8c3-c57b8b4714fa" (UID: "2eff25b9-7a31-4118-b8c3-c57b8b4714fa"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:15:31 crc kubenswrapper[4898]: I1211 13:15:31.495801 4898 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2eff25b9-7a31-4118-b8c3-c57b8b4714fa-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 11 13:15:31 crc kubenswrapper[4898]: I1211 13:15:31.495839 4898 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2eff25b9-7a31-4118-b8c3-c57b8b4714fa-config-volume\") on node \"crc\" DevicePath \"\"" Dec 11 13:15:31 crc kubenswrapper[4898]: I1211 13:15:31.495850 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6lj48\" (UniqueName: \"kubernetes.io/projected/2eff25b9-7a31-4118-b8c3-c57b8b4714fa-kube-api-access-6lj48\") on node \"crc\" DevicePath \"\"" Dec 11 13:15:32 crc kubenswrapper[4898]: I1211 13:15:32.083606 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29424315-nc8pn" event={"ID":"2eff25b9-7a31-4118-b8c3-c57b8b4714fa","Type":"ContainerDied","Data":"701a2ba4f514f3e6a4958624c748108a2f543468a3e1f7c0428d717c52d65420"} Dec 11 13:15:32 crc kubenswrapper[4898]: I1211 13:15:32.083654 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="701a2ba4f514f3e6a4958624c748108a2f543468a3e1f7c0428d717c52d65420" Dec 11 13:15:32 crc kubenswrapper[4898]: I1211 13:15:32.083701 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29424315-nc8pn" Dec 11 13:15:33 crc kubenswrapper[4898]: I1211 13:15:33.090581 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-67ddb68bbf-v6w7g" event={"ID":"8c1b0eb8-fe9f-4ad8-897e-52e76251f1ef","Type":"ContainerStarted","Data":"9ca3b4b2492df0bdc1c5dec0c332476e3a6be87a5d84ec1ae46a82f0fcf4ebb9"} Dec 11 13:15:33 crc kubenswrapper[4898]: I1211 13:15:33.093214 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-67ddb68bbf-wvkwh" event={"ID":"a328cfd7-383d-4f47-9723-ef24187542bd","Type":"ContainerStarted","Data":"4036ec5f7752520bf4900c5a46dd3e5cb10f3bd5931a7ab0290cbc582aaee976"} Dec 11 13:15:33 crc kubenswrapper[4898]: I1211 13:15:33.095281 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5446b9c989-nq968" event={"ID":"c5f15058-ca6b-40a5-bad2-83ea7339d28b","Type":"ContainerStarted","Data":"4f202a14dbbe0faa9ab83421ece06e0524ece0d8921b7980bd6a5bb5e4d695f8"} Dec 11 13:15:33 crc kubenswrapper[4898]: I1211 13:15:33.095426 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-5446b9c989-nq968" Dec 11 13:15:33 crc kubenswrapper[4898]: I1211 13:15:33.109093 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-67ddb68bbf-v6w7g" podStartSLOduration=32.363998237 podStartE2EDuration="35.109073385s" podCreationTimestamp="2025-12-11 13:14:58 +0000 UTC" firstStartedPulling="2025-12-11 13:15:29.935438823 +0000 UTC m=+687.507765250" lastFinishedPulling="2025-12-11 13:15:32.680513971 +0000 UTC m=+690.252840398" observedRunningTime="2025-12-11 13:15:33.10744047 +0000 UTC m=+690.679766917" watchObservedRunningTime="2025-12-11 13:15:33.109073385 +0000 UTC m=+690.681399822" Dec 11 13:15:33 crc kubenswrapper[4898]: I1211 13:15:33.142944 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-5446b9c989-nq968" podStartSLOduration=31.401511114 podStartE2EDuration="34.142924422s" podCreationTimestamp="2025-12-11 13:14:59 +0000 UTC" firstStartedPulling="2025-12-11 13:15:29.935572427 +0000 UTC m=+687.507898864" lastFinishedPulling="2025-12-11 13:15:32.676985735 +0000 UTC m=+690.249312172" observedRunningTime="2025-12-11 13:15:33.136881327 +0000 UTC m=+690.709207774" watchObservedRunningTime="2025-12-11 13:15:33.142924422 +0000 UTC m=+690.715250859" Dec 11 13:15:33 crc kubenswrapper[4898]: I1211 13:15:33.164593 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-67ddb68bbf-wvkwh" podStartSLOduration=32.108086684 podStartE2EDuration="35.164572926s" podCreationTimestamp="2025-12-11 13:14:58 +0000 UTC" firstStartedPulling="2025-12-11 13:15:29.62002331 +0000 UTC m=+687.192349747" lastFinishedPulling="2025-12-11 13:15:32.676509552 +0000 UTC m=+690.248835989" observedRunningTime="2025-12-11 13:15:33.153664457 +0000 UTC m=+690.725990884" watchObservedRunningTime="2025-12-11 13:15:33.164572926 +0000 UTC m=+690.736899383" Dec 11 13:15:38 crc kubenswrapper[4898]: I1211 13:15:38.133595 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-d8bb48f5d-cnkcn" event={"ID":"2fed6ea1-0c4d-476c-9f45-ca4b5a9dc406","Type":"ContainerStarted","Data":"e248074b3362d2ee113c9afd5bb776faca425236c3e994c20bbed21878a74ac2"} Dec 11 13:15:38 crc kubenswrapper[4898]: I1211 13:15:38.135724 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-d8bb48f5d-cnkcn" Dec 11 13:15:38 crc kubenswrapper[4898]: I1211 13:15:38.161668 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-d8bb48f5d-cnkcn" podStartSLOduration=31.323380333 podStartE2EDuration="39.161646213s" podCreationTimestamp="2025-12-11 13:14:59 +0000 UTC" firstStartedPulling="2025-12-11 13:15:29.58974927 +0000 UTC m=+687.162075707" lastFinishedPulling="2025-12-11 13:15:37.42801515 +0000 UTC m=+695.000341587" observedRunningTime="2025-12-11 13:15:38.159281508 +0000 UTC m=+695.731607965" watchObservedRunningTime="2025-12-11 13:15:38.161646213 +0000 UTC m=+695.733972690" Dec 11 13:15:38 crc kubenswrapper[4898]: I1211 13:15:38.181480 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-d8bb48f5d-cnkcn" Dec 11 13:15:39 crc kubenswrapper[4898]: I1211 13:15:39.627171 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-5446b9c989-nq968" Dec 11 13:15:44 crc kubenswrapper[4898]: I1211 13:15:44.278293 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-6wct8"] Dec 11 13:15:44 crc kubenswrapper[4898]: E1211 13:15:44.279835 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2eff25b9-7a31-4118-b8c3-c57b8b4714fa" containerName="collect-profiles" Dec 11 13:15:44 crc kubenswrapper[4898]: I1211 13:15:44.279916 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="2eff25b9-7a31-4118-b8c3-c57b8b4714fa" containerName="collect-profiles" Dec 11 13:15:44 crc kubenswrapper[4898]: I1211 13:15:44.280084 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="2eff25b9-7a31-4118-b8c3-c57b8b4714fa" containerName="collect-profiles" Dec 11 13:15:44 crc kubenswrapper[4898]: I1211 13:15:44.280594 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-6wct8" Dec 11 13:15:44 crc kubenswrapper[4898]: I1211 13:15:44.292562 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Dec 11 13:15:44 crc kubenswrapper[4898]: I1211 13:15:44.293086 4898 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-prd4s" Dec 11 13:15:44 crc kubenswrapper[4898]: I1211 13:15:44.294125 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Dec 11 13:15:44 crc kubenswrapper[4898]: I1211 13:15:44.296358 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-5b446d88c5-h9lc7"] Dec 11 13:15:44 crc kubenswrapper[4898]: I1211 13:15:44.297076 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-h9lc7" Dec 11 13:15:44 crc kubenswrapper[4898]: I1211 13:15:44.299588 4898 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-wwdhr" Dec 11 13:15:44 crc kubenswrapper[4898]: I1211 13:15:44.309414 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-dwxc2"] Dec 11 13:15:44 crc kubenswrapper[4898]: I1211 13:15:44.310145 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-dwxc2" Dec 11 13:15:44 crc kubenswrapper[4898]: I1211 13:15:44.313940 4898 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-t5h4d" Dec 11 13:15:44 crc kubenswrapper[4898]: I1211 13:15:44.317609 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-6wct8"] Dec 11 13:15:44 crc kubenswrapper[4898]: I1211 13:15:44.321048 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-h9lc7"] Dec 11 13:15:44 crc kubenswrapper[4898]: I1211 13:15:44.327336 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-dwxc2"] Dec 11 13:15:44 crc kubenswrapper[4898]: I1211 13:15:44.380831 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kf45c\" (UniqueName: \"kubernetes.io/projected/df56f399-1275-4a62-b700-05d108445723-kube-api-access-kf45c\") pod \"cert-manager-cainjector-7f985d654d-6wct8\" (UID: \"df56f399-1275-4a62-b700-05d108445723\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-6wct8" Dec 11 13:15:44 crc kubenswrapper[4898]: I1211 13:15:44.482666 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kf45c\" (UniqueName: \"kubernetes.io/projected/df56f399-1275-4a62-b700-05d108445723-kube-api-access-kf45c\") pod \"cert-manager-cainjector-7f985d654d-6wct8\" (UID: \"df56f399-1275-4a62-b700-05d108445723\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-6wct8" Dec 11 13:15:44 crc kubenswrapper[4898]: I1211 13:15:44.482952 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b888s\" (UniqueName: \"kubernetes.io/projected/8e5e03d3-bb27-44ef-9f33-fe1175a655ed-kube-api-access-b888s\") pod \"cert-manager-5b446d88c5-h9lc7\" (UID: \"8e5e03d3-bb27-44ef-9f33-fe1175a655ed\") " pod="cert-manager/cert-manager-5b446d88c5-h9lc7" Dec 11 13:15:44 crc kubenswrapper[4898]: I1211 13:15:44.483105 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xphtq\" (UniqueName: \"kubernetes.io/projected/80c45250-6b80-452d-ade1-a8b024cabf10-kube-api-access-xphtq\") pod \"cert-manager-webhook-5655c58dd6-dwxc2\" (UID: \"80c45250-6b80-452d-ade1-a8b024cabf10\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-dwxc2" Dec 11 13:15:44 crc kubenswrapper[4898]: I1211 13:15:44.512481 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kf45c\" (UniqueName: \"kubernetes.io/projected/df56f399-1275-4a62-b700-05d108445723-kube-api-access-kf45c\") pod \"cert-manager-cainjector-7f985d654d-6wct8\" (UID: \"df56f399-1275-4a62-b700-05d108445723\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-6wct8" Dec 11 13:15:44 crc kubenswrapper[4898]: I1211 13:15:44.584545 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xphtq\" (UniqueName: \"kubernetes.io/projected/80c45250-6b80-452d-ade1-a8b024cabf10-kube-api-access-xphtq\") pod \"cert-manager-webhook-5655c58dd6-dwxc2\" (UID: \"80c45250-6b80-452d-ade1-a8b024cabf10\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-dwxc2" Dec 11 13:15:44 crc kubenswrapper[4898]: I1211 13:15:44.584637 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b888s\" (UniqueName: \"kubernetes.io/projected/8e5e03d3-bb27-44ef-9f33-fe1175a655ed-kube-api-access-b888s\") pod \"cert-manager-5b446d88c5-h9lc7\" (UID: \"8e5e03d3-bb27-44ef-9f33-fe1175a655ed\") " pod="cert-manager/cert-manager-5b446d88c5-h9lc7" Dec 11 13:15:44 crc kubenswrapper[4898]: I1211 13:15:44.603239 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-6wct8" Dec 11 13:15:44 crc kubenswrapper[4898]: I1211 13:15:44.607236 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xphtq\" (UniqueName: \"kubernetes.io/projected/80c45250-6b80-452d-ade1-a8b024cabf10-kube-api-access-xphtq\") pod \"cert-manager-webhook-5655c58dd6-dwxc2\" (UID: \"80c45250-6b80-452d-ade1-a8b024cabf10\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-dwxc2" Dec 11 13:15:44 crc kubenswrapper[4898]: I1211 13:15:44.613320 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b888s\" (UniqueName: \"kubernetes.io/projected/8e5e03d3-bb27-44ef-9f33-fe1175a655ed-kube-api-access-b888s\") pod \"cert-manager-5b446d88c5-h9lc7\" (UID: \"8e5e03d3-bb27-44ef-9f33-fe1175a655ed\") " pod="cert-manager/cert-manager-5b446d88c5-h9lc7" Dec 11 13:15:44 crc kubenswrapper[4898]: I1211 13:15:44.622082 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-h9lc7" Dec 11 13:15:44 crc kubenswrapper[4898]: I1211 13:15:44.629922 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-dwxc2" Dec 11 13:15:45 crc kubenswrapper[4898]: W1211 13:15:45.038045 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod80c45250_6b80_452d_ade1_a8b024cabf10.slice/crio-6b01db9ad37ce3ec13489efe70aa40958c55ac1b992bd9da0955d6cf3b996200 WatchSource:0}: Error finding container 6b01db9ad37ce3ec13489efe70aa40958c55ac1b992bd9da0955d6cf3b996200: Status 404 returned error can't find the container with id 6b01db9ad37ce3ec13489efe70aa40958c55ac1b992bd9da0955d6cf3b996200 Dec 11 13:15:45 crc kubenswrapper[4898]: I1211 13:15:45.038973 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-dwxc2"] Dec 11 13:15:45 crc kubenswrapper[4898]: I1211 13:15:45.183984 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-dwxc2" event={"ID":"80c45250-6b80-452d-ade1-a8b024cabf10","Type":"ContainerStarted","Data":"6b01db9ad37ce3ec13489efe70aa40958c55ac1b992bd9da0955d6cf3b996200"} Dec 11 13:15:45 crc kubenswrapper[4898]: I1211 13:15:45.208995 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-6wct8"] Dec 11 13:15:45 crc kubenswrapper[4898]: W1211 13:15:45.212405 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddf56f399_1275_4a62_b700_05d108445723.slice/crio-07b1366d4b211d8601e7b42bd2e7fa83ae7b89e156d58aa6d79a74eb4a1a2c72 WatchSource:0}: Error finding container 07b1366d4b211d8601e7b42bd2e7fa83ae7b89e156d58aa6d79a74eb4a1a2c72: Status 404 returned error can't find the container with id 07b1366d4b211d8601e7b42bd2e7fa83ae7b89e156d58aa6d79a74eb4a1a2c72 Dec 11 13:15:45 crc kubenswrapper[4898]: I1211 13:15:45.228101 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-h9lc7"] Dec 11 13:15:45 crc kubenswrapper[4898]: W1211 13:15:45.228402 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8e5e03d3_bb27_44ef_9f33_fe1175a655ed.slice/crio-abd3a912163989ff80d750246be4ec93321d175f85d897636d3232ace44c58b3 WatchSource:0}: Error finding container abd3a912163989ff80d750246be4ec93321d175f85d897636d3232ace44c58b3: Status 404 returned error can't find the container with id abd3a912163989ff80d750246be4ec93321d175f85d897636d3232ace44c58b3 Dec 11 13:15:46 crc kubenswrapper[4898]: I1211 13:15:46.191879 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-h9lc7" event={"ID":"8e5e03d3-bb27-44ef-9f33-fe1175a655ed","Type":"ContainerStarted","Data":"abd3a912163989ff80d750246be4ec93321d175f85d897636d3232ace44c58b3"} Dec 11 13:15:46 crc kubenswrapper[4898]: I1211 13:15:46.192680 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-6wct8" event={"ID":"df56f399-1275-4a62-b700-05d108445723","Type":"ContainerStarted","Data":"07b1366d4b211d8601e7b42bd2e7fa83ae7b89e156d58aa6d79a74eb4a1a2c72"} Dec 11 13:15:49 crc kubenswrapper[4898]: I1211 13:15:49.213754 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-h9lc7" event={"ID":"8e5e03d3-bb27-44ef-9f33-fe1175a655ed","Type":"ContainerStarted","Data":"06c0af7bb0af69ff282cdccccf59614651e2b2ea7ef1d3c6b3e2db32324fcbda"} Dec 11 13:15:49 crc kubenswrapper[4898]: I1211 13:15:49.215208 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-dwxc2" event={"ID":"80c45250-6b80-452d-ade1-a8b024cabf10","Type":"ContainerStarted","Data":"b93109f456efc0c744fee0e60e2d79111578e29ca817084264379e77edd5c1af"} Dec 11 13:15:49 crc kubenswrapper[4898]: I1211 13:15:49.215313 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-5655c58dd6-dwxc2" Dec 11 13:15:49 crc kubenswrapper[4898]: I1211 13:15:49.216430 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-6wct8" event={"ID":"df56f399-1275-4a62-b700-05d108445723","Type":"ContainerStarted","Data":"2eff098720cba520312e1ce62c1aa8610c20a3edffb685f7ea5067bfc8ae6b64"} Dec 11 13:15:49 crc kubenswrapper[4898]: I1211 13:15:49.234658 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-5b446d88c5-h9lc7" podStartSLOduration=1.967994197 podStartE2EDuration="5.234633748s" podCreationTimestamp="2025-12-11 13:15:44 +0000 UTC" firstStartedPulling="2025-12-11 13:15:45.23493137 +0000 UTC m=+702.807257817" lastFinishedPulling="2025-12-11 13:15:48.501570921 +0000 UTC m=+706.073897368" observedRunningTime="2025-12-11 13:15:49.230061312 +0000 UTC m=+706.802387769" watchObservedRunningTime="2025-12-11 13:15:49.234633748 +0000 UTC m=+706.806960195" Dec 11 13:15:49 crc kubenswrapper[4898]: I1211 13:15:49.268752 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-5655c58dd6-dwxc2" podStartSLOduration=1.8723999679999999 podStartE2EDuration="5.268728792s" podCreationTimestamp="2025-12-11 13:15:44 +0000 UTC" firstStartedPulling="2025-12-11 13:15:45.040348598 +0000 UTC m=+702.612675035" lastFinishedPulling="2025-12-11 13:15:48.436677422 +0000 UTC m=+706.009003859" observedRunningTime="2025-12-11 13:15:49.254152192 +0000 UTC m=+706.826478649" watchObservedRunningTime="2025-12-11 13:15:49.268728792 +0000 UTC m=+706.841055249" Dec 11 13:15:49 crc kubenswrapper[4898]: I1211 13:15:49.269899 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-7f985d654d-6wct8" podStartSLOduration=2.047837715 podStartE2EDuration="5.269888294s" podCreationTimestamp="2025-12-11 13:15:44 +0000 UTC" firstStartedPulling="2025-12-11 13:15:45.214678285 +0000 UTC m=+702.787004722" lastFinishedPulling="2025-12-11 13:15:48.436728864 +0000 UTC m=+706.009055301" observedRunningTime="2025-12-11 13:15:49.268385182 +0000 UTC m=+706.840711609" watchObservedRunningTime="2025-12-11 13:15:49.269888294 +0000 UTC m=+706.842214751" Dec 11 13:15:54 crc kubenswrapper[4898]: I1211 13:15:54.633346 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-5655c58dd6-dwxc2" Dec 11 13:16:18 crc kubenswrapper[4898]: I1211 13:16:18.180656 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463f542dx"] Dec 11 13:16:18 crc kubenswrapper[4898]: I1211 13:16:18.184070 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463f542dx" Dec 11 13:16:18 crc kubenswrapper[4898]: I1211 13:16:18.186028 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 11 13:16:18 crc kubenswrapper[4898]: I1211 13:16:18.196271 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463f542dx"] Dec 11 13:16:18 crc kubenswrapper[4898]: I1211 13:16:18.315003 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tcvt8\" (UniqueName: \"kubernetes.io/projected/77e660a9-36a2-45e9-9d33-a5a6feb01cd2-kube-api-access-tcvt8\") pod \"a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463f542dx\" (UID: \"77e660a9-36a2-45e9-9d33-a5a6feb01cd2\") " pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463f542dx" Dec 11 13:16:18 crc kubenswrapper[4898]: I1211 13:16:18.315065 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/77e660a9-36a2-45e9-9d33-a5a6feb01cd2-bundle\") pod \"a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463f542dx\" (UID: \"77e660a9-36a2-45e9-9d33-a5a6feb01cd2\") " pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463f542dx" Dec 11 13:16:18 crc kubenswrapper[4898]: I1211 13:16:18.315270 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/77e660a9-36a2-45e9-9d33-a5a6feb01cd2-util\") pod \"a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463f542dx\" (UID: \"77e660a9-36a2-45e9-9d33-a5a6feb01cd2\") " pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463f542dx" Dec 11 13:16:18 crc kubenswrapper[4898]: I1211 13:16:18.355839 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8td8s7"] Dec 11 13:16:18 crc kubenswrapper[4898]: I1211 13:16:18.357345 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8td8s7" Dec 11 13:16:18 crc kubenswrapper[4898]: I1211 13:16:18.366234 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8td8s7"] Dec 11 13:16:18 crc kubenswrapper[4898]: I1211 13:16:18.416721 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tcvt8\" (UniqueName: \"kubernetes.io/projected/77e660a9-36a2-45e9-9d33-a5a6feb01cd2-kube-api-access-tcvt8\") pod \"a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463f542dx\" (UID: \"77e660a9-36a2-45e9-9d33-a5a6feb01cd2\") " pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463f542dx" Dec 11 13:16:18 crc kubenswrapper[4898]: I1211 13:16:18.416774 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/77e660a9-36a2-45e9-9d33-a5a6feb01cd2-bundle\") pod \"a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463f542dx\" (UID: \"77e660a9-36a2-45e9-9d33-a5a6feb01cd2\") " pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463f542dx" Dec 11 13:16:18 crc kubenswrapper[4898]: I1211 13:16:18.416812 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s9q7m\" (UniqueName: \"kubernetes.io/projected/2e30a8bb-3d26-41d3-af16-de028edba0ff-kube-api-access-s9q7m\") pod \"4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8td8s7\" (UID: \"2e30a8bb-3d26-41d3-af16-de028edba0ff\") " pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8td8s7" Dec 11 13:16:18 crc kubenswrapper[4898]: I1211 13:16:18.416841 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2e30a8bb-3d26-41d3-af16-de028edba0ff-util\") pod \"4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8td8s7\" (UID: \"2e30a8bb-3d26-41d3-af16-de028edba0ff\") " pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8td8s7" Dec 11 13:16:18 crc kubenswrapper[4898]: I1211 13:16:18.416857 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/77e660a9-36a2-45e9-9d33-a5a6feb01cd2-util\") pod \"a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463f542dx\" (UID: \"77e660a9-36a2-45e9-9d33-a5a6feb01cd2\") " pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463f542dx" Dec 11 13:16:18 crc kubenswrapper[4898]: I1211 13:16:18.416901 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2e30a8bb-3d26-41d3-af16-de028edba0ff-bundle\") pod \"4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8td8s7\" (UID: \"2e30a8bb-3d26-41d3-af16-de028edba0ff\") " pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8td8s7" Dec 11 13:16:18 crc kubenswrapper[4898]: I1211 13:16:18.417370 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/77e660a9-36a2-45e9-9d33-a5a6feb01cd2-util\") pod \"a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463f542dx\" (UID: \"77e660a9-36a2-45e9-9d33-a5a6feb01cd2\") " pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463f542dx" Dec 11 13:16:18 crc kubenswrapper[4898]: I1211 13:16:18.417732 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/77e660a9-36a2-45e9-9d33-a5a6feb01cd2-bundle\") pod \"a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463f542dx\" (UID: \"77e660a9-36a2-45e9-9d33-a5a6feb01cd2\") " pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463f542dx" Dec 11 13:16:18 crc kubenswrapper[4898]: I1211 13:16:18.442295 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tcvt8\" (UniqueName: \"kubernetes.io/projected/77e660a9-36a2-45e9-9d33-a5a6feb01cd2-kube-api-access-tcvt8\") pod \"a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463f542dx\" (UID: \"77e660a9-36a2-45e9-9d33-a5a6feb01cd2\") " pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463f542dx" Dec 11 13:16:18 crc kubenswrapper[4898]: I1211 13:16:18.517874 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2e30a8bb-3d26-41d3-af16-de028edba0ff-bundle\") pod \"4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8td8s7\" (UID: \"2e30a8bb-3d26-41d3-af16-de028edba0ff\") " pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8td8s7" Dec 11 13:16:18 crc kubenswrapper[4898]: I1211 13:16:18.517971 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s9q7m\" (UniqueName: \"kubernetes.io/projected/2e30a8bb-3d26-41d3-af16-de028edba0ff-kube-api-access-s9q7m\") pod \"4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8td8s7\" (UID: \"2e30a8bb-3d26-41d3-af16-de028edba0ff\") " pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8td8s7" Dec 11 13:16:18 crc kubenswrapper[4898]: I1211 13:16:18.518002 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2e30a8bb-3d26-41d3-af16-de028edba0ff-util\") pod \"4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8td8s7\" (UID: \"2e30a8bb-3d26-41d3-af16-de028edba0ff\") " pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8td8s7" Dec 11 13:16:18 crc kubenswrapper[4898]: I1211 13:16:18.518289 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463f542dx" Dec 11 13:16:18 crc kubenswrapper[4898]: I1211 13:16:18.518470 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2e30a8bb-3d26-41d3-af16-de028edba0ff-util\") pod \"4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8td8s7\" (UID: \"2e30a8bb-3d26-41d3-af16-de028edba0ff\") " pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8td8s7" Dec 11 13:16:18 crc kubenswrapper[4898]: I1211 13:16:18.518605 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2e30a8bb-3d26-41d3-af16-de028edba0ff-bundle\") pod \"4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8td8s7\" (UID: \"2e30a8bb-3d26-41d3-af16-de028edba0ff\") " pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8td8s7" Dec 11 13:16:18 crc kubenswrapper[4898]: I1211 13:16:18.539412 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s9q7m\" (UniqueName: \"kubernetes.io/projected/2e30a8bb-3d26-41d3-af16-de028edba0ff-kube-api-access-s9q7m\") pod \"4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8td8s7\" (UID: \"2e30a8bb-3d26-41d3-af16-de028edba0ff\") " pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8td8s7" Dec 11 13:16:18 crc kubenswrapper[4898]: I1211 13:16:18.717052 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8td8s7" Dec 11 13:16:18 crc kubenswrapper[4898]: I1211 13:16:18.744642 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463f542dx"] Dec 11 13:16:18 crc kubenswrapper[4898]: I1211 13:16:18.827238 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463f542dx" event={"ID":"77e660a9-36a2-45e9-9d33-a5a6feb01cd2","Type":"ContainerStarted","Data":"196048647e3aa0be68ab426f8eb03c5c036f42c31cbeb1a92c2ac5c7791a6975"} Dec 11 13:16:19 crc kubenswrapper[4898]: I1211 13:16:19.025755 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8td8s7"] Dec 11 13:16:19 crc kubenswrapper[4898]: I1211 13:16:19.838898 4898 generic.go:334] "Generic (PLEG): container finished" podID="77e660a9-36a2-45e9-9d33-a5a6feb01cd2" containerID="05ba752867b7190b3f1e27271aa77773b5bec5937ad3a93cd938b137883c7bf8" exitCode=0 Dec 11 13:16:19 crc kubenswrapper[4898]: I1211 13:16:19.839043 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463f542dx" event={"ID":"77e660a9-36a2-45e9-9d33-a5a6feb01cd2","Type":"ContainerDied","Data":"05ba752867b7190b3f1e27271aa77773b5bec5937ad3a93cd938b137883c7bf8"} Dec 11 13:16:19 crc kubenswrapper[4898]: I1211 13:16:19.842234 4898 generic.go:334] "Generic (PLEG): container finished" podID="2e30a8bb-3d26-41d3-af16-de028edba0ff" containerID="87373f381bf2ac432c7f1975028b5b2ffc97851600ce97a8ee3f35debddef16d" exitCode=0 Dec 11 13:16:19 crc kubenswrapper[4898]: I1211 13:16:19.842281 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8td8s7" event={"ID":"2e30a8bb-3d26-41d3-af16-de028edba0ff","Type":"ContainerDied","Data":"87373f381bf2ac432c7f1975028b5b2ffc97851600ce97a8ee3f35debddef16d"} Dec 11 13:16:19 crc kubenswrapper[4898]: I1211 13:16:19.842322 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8td8s7" event={"ID":"2e30a8bb-3d26-41d3-af16-de028edba0ff","Type":"ContainerStarted","Data":"97acd0cefdf47bb342559baef3fdeb15c3049080e999d87a384e91a8fd24bacb"} Dec 11 13:16:21 crc kubenswrapper[4898]: E1211 13:16:21.377356 4898 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod77e660a9_36a2_45e9_9d33_a5a6feb01cd2.slice/crio-conmon-4f3ebf48e26755a971e839f446f33ab88e1427d3936f8914aa35be7a2ef09fde.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod77e660a9_36a2_45e9_9d33_a5a6feb01cd2.slice/crio-4f3ebf48e26755a971e839f446f33ab88e1427d3936f8914aa35be7a2ef09fde.scope\": RecentStats: unable to find data in memory cache]" Dec 11 13:16:21 crc kubenswrapper[4898]: I1211 13:16:21.864350 4898 generic.go:334] "Generic (PLEG): container finished" podID="2e30a8bb-3d26-41d3-af16-de028edba0ff" containerID="674ab730b0f49781f04ab9c9b8e40f1b06fa501ea37b098c70fdf142a47730d4" exitCode=0 Dec 11 13:16:21 crc kubenswrapper[4898]: I1211 13:16:21.864444 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8td8s7" event={"ID":"2e30a8bb-3d26-41d3-af16-de028edba0ff","Type":"ContainerDied","Data":"674ab730b0f49781f04ab9c9b8e40f1b06fa501ea37b098c70fdf142a47730d4"} Dec 11 13:16:21 crc kubenswrapper[4898]: I1211 13:16:21.871489 4898 generic.go:334] "Generic (PLEG): container finished" podID="77e660a9-36a2-45e9-9d33-a5a6feb01cd2" containerID="4f3ebf48e26755a971e839f446f33ab88e1427d3936f8914aa35be7a2ef09fde" exitCode=0 Dec 11 13:16:21 crc kubenswrapper[4898]: I1211 13:16:21.871549 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463f542dx" event={"ID":"77e660a9-36a2-45e9-9d33-a5a6feb01cd2","Type":"ContainerDied","Data":"4f3ebf48e26755a971e839f446f33ab88e1427d3936f8914aa35be7a2ef09fde"} Dec 11 13:16:22 crc kubenswrapper[4898]: I1211 13:16:22.881401 4898 generic.go:334] "Generic (PLEG): container finished" podID="77e660a9-36a2-45e9-9d33-a5a6feb01cd2" containerID="003490c0378e0c6b02492781365af5376e218b46ae97989eeddf0d3eeb49d21b" exitCode=0 Dec 11 13:16:22 crc kubenswrapper[4898]: I1211 13:16:22.881535 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463f542dx" event={"ID":"77e660a9-36a2-45e9-9d33-a5a6feb01cd2","Type":"ContainerDied","Data":"003490c0378e0c6b02492781365af5376e218b46ae97989eeddf0d3eeb49d21b"} Dec 11 13:16:22 crc kubenswrapper[4898]: I1211 13:16:22.885402 4898 generic.go:334] "Generic (PLEG): container finished" podID="2e30a8bb-3d26-41d3-af16-de028edba0ff" containerID="db5448eebde09837fa9724c73c618b60c10108cc0fee812a75f14e360a4f5dd2" exitCode=0 Dec 11 13:16:22 crc kubenswrapper[4898]: I1211 13:16:22.885452 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8td8s7" event={"ID":"2e30a8bb-3d26-41d3-af16-de028edba0ff","Type":"ContainerDied","Data":"db5448eebde09837fa9724c73c618b60c10108cc0fee812a75f14e360a4f5dd2"} Dec 11 13:16:24 crc kubenswrapper[4898]: I1211 13:16:24.237806 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8td8s7" Dec 11 13:16:24 crc kubenswrapper[4898]: I1211 13:16:24.243878 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463f542dx" Dec 11 13:16:24 crc kubenswrapper[4898]: I1211 13:16:24.320611 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/77e660a9-36a2-45e9-9d33-a5a6feb01cd2-bundle\") pod \"77e660a9-36a2-45e9-9d33-a5a6feb01cd2\" (UID: \"77e660a9-36a2-45e9-9d33-a5a6feb01cd2\") " Dec 11 13:16:24 crc kubenswrapper[4898]: I1211 13:16:24.320678 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tcvt8\" (UniqueName: \"kubernetes.io/projected/77e660a9-36a2-45e9-9d33-a5a6feb01cd2-kube-api-access-tcvt8\") pod \"77e660a9-36a2-45e9-9d33-a5a6feb01cd2\" (UID: \"77e660a9-36a2-45e9-9d33-a5a6feb01cd2\") " Dec 11 13:16:24 crc kubenswrapper[4898]: I1211 13:16:24.320711 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/77e660a9-36a2-45e9-9d33-a5a6feb01cd2-util\") pod \"77e660a9-36a2-45e9-9d33-a5a6feb01cd2\" (UID: \"77e660a9-36a2-45e9-9d33-a5a6feb01cd2\") " Dec 11 13:16:24 crc kubenswrapper[4898]: I1211 13:16:24.320755 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2e30a8bb-3d26-41d3-af16-de028edba0ff-util\") pod \"2e30a8bb-3d26-41d3-af16-de028edba0ff\" (UID: \"2e30a8bb-3d26-41d3-af16-de028edba0ff\") " Dec 11 13:16:24 crc kubenswrapper[4898]: I1211 13:16:24.320804 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2e30a8bb-3d26-41d3-af16-de028edba0ff-bundle\") pod \"2e30a8bb-3d26-41d3-af16-de028edba0ff\" (UID: \"2e30a8bb-3d26-41d3-af16-de028edba0ff\") " Dec 11 13:16:24 crc kubenswrapper[4898]: I1211 13:16:24.320842 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s9q7m\" (UniqueName: \"kubernetes.io/projected/2e30a8bb-3d26-41d3-af16-de028edba0ff-kube-api-access-s9q7m\") pod \"2e30a8bb-3d26-41d3-af16-de028edba0ff\" (UID: \"2e30a8bb-3d26-41d3-af16-de028edba0ff\") " Dec 11 13:16:24 crc kubenswrapper[4898]: I1211 13:16:24.322010 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2e30a8bb-3d26-41d3-af16-de028edba0ff-bundle" (OuterVolumeSpecName: "bundle") pod "2e30a8bb-3d26-41d3-af16-de028edba0ff" (UID: "2e30a8bb-3d26-41d3-af16-de028edba0ff"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 13:16:24 crc kubenswrapper[4898]: I1211 13:16:24.322174 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/77e660a9-36a2-45e9-9d33-a5a6feb01cd2-bundle" (OuterVolumeSpecName: "bundle") pod "77e660a9-36a2-45e9-9d33-a5a6feb01cd2" (UID: "77e660a9-36a2-45e9-9d33-a5a6feb01cd2"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 13:16:24 crc kubenswrapper[4898]: I1211 13:16:24.329626 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77e660a9-36a2-45e9-9d33-a5a6feb01cd2-kube-api-access-tcvt8" (OuterVolumeSpecName: "kube-api-access-tcvt8") pod "77e660a9-36a2-45e9-9d33-a5a6feb01cd2" (UID: "77e660a9-36a2-45e9-9d33-a5a6feb01cd2"). InnerVolumeSpecName "kube-api-access-tcvt8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:16:24 crc kubenswrapper[4898]: I1211 13:16:24.333878 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2e30a8bb-3d26-41d3-af16-de028edba0ff-util" (OuterVolumeSpecName: "util") pod "2e30a8bb-3d26-41d3-af16-de028edba0ff" (UID: "2e30a8bb-3d26-41d3-af16-de028edba0ff"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 13:16:24 crc kubenswrapper[4898]: I1211 13:16:24.335705 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e30a8bb-3d26-41d3-af16-de028edba0ff-kube-api-access-s9q7m" (OuterVolumeSpecName: "kube-api-access-s9q7m") pod "2e30a8bb-3d26-41d3-af16-de028edba0ff" (UID: "2e30a8bb-3d26-41d3-af16-de028edba0ff"). InnerVolumeSpecName "kube-api-access-s9q7m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:16:24 crc kubenswrapper[4898]: I1211 13:16:24.421857 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s9q7m\" (UniqueName: \"kubernetes.io/projected/2e30a8bb-3d26-41d3-af16-de028edba0ff-kube-api-access-s9q7m\") on node \"crc\" DevicePath \"\"" Dec 11 13:16:24 crc kubenswrapper[4898]: I1211 13:16:24.421892 4898 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/77e660a9-36a2-45e9-9d33-a5a6feb01cd2-bundle\") on node \"crc\" DevicePath \"\"" Dec 11 13:16:24 crc kubenswrapper[4898]: I1211 13:16:24.421903 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tcvt8\" (UniqueName: \"kubernetes.io/projected/77e660a9-36a2-45e9-9d33-a5a6feb01cd2-kube-api-access-tcvt8\") on node \"crc\" DevicePath \"\"" Dec 11 13:16:24 crc kubenswrapper[4898]: I1211 13:16:24.421913 4898 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2e30a8bb-3d26-41d3-af16-de028edba0ff-util\") on node \"crc\" DevicePath \"\"" Dec 11 13:16:24 crc kubenswrapper[4898]: I1211 13:16:24.421924 4898 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2e30a8bb-3d26-41d3-af16-de028edba0ff-bundle\") on node \"crc\" DevicePath \"\"" Dec 11 13:16:24 crc kubenswrapper[4898]: I1211 13:16:24.464535 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/77e660a9-36a2-45e9-9d33-a5a6feb01cd2-util" (OuterVolumeSpecName: "util") pod "77e660a9-36a2-45e9-9d33-a5a6feb01cd2" (UID: "77e660a9-36a2-45e9-9d33-a5a6feb01cd2"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 13:16:24 crc kubenswrapper[4898]: I1211 13:16:24.523279 4898 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/77e660a9-36a2-45e9-9d33-a5a6feb01cd2-util\") on node \"crc\" DevicePath \"\"" Dec 11 13:16:24 crc kubenswrapper[4898]: I1211 13:16:24.905168 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463f542dx" event={"ID":"77e660a9-36a2-45e9-9d33-a5a6feb01cd2","Type":"ContainerDied","Data":"196048647e3aa0be68ab426f8eb03c5c036f42c31cbeb1a92c2ac5c7791a6975"} Dec 11 13:16:24 crc kubenswrapper[4898]: I1211 13:16:24.905235 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="196048647e3aa0be68ab426f8eb03c5c036f42c31cbeb1a92c2ac5c7791a6975" Dec 11 13:16:24 crc kubenswrapper[4898]: I1211 13:16:24.905231 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463f542dx" Dec 11 13:16:24 crc kubenswrapper[4898]: I1211 13:16:24.909552 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8td8s7" event={"ID":"2e30a8bb-3d26-41d3-af16-de028edba0ff","Type":"ContainerDied","Data":"97acd0cefdf47bb342559baef3fdeb15c3049080e999d87a384e91a8fd24bacb"} Dec 11 13:16:24 crc kubenswrapper[4898]: I1211 13:16:24.909633 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="97acd0cefdf47bb342559baef3fdeb15c3049080e999d87a384e91a8fd24bacb" Dec 11 13:16:24 crc kubenswrapper[4898]: I1211 13:16:24.909708 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8td8s7" Dec 11 13:16:35 crc kubenswrapper[4898]: I1211 13:16:35.379135 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators-redhat/loki-operator-controller-manager-7d9d9f99f6-7sstc"] Dec 11 13:16:35 crc kubenswrapper[4898]: E1211 13:16:35.379948 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77e660a9-36a2-45e9-9d33-a5a6feb01cd2" containerName="util" Dec 11 13:16:35 crc kubenswrapper[4898]: I1211 13:16:35.379966 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="77e660a9-36a2-45e9-9d33-a5a6feb01cd2" containerName="util" Dec 11 13:16:35 crc kubenswrapper[4898]: E1211 13:16:35.379976 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e30a8bb-3d26-41d3-af16-de028edba0ff" containerName="util" Dec 11 13:16:35 crc kubenswrapper[4898]: I1211 13:16:35.379984 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e30a8bb-3d26-41d3-af16-de028edba0ff" containerName="util" Dec 11 13:16:35 crc kubenswrapper[4898]: E1211 13:16:35.379996 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77e660a9-36a2-45e9-9d33-a5a6feb01cd2" containerName="extract" Dec 11 13:16:35 crc kubenswrapper[4898]: I1211 13:16:35.380005 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="77e660a9-36a2-45e9-9d33-a5a6feb01cd2" containerName="extract" Dec 11 13:16:35 crc kubenswrapper[4898]: E1211 13:16:35.380021 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77e660a9-36a2-45e9-9d33-a5a6feb01cd2" containerName="pull" Dec 11 13:16:35 crc kubenswrapper[4898]: I1211 13:16:35.380030 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="77e660a9-36a2-45e9-9d33-a5a6feb01cd2" containerName="pull" Dec 11 13:16:35 crc kubenswrapper[4898]: E1211 13:16:35.380043 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e30a8bb-3d26-41d3-af16-de028edba0ff" containerName="pull" Dec 11 13:16:35 crc kubenswrapper[4898]: I1211 13:16:35.380051 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e30a8bb-3d26-41d3-af16-de028edba0ff" containerName="pull" Dec 11 13:16:35 crc kubenswrapper[4898]: E1211 13:16:35.380061 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e30a8bb-3d26-41d3-af16-de028edba0ff" containerName="extract" Dec 11 13:16:35 crc kubenswrapper[4898]: I1211 13:16:35.380068 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e30a8bb-3d26-41d3-af16-de028edba0ff" containerName="extract" Dec 11 13:16:35 crc kubenswrapper[4898]: I1211 13:16:35.380208 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="77e660a9-36a2-45e9-9d33-a5a6feb01cd2" containerName="extract" Dec 11 13:16:35 crc kubenswrapper[4898]: I1211 13:16:35.380228 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e30a8bb-3d26-41d3-af16-de028edba0ff" containerName="extract" Dec 11 13:16:35 crc kubenswrapper[4898]: I1211 13:16:35.381056 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators-redhat/loki-operator-controller-manager-7d9d9f99f6-7sstc" Dec 11 13:16:35 crc kubenswrapper[4898]: I1211 13:16:35.383483 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators-redhat"/"openshift-service-ca.crt" Dec 11 13:16:35 crc kubenswrapper[4898]: I1211 13:16:35.383497 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators-redhat"/"loki-operator-controller-manager-dockercfg-5bsxf" Dec 11 13:16:35 crc kubenswrapper[4898]: I1211 13:16:35.383573 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators-redhat"/"loki-operator-manager-config" Dec 11 13:16:35 crc kubenswrapper[4898]: I1211 13:16:35.383600 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators-redhat"/"loki-operator-metrics" Dec 11 13:16:35 crc kubenswrapper[4898]: I1211 13:16:35.383642 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators-redhat"/"loki-operator-controller-manager-service-cert" Dec 11 13:16:35 crc kubenswrapper[4898]: I1211 13:16:35.383788 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators-redhat"/"kube-root-ca.crt" Dec 11 13:16:35 crc kubenswrapper[4898]: I1211 13:16:35.407627 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators-redhat/loki-operator-controller-manager-7d9d9f99f6-7sstc"] Dec 11 13:16:35 crc kubenswrapper[4898]: I1211 13:16:35.485405 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b4cf67d3-b13e-4afb-be20-80dc0801c69c-apiservice-cert\") pod \"loki-operator-controller-manager-7d9d9f99f6-7sstc\" (UID: \"b4cf67d3-b13e-4afb-be20-80dc0801c69c\") " pod="openshift-operators-redhat/loki-operator-controller-manager-7d9d9f99f6-7sstc" Dec 11 13:16:35 crc kubenswrapper[4898]: I1211 13:16:35.485548 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"loki-operator-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b4cf67d3-b13e-4afb-be20-80dc0801c69c-loki-operator-metrics-cert\") pod \"loki-operator-controller-manager-7d9d9f99f6-7sstc\" (UID: \"b4cf67d3-b13e-4afb-be20-80dc0801c69c\") " pod="openshift-operators-redhat/loki-operator-controller-manager-7d9d9f99f6-7sstc" Dec 11 13:16:35 crc kubenswrapper[4898]: I1211 13:16:35.485579 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8rxmh\" (UniqueName: \"kubernetes.io/projected/b4cf67d3-b13e-4afb-be20-80dc0801c69c-kube-api-access-8rxmh\") pod \"loki-operator-controller-manager-7d9d9f99f6-7sstc\" (UID: \"b4cf67d3-b13e-4afb-be20-80dc0801c69c\") " pod="openshift-operators-redhat/loki-operator-controller-manager-7d9d9f99f6-7sstc" Dec 11 13:16:35 crc kubenswrapper[4898]: I1211 13:16:35.485645 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b4cf67d3-b13e-4afb-be20-80dc0801c69c-webhook-cert\") pod \"loki-operator-controller-manager-7d9d9f99f6-7sstc\" (UID: \"b4cf67d3-b13e-4afb-be20-80dc0801c69c\") " pod="openshift-operators-redhat/loki-operator-controller-manager-7d9d9f99f6-7sstc" Dec 11 13:16:35 crc kubenswrapper[4898]: I1211 13:16:35.485690 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/b4cf67d3-b13e-4afb-be20-80dc0801c69c-manager-config\") pod \"loki-operator-controller-manager-7d9d9f99f6-7sstc\" (UID: \"b4cf67d3-b13e-4afb-be20-80dc0801c69c\") " pod="openshift-operators-redhat/loki-operator-controller-manager-7d9d9f99f6-7sstc" Dec 11 13:16:35 crc kubenswrapper[4898]: I1211 13:16:35.587229 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b4cf67d3-b13e-4afb-be20-80dc0801c69c-apiservice-cert\") pod \"loki-operator-controller-manager-7d9d9f99f6-7sstc\" (UID: \"b4cf67d3-b13e-4afb-be20-80dc0801c69c\") " pod="openshift-operators-redhat/loki-operator-controller-manager-7d9d9f99f6-7sstc" Dec 11 13:16:35 crc kubenswrapper[4898]: I1211 13:16:35.587305 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"loki-operator-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b4cf67d3-b13e-4afb-be20-80dc0801c69c-loki-operator-metrics-cert\") pod \"loki-operator-controller-manager-7d9d9f99f6-7sstc\" (UID: \"b4cf67d3-b13e-4afb-be20-80dc0801c69c\") " pod="openshift-operators-redhat/loki-operator-controller-manager-7d9d9f99f6-7sstc" Dec 11 13:16:35 crc kubenswrapper[4898]: I1211 13:16:35.587334 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8rxmh\" (UniqueName: \"kubernetes.io/projected/b4cf67d3-b13e-4afb-be20-80dc0801c69c-kube-api-access-8rxmh\") pod \"loki-operator-controller-manager-7d9d9f99f6-7sstc\" (UID: \"b4cf67d3-b13e-4afb-be20-80dc0801c69c\") " pod="openshift-operators-redhat/loki-operator-controller-manager-7d9d9f99f6-7sstc" Dec 11 13:16:35 crc kubenswrapper[4898]: I1211 13:16:35.587404 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b4cf67d3-b13e-4afb-be20-80dc0801c69c-webhook-cert\") pod \"loki-operator-controller-manager-7d9d9f99f6-7sstc\" (UID: \"b4cf67d3-b13e-4afb-be20-80dc0801c69c\") " pod="openshift-operators-redhat/loki-operator-controller-manager-7d9d9f99f6-7sstc" Dec 11 13:16:35 crc kubenswrapper[4898]: I1211 13:16:35.587470 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/b4cf67d3-b13e-4afb-be20-80dc0801c69c-manager-config\") pod \"loki-operator-controller-manager-7d9d9f99f6-7sstc\" (UID: \"b4cf67d3-b13e-4afb-be20-80dc0801c69c\") " pod="openshift-operators-redhat/loki-operator-controller-manager-7d9d9f99f6-7sstc" Dec 11 13:16:35 crc kubenswrapper[4898]: I1211 13:16:35.588412 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/b4cf67d3-b13e-4afb-be20-80dc0801c69c-manager-config\") pod \"loki-operator-controller-manager-7d9d9f99f6-7sstc\" (UID: \"b4cf67d3-b13e-4afb-be20-80dc0801c69c\") " pod="openshift-operators-redhat/loki-operator-controller-manager-7d9d9f99f6-7sstc" Dec 11 13:16:35 crc kubenswrapper[4898]: I1211 13:16:35.593047 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b4cf67d3-b13e-4afb-be20-80dc0801c69c-apiservice-cert\") pod \"loki-operator-controller-manager-7d9d9f99f6-7sstc\" (UID: \"b4cf67d3-b13e-4afb-be20-80dc0801c69c\") " pod="openshift-operators-redhat/loki-operator-controller-manager-7d9d9f99f6-7sstc" Dec 11 13:16:35 crc kubenswrapper[4898]: I1211 13:16:35.594724 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b4cf67d3-b13e-4afb-be20-80dc0801c69c-webhook-cert\") pod \"loki-operator-controller-manager-7d9d9f99f6-7sstc\" (UID: \"b4cf67d3-b13e-4afb-be20-80dc0801c69c\") " pod="openshift-operators-redhat/loki-operator-controller-manager-7d9d9f99f6-7sstc" Dec 11 13:16:35 crc kubenswrapper[4898]: I1211 13:16:35.605285 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"loki-operator-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b4cf67d3-b13e-4afb-be20-80dc0801c69c-loki-operator-metrics-cert\") pod \"loki-operator-controller-manager-7d9d9f99f6-7sstc\" (UID: \"b4cf67d3-b13e-4afb-be20-80dc0801c69c\") " pod="openshift-operators-redhat/loki-operator-controller-manager-7d9d9f99f6-7sstc" Dec 11 13:16:35 crc kubenswrapper[4898]: I1211 13:16:35.611424 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8rxmh\" (UniqueName: \"kubernetes.io/projected/b4cf67d3-b13e-4afb-be20-80dc0801c69c-kube-api-access-8rxmh\") pod \"loki-operator-controller-manager-7d9d9f99f6-7sstc\" (UID: \"b4cf67d3-b13e-4afb-be20-80dc0801c69c\") " pod="openshift-operators-redhat/loki-operator-controller-manager-7d9d9f99f6-7sstc" Dec 11 13:16:35 crc kubenswrapper[4898]: I1211 13:16:35.699518 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators-redhat/loki-operator-controller-manager-7d9d9f99f6-7sstc" Dec 11 13:16:36 crc kubenswrapper[4898]: I1211 13:16:36.243506 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators-redhat/loki-operator-controller-manager-7d9d9f99f6-7sstc"] Dec 11 13:16:37 crc kubenswrapper[4898]: I1211 13:16:37.015870 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators-redhat/loki-operator-controller-manager-7d9d9f99f6-7sstc" event={"ID":"b4cf67d3-b13e-4afb-be20-80dc0801c69c","Type":"ContainerStarted","Data":"92964594093ae1b1cfba1c40be6f97326da8a231eab6041c52fa0cbb390b8388"} Dec 11 13:16:38 crc kubenswrapper[4898]: I1211 13:16:38.976147 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/cluster-logging-operator-ff9846bd-5qw7n"] Dec 11 13:16:38 crc kubenswrapper[4898]: I1211 13:16:38.977437 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/cluster-logging-operator-ff9846bd-5qw7n" Dec 11 13:16:38 crc kubenswrapper[4898]: I1211 13:16:38.980193 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"openshift-service-ca.crt" Dec 11 13:16:38 crc kubenswrapper[4898]: I1211 13:16:38.980202 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"kube-root-ca.crt" Dec 11 13:16:38 crc kubenswrapper[4898]: I1211 13:16:38.981060 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"cluster-logging-operator-dockercfg-k4z26" Dec 11 13:16:38 crc kubenswrapper[4898]: I1211 13:16:38.994712 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/cluster-logging-operator-ff9846bd-5qw7n"] Dec 11 13:16:39 crc kubenswrapper[4898]: I1211 13:16:39.038226 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bprm2\" (UniqueName: \"kubernetes.io/projected/7ffaf6d3-c69c-4b78-8364-be63b25056c3-kube-api-access-bprm2\") pod \"cluster-logging-operator-ff9846bd-5qw7n\" (UID: \"7ffaf6d3-c69c-4b78-8364-be63b25056c3\") " pod="openshift-logging/cluster-logging-operator-ff9846bd-5qw7n" Dec 11 13:16:39 crc kubenswrapper[4898]: I1211 13:16:39.140420 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bprm2\" (UniqueName: \"kubernetes.io/projected/7ffaf6d3-c69c-4b78-8364-be63b25056c3-kube-api-access-bprm2\") pod \"cluster-logging-operator-ff9846bd-5qw7n\" (UID: \"7ffaf6d3-c69c-4b78-8364-be63b25056c3\") " pod="openshift-logging/cluster-logging-operator-ff9846bd-5qw7n" Dec 11 13:16:39 crc kubenswrapper[4898]: I1211 13:16:39.164829 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bprm2\" (UniqueName: \"kubernetes.io/projected/7ffaf6d3-c69c-4b78-8364-be63b25056c3-kube-api-access-bprm2\") pod \"cluster-logging-operator-ff9846bd-5qw7n\" (UID: \"7ffaf6d3-c69c-4b78-8364-be63b25056c3\") " pod="openshift-logging/cluster-logging-operator-ff9846bd-5qw7n" Dec 11 13:16:39 crc kubenswrapper[4898]: I1211 13:16:39.349739 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/cluster-logging-operator-ff9846bd-5qw7n" Dec 11 13:16:39 crc kubenswrapper[4898]: I1211 13:16:39.883653 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/cluster-logging-operator-ff9846bd-5qw7n"] Dec 11 13:16:39 crc kubenswrapper[4898]: W1211 13:16:39.900000 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7ffaf6d3_c69c_4b78_8364_be63b25056c3.slice/crio-44486ff3060f32288b9c00d3a3841f72a0fd3dacdc0a112a975418c267d37036 WatchSource:0}: Error finding container 44486ff3060f32288b9c00d3a3841f72a0fd3dacdc0a112a975418c267d37036: Status 404 returned error can't find the container with id 44486ff3060f32288b9c00d3a3841f72a0fd3dacdc0a112a975418c267d37036 Dec 11 13:16:40 crc kubenswrapper[4898]: I1211 13:16:40.046989 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/cluster-logging-operator-ff9846bd-5qw7n" event={"ID":"7ffaf6d3-c69c-4b78-8364-be63b25056c3","Type":"ContainerStarted","Data":"44486ff3060f32288b9c00d3a3841f72a0fd3dacdc0a112a975418c267d37036"} Dec 11 13:16:43 crc kubenswrapper[4898]: I1211 13:16:43.067859 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators-redhat/loki-operator-controller-manager-7d9d9f99f6-7sstc" event={"ID":"b4cf67d3-b13e-4afb-be20-80dc0801c69c","Type":"ContainerStarted","Data":"a71b889fd0cb427a50bd9035efbcf72d175000a5943a80b510e134432254186c"} Dec 11 13:16:45 crc kubenswrapper[4898]: I1211 13:16:45.113399 4898 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 11 13:16:50 crc kubenswrapper[4898]: I1211 13:16:50.122933 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/cluster-logging-operator-ff9846bd-5qw7n" event={"ID":"7ffaf6d3-c69c-4b78-8364-be63b25056c3","Type":"ContainerStarted","Data":"ee8d7cc19dbd5180afbf2a1dfae5c7f1aa42411facebb69b54e88445552852e6"} Dec 11 13:16:50 crc kubenswrapper[4898]: I1211 13:16:50.126036 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators-redhat/loki-operator-controller-manager-7d9d9f99f6-7sstc" event={"ID":"b4cf67d3-b13e-4afb-be20-80dc0801c69c","Type":"ContainerStarted","Data":"6ac8e7a13c9aa6698bd5b50664087abb8ca3f713606a040df0e96de686e6e726"} Dec 11 13:16:50 crc kubenswrapper[4898]: I1211 13:16:50.126692 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators-redhat/loki-operator-controller-manager-7d9d9f99f6-7sstc" Dec 11 13:16:50 crc kubenswrapper[4898]: I1211 13:16:50.128924 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators-redhat/loki-operator-controller-manager-7d9d9f99f6-7sstc" Dec 11 13:16:50 crc kubenswrapper[4898]: I1211 13:16:50.147385 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/cluster-logging-operator-ff9846bd-5qw7n" podStartSLOduration=2.194826625 podStartE2EDuration="12.147358756s" podCreationTimestamp="2025-12-11 13:16:38 +0000 UTC" firstStartedPulling="2025-12-11 13:16:39.906087203 +0000 UTC m=+757.478413640" lastFinishedPulling="2025-12-11 13:16:49.858619314 +0000 UTC m=+767.430945771" observedRunningTime="2025-12-11 13:16:50.144322424 +0000 UTC m=+767.716648901" watchObservedRunningTime="2025-12-11 13:16:50.147358756 +0000 UTC m=+767.719685193" Dec 11 13:16:50 crc kubenswrapper[4898]: I1211 13:16:50.194251 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators-redhat/loki-operator-controller-manager-7d9d9f99f6-7sstc" podStartSLOduration=1.566600576 podStartE2EDuration="15.194229205s" podCreationTimestamp="2025-12-11 13:16:35 +0000 UTC" firstStartedPulling="2025-12-11 13:16:36.250969794 +0000 UTC m=+753.823296231" lastFinishedPulling="2025-12-11 13:16:49.878598423 +0000 UTC m=+767.450924860" observedRunningTime="2025-12-11 13:16:50.189928484 +0000 UTC m=+767.762254941" watchObservedRunningTime="2025-12-11 13:16:50.194229205 +0000 UTC m=+767.766555642" Dec 11 13:16:55 crc kubenswrapper[4898]: I1211 13:16:55.017430 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["minio-dev/minio"] Dec 11 13:16:55 crc kubenswrapper[4898]: I1211 13:16:55.018789 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="minio-dev/minio" Dec 11 13:16:55 crc kubenswrapper[4898]: I1211 13:16:55.020804 4898 reflector.go:368] Caches populated for *v1.Secret from object-"minio-dev"/"default-dockercfg-zxv65" Dec 11 13:16:55 crc kubenswrapper[4898]: I1211 13:16:55.021095 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"minio-dev"/"openshift-service-ca.crt" Dec 11 13:16:55 crc kubenswrapper[4898]: I1211 13:16:55.021562 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"minio-dev"/"kube-root-ca.crt" Dec 11 13:16:55 crc kubenswrapper[4898]: I1211 13:16:55.071346 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["minio-dev/minio"] Dec 11 13:16:55 crc kubenswrapper[4898]: I1211 13:16:55.185436 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vd8rm\" (UniqueName: \"kubernetes.io/projected/c66c6173-59fa-46e0-8101-aec532f90448-kube-api-access-vd8rm\") pod \"minio\" (UID: \"c66c6173-59fa-46e0-8101-aec532f90448\") " pod="minio-dev/minio" Dec 11 13:16:55 crc kubenswrapper[4898]: I1211 13:16:55.185535 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-d872606b-2a18-4a11-ae0c-1594baa6ee12\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d872606b-2a18-4a11-ae0c-1594baa6ee12\") pod \"minio\" (UID: \"c66c6173-59fa-46e0-8101-aec532f90448\") " pod="minio-dev/minio" Dec 11 13:16:55 crc kubenswrapper[4898]: I1211 13:16:55.287198 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vd8rm\" (UniqueName: \"kubernetes.io/projected/c66c6173-59fa-46e0-8101-aec532f90448-kube-api-access-vd8rm\") pod \"minio\" (UID: \"c66c6173-59fa-46e0-8101-aec532f90448\") " pod="minio-dev/minio" Dec 11 13:16:55 crc kubenswrapper[4898]: I1211 13:16:55.287258 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-d872606b-2a18-4a11-ae0c-1594baa6ee12\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d872606b-2a18-4a11-ae0c-1594baa6ee12\") pod \"minio\" (UID: \"c66c6173-59fa-46e0-8101-aec532f90448\") " pod="minio-dev/minio" Dec 11 13:16:55 crc kubenswrapper[4898]: I1211 13:16:55.290058 4898 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 11 13:16:55 crc kubenswrapper[4898]: I1211 13:16:55.290085 4898 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-d872606b-2a18-4a11-ae0c-1594baa6ee12\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d872606b-2a18-4a11-ae0c-1594baa6ee12\") pod \"minio\" (UID: \"c66c6173-59fa-46e0-8101-aec532f90448\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/347ba0f01c5414e9aa687e8b077937658627dd72636411a089243ef747a99179/globalmount\"" pod="minio-dev/minio" Dec 11 13:16:55 crc kubenswrapper[4898]: I1211 13:16:55.311847 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vd8rm\" (UniqueName: \"kubernetes.io/projected/c66c6173-59fa-46e0-8101-aec532f90448-kube-api-access-vd8rm\") pod \"minio\" (UID: \"c66c6173-59fa-46e0-8101-aec532f90448\") " pod="minio-dev/minio" Dec 11 13:16:55 crc kubenswrapper[4898]: I1211 13:16:55.312335 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-d872606b-2a18-4a11-ae0c-1594baa6ee12\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d872606b-2a18-4a11-ae0c-1594baa6ee12\") pod \"minio\" (UID: \"c66c6173-59fa-46e0-8101-aec532f90448\") " pod="minio-dev/minio" Dec 11 13:16:55 crc kubenswrapper[4898]: I1211 13:16:55.343024 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="minio-dev/minio" Dec 11 13:16:55 crc kubenswrapper[4898]: I1211 13:16:55.775115 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["minio-dev/minio"] Dec 11 13:16:56 crc kubenswrapper[4898]: I1211 13:16:56.161999 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="minio-dev/minio" event={"ID":"c66c6173-59fa-46e0-8101-aec532f90448","Type":"ContainerStarted","Data":"6ae80f872c699152cbc76d722c456b925779f0cc2effc81d034bcf0f4362f136"} Dec 11 13:17:00 crc kubenswrapper[4898]: I1211 13:17:00.190473 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="minio-dev/minio" event={"ID":"c66c6173-59fa-46e0-8101-aec532f90448","Type":"ContainerStarted","Data":"4e8a6bf2f7496d2540641713bef1f5d6b74a3f246bb5f232099fe80cd5f73d0b"} Dec 11 13:17:00 crc kubenswrapper[4898]: I1211 13:17:00.212021 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="minio-dev/minio" podStartSLOduration=4.33007187 podStartE2EDuration="8.212003613s" podCreationTimestamp="2025-12-11 13:16:52 +0000 UTC" firstStartedPulling="2025-12-11 13:16:55.779220976 +0000 UTC m=+773.351547403" lastFinishedPulling="2025-12-11 13:16:59.661152699 +0000 UTC m=+777.233479146" observedRunningTime="2025-12-11 13:17:00.209321471 +0000 UTC m=+777.781647938" watchObservedRunningTime="2025-12-11 13:17:00.212003613 +0000 UTC m=+777.784330070" Dec 11 13:17:04 crc kubenswrapper[4898]: I1211 13:17:04.135286 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-distributor-76cc67bf56-qjz7m"] Dec 11 13:17:04 crc kubenswrapper[4898]: I1211 13:17:04.136739 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-distributor-76cc67bf56-qjz7m" Dec 11 13:17:04 crc kubenswrapper[4898]: I1211 13:17:04.139646 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-dockercfg-zlbmq" Dec 11 13:17:04 crc kubenswrapper[4898]: I1211 13:17:04.140194 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"logging-loki-ca-bundle" Dec 11 13:17:04 crc kubenswrapper[4898]: I1211 13:17:04.140391 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-distributor-grpc" Dec 11 13:17:04 crc kubenswrapper[4898]: I1211 13:17:04.142145 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"logging-loki-config" Dec 11 13:17:04 crc kubenswrapper[4898]: I1211 13:17:04.142359 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-distributor-http" Dec 11 13:17:04 crc kubenswrapper[4898]: I1211 13:17:04.143307 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-distributor-76cc67bf56-qjz7m"] Dec 11 13:17:04 crc kubenswrapper[4898]: I1211 13:17:04.231447 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/841a3e5b-876d-43b2-b24a-d5c01876c30d-logging-loki-ca-bundle\") pod \"logging-loki-distributor-76cc67bf56-qjz7m\" (UID: \"841a3e5b-876d-43b2-b24a-d5c01876c30d\") " pod="openshift-logging/logging-loki-distributor-76cc67bf56-qjz7m" Dec 11 13:17:04 crc kubenswrapper[4898]: I1211 13:17:04.231590 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8xtwj\" (UniqueName: \"kubernetes.io/projected/841a3e5b-876d-43b2-b24a-d5c01876c30d-kube-api-access-8xtwj\") pod \"logging-loki-distributor-76cc67bf56-qjz7m\" (UID: \"841a3e5b-876d-43b2-b24a-d5c01876c30d\") " pod="openshift-logging/logging-loki-distributor-76cc67bf56-qjz7m" Dec 11 13:17:04 crc kubenswrapper[4898]: I1211 13:17:04.231632 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-distributor-grpc\" (UniqueName: \"kubernetes.io/secret/841a3e5b-876d-43b2-b24a-d5c01876c30d-logging-loki-distributor-grpc\") pod \"logging-loki-distributor-76cc67bf56-qjz7m\" (UID: \"841a3e5b-876d-43b2-b24a-d5c01876c30d\") " pod="openshift-logging/logging-loki-distributor-76cc67bf56-qjz7m" Dec 11 13:17:04 crc kubenswrapper[4898]: I1211 13:17:04.231712 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-distributor-http\" (UniqueName: \"kubernetes.io/secret/841a3e5b-876d-43b2-b24a-d5c01876c30d-logging-loki-distributor-http\") pod \"logging-loki-distributor-76cc67bf56-qjz7m\" (UID: \"841a3e5b-876d-43b2-b24a-d5c01876c30d\") " pod="openshift-logging/logging-loki-distributor-76cc67bf56-qjz7m" Dec 11 13:17:04 crc kubenswrapper[4898]: I1211 13:17:04.231767 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/841a3e5b-876d-43b2-b24a-d5c01876c30d-config\") pod \"logging-loki-distributor-76cc67bf56-qjz7m\" (UID: \"841a3e5b-876d-43b2-b24a-d5c01876c30d\") " pod="openshift-logging/logging-loki-distributor-76cc67bf56-qjz7m" Dec 11 13:17:04 crc kubenswrapper[4898]: I1211 13:17:04.283504 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-querier-5895d59bb8-62tqd"] Dec 11 13:17:04 crc kubenswrapper[4898]: I1211 13:17:04.284307 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-querier-5895d59bb8-62tqd" Dec 11 13:17:04 crc kubenswrapper[4898]: I1211 13:17:04.286635 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-s3" Dec 11 13:17:04 crc kubenswrapper[4898]: I1211 13:17:04.286813 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-querier-http" Dec 11 13:17:04 crc kubenswrapper[4898]: I1211 13:17:04.286943 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-querier-grpc" Dec 11 13:17:04 crc kubenswrapper[4898]: I1211 13:17:04.312941 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-querier-5895d59bb8-62tqd"] Dec 11 13:17:04 crc kubenswrapper[4898]: I1211 13:17:04.333117 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/841a3e5b-876d-43b2-b24a-d5c01876c30d-logging-loki-ca-bundle\") pod \"logging-loki-distributor-76cc67bf56-qjz7m\" (UID: \"841a3e5b-876d-43b2-b24a-d5c01876c30d\") " pod="openshift-logging/logging-loki-distributor-76cc67bf56-qjz7m" Dec 11 13:17:04 crc kubenswrapper[4898]: I1211 13:17:04.333185 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8xtwj\" (UniqueName: \"kubernetes.io/projected/841a3e5b-876d-43b2-b24a-d5c01876c30d-kube-api-access-8xtwj\") pod \"logging-loki-distributor-76cc67bf56-qjz7m\" (UID: \"841a3e5b-876d-43b2-b24a-d5c01876c30d\") " pod="openshift-logging/logging-loki-distributor-76cc67bf56-qjz7m" Dec 11 13:17:04 crc kubenswrapper[4898]: I1211 13:17:04.333240 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-distributor-grpc\" (UniqueName: \"kubernetes.io/secret/841a3e5b-876d-43b2-b24a-d5c01876c30d-logging-loki-distributor-grpc\") pod \"logging-loki-distributor-76cc67bf56-qjz7m\" (UID: \"841a3e5b-876d-43b2-b24a-d5c01876c30d\") " pod="openshift-logging/logging-loki-distributor-76cc67bf56-qjz7m" Dec 11 13:17:04 crc kubenswrapper[4898]: I1211 13:17:04.333322 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-distributor-http\" (UniqueName: \"kubernetes.io/secret/841a3e5b-876d-43b2-b24a-d5c01876c30d-logging-loki-distributor-http\") pod \"logging-loki-distributor-76cc67bf56-qjz7m\" (UID: \"841a3e5b-876d-43b2-b24a-d5c01876c30d\") " pod="openshift-logging/logging-loki-distributor-76cc67bf56-qjz7m" Dec 11 13:17:04 crc kubenswrapper[4898]: I1211 13:17:04.333351 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/841a3e5b-876d-43b2-b24a-d5c01876c30d-config\") pod \"logging-loki-distributor-76cc67bf56-qjz7m\" (UID: \"841a3e5b-876d-43b2-b24a-d5c01876c30d\") " pod="openshift-logging/logging-loki-distributor-76cc67bf56-qjz7m" Dec 11 13:17:04 crc kubenswrapper[4898]: I1211 13:17:04.333987 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/841a3e5b-876d-43b2-b24a-d5c01876c30d-logging-loki-ca-bundle\") pod \"logging-loki-distributor-76cc67bf56-qjz7m\" (UID: \"841a3e5b-876d-43b2-b24a-d5c01876c30d\") " pod="openshift-logging/logging-loki-distributor-76cc67bf56-qjz7m" Dec 11 13:17:04 crc kubenswrapper[4898]: I1211 13:17:04.334160 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/841a3e5b-876d-43b2-b24a-d5c01876c30d-config\") pod \"logging-loki-distributor-76cc67bf56-qjz7m\" (UID: \"841a3e5b-876d-43b2-b24a-d5c01876c30d\") " pod="openshift-logging/logging-loki-distributor-76cc67bf56-qjz7m" Dec 11 13:17:04 crc kubenswrapper[4898]: I1211 13:17:04.339325 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-distributor-http\" (UniqueName: \"kubernetes.io/secret/841a3e5b-876d-43b2-b24a-d5c01876c30d-logging-loki-distributor-http\") pod \"logging-loki-distributor-76cc67bf56-qjz7m\" (UID: \"841a3e5b-876d-43b2-b24a-d5c01876c30d\") " pod="openshift-logging/logging-loki-distributor-76cc67bf56-qjz7m" Dec 11 13:17:04 crc kubenswrapper[4898]: I1211 13:17:04.341032 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-distributor-grpc\" (UniqueName: \"kubernetes.io/secret/841a3e5b-876d-43b2-b24a-d5c01876c30d-logging-loki-distributor-grpc\") pod \"logging-loki-distributor-76cc67bf56-qjz7m\" (UID: \"841a3e5b-876d-43b2-b24a-d5c01876c30d\") " pod="openshift-logging/logging-loki-distributor-76cc67bf56-qjz7m" Dec 11 13:17:04 crc kubenswrapper[4898]: I1211 13:17:04.348137 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8xtwj\" (UniqueName: \"kubernetes.io/projected/841a3e5b-876d-43b2-b24a-d5c01876c30d-kube-api-access-8xtwj\") pod \"logging-loki-distributor-76cc67bf56-qjz7m\" (UID: \"841a3e5b-876d-43b2-b24a-d5c01876c30d\") " pod="openshift-logging/logging-loki-distributor-76cc67bf56-qjz7m" Dec 11 13:17:04 crc kubenswrapper[4898]: I1211 13:17:04.395616 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-query-frontend-84558f7c9f-rjbbq"] Dec 11 13:17:04 crc kubenswrapper[4898]: I1211 13:17:04.397133 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-rjbbq" Dec 11 13:17:04 crc kubenswrapper[4898]: I1211 13:17:04.398311 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-query-frontend-http" Dec 11 13:17:04 crc kubenswrapper[4898]: I1211 13:17:04.399270 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-query-frontend-grpc" Dec 11 13:17:04 crc kubenswrapper[4898]: I1211 13:17:04.409401 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-query-frontend-84558f7c9f-rjbbq"] Dec 11 13:17:04 crc kubenswrapper[4898]: I1211 13:17:04.434289 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4ecec7bc-8ce0-46ce-99fd-2a6fbcc626d0-config\") pod \"logging-loki-querier-5895d59bb8-62tqd\" (UID: \"4ecec7bc-8ce0-46ce-99fd-2a6fbcc626d0\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-62tqd" Dec 11 13:17:04 crc kubenswrapper[4898]: I1211 13:17:04.434390 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lc7b7\" (UniqueName: \"kubernetes.io/projected/4ecec7bc-8ce0-46ce-99fd-2a6fbcc626d0-kube-api-access-lc7b7\") pod \"logging-loki-querier-5895d59bb8-62tqd\" (UID: \"4ecec7bc-8ce0-46ce-99fd-2a6fbcc626d0\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-62tqd" Dec 11 13:17:04 crc kubenswrapper[4898]: I1211 13:17:04.434414 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-querier-grpc\" (UniqueName: \"kubernetes.io/secret/4ecec7bc-8ce0-46ce-99fd-2a6fbcc626d0-logging-loki-querier-grpc\") pod \"logging-loki-querier-5895d59bb8-62tqd\" (UID: \"4ecec7bc-8ce0-46ce-99fd-2a6fbcc626d0\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-62tqd" Dec 11 13:17:04 crc kubenswrapper[4898]: I1211 13:17:04.434430 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-querier-http\" (UniqueName: \"kubernetes.io/secret/4ecec7bc-8ce0-46ce-99fd-2a6fbcc626d0-logging-loki-querier-http\") pod \"logging-loki-querier-5895d59bb8-62tqd\" (UID: \"4ecec7bc-8ce0-46ce-99fd-2a6fbcc626d0\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-62tqd" Dec 11 13:17:04 crc kubenswrapper[4898]: I1211 13:17:04.434474 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4ecec7bc-8ce0-46ce-99fd-2a6fbcc626d0-logging-loki-ca-bundle\") pod \"logging-loki-querier-5895d59bb8-62tqd\" (UID: \"4ecec7bc-8ce0-46ce-99fd-2a6fbcc626d0\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-62tqd" Dec 11 13:17:04 crc kubenswrapper[4898]: I1211 13:17:04.434508 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/4ecec7bc-8ce0-46ce-99fd-2a6fbcc626d0-logging-loki-s3\") pod \"logging-loki-querier-5895d59bb8-62tqd\" (UID: \"4ecec7bc-8ce0-46ce-99fd-2a6fbcc626d0\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-62tqd" Dec 11 13:17:04 crc kubenswrapper[4898]: I1211 13:17:04.456514 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-distributor-76cc67bf56-qjz7m" Dec 11 13:17:04 crc kubenswrapper[4898]: I1211 13:17:04.487491 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-gateway-69ffd5987-wz9b5"] Dec 11 13:17:04 crc kubenswrapper[4898]: I1211 13:17:04.488867 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-gateway-69ffd5987-wz9b5" Dec 11 13:17:04 crc kubenswrapper[4898]: I1211 13:17:04.491764 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-gateway-http" Dec 11 13:17:04 crc kubenswrapper[4898]: I1211 13:17:04.491764 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-gateway" Dec 11 13:17:04 crc kubenswrapper[4898]: I1211 13:17:04.491986 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-gateway-client-http" Dec 11 13:17:04 crc kubenswrapper[4898]: I1211 13:17:04.492131 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"logging-loki-gateway" Dec 11 13:17:04 crc kubenswrapper[4898]: I1211 13:17:04.492250 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"logging-loki-gateway-ca-bundle" Dec 11 13:17:04 crc kubenswrapper[4898]: I1211 13:17:04.501029 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-gateway-69ffd5987-jj95c"] Dec 11 13:17:04 crc kubenswrapper[4898]: I1211 13:17:04.502026 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-gateway-69ffd5987-jj95c" Dec 11 13:17:04 crc kubenswrapper[4898]: I1211 13:17:04.510530 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-gateway-dockercfg-778pr" Dec 11 13:17:04 crc kubenswrapper[4898]: I1211 13:17:04.527382 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-gateway-69ffd5987-wz9b5"] Dec 11 13:17:04 crc kubenswrapper[4898]: I1211 13:17:04.535975 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/4ecec7bc-8ce0-46ce-99fd-2a6fbcc626d0-logging-loki-s3\") pod \"logging-loki-querier-5895d59bb8-62tqd\" (UID: \"4ecec7bc-8ce0-46ce-99fd-2a6fbcc626d0\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-62tqd" Dec 11 13:17:04 crc kubenswrapper[4898]: I1211 13:17:04.536040 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/98efd2cb-8cd8-49c2-a54b-5a04cf51dc71-config\") pod \"logging-loki-query-frontend-84558f7c9f-rjbbq\" (UID: \"98efd2cb-8cd8-49c2-a54b-5a04cf51dc71\") " pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-rjbbq" Dec 11 13:17:04 crc kubenswrapper[4898]: I1211 13:17:04.536065 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-query-frontend-grpc\" (UniqueName: \"kubernetes.io/secret/98efd2cb-8cd8-49c2-a54b-5a04cf51dc71-logging-loki-query-frontend-grpc\") pod \"logging-loki-query-frontend-84558f7c9f-rjbbq\" (UID: \"98efd2cb-8cd8-49c2-a54b-5a04cf51dc71\") " pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-rjbbq" Dec 11 13:17:04 crc kubenswrapper[4898]: I1211 13:17:04.536082 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/98efd2cb-8cd8-49c2-a54b-5a04cf51dc71-logging-loki-ca-bundle\") pod \"logging-loki-query-frontend-84558f7c9f-rjbbq\" (UID: \"98efd2cb-8cd8-49c2-a54b-5a04cf51dc71\") " pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-rjbbq" Dec 11 13:17:04 crc kubenswrapper[4898]: I1211 13:17:04.536101 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4ecec7bc-8ce0-46ce-99fd-2a6fbcc626d0-config\") pod \"logging-loki-querier-5895d59bb8-62tqd\" (UID: \"4ecec7bc-8ce0-46ce-99fd-2a6fbcc626d0\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-62tqd" Dec 11 13:17:04 crc kubenswrapper[4898]: I1211 13:17:04.536122 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-query-frontend-http\" (UniqueName: \"kubernetes.io/secret/98efd2cb-8cd8-49c2-a54b-5a04cf51dc71-logging-loki-query-frontend-http\") pod \"logging-loki-query-frontend-84558f7c9f-rjbbq\" (UID: \"98efd2cb-8cd8-49c2-a54b-5a04cf51dc71\") " pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-rjbbq" Dec 11 13:17:04 crc kubenswrapper[4898]: I1211 13:17:04.536151 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jpnt4\" (UniqueName: \"kubernetes.io/projected/98efd2cb-8cd8-49c2-a54b-5a04cf51dc71-kube-api-access-jpnt4\") pod \"logging-loki-query-frontend-84558f7c9f-rjbbq\" (UID: \"98efd2cb-8cd8-49c2-a54b-5a04cf51dc71\") " pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-rjbbq" Dec 11 13:17:04 crc kubenswrapper[4898]: I1211 13:17:04.536204 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lc7b7\" (UniqueName: \"kubernetes.io/projected/4ecec7bc-8ce0-46ce-99fd-2a6fbcc626d0-kube-api-access-lc7b7\") pod \"logging-loki-querier-5895d59bb8-62tqd\" (UID: \"4ecec7bc-8ce0-46ce-99fd-2a6fbcc626d0\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-62tqd" Dec 11 13:17:04 crc kubenswrapper[4898]: I1211 13:17:04.536220 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-querier-http\" (UniqueName: \"kubernetes.io/secret/4ecec7bc-8ce0-46ce-99fd-2a6fbcc626d0-logging-loki-querier-http\") pod \"logging-loki-querier-5895d59bb8-62tqd\" (UID: \"4ecec7bc-8ce0-46ce-99fd-2a6fbcc626d0\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-62tqd" Dec 11 13:17:04 crc kubenswrapper[4898]: I1211 13:17:04.536236 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-querier-grpc\" (UniqueName: \"kubernetes.io/secret/4ecec7bc-8ce0-46ce-99fd-2a6fbcc626d0-logging-loki-querier-grpc\") pod \"logging-loki-querier-5895d59bb8-62tqd\" (UID: \"4ecec7bc-8ce0-46ce-99fd-2a6fbcc626d0\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-62tqd" Dec 11 13:17:04 crc kubenswrapper[4898]: I1211 13:17:04.536259 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4ecec7bc-8ce0-46ce-99fd-2a6fbcc626d0-logging-loki-ca-bundle\") pod \"logging-loki-querier-5895d59bb8-62tqd\" (UID: \"4ecec7bc-8ce0-46ce-99fd-2a6fbcc626d0\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-62tqd" Dec 11 13:17:04 crc kubenswrapper[4898]: I1211 13:17:04.536544 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-gateway-69ffd5987-jj95c"] Dec 11 13:17:04 crc kubenswrapper[4898]: I1211 13:17:04.537035 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4ecec7bc-8ce0-46ce-99fd-2a6fbcc626d0-logging-loki-ca-bundle\") pod \"logging-loki-querier-5895d59bb8-62tqd\" (UID: \"4ecec7bc-8ce0-46ce-99fd-2a6fbcc626d0\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-62tqd" Dec 11 13:17:04 crc kubenswrapper[4898]: I1211 13:17:04.537675 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4ecec7bc-8ce0-46ce-99fd-2a6fbcc626d0-config\") pod \"logging-loki-querier-5895d59bb8-62tqd\" (UID: \"4ecec7bc-8ce0-46ce-99fd-2a6fbcc626d0\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-62tqd" Dec 11 13:17:04 crc kubenswrapper[4898]: I1211 13:17:04.541285 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-querier-grpc\" (UniqueName: \"kubernetes.io/secret/4ecec7bc-8ce0-46ce-99fd-2a6fbcc626d0-logging-loki-querier-grpc\") pod \"logging-loki-querier-5895d59bb8-62tqd\" (UID: \"4ecec7bc-8ce0-46ce-99fd-2a6fbcc626d0\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-62tqd" Dec 11 13:17:04 crc kubenswrapper[4898]: I1211 13:17:04.546036 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/4ecec7bc-8ce0-46ce-99fd-2a6fbcc626d0-logging-loki-s3\") pod \"logging-loki-querier-5895d59bb8-62tqd\" (UID: \"4ecec7bc-8ce0-46ce-99fd-2a6fbcc626d0\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-62tqd" Dec 11 13:17:04 crc kubenswrapper[4898]: I1211 13:17:04.558580 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-querier-http\" (UniqueName: \"kubernetes.io/secret/4ecec7bc-8ce0-46ce-99fd-2a6fbcc626d0-logging-loki-querier-http\") pod \"logging-loki-querier-5895d59bb8-62tqd\" (UID: \"4ecec7bc-8ce0-46ce-99fd-2a6fbcc626d0\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-62tqd" Dec 11 13:17:04 crc kubenswrapper[4898]: I1211 13:17:04.563186 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lc7b7\" (UniqueName: \"kubernetes.io/projected/4ecec7bc-8ce0-46ce-99fd-2a6fbcc626d0-kube-api-access-lc7b7\") pod \"logging-loki-querier-5895d59bb8-62tqd\" (UID: \"4ecec7bc-8ce0-46ce-99fd-2a6fbcc626d0\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-62tqd" Dec 11 13:17:04 crc kubenswrapper[4898]: I1211 13:17:04.598449 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-querier-5895d59bb8-62tqd" Dec 11 13:17:04 crc kubenswrapper[4898]: I1211 13:17:04.638092 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c8cb2c04-2c3f-4ffa-95b3-7e5c32a159ce-logging-loki-gateway-ca-bundle\") pod \"logging-loki-gateway-69ffd5987-jj95c\" (UID: \"c8cb2c04-2c3f-4ffa-95b3-7e5c32a159ce\") " pod="openshift-logging/logging-loki-gateway-69ffd5987-jj95c" Dec 11 13:17:04 crc kubenswrapper[4898]: I1211 13:17:04.638468 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/6d6f1657-b9dd-4fba-a216-ca660a4fa958-lokistack-gateway\") pod \"logging-loki-gateway-69ffd5987-wz9b5\" (UID: \"6d6f1657-b9dd-4fba-a216-ca660a4fa958\") " pod="openshift-logging/logging-loki-gateway-69ffd5987-wz9b5" Dec 11 13:17:04 crc kubenswrapper[4898]: I1211 13:17:04.638492 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6d6f1657-b9dd-4fba-a216-ca660a4fa958-logging-loki-gateway-ca-bundle\") pod \"logging-loki-gateway-69ffd5987-wz9b5\" (UID: \"6d6f1657-b9dd-4fba-a216-ca660a4fa958\") " pod="openshift-logging/logging-loki-gateway-69ffd5987-wz9b5" Dec 11 13:17:04 crc kubenswrapper[4898]: I1211 13:17:04.638559 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/6d6f1657-b9dd-4fba-a216-ca660a4fa958-rbac\") pod \"logging-loki-gateway-69ffd5987-wz9b5\" (UID: \"6d6f1657-b9dd-4fba-a216-ca660a4fa958\") " pod="openshift-logging/logging-loki-gateway-69ffd5987-wz9b5" Dec 11 13:17:04 crc kubenswrapper[4898]: I1211 13:17:04.638591 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/6d6f1657-b9dd-4fba-a216-ca660a4fa958-tls-secret\") pod \"logging-loki-gateway-69ffd5987-wz9b5\" (UID: \"6d6f1657-b9dd-4fba-a216-ca660a4fa958\") " pod="openshift-logging/logging-loki-gateway-69ffd5987-wz9b5" Dec 11 13:17:04 crc kubenswrapper[4898]: I1211 13:17:04.638629 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/98efd2cb-8cd8-49c2-a54b-5a04cf51dc71-config\") pod \"logging-loki-query-frontend-84558f7c9f-rjbbq\" (UID: \"98efd2cb-8cd8-49c2-a54b-5a04cf51dc71\") " pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-rjbbq" Dec 11 13:17:04 crc kubenswrapper[4898]: I1211 13:17:04.638662 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-query-frontend-grpc\" (UniqueName: \"kubernetes.io/secret/98efd2cb-8cd8-49c2-a54b-5a04cf51dc71-logging-loki-query-frontend-grpc\") pod \"logging-loki-query-frontend-84558f7c9f-rjbbq\" (UID: \"98efd2cb-8cd8-49c2-a54b-5a04cf51dc71\") " pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-rjbbq" Dec 11 13:17:04 crc kubenswrapper[4898]: I1211 13:17:04.638686 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/98efd2cb-8cd8-49c2-a54b-5a04cf51dc71-logging-loki-ca-bundle\") pod \"logging-loki-query-frontend-84558f7c9f-rjbbq\" (UID: \"98efd2cb-8cd8-49c2-a54b-5a04cf51dc71\") " pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-rjbbq" Dec 11 13:17:04 crc kubenswrapper[4898]: I1211 13:17:04.638718 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-query-frontend-http\" (UniqueName: \"kubernetes.io/secret/98efd2cb-8cd8-49c2-a54b-5a04cf51dc71-logging-loki-query-frontend-http\") pod \"logging-loki-query-frontend-84558f7c9f-rjbbq\" (UID: \"98efd2cb-8cd8-49c2-a54b-5a04cf51dc71\") " pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-rjbbq" Dec 11 13:17:04 crc kubenswrapper[4898]: I1211 13:17:04.638773 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jpnt4\" (UniqueName: \"kubernetes.io/projected/98efd2cb-8cd8-49c2-a54b-5a04cf51dc71-kube-api-access-jpnt4\") pod \"logging-loki-query-frontend-84558f7c9f-rjbbq\" (UID: \"98efd2cb-8cd8-49c2-a54b-5a04cf51dc71\") " pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-rjbbq" Dec 11 13:17:04 crc kubenswrapper[4898]: I1211 13:17:04.638814 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/6d6f1657-b9dd-4fba-a216-ca660a4fa958-logging-loki-gateway-client-http\") pod \"logging-loki-gateway-69ffd5987-wz9b5\" (UID: \"6d6f1657-b9dd-4fba-a216-ca660a4fa958\") " pod="openshift-logging/logging-loki-gateway-69ffd5987-wz9b5" Dec 11 13:17:04 crc kubenswrapper[4898]: I1211 13:17:04.638846 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/c8cb2c04-2c3f-4ffa-95b3-7e5c32a159ce-rbac\") pod \"logging-loki-gateway-69ffd5987-jj95c\" (UID: \"c8cb2c04-2c3f-4ffa-95b3-7e5c32a159ce\") " pod="openshift-logging/logging-loki-gateway-69ffd5987-jj95c" Dec 11 13:17:04 crc kubenswrapper[4898]: I1211 13:17:04.638868 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/c8cb2c04-2c3f-4ffa-95b3-7e5c32a159ce-tenants\") pod \"logging-loki-gateway-69ffd5987-jj95c\" (UID: \"c8cb2c04-2c3f-4ffa-95b3-7e5c32a159ce\") " pod="openshift-logging/logging-loki-gateway-69ffd5987-jj95c" Dec 11 13:17:04 crc kubenswrapper[4898]: I1211 13:17:04.638900 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b7lrh\" (UniqueName: \"kubernetes.io/projected/6d6f1657-b9dd-4fba-a216-ca660a4fa958-kube-api-access-b7lrh\") pod \"logging-loki-gateway-69ffd5987-wz9b5\" (UID: \"6d6f1657-b9dd-4fba-a216-ca660a4fa958\") " pod="openshift-logging/logging-loki-gateway-69ffd5987-wz9b5" Dec 11 13:17:04 crc kubenswrapper[4898]: I1211 13:17:04.638936 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vvlm8\" (UniqueName: \"kubernetes.io/projected/c8cb2c04-2c3f-4ffa-95b3-7e5c32a159ce-kube-api-access-vvlm8\") pod \"logging-loki-gateway-69ffd5987-jj95c\" (UID: \"c8cb2c04-2c3f-4ffa-95b3-7e5c32a159ce\") " pod="openshift-logging/logging-loki-gateway-69ffd5987-jj95c" Dec 11 13:17:04 crc kubenswrapper[4898]: I1211 13:17:04.638964 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/c8cb2c04-2c3f-4ffa-95b3-7e5c32a159ce-lokistack-gateway\") pod \"logging-loki-gateway-69ffd5987-jj95c\" (UID: \"c8cb2c04-2c3f-4ffa-95b3-7e5c32a159ce\") " pod="openshift-logging/logging-loki-gateway-69ffd5987-jj95c" Dec 11 13:17:04 crc kubenswrapper[4898]: I1211 13:17:04.638986 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/c8cb2c04-2c3f-4ffa-95b3-7e5c32a159ce-tls-secret\") pod \"logging-loki-gateway-69ffd5987-jj95c\" (UID: \"c8cb2c04-2c3f-4ffa-95b3-7e5c32a159ce\") " pod="openshift-logging/logging-loki-gateway-69ffd5987-jj95c" Dec 11 13:17:04 crc kubenswrapper[4898]: I1211 13:17:04.639011 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/c8cb2c04-2c3f-4ffa-95b3-7e5c32a159ce-logging-loki-gateway-client-http\") pod \"logging-loki-gateway-69ffd5987-jj95c\" (UID: \"c8cb2c04-2c3f-4ffa-95b3-7e5c32a159ce\") " pod="openshift-logging/logging-loki-gateway-69ffd5987-jj95c" Dec 11 13:17:04 crc kubenswrapper[4898]: I1211 13:17:04.639033 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6d6f1657-b9dd-4fba-a216-ca660a4fa958-logging-loki-ca-bundle\") pod \"logging-loki-gateway-69ffd5987-wz9b5\" (UID: \"6d6f1657-b9dd-4fba-a216-ca660a4fa958\") " pod="openshift-logging/logging-loki-gateway-69ffd5987-wz9b5" Dec 11 13:17:04 crc kubenswrapper[4898]: I1211 13:17:04.639056 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c8cb2c04-2c3f-4ffa-95b3-7e5c32a159ce-logging-loki-ca-bundle\") pod \"logging-loki-gateway-69ffd5987-jj95c\" (UID: \"c8cb2c04-2c3f-4ffa-95b3-7e5c32a159ce\") " pod="openshift-logging/logging-loki-gateway-69ffd5987-jj95c" Dec 11 13:17:04 crc kubenswrapper[4898]: I1211 13:17:04.639080 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/6d6f1657-b9dd-4fba-a216-ca660a4fa958-tenants\") pod \"logging-loki-gateway-69ffd5987-wz9b5\" (UID: \"6d6f1657-b9dd-4fba-a216-ca660a4fa958\") " pod="openshift-logging/logging-loki-gateway-69ffd5987-wz9b5" Dec 11 13:17:04 crc kubenswrapper[4898]: I1211 13:17:04.639675 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/98efd2cb-8cd8-49c2-a54b-5a04cf51dc71-config\") pod \"logging-loki-query-frontend-84558f7c9f-rjbbq\" (UID: \"98efd2cb-8cd8-49c2-a54b-5a04cf51dc71\") " pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-rjbbq" Dec 11 13:17:04 crc kubenswrapper[4898]: I1211 13:17:04.640324 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/98efd2cb-8cd8-49c2-a54b-5a04cf51dc71-logging-loki-ca-bundle\") pod \"logging-loki-query-frontend-84558f7c9f-rjbbq\" (UID: \"98efd2cb-8cd8-49c2-a54b-5a04cf51dc71\") " pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-rjbbq" Dec 11 13:17:04 crc kubenswrapper[4898]: I1211 13:17:04.644110 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-query-frontend-grpc\" (UniqueName: \"kubernetes.io/secret/98efd2cb-8cd8-49c2-a54b-5a04cf51dc71-logging-loki-query-frontend-grpc\") pod \"logging-loki-query-frontend-84558f7c9f-rjbbq\" (UID: \"98efd2cb-8cd8-49c2-a54b-5a04cf51dc71\") " pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-rjbbq" Dec 11 13:17:04 crc kubenswrapper[4898]: I1211 13:17:04.647822 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-query-frontend-http\" (UniqueName: \"kubernetes.io/secret/98efd2cb-8cd8-49c2-a54b-5a04cf51dc71-logging-loki-query-frontend-http\") pod \"logging-loki-query-frontend-84558f7c9f-rjbbq\" (UID: \"98efd2cb-8cd8-49c2-a54b-5a04cf51dc71\") " pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-rjbbq" Dec 11 13:17:04 crc kubenswrapper[4898]: I1211 13:17:04.659064 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jpnt4\" (UniqueName: \"kubernetes.io/projected/98efd2cb-8cd8-49c2-a54b-5a04cf51dc71-kube-api-access-jpnt4\") pod \"logging-loki-query-frontend-84558f7c9f-rjbbq\" (UID: \"98efd2cb-8cd8-49c2-a54b-5a04cf51dc71\") " pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-rjbbq" Dec 11 13:17:04 crc kubenswrapper[4898]: I1211 13:17:04.714557 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-rjbbq" Dec 11 13:17:04 crc kubenswrapper[4898]: I1211 13:17:04.740512 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/c8cb2c04-2c3f-4ffa-95b3-7e5c32a159ce-logging-loki-gateway-client-http\") pod \"logging-loki-gateway-69ffd5987-jj95c\" (UID: \"c8cb2c04-2c3f-4ffa-95b3-7e5c32a159ce\") " pod="openshift-logging/logging-loki-gateway-69ffd5987-jj95c" Dec 11 13:17:04 crc kubenswrapper[4898]: I1211 13:17:04.740548 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6d6f1657-b9dd-4fba-a216-ca660a4fa958-logging-loki-ca-bundle\") pod \"logging-loki-gateway-69ffd5987-wz9b5\" (UID: \"6d6f1657-b9dd-4fba-a216-ca660a4fa958\") " pod="openshift-logging/logging-loki-gateway-69ffd5987-wz9b5" Dec 11 13:17:04 crc kubenswrapper[4898]: I1211 13:17:04.740568 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c8cb2c04-2c3f-4ffa-95b3-7e5c32a159ce-logging-loki-ca-bundle\") pod \"logging-loki-gateway-69ffd5987-jj95c\" (UID: \"c8cb2c04-2c3f-4ffa-95b3-7e5c32a159ce\") " pod="openshift-logging/logging-loki-gateway-69ffd5987-jj95c" Dec 11 13:17:04 crc kubenswrapper[4898]: I1211 13:17:04.740589 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/6d6f1657-b9dd-4fba-a216-ca660a4fa958-tenants\") pod \"logging-loki-gateway-69ffd5987-wz9b5\" (UID: \"6d6f1657-b9dd-4fba-a216-ca660a4fa958\") " pod="openshift-logging/logging-loki-gateway-69ffd5987-wz9b5" Dec 11 13:17:04 crc kubenswrapper[4898]: I1211 13:17:04.740608 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6d6f1657-b9dd-4fba-a216-ca660a4fa958-logging-loki-gateway-ca-bundle\") pod \"logging-loki-gateway-69ffd5987-wz9b5\" (UID: \"6d6f1657-b9dd-4fba-a216-ca660a4fa958\") " pod="openshift-logging/logging-loki-gateway-69ffd5987-wz9b5" Dec 11 13:17:04 crc kubenswrapper[4898]: I1211 13:17:04.740625 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c8cb2c04-2c3f-4ffa-95b3-7e5c32a159ce-logging-loki-gateway-ca-bundle\") pod \"logging-loki-gateway-69ffd5987-jj95c\" (UID: \"c8cb2c04-2c3f-4ffa-95b3-7e5c32a159ce\") " pod="openshift-logging/logging-loki-gateway-69ffd5987-jj95c" Dec 11 13:17:04 crc kubenswrapper[4898]: I1211 13:17:04.740642 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/6d6f1657-b9dd-4fba-a216-ca660a4fa958-lokistack-gateway\") pod \"logging-loki-gateway-69ffd5987-wz9b5\" (UID: \"6d6f1657-b9dd-4fba-a216-ca660a4fa958\") " pod="openshift-logging/logging-loki-gateway-69ffd5987-wz9b5" Dec 11 13:17:04 crc kubenswrapper[4898]: I1211 13:17:04.740665 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/6d6f1657-b9dd-4fba-a216-ca660a4fa958-rbac\") pod \"logging-loki-gateway-69ffd5987-wz9b5\" (UID: \"6d6f1657-b9dd-4fba-a216-ca660a4fa958\") " pod="openshift-logging/logging-loki-gateway-69ffd5987-wz9b5" Dec 11 13:17:04 crc kubenswrapper[4898]: I1211 13:17:04.740696 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/6d6f1657-b9dd-4fba-a216-ca660a4fa958-tls-secret\") pod \"logging-loki-gateway-69ffd5987-wz9b5\" (UID: \"6d6f1657-b9dd-4fba-a216-ca660a4fa958\") " pod="openshift-logging/logging-loki-gateway-69ffd5987-wz9b5" Dec 11 13:17:04 crc kubenswrapper[4898]: I1211 13:17:04.740752 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/6d6f1657-b9dd-4fba-a216-ca660a4fa958-logging-loki-gateway-client-http\") pod \"logging-loki-gateway-69ffd5987-wz9b5\" (UID: \"6d6f1657-b9dd-4fba-a216-ca660a4fa958\") " pod="openshift-logging/logging-loki-gateway-69ffd5987-wz9b5" Dec 11 13:17:04 crc kubenswrapper[4898]: I1211 13:17:04.740774 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/c8cb2c04-2c3f-4ffa-95b3-7e5c32a159ce-rbac\") pod \"logging-loki-gateway-69ffd5987-jj95c\" (UID: \"c8cb2c04-2c3f-4ffa-95b3-7e5c32a159ce\") " pod="openshift-logging/logging-loki-gateway-69ffd5987-jj95c" Dec 11 13:17:04 crc kubenswrapper[4898]: I1211 13:17:04.740791 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/c8cb2c04-2c3f-4ffa-95b3-7e5c32a159ce-tenants\") pod \"logging-loki-gateway-69ffd5987-jj95c\" (UID: \"c8cb2c04-2c3f-4ffa-95b3-7e5c32a159ce\") " pod="openshift-logging/logging-loki-gateway-69ffd5987-jj95c" Dec 11 13:17:04 crc kubenswrapper[4898]: I1211 13:17:04.740814 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b7lrh\" (UniqueName: \"kubernetes.io/projected/6d6f1657-b9dd-4fba-a216-ca660a4fa958-kube-api-access-b7lrh\") pod \"logging-loki-gateway-69ffd5987-wz9b5\" (UID: \"6d6f1657-b9dd-4fba-a216-ca660a4fa958\") " pod="openshift-logging/logging-loki-gateway-69ffd5987-wz9b5" Dec 11 13:17:04 crc kubenswrapper[4898]: I1211 13:17:04.740838 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vvlm8\" (UniqueName: \"kubernetes.io/projected/c8cb2c04-2c3f-4ffa-95b3-7e5c32a159ce-kube-api-access-vvlm8\") pod \"logging-loki-gateway-69ffd5987-jj95c\" (UID: \"c8cb2c04-2c3f-4ffa-95b3-7e5c32a159ce\") " pod="openshift-logging/logging-loki-gateway-69ffd5987-jj95c" Dec 11 13:17:04 crc kubenswrapper[4898]: I1211 13:17:04.740859 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/c8cb2c04-2c3f-4ffa-95b3-7e5c32a159ce-lokistack-gateway\") pod \"logging-loki-gateway-69ffd5987-jj95c\" (UID: \"c8cb2c04-2c3f-4ffa-95b3-7e5c32a159ce\") " pod="openshift-logging/logging-loki-gateway-69ffd5987-jj95c" Dec 11 13:17:04 crc kubenswrapper[4898]: I1211 13:17:04.740872 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/c8cb2c04-2c3f-4ffa-95b3-7e5c32a159ce-tls-secret\") pod \"logging-loki-gateway-69ffd5987-jj95c\" (UID: \"c8cb2c04-2c3f-4ffa-95b3-7e5c32a159ce\") " pod="openshift-logging/logging-loki-gateway-69ffd5987-jj95c" Dec 11 13:17:04 crc kubenswrapper[4898]: E1211 13:17:04.740945 4898 secret.go:188] Couldn't get secret openshift-logging/logging-loki-gateway-http: secret "logging-loki-gateway-http" not found Dec 11 13:17:04 crc kubenswrapper[4898]: E1211 13:17:04.740997 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c8cb2c04-2c3f-4ffa-95b3-7e5c32a159ce-tls-secret podName:c8cb2c04-2c3f-4ffa-95b3-7e5c32a159ce nodeName:}" failed. No retries permitted until 2025-12-11 13:17:05.240977422 +0000 UTC m=+782.813303859 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-secret" (UniqueName: "kubernetes.io/secret/c8cb2c04-2c3f-4ffa-95b3-7e5c32a159ce-tls-secret") pod "logging-loki-gateway-69ffd5987-jj95c" (UID: "c8cb2c04-2c3f-4ffa-95b3-7e5c32a159ce") : secret "logging-loki-gateway-http" not found Dec 11 13:17:04 crc kubenswrapper[4898]: E1211 13:17:04.741274 4898 secret.go:188] Couldn't get secret openshift-logging/logging-loki-gateway-http: secret "logging-loki-gateway-http" not found Dec 11 13:17:04 crc kubenswrapper[4898]: E1211 13:17:04.741358 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6d6f1657-b9dd-4fba-a216-ca660a4fa958-tls-secret podName:6d6f1657-b9dd-4fba-a216-ca660a4fa958 nodeName:}" failed. No retries permitted until 2025-12-11 13:17:05.241336713 +0000 UTC m=+782.813663200 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-secret" (UniqueName: "kubernetes.io/secret/6d6f1657-b9dd-4fba-a216-ca660a4fa958-tls-secret") pod "logging-loki-gateway-69ffd5987-wz9b5" (UID: "6d6f1657-b9dd-4fba-a216-ca660a4fa958") : secret "logging-loki-gateway-http" not found Dec 11 13:17:04 crc kubenswrapper[4898]: I1211 13:17:04.742353 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6d6f1657-b9dd-4fba-a216-ca660a4fa958-logging-loki-ca-bundle\") pod \"logging-loki-gateway-69ffd5987-wz9b5\" (UID: \"6d6f1657-b9dd-4fba-a216-ca660a4fa958\") " pod="openshift-logging/logging-loki-gateway-69ffd5987-wz9b5" Dec 11 13:17:04 crc kubenswrapper[4898]: I1211 13:17:04.743022 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c8cb2c04-2c3f-4ffa-95b3-7e5c32a159ce-logging-loki-ca-bundle\") pod \"logging-loki-gateway-69ffd5987-jj95c\" (UID: \"c8cb2c04-2c3f-4ffa-95b3-7e5c32a159ce\") " pod="openshift-logging/logging-loki-gateway-69ffd5987-jj95c" Dec 11 13:17:04 crc kubenswrapper[4898]: I1211 13:17:04.746083 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6d6f1657-b9dd-4fba-a216-ca660a4fa958-logging-loki-gateway-ca-bundle\") pod \"logging-loki-gateway-69ffd5987-wz9b5\" (UID: \"6d6f1657-b9dd-4fba-a216-ca660a4fa958\") " pod="openshift-logging/logging-loki-gateway-69ffd5987-wz9b5" Dec 11 13:17:04 crc kubenswrapper[4898]: I1211 13:17:04.746727 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/6d6f1657-b9dd-4fba-a216-ca660a4fa958-tenants\") pod \"logging-loki-gateway-69ffd5987-wz9b5\" (UID: \"6d6f1657-b9dd-4fba-a216-ca660a4fa958\") " pod="openshift-logging/logging-loki-gateway-69ffd5987-wz9b5" Dec 11 13:17:04 crc kubenswrapper[4898]: I1211 13:17:04.746954 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/c8cb2c04-2c3f-4ffa-95b3-7e5c32a159ce-lokistack-gateway\") pod \"logging-loki-gateway-69ffd5987-jj95c\" (UID: \"c8cb2c04-2c3f-4ffa-95b3-7e5c32a159ce\") " pod="openshift-logging/logging-loki-gateway-69ffd5987-jj95c" Dec 11 13:17:04 crc kubenswrapper[4898]: I1211 13:17:04.747116 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/c8cb2c04-2c3f-4ffa-95b3-7e5c32a159ce-rbac\") pod \"logging-loki-gateway-69ffd5987-jj95c\" (UID: \"c8cb2c04-2c3f-4ffa-95b3-7e5c32a159ce\") " pod="openshift-logging/logging-loki-gateway-69ffd5987-jj95c" Dec 11 13:17:04 crc kubenswrapper[4898]: I1211 13:17:04.748994 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/6d6f1657-b9dd-4fba-a216-ca660a4fa958-logging-loki-gateway-client-http\") pod \"logging-loki-gateway-69ffd5987-wz9b5\" (UID: \"6d6f1657-b9dd-4fba-a216-ca660a4fa958\") " pod="openshift-logging/logging-loki-gateway-69ffd5987-wz9b5" Dec 11 13:17:04 crc kubenswrapper[4898]: I1211 13:17:04.749026 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/6d6f1657-b9dd-4fba-a216-ca660a4fa958-lokistack-gateway\") pod \"logging-loki-gateway-69ffd5987-wz9b5\" (UID: \"6d6f1657-b9dd-4fba-a216-ca660a4fa958\") " pod="openshift-logging/logging-loki-gateway-69ffd5987-wz9b5" Dec 11 13:17:04 crc kubenswrapper[4898]: I1211 13:17:04.749143 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/c8cb2c04-2c3f-4ffa-95b3-7e5c32a159ce-tenants\") pod \"logging-loki-gateway-69ffd5987-jj95c\" (UID: \"c8cb2c04-2c3f-4ffa-95b3-7e5c32a159ce\") " pod="openshift-logging/logging-loki-gateway-69ffd5987-jj95c" Dec 11 13:17:04 crc kubenswrapper[4898]: I1211 13:17:04.749265 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/6d6f1657-b9dd-4fba-a216-ca660a4fa958-rbac\") pod \"logging-loki-gateway-69ffd5987-wz9b5\" (UID: \"6d6f1657-b9dd-4fba-a216-ca660a4fa958\") " pod="openshift-logging/logging-loki-gateway-69ffd5987-wz9b5" Dec 11 13:17:04 crc kubenswrapper[4898]: I1211 13:17:04.756238 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c8cb2c04-2c3f-4ffa-95b3-7e5c32a159ce-logging-loki-gateway-ca-bundle\") pod \"logging-loki-gateway-69ffd5987-jj95c\" (UID: \"c8cb2c04-2c3f-4ffa-95b3-7e5c32a159ce\") " pod="openshift-logging/logging-loki-gateway-69ffd5987-jj95c" Dec 11 13:17:04 crc kubenswrapper[4898]: I1211 13:17:04.756490 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/c8cb2c04-2c3f-4ffa-95b3-7e5c32a159ce-logging-loki-gateway-client-http\") pod \"logging-loki-gateway-69ffd5987-jj95c\" (UID: \"c8cb2c04-2c3f-4ffa-95b3-7e5c32a159ce\") " pod="openshift-logging/logging-loki-gateway-69ffd5987-jj95c" Dec 11 13:17:04 crc kubenswrapper[4898]: I1211 13:17:04.759836 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b7lrh\" (UniqueName: \"kubernetes.io/projected/6d6f1657-b9dd-4fba-a216-ca660a4fa958-kube-api-access-b7lrh\") pod \"logging-loki-gateway-69ffd5987-wz9b5\" (UID: \"6d6f1657-b9dd-4fba-a216-ca660a4fa958\") " pod="openshift-logging/logging-loki-gateway-69ffd5987-wz9b5" Dec 11 13:17:04 crc kubenswrapper[4898]: I1211 13:17:04.761374 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vvlm8\" (UniqueName: \"kubernetes.io/projected/c8cb2c04-2c3f-4ffa-95b3-7e5c32a159ce-kube-api-access-vvlm8\") pod \"logging-loki-gateway-69ffd5987-jj95c\" (UID: \"c8cb2c04-2c3f-4ffa-95b3-7e5c32a159ce\") " pod="openshift-logging/logging-loki-gateway-69ffd5987-jj95c" Dec 11 13:17:04 crc kubenswrapper[4898]: I1211 13:17:04.969222 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-distributor-76cc67bf56-qjz7m"] Dec 11 13:17:04 crc kubenswrapper[4898]: W1211 13:17:04.974985 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod841a3e5b_876d_43b2_b24a_d5c01876c30d.slice/crio-3890f9aed2471a63f84e102ec6f8f9c714537e6f48f0a631fed8e6c945f4af2f WatchSource:0}: Error finding container 3890f9aed2471a63f84e102ec6f8f9c714537e6f48f0a631fed8e6c945f4af2f: Status 404 returned error can't find the container with id 3890f9aed2471a63f84e102ec6f8f9c714537e6f48f0a631fed8e6c945f4af2f Dec 11 13:17:04 crc kubenswrapper[4898]: I1211 13:17:04.995322 4898 patch_prober.go:28] interesting pod/machine-config-daemon-7mmvk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 11 13:17:04 crc kubenswrapper[4898]: I1211 13:17:04.995663 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 11 13:17:05 crc kubenswrapper[4898]: I1211 13:17:05.044375 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-querier-5895d59bb8-62tqd"] Dec 11 13:17:05 crc kubenswrapper[4898]: W1211 13:17:05.044633 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4ecec7bc_8ce0_46ce_99fd_2a6fbcc626d0.slice/crio-4f96ad5ca41e54f125e0beb124ee06191f200dcaa26a4e49803bcf2edede57cd WatchSource:0}: Error finding container 4f96ad5ca41e54f125e0beb124ee06191f200dcaa26a4e49803bcf2edede57cd: Status 404 returned error can't find the container with id 4f96ad5ca41e54f125e0beb124ee06191f200dcaa26a4e49803bcf2edede57cd Dec 11 13:17:05 crc kubenswrapper[4898]: I1211 13:17:05.165239 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-query-frontend-84558f7c9f-rjbbq"] Dec 11 13:17:05 crc kubenswrapper[4898]: W1211 13:17:05.172003 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod98efd2cb_8cd8_49c2_a54b_5a04cf51dc71.slice/crio-ac76a56f7120e5b5c92256b0ff0d22875470de19730c88bc7a315a1e7b5eb6e7 WatchSource:0}: Error finding container ac76a56f7120e5b5c92256b0ff0d22875470de19730c88bc7a315a1e7b5eb6e7: Status 404 returned error can't find the container with id ac76a56f7120e5b5c92256b0ff0d22875470de19730c88bc7a315a1e7b5eb6e7 Dec 11 13:17:05 crc kubenswrapper[4898]: I1211 13:17:05.228418 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-distributor-76cc67bf56-qjz7m" event={"ID":"841a3e5b-876d-43b2-b24a-d5c01876c30d","Type":"ContainerStarted","Data":"3890f9aed2471a63f84e102ec6f8f9c714537e6f48f0a631fed8e6c945f4af2f"} Dec 11 13:17:05 crc kubenswrapper[4898]: I1211 13:17:05.229504 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-querier-5895d59bb8-62tqd" event={"ID":"4ecec7bc-8ce0-46ce-99fd-2a6fbcc626d0","Type":"ContainerStarted","Data":"4f96ad5ca41e54f125e0beb124ee06191f200dcaa26a4e49803bcf2edede57cd"} Dec 11 13:17:05 crc kubenswrapper[4898]: I1211 13:17:05.230205 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-rjbbq" event={"ID":"98efd2cb-8cd8-49c2-a54b-5a04cf51dc71","Type":"ContainerStarted","Data":"ac76a56f7120e5b5c92256b0ff0d22875470de19730c88bc7a315a1e7b5eb6e7"} Dec 11 13:17:05 crc kubenswrapper[4898]: I1211 13:17:05.249406 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/c8cb2c04-2c3f-4ffa-95b3-7e5c32a159ce-tls-secret\") pod \"logging-loki-gateway-69ffd5987-jj95c\" (UID: \"c8cb2c04-2c3f-4ffa-95b3-7e5c32a159ce\") " pod="openshift-logging/logging-loki-gateway-69ffd5987-jj95c" Dec 11 13:17:05 crc kubenswrapper[4898]: I1211 13:17:05.249513 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/6d6f1657-b9dd-4fba-a216-ca660a4fa958-tls-secret\") pod \"logging-loki-gateway-69ffd5987-wz9b5\" (UID: \"6d6f1657-b9dd-4fba-a216-ca660a4fa958\") " pod="openshift-logging/logging-loki-gateway-69ffd5987-wz9b5" Dec 11 13:17:05 crc kubenswrapper[4898]: I1211 13:17:05.254157 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/6d6f1657-b9dd-4fba-a216-ca660a4fa958-tls-secret\") pod \"logging-loki-gateway-69ffd5987-wz9b5\" (UID: \"6d6f1657-b9dd-4fba-a216-ca660a4fa958\") " pod="openshift-logging/logging-loki-gateway-69ffd5987-wz9b5" Dec 11 13:17:05 crc kubenswrapper[4898]: I1211 13:17:05.256118 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/c8cb2c04-2c3f-4ffa-95b3-7e5c32a159ce-tls-secret\") pod \"logging-loki-gateway-69ffd5987-jj95c\" (UID: \"c8cb2c04-2c3f-4ffa-95b3-7e5c32a159ce\") " pod="openshift-logging/logging-loki-gateway-69ffd5987-jj95c" Dec 11 13:17:05 crc kubenswrapper[4898]: I1211 13:17:05.275320 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-ingester-0"] Dec 11 13:17:05 crc kubenswrapper[4898]: I1211 13:17:05.276635 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-ingester-0" Dec 11 13:17:05 crc kubenswrapper[4898]: I1211 13:17:05.278615 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-ingester-http" Dec 11 13:17:05 crc kubenswrapper[4898]: I1211 13:17:05.280096 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-ingester-grpc" Dec 11 13:17:05 crc kubenswrapper[4898]: I1211 13:17:05.293243 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-ingester-0"] Dec 11 13:17:05 crc kubenswrapper[4898]: I1211 13:17:05.351104 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-compactor-0"] Dec 11 13:17:05 crc kubenswrapper[4898]: I1211 13:17:05.352212 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-compactor-0" Dec 11 13:17:05 crc kubenswrapper[4898]: I1211 13:17:05.356447 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-compactor-grpc" Dec 11 13:17:05 crc kubenswrapper[4898]: I1211 13:17:05.356503 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-compactor-http" Dec 11 13:17:05 crc kubenswrapper[4898]: I1211 13:17:05.368820 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-compactor-0"] Dec 11 13:17:05 crc kubenswrapper[4898]: I1211 13:17:05.452436 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-20225b66-fa75-4986-8d04-e0e7e948c829\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-20225b66-fa75-4986-8d04-e0e7e948c829\") pod \"logging-loki-ingester-0\" (UID: \"f6ad4db8-64f3-403c-9c92-9033a73ed12c\") " pod="openshift-logging/logging-loki-ingester-0" Dec 11 13:17:05 crc kubenswrapper[4898]: I1211 13:17:05.452577 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-compactor-grpc\" (UniqueName: \"kubernetes.io/secret/3cd3cd1d-9ead-4620-a346-f83e9e5190ba-logging-loki-compactor-grpc\") pod \"logging-loki-compactor-0\" (UID: \"3cd3cd1d-9ead-4620-a346-f83e9e5190ba\") " pod="openshift-logging/logging-loki-compactor-0" Dec 11 13:17:05 crc kubenswrapper[4898]: I1211 13:17:05.452605 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/3cd3cd1d-9ead-4620-a346-f83e9e5190ba-logging-loki-s3\") pod \"logging-loki-compactor-0\" (UID: \"3cd3cd1d-9ead-4620-a346-f83e9e5190ba\") " pod="openshift-logging/logging-loki-compactor-0" Dec 11 13:17:05 crc kubenswrapper[4898]: I1211 13:17:05.452631 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-compactor-http\" (UniqueName: \"kubernetes.io/secret/3cd3cd1d-9ead-4620-a346-f83e9e5190ba-logging-loki-compactor-http\") pod \"logging-loki-compactor-0\" (UID: \"3cd3cd1d-9ead-4620-a346-f83e9e5190ba\") " pod="openshift-logging/logging-loki-compactor-0" Dec 11 13:17:05 crc kubenswrapper[4898]: I1211 13:17:05.452727 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3cd3cd1d-9ead-4620-a346-f83e9e5190ba-logging-loki-ca-bundle\") pod \"logging-loki-compactor-0\" (UID: \"3cd3cd1d-9ead-4620-a346-f83e9e5190ba\") " pod="openshift-logging/logging-loki-compactor-0" Dec 11 13:17:05 crc kubenswrapper[4898]: I1211 13:17:05.452756 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/f6ad4db8-64f3-403c-9c92-9033a73ed12c-logging-loki-s3\") pod \"logging-loki-ingester-0\" (UID: \"f6ad4db8-64f3-403c-9c92-9033a73ed12c\") " pod="openshift-logging/logging-loki-ingester-0" Dec 11 13:17:05 crc kubenswrapper[4898]: I1211 13:17:05.452777 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-65f24eab-f9a5-4d3d-823c-f892fd42bd3d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-65f24eab-f9a5-4d3d-823c-f892fd42bd3d\") pod \"logging-loki-compactor-0\" (UID: \"3cd3cd1d-9ead-4620-a346-f83e9e5190ba\") " pod="openshift-logging/logging-loki-compactor-0" Dec 11 13:17:05 crc kubenswrapper[4898]: I1211 13:17:05.452813 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ingester-grpc\" (UniqueName: \"kubernetes.io/secret/f6ad4db8-64f3-403c-9c92-9033a73ed12c-logging-loki-ingester-grpc\") pod \"logging-loki-ingester-0\" (UID: \"f6ad4db8-64f3-403c-9c92-9033a73ed12c\") " pod="openshift-logging/logging-loki-ingester-0" Dec 11 13:17:05 crc kubenswrapper[4898]: I1211 13:17:05.452857 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f6ad4db8-64f3-403c-9c92-9033a73ed12c-logging-loki-ca-bundle\") pod \"logging-loki-ingester-0\" (UID: \"f6ad4db8-64f3-403c-9c92-9033a73ed12c\") " pod="openshift-logging/logging-loki-ingester-0" Dec 11 13:17:05 crc kubenswrapper[4898]: I1211 13:17:05.452881 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-24678b1b-35e7-4280-aac3-ac65512d700e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-24678b1b-35e7-4280-aac3-ac65512d700e\") pod \"logging-loki-ingester-0\" (UID: \"f6ad4db8-64f3-403c-9c92-9033a73ed12c\") " pod="openshift-logging/logging-loki-ingester-0" Dec 11 13:17:05 crc kubenswrapper[4898]: I1211 13:17:05.452920 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f6ad4db8-64f3-403c-9c92-9033a73ed12c-config\") pod \"logging-loki-ingester-0\" (UID: \"f6ad4db8-64f3-403c-9c92-9033a73ed12c\") " pod="openshift-logging/logging-loki-ingester-0" Dec 11 13:17:05 crc kubenswrapper[4898]: I1211 13:17:05.452972 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3cd3cd1d-9ead-4620-a346-f83e9e5190ba-config\") pod \"logging-loki-compactor-0\" (UID: \"3cd3cd1d-9ead-4620-a346-f83e9e5190ba\") " pod="openshift-logging/logging-loki-compactor-0" Dec 11 13:17:05 crc kubenswrapper[4898]: I1211 13:17:05.453002 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m2lwg\" (UniqueName: \"kubernetes.io/projected/3cd3cd1d-9ead-4620-a346-f83e9e5190ba-kube-api-access-m2lwg\") pod \"logging-loki-compactor-0\" (UID: \"3cd3cd1d-9ead-4620-a346-f83e9e5190ba\") " pod="openshift-logging/logging-loki-compactor-0" Dec 11 13:17:05 crc kubenswrapper[4898]: I1211 13:17:05.453025 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ingester-http\" (UniqueName: \"kubernetes.io/secret/f6ad4db8-64f3-403c-9c92-9033a73ed12c-logging-loki-ingester-http\") pod \"logging-loki-ingester-0\" (UID: \"f6ad4db8-64f3-403c-9c92-9033a73ed12c\") " pod="openshift-logging/logging-loki-ingester-0" Dec 11 13:17:05 crc kubenswrapper[4898]: I1211 13:17:05.453046 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jt9n8\" (UniqueName: \"kubernetes.io/projected/f6ad4db8-64f3-403c-9c92-9033a73ed12c-kube-api-access-jt9n8\") pod \"logging-loki-ingester-0\" (UID: \"f6ad4db8-64f3-403c-9c92-9033a73ed12c\") " pod="openshift-logging/logging-loki-ingester-0" Dec 11 13:17:05 crc kubenswrapper[4898]: I1211 13:17:05.468843 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-index-gateway-0"] Dec 11 13:17:05 crc kubenswrapper[4898]: I1211 13:17:05.469845 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-index-gateway-0" Dec 11 13:17:05 crc kubenswrapper[4898]: I1211 13:17:05.473194 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-index-gateway-grpc" Dec 11 13:17:05 crc kubenswrapper[4898]: I1211 13:17:05.473189 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-index-gateway-http" Dec 11 13:17:05 crc kubenswrapper[4898]: I1211 13:17:05.487423 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-index-gateway-0"] Dec 11 13:17:05 crc kubenswrapper[4898]: I1211 13:17:05.487985 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-gateway-69ffd5987-wz9b5" Dec 11 13:17:05 crc kubenswrapper[4898]: I1211 13:17:05.507342 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-gateway-69ffd5987-jj95c" Dec 11 13:17:05 crc kubenswrapper[4898]: I1211 13:17:05.554491 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8571f1ff-bc62-45f4-a34d-d221e36df569-config\") pod \"logging-loki-index-gateway-0\" (UID: \"8571f1ff-bc62-45f4-a34d-d221e36df569\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 11 13:17:05 crc kubenswrapper[4898]: I1211 13:17:05.554546 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/f6ad4db8-64f3-403c-9c92-9033a73ed12c-logging-loki-s3\") pod \"logging-loki-ingester-0\" (UID: \"f6ad4db8-64f3-403c-9c92-9033a73ed12c\") " pod="openshift-logging/logging-loki-ingester-0" Dec 11 13:17:05 crc kubenswrapper[4898]: I1211 13:17:05.554576 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-index-gateway-grpc\" (UniqueName: \"kubernetes.io/secret/8571f1ff-bc62-45f4-a34d-d221e36df569-logging-loki-index-gateway-grpc\") pod \"logging-loki-index-gateway-0\" (UID: \"8571f1ff-bc62-45f4-a34d-d221e36df569\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 11 13:17:05 crc kubenswrapper[4898]: I1211 13:17:05.554605 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-65f24eab-f9a5-4d3d-823c-f892fd42bd3d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-65f24eab-f9a5-4d3d-823c-f892fd42bd3d\") pod \"logging-loki-compactor-0\" (UID: \"3cd3cd1d-9ead-4620-a346-f83e9e5190ba\") " pod="openshift-logging/logging-loki-compactor-0" Dec 11 13:17:05 crc kubenswrapper[4898]: I1211 13:17:05.554636 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ingester-grpc\" (UniqueName: \"kubernetes.io/secret/f6ad4db8-64f3-403c-9c92-9033a73ed12c-logging-loki-ingester-grpc\") pod \"logging-loki-ingester-0\" (UID: \"f6ad4db8-64f3-403c-9c92-9033a73ed12c\") " pod="openshift-logging/logging-loki-ingester-0" Dec 11 13:17:05 crc kubenswrapper[4898]: I1211 13:17:05.554659 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-24678b1b-35e7-4280-aac3-ac65512d700e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-24678b1b-35e7-4280-aac3-ac65512d700e\") pod \"logging-loki-ingester-0\" (UID: \"f6ad4db8-64f3-403c-9c92-9033a73ed12c\") " pod="openshift-logging/logging-loki-ingester-0" Dec 11 13:17:05 crc kubenswrapper[4898]: I1211 13:17:05.554691 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-index-gateway-http\" (UniqueName: \"kubernetes.io/secret/8571f1ff-bc62-45f4-a34d-d221e36df569-logging-loki-index-gateway-http\") pod \"logging-loki-index-gateway-0\" (UID: \"8571f1ff-bc62-45f4-a34d-d221e36df569\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 11 13:17:05 crc kubenswrapper[4898]: I1211 13:17:05.554721 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3cd3cd1d-9ead-4620-a346-f83e9e5190ba-config\") pod \"logging-loki-compactor-0\" (UID: \"3cd3cd1d-9ead-4620-a346-f83e9e5190ba\") " pod="openshift-logging/logging-loki-compactor-0" Dec 11 13:17:05 crc kubenswrapper[4898]: I1211 13:17:05.554758 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-compactor-grpc\" (UniqueName: \"kubernetes.io/secret/3cd3cd1d-9ead-4620-a346-f83e9e5190ba-logging-loki-compactor-grpc\") pod \"logging-loki-compactor-0\" (UID: \"3cd3cd1d-9ead-4620-a346-f83e9e5190ba\") " pod="openshift-logging/logging-loki-compactor-0" Dec 11 13:17:05 crc kubenswrapper[4898]: I1211 13:17:05.554777 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/3cd3cd1d-9ead-4620-a346-f83e9e5190ba-logging-loki-s3\") pod \"logging-loki-compactor-0\" (UID: \"3cd3cd1d-9ead-4620-a346-f83e9e5190ba\") " pod="openshift-logging/logging-loki-compactor-0" Dec 11 13:17:05 crc kubenswrapper[4898]: I1211 13:17:05.554820 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-26bc9891-b714-4d6d-ab19-80e0b1e49249\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-26bc9891-b714-4d6d-ab19-80e0b1e49249\") pod \"logging-loki-index-gateway-0\" (UID: \"8571f1ff-bc62-45f4-a34d-d221e36df569\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 11 13:17:05 crc kubenswrapper[4898]: I1211 13:17:05.554846 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/8571f1ff-bc62-45f4-a34d-d221e36df569-logging-loki-s3\") pod \"logging-loki-index-gateway-0\" (UID: \"8571f1ff-bc62-45f4-a34d-d221e36df569\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 11 13:17:05 crc kubenswrapper[4898]: I1211 13:17:05.554872 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3cd3cd1d-9ead-4620-a346-f83e9e5190ba-logging-loki-ca-bundle\") pod \"logging-loki-compactor-0\" (UID: \"3cd3cd1d-9ead-4620-a346-f83e9e5190ba\") " pod="openshift-logging/logging-loki-compactor-0" Dec 11 13:17:05 crc kubenswrapper[4898]: I1211 13:17:05.554908 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f6ad4db8-64f3-403c-9c92-9033a73ed12c-logging-loki-ca-bundle\") pod \"logging-loki-ingester-0\" (UID: \"f6ad4db8-64f3-403c-9c92-9033a73ed12c\") " pod="openshift-logging/logging-loki-ingester-0" Dec 11 13:17:05 crc kubenswrapper[4898]: I1211 13:17:05.554933 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8571f1ff-bc62-45f4-a34d-d221e36df569-logging-loki-ca-bundle\") pod \"logging-loki-index-gateway-0\" (UID: \"8571f1ff-bc62-45f4-a34d-d221e36df569\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 11 13:17:05 crc kubenswrapper[4898]: I1211 13:17:05.554957 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f6ad4db8-64f3-403c-9c92-9033a73ed12c-config\") pod \"logging-loki-ingester-0\" (UID: \"f6ad4db8-64f3-403c-9c92-9033a73ed12c\") " pod="openshift-logging/logging-loki-ingester-0" Dec 11 13:17:05 crc kubenswrapper[4898]: I1211 13:17:05.554993 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ingester-http\" (UniqueName: \"kubernetes.io/secret/f6ad4db8-64f3-403c-9c92-9033a73ed12c-logging-loki-ingester-http\") pod \"logging-loki-ingester-0\" (UID: \"f6ad4db8-64f3-403c-9c92-9033a73ed12c\") " pod="openshift-logging/logging-loki-ingester-0" Dec 11 13:17:05 crc kubenswrapper[4898]: I1211 13:17:05.555016 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jt9n8\" (UniqueName: \"kubernetes.io/projected/f6ad4db8-64f3-403c-9c92-9033a73ed12c-kube-api-access-jt9n8\") pod \"logging-loki-ingester-0\" (UID: \"f6ad4db8-64f3-403c-9c92-9033a73ed12c\") " pod="openshift-logging/logging-loki-ingester-0" Dec 11 13:17:05 crc kubenswrapper[4898]: I1211 13:17:05.555042 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m2lwg\" (UniqueName: \"kubernetes.io/projected/3cd3cd1d-9ead-4620-a346-f83e9e5190ba-kube-api-access-m2lwg\") pod \"logging-loki-compactor-0\" (UID: \"3cd3cd1d-9ead-4620-a346-f83e9e5190ba\") " pod="openshift-logging/logging-loki-compactor-0" Dec 11 13:17:05 crc kubenswrapper[4898]: I1211 13:17:05.555074 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-20225b66-fa75-4986-8d04-e0e7e948c829\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-20225b66-fa75-4986-8d04-e0e7e948c829\") pod \"logging-loki-ingester-0\" (UID: \"f6ad4db8-64f3-403c-9c92-9033a73ed12c\") " pod="openshift-logging/logging-loki-ingester-0" Dec 11 13:17:05 crc kubenswrapper[4898]: I1211 13:17:05.555104 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-compactor-http\" (UniqueName: \"kubernetes.io/secret/3cd3cd1d-9ead-4620-a346-f83e9e5190ba-logging-loki-compactor-http\") pod \"logging-loki-compactor-0\" (UID: \"3cd3cd1d-9ead-4620-a346-f83e9e5190ba\") " pod="openshift-logging/logging-loki-compactor-0" Dec 11 13:17:05 crc kubenswrapper[4898]: I1211 13:17:05.555136 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gs4v2\" (UniqueName: \"kubernetes.io/projected/8571f1ff-bc62-45f4-a34d-d221e36df569-kube-api-access-gs4v2\") pod \"logging-loki-index-gateway-0\" (UID: \"8571f1ff-bc62-45f4-a34d-d221e36df569\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 11 13:17:05 crc kubenswrapper[4898]: I1211 13:17:05.558976 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f6ad4db8-64f3-403c-9c92-9033a73ed12c-config\") pod \"logging-loki-ingester-0\" (UID: \"f6ad4db8-64f3-403c-9c92-9033a73ed12c\") " pod="openshift-logging/logging-loki-ingester-0" Dec 11 13:17:05 crc kubenswrapper[4898]: I1211 13:17:05.559747 4898 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 11 13:17:05 crc kubenswrapper[4898]: I1211 13:17:05.559784 4898 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-65f24eab-f9a5-4d3d-823c-f892fd42bd3d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-65f24eab-f9a5-4d3d-823c-f892fd42bd3d\") pod \"logging-loki-compactor-0\" (UID: \"3cd3cd1d-9ead-4620-a346-f83e9e5190ba\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/dd2f2af043ab14434c0dbb06f32fc3135b9b8f6ecc21e7d89241a07d5acf07e9/globalmount\"" pod="openshift-logging/logging-loki-compactor-0" Dec 11 13:17:05 crc kubenswrapper[4898]: I1211 13:17:05.560676 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/f6ad4db8-64f3-403c-9c92-9033a73ed12c-logging-loki-s3\") pod \"logging-loki-ingester-0\" (UID: \"f6ad4db8-64f3-403c-9c92-9033a73ed12c\") " pod="openshift-logging/logging-loki-ingester-0" Dec 11 13:17:05 crc kubenswrapper[4898]: I1211 13:17:05.562041 4898 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 11 13:17:05 crc kubenswrapper[4898]: I1211 13:17:05.562080 4898 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-24678b1b-35e7-4280-aac3-ac65512d700e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-24678b1b-35e7-4280-aac3-ac65512d700e\") pod \"logging-loki-ingester-0\" (UID: \"f6ad4db8-64f3-403c-9c92-9033a73ed12c\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/f78404e5f42c8e9de2ef6020bb144082a59a851393f1d437da79f500f9c9ac3c/globalmount\"" pod="openshift-logging/logging-loki-ingester-0" Dec 11 13:17:05 crc kubenswrapper[4898]: I1211 13:17:05.562170 4898 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 11 13:17:05 crc kubenswrapper[4898]: I1211 13:17:05.562211 4898 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-20225b66-fa75-4986-8d04-e0e7e948c829\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-20225b66-fa75-4986-8d04-e0e7e948c829\") pod \"logging-loki-ingester-0\" (UID: \"f6ad4db8-64f3-403c-9c92-9033a73ed12c\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/91274ca40dfbbab929074e17cd171c088ba18064fffaf43878bd9185f3442c5a/globalmount\"" pod="openshift-logging/logging-loki-ingester-0" Dec 11 13:17:05 crc kubenswrapper[4898]: I1211 13:17:05.562449 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/3cd3cd1d-9ead-4620-a346-f83e9e5190ba-logging-loki-s3\") pod \"logging-loki-compactor-0\" (UID: \"3cd3cd1d-9ead-4620-a346-f83e9e5190ba\") " pod="openshift-logging/logging-loki-compactor-0" Dec 11 13:17:05 crc kubenswrapper[4898]: I1211 13:17:05.563077 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ingester-http\" (UniqueName: \"kubernetes.io/secret/f6ad4db8-64f3-403c-9c92-9033a73ed12c-logging-loki-ingester-http\") pod \"logging-loki-ingester-0\" (UID: \"f6ad4db8-64f3-403c-9c92-9033a73ed12c\") " pod="openshift-logging/logging-loki-ingester-0" Dec 11 13:17:05 crc kubenswrapper[4898]: I1211 13:17:05.563262 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f6ad4db8-64f3-403c-9c92-9033a73ed12c-logging-loki-ca-bundle\") pod \"logging-loki-ingester-0\" (UID: \"f6ad4db8-64f3-403c-9c92-9033a73ed12c\") " pod="openshift-logging/logging-loki-ingester-0" Dec 11 13:17:05 crc kubenswrapper[4898]: I1211 13:17:05.563308 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-compactor-grpc\" (UniqueName: \"kubernetes.io/secret/3cd3cd1d-9ead-4620-a346-f83e9e5190ba-logging-loki-compactor-grpc\") pod \"logging-loki-compactor-0\" (UID: \"3cd3cd1d-9ead-4620-a346-f83e9e5190ba\") " pod="openshift-logging/logging-loki-compactor-0" Dec 11 13:17:05 crc kubenswrapper[4898]: I1211 13:17:05.564253 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3cd3cd1d-9ead-4620-a346-f83e9e5190ba-logging-loki-ca-bundle\") pod \"logging-loki-compactor-0\" (UID: \"3cd3cd1d-9ead-4620-a346-f83e9e5190ba\") " pod="openshift-logging/logging-loki-compactor-0" Dec 11 13:17:05 crc kubenswrapper[4898]: I1211 13:17:05.565572 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3cd3cd1d-9ead-4620-a346-f83e9e5190ba-config\") pod \"logging-loki-compactor-0\" (UID: \"3cd3cd1d-9ead-4620-a346-f83e9e5190ba\") " pod="openshift-logging/logging-loki-compactor-0" Dec 11 13:17:05 crc kubenswrapper[4898]: I1211 13:17:05.567977 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ingester-grpc\" (UniqueName: \"kubernetes.io/secret/f6ad4db8-64f3-403c-9c92-9033a73ed12c-logging-loki-ingester-grpc\") pod \"logging-loki-ingester-0\" (UID: \"f6ad4db8-64f3-403c-9c92-9033a73ed12c\") " pod="openshift-logging/logging-loki-ingester-0" Dec 11 13:17:05 crc kubenswrapper[4898]: I1211 13:17:05.578519 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jt9n8\" (UniqueName: \"kubernetes.io/projected/f6ad4db8-64f3-403c-9c92-9033a73ed12c-kube-api-access-jt9n8\") pod \"logging-loki-ingester-0\" (UID: \"f6ad4db8-64f3-403c-9c92-9033a73ed12c\") " pod="openshift-logging/logging-loki-ingester-0" Dec 11 13:17:05 crc kubenswrapper[4898]: I1211 13:17:05.580041 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m2lwg\" (UniqueName: \"kubernetes.io/projected/3cd3cd1d-9ead-4620-a346-f83e9e5190ba-kube-api-access-m2lwg\") pod \"logging-loki-compactor-0\" (UID: \"3cd3cd1d-9ead-4620-a346-f83e9e5190ba\") " pod="openshift-logging/logging-loki-compactor-0" Dec 11 13:17:05 crc kubenswrapper[4898]: I1211 13:17:05.582779 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-compactor-http\" (UniqueName: \"kubernetes.io/secret/3cd3cd1d-9ead-4620-a346-f83e9e5190ba-logging-loki-compactor-http\") pod \"logging-loki-compactor-0\" (UID: \"3cd3cd1d-9ead-4620-a346-f83e9e5190ba\") " pod="openshift-logging/logging-loki-compactor-0" Dec 11 13:17:05 crc kubenswrapper[4898]: I1211 13:17:05.598113 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-24678b1b-35e7-4280-aac3-ac65512d700e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-24678b1b-35e7-4280-aac3-ac65512d700e\") pod \"logging-loki-ingester-0\" (UID: \"f6ad4db8-64f3-403c-9c92-9033a73ed12c\") " pod="openshift-logging/logging-loki-ingester-0" Dec 11 13:17:05 crc kubenswrapper[4898]: I1211 13:17:05.626991 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-65f24eab-f9a5-4d3d-823c-f892fd42bd3d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-65f24eab-f9a5-4d3d-823c-f892fd42bd3d\") pod \"logging-loki-compactor-0\" (UID: \"3cd3cd1d-9ead-4620-a346-f83e9e5190ba\") " pod="openshift-logging/logging-loki-compactor-0" Dec 11 13:17:05 crc kubenswrapper[4898]: I1211 13:17:05.644476 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-20225b66-fa75-4986-8d04-e0e7e948c829\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-20225b66-fa75-4986-8d04-e0e7e948c829\") pod \"logging-loki-ingester-0\" (UID: \"f6ad4db8-64f3-403c-9c92-9033a73ed12c\") " pod="openshift-logging/logging-loki-ingester-0" Dec 11 13:17:05 crc kubenswrapper[4898]: I1211 13:17:05.657429 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-index-gateway-grpc\" (UniqueName: \"kubernetes.io/secret/8571f1ff-bc62-45f4-a34d-d221e36df569-logging-loki-index-gateway-grpc\") pod \"logging-loki-index-gateway-0\" (UID: \"8571f1ff-bc62-45f4-a34d-d221e36df569\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 11 13:17:05 crc kubenswrapper[4898]: I1211 13:17:05.657642 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-index-gateway-http\" (UniqueName: \"kubernetes.io/secret/8571f1ff-bc62-45f4-a34d-d221e36df569-logging-loki-index-gateway-http\") pod \"logging-loki-index-gateway-0\" (UID: \"8571f1ff-bc62-45f4-a34d-d221e36df569\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 11 13:17:05 crc kubenswrapper[4898]: I1211 13:17:05.657718 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-26bc9891-b714-4d6d-ab19-80e0b1e49249\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-26bc9891-b714-4d6d-ab19-80e0b1e49249\") pod \"logging-loki-index-gateway-0\" (UID: \"8571f1ff-bc62-45f4-a34d-d221e36df569\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 11 13:17:05 crc kubenswrapper[4898]: I1211 13:17:05.657753 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/8571f1ff-bc62-45f4-a34d-d221e36df569-logging-loki-s3\") pod \"logging-loki-index-gateway-0\" (UID: \"8571f1ff-bc62-45f4-a34d-d221e36df569\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 11 13:17:05 crc kubenswrapper[4898]: I1211 13:17:05.657797 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8571f1ff-bc62-45f4-a34d-d221e36df569-logging-loki-ca-bundle\") pod \"logging-loki-index-gateway-0\" (UID: \"8571f1ff-bc62-45f4-a34d-d221e36df569\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 11 13:17:05 crc kubenswrapper[4898]: I1211 13:17:05.657858 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gs4v2\" (UniqueName: \"kubernetes.io/projected/8571f1ff-bc62-45f4-a34d-d221e36df569-kube-api-access-gs4v2\") pod \"logging-loki-index-gateway-0\" (UID: \"8571f1ff-bc62-45f4-a34d-d221e36df569\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 11 13:17:05 crc kubenswrapper[4898]: I1211 13:17:05.657889 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8571f1ff-bc62-45f4-a34d-d221e36df569-config\") pod \"logging-loki-index-gateway-0\" (UID: \"8571f1ff-bc62-45f4-a34d-d221e36df569\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 11 13:17:05 crc kubenswrapper[4898]: I1211 13:17:05.659331 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8571f1ff-bc62-45f4-a34d-d221e36df569-logging-loki-ca-bundle\") pod \"logging-loki-index-gateway-0\" (UID: \"8571f1ff-bc62-45f4-a34d-d221e36df569\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 11 13:17:05 crc kubenswrapper[4898]: I1211 13:17:05.660480 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8571f1ff-bc62-45f4-a34d-d221e36df569-config\") pod \"logging-loki-index-gateway-0\" (UID: \"8571f1ff-bc62-45f4-a34d-d221e36df569\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 11 13:17:05 crc kubenswrapper[4898]: I1211 13:17:05.662574 4898 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 11 13:17:05 crc kubenswrapper[4898]: I1211 13:17:05.662429 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/8571f1ff-bc62-45f4-a34d-d221e36df569-logging-loki-s3\") pod \"logging-loki-index-gateway-0\" (UID: \"8571f1ff-bc62-45f4-a34d-d221e36df569\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 11 13:17:05 crc kubenswrapper[4898]: I1211 13:17:05.662621 4898 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-26bc9891-b714-4d6d-ab19-80e0b1e49249\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-26bc9891-b714-4d6d-ab19-80e0b1e49249\") pod \"logging-loki-index-gateway-0\" (UID: \"8571f1ff-bc62-45f4-a34d-d221e36df569\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/3cf9414cac9c89584fdd8af0da6b2a188b8d120b9987a63213add302297ba12c/globalmount\"" pod="openshift-logging/logging-loki-index-gateway-0" Dec 11 13:17:05 crc kubenswrapper[4898]: I1211 13:17:05.664186 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-index-gateway-http\" (UniqueName: \"kubernetes.io/secret/8571f1ff-bc62-45f4-a34d-d221e36df569-logging-loki-index-gateway-http\") pod \"logging-loki-index-gateway-0\" (UID: \"8571f1ff-bc62-45f4-a34d-d221e36df569\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 11 13:17:05 crc kubenswrapper[4898]: I1211 13:17:05.665538 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-index-gateway-grpc\" (UniqueName: \"kubernetes.io/secret/8571f1ff-bc62-45f4-a34d-d221e36df569-logging-loki-index-gateway-grpc\") pod \"logging-loki-index-gateway-0\" (UID: \"8571f1ff-bc62-45f4-a34d-d221e36df569\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 11 13:17:05 crc kubenswrapper[4898]: I1211 13:17:05.681581 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gs4v2\" (UniqueName: \"kubernetes.io/projected/8571f1ff-bc62-45f4-a34d-d221e36df569-kube-api-access-gs4v2\") pod \"logging-loki-index-gateway-0\" (UID: \"8571f1ff-bc62-45f4-a34d-d221e36df569\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 11 13:17:05 crc kubenswrapper[4898]: I1211 13:17:05.693630 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-26bc9891-b714-4d6d-ab19-80e0b1e49249\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-26bc9891-b714-4d6d-ab19-80e0b1e49249\") pod \"logging-loki-index-gateway-0\" (UID: \"8571f1ff-bc62-45f4-a34d-d221e36df569\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 11 13:17:05 crc kubenswrapper[4898]: I1211 13:17:05.705069 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-compactor-0" Dec 11 13:17:05 crc kubenswrapper[4898]: I1211 13:17:05.795026 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-index-gateway-0" Dec 11 13:17:05 crc kubenswrapper[4898]: I1211 13:17:05.892341 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-ingester-0" Dec 11 13:17:05 crc kubenswrapper[4898]: I1211 13:17:05.908833 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-compactor-0"] Dec 11 13:17:05 crc kubenswrapper[4898]: W1211 13:17:05.915176 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3cd3cd1d_9ead_4620_a346_f83e9e5190ba.slice/crio-504cfe6c8d613eb1809ac95cebca4fc9acd8e760a443122a043d2f3f5ac7cee7 WatchSource:0}: Error finding container 504cfe6c8d613eb1809ac95cebca4fc9acd8e760a443122a043d2f3f5ac7cee7: Status 404 returned error can't find the container with id 504cfe6c8d613eb1809ac95cebca4fc9acd8e760a443122a043d2f3f5ac7cee7 Dec 11 13:17:05 crc kubenswrapper[4898]: I1211 13:17:05.933486 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-gateway-69ffd5987-wz9b5"] Dec 11 13:17:06 crc kubenswrapper[4898]: I1211 13:17:06.076224 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-gateway-69ffd5987-jj95c"] Dec 11 13:17:06 crc kubenswrapper[4898]: I1211 13:17:06.225017 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-ingester-0"] Dec 11 13:17:06 crc kubenswrapper[4898]: W1211 13:17:06.235807 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf6ad4db8_64f3_403c_9c92_9033a73ed12c.slice/crio-8c086fb054f6db637cb39d07a5fd1c630cd41d8c7d6a34324ac6347928d52951 WatchSource:0}: Error finding container 8c086fb054f6db637cb39d07a5fd1c630cd41d8c7d6a34324ac6347928d52951: Status 404 returned error can't find the container with id 8c086fb054f6db637cb39d07a5fd1c630cd41d8c7d6a34324ac6347928d52951 Dec 11 13:17:06 crc kubenswrapper[4898]: I1211 13:17:06.239020 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-index-gateway-0"] Dec 11 13:17:06 crc kubenswrapper[4898]: I1211 13:17:06.245720 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-gateway-69ffd5987-wz9b5" event={"ID":"6d6f1657-b9dd-4fba-a216-ca660a4fa958","Type":"ContainerStarted","Data":"fdd85d357682d5e89969b3b9855190638a0cee285d7a7b1452eaeacde8a4fc94"} Dec 11 13:17:06 crc kubenswrapper[4898]: I1211 13:17:06.248705 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-gateway-69ffd5987-jj95c" event={"ID":"c8cb2c04-2c3f-4ffa-95b3-7e5c32a159ce","Type":"ContainerStarted","Data":"1c3feb5674f9889d12bd3cba59eaa7a9dae83f64f3982c2a489a749fe8240688"} Dec 11 13:17:06 crc kubenswrapper[4898]: I1211 13:17:06.250656 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-compactor-0" event={"ID":"3cd3cd1d-9ead-4620-a346-f83e9e5190ba","Type":"ContainerStarted","Data":"504cfe6c8d613eb1809ac95cebca4fc9acd8e760a443122a043d2f3f5ac7cee7"} Dec 11 13:17:07 crc kubenswrapper[4898]: I1211 13:17:07.259357 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-index-gateway-0" event={"ID":"8571f1ff-bc62-45f4-a34d-d221e36df569","Type":"ContainerStarted","Data":"11546a2c70271b5bcca3e959b8eb1f40b374562991da02340a26ccf03537f45e"} Dec 11 13:17:07 crc kubenswrapper[4898]: I1211 13:17:07.260923 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-ingester-0" event={"ID":"f6ad4db8-64f3-403c-9c92-9033a73ed12c","Type":"ContainerStarted","Data":"8c086fb054f6db637cb39d07a5fd1c630cd41d8c7d6a34324ac6347928d52951"} Dec 11 13:17:09 crc kubenswrapper[4898]: I1211 13:17:09.283935 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-compactor-0" event={"ID":"3cd3cd1d-9ead-4620-a346-f83e9e5190ba","Type":"ContainerStarted","Data":"19aa56f68076544497cd8123853ca556072296e97114d8175b485f38993117ce"} Dec 11 13:17:09 crc kubenswrapper[4898]: I1211 13:17:09.284618 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-compactor-0" Dec 11 13:17:09 crc kubenswrapper[4898]: I1211 13:17:09.288647 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-gateway-69ffd5987-wz9b5" event={"ID":"6d6f1657-b9dd-4fba-a216-ca660a4fa958","Type":"ContainerStarted","Data":"8d13cbef781b3f01b1e9ceb0643d470b237a8cbfd383b2dfd0535b4736aa7452"} Dec 11 13:17:09 crc kubenswrapper[4898]: I1211 13:17:09.302426 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-index-gateway-0" event={"ID":"8571f1ff-bc62-45f4-a34d-d221e36df569","Type":"ContainerStarted","Data":"d6a4e9edef4644f047cf3d445b179cf423e73bd26f5497a7a1b2ea8ea9796e5c"} Dec 11 13:17:09 crc kubenswrapper[4898]: I1211 13:17:09.302553 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-index-gateway-0" Dec 11 13:17:09 crc kubenswrapper[4898]: I1211 13:17:09.305920 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-querier-5895d59bb8-62tqd" event={"ID":"4ecec7bc-8ce0-46ce-99fd-2a6fbcc626d0","Type":"ContainerStarted","Data":"7bc5e7828ed46b5a17605a190f6f87844fe59a7d323de0f657e3e3ba713ba70d"} Dec 11 13:17:09 crc kubenswrapper[4898]: I1211 13:17:09.306023 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-querier-5895d59bb8-62tqd" Dec 11 13:17:09 crc kubenswrapper[4898]: I1211 13:17:09.308194 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-distributor-76cc67bf56-qjz7m" event={"ID":"841a3e5b-876d-43b2-b24a-d5c01876c30d","Type":"ContainerStarted","Data":"9041131c978df76877cbea0fb6f42edb8f5d5723a5925b57c7d2d7ac046b7b83"} Dec 11 13:17:09 crc kubenswrapper[4898]: I1211 13:17:09.308598 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-distributor-76cc67bf56-qjz7m" Dec 11 13:17:09 crc kubenswrapper[4898]: I1211 13:17:09.310289 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-rjbbq" event={"ID":"98efd2cb-8cd8-49c2-a54b-5a04cf51dc71","Type":"ContainerStarted","Data":"e20a0114d0be2a354e1be5ed666e4d392fdd472584d5f4c6d1c6eb0db19e2ac5"} Dec 11 13:17:09 crc kubenswrapper[4898]: I1211 13:17:09.310674 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-rjbbq" Dec 11 13:17:09 crc kubenswrapper[4898]: I1211 13:17:09.312781 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-gateway-69ffd5987-jj95c" event={"ID":"c8cb2c04-2c3f-4ffa-95b3-7e5c32a159ce","Type":"ContainerStarted","Data":"767f1f46732e3d7700fa8b495250f2cc0935a0c8aa8255313727a7d19ed2144f"} Dec 11 13:17:09 crc kubenswrapper[4898]: I1211 13:17:09.315841 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-compactor-0" podStartSLOduration=2.742773316 podStartE2EDuration="5.315831969s" podCreationTimestamp="2025-12-11 13:17:04 +0000 UTC" firstStartedPulling="2025-12-11 13:17:05.922220102 +0000 UTC m=+783.494546539" lastFinishedPulling="2025-12-11 13:17:08.495278745 +0000 UTC m=+786.067605192" observedRunningTime="2025-12-11 13:17:09.31552219 +0000 UTC m=+786.887848697" watchObservedRunningTime="2025-12-11 13:17:09.315831969 +0000 UTC m=+786.888158406" Dec 11 13:17:09 crc kubenswrapper[4898]: I1211 13:17:09.316450 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-ingester-0" event={"ID":"f6ad4db8-64f3-403c-9c92-9033a73ed12c","Type":"ContainerStarted","Data":"92259fdae026cf8fef657b9e7f04162a243a69e54bcbf1da8ed394f166104beb"} Dec 11 13:17:09 crc kubenswrapper[4898]: I1211 13:17:09.317762 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-ingester-0" Dec 11 13:17:09 crc kubenswrapper[4898]: I1211 13:17:09.343532 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-querier-5895d59bb8-62tqd" podStartSLOduration=1.890157604 podStartE2EDuration="5.343512593s" podCreationTimestamp="2025-12-11 13:17:04 +0000 UTC" firstStartedPulling="2025-12-11 13:17:05.046498385 +0000 UTC m=+782.618824822" lastFinishedPulling="2025-12-11 13:17:08.499853334 +0000 UTC m=+786.072179811" observedRunningTime="2025-12-11 13:17:09.341595024 +0000 UTC m=+786.913921471" watchObservedRunningTime="2025-12-11 13:17:09.343512593 +0000 UTC m=+786.915839050" Dec 11 13:17:09 crc kubenswrapper[4898]: I1211 13:17:09.376194 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-rjbbq" podStartSLOduration=2.170502541 podStartE2EDuration="5.376172759s" podCreationTimestamp="2025-12-11 13:17:04 +0000 UTC" firstStartedPulling="2025-12-11 13:17:05.174803317 +0000 UTC m=+782.747129754" lastFinishedPulling="2025-12-11 13:17:08.380473535 +0000 UTC m=+785.952799972" observedRunningTime="2025-12-11 13:17:09.370024061 +0000 UTC m=+786.942350568" watchObservedRunningTime="2025-12-11 13:17:09.376172759 +0000 UTC m=+786.948499206" Dec 11 13:17:09 crc kubenswrapper[4898]: I1211 13:17:09.406555 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-index-gateway-0" podStartSLOduration=3.160617435 podStartE2EDuration="5.406529034s" podCreationTimestamp="2025-12-11 13:17:04 +0000 UTC" firstStartedPulling="2025-12-11 13:17:06.250089588 +0000 UTC m=+783.822416025" lastFinishedPulling="2025-12-11 13:17:08.496001147 +0000 UTC m=+786.068327624" observedRunningTime="2025-12-11 13:17:09.404814032 +0000 UTC m=+786.977140499" watchObservedRunningTime="2025-12-11 13:17:09.406529034 +0000 UTC m=+786.978855521" Dec 11 13:17:09 crc kubenswrapper[4898]: I1211 13:17:09.454196 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-distributor-76cc67bf56-qjz7m" podStartSLOduration=1.969330717 podStartE2EDuration="5.454168986s" podCreationTimestamp="2025-12-11 13:17:04 +0000 UTC" firstStartedPulling="2025-12-11 13:17:04.976958565 +0000 UTC m=+782.549285002" lastFinishedPulling="2025-12-11 13:17:08.461796804 +0000 UTC m=+786.034123271" observedRunningTime="2025-12-11 13:17:09.444490431 +0000 UTC m=+787.016816878" watchObservedRunningTime="2025-12-11 13:17:09.454168986 +0000 UTC m=+787.026495463" Dec 11 13:17:09 crc kubenswrapper[4898]: I1211 13:17:09.468074 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-ingester-0" podStartSLOduration=3.215867899 podStartE2EDuration="5.468046069s" podCreationTimestamp="2025-12-11 13:17:04 +0000 UTC" firstStartedPulling="2025-12-11 13:17:06.244234469 +0000 UTC m=+783.816560906" lastFinishedPulling="2025-12-11 13:17:08.496412629 +0000 UTC m=+786.068739076" observedRunningTime="2025-12-11 13:17:09.462654375 +0000 UTC m=+787.034980892" watchObservedRunningTime="2025-12-11 13:17:09.468046069 +0000 UTC m=+787.040372516" Dec 11 13:17:11 crc kubenswrapper[4898]: I1211 13:17:11.360821 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-gateway-69ffd5987-jj95c" event={"ID":"c8cb2c04-2c3f-4ffa-95b3-7e5c32a159ce","Type":"ContainerStarted","Data":"5cd3db7eb7665d0bc6aab274fc1f58516b18f4afc41e835fed5856efcb5e7423"} Dec 11 13:17:11 crc kubenswrapper[4898]: I1211 13:17:11.362682 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-gateway-69ffd5987-jj95c" Dec 11 13:17:11 crc kubenswrapper[4898]: I1211 13:17:11.362881 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-gateway-69ffd5987-jj95c" Dec 11 13:17:11 crc kubenswrapper[4898]: I1211 13:17:11.368590 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-gateway-69ffd5987-wz9b5" event={"ID":"6d6f1657-b9dd-4fba-a216-ca660a4fa958","Type":"ContainerStarted","Data":"e5b321cacad6cb0165709c4132a9f0ebc3839df228b46c374a587fb1799d269d"} Dec 11 13:17:11 crc kubenswrapper[4898]: I1211 13:17:11.378760 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-gateway-69ffd5987-jj95c" Dec 11 13:17:11 crc kubenswrapper[4898]: I1211 13:17:11.384086 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-gateway-69ffd5987-jj95c" Dec 11 13:17:11 crc kubenswrapper[4898]: I1211 13:17:11.407864 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-gateway-69ffd5987-jj95c" podStartSLOduration=2.77140459 podStartE2EDuration="7.407829826s" podCreationTimestamp="2025-12-11 13:17:04 +0000 UTC" firstStartedPulling="2025-12-11 13:17:06.138291409 +0000 UTC m=+783.710617846" lastFinishedPulling="2025-12-11 13:17:10.774716645 +0000 UTC m=+788.347043082" observedRunningTime="2025-12-11 13:17:11.39614741 +0000 UTC m=+788.968473887" watchObservedRunningTime="2025-12-11 13:17:11.407829826 +0000 UTC m=+788.980156313" Dec 11 13:17:11 crc kubenswrapper[4898]: I1211 13:17:11.507332 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-gateway-69ffd5987-wz9b5" podStartSLOduration=2.7612479309999998 podStartE2EDuration="7.507310579s" podCreationTimestamp="2025-12-11 13:17:04 +0000 UTC" firstStartedPulling="2025-12-11 13:17:06.021667774 +0000 UTC m=+783.593994211" lastFinishedPulling="2025-12-11 13:17:10.767730402 +0000 UTC m=+788.340056859" observedRunningTime="2025-12-11 13:17:11.490877778 +0000 UTC m=+789.063204215" watchObservedRunningTime="2025-12-11 13:17:11.507310579 +0000 UTC m=+789.079637026" Dec 11 13:17:12 crc kubenswrapper[4898]: I1211 13:17:12.378622 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-gateway-69ffd5987-wz9b5" Dec 11 13:17:12 crc kubenswrapper[4898]: I1211 13:17:12.378939 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-gateway-69ffd5987-wz9b5" Dec 11 13:17:12 crc kubenswrapper[4898]: I1211 13:17:12.389212 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-gateway-69ffd5987-wz9b5" Dec 11 13:17:12 crc kubenswrapper[4898]: I1211 13:17:12.399160 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-gateway-69ffd5987-wz9b5" Dec 11 13:17:24 crc kubenswrapper[4898]: I1211 13:17:24.466914 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-distributor-76cc67bf56-qjz7m" Dec 11 13:17:24 crc kubenswrapper[4898]: I1211 13:17:24.607587 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-querier-5895d59bb8-62tqd" Dec 11 13:17:24 crc kubenswrapper[4898]: I1211 13:17:24.721686 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-rjbbq" Dec 11 13:17:25 crc kubenswrapper[4898]: I1211 13:17:25.711842 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-compactor-0" Dec 11 13:17:25 crc kubenswrapper[4898]: I1211 13:17:25.805769 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-index-gateway-0" Dec 11 13:17:25 crc kubenswrapper[4898]: I1211 13:17:25.898517 4898 patch_prober.go:28] interesting pod/logging-loki-ingester-0 container/loki-ingester namespace/openshift-logging: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body=Ingester not ready: this instance owns no tokens Dec 11 13:17:25 crc kubenswrapper[4898]: I1211 13:17:25.898579 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-ingester-0" podUID="f6ad4db8-64f3-403c-9c92-9033a73ed12c" containerName="loki-ingester" probeResult="failure" output="HTTP probe failed with statuscode: 503" Dec 11 13:17:34 crc kubenswrapper[4898]: I1211 13:17:34.996023 4898 patch_prober.go:28] interesting pod/machine-config-daemon-7mmvk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 11 13:17:34 crc kubenswrapper[4898]: I1211 13:17:34.996784 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 11 13:17:35 crc kubenswrapper[4898]: I1211 13:17:35.900681 4898 patch_prober.go:28] interesting pod/logging-loki-ingester-0 container/loki-ingester namespace/openshift-logging: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body=Ingester not ready: this instance owns no tokens Dec 11 13:17:35 crc kubenswrapper[4898]: I1211 13:17:35.901044 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-ingester-0" podUID="f6ad4db8-64f3-403c-9c92-9033a73ed12c" containerName="loki-ingester" probeResult="failure" output="HTTP probe failed with statuscode: 503" Dec 11 13:17:45 crc kubenswrapper[4898]: I1211 13:17:45.899050 4898 patch_prober.go:28] interesting pod/logging-loki-ingester-0 container/loki-ingester namespace/openshift-logging: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body=Ingester not ready: waiting for 15s after being ready Dec 11 13:17:45 crc kubenswrapper[4898]: I1211 13:17:45.899777 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-ingester-0" podUID="f6ad4db8-64f3-403c-9c92-9033a73ed12c" containerName="loki-ingester" probeResult="failure" output="HTTP probe failed with statuscode: 503" Dec 11 13:17:55 crc kubenswrapper[4898]: I1211 13:17:55.897378 4898 patch_prober.go:28] interesting pod/logging-loki-ingester-0 container/loki-ingester namespace/openshift-logging: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body=Ingester not ready: waiting for 15s after being ready Dec 11 13:17:55 crc kubenswrapper[4898]: I1211 13:17:55.897963 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-ingester-0" podUID="f6ad4db8-64f3-403c-9c92-9033a73ed12c" containerName="loki-ingester" probeResult="failure" output="HTTP probe failed with statuscode: 503" Dec 11 13:18:04 crc kubenswrapper[4898]: I1211 13:18:04.995960 4898 patch_prober.go:28] interesting pod/machine-config-daemon-7mmvk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 11 13:18:04 crc kubenswrapper[4898]: I1211 13:18:04.996756 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 11 13:18:04 crc kubenswrapper[4898]: I1211 13:18:04.996814 4898 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" Dec 11 13:18:04 crc kubenswrapper[4898]: I1211 13:18:04.997975 4898 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0d92a34a49f382a6447faa4993a93c91c6e3595696f153fe710d6ffe0ba35fec"} pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 11 13:18:04 crc kubenswrapper[4898]: I1211 13:18:04.998056 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" containerName="machine-config-daemon" containerID="cri-o://0d92a34a49f382a6447faa4993a93c91c6e3595696f153fe710d6ffe0ba35fec" gracePeriod=600 Dec 11 13:18:05 crc kubenswrapper[4898]: I1211 13:18:05.850423 4898 generic.go:334] "Generic (PLEG): container finished" podID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" containerID="0d92a34a49f382a6447faa4993a93c91c6e3595696f153fe710d6ffe0ba35fec" exitCode=0 Dec 11 13:18:05 crc kubenswrapper[4898]: I1211 13:18:05.850538 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" event={"ID":"b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c","Type":"ContainerDied","Data":"0d92a34a49f382a6447faa4993a93c91c6e3595696f153fe710d6ffe0ba35fec"} Dec 11 13:18:05 crc kubenswrapper[4898]: I1211 13:18:05.851191 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" event={"ID":"b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c","Type":"ContainerStarted","Data":"d75b5443399659dfc4c4753c4049d2fed5f950b8d251c55a9abe387518cd27d2"} Dec 11 13:18:05 crc kubenswrapper[4898]: I1211 13:18:05.851229 4898 scope.go:117] "RemoveContainer" containerID="207d0448c3903ada5a8d43c8e4caa5a44f53738d1392a4585159b0ae517f8656" Dec 11 13:18:05 crc kubenswrapper[4898]: I1211 13:18:05.900985 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-ingester-0" Dec 11 13:18:13 crc kubenswrapper[4898]: I1211 13:18:13.434317 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-v8rz5"] Dec 11 13:18:13 crc kubenswrapper[4898]: I1211 13:18:13.448274 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-v8rz5" Dec 11 13:18:13 crc kubenswrapper[4898]: I1211 13:18:13.461751 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-v8rz5"] Dec 11 13:18:13 crc kubenswrapper[4898]: I1211 13:18:13.495743 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b9eb4c92-7716-4065-bc6c-9387b8f0299a-catalog-content\") pod \"certified-operators-v8rz5\" (UID: \"b9eb4c92-7716-4065-bc6c-9387b8f0299a\") " pod="openshift-marketplace/certified-operators-v8rz5" Dec 11 13:18:13 crc kubenswrapper[4898]: I1211 13:18:13.495979 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b9eb4c92-7716-4065-bc6c-9387b8f0299a-utilities\") pod \"certified-operators-v8rz5\" (UID: \"b9eb4c92-7716-4065-bc6c-9387b8f0299a\") " pod="openshift-marketplace/certified-operators-v8rz5" Dec 11 13:18:13 crc kubenswrapper[4898]: I1211 13:18:13.496016 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7zrcf\" (UniqueName: \"kubernetes.io/projected/b9eb4c92-7716-4065-bc6c-9387b8f0299a-kube-api-access-7zrcf\") pod \"certified-operators-v8rz5\" (UID: \"b9eb4c92-7716-4065-bc6c-9387b8f0299a\") " pod="openshift-marketplace/certified-operators-v8rz5" Dec 11 13:18:13 crc kubenswrapper[4898]: I1211 13:18:13.596869 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b9eb4c92-7716-4065-bc6c-9387b8f0299a-utilities\") pod \"certified-operators-v8rz5\" (UID: \"b9eb4c92-7716-4065-bc6c-9387b8f0299a\") " pod="openshift-marketplace/certified-operators-v8rz5" Dec 11 13:18:13 crc kubenswrapper[4898]: I1211 13:18:13.596914 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7zrcf\" (UniqueName: \"kubernetes.io/projected/b9eb4c92-7716-4065-bc6c-9387b8f0299a-kube-api-access-7zrcf\") pod \"certified-operators-v8rz5\" (UID: \"b9eb4c92-7716-4065-bc6c-9387b8f0299a\") " pod="openshift-marketplace/certified-operators-v8rz5" Dec 11 13:18:13 crc kubenswrapper[4898]: I1211 13:18:13.596937 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b9eb4c92-7716-4065-bc6c-9387b8f0299a-catalog-content\") pod \"certified-operators-v8rz5\" (UID: \"b9eb4c92-7716-4065-bc6c-9387b8f0299a\") " pod="openshift-marketplace/certified-operators-v8rz5" Dec 11 13:18:13 crc kubenswrapper[4898]: I1211 13:18:13.597351 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b9eb4c92-7716-4065-bc6c-9387b8f0299a-catalog-content\") pod \"certified-operators-v8rz5\" (UID: \"b9eb4c92-7716-4065-bc6c-9387b8f0299a\") " pod="openshift-marketplace/certified-operators-v8rz5" Dec 11 13:18:13 crc kubenswrapper[4898]: I1211 13:18:13.597493 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b9eb4c92-7716-4065-bc6c-9387b8f0299a-utilities\") pod \"certified-operators-v8rz5\" (UID: \"b9eb4c92-7716-4065-bc6c-9387b8f0299a\") " pod="openshift-marketplace/certified-operators-v8rz5" Dec 11 13:18:13 crc kubenswrapper[4898]: I1211 13:18:13.642507 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7zrcf\" (UniqueName: \"kubernetes.io/projected/b9eb4c92-7716-4065-bc6c-9387b8f0299a-kube-api-access-7zrcf\") pod \"certified-operators-v8rz5\" (UID: \"b9eb4c92-7716-4065-bc6c-9387b8f0299a\") " pod="openshift-marketplace/certified-operators-v8rz5" Dec 11 13:18:13 crc kubenswrapper[4898]: I1211 13:18:13.811761 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-v8rz5" Dec 11 13:18:14 crc kubenswrapper[4898]: I1211 13:18:14.340879 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-v8rz5"] Dec 11 13:18:14 crc kubenswrapper[4898]: I1211 13:18:14.937779 4898 generic.go:334] "Generic (PLEG): container finished" podID="b9eb4c92-7716-4065-bc6c-9387b8f0299a" containerID="dcfbb2f095272380794c36ebcf27d9901ab559b9a92ade5791a0d1caf70da13f" exitCode=0 Dec 11 13:18:14 crc kubenswrapper[4898]: I1211 13:18:14.938124 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v8rz5" event={"ID":"b9eb4c92-7716-4065-bc6c-9387b8f0299a","Type":"ContainerDied","Data":"dcfbb2f095272380794c36ebcf27d9901ab559b9a92ade5791a0d1caf70da13f"} Dec 11 13:18:14 crc kubenswrapper[4898]: I1211 13:18:14.938165 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v8rz5" event={"ID":"b9eb4c92-7716-4065-bc6c-9387b8f0299a","Type":"ContainerStarted","Data":"ebf86507ff7d60ed3b401567f2f77ab1b27a580f8053974a8f0cdd7b9a159b2e"} Dec 11 13:18:15 crc kubenswrapper[4898]: I1211 13:18:15.947925 4898 generic.go:334] "Generic (PLEG): container finished" podID="b9eb4c92-7716-4065-bc6c-9387b8f0299a" containerID="6d3d638c223af66c51d8f3d4b863a0f0096c86658f0ad9164a3d0e6b28249961" exitCode=0 Dec 11 13:18:15 crc kubenswrapper[4898]: I1211 13:18:15.948004 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v8rz5" event={"ID":"b9eb4c92-7716-4065-bc6c-9387b8f0299a","Type":"ContainerDied","Data":"6d3d638c223af66c51d8f3d4b863a0f0096c86658f0ad9164a3d0e6b28249961"} Dec 11 13:18:16 crc kubenswrapper[4898]: I1211 13:18:16.962217 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v8rz5" event={"ID":"b9eb4c92-7716-4065-bc6c-9387b8f0299a","Type":"ContainerStarted","Data":"2cacc277d9b0387b52824a0d6cb151530f96368ff80e90a6ca1542b7c1956121"} Dec 11 13:18:16 crc kubenswrapper[4898]: I1211 13:18:16.995701 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-v8rz5" podStartSLOduration=2.530222547 podStartE2EDuration="3.99567555s" podCreationTimestamp="2025-12-11 13:18:13 +0000 UTC" firstStartedPulling="2025-12-11 13:18:14.941385149 +0000 UTC m=+852.513711626" lastFinishedPulling="2025-12-11 13:18:16.406838182 +0000 UTC m=+853.979164629" observedRunningTime="2025-12-11 13:18:16.990306623 +0000 UTC m=+854.562633070" watchObservedRunningTime="2025-12-11 13:18:16.99567555 +0000 UTC m=+854.568002007" Dec 11 13:18:23 crc kubenswrapper[4898]: I1211 13:18:23.812516 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-v8rz5" Dec 11 13:18:23 crc kubenswrapper[4898]: I1211 13:18:23.813541 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-v8rz5" Dec 11 13:18:23 crc kubenswrapper[4898]: I1211 13:18:23.864541 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-v8rz5" Dec 11 13:18:24 crc kubenswrapper[4898]: I1211 13:18:24.063723 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-v8rz5" Dec 11 13:18:24 crc kubenswrapper[4898]: I1211 13:18:24.101814 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-v8rz5"] Dec 11 13:18:25 crc kubenswrapper[4898]: I1211 13:18:25.218257 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/collector-hgndg"] Dec 11 13:18:25 crc kubenswrapper[4898]: I1211 13:18:25.219558 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/collector-hgndg" Dec 11 13:18:25 crc kubenswrapper[4898]: I1211 13:18:25.221544 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-token" Dec 11 13:18:25 crc kubenswrapper[4898]: I1211 13:18:25.221722 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-metrics" Dec 11 13:18:25 crc kubenswrapper[4898]: I1211 13:18:25.221657 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"collector-config" Dec 11 13:18:25 crc kubenswrapper[4898]: I1211 13:18:25.226141 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-syslog-receiver" Dec 11 13:18:25 crc kubenswrapper[4898]: I1211 13:18:25.226157 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-dockercfg-zdq8x" Dec 11 13:18:25 crc kubenswrapper[4898]: I1211 13:18:25.240787 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"collector-trustbundle" Dec 11 13:18:25 crc kubenswrapper[4898]: I1211 13:18:25.245588 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/collector-hgndg"] Dec 11 13:18:25 crc kubenswrapper[4898]: I1211 13:18:25.315992 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-47w86\" (UniqueName: \"kubernetes.io/projected/77030d75-67ee-48b4-bb7b-fcaba6219f53-kube-api-access-47w86\") pod \"collector-hgndg\" (UID: \"77030d75-67ee-48b4-bb7b-fcaba6219f53\") " pod="openshift-logging/collector-hgndg" Dec 11 13:18:25 crc kubenswrapper[4898]: I1211 13:18:25.316043 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/77030d75-67ee-48b4-bb7b-fcaba6219f53-entrypoint\") pod \"collector-hgndg\" (UID: \"77030d75-67ee-48b4-bb7b-fcaba6219f53\") " pod="openshift-logging/collector-hgndg" Dec 11 13:18:25 crc kubenswrapper[4898]: I1211 13:18:25.316061 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/77030d75-67ee-48b4-bb7b-fcaba6219f53-trusted-ca\") pod \"collector-hgndg\" (UID: \"77030d75-67ee-48b4-bb7b-fcaba6219f53\") " pod="openshift-logging/collector-hgndg" Dec 11 13:18:25 crc kubenswrapper[4898]: I1211 13:18:25.316080 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/77030d75-67ee-48b4-bb7b-fcaba6219f53-config-openshift-service-cacrt\") pod \"collector-hgndg\" (UID: \"77030d75-67ee-48b4-bb7b-fcaba6219f53\") " pod="openshift-logging/collector-hgndg" Dec 11 13:18:25 crc kubenswrapper[4898]: I1211 13:18:25.316100 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/77030d75-67ee-48b4-bb7b-fcaba6219f53-datadir\") pod \"collector-hgndg\" (UID: \"77030d75-67ee-48b4-bb7b-fcaba6219f53\") " pod="openshift-logging/collector-hgndg" Dec 11 13:18:25 crc kubenswrapper[4898]: I1211 13:18:25.316114 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/77030d75-67ee-48b4-bb7b-fcaba6219f53-metrics\") pod \"collector-hgndg\" (UID: \"77030d75-67ee-48b4-bb7b-fcaba6219f53\") " pod="openshift-logging/collector-hgndg" Dec 11 13:18:25 crc kubenswrapper[4898]: I1211 13:18:25.316144 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/77030d75-67ee-48b4-bb7b-fcaba6219f53-config\") pod \"collector-hgndg\" (UID: \"77030d75-67ee-48b4-bb7b-fcaba6219f53\") " pod="openshift-logging/collector-hgndg" Dec 11 13:18:25 crc kubenswrapper[4898]: I1211 13:18:25.316164 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/77030d75-67ee-48b4-bb7b-fcaba6219f53-collector-syslog-receiver\") pod \"collector-hgndg\" (UID: \"77030d75-67ee-48b4-bb7b-fcaba6219f53\") " pod="openshift-logging/collector-hgndg" Dec 11 13:18:25 crc kubenswrapper[4898]: I1211 13:18:25.316182 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/77030d75-67ee-48b4-bb7b-fcaba6219f53-sa-token\") pod \"collector-hgndg\" (UID: \"77030d75-67ee-48b4-bb7b-fcaba6219f53\") " pod="openshift-logging/collector-hgndg" Dec 11 13:18:25 crc kubenswrapper[4898]: I1211 13:18:25.316216 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/77030d75-67ee-48b4-bb7b-fcaba6219f53-collector-token\") pod \"collector-hgndg\" (UID: \"77030d75-67ee-48b4-bb7b-fcaba6219f53\") " pod="openshift-logging/collector-hgndg" Dec 11 13:18:25 crc kubenswrapper[4898]: I1211 13:18:25.316241 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/77030d75-67ee-48b4-bb7b-fcaba6219f53-tmp\") pod \"collector-hgndg\" (UID: \"77030d75-67ee-48b4-bb7b-fcaba6219f53\") " pod="openshift-logging/collector-hgndg" Dec 11 13:18:25 crc kubenswrapper[4898]: I1211 13:18:25.329201 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-logging/collector-hgndg"] Dec 11 13:18:25 crc kubenswrapper[4898]: E1211 13:18:25.329982 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[collector-syslog-receiver collector-token config config-openshift-service-cacrt datadir entrypoint kube-api-access-47w86 metrics sa-token tmp trusted-ca], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-logging/collector-hgndg" podUID="77030d75-67ee-48b4-bb7b-fcaba6219f53" Dec 11 13:18:25 crc kubenswrapper[4898]: I1211 13:18:25.417272 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/77030d75-67ee-48b4-bb7b-fcaba6219f53-tmp\") pod \"collector-hgndg\" (UID: \"77030d75-67ee-48b4-bb7b-fcaba6219f53\") " pod="openshift-logging/collector-hgndg" Dec 11 13:18:25 crc kubenswrapper[4898]: I1211 13:18:25.417345 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-47w86\" (UniqueName: \"kubernetes.io/projected/77030d75-67ee-48b4-bb7b-fcaba6219f53-kube-api-access-47w86\") pod \"collector-hgndg\" (UID: \"77030d75-67ee-48b4-bb7b-fcaba6219f53\") " pod="openshift-logging/collector-hgndg" Dec 11 13:18:25 crc kubenswrapper[4898]: I1211 13:18:25.417374 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/77030d75-67ee-48b4-bb7b-fcaba6219f53-entrypoint\") pod \"collector-hgndg\" (UID: \"77030d75-67ee-48b4-bb7b-fcaba6219f53\") " pod="openshift-logging/collector-hgndg" Dec 11 13:18:25 crc kubenswrapper[4898]: I1211 13:18:25.417390 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/77030d75-67ee-48b4-bb7b-fcaba6219f53-trusted-ca\") pod \"collector-hgndg\" (UID: \"77030d75-67ee-48b4-bb7b-fcaba6219f53\") " pod="openshift-logging/collector-hgndg" Dec 11 13:18:25 crc kubenswrapper[4898]: I1211 13:18:25.417411 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/77030d75-67ee-48b4-bb7b-fcaba6219f53-config-openshift-service-cacrt\") pod \"collector-hgndg\" (UID: \"77030d75-67ee-48b4-bb7b-fcaba6219f53\") " pod="openshift-logging/collector-hgndg" Dec 11 13:18:25 crc kubenswrapper[4898]: I1211 13:18:25.417434 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/77030d75-67ee-48b4-bb7b-fcaba6219f53-datadir\") pod \"collector-hgndg\" (UID: \"77030d75-67ee-48b4-bb7b-fcaba6219f53\") " pod="openshift-logging/collector-hgndg" Dec 11 13:18:25 crc kubenswrapper[4898]: I1211 13:18:25.417465 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/77030d75-67ee-48b4-bb7b-fcaba6219f53-metrics\") pod \"collector-hgndg\" (UID: \"77030d75-67ee-48b4-bb7b-fcaba6219f53\") " pod="openshift-logging/collector-hgndg" Dec 11 13:18:25 crc kubenswrapper[4898]: I1211 13:18:25.417498 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/77030d75-67ee-48b4-bb7b-fcaba6219f53-config\") pod \"collector-hgndg\" (UID: \"77030d75-67ee-48b4-bb7b-fcaba6219f53\") " pod="openshift-logging/collector-hgndg" Dec 11 13:18:25 crc kubenswrapper[4898]: I1211 13:18:25.417520 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/77030d75-67ee-48b4-bb7b-fcaba6219f53-collector-syslog-receiver\") pod \"collector-hgndg\" (UID: \"77030d75-67ee-48b4-bb7b-fcaba6219f53\") " pod="openshift-logging/collector-hgndg" Dec 11 13:18:25 crc kubenswrapper[4898]: I1211 13:18:25.417539 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/77030d75-67ee-48b4-bb7b-fcaba6219f53-sa-token\") pod \"collector-hgndg\" (UID: \"77030d75-67ee-48b4-bb7b-fcaba6219f53\") " pod="openshift-logging/collector-hgndg" Dec 11 13:18:25 crc kubenswrapper[4898]: I1211 13:18:25.417577 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/77030d75-67ee-48b4-bb7b-fcaba6219f53-collector-token\") pod \"collector-hgndg\" (UID: \"77030d75-67ee-48b4-bb7b-fcaba6219f53\") " pod="openshift-logging/collector-hgndg" Dec 11 13:18:25 crc kubenswrapper[4898]: E1211 13:18:25.417798 4898 secret.go:188] Couldn't get secret openshift-logging/collector-metrics: secret "collector-metrics" not found Dec 11 13:18:25 crc kubenswrapper[4898]: E1211 13:18:25.417844 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/77030d75-67ee-48b4-bb7b-fcaba6219f53-metrics podName:77030d75-67ee-48b4-bb7b-fcaba6219f53 nodeName:}" failed. No retries permitted until 2025-12-11 13:18:25.917830479 +0000 UTC m=+863.490156916 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics" (UniqueName: "kubernetes.io/secret/77030d75-67ee-48b4-bb7b-fcaba6219f53-metrics") pod "collector-hgndg" (UID: "77030d75-67ee-48b4-bb7b-fcaba6219f53") : secret "collector-metrics" not found Dec 11 13:18:25 crc kubenswrapper[4898]: I1211 13:18:25.418278 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/77030d75-67ee-48b4-bb7b-fcaba6219f53-datadir\") pod \"collector-hgndg\" (UID: \"77030d75-67ee-48b4-bb7b-fcaba6219f53\") " pod="openshift-logging/collector-hgndg" Dec 11 13:18:25 crc kubenswrapper[4898]: I1211 13:18:25.418848 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/77030d75-67ee-48b4-bb7b-fcaba6219f53-config\") pod \"collector-hgndg\" (UID: \"77030d75-67ee-48b4-bb7b-fcaba6219f53\") " pod="openshift-logging/collector-hgndg" Dec 11 13:18:25 crc kubenswrapper[4898]: I1211 13:18:25.418862 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/77030d75-67ee-48b4-bb7b-fcaba6219f53-entrypoint\") pod \"collector-hgndg\" (UID: \"77030d75-67ee-48b4-bb7b-fcaba6219f53\") " pod="openshift-logging/collector-hgndg" Dec 11 13:18:25 crc kubenswrapper[4898]: I1211 13:18:25.419480 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/77030d75-67ee-48b4-bb7b-fcaba6219f53-trusted-ca\") pod \"collector-hgndg\" (UID: \"77030d75-67ee-48b4-bb7b-fcaba6219f53\") " pod="openshift-logging/collector-hgndg" Dec 11 13:18:25 crc kubenswrapper[4898]: I1211 13:18:25.419960 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/77030d75-67ee-48b4-bb7b-fcaba6219f53-config-openshift-service-cacrt\") pod \"collector-hgndg\" (UID: \"77030d75-67ee-48b4-bb7b-fcaba6219f53\") " pod="openshift-logging/collector-hgndg" Dec 11 13:18:25 crc kubenswrapper[4898]: I1211 13:18:25.424068 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/77030d75-67ee-48b4-bb7b-fcaba6219f53-collector-token\") pod \"collector-hgndg\" (UID: \"77030d75-67ee-48b4-bb7b-fcaba6219f53\") " pod="openshift-logging/collector-hgndg" Dec 11 13:18:25 crc kubenswrapper[4898]: I1211 13:18:25.425255 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/77030d75-67ee-48b4-bb7b-fcaba6219f53-tmp\") pod \"collector-hgndg\" (UID: \"77030d75-67ee-48b4-bb7b-fcaba6219f53\") " pod="openshift-logging/collector-hgndg" Dec 11 13:18:25 crc kubenswrapper[4898]: I1211 13:18:25.434355 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/77030d75-67ee-48b4-bb7b-fcaba6219f53-collector-syslog-receiver\") pod \"collector-hgndg\" (UID: \"77030d75-67ee-48b4-bb7b-fcaba6219f53\") " pod="openshift-logging/collector-hgndg" Dec 11 13:18:25 crc kubenswrapper[4898]: I1211 13:18:25.438996 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/77030d75-67ee-48b4-bb7b-fcaba6219f53-sa-token\") pod \"collector-hgndg\" (UID: \"77030d75-67ee-48b4-bb7b-fcaba6219f53\") " pod="openshift-logging/collector-hgndg" Dec 11 13:18:25 crc kubenswrapper[4898]: I1211 13:18:25.463237 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-47w86\" (UniqueName: \"kubernetes.io/projected/77030d75-67ee-48b4-bb7b-fcaba6219f53-kube-api-access-47w86\") pod \"collector-hgndg\" (UID: \"77030d75-67ee-48b4-bb7b-fcaba6219f53\") " pod="openshift-logging/collector-hgndg" Dec 11 13:18:25 crc kubenswrapper[4898]: I1211 13:18:25.923773 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/77030d75-67ee-48b4-bb7b-fcaba6219f53-metrics\") pod \"collector-hgndg\" (UID: \"77030d75-67ee-48b4-bb7b-fcaba6219f53\") " pod="openshift-logging/collector-hgndg" Dec 11 13:18:25 crc kubenswrapper[4898]: I1211 13:18:25.928840 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/77030d75-67ee-48b4-bb7b-fcaba6219f53-metrics\") pod \"collector-hgndg\" (UID: \"77030d75-67ee-48b4-bb7b-fcaba6219f53\") " pod="openshift-logging/collector-hgndg" Dec 11 13:18:26 crc kubenswrapper[4898]: I1211 13:18:26.037970 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-v8rz5" podUID="b9eb4c92-7716-4065-bc6c-9387b8f0299a" containerName="registry-server" containerID="cri-o://2cacc277d9b0387b52824a0d6cb151530f96368ff80e90a6ca1542b7c1956121" gracePeriod=2 Dec 11 13:18:26 crc kubenswrapper[4898]: I1211 13:18:26.038059 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/collector-hgndg" Dec 11 13:18:26 crc kubenswrapper[4898]: I1211 13:18:26.053622 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/collector-hgndg" Dec 11 13:18:26 crc kubenswrapper[4898]: I1211 13:18:26.127021 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/77030d75-67ee-48b4-bb7b-fcaba6219f53-tmp\") pod \"77030d75-67ee-48b4-bb7b-fcaba6219f53\" (UID: \"77030d75-67ee-48b4-bb7b-fcaba6219f53\") " Dec 11 13:18:26 crc kubenswrapper[4898]: I1211 13:18:26.127093 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/77030d75-67ee-48b4-bb7b-fcaba6219f53-metrics\") pod \"77030d75-67ee-48b4-bb7b-fcaba6219f53\" (UID: \"77030d75-67ee-48b4-bb7b-fcaba6219f53\") " Dec 11 13:18:26 crc kubenswrapper[4898]: I1211 13:18:26.127246 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-47w86\" (UniqueName: \"kubernetes.io/projected/77030d75-67ee-48b4-bb7b-fcaba6219f53-kube-api-access-47w86\") pod \"77030d75-67ee-48b4-bb7b-fcaba6219f53\" (UID: \"77030d75-67ee-48b4-bb7b-fcaba6219f53\") " Dec 11 13:18:26 crc kubenswrapper[4898]: I1211 13:18:26.127284 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/77030d75-67ee-48b4-bb7b-fcaba6219f53-trusted-ca\") pod \"77030d75-67ee-48b4-bb7b-fcaba6219f53\" (UID: \"77030d75-67ee-48b4-bb7b-fcaba6219f53\") " Dec 11 13:18:26 crc kubenswrapper[4898]: I1211 13:18:26.127384 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/77030d75-67ee-48b4-bb7b-fcaba6219f53-entrypoint\") pod \"77030d75-67ee-48b4-bb7b-fcaba6219f53\" (UID: \"77030d75-67ee-48b4-bb7b-fcaba6219f53\") " Dec 11 13:18:26 crc kubenswrapper[4898]: I1211 13:18:26.127416 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/77030d75-67ee-48b4-bb7b-fcaba6219f53-datadir\") pod \"77030d75-67ee-48b4-bb7b-fcaba6219f53\" (UID: \"77030d75-67ee-48b4-bb7b-fcaba6219f53\") " Dec 11 13:18:26 crc kubenswrapper[4898]: I1211 13:18:26.127444 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/77030d75-67ee-48b4-bb7b-fcaba6219f53-config\") pod \"77030d75-67ee-48b4-bb7b-fcaba6219f53\" (UID: \"77030d75-67ee-48b4-bb7b-fcaba6219f53\") " Dec 11 13:18:26 crc kubenswrapper[4898]: I1211 13:18:26.127491 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/77030d75-67ee-48b4-bb7b-fcaba6219f53-config-openshift-service-cacrt\") pod \"77030d75-67ee-48b4-bb7b-fcaba6219f53\" (UID: \"77030d75-67ee-48b4-bb7b-fcaba6219f53\") " Dec 11 13:18:26 crc kubenswrapper[4898]: I1211 13:18:26.127537 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/77030d75-67ee-48b4-bb7b-fcaba6219f53-collector-token\") pod \"77030d75-67ee-48b4-bb7b-fcaba6219f53\" (UID: \"77030d75-67ee-48b4-bb7b-fcaba6219f53\") " Dec 11 13:18:26 crc kubenswrapper[4898]: I1211 13:18:26.127594 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/77030d75-67ee-48b4-bb7b-fcaba6219f53-sa-token\") pod \"77030d75-67ee-48b4-bb7b-fcaba6219f53\" (UID: \"77030d75-67ee-48b4-bb7b-fcaba6219f53\") " Dec 11 13:18:26 crc kubenswrapper[4898]: I1211 13:18:26.127633 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/77030d75-67ee-48b4-bb7b-fcaba6219f53-collector-syslog-receiver\") pod \"77030d75-67ee-48b4-bb7b-fcaba6219f53\" (UID: \"77030d75-67ee-48b4-bb7b-fcaba6219f53\") " Dec 11 13:18:26 crc kubenswrapper[4898]: I1211 13:18:26.127740 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/77030d75-67ee-48b4-bb7b-fcaba6219f53-datadir" (OuterVolumeSpecName: "datadir") pod "77030d75-67ee-48b4-bb7b-fcaba6219f53" (UID: "77030d75-67ee-48b4-bb7b-fcaba6219f53"). InnerVolumeSpecName "datadir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 11 13:18:26 crc kubenswrapper[4898]: I1211 13:18:26.128026 4898 reconciler_common.go:293] "Volume detached for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/77030d75-67ee-48b4-bb7b-fcaba6219f53-datadir\") on node \"crc\" DevicePath \"\"" Dec 11 13:18:26 crc kubenswrapper[4898]: I1211 13:18:26.128424 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/77030d75-67ee-48b4-bb7b-fcaba6219f53-config" (OuterVolumeSpecName: "config") pod "77030d75-67ee-48b4-bb7b-fcaba6219f53" (UID: "77030d75-67ee-48b4-bb7b-fcaba6219f53"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:18:26 crc kubenswrapper[4898]: I1211 13:18:26.131763 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/77030d75-67ee-48b4-bb7b-fcaba6219f53-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "77030d75-67ee-48b4-bb7b-fcaba6219f53" (UID: "77030d75-67ee-48b4-bb7b-fcaba6219f53"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:18:26 crc kubenswrapper[4898]: I1211 13:18:26.131907 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/77030d75-67ee-48b4-bb7b-fcaba6219f53-entrypoint" (OuterVolumeSpecName: "entrypoint") pod "77030d75-67ee-48b4-bb7b-fcaba6219f53" (UID: "77030d75-67ee-48b4-bb7b-fcaba6219f53"). InnerVolumeSpecName "entrypoint". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:18:26 crc kubenswrapper[4898]: I1211 13:18:26.131921 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/77030d75-67ee-48b4-bb7b-fcaba6219f53-config-openshift-service-cacrt" (OuterVolumeSpecName: "config-openshift-service-cacrt") pod "77030d75-67ee-48b4-bb7b-fcaba6219f53" (UID: "77030d75-67ee-48b4-bb7b-fcaba6219f53"). InnerVolumeSpecName "config-openshift-service-cacrt". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:18:26 crc kubenswrapper[4898]: I1211 13:18:26.132185 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/77030d75-67ee-48b4-bb7b-fcaba6219f53-tmp" (OuterVolumeSpecName: "tmp") pod "77030d75-67ee-48b4-bb7b-fcaba6219f53" (UID: "77030d75-67ee-48b4-bb7b-fcaba6219f53"). InnerVolumeSpecName "tmp". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 13:18:26 crc kubenswrapper[4898]: I1211 13:18:26.134197 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77030d75-67ee-48b4-bb7b-fcaba6219f53-collector-syslog-receiver" (OuterVolumeSpecName: "collector-syslog-receiver") pod "77030d75-67ee-48b4-bb7b-fcaba6219f53" (UID: "77030d75-67ee-48b4-bb7b-fcaba6219f53"). InnerVolumeSpecName "collector-syslog-receiver". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:18:26 crc kubenswrapper[4898]: I1211 13:18:26.136954 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77030d75-67ee-48b4-bb7b-fcaba6219f53-sa-token" (OuterVolumeSpecName: "sa-token") pod "77030d75-67ee-48b4-bb7b-fcaba6219f53" (UID: "77030d75-67ee-48b4-bb7b-fcaba6219f53"). InnerVolumeSpecName "sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:18:26 crc kubenswrapper[4898]: I1211 13:18:26.137259 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77030d75-67ee-48b4-bb7b-fcaba6219f53-collector-token" (OuterVolumeSpecName: "collector-token") pod "77030d75-67ee-48b4-bb7b-fcaba6219f53" (UID: "77030d75-67ee-48b4-bb7b-fcaba6219f53"). InnerVolumeSpecName "collector-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:18:26 crc kubenswrapper[4898]: I1211 13:18:26.138950 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77030d75-67ee-48b4-bb7b-fcaba6219f53-kube-api-access-47w86" (OuterVolumeSpecName: "kube-api-access-47w86") pod "77030d75-67ee-48b4-bb7b-fcaba6219f53" (UID: "77030d75-67ee-48b4-bb7b-fcaba6219f53"). InnerVolumeSpecName "kube-api-access-47w86". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:18:26 crc kubenswrapper[4898]: I1211 13:18:26.141671 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77030d75-67ee-48b4-bb7b-fcaba6219f53-metrics" (OuterVolumeSpecName: "metrics") pod "77030d75-67ee-48b4-bb7b-fcaba6219f53" (UID: "77030d75-67ee-48b4-bb7b-fcaba6219f53"). InnerVolumeSpecName "metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:18:26 crc kubenswrapper[4898]: I1211 13:18:26.231692 4898 reconciler_common.go:293] "Volume detached for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/77030d75-67ee-48b4-bb7b-fcaba6219f53-entrypoint\") on node \"crc\" DevicePath \"\"" Dec 11 13:18:26 crc kubenswrapper[4898]: I1211 13:18:26.232116 4898 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/77030d75-67ee-48b4-bb7b-fcaba6219f53-config\") on node \"crc\" DevicePath \"\"" Dec 11 13:18:26 crc kubenswrapper[4898]: I1211 13:18:26.232139 4898 reconciler_common.go:293] "Volume detached for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/77030d75-67ee-48b4-bb7b-fcaba6219f53-config-openshift-service-cacrt\") on node \"crc\" DevicePath \"\"" Dec 11 13:18:26 crc kubenswrapper[4898]: I1211 13:18:26.232159 4898 reconciler_common.go:293] "Volume detached for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/77030d75-67ee-48b4-bb7b-fcaba6219f53-collector-token\") on node \"crc\" DevicePath \"\"" Dec 11 13:18:26 crc kubenswrapper[4898]: I1211 13:18:26.232178 4898 reconciler_common.go:293] "Volume detached for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/77030d75-67ee-48b4-bb7b-fcaba6219f53-sa-token\") on node \"crc\" DevicePath \"\"" Dec 11 13:18:26 crc kubenswrapper[4898]: I1211 13:18:26.232195 4898 reconciler_common.go:293] "Volume detached for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/77030d75-67ee-48b4-bb7b-fcaba6219f53-collector-syslog-receiver\") on node \"crc\" DevicePath \"\"" Dec 11 13:18:26 crc kubenswrapper[4898]: I1211 13:18:26.232213 4898 reconciler_common.go:293] "Volume detached for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/77030d75-67ee-48b4-bb7b-fcaba6219f53-tmp\") on node \"crc\" DevicePath \"\"" Dec 11 13:18:26 crc kubenswrapper[4898]: I1211 13:18:26.232229 4898 reconciler_common.go:293] "Volume detached for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/77030d75-67ee-48b4-bb7b-fcaba6219f53-metrics\") on node \"crc\" DevicePath \"\"" Dec 11 13:18:26 crc kubenswrapper[4898]: I1211 13:18:26.232245 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-47w86\" (UniqueName: \"kubernetes.io/projected/77030d75-67ee-48b4-bb7b-fcaba6219f53-kube-api-access-47w86\") on node \"crc\" DevicePath \"\"" Dec 11 13:18:26 crc kubenswrapper[4898]: I1211 13:18:26.232262 4898 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/77030d75-67ee-48b4-bb7b-fcaba6219f53-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 11 13:18:26 crc kubenswrapper[4898]: I1211 13:18:26.540669 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-v8rz5" Dec 11 13:18:26 crc kubenswrapper[4898]: I1211 13:18:26.640075 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7zrcf\" (UniqueName: \"kubernetes.io/projected/b9eb4c92-7716-4065-bc6c-9387b8f0299a-kube-api-access-7zrcf\") pod \"b9eb4c92-7716-4065-bc6c-9387b8f0299a\" (UID: \"b9eb4c92-7716-4065-bc6c-9387b8f0299a\") " Dec 11 13:18:26 crc kubenswrapper[4898]: I1211 13:18:26.640336 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b9eb4c92-7716-4065-bc6c-9387b8f0299a-catalog-content\") pod \"b9eb4c92-7716-4065-bc6c-9387b8f0299a\" (UID: \"b9eb4c92-7716-4065-bc6c-9387b8f0299a\") " Dec 11 13:18:26 crc kubenswrapper[4898]: I1211 13:18:26.640380 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b9eb4c92-7716-4065-bc6c-9387b8f0299a-utilities\") pod \"b9eb4c92-7716-4065-bc6c-9387b8f0299a\" (UID: \"b9eb4c92-7716-4065-bc6c-9387b8f0299a\") " Dec 11 13:18:26 crc kubenswrapper[4898]: I1211 13:18:26.642049 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b9eb4c92-7716-4065-bc6c-9387b8f0299a-utilities" (OuterVolumeSpecName: "utilities") pod "b9eb4c92-7716-4065-bc6c-9387b8f0299a" (UID: "b9eb4c92-7716-4065-bc6c-9387b8f0299a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 13:18:26 crc kubenswrapper[4898]: I1211 13:18:26.647677 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9eb4c92-7716-4065-bc6c-9387b8f0299a-kube-api-access-7zrcf" (OuterVolumeSpecName: "kube-api-access-7zrcf") pod "b9eb4c92-7716-4065-bc6c-9387b8f0299a" (UID: "b9eb4c92-7716-4065-bc6c-9387b8f0299a"). InnerVolumeSpecName "kube-api-access-7zrcf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:18:26 crc kubenswrapper[4898]: I1211 13:18:26.742654 4898 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b9eb4c92-7716-4065-bc6c-9387b8f0299a-utilities\") on node \"crc\" DevicePath \"\"" Dec 11 13:18:26 crc kubenswrapper[4898]: I1211 13:18:26.742698 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7zrcf\" (UniqueName: \"kubernetes.io/projected/b9eb4c92-7716-4065-bc6c-9387b8f0299a-kube-api-access-7zrcf\") on node \"crc\" DevicePath \"\"" Dec 11 13:18:27 crc kubenswrapper[4898]: I1211 13:18:27.052631 4898 generic.go:334] "Generic (PLEG): container finished" podID="b9eb4c92-7716-4065-bc6c-9387b8f0299a" containerID="2cacc277d9b0387b52824a0d6cb151530f96368ff80e90a6ca1542b7c1956121" exitCode=0 Dec 11 13:18:27 crc kubenswrapper[4898]: I1211 13:18:27.052734 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v8rz5" event={"ID":"b9eb4c92-7716-4065-bc6c-9387b8f0299a","Type":"ContainerDied","Data":"2cacc277d9b0387b52824a0d6cb151530f96368ff80e90a6ca1542b7c1956121"} Dec 11 13:18:27 crc kubenswrapper[4898]: I1211 13:18:27.053105 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/collector-hgndg" Dec 11 13:18:27 crc kubenswrapper[4898]: I1211 13:18:27.052810 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-v8rz5" Dec 11 13:18:27 crc kubenswrapper[4898]: I1211 13:18:27.053146 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v8rz5" event={"ID":"b9eb4c92-7716-4065-bc6c-9387b8f0299a","Type":"ContainerDied","Data":"ebf86507ff7d60ed3b401567f2f77ab1b27a580f8053974a8f0cdd7b9a159b2e"} Dec 11 13:18:27 crc kubenswrapper[4898]: I1211 13:18:27.053267 4898 scope.go:117] "RemoveContainer" containerID="2cacc277d9b0387b52824a0d6cb151530f96368ff80e90a6ca1542b7c1956121" Dec 11 13:18:27 crc kubenswrapper[4898]: I1211 13:18:27.085999 4898 scope.go:117] "RemoveContainer" containerID="6d3d638c223af66c51d8f3d4b863a0f0096c86658f0ad9164a3d0e6b28249961" Dec 11 13:18:27 crc kubenswrapper[4898]: I1211 13:18:27.158022 4898 scope.go:117] "RemoveContainer" containerID="dcfbb2f095272380794c36ebcf27d9901ab559b9a92ade5791a0d1caf70da13f" Dec 11 13:18:27 crc kubenswrapper[4898]: I1211 13:18:27.160435 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-logging/collector-hgndg"] Dec 11 13:18:27 crc kubenswrapper[4898]: I1211 13:18:27.180520 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-logging/collector-hgndg"] Dec 11 13:18:27 crc kubenswrapper[4898]: I1211 13:18:27.182883 4898 scope.go:117] "RemoveContainer" containerID="2cacc277d9b0387b52824a0d6cb151530f96368ff80e90a6ca1542b7c1956121" Dec 11 13:18:27 crc kubenswrapper[4898]: E1211 13:18:27.183517 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2cacc277d9b0387b52824a0d6cb151530f96368ff80e90a6ca1542b7c1956121\": container with ID starting with 2cacc277d9b0387b52824a0d6cb151530f96368ff80e90a6ca1542b7c1956121 not found: ID does not exist" containerID="2cacc277d9b0387b52824a0d6cb151530f96368ff80e90a6ca1542b7c1956121" Dec 11 13:18:27 crc kubenswrapper[4898]: I1211 13:18:27.183680 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2cacc277d9b0387b52824a0d6cb151530f96368ff80e90a6ca1542b7c1956121"} err="failed to get container status \"2cacc277d9b0387b52824a0d6cb151530f96368ff80e90a6ca1542b7c1956121\": rpc error: code = NotFound desc = could not find container \"2cacc277d9b0387b52824a0d6cb151530f96368ff80e90a6ca1542b7c1956121\": container with ID starting with 2cacc277d9b0387b52824a0d6cb151530f96368ff80e90a6ca1542b7c1956121 not found: ID does not exist" Dec 11 13:18:27 crc kubenswrapper[4898]: I1211 13:18:27.183735 4898 scope.go:117] "RemoveContainer" containerID="6d3d638c223af66c51d8f3d4b863a0f0096c86658f0ad9164a3d0e6b28249961" Dec 11 13:18:27 crc kubenswrapper[4898]: E1211 13:18:27.184189 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d3d638c223af66c51d8f3d4b863a0f0096c86658f0ad9164a3d0e6b28249961\": container with ID starting with 6d3d638c223af66c51d8f3d4b863a0f0096c86658f0ad9164a3d0e6b28249961 not found: ID does not exist" containerID="6d3d638c223af66c51d8f3d4b863a0f0096c86658f0ad9164a3d0e6b28249961" Dec 11 13:18:27 crc kubenswrapper[4898]: I1211 13:18:27.184322 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d3d638c223af66c51d8f3d4b863a0f0096c86658f0ad9164a3d0e6b28249961"} err="failed to get container status \"6d3d638c223af66c51d8f3d4b863a0f0096c86658f0ad9164a3d0e6b28249961\": rpc error: code = NotFound desc = could not find container \"6d3d638c223af66c51d8f3d4b863a0f0096c86658f0ad9164a3d0e6b28249961\": container with ID starting with 6d3d638c223af66c51d8f3d4b863a0f0096c86658f0ad9164a3d0e6b28249961 not found: ID does not exist" Dec 11 13:18:27 crc kubenswrapper[4898]: I1211 13:18:27.184431 4898 scope.go:117] "RemoveContainer" containerID="dcfbb2f095272380794c36ebcf27d9901ab559b9a92ade5791a0d1caf70da13f" Dec 11 13:18:27 crc kubenswrapper[4898]: E1211 13:18:27.184931 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dcfbb2f095272380794c36ebcf27d9901ab559b9a92ade5791a0d1caf70da13f\": container with ID starting with dcfbb2f095272380794c36ebcf27d9901ab559b9a92ade5791a0d1caf70da13f not found: ID does not exist" containerID="dcfbb2f095272380794c36ebcf27d9901ab559b9a92ade5791a0d1caf70da13f" Dec 11 13:18:27 crc kubenswrapper[4898]: I1211 13:18:27.185048 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dcfbb2f095272380794c36ebcf27d9901ab559b9a92ade5791a0d1caf70da13f"} err="failed to get container status \"dcfbb2f095272380794c36ebcf27d9901ab559b9a92ade5791a0d1caf70da13f\": rpc error: code = NotFound desc = could not find container \"dcfbb2f095272380794c36ebcf27d9901ab559b9a92ade5791a0d1caf70da13f\": container with ID starting with dcfbb2f095272380794c36ebcf27d9901ab559b9a92ade5791a0d1caf70da13f not found: ID does not exist" Dec 11 13:18:27 crc kubenswrapper[4898]: I1211 13:18:27.192765 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/collector-ql6sd"] Dec 11 13:18:27 crc kubenswrapper[4898]: E1211 13:18:27.193122 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9eb4c92-7716-4065-bc6c-9387b8f0299a" containerName="extract-utilities" Dec 11 13:18:27 crc kubenswrapper[4898]: I1211 13:18:27.193143 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9eb4c92-7716-4065-bc6c-9387b8f0299a" containerName="extract-utilities" Dec 11 13:18:27 crc kubenswrapper[4898]: E1211 13:18:27.193160 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9eb4c92-7716-4065-bc6c-9387b8f0299a" containerName="registry-server" Dec 11 13:18:27 crc kubenswrapper[4898]: I1211 13:18:27.193169 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9eb4c92-7716-4065-bc6c-9387b8f0299a" containerName="registry-server" Dec 11 13:18:27 crc kubenswrapper[4898]: E1211 13:18:27.193199 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9eb4c92-7716-4065-bc6c-9387b8f0299a" containerName="extract-content" Dec 11 13:18:27 crc kubenswrapper[4898]: I1211 13:18:27.193207 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9eb4c92-7716-4065-bc6c-9387b8f0299a" containerName="extract-content" Dec 11 13:18:27 crc kubenswrapper[4898]: I1211 13:18:27.193370 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="b9eb4c92-7716-4065-bc6c-9387b8f0299a" containerName="registry-server" Dec 11 13:18:27 crc kubenswrapper[4898]: I1211 13:18:27.194166 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/collector-ql6sd" Dec 11 13:18:27 crc kubenswrapper[4898]: I1211 13:18:27.196394 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-dockercfg-zdq8x" Dec 11 13:18:27 crc kubenswrapper[4898]: I1211 13:18:27.196755 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-syslog-receiver" Dec 11 13:18:27 crc kubenswrapper[4898]: I1211 13:18:27.198603 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-metrics" Dec 11 13:18:27 crc kubenswrapper[4898]: I1211 13:18:27.198714 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-token" Dec 11 13:18:27 crc kubenswrapper[4898]: I1211 13:18:27.199192 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"collector-config" Dec 11 13:18:27 crc kubenswrapper[4898]: I1211 13:18:27.201136 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/collector-ql6sd"] Dec 11 13:18:27 crc kubenswrapper[4898]: I1211 13:18:27.204012 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"collector-trustbundle" Dec 11 13:18:27 crc kubenswrapper[4898]: I1211 13:18:27.261704 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c696c635-6d7a-40d8-aef4-ee5781067e7f-trusted-ca\") pod \"collector-ql6sd\" (UID: \"c696c635-6d7a-40d8-aef4-ee5781067e7f\") " pod="openshift-logging/collector-ql6sd" Dec 11 13:18:27 crc kubenswrapper[4898]: I1211 13:18:27.261792 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/c696c635-6d7a-40d8-aef4-ee5781067e7f-datadir\") pod \"collector-ql6sd\" (UID: \"c696c635-6d7a-40d8-aef4-ee5781067e7f\") " pod="openshift-logging/collector-ql6sd" Dec 11 13:18:27 crc kubenswrapper[4898]: I1211 13:18:27.261843 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ws2zm\" (UniqueName: \"kubernetes.io/projected/c696c635-6d7a-40d8-aef4-ee5781067e7f-kube-api-access-ws2zm\") pod \"collector-ql6sd\" (UID: \"c696c635-6d7a-40d8-aef4-ee5781067e7f\") " pod="openshift-logging/collector-ql6sd" Dec 11 13:18:27 crc kubenswrapper[4898]: I1211 13:18:27.262673 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/c696c635-6d7a-40d8-aef4-ee5781067e7f-collector-syslog-receiver\") pod \"collector-ql6sd\" (UID: \"c696c635-6d7a-40d8-aef4-ee5781067e7f\") " pod="openshift-logging/collector-ql6sd" Dec 11 13:18:27 crc kubenswrapper[4898]: I1211 13:18:27.262756 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/c696c635-6d7a-40d8-aef4-ee5781067e7f-metrics\") pod \"collector-ql6sd\" (UID: \"c696c635-6d7a-40d8-aef4-ee5781067e7f\") " pod="openshift-logging/collector-ql6sd" Dec 11 13:18:27 crc kubenswrapper[4898]: I1211 13:18:27.263089 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/c696c635-6d7a-40d8-aef4-ee5781067e7f-config-openshift-service-cacrt\") pod \"collector-ql6sd\" (UID: \"c696c635-6d7a-40d8-aef4-ee5781067e7f\") " pod="openshift-logging/collector-ql6sd" Dec 11 13:18:27 crc kubenswrapper[4898]: I1211 13:18:27.263149 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c696c635-6d7a-40d8-aef4-ee5781067e7f-tmp\") pod \"collector-ql6sd\" (UID: \"c696c635-6d7a-40d8-aef4-ee5781067e7f\") " pod="openshift-logging/collector-ql6sd" Dec 11 13:18:27 crc kubenswrapper[4898]: I1211 13:18:27.263192 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/c696c635-6d7a-40d8-aef4-ee5781067e7f-sa-token\") pod \"collector-ql6sd\" (UID: \"c696c635-6d7a-40d8-aef4-ee5781067e7f\") " pod="openshift-logging/collector-ql6sd" Dec 11 13:18:27 crc kubenswrapper[4898]: I1211 13:18:27.263220 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/c696c635-6d7a-40d8-aef4-ee5781067e7f-entrypoint\") pod \"collector-ql6sd\" (UID: \"c696c635-6d7a-40d8-aef4-ee5781067e7f\") " pod="openshift-logging/collector-ql6sd" Dec 11 13:18:27 crc kubenswrapper[4898]: I1211 13:18:27.263283 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c696c635-6d7a-40d8-aef4-ee5781067e7f-config\") pod \"collector-ql6sd\" (UID: \"c696c635-6d7a-40d8-aef4-ee5781067e7f\") " pod="openshift-logging/collector-ql6sd" Dec 11 13:18:27 crc kubenswrapper[4898]: I1211 13:18:27.263333 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/c696c635-6d7a-40d8-aef4-ee5781067e7f-collector-token\") pod \"collector-ql6sd\" (UID: \"c696c635-6d7a-40d8-aef4-ee5781067e7f\") " pod="openshift-logging/collector-ql6sd" Dec 11 13:18:27 crc kubenswrapper[4898]: I1211 13:18:27.364620 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/c696c635-6d7a-40d8-aef4-ee5781067e7f-config-openshift-service-cacrt\") pod \"collector-ql6sd\" (UID: \"c696c635-6d7a-40d8-aef4-ee5781067e7f\") " pod="openshift-logging/collector-ql6sd" Dec 11 13:18:27 crc kubenswrapper[4898]: I1211 13:18:27.364666 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c696c635-6d7a-40d8-aef4-ee5781067e7f-tmp\") pod \"collector-ql6sd\" (UID: \"c696c635-6d7a-40d8-aef4-ee5781067e7f\") " pod="openshift-logging/collector-ql6sd" Dec 11 13:18:27 crc kubenswrapper[4898]: I1211 13:18:27.364687 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/c696c635-6d7a-40d8-aef4-ee5781067e7f-sa-token\") pod \"collector-ql6sd\" (UID: \"c696c635-6d7a-40d8-aef4-ee5781067e7f\") " pod="openshift-logging/collector-ql6sd" Dec 11 13:18:27 crc kubenswrapper[4898]: I1211 13:18:27.364704 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/c696c635-6d7a-40d8-aef4-ee5781067e7f-entrypoint\") pod \"collector-ql6sd\" (UID: \"c696c635-6d7a-40d8-aef4-ee5781067e7f\") " pod="openshift-logging/collector-ql6sd" Dec 11 13:18:27 crc kubenswrapper[4898]: I1211 13:18:27.364732 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c696c635-6d7a-40d8-aef4-ee5781067e7f-config\") pod \"collector-ql6sd\" (UID: \"c696c635-6d7a-40d8-aef4-ee5781067e7f\") " pod="openshift-logging/collector-ql6sd" Dec 11 13:18:27 crc kubenswrapper[4898]: I1211 13:18:27.364755 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/c696c635-6d7a-40d8-aef4-ee5781067e7f-collector-token\") pod \"collector-ql6sd\" (UID: \"c696c635-6d7a-40d8-aef4-ee5781067e7f\") " pod="openshift-logging/collector-ql6sd" Dec 11 13:18:27 crc kubenswrapper[4898]: I1211 13:18:27.364797 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c696c635-6d7a-40d8-aef4-ee5781067e7f-trusted-ca\") pod \"collector-ql6sd\" (UID: \"c696c635-6d7a-40d8-aef4-ee5781067e7f\") " pod="openshift-logging/collector-ql6sd" Dec 11 13:18:27 crc kubenswrapper[4898]: I1211 13:18:27.364814 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/c696c635-6d7a-40d8-aef4-ee5781067e7f-datadir\") pod \"collector-ql6sd\" (UID: \"c696c635-6d7a-40d8-aef4-ee5781067e7f\") " pod="openshift-logging/collector-ql6sd" Dec 11 13:18:27 crc kubenswrapper[4898]: I1211 13:18:27.364843 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ws2zm\" (UniqueName: \"kubernetes.io/projected/c696c635-6d7a-40d8-aef4-ee5781067e7f-kube-api-access-ws2zm\") pod \"collector-ql6sd\" (UID: \"c696c635-6d7a-40d8-aef4-ee5781067e7f\") " pod="openshift-logging/collector-ql6sd" Dec 11 13:18:27 crc kubenswrapper[4898]: I1211 13:18:27.364870 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/c696c635-6d7a-40d8-aef4-ee5781067e7f-collector-syslog-receiver\") pod \"collector-ql6sd\" (UID: \"c696c635-6d7a-40d8-aef4-ee5781067e7f\") " pod="openshift-logging/collector-ql6sd" Dec 11 13:18:27 crc kubenswrapper[4898]: I1211 13:18:27.364891 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/c696c635-6d7a-40d8-aef4-ee5781067e7f-metrics\") pod \"collector-ql6sd\" (UID: \"c696c635-6d7a-40d8-aef4-ee5781067e7f\") " pod="openshift-logging/collector-ql6sd" Dec 11 13:18:27 crc kubenswrapper[4898]: I1211 13:18:27.366621 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/c696c635-6d7a-40d8-aef4-ee5781067e7f-datadir\") pod \"collector-ql6sd\" (UID: \"c696c635-6d7a-40d8-aef4-ee5781067e7f\") " pod="openshift-logging/collector-ql6sd" Dec 11 13:18:27 crc kubenswrapper[4898]: I1211 13:18:27.367957 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/c696c635-6d7a-40d8-aef4-ee5781067e7f-config-openshift-service-cacrt\") pod \"collector-ql6sd\" (UID: \"c696c635-6d7a-40d8-aef4-ee5781067e7f\") " pod="openshift-logging/collector-ql6sd" Dec 11 13:18:27 crc kubenswrapper[4898]: I1211 13:18:27.368579 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c696c635-6d7a-40d8-aef4-ee5781067e7f-config\") pod \"collector-ql6sd\" (UID: \"c696c635-6d7a-40d8-aef4-ee5781067e7f\") " pod="openshift-logging/collector-ql6sd" Dec 11 13:18:27 crc kubenswrapper[4898]: I1211 13:18:27.368748 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/c696c635-6d7a-40d8-aef4-ee5781067e7f-entrypoint\") pod \"collector-ql6sd\" (UID: \"c696c635-6d7a-40d8-aef4-ee5781067e7f\") " pod="openshift-logging/collector-ql6sd" Dec 11 13:18:27 crc kubenswrapper[4898]: I1211 13:18:27.369468 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c696c635-6d7a-40d8-aef4-ee5781067e7f-trusted-ca\") pod \"collector-ql6sd\" (UID: \"c696c635-6d7a-40d8-aef4-ee5781067e7f\") " pod="openshift-logging/collector-ql6sd" Dec 11 13:18:27 crc kubenswrapper[4898]: I1211 13:18:27.371896 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/c696c635-6d7a-40d8-aef4-ee5781067e7f-metrics\") pod \"collector-ql6sd\" (UID: \"c696c635-6d7a-40d8-aef4-ee5781067e7f\") " pod="openshift-logging/collector-ql6sd" Dec 11 13:18:27 crc kubenswrapper[4898]: I1211 13:18:27.372218 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c696c635-6d7a-40d8-aef4-ee5781067e7f-tmp\") pod \"collector-ql6sd\" (UID: \"c696c635-6d7a-40d8-aef4-ee5781067e7f\") " pod="openshift-logging/collector-ql6sd" Dec 11 13:18:27 crc kubenswrapper[4898]: I1211 13:18:27.373559 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/c696c635-6d7a-40d8-aef4-ee5781067e7f-collector-token\") pod \"collector-ql6sd\" (UID: \"c696c635-6d7a-40d8-aef4-ee5781067e7f\") " pod="openshift-logging/collector-ql6sd" Dec 11 13:18:27 crc kubenswrapper[4898]: I1211 13:18:27.385272 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/c696c635-6d7a-40d8-aef4-ee5781067e7f-collector-syslog-receiver\") pod \"collector-ql6sd\" (UID: \"c696c635-6d7a-40d8-aef4-ee5781067e7f\") " pod="openshift-logging/collector-ql6sd" Dec 11 13:18:27 crc kubenswrapper[4898]: I1211 13:18:27.386880 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b9eb4c92-7716-4065-bc6c-9387b8f0299a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b9eb4c92-7716-4065-bc6c-9387b8f0299a" (UID: "b9eb4c92-7716-4065-bc6c-9387b8f0299a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 13:18:27 crc kubenswrapper[4898]: I1211 13:18:27.403966 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/c696c635-6d7a-40d8-aef4-ee5781067e7f-sa-token\") pod \"collector-ql6sd\" (UID: \"c696c635-6d7a-40d8-aef4-ee5781067e7f\") " pod="openshift-logging/collector-ql6sd" Dec 11 13:18:27 crc kubenswrapper[4898]: I1211 13:18:27.405110 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ws2zm\" (UniqueName: \"kubernetes.io/projected/c696c635-6d7a-40d8-aef4-ee5781067e7f-kube-api-access-ws2zm\") pod \"collector-ql6sd\" (UID: \"c696c635-6d7a-40d8-aef4-ee5781067e7f\") " pod="openshift-logging/collector-ql6sd" Dec 11 13:18:27 crc kubenswrapper[4898]: I1211 13:18:27.467124 4898 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b9eb4c92-7716-4065-bc6c-9387b8f0299a-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 11 13:18:27 crc kubenswrapper[4898]: I1211 13:18:27.517909 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-drtbg"] Dec 11 13:18:27 crc kubenswrapper[4898]: I1211 13:18:27.518616 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/collector-ql6sd" Dec 11 13:18:27 crc kubenswrapper[4898]: I1211 13:18:27.520680 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-drtbg" Dec 11 13:18:27 crc kubenswrapper[4898]: I1211 13:18:27.536623 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-drtbg"] Dec 11 13:18:27 crc kubenswrapper[4898]: I1211 13:18:27.569100 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d966730-3840-43d8-9354-7711d949bbd4-utilities\") pod \"redhat-operators-drtbg\" (UID: \"3d966730-3840-43d8-9354-7711d949bbd4\") " pod="openshift-marketplace/redhat-operators-drtbg" Dec 11 13:18:27 crc kubenswrapper[4898]: I1211 13:18:27.569263 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d966730-3840-43d8-9354-7711d949bbd4-catalog-content\") pod \"redhat-operators-drtbg\" (UID: \"3d966730-3840-43d8-9354-7711d949bbd4\") " pod="openshift-marketplace/redhat-operators-drtbg" Dec 11 13:18:27 crc kubenswrapper[4898]: I1211 13:18:27.569356 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9jtlm\" (UniqueName: \"kubernetes.io/projected/3d966730-3840-43d8-9354-7711d949bbd4-kube-api-access-9jtlm\") pod \"redhat-operators-drtbg\" (UID: \"3d966730-3840-43d8-9354-7711d949bbd4\") " pod="openshift-marketplace/redhat-operators-drtbg" Dec 11 13:18:27 crc kubenswrapper[4898]: I1211 13:18:27.673331 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d966730-3840-43d8-9354-7711d949bbd4-utilities\") pod \"redhat-operators-drtbg\" (UID: \"3d966730-3840-43d8-9354-7711d949bbd4\") " pod="openshift-marketplace/redhat-operators-drtbg" Dec 11 13:18:27 crc kubenswrapper[4898]: I1211 13:18:27.673971 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d966730-3840-43d8-9354-7711d949bbd4-catalog-content\") pod \"redhat-operators-drtbg\" (UID: \"3d966730-3840-43d8-9354-7711d949bbd4\") " pod="openshift-marketplace/redhat-operators-drtbg" Dec 11 13:18:27 crc kubenswrapper[4898]: I1211 13:18:27.674177 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9jtlm\" (UniqueName: \"kubernetes.io/projected/3d966730-3840-43d8-9354-7711d949bbd4-kube-api-access-9jtlm\") pod \"redhat-operators-drtbg\" (UID: \"3d966730-3840-43d8-9354-7711d949bbd4\") " pod="openshift-marketplace/redhat-operators-drtbg" Dec 11 13:18:27 crc kubenswrapper[4898]: I1211 13:18:27.673895 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d966730-3840-43d8-9354-7711d949bbd4-utilities\") pod \"redhat-operators-drtbg\" (UID: \"3d966730-3840-43d8-9354-7711d949bbd4\") " pod="openshift-marketplace/redhat-operators-drtbg" Dec 11 13:18:27 crc kubenswrapper[4898]: I1211 13:18:27.674477 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d966730-3840-43d8-9354-7711d949bbd4-catalog-content\") pod \"redhat-operators-drtbg\" (UID: \"3d966730-3840-43d8-9354-7711d949bbd4\") " pod="openshift-marketplace/redhat-operators-drtbg" Dec 11 13:18:27 crc kubenswrapper[4898]: I1211 13:18:27.708595 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9jtlm\" (UniqueName: \"kubernetes.io/projected/3d966730-3840-43d8-9354-7711d949bbd4-kube-api-access-9jtlm\") pod \"redhat-operators-drtbg\" (UID: \"3d966730-3840-43d8-9354-7711d949bbd4\") " pod="openshift-marketplace/redhat-operators-drtbg" Dec 11 13:18:27 crc kubenswrapper[4898]: I1211 13:18:27.713151 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-v8rz5"] Dec 11 13:18:27 crc kubenswrapper[4898]: I1211 13:18:27.718311 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-v8rz5"] Dec 11 13:18:27 crc kubenswrapper[4898]: I1211 13:18:27.852385 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-drtbg" Dec 11 13:18:28 crc kubenswrapper[4898]: I1211 13:18:28.010704 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/collector-ql6sd"] Dec 11 13:18:28 crc kubenswrapper[4898]: I1211 13:18:28.064386 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/collector-ql6sd" event={"ID":"c696c635-6d7a-40d8-aef4-ee5781067e7f","Type":"ContainerStarted","Data":"c64560228c614452ebd3d1ecae3e3b66e63ade5a0eac630bbfa2d63d5a03abb8"} Dec 11 13:18:28 crc kubenswrapper[4898]: I1211 13:18:28.281425 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-drtbg"] Dec 11 13:18:28 crc kubenswrapper[4898]: W1211 13:18:28.703202 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3d966730_3840_43d8_9354_7711d949bbd4.slice/crio-5e76890f42a6d6a189d8f4f727fb3420f16f3cd5915976d6d363c8d70a5f63c0 WatchSource:0}: Error finding container 5e76890f42a6d6a189d8f4f727fb3420f16f3cd5915976d6d363c8d70a5f63c0: Status 404 returned error can't find the container with id 5e76890f42a6d6a189d8f4f727fb3420f16f3cd5915976d6d363c8d70a5f63c0 Dec 11 13:18:28 crc kubenswrapper[4898]: I1211 13:18:28.784717 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="77030d75-67ee-48b4-bb7b-fcaba6219f53" path="/var/lib/kubelet/pods/77030d75-67ee-48b4-bb7b-fcaba6219f53/volumes" Dec 11 13:18:28 crc kubenswrapper[4898]: I1211 13:18:28.785294 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b9eb4c92-7716-4065-bc6c-9387b8f0299a" path="/var/lib/kubelet/pods/b9eb4c92-7716-4065-bc6c-9387b8f0299a/volumes" Dec 11 13:18:29 crc kubenswrapper[4898]: I1211 13:18:29.075977 4898 generic.go:334] "Generic (PLEG): container finished" podID="3d966730-3840-43d8-9354-7711d949bbd4" containerID="36e8e001d5977cbe0372e38fd41ea81b759dab94535c3f7115ac4d0c13ee5067" exitCode=0 Dec 11 13:18:29 crc kubenswrapper[4898]: I1211 13:18:29.076016 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-drtbg" event={"ID":"3d966730-3840-43d8-9354-7711d949bbd4","Type":"ContainerDied","Data":"36e8e001d5977cbe0372e38fd41ea81b759dab94535c3f7115ac4d0c13ee5067"} Dec 11 13:18:29 crc kubenswrapper[4898]: I1211 13:18:29.076041 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-drtbg" event={"ID":"3d966730-3840-43d8-9354-7711d949bbd4","Type":"ContainerStarted","Data":"5e76890f42a6d6a189d8f4f727fb3420f16f3cd5915976d6d363c8d70a5f63c0"} Dec 11 13:18:30 crc kubenswrapper[4898]: I1211 13:18:30.082702 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-drtbg" event={"ID":"3d966730-3840-43d8-9354-7711d949bbd4","Type":"ContainerStarted","Data":"7c61ba38617992176b257e17a3c170113095d14d5994e66b7856082a0bd06e47"} Dec 11 13:18:31 crc kubenswrapper[4898]: I1211 13:18:31.090861 4898 generic.go:334] "Generic (PLEG): container finished" podID="3d966730-3840-43d8-9354-7711d949bbd4" containerID="7c61ba38617992176b257e17a3c170113095d14d5994e66b7856082a0bd06e47" exitCode=0 Dec 11 13:18:31 crc kubenswrapper[4898]: I1211 13:18:31.091071 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-drtbg" event={"ID":"3d966730-3840-43d8-9354-7711d949bbd4","Type":"ContainerDied","Data":"7c61ba38617992176b257e17a3c170113095d14d5994e66b7856082a0bd06e47"} Dec 11 13:18:34 crc kubenswrapper[4898]: I1211 13:18:34.724745 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-4mssh"] Dec 11 13:18:34 crc kubenswrapper[4898]: I1211 13:18:34.727894 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4mssh" Dec 11 13:18:34 crc kubenswrapper[4898]: I1211 13:18:34.729447 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4mssh"] Dec 11 13:18:34 crc kubenswrapper[4898]: I1211 13:18:34.797442 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/71015a82-563b-4992-8246-174c0b0be2cb-utilities\") pod \"redhat-marketplace-4mssh\" (UID: \"71015a82-563b-4992-8246-174c0b0be2cb\") " pod="openshift-marketplace/redhat-marketplace-4mssh" Dec 11 13:18:34 crc kubenswrapper[4898]: I1211 13:18:34.797602 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/71015a82-563b-4992-8246-174c0b0be2cb-catalog-content\") pod \"redhat-marketplace-4mssh\" (UID: \"71015a82-563b-4992-8246-174c0b0be2cb\") " pod="openshift-marketplace/redhat-marketplace-4mssh" Dec 11 13:18:34 crc kubenswrapper[4898]: I1211 13:18:34.797646 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4dh7w\" (UniqueName: \"kubernetes.io/projected/71015a82-563b-4992-8246-174c0b0be2cb-kube-api-access-4dh7w\") pod \"redhat-marketplace-4mssh\" (UID: \"71015a82-563b-4992-8246-174c0b0be2cb\") " pod="openshift-marketplace/redhat-marketplace-4mssh" Dec 11 13:18:34 crc kubenswrapper[4898]: I1211 13:18:34.899530 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/71015a82-563b-4992-8246-174c0b0be2cb-utilities\") pod \"redhat-marketplace-4mssh\" (UID: \"71015a82-563b-4992-8246-174c0b0be2cb\") " pod="openshift-marketplace/redhat-marketplace-4mssh" Dec 11 13:18:34 crc kubenswrapper[4898]: I1211 13:18:34.899683 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/71015a82-563b-4992-8246-174c0b0be2cb-catalog-content\") pod \"redhat-marketplace-4mssh\" (UID: \"71015a82-563b-4992-8246-174c0b0be2cb\") " pod="openshift-marketplace/redhat-marketplace-4mssh" Dec 11 13:18:34 crc kubenswrapper[4898]: I1211 13:18:34.899734 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4dh7w\" (UniqueName: \"kubernetes.io/projected/71015a82-563b-4992-8246-174c0b0be2cb-kube-api-access-4dh7w\") pod \"redhat-marketplace-4mssh\" (UID: \"71015a82-563b-4992-8246-174c0b0be2cb\") " pod="openshift-marketplace/redhat-marketplace-4mssh" Dec 11 13:18:34 crc kubenswrapper[4898]: I1211 13:18:34.900365 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/71015a82-563b-4992-8246-174c0b0be2cb-utilities\") pod \"redhat-marketplace-4mssh\" (UID: \"71015a82-563b-4992-8246-174c0b0be2cb\") " pod="openshift-marketplace/redhat-marketplace-4mssh" Dec 11 13:18:34 crc kubenswrapper[4898]: I1211 13:18:34.900625 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/71015a82-563b-4992-8246-174c0b0be2cb-catalog-content\") pod \"redhat-marketplace-4mssh\" (UID: \"71015a82-563b-4992-8246-174c0b0be2cb\") " pod="openshift-marketplace/redhat-marketplace-4mssh" Dec 11 13:18:34 crc kubenswrapper[4898]: I1211 13:18:34.922404 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4dh7w\" (UniqueName: \"kubernetes.io/projected/71015a82-563b-4992-8246-174c0b0be2cb-kube-api-access-4dh7w\") pod \"redhat-marketplace-4mssh\" (UID: \"71015a82-563b-4992-8246-174c0b0be2cb\") " pod="openshift-marketplace/redhat-marketplace-4mssh" Dec 11 13:18:35 crc kubenswrapper[4898]: I1211 13:18:35.125291 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/collector-ql6sd" event={"ID":"c696c635-6d7a-40d8-aef4-ee5781067e7f","Type":"ContainerStarted","Data":"f2a30b4818eeca13b332bf997dcf2ed4a98d7a1ef30ae093eef80698e37c9913"} Dec 11 13:18:35 crc kubenswrapper[4898]: I1211 13:18:35.127367 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-drtbg" event={"ID":"3d966730-3840-43d8-9354-7711d949bbd4","Type":"ContainerStarted","Data":"801c9e99c546d3417a9682e9ee6e65a2c6d02a3cf6889e65f9593d288c4a5f01"} Dec 11 13:18:35 crc kubenswrapper[4898]: I1211 13:18:35.150576 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/collector-ql6sd" podStartSLOduration=1.437864094 podStartE2EDuration="8.150552022s" podCreationTimestamp="2025-12-11 13:18:27 +0000 UTC" firstStartedPulling="2025-12-11 13:18:28.027149826 +0000 UTC m=+865.599476263" lastFinishedPulling="2025-12-11 13:18:34.739837714 +0000 UTC m=+872.312164191" observedRunningTime="2025-12-11 13:18:35.146626635 +0000 UTC m=+872.718953082" watchObservedRunningTime="2025-12-11 13:18:35.150552022 +0000 UTC m=+872.722878469" Dec 11 13:18:35 crc kubenswrapper[4898]: I1211 13:18:35.216789 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4mssh" Dec 11 13:18:35 crc kubenswrapper[4898]: I1211 13:18:35.442552 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-drtbg" podStartSLOduration=2.822368188 podStartE2EDuration="8.442535841s" podCreationTimestamp="2025-12-11 13:18:27 +0000 UTC" firstStartedPulling="2025-12-11 13:18:29.077276254 +0000 UTC m=+866.649602691" lastFinishedPulling="2025-12-11 13:18:34.697443897 +0000 UTC m=+872.269770344" observedRunningTime="2025-12-11 13:18:35.175307188 +0000 UTC m=+872.747633615" watchObservedRunningTime="2025-12-11 13:18:35.442535841 +0000 UTC m=+873.014862268" Dec 11 13:18:35 crc kubenswrapper[4898]: I1211 13:18:35.444033 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4mssh"] Dec 11 13:18:35 crc kubenswrapper[4898]: W1211 13:18:35.447558 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71015a82_563b_4992_8246_174c0b0be2cb.slice/crio-2f07f1aa57939d1e1dcf7c6396b6ac35434e6dc38cafa54b92cfef1b69a38ff3 WatchSource:0}: Error finding container 2f07f1aa57939d1e1dcf7c6396b6ac35434e6dc38cafa54b92cfef1b69a38ff3: Status 404 returned error can't find the container with id 2f07f1aa57939d1e1dcf7c6396b6ac35434e6dc38cafa54b92cfef1b69a38ff3 Dec 11 13:18:36 crc kubenswrapper[4898]: I1211 13:18:36.135956 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4mssh" event={"ID":"71015a82-563b-4992-8246-174c0b0be2cb","Type":"ContainerStarted","Data":"b9ee0f72195c704e8474b97824668b71dd4dfe87b7bb05210ea6e3ce56d34541"} Dec 11 13:18:36 crc kubenswrapper[4898]: I1211 13:18:36.136407 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4mssh" event={"ID":"71015a82-563b-4992-8246-174c0b0be2cb","Type":"ContainerStarted","Data":"2f07f1aa57939d1e1dcf7c6396b6ac35434e6dc38cafa54b92cfef1b69a38ff3"} Dec 11 13:18:37 crc kubenswrapper[4898]: I1211 13:18:37.146105 4898 generic.go:334] "Generic (PLEG): container finished" podID="71015a82-563b-4992-8246-174c0b0be2cb" containerID="b9ee0f72195c704e8474b97824668b71dd4dfe87b7bb05210ea6e3ce56d34541" exitCode=0 Dec 11 13:18:37 crc kubenswrapper[4898]: I1211 13:18:37.146223 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4mssh" event={"ID":"71015a82-563b-4992-8246-174c0b0be2cb","Type":"ContainerDied","Data":"b9ee0f72195c704e8474b97824668b71dd4dfe87b7bb05210ea6e3ce56d34541"} Dec 11 13:18:37 crc kubenswrapper[4898]: I1211 13:18:37.852643 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-drtbg" Dec 11 13:18:37 crc kubenswrapper[4898]: I1211 13:18:37.852690 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-drtbg" Dec 11 13:18:38 crc kubenswrapper[4898]: I1211 13:18:38.153112 4898 generic.go:334] "Generic (PLEG): container finished" podID="71015a82-563b-4992-8246-174c0b0be2cb" containerID="35106283b56b4eaabc842b0bf3b4c427d3f22da6ac837ee6245311cf4d069dde" exitCode=0 Dec 11 13:18:38 crc kubenswrapper[4898]: I1211 13:18:38.153156 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4mssh" event={"ID":"71015a82-563b-4992-8246-174c0b0be2cb","Type":"ContainerDied","Data":"35106283b56b4eaabc842b0bf3b4c427d3f22da6ac837ee6245311cf4d069dde"} Dec 11 13:18:38 crc kubenswrapper[4898]: I1211 13:18:38.921189 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-drtbg" podUID="3d966730-3840-43d8-9354-7711d949bbd4" containerName="registry-server" probeResult="failure" output=< Dec 11 13:18:38 crc kubenswrapper[4898]: timeout: failed to connect service ":50051" within 1s Dec 11 13:18:38 crc kubenswrapper[4898]: > Dec 11 13:18:39 crc kubenswrapper[4898]: I1211 13:18:39.161175 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4mssh" event={"ID":"71015a82-563b-4992-8246-174c0b0be2cb","Type":"ContainerStarted","Data":"fee8a518d8fb30e7d0d9ec6a600f009ceef5a865108a25da57b0954db51fb6ae"} Dec 11 13:18:39 crc kubenswrapper[4898]: I1211 13:18:39.188095 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-4mssh" podStartSLOduration=3.559256546 podStartE2EDuration="5.188075265s" podCreationTimestamp="2025-12-11 13:18:34 +0000 UTC" firstStartedPulling="2025-12-11 13:18:37.148550188 +0000 UTC m=+874.720876625" lastFinishedPulling="2025-12-11 13:18:38.777368897 +0000 UTC m=+876.349695344" observedRunningTime="2025-12-11 13:18:39.185064423 +0000 UTC m=+876.757390870" watchObservedRunningTime="2025-12-11 13:18:39.188075265 +0000 UTC m=+876.760401712" Dec 11 13:18:45 crc kubenswrapper[4898]: I1211 13:18:45.218412 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-4mssh" Dec 11 13:18:45 crc kubenswrapper[4898]: I1211 13:18:45.219118 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-4mssh" Dec 11 13:18:45 crc kubenswrapper[4898]: I1211 13:18:45.295084 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-4mssh" Dec 11 13:18:46 crc kubenswrapper[4898]: I1211 13:18:46.284323 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-4mssh" Dec 11 13:18:46 crc kubenswrapper[4898]: I1211 13:18:46.365508 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4mssh"] Dec 11 13:18:47 crc kubenswrapper[4898]: I1211 13:18:47.931442 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-drtbg" Dec 11 13:18:48 crc kubenswrapper[4898]: I1211 13:18:48.001937 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-drtbg" Dec 11 13:18:48 crc kubenswrapper[4898]: I1211 13:18:48.232801 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-4mssh" podUID="71015a82-563b-4992-8246-174c0b0be2cb" containerName="registry-server" containerID="cri-o://fee8a518d8fb30e7d0d9ec6a600f009ceef5a865108a25da57b0954db51fb6ae" gracePeriod=2 Dec 11 13:18:49 crc kubenswrapper[4898]: I1211 13:18:49.188020 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-drtbg"] Dec 11 13:18:49 crc kubenswrapper[4898]: I1211 13:18:49.238870 4898 generic.go:334] "Generic (PLEG): container finished" podID="71015a82-563b-4992-8246-174c0b0be2cb" containerID="fee8a518d8fb30e7d0d9ec6a600f009ceef5a865108a25da57b0954db51fb6ae" exitCode=0 Dec 11 13:18:49 crc kubenswrapper[4898]: I1211 13:18:49.239070 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-drtbg" podUID="3d966730-3840-43d8-9354-7711d949bbd4" containerName="registry-server" containerID="cri-o://801c9e99c546d3417a9682e9ee6e65a2c6d02a3cf6889e65f9593d288c4a5f01" gracePeriod=2 Dec 11 13:18:49 crc kubenswrapper[4898]: I1211 13:18:49.239399 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4mssh" event={"ID":"71015a82-563b-4992-8246-174c0b0be2cb","Type":"ContainerDied","Data":"fee8a518d8fb30e7d0d9ec6a600f009ceef5a865108a25da57b0954db51fb6ae"} Dec 11 13:18:49 crc kubenswrapper[4898]: I1211 13:18:49.524332 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4mssh" Dec 11 13:18:49 crc kubenswrapper[4898]: I1211 13:18:49.588171 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/71015a82-563b-4992-8246-174c0b0be2cb-catalog-content\") pod \"71015a82-563b-4992-8246-174c0b0be2cb\" (UID: \"71015a82-563b-4992-8246-174c0b0be2cb\") " Dec 11 13:18:49 crc kubenswrapper[4898]: I1211 13:18:49.588549 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4dh7w\" (UniqueName: \"kubernetes.io/projected/71015a82-563b-4992-8246-174c0b0be2cb-kube-api-access-4dh7w\") pod \"71015a82-563b-4992-8246-174c0b0be2cb\" (UID: \"71015a82-563b-4992-8246-174c0b0be2cb\") " Dec 11 13:18:49 crc kubenswrapper[4898]: I1211 13:18:49.588613 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/71015a82-563b-4992-8246-174c0b0be2cb-utilities\") pod \"71015a82-563b-4992-8246-174c0b0be2cb\" (UID: \"71015a82-563b-4992-8246-174c0b0be2cb\") " Dec 11 13:18:49 crc kubenswrapper[4898]: I1211 13:18:49.589342 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/71015a82-563b-4992-8246-174c0b0be2cb-utilities" (OuterVolumeSpecName: "utilities") pod "71015a82-563b-4992-8246-174c0b0be2cb" (UID: "71015a82-563b-4992-8246-174c0b0be2cb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 13:18:49 crc kubenswrapper[4898]: I1211 13:18:49.593612 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71015a82-563b-4992-8246-174c0b0be2cb-kube-api-access-4dh7w" (OuterVolumeSpecName: "kube-api-access-4dh7w") pod "71015a82-563b-4992-8246-174c0b0be2cb" (UID: "71015a82-563b-4992-8246-174c0b0be2cb"). InnerVolumeSpecName "kube-api-access-4dh7w". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:18:49 crc kubenswrapper[4898]: I1211 13:18:49.611758 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/71015a82-563b-4992-8246-174c0b0be2cb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "71015a82-563b-4992-8246-174c0b0be2cb" (UID: "71015a82-563b-4992-8246-174c0b0be2cb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 13:18:49 crc kubenswrapper[4898]: I1211 13:18:49.623787 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-drtbg" Dec 11 13:18:49 crc kubenswrapper[4898]: I1211 13:18:49.690413 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d966730-3840-43d8-9354-7711d949bbd4-utilities\") pod \"3d966730-3840-43d8-9354-7711d949bbd4\" (UID: \"3d966730-3840-43d8-9354-7711d949bbd4\") " Dec 11 13:18:49 crc kubenswrapper[4898]: I1211 13:18:49.690600 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d966730-3840-43d8-9354-7711d949bbd4-catalog-content\") pod \"3d966730-3840-43d8-9354-7711d949bbd4\" (UID: \"3d966730-3840-43d8-9354-7711d949bbd4\") " Dec 11 13:18:49 crc kubenswrapper[4898]: I1211 13:18:49.690660 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9jtlm\" (UniqueName: \"kubernetes.io/projected/3d966730-3840-43d8-9354-7711d949bbd4-kube-api-access-9jtlm\") pod \"3d966730-3840-43d8-9354-7711d949bbd4\" (UID: \"3d966730-3840-43d8-9354-7711d949bbd4\") " Dec 11 13:18:49 crc kubenswrapper[4898]: I1211 13:18:49.691049 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4dh7w\" (UniqueName: \"kubernetes.io/projected/71015a82-563b-4992-8246-174c0b0be2cb-kube-api-access-4dh7w\") on node \"crc\" DevicePath \"\"" Dec 11 13:18:49 crc kubenswrapper[4898]: I1211 13:18:49.691071 4898 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/71015a82-563b-4992-8246-174c0b0be2cb-utilities\") on node \"crc\" DevicePath \"\"" Dec 11 13:18:49 crc kubenswrapper[4898]: I1211 13:18:49.691085 4898 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/71015a82-563b-4992-8246-174c0b0be2cb-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 11 13:18:49 crc kubenswrapper[4898]: I1211 13:18:49.693570 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3d966730-3840-43d8-9354-7711d949bbd4-utilities" (OuterVolumeSpecName: "utilities") pod "3d966730-3840-43d8-9354-7711d949bbd4" (UID: "3d966730-3840-43d8-9354-7711d949bbd4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 13:18:49 crc kubenswrapper[4898]: I1211 13:18:49.694646 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d966730-3840-43d8-9354-7711d949bbd4-kube-api-access-9jtlm" (OuterVolumeSpecName: "kube-api-access-9jtlm") pod "3d966730-3840-43d8-9354-7711d949bbd4" (UID: "3d966730-3840-43d8-9354-7711d949bbd4"). InnerVolumeSpecName "kube-api-access-9jtlm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:18:49 crc kubenswrapper[4898]: I1211 13:18:49.793430 4898 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d966730-3840-43d8-9354-7711d949bbd4-utilities\") on node \"crc\" DevicePath \"\"" Dec 11 13:18:49 crc kubenswrapper[4898]: I1211 13:18:49.793491 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9jtlm\" (UniqueName: \"kubernetes.io/projected/3d966730-3840-43d8-9354-7711d949bbd4-kube-api-access-9jtlm\") on node \"crc\" DevicePath \"\"" Dec 11 13:18:49 crc kubenswrapper[4898]: I1211 13:18:49.818688 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3d966730-3840-43d8-9354-7711d949bbd4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3d966730-3840-43d8-9354-7711d949bbd4" (UID: "3d966730-3840-43d8-9354-7711d949bbd4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 13:18:49 crc kubenswrapper[4898]: I1211 13:18:49.897366 4898 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d966730-3840-43d8-9354-7711d949bbd4-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 11 13:18:50 crc kubenswrapper[4898]: I1211 13:18:50.247121 4898 generic.go:334] "Generic (PLEG): container finished" podID="3d966730-3840-43d8-9354-7711d949bbd4" containerID="801c9e99c546d3417a9682e9ee6e65a2c6d02a3cf6889e65f9593d288c4a5f01" exitCode=0 Dec 11 13:18:50 crc kubenswrapper[4898]: I1211 13:18:50.247166 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-drtbg" event={"ID":"3d966730-3840-43d8-9354-7711d949bbd4","Type":"ContainerDied","Data":"801c9e99c546d3417a9682e9ee6e65a2c6d02a3cf6889e65f9593d288c4a5f01"} Dec 11 13:18:50 crc kubenswrapper[4898]: I1211 13:18:50.247220 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-drtbg" event={"ID":"3d966730-3840-43d8-9354-7711d949bbd4","Type":"ContainerDied","Data":"5e76890f42a6d6a189d8f4f727fb3420f16f3cd5915976d6d363c8d70a5f63c0"} Dec 11 13:18:50 crc kubenswrapper[4898]: I1211 13:18:50.247250 4898 scope.go:117] "RemoveContainer" containerID="801c9e99c546d3417a9682e9ee6e65a2c6d02a3cf6889e65f9593d288c4a5f01" Dec 11 13:18:50 crc kubenswrapper[4898]: I1211 13:18:50.248502 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-drtbg" Dec 11 13:18:50 crc kubenswrapper[4898]: I1211 13:18:50.249847 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4mssh" event={"ID":"71015a82-563b-4992-8246-174c0b0be2cb","Type":"ContainerDied","Data":"2f07f1aa57939d1e1dcf7c6396b6ac35434e6dc38cafa54b92cfef1b69a38ff3"} Dec 11 13:18:50 crc kubenswrapper[4898]: I1211 13:18:50.249933 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4mssh" Dec 11 13:18:50 crc kubenswrapper[4898]: I1211 13:18:50.273349 4898 scope.go:117] "RemoveContainer" containerID="7c61ba38617992176b257e17a3c170113095d14d5994e66b7856082a0bd06e47" Dec 11 13:18:50 crc kubenswrapper[4898]: I1211 13:18:50.307794 4898 scope.go:117] "RemoveContainer" containerID="36e8e001d5977cbe0372e38fd41ea81b759dab94535c3f7115ac4d0c13ee5067" Dec 11 13:18:50 crc kubenswrapper[4898]: I1211 13:18:50.316499 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4mssh"] Dec 11 13:18:50 crc kubenswrapper[4898]: I1211 13:18:50.327737 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-4mssh"] Dec 11 13:18:50 crc kubenswrapper[4898]: I1211 13:18:50.342474 4898 scope.go:117] "RemoveContainer" containerID="801c9e99c546d3417a9682e9ee6e65a2c6d02a3cf6889e65f9593d288c4a5f01" Dec 11 13:18:50 crc kubenswrapper[4898]: E1211 13:18:50.342933 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"801c9e99c546d3417a9682e9ee6e65a2c6d02a3cf6889e65f9593d288c4a5f01\": container with ID starting with 801c9e99c546d3417a9682e9ee6e65a2c6d02a3cf6889e65f9593d288c4a5f01 not found: ID does not exist" containerID="801c9e99c546d3417a9682e9ee6e65a2c6d02a3cf6889e65f9593d288c4a5f01" Dec 11 13:18:50 crc kubenswrapper[4898]: I1211 13:18:50.342984 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"801c9e99c546d3417a9682e9ee6e65a2c6d02a3cf6889e65f9593d288c4a5f01"} err="failed to get container status \"801c9e99c546d3417a9682e9ee6e65a2c6d02a3cf6889e65f9593d288c4a5f01\": rpc error: code = NotFound desc = could not find container \"801c9e99c546d3417a9682e9ee6e65a2c6d02a3cf6889e65f9593d288c4a5f01\": container with ID starting with 801c9e99c546d3417a9682e9ee6e65a2c6d02a3cf6889e65f9593d288c4a5f01 not found: ID does not exist" Dec 11 13:18:50 crc kubenswrapper[4898]: I1211 13:18:50.343011 4898 scope.go:117] "RemoveContainer" containerID="7c61ba38617992176b257e17a3c170113095d14d5994e66b7856082a0bd06e47" Dec 11 13:18:50 crc kubenswrapper[4898]: E1211 13:18:50.343396 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7c61ba38617992176b257e17a3c170113095d14d5994e66b7856082a0bd06e47\": container with ID starting with 7c61ba38617992176b257e17a3c170113095d14d5994e66b7856082a0bd06e47 not found: ID does not exist" containerID="7c61ba38617992176b257e17a3c170113095d14d5994e66b7856082a0bd06e47" Dec 11 13:18:50 crc kubenswrapper[4898]: I1211 13:18:50.343442 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c61ba38617992176b257e17a3c170113095d14d5994e66b7856082a0bd06e47"} err="failed to get container status \"7c61ba38617992176b257e17a3c170113095d14d5994e66b7856082a0bd06e47\": rpc error: code = NotFound desc = could not find container \"7c61ba38617992176b257e17a3c170113095d14d5994e66b7856082a0bd06e47\": container with ID starting with 7c61ba38617992176b257e17a3c170113095d14d5994e66b7856082a0bd06e47 not found: ID does not exist" Dec 11 13:18:50 crc kubenswrapper[4898]: I1211 13:18:50.343503 4898 scope.go:117] "RemoveContainer" containerID="36e8e001d5977cbe0372e38fd41ea81b759dab94535c3f7115ac4d0c13ee5067" Dec 11 13:18:50 crc kubenswrapper[4898]: E1211 13:18:50.343757 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"36e8e001d5977cbe0372e38fd41ea81b759dab94535c3f7115ac4d0c13ee5067\": container with ID starting with 36e8e001d5977cbe0372e38fd41ea81b759dab94535c3f7115ac4d0c13ee5067 not found: ID does not exist" containerID="36e8e001d5977cbe0372e38fd41ea81b759dab94535c3f7115ac4d0c13ee5067" Dec 11 13:18:50 crc kubenswrapper[4898]: I1211 13:18:50.343784 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"36e8e001d5977cbe0372e38fd41ea81b759dab94535c3f7115ac4d0c13ee5067"} err="failed to get container status \"36e8e001d5977cbe0372e38fd41ea81b759dab94535c3f7115ac4d0c13ee5067\": rpc error: code = NotFound desc = could not find container \"36e8e001d5977cbe0372e38fd41ea81b759dab94535c3f7115ac4d0c13ee5067\": container with ID starting with 36e8e001d5977cbe0372e38fd41ea81b759dab94535c3f7115ac4d0c13ee5067 not found: ID does not exist" Dec 11 13:18:50 crc kubenswrapper[4898]: I1211 13:18:50.343800 4898 scope.go:117] "RemoveContainer" containerID="fee8a518d8fb30e7d0d9ec6a600f009ceef5a865108a25da57b0954db51fb6ae" Dec 11 13:18:50 crc kubenswrapper[4898]: I1211 13:18:50.388072 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-drtbg"] Dec 11 13:18:50 crc kubenswrapper[4898]: I1211 13:18:50.388789 4898 scope.go:117] "RemoveContainer" containerID="35106283b56b4eaabc842b0bf3b4c427d3f22da6ac837ee6245311cf4d069dde" Dec 11 13:18:50 crc kubenswrapper[4898]: I1211 13:18:50.392919 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-drtbg"] Dec 11 13:18:50 crc kubenswrapper[4898]: I1211 13:18:50.402239 4898 scope.go:117] "RemoveContainer" containerID="b9ee0f72195c704e8474b97824668b71dd4dfe87b7bb05210ea6e3ce56d34541" Dec 11 13:18:50 crc kubenswrapper[4898]: I1211 13:18:50.788140 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d966730-3840-43d8-9354-7711d949bbd4" path="/var/lib/kubelet/pods/3d966730-3840-43d8-9354-7711d949bbd4/volumes" Dec 11 13:18:50 crc kubenswrapper[4898]: I1211 13:18:50.790082 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="71015a82-563b-4992-8246-174c0b0be2cb" path="/var/lib/kubelet/pods/71015a82-563b-4992-8246-174c0b0be2cb/volumes" Dec 11 13:19:05 crc kubenswrapper[4898]: I1211 13:19:05.370568 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa84jf9c"] Dec 11 13:19:05 crc kubenswrapper[4898]: E1211 13:19:05.372971 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71015a82-563b-4992-8246-174c0b0be2cb" containerName="registry-server" Dec 11 13:19:05 crc kubenswrapper[4898]: I1211 13:19:05.373151 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="71015a82-563b-4992-8246-174c0b0be2cb" containerName="registry-server" Dec 11 13:19:05 crc kubenswrapper[4898]: E1211 13:19:05.373275 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d966730-3840-43d8-9354-7711d949bbd4" containerName="extract-utilities" Dec 11 13:19:05 crc kubenswrapper[4898]: I1211 13:19:05.373391 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d966730-3840-43d8-9354-7711d949bbd4" containerName="extract-utilities" Dec 11 13:19:05 crc kubenswrapper[4898]: E1211 13:19:05.373529 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71015a82-563b-4992-8246-174c0b0be2cb" containerName="extract-content" Dec 11 13:19:05 crc kubenswrapper[4898]: I1211 13:19:05.373632 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="71015a82-563b-4992-8246-174c0b0be2cb" containerName="extract-content" Dec 11 13:19:05 crc kubenswrapper[4898]: E1211 13:19:05.373751 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d966730-3840-43d8-9354-7711d949bbd4" containerName="registry-server" Dec 11 13:19:05 crc kubenswrapper[4898]: I1211 13:19:05.373850 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d966730-3840-43d8-9354-7711d949bbd4" containerName="registry-server" Dec 11 13:19:05 crc kubenswrapper[4898]: E1211 13:19:05.373967 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d966730-3840-43d8-9354-7711d949bbd4" containerName="extract-content" Dec 11 13:19:05 crc kubenswrapper[4898]: I1211 13:19:05.374073 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d966730-3840-43d8-9354-7711d949bbd4" containerName="extract-content" Dec 11 13:19:05 crc kubenswrapper[4898]: E1211 13:19:05.374181 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71015a82-563b-4992-8246-174c0b0be2cb" containerName="extract-utilities" Dec 11 13:19:05 crc kubenswrapper[4898]: I1211 13:19:05.374294 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="71015a82-563b-4992-8246-174c0b0be2cb" containerName="extract-utilities" Dec 11 13:19:05 crc kubenswrapper[4898]: I1211 13:19:05.374651 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="71015a82-563b-4992-8246-174c0b0be2cb" containerName="registry-server" Dec 11 13:19:05 crc kubenswrapper[4898]: I1211 13:19:05.374804 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d966730-3840-43d8-9354-7711d949bbd4" containerName="registry-server" Dec 11 13:19:05 crc kubenswrapper[4898]: I1211 13:19:05.376240 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa84jf9c" Dec 11 13:19:05 crc kubenswrapper[4898]: I1211 13:19:05.378647 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 11 13:19:05 crc kubenswrapper[4898]: I1211 13:19:05.386752 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa84jf9c"] Dec 11 13:19:05 crc kubenswrapper[4898]: I1211 13:19:05.445782 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c9d7d79c-b405-4c5b-b4b9-860f7d4dea0e-bundle\") pod \"98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa84jf9c\" (UID: \"c9d7d79c-b405-4c5b-b4b9-860f7d4dea0e\") " pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa84jf9c" Dec 11 13:19:05 crc kubenswrapper[4898]: I1211 13:19:05.445858 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c9d7d79c-b405-4c5b-b4b9-860f7d4dea0e-util\") pod \"98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa84jf9c\" (UID: \"c9d7d79c-b405-4c5b-b4b9-860f7d4dea0e\") " pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa84jf9c" Dec 11 13:19:05 crc kubenswrapper[4898]: I1211 13:19:05.445979 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p8mpg\" (UniqueName: \"kubernetes.io/projected/c9d7d79c-b405-4c5b-b4b9-860f7d4dea0e-kube-api-access-p8mpg\") pod \"98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa84jf9c\" (UID: \"c9d7d79c-b405-4c5b-b4b9-860f7d4dea0e\") " pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa84jf9c" Dec 11 13:19:05 crc kubenswrapper[4898]: I1211 13:19:05.547171 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p8mpg\" (UniqueName: \"kubernetes.io/projected/c9d7d79c-b405-4c5b-b4b9-860f7d4dea0e-kube-api-access-p8mpg\") pod \"98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa84jf9c\" (UID: \"c9d7d79c-b405-4c5b-b4b9-860f7d4dea0e\") " pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa84jf9c" Dec 11 13:19:05 crc kubenswrapper[4898]: I1211 13:19:05.547313 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c9d7d79c-b405-4c5b-b4b9-860f7d4dea0e-bundle\") pod \"98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa84jf9c\" (UID: \"c9d7d79c-b405-4c5b-b4b9-860f7d4dea0e\") " pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa84jf9c" Dec 11 13:19:05 crc kubenswrapper[4898]: I1211 13:19:05.547344 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c9d7d79c-b405-4c5b-b4b9-860f7d4dea0e-util\") pod \"98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa84jf9c\" (UID: \"c9d7d79c-b405-4c5b-b4b9-860f7d4dea0e\") " pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa84jf9c" Dec 11 13:19:05 crc kubenswrapper[4898]: I1211 13:19:05.547878 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c9d7d79c-b405-4c5b-b4b9-860f7d4dea0e-bundle\") pod \"98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa84jf9c\" (UID: \"c9d7d79c-b405-4c5b-b4b9-860f7d4dea0e\") " pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa84jf9c" Dec 11 13:19:05 crc kubenswrapper[4898]: I1211 13:19:05.547904 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c9d7d79c-b405-4c5b-b4b9-860f7d4dea0e-util\") pod \"98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa84jf9c\" (UID: \"c9d7d79c-b405-4c5b-b4b9-860f7d4dea0e\") " pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa84jf9c" Dec 11 13:19:05 crc kubenswrapper[4898]: I1211 13:19:05.578673 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p8mpg\" (UniqueName: \"kubernetes.io/projected/c9d7d79c-b405-4c5b-b4b9-860f7d4dea0e-kube-api-access-p8mpg\") pod \"98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa84jf9c\" (UID: \"c9d7d79c-b405-4c5b-b4b9-860f7d4dea0e\") " pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa84jf9c" Dec 11 13:19:05 crc kubenswrapper[4898]: I1211 13:19:05.693266 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa84jf9c" Dec 11 13:19:06 crc kubenswrapper[4898]: W1211 13:19:06.258994 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc9d7d79c_b405_4c5b_b4b9_860f7d4dea0e.slice/crio-37b2b096b8f75698510693325bc0bade59c9f5c901faba26eb2216ca3ec8a7f4 WatchSource:0}: Error finding container 37b2b096b8f75698510693325bc0bade59c9f5c901faba26eb2216ca3ec8a7f4: Status 404 returned error can't find the container with id 37b2b096b8f75698510693325bc0bade59c9f5c901faba26eb2216ca3ec8a7f4 Dec 11 13:19:06 crc kubenswrapper[4898]: I1211 13:19:06.267984 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa84jf9c"] Dec 11 13:19:06 crc kubenswrapper[4898]: I1211 13:19:06.383727 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa84jf9c" event={"ID":"c9d7d79c-b405-4c5b-b4b9-860f7d4dea0e","Type":"ContainerStarted","Data":"37b2b096b8f75698510693325bc0bade59c9f5c901faba26eb2216ca3ec8a7f4"} Dec 11 13:19:07 crc kubenswrapper[4898]: I1211 13:19:07.392946 4898 generic.go:334] "Generic (PLEG): container finished" podID="c9d7d79c-b405-4c5b-b4b9-860f7d4dea0e" containerID="880ead7e63549ab2dd0eac6dcaf6f12669059a39c2dd14f3f2f1ba1c9f7655da" exitCode=0 Dec 11 13:19:07 crc kubenswrapper[4898]: I1211 13:19:07.393008 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa84jf9c" event={"ID":"c9d7d79c-b405-4c5b-b4b9-860f7d4dea0e","Type":"ContainerDied","Data":"880ead7e63549ab2dd0eac6dcaf6f12669059a39c2dd14f3f2f1ba1c9f7655da"} Dec 11 13:19:09 crc kubenswrapper[4898]: I1211 13:19:09.412527 4898 generic.go:334] "Generic (PLEG): container finished" podID="c9d7d79c-b405-4c5b-b4b9-860f7d4dea0e" containerID="7c70b871b3bb7ea453acea051ff3eabb5de2d30636d8d0e783d6b52136a37f86" exitCode=0 Dec 11 13:19:09 crc kubenswrapper[4898]: I1211 13:19:09.412621 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa84jf9c" event={"ID":"c9d7d79c-b405-4c5b-b4b9-860f7d4dea0e","Type":"ContainerDied","Data":"7c70b871b3bb7ea453acea051ff3eabb5de2d30636d8d0e783d6b52136a37f86"} Dec 11 13:19:10 crc kubenswrapper[4898]: I1211 13:19:10.425251 4898 generic.go:334] "Generic (PLEG): container finished" podID="c9d7d79c-b405-4c5b-b4b9-860f7d4dea0e" containerID="2ad55bf2be2a296dba0ce8c75fc6201041da0b9aeb137b6e6bb17c9569836e05" exitCode=0 Dec 11 13:19:10 crc kubenswrapper[4898]: I1211 13:19:10.425312 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa84jf9c" event={"ID":"c9d7d79c-b405-4c5b-b4b9-860f7d4dea0e","Type":"ContainerDied","Data":"2ad55bf2be2a296dba0ce8c75fc6201041da0b9aeb137b6e6bb17c9569836e05"} Dec 11 13:19:11 crc kubenswrapper[4898]: I1211 13:19:11.761871 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa84jf9c" Dec 11 13:19:11 crc kubenswrapper[4898]: I1211 13:19:11.960191 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c9d7d79c-b405-4c5b-b4b9-860f7d4dea0e-util\") pod \"c9d7d79c-b405-4c5b-b4b9-860f7d4dea0e\" (UID: \"c9d7d79c-b405-4c5b-b4b9-860f7d4dea0e\") " Dec 11 13:19:11 crc kubenswrapper[4898]: I1211 13:19:11.960298 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c9d7d79c-b405-4c5b-b4b9-860f7d4dea0e-bundle\") pod \"c9d7d79c-b405-4c5b-b4b9-860f7d4dea0e\" (UID: \"c9d7d79c-b405-4c5b-b4b9-860f7d4dea0e\") " Dec 11 13:19:11 crc kubenswrapper[4898]: I1211 13:19:11.960354 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p8mpg\" (UniqueName: \"kubernetes.io/projected/c9d7d79c-b405-4c5b-b4b9-860f7d4dea0e-kube-api-access-p8mpg\") pod \"c9d7d79c-b405-4c5b-b4b9-860f7d4dea0e\" (UID: \"c9d7d79c-b405-4c5b-b4b9-860f7d4dea0e\") " Dec 11 13:19:11 crc kubenswrapper[4898]: I1211 13:19:11.962588 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c9d7d79c-b405-4c5b-b4b9-860f7d4dea0e-bundle" (OuterVolumeSpecName: "bundle") pod "c9d7d79c-b405-4c5b-b4b9-860f7d4dea0e" (UID: "c9d7d79c-b405-4c5b-b4b9-860f7d4dea0e"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 13:19:11 crc kubenswrapper[4898]: I1211 13:19:11.981544 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9d7d79c-b405-4c5b-b4b9-860f7d4dea0e-kube-api-access-p8mpg" (OuterVolumeSpecName: "kube-api-access-p8mpg") pod "c9d7d79c-b405-4c5b-b4b9-860f7d4dea0e" (UID: "c9d7d79c-b405-4c5b-b4b9-860f7d4dea0e"). InnerVolumeSpecName "kube-api-access-p8mpg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:19:12 crc kubenswrapper[4898]: I1211 13:19:12.034536 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c9d7d79c-b405-4c5b-b4b9-860f7d4dea0e-util" (OuterVolumeSpecName: "util") pod "c9d7d79c-b405-4c5b-b4b9-860f7d4dea0e" (UID: "c9d7d79c-b405-4c5b-b4b9-860f7d4dea0e"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 13:19:12 crc kubenswrapper[4898]: I1211 13:19:12.062439 4898 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c9d7d79c-b405-4c5b-b4b9-860f7d4dea0e-util\") on node \"crc\" DevicePath \"\"" Dec 11 13:19:12 crc kubenswrapper[4898]: I1211 13:19:12.062502 4898 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c9d7d79c-b405-4c5b-b4b9-860f7d4dea0e-bundle\") on node \"crc\" DevicePath \"\"" Dec 11 13:19:12 crc kubenswrapper[4898]: I1211 13:19:12.062517 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p8mpg\" (UniqueName: \"kubernetes.io/projected/c9d7d79c-b405-4c5b-b4b9-860f7d4dea0e-kube-api-access-p8mpg\") on node \"crc\" DevicePath \"\"" Dec 11 13:19:12 crc kubenswrapper[4898]: I1211 13:19:12.443858 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa84jf9c" event={"ID":"c9d7d79c-b405-4c5b-b4b9-860f7d4dea0e","Type":"ContainerDied","Data":"37b2b096b8f75698510693325bc0bade59c9f5c901faba26eb2216ca3ec8a7f4"} Dec 11 13:19:12 crc kubenswrapper[4898]: I1211 13:19:12.443905 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="37b2b096b8f75698510693325bc0bade59c9f5c901faba26eb2216ca3ec8a7f4" Dec 11 13:19:12 crc kubenswrapper[4898]: I1211 13:19:12.443978 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa84jf9c" Dec 11 13:19:17 crc kubenswrapper[4898]: I1211 13:19:17.384733 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-6769fb99d-j52sl"] Dec 11 13:19:17 crc kubenswrapper[4898]: E1211 13:19:17.385267 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9d7d79c-b405-4c5b-b4b9-860f7d4dea0e" containerName="util" Dec 11 13:19:17 crc kubenswrapper[4898]: I1211 13:19:17.385278 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9d7d79c-b405-4c5b-b4b9-860f7d4dea0e" containerName="util" Dec 11 13:19:17 crc kubenswrapper[4898]: E1211 13:19:17.385286 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9d7d79c-b405-4c5b-b4b9-860f7d4dea0e" containerName="extract" Dec 11 13:19:17 crc kubenswrapper[4898]: I1211 13:19:17.385292 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9d7d79c-b405-4c5b-b4b9-860f7d4dea0e" containerName="extract" Dec 11 13:19:17 crc kubenswrapper[4898]: E1211 13:19:17.385299 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9d7d79c-b405-4c5b-b4b9-860f7d4dea0e" containerName="pull" Dec 11 13:19:17 crc kubenswrapper[4898]: I1211 13:19:17.385305 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9d7d79c-b405-4c5b-b4b9-860f7d4dea0e" containerName="pull" Dec 11 13:19:17 crc kubenswrapper[4898]: I1211 13:19:17.385439 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9d7d79c-b405-4c5b-b4b9-860f7d4dea0e" containerName="extract" Dec 11 13:19:17 crc kubenswrapper[4898]: I1211 13:19:17.386055 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-6769fb99d-j52sl" Dec 11 13:19:17 crc kubenswrapper[4898]: I1211 13:19:17.389077 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-8vnwn" Dec 11 13:19:17 crc kubenswrapper[4898]: I1211 13:19:17.389174 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Dec 11 13:19:17 crc kubenswrapper[4898]: I1211 13:19:17.389260 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Dec 11 13:19:17 crc kubenswrapper[4898]: I1211 13:19:17.397581 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-6769fb99d-j52sl"] Dec 11 13:19:17 crc kubenswrapper[4898]: I1211 13:19:17.459438 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-52mg7\" (UniqueName: \"kubernetes.io/projected/41793757-3faf-4702-b024-5e2ab032b432-kube-api-access-52mg7\") pod \"nmstate-operator-6769fb99d-j52sl\" (UID: \"41793757-3faf-4702-b024-5e2ab032b432\") " pod="openshift-nmstate/nmstate-operator-6769fb99d-j52sl" Dec 11 13:19:17 crc kubenswrapper[4898]: I1211 13:19:17.560553 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-52mg7\" (UniqueName: \"kubernetes.io/projected/41793757-3faf-4702-b024-5e2ab032b432-kube-api-access-52mg7\") pod \"nmstate-operator-6769fb99d-j52sl\" (UID: \"41793757-3faf-4702-b024-5e2ab032b432\") " pod="openshift-nmstate/nmstate-operator-6769fb99d-j52sl" Dec 11 13:19:17 crc kubenswrapper[4898]: I1211 13:19:17.584861 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-52mg7\" (UniqueName: \"kubernetes.io/projected/41793757-3faf-4702-b024-5e2ab032b432-kube-api-access-52mg7\") pod \"nmstate-operator-6769fb99d-j52sl\" (UID: \"41793757-3faf-4702-b024-5e2ab032b432\") " pod="openshift-nmstate/nmstate-operator-6769fb99d-j52sl" Dec 11 13:19:17 crc kubenswrapper[4898]: I1211 13:19:17.756188 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-6769fb99d-j52sl" Dec 11 13:19:18 crc kubenswrapper[4898]: W1211 13:19:18.192653 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod41793757_3faf_4702_b024_5e2ab032b432.slice/crio-a8427abb3e172cc3c5bb729ac86353c2221554e40a117426f12b868ecde5ce28 WatchSource:0}: Error finding container a8427abb3e172cc3c5bb729ac86353c2221554e40a117426f12b868ecde5ce28: Status 404 returned error can't find the container with id a8427abb3e172cc3c5bb729ac86353c2221554e40a117426f12b868ecde5ce28 Dec 11 13:19:18 crc kubenswrapper[4898]: I1211 13:19:18.196373 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-6769fb99d-j52sl"] Dec 11 13:19:18 crc kubenswrapper[4898]: I1211 13:19:18.494085 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-6769fb99d-j52sl" event={"ID":"41793757-3faf-4702-b024-5e2ab032b432","Type":"ContainerStarted","Data":"a8427abb3e172cc3c5bb729ac86353c2221554e40a117426f12b868ecde5ce28"} Dec 11 13:19:21 crc kubenswrapper[4898]: I1211 13:19:21.517082 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-6769fb99d-j52sl" event={"ID":"41793757-3faf-4702-b024-5e2ab032b432","Type":"ContainerStarted","Data":"2d897c56b04f7229fcffc6d4499116a9a0e3d619f3c3fd25a20df8dbc816e7b0"} Dec 11 13:19:21 crc kubenswrapper[4898]: I1211 13:19:21.537261 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-6769fb99d-j52sl" podStartSLOduration=2.237928794 podStartE2EDuration="4.537236853s" podCreationTimestamp="2025-12-11 13:19:17 +0000 UTC" firstStartedPulling="2025-12-11 13:19:18.196394562 +0000 UTC m=+915.768721029" lastFinishedPulling="2025-12-11 13:19:20.495702661 +0000 UTC m=+918.068029088" observedRunningTime="2025-12-11 13:19:21.531448835 +0000 UTC m=+919.103775272" watchObservedRunningTime="2025-12-11 13:19:21.537236853 +0000 UTC m=+919.109563330" Dec 11 13:19:24 crc kubenswrapper[4898]: I1211 13:19:24.906951 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-tk78x"] Dec 11 13:19:24 crc kubenswrapper[4898]: I1211 13:19:24.909282 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tk78x" Dec 11 13:19:24 crc kubenswrapper[4898]: I1211 13:19:24.933061 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-tk78x"] Dec 11 13:19:25 crc kubenswrapper[4898]: I1211 13:19:25.089327 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3010c9a6-a94e-4a73-98d0-e7c4e6bf8521-catalog-content\") pod \"community-operators-tk78x\" (UID: \"3010c9a6-a94e-4a73-98d0-e7c4e6bf8521\") " pod="openshift-marketplace/community-operators-tk78x" Dec 11 13:19:25 crc kubenswrapper[4898]: I1211 13:19:25.089725 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2h8ch\" (UniqueName: \"kubernetes.io/projected/3010c9a6-a94e-4a73-98d0-e7c4e6bf8521-kube-api-access-2h8ch\") pod \"community-operators-tk78x\" (UID: \"3010c9a6-a94e-4a73-98d0-e7c4e6bf8521\") " pod="openshift-marketplace/community-operators-tk78x" Dec 11 13:19:25 crc kubenswrapper[4898]: I1211 13:19:25.089822 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3010c9a6-a94e-4a73-98d0-e7c4e6bf8521-utilities\") pod \"community-operators-tk78x\" (UID: \"3010c9a6-a94e-4a73-98d0-e7c4e6bf8521\") " pod="openshift-marketplace/community-operators-tk78x" Dec 11 13:19:25 crc kubenswrapper[4898]: I1211 13:19:25.191533 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3010c9a6-a94e-4a73-98d0-e7c4e6bf8521-catalog-content\") pod \"community-operators-tk78x\" (UID: \"3010c9a6-a94e-4a73-98d0-e7c4e6bf8521\") " pod="openshift-marketplace/community-operators-tk78x" Dec 11 13:19:25 crc kubenswrapper[4898]: I1211 13:19:25.191703 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2h8ch\" (UniqueName: \"kubernetes.io/projected/3010c9a6-a94e-4a73-98d0-e7c4e6bf8521-kube-api-access-2h8ch\") pod \"community-operators-tk78x\" (UID: \"3010c9a6-a94e-4a73-98d0-e7c4e6bf8521\") " pod="openshift-marketplace/community-operators-tk78x" Dec 11 13:19:25 crc kubenswrapper[4898]: I1211 13:19:25.191739 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3010c9a6-a94e-4a73-98d0-e7c4e6bf8521-utilities\") pod \"community-operators-tk78x\" (UID: \"3010c9a6-a94e-4a73-98d0-e7c4e6bf8521\") " pod="openshift-marketplace/community-operators-tk78x" Dec 11 13:19:25 crc kubenswrapper[4898]: I1211 13:19:25.192272 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3010c9a6-a94e-4a73-98d0-e7c4e6bf8521-catalog-content\") pod \"community-operators-tk78x\" (UID: \"3010c9a6-a94e-4a73-98d0-e7c4e6bf8521\") " pod="openshift-marketplace/community-operators-tk78x" Dec 11 13:19:25 crc kubenswrapper[4898]: I1211 13:19:25.192321 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3010c9a6-a94e-4a73-98d0-e7c4e6bf8521-utilities\") pod \"community-operators-tk78x\" (UID: \"3010c9a6-a94e-4a73-98d0-e7c4e6bf8521\") " pod="openshift-marketplace/community-operators-tk78x" Dec 11 13:19:25 crc kubenswrapper[4898]: I1211 13:19:25.228059 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2h8ch\" (UniqueName: \"kubernetes.io/projected/3010c9a6-a94e-4a73-98d0-e7c4e6bf8521-kube-api-access-2h8ch\") pod \"community-operators-tk78x\" (UID: \"3010c9a6-a94e-4a73-98d0-e7c4e6bf8521\") " pod="openshift-marketplace/community-operators-tk78x" Dec 11 13:19:25 crc kubenswrapper[4898]: I1211 13:19:25.249533 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tk78x" Dec 11 13:19:25 crc kubenswrapper[4898]: I1211 13:19:25.729982 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-tk78x"] Dec 11 13:19:26 crc kubenswrapper[4898]: I1211 13:19:26.562429 4898 generic.go:334] "Generic (PLEG): container finished" podID="3010c9a6-a94e-4a73-98d0-e7c4e6bf8521" containerID="0b2823a9b1d08b132a77b7b9c88437b247fe7f7f38c00c288002fb459e65a7bd" exitCode=0 Dec 11 13:19:26 crc kubenswrapper[4898]: I1211 13:19:26.562489 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tk78x" event={"ID":"3010c9a6-a94e-4a73-98d0-e7c4e6bf8521","Type":"ContainerDied","Data":"0b2823a9b1d08b132a77b7b9c88437b247fe7f7f38c00c288002fb459e65a7bd"} Dec 11 13:19:26 crc kubenswrapper[4898]: I1211 13:19:26.562778 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tk78x" event={"ID":"3010c9a6-a94e-4a73-98d0-e7c4e6bf8521","Type":"ContainerStarted","Data":"d24c1bb7572e2ed5180ce2ebc8310e957c9a1d941f36891b3e6da91685d9bf43"} Dec 11 13:19:26 crc kubenswrapper[4898]: I1211 13:19:26.582354 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-7f7f7578db-65snc"] Dec 11 13:19:26 crc kubenswrapper[4898]: I1211 13:19:26.583804 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-7f7f7578db-65snc" Dec 11 13:19:26 crc kubenswrapper[4898]: I1211 13:19:26.585839 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-m75wq" Dec 11 13:19:26 crc kubenswrapper[4898]: I1211 13:19:26.597955 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-7f7f7578db-65snc"] Dec 11 13:19:26 crc kubenswrapper[4898]: I1211 13:19:26.609583 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-f8fb84555-z6l22"] Dec 11 13:19:26 crc kubenswrapper[4898]: I1211 13:19:26.610563 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-f8fb84555-z6l22" Dec 11 13:19:26 crc kubenswrapper[4898]: I1211 13:19:26.611450 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h6cwn\" (UniqueName: \"kubernetes.io/projected/231b14b2-7f77-4900-ba69-07247827770f-kube-api-access-h6cwn\") pod \"nmstate-metrics-7f7f7578db-65snc\" (UID: \"231b14b2-7f77-4900-ba69-07247827770f\") " pod="openshift-nmstate/nmstate-metrics-7f7f7578db-65snc" Dec 11 13:19:26 crc kubenswrapper[4898]: I1211 13:19:26.612674 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Dec 11 13:19:26 crc kubenswrapper[4898]: I1211 13:19:26.626485 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-f8fb84555-z6l22"] Dec 11 13:19:26 crc kubenswrapper[4898]: I1211 13:19:26.647634 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-s4vbv"] Dec 11 13:19:26 crc kubenswrapper[4898]: I1211 13:19:26.648792 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-s4vbv" Dec 11 13:19:26 crc kubenswrapper[4898]: I1211 13:19:26.714354 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h6cwn\" (UniqueName: \"kubernetes.io/projected/231b14b2-7f77-4900-ba69-07247827770f-kube-api-access-h6cwn\") pod \"nmstate-metrics-7f7f7578db-65snc\" (UID: \"231b14b2-7f77-4900-ba69-07247827770f\") " pod="openshift-nmstate/nmstate-metrics-7f7f7578db-65snc" Dec 11 13:19:26 crc kubenswrapper[4898]: I1211 13:19:26.714709 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/79dd8f49-7447-49a9-84a3-252ac5286cc3-tls-key-pair\") pod \"nmstate-webhook-f8fb84555-z6l22\" (UID: \"79dd8f49-7447-49a9-84a3-252ac5286cc3\") " pod="openshift-nmstate/nmstate-webhook-f8fb84555-z6l22" Dec 11 13:19:26 crc kubenswrapper[4898]: I1211 13:19:26.714731 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z5cm9\" (UniqueName: \"kubernetes.io/projected/79dd8f49-7447-49a9-84a3-252ac5286cc3-kube-api-access-z5cm9\") pod \"nmstate-webhook-f8fb84555-z6l22\" (UID: \"79dd8f49-7447-49a9-84a3-252ac5286cc3\") " pod="openshift-nmstate/nmstate-webhook-f8fb84555-z6l22" Dec 11 13:19:26 crc kubenswrapper[4898]: I1211 13:19:26.729352 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-6ff7998486-xz82s"] Dec 11 13:19:26 crc kubenswrapper[4898]: I1211 13:19:26.730404 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-6ff7998486-xz82s" Dec 11 13:19:26 crc kubenswrapper[4898]: I1211 13:19:26.740256 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-pqpqz" Dec 11 13:19:26 crc kubenswrapper[4898]: I1211 13:19:26.740738 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Dec 11 13:19:26 crc kubenswrapper[4898]: I1211 13:19:26.744483 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-6ff7998486-xz82s"] Dec 11 13:19:26 crc kubenswrapper[4898]: I1211 13:19:26.752368 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Dec 11 13:19:26 crc kubenswrapper[4898]: I1211 13:19:26.760302 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h6cwn\" (UniqueName: \"kubernetes.io/projected/231b14b2-7f77-4900-ba69-07247827770f-kube-api-access-h6cwn\") pod \"nmstate-metrics-7f7f7578db-65snc\" (UID: \"231b14b2-7f77-4900-ba69-07247827770f\") " pod="openshift-nmstate/nmstate-metrics-7f7f7578db-65snc" Dec 11 13:19:26 crc kubenswrapper[4898]: I1211 13:19:26.819591 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/caec89bd-563f-4065-87ce-2cb58b5e4dc9-dbus-socket\") pod \"nmstate-handler-s4vbv\" (UID: \"caec89bd-563f-4065-87ce-2cb58b5e4dc9\") " pod="openshift-nmstate/nmstate-handler-s4vbv" Dec 11 13:19:26 crc kubenswrapper[4898]: I1211 13:19:26.819705 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/d33832d5-019a-4630-84b6-01df5d77cade-nginx-conf\") pod \"nmstate-console-plugin-6ff7998486-xz82s\" (UID: \"d33832d5-019a-4630-84b6-01df5d77cade\") " pod="openshift-nmstate/nmstate-console-plugin-6ff7998486-xz82s" Dec 11 13:19:26 crc kubenswrapper[4898]: I1211 13:19:26.820207 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/79dd8f49-7447-49a9-84a3-252ac5286cc3-tls-key-pair\") pod \"nmstate-webhook-f8fb84555-z6l22\" (UID: \"79dd8f49-7447-49a9-84a3-252ac5286cc3\") " pod="openshift-nmstate/nmstate-webhook-f8fb84555-z6l22" Dec 11 13:19:26 crc kubenswrapper[4898]: I1211 13:19:26.820243 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z5cm9\" (UniqueName: \"kubernetes.io/projected/79dd8f49-7447-49a9-84a3-252ac5286cc3-kube-api-access-z5cm9\") pod \"nmstate-webhook-f8fb84555-z6l22\" (UID: \"79dd8f49-7447-49a9-84a3-252ac5286cc3\") " pod="openshift-nmstate/nmstate-webhook-f8fb84555-z6l22" Dec 11 13:19:26 crc kubenswrapper[4898]: I1211 13:19:26.820288 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b7xfr\" (UniqueName: \"kubernetes.io/projected/d33832d5-019a-4630-84b6-01df5d77cade-kube-api-access-b7xfr\") pod \"nmstate-console-plugin-6ff7998486-xz82s\" (UID: \"d33832d5-019a-4630-84b6-01df5d77cade\") " pod="openshift-nmstate/nmstate-console-plugin-6ff7998486-xz82s" Dec 11 13:19:26 crc kubenswrapper[4898]: I1211 13:19:26.820316 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/caec89bd-563f-4065-87ce-2cb58b5e4dc9-nmstate-lock\") pod \"nmstate-handler-s4vbv\" (UID: \"caec89bd-563f-4065-87ce-2cb58b5e4dc9\") " pod="openshift-nmstate/nmstate-handler-s4vbv" Dec 11 13:19:26 crc kubenswrapper[4898]: I1211 13:19:26.820336 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8lvws\" (UniqueName: \"kubernetes.io/projected/caec89bd-563f-4065-87ce-2cb58b5e4dc9-kube-api-access-8lvws\") pod \"nmstate-handler-s4vbv\" (UID: \"caec89bd-563f-4065-87ce-2cb58b5e4dc9\") " pod="openshift-nmstate/nmstate-handler-s4vbv" Dec 11 13:19:26 crc kubenswrapper[4898]: I1211 13:19:26.820766 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/d33832d5-019a-4630-84b6-01df5d77cade-plugin-serving-cert\") pod \"nmstate-console-plugin-6ff7998486-xz82s\" (UID: \"d33832d5-019a-4630-84b6-01df5d77cade\") " pod="openshift-nmstate/nmstate-console-plugin-6ff7998486-xz82s" Dec 11 13:19:26 crc kubenswrapper[4898]: I1211 13:19:26.821244 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/caec89bd-563f-4065-87ce-2cb58b5e4dc9-ovs-socket\") pod \"nmstate-handler-s4vbv\" (UID: \"caec89bd-563f-4065-87ce-2cb58b5e4dc9\") " pod="openshift-nmstate/nmstate-handler-s4vbv" Dec 11 13:19:26 crc kubenswrapper[4898]: E1211 13:19:26.821552 4898 secret.go:188] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Dec 11 13:19:26 crc kubenswrapper[4898]: E1211 13:19:26.821613 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/79dd8f49-7447-49a9-84a3-252ac5286cc3-tls-key-pair podName:79dd8f49-7447-49a9-84a3-252ac5286cc3 nodeName:}" failed. No retries permitted until 2025-12-11 13:19:27.321591062 +0000 UTC m=+924.893917499 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/79dd8f49-7447-49a9-84a3-252ac5286cc3-tls-key-pair") pod "nmstate-webhook-f8fb84555-z6l22" (UID: "79dd8f49-7447-49a9-84a3-252ac5286cc3") : secret "openshift-nmstate-webhook" not found Dec 11 13:19:26 crc kubenswrapper[4898]: I1211 13:19:26.849727 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z5cm9\" (UniqueName: \"kubernetes.io/projected/79dd8f49-7447-49a9-84a3-252ac5286cc3-kube-api-access-z5cm9\") pod \"nmstate-webhook-f8fb84555-z6l22\" (UID: \"79dd8f49-7447-49a9-84a3-252ac5286cc3\") " pod="openshift-nmstate/nmstate-webhook-f8fb84555-z6l22" Dec 11 13:19:26 crc kubenswrapper[4898]: I1211 13:19:26.903868 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-7f7f7578db-65snc" Dec 11 13:19:26 crc kubenswrapper[4898]: I1211 13:19:26.923836 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/caec89bd-563f-4065-87ce-2cb58b5e4dc9-dbus-socket\") pod \"nmstate-handler-s4vbv\" (UID: \"caec89bd-563f-4065-87ce-2cb58b5e4dc9\") " pod="openshift-nmstate/nmstate-handler-s4vbv" Dec 11 13:19:26 crc kubenswrapper[4898]: I1211 13:19:26.923910 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/d33832d5-019a-4630-84b6-01df5d77cade-nginx-conf\") pod \"nmstate-console-plugin-6ff7998486-xz82s\" (UID: \"d33832d5-019a-4630-84b6-01df5d77cade\") " pod="openshift-nmstate/nmstate-console-plugin-6ff7998486-xz82s" Dec 11 13:19:26 crc kubenswrapper[4898]: I1211 13:19:26.924008 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b7xfr\" (UniqueName: \"kubernetes.io/projected/d33832d5-019a-4630-84b6-01df5d77cade-kube-api-access-b7xfr\") pod \"nmstate-console-plugin-6ff7998486-xz82s\" (UID: \"d33832d5-019a-4630-84b6-01df5d77cade\") " pod="openshift-nmstate/nmstate-console-plugin-6ff7998486-xz82s" Dec 11 13:19:26 crc kubenswrapper[4898]: I1211 13:19:26.924048 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8lvws\" (UniqueName: \"kubernetes.io/projected/caec89bd-563f-4065-87ce-2cb58b5e4dc9-kube-api-access-8lvws\") pod \"nmstate-handler-s4vbv\" (UID: \"caec89bd-563f-4065-87ce-2cb58b5e4dc9\") " pod="openshift-nmstate/nmstate-handler-s4vbv" Dec 11 13:19:26 crc kubenswrapper[4898]: I1211 13:19:26.924092 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/caec89bd-563f-4065-87ce-2cb58b5e4dc9-nmstate-lock\") pod \"nmstate-handler-s4vbv\" (UID: \"caec89bd-563f-4065-87ce-2cb58b5e4dc9\") " pod="openshift-nmstate/nmstate-handler-s4vbv" Dec 11 13:19:26 crc kubenswrapper[4898]: I1211 13:19:26.924122 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/d33832d5-019a-4630-84b6-01df5d77cade-plugin-serving-cert\") pod \"nmstate-console-plugin-6ff7998486-xz82s\" (UID: \"d33832d5-019a-4630-84b6-01df5d77cade\") " pod="openshift-nmstate/nmstate-console-plugin-6ff7998486-xz82s" Dec 11 13:19:26 crc kubenswrapper[4898]: I1211 13:19:26.924190 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/caec89bd-563f-4065-87ce-2cb58b5e4dc9-dbus-socket\") pod \"nmstate-handler-s4vbv\" (UID: \"caec89bd-563f-4065-87ce-2cb58b5e4dc9\") " pod="openshift-nmstate/nmstate-handler-s4vbv" Dec 11 13:19:26 crc kubenswrapper[4898]: I1211 13:19:26.924232 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/caec89bd-563f-4065-87ce-2cb58b5e4dc9-nmstate-lock\") pod \"nmstate-handler-s4vbv\" (UID: \"caec89bd-563f-4065-87ce-2cb58b5e4dc9\") " pod="openshift-nmstate/nmstate-handler-s4vbv" Dec 11 13:19:26 crc kubenswrapper[4898]: I1211 13:19:26.924277 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/caec89bd-563f-4065-87ce-2cb58b5e4dc9-ovs-socket\") pod \"nmstate-handler-s4vbv\" (UID: \"caec89bd-563f-4065-87ce-2cb58b5e4dc9\") " pod="openshift-nmstate/nmstate-handler-s4vbv" Dec 11 13:19:26 crc kubenswrapper[4898]: I1211 13:19:26.924197 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/caec89bd-563f-4065-87ce-2cb58b5e4dc9-ovs-socket\") pod \"nmstate-handler-s4vbv\" (UID: \"caec89bd-563f-4065-87ce-2cb58b5e4dc9\") " pod="openshift-nmstate/nmstate-handler-s4vbv" Dec 11 13:19:26 crc kubenswrapper[4898]: I1211 13:19:26.925318 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/d33832d5-019a-4630-84b6-01df5d77cade-nginx-conf\") pod \"nmstate-console-plugin-6ff7998486-xz82s\" (UID: \"d33832d5-019a-4630-84b6-01df5d77cade\") " pod="openshift-nmstate/nmstate-console-plugin-6ff7998486-xz82s" Dec 11 13:19:26 crc kubenswrapper[4898]: I1211 13:19:26.928423 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/d33832d5-019a-4630-84b6-01df5d77cade-plugin-serving-cert\") pod \"nmstate-console-plugin-6ff7998486-xz82s\" (UID: \"d33832d5-019a-4630-84b6-01df5d77cade\") " pod="openshift-nmstate/nmstate-console-plugin-6ff7998486-xz82s" Dec 11 13:19:26 crc kubenswrapper[4898]: I1211 13:19:26.943856 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-6f8f67696b-jczc2"] Dec 11 13:19:26 crc kubenswrapper[4898]: I1211 13:19:26.950443 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8lvws\" (UniqueName: \"kubernetes.io/projected/caec89bd-563f-4065-87ce-2cb58b5e4dc9-kube-api-access-8lvws\") pod \"nmstate-handler-s4vbv\" (UID: \"caec89bd-563f-4065-87ce-2cb58b5e4dc9\") " pod="openshift-nmstate/nmstate-handler-s4vbv" Dec 11 13:19:26 crc kubenswrapper[4898]: I1211 13:19:26.950989 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6f8f67696b-jczc2" Dec 11 13:19:26 crc kubenswrapper[4898]: I1211 13:19:26.968020 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6f8f67696b-jczc2"] Dec 11 13:19:26 crc kubenswrapper[4898]: I1211 13:19:26.974718 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b7xfr\" (UniqueName: \"kubernetes.io/projected/d33832d5-019a-4630-84b6-01df5d77cade-kube-api-access-b7xfr\") pod \"nmstate-console-plugin-6ff7998486-xz82s\" (UID: \"d33832d5-019a-4630-84b6-01df5d77cade\") " pod="openshift-nmstate/nmstate-console-plugin-6ff7998486-xz82s" Dec 11 13:19:26 crc kubenswrapper[4898]: I1211 13:19:26.975268 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-s4vbv" Dec 11 13:19:27 crc kubenswrapper[4898]: I1211 13:19:27.024910 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c31c6aa7-c341-4e86-92d2-56ea1ab37168-console-config\") pod \"console-6f8f67696b-jczc2\" (UID: \"c31c6aa7-c341-4e86-92d2-56ea1ab37168\") " pod="openshift-console/console-6f8f67696b-jczc2" Dec 11 13:19:27 crc kubenswrapper[4898]: I1211 13:19:27.024949 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c31c6aa7-c341-4e86-92d2-56ea1ab37168-service-ca\") pod \"console-6f8f67696b-jczc2\" (UID: \"c31c6aa7-c341-4e86-92d2-56ea1ab37168\") " pod="openshift-console/console-6f8f67696b-jczc2" Dec 11 13:19:27 crc kubenswrapper[4898]: I1211 13:19:27.024968 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7rz5p\" (UniqueName: \"kubernetes.io/projected/c31c6aa7-c341-4e86-92d2-56ea1ab37168-kube-api-access-7rz5p\") pod \"console-6f8f67696b-jczc2\" (UID: \"c31c6aa7-c341-4e86-92d2-56ea1ab37168\") " pod="openshift-console/console-6f8f67696b-jczc2" Dec 11 13:19:27 crc kubenswrapper[4898]: I1211 13:19:27.024986 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c31c6aa7-c341-4e86-92d2-56ea1ab37168-console-serving-cert\") pod \"console-6f8f67696b-jczc2\" (UID: \"c31c6aa7-c341-4e86-92d2-56ea1ab37168\") " pod="openshift-console/console-6f8f67696b-jczc2" Dec 11 13:19:27 crc kubenswrapper[4898]: I1211 13:19:27.025006 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c31c6aa7-c341-4e86-92d2-56ea1ab37168-trusted-ca-bundle\") pod \"console-6f8f67696b-jczc2\" (UID: \"c31c6aa7-c341-4e86-92d2-56ea1ab37168\") " pod="openshift-console/console-6f8f67696b-jczc2" Dec 11 13:19:27 crc kubenswrapper[4898]: I1211 13:19:27.025023 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c31c6aa7-c341-4e86-92d2-56ea1ab37168-oauth-serving-cert\") pod \"console-6f8f67696b-jczc2\" (UID: \"c31c6aa7-c341-4e86-92d2-56ea1ab37168\") " pod="openshift-console/console-6f8f67696b-jczc2" Dec 11 13:19:27 crc kubenswrapper[4898]: I1211 13:19:27.025059 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c31c6aa7-c341-4e86-92d2-56ea1ab37168-console-oauth-config\") pod \"console-6f8f67696b-jczc2\" (UID: \"c31c6aa7-c341-4e86-92d2-56ea1ab37168\") " pod="openshift-console/console-6f8f67696b-jczc2" Dec 11 13:19:27 crc kubenswrapper[4898]: I1211 13:19:27.075760 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-6ff7998486-xz82s" Dec 11 13:19:27 crc kubenswrapper[4898]: I1211 13:19:27.126671 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c31c6aa7-c341-4e86-92d2-56ea1ab37168-console-config\") pod \"console-6f8f67696b-jczc2\" (UID: \"c31c6aa7-c341-4e86-92d2-56ea1ab37168\") " pod="openshift-console/console-6f8f67696b-jczc2" Dec 11 13:19:27 crc kubenswrapper[4898]: I1211 13:19:27.126717 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c31c6aa7-c341-4e86-92d2-56ea1ab37168-service-ca\") pod \"console-6f8f67696b-jczc2\" (UID: \"c31c6aa7-c341-4e86-92d2-56ea1ab37168\") " pod="openshift-console/console-6f8f67696b-jczc2" Dec 11 13:19:27 crc kubenswrapper[4898]: I1211 13:19:27.126749 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7rz5p\" (UniqueName: \"kubernetes.io/projected/c31c6aa7-c341-4e86-92d2-56ea1ab37168-kube-api-access-7rz5p\") pod \"console-6f8f67696b-jczc2\" (UID: \"c31c6aa7-c341-4e86-92d2-56ea1ab37168\") " pod="openshift-console/console-6f8f67696b-jczc2" Dec 11 13:19:27 crc kubenswrapper[4898]: I1211 13:19:27.126771 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c31c6aa7-c341-4e86-92d2-56ea1ab37168-console-serving-cert\") pod \"console-6f8f67696b-jczc2\" (UID: \"c31c6aa7-c341-4e86-92d2-56ea1ab37168\") " pod="openshift-console/console-6f8f67696b-jczc2" Dec 11 13:19:27 crc kubenswrapper[4898]: I1211 13:19:27.126793 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c31c6aa7-c341-4e86-92d2-56ea1ab37168-trusted-ca-bundle\") pod \"console-6f8f67696b-jczc2\" (UID: \"c31c6aa7-c341-4e86-92d2-56ea1ab37168\") " pod="openshift-console/console-6f8f67696b-jczc2" Dec 11 13:19:27 crc kubenswrapper[4898]: I1211 13:19:27.126809 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c31c6aa7-c341-4e86-92d2-56ea1ab37168-oauth-serving-cert\") pod \"console-6f8f67696b-jczc2\" (UID: \"c31c6aa7-c341-4e86-92d2-56ea1ab37168\") " pod="openshift-console/console-6f8f67696b-jczc2" Dec 11 13:19:27 crc kubenswrapper[4898]: I1211 13:19:27.126848 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c31c6aa7-c341-4e86-92d2-56ea1ab37168-console-oauth-config\") pod \"console-6f8f67696b-jczc2\" (UID: \"c31c6aa7-c341-4e86-92d2-56ea1ab37168\") " pod="openshift-console/console-6f8f67696b-jczc2" Dec 11 13:19:27 crc kubenswrapper[4898]: I1211 13:19:27.127478 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c31c6aa7-c341-4e86-92d2-56ea1ab37168-console-config\") pod \"console-6f8f67696b-jczc2\" (UID: \"c31c6aa7-c341-4e86-92d2-56ea1ab37168\") " pod="openshift-console/console-6f8f67696b-jczc2" Dec 11 13:19:27 crc kubenswrapper[4898]: I1211 13:19:27.128650 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c31c6aa7-c341-4e86-92d2-56ea1ab37168-trusted-ca-bundle\") pod \"console-6f8f67696b-jczc2\" (UID: \"c31c6aa7-c341-4e86-92d2-56ea1ab37168\") " pod="openshift-console/console-6f8f67696b-jczc2" Dec 11 13:19:27 crc kubenswrapper[4898]: I1211 13:19:27.132009 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c31c6aa7-c341-4e86-92d2-56ea1ab37168-oauth-serving-cert\") pod \"console-6f8f67696b-jczc2\" (UID: \"c31c6aa7-c341-4e86-92d2-56ea1ab37168\") " pod="openshift-console/console-6f8f67696b-jczc2" Dec 11 13:19:27 crc kubenswrapper[4898]: I1211 13:19:27.132701 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c31c6aa7-c341-4e86-92d2-56ea1ab37168-console-serving-cert\") pod \"console-6f8f67696b-jczc2\" (UID: \"c31c6aa7-c341-4e86-92d2-56ea1ab37168\") " pod="openshift-console/console-6f8f67696b-jczc2" Dec 11 13:19:27 crc kubenswrapper[4898]: I1211 13:19:27.133267 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c31c6aa7-c341-4e86-92d2-56ea1ab37168-service-ca\") pod \"console-6f8f67696b-jczc2\" (UID: \"c31c6aa7-c341-4e86-92d2-56ea1ab37168\") " pod="openshift-console/console-6f8f67696b-jczc2" Dec 11 13:19:27 crc kubenswrapper[4898]: I1211 13:19:27.135069 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c31c6aa7-c341-4e86-92d2-56ea1ab37168-console-oauth-config\") pod \"console-6f8f67696b-jczc2\" (UID: \"c31c6aa7-c341-4e86-92d2-56ea1ab37168\") " pod="openshift-console/console-6f8f67696b-jczc2" Dec 11 13:19:27 crc kubenswrapper[4898]: I1211 13:19:27.141621 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7rz5p\" (UniqueName: \"kubernetes.io/projected/c31c6aa7-c341-4e86-92d2-56ea1ab37168-kube-api-access-7rz5p\") pod \"console-6f8f67696b-jczc2\" (UID: \"c31c6aa7-c341-4e86-92d2-56ea1ab37168\") " pod="openshift-console/console-6f8f67696b-jczc2" Dec 11 13:19:27 crc kubenswrapper[4898]: I1211 13:19:27.309099 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6f8f67696b-jczc2" Dec 11 13:19:27 crc kubenswrapper[4898]: I1211 13:19:27.330831 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/79dd8f49-7447-49a9-84a3-252ac5286cc3-tls-key-pair\") pod \"nmstate-webhook-f8fb84555-z6l22\" (UID: \"79dd8f49-7447-49a9-84a3-252ac5286cc3\") " pod="openshift-nmstate/nmstate-webhook-f8fb84555-z6l22" Dec 11 13:19:27 crc kubenswrapper[4898]: I1211 13:19:27.334733 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/79dd8f49-7447-49a9-84a3-252ac5286cc3-tls-key-pair\") pod \"nmstate-webhook-f8fb84555-z6l22\" (UID: \"79dd8f49-7447-49a9-84a3-252ac5286cc3\") " pod="openshift-nmstate/nmstate-webhook-f8fb84555-z6l22" Dec 11 13:19:27 crc kubenswrapper[4898]: I1211 13:19:27.379180 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-7f7f7578db-65snc"] Dec 11 13:19:27 crc kubenswrapper[4898]: W1211 13:19:27.389061 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod231b14b2_7f77_4900_ba69_07247827770f.slice/crio-ddaa5c5cdc23a71293844447ac18e5d32c2a76eb0253d79f4be2f4579904a31f WatchSource:0}: Error finding container ddaa5c5cdc23a71293844447ac18e5d32c2a76eb0253d79f4be2f4579904a31f: Status 404 returned error can't find the container with id ddaa5c5cdc23a71293844447ac18e5d32c2a76eb0253d79f4be2f4579904a31f Dec 11 13:19:27 crc kubenswrapper[4898]: I1211 13:19:27.536013 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-f8fb84555-z6l22" Dec 11 13:19:27 crc kubenswrapper[4898]: I1211 13:19:27.550138 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-6ff7998486-xz82s"] Dec 11 13:19:27 crc kubenswrapper[4898]: I1211 13:19:27.576547 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-s4vbv" event={"ID":"caec89bd-563f-4065-87ce-2cb58b5e4dc9","Type":"ContainerStarted","Data":"c6525dce443bf15692ec6fa641e60bf43e0cbb4d8fd2ac0d0eaddd810cab9b59"} Dec 11 13:19:27 crc kubenswrapper[4898]: I1211 13:19:27.577849 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f7f7578db-65snc" event={"ID":"231b14b2-7f77-4900-ba69-07247827770f","Type":"ContainerStarted","Data":"ddaa5c5cdc23a71293844447ac18e5d32c2a76eb0253d79f4be2f4579904a31f"} Dec 11 13:19:27 crc kubenswrapper[4898]: I1211 13:19:27.578703 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-6ff7998486-xz82s" event={"ID":"d33832d5-019a-4630-84b6-01df5d77cade","Type":"ContainerStarted","Data":"3280478c68d5be491744f221b207b14373ca556fb6cd17942fa810ad3a143ab5"} Dec 11 13:19:27 crc kubenswrapper[4898]: I1211 13:19:27.790037 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-f8fb84555-z6l22"] Dec 11 13:19:27 crc kubenswrapper[4898]: I1211 13:19:27.844524 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6f8f67696b-jczc2"] Dec 11 13:19:27 crc kubenswrapper[4898]: W1211 13:19:27.861403 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc31c6aa7_c341_4e86_92d2_56ea1ab37168.slice/crio-d8eda09cf08c67b4ca42a3ef68d3fd652b8f12351dc59b833a7c89e246030bda WatchSource:0}: Error finding container d8eda09cf08c67b4ca42a3ef68d3fd652b8f12351dc59b833a7c89e246030bda: Status 404 returned error can't find the container with id d8eda09cf08c67b4ca42a3ef68d3fd652b8f12351dc59b833a7c89e246030bda Dec 11 13:19:28 crc kubenswrapper[4898]: I1211 13:19:28.587827 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-f8fb84555-z6l22" event={"ID":"79dd8f49-7447-49a9-84a3-252ac5286cc3","Type":"ContainerStarted","Data":"feba0e9ab33d2fb21404f01a28e7fdd31a6158b75c721b67345c80c9537be6cf"} Dec 11 13:19:28 crc kubenswrapper[4898]: I1211 13:19:28.590693 4898 generic.go:334] "Generic (PLEG): container finished" podID="3010c9a6-a94e-4a73-98d0-e7c4e6bf8521" containerID="19f62ba00aaf6ab1fd3e2aedc9ed728bd1bf0880cd56346a25f1494a9e06a0d1" exitCode=0 Dec 11 13:19:28 crc kubenswrapper[4898]: I1211 13:19:28.590756 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tk78x" event={"ID":"3010c9a6-a94e-4a73-98d0-e7c4e6bf8521","Type":"ContainerDied","Data":"19f62ba00aaf6ab1fd3e2aedc9ed728bd1bf0880cd56346a25f1494a9e06a0d1"} Dec 11 13:19:28 crc kubenswrapper[4898]: I1211 13:19:28.593499 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6f8f67696b-jczc2" event={"ID":"c31c6aa7-c341-4e86-92d2-56ea1ab37168","Type":"ContainerStarted","Data":"49dba1f0a7e23cfadf22b5fecf6ad3ac4c0b0b87d27f3f34e37ae4fe72a4bd35"} Dec 11 13:19:28 crc kubenswrapper[4898]: I1211 13:19:28.593552 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6f8f67696b-jczc2" event={"ID":"c31c6aa7-c341-4e86-92d2-56ea1ab37168","Type":"ContainerStarted","Data":"d8eda09cf08c67b4ca42a3ef68d3fd652b8f12351dc59b833a7c89e246030bda"} Dec 11 13:19:28 crc kubenswrapper[4898]: I1211 13:19:28.631901 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6f8f67696b-jczc2" podStartSLOduration=2.6318831940000003 podStartE2EDuration="2.631883194s" podCreationTimestamp="2025-12-11 13:19:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 13:19:28.627290189 +0000 UTC m=+926.199616646" watchObservedRunningTime="2025-12-11 13:19:28.631883194 +0000 UTC m=+926.204209631" Dec 11 13:19:29 crc kubenswrapper[4898]: I1211 13:19:29.608009 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-f8fb84555-z6l22" event={"ID":"79dd8f49-7447-49a9-84a3-252ac5286cc3","Type":"ContainerStarted","Data":"86e7c681a2b3e381492b0552001a315d447e85dcff7988dbd325e59b5b3f2e01"} Dec 11 13:19:29 crc kubenswrapper[4898]: I1211 13:19:29.608639 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-f8fb84555-z6l22" Dec 11 13:19:29 crc kubenswrapper[4898]: I1211 13:19:29.611575 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f7f7578db-65snc" event={"ID":"231b14b2-7f77-4900-ba69-07247827770f","Type":"ContainerStarted","Data":"415127d86b0b42e777ecc7e0d1b778eea3615217f3e75498237c21e7dd87e13e"} Dec 11 13:19:29 crc kubenswrapper[4898]: I1211 13:19:29.631154 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-f8fb84555-z6l22" podStartSLOduration=2.12528101 podStartE2EDuration="3.631131544s" podCreationTimestamp="2025-12-11 13:19:26 +0000 UTC" firstStartedPulling="2025-12-11 13:19:27.817785438 +0000 UTC m=+925.390111875" lastFinishedPulling="2025-12-11 13:19:29.323635932 +0000 UTC m=+926.895962409" observedRunningTime="2025-12-11 13:19:29.627948087 +0000 UTC m=+927.200274534" watchObservedRunningTime="2025-12-11 13:19:29.631131544 +0000 UTC m=+927.203457991" Dec 11 13:19:30 crc kubenswrapper[4898]: I1211 13:19:30.640174 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-s4vbv" event={"ID":"caec89bd-563f-4065-87ce-2cb58b5e4dc9","Type":"ContainerStarted","Data":"910a18a59b4004f433496413835faf38818c715509c7dcdb7c0b151782c6066d"} Dec 11 13:19:30 crc kubenswrapper[4898]: I1211 13:19:30.641621 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-s4vbv" Dec 11 13:19:30 crc kubenswrapper[4898]: I1211 13:19:30.648951 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tk78x" event={"ID":"3010c9a6-a94e-4a73-98d0-e7c4e6bf8521","Type":"ContainerStarted","Data":"37bfc56bc1ed2966a525549ee29842deeafb804cd2d3070d6eb00e170740767e"} Dec 11 13:19:30 crc kubenswrapper[4898]: I1211 13:19:30.670167 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-s4vbv" podStartSLOduration=2.364114548 podStartE2EDuration="4.670142068s" podCreationTimestamp="2025-12-11 13:19:26 +0000 UTC" firstStartedPulling="2025-12-11 13:19:27.016591904 +0000 UTC m=+924.588918341" lastFinishedPulling="2025-12-11 13:19:29.322619424 +0000 UTC m=+926.894945861" observedRunningTime="2025-12-11 13:19:30.659750415 +0000 UTC m=+928.232076852" watchObservedRunningTime="2025-12-11 13:19:30.670142068 +0000 UTC m=+928.242468505" Dec 11 13:19:30 crc kubenswrapper[4898]: I1211 13:19:30.699811 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-tk78x" podStartSLOduration=3.620552137 podStartE2EDuration="6.699793578s" podCreationTimestamp="2025-12-11 13:19:24 +0000 UTC" firstStartedPulling="2025-12-11 13:19:26.564519657 +0000 UTC m=+924.136846084" lastFinishedPulling="2025-12-11 13:19:29.643761088 +0000 UTC m=+927.216087525" observedRunningTime="2025-12-11 13:19:30.699717866 +0000 UTC m=+928.272044303" watchObservedRunningTime="2025-12-11 13:19:30.699793578 +0000 UTC m=+928.272120015" Dec 11 13:19:31 crc kubenswrapper[4898]: I1211 13:19:31.657462 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-6ff7998486-xz82s" event={"ID":"d33832d5-019a-4630-84b6-01df5d77cade","Type":"ContainerStarted","Data":"17cfab795706dcaaf244ff1b1abb3e12b97a5208e2f1475016bafad281a1a7a5"} Dec 11 13:19:31 crc kubenswrapper[4898]: I1211 13:19:31.683104 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-6ff7998486-xz82s" podStartSLOduration=2.673036228 podStartE2EDuration="5.683075951s" podCreationTimestamp="2025-12-11 13:19:26 +0000 UTC" firstStartedPulling="2025-12-11 13:19:27.5756259 +0000 UTC m=+925.147952337" lastFinishedPulling="2025-12-11 13:19:30.585665623 +0000 UTC m=+928.157992060" observedRunningTime="2025-12-11 13:19:31.673245523 +0000 UTC m=+929.245571970" watchObservedRunningTime="2025-12-11 13:19:31.683075951 +0000 UTC m=+929.255402398" Dec 11 13:19:32 crc kubenswrapper[4898]: I1211 13:19:32.669872 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f7f7578db-65snc" event={"ID":"231b14b2-7f77-4900-ba69-07247827770f","Type":"ContainerStarted","Data":"878c315086302f66c96c87c61d3238f6d578bbdc5ff1b6913f733ac1979a7af8"} Dec 11 13:19:32 crc kubenswrapper[4898]: I1211 13:19:32.700213 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-7f7f7578db-65snc" podStartSLOduration=2.019438251 podStartE2EDuration="6.700177867s" podCreationTimestamp="2025-12-11 13:19:26 +0000 UTC" firstStartedPulling="2025-12-11 13:19:27.391903706 +0000 UTC m=+924.964230133" lastFinishedPulling="2025-12-11 13:19:32.072643312 +0000 UTC m=+929.644969749" observedRunningTime="2025-12-11 13:19:32.692807206 +0000 UTC m=+930.265133643" watchObservedRunningTime="2025-12-11 13:19:32.700177867 +0000 UTC m=+930.272504344" Dec 11 13:19:35 crc kubenswrapper[4898]: I1211 13:19:35.250429 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-tk78x" Dec 11 13:19:35 crc kubenswrapper[4898]: I1211 13:19:35.250706 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-tk78x" Dec 11 13:19:35 crc kubenswrapper[4898]: I1211 13:19:35.310361 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-tk78x" Dec 11 13:19:35 crc kubenswrapper[4898]: I1211 13:19:35.780061 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-tk78x" Dec 11 13:19:37 crc kubenswrapper[4898]: I1211 13:19:37.017747 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-s4vbv" Dec 11 13:19:37 crc kubenswrapper[4898]: I1211 13:19:37.310359 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-6f8f67696b-jczc2" Dec 11 13:19:37 crc kubenswrapper[4898]: I1211 13:19:37.311514 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6f8f67696b-jczc2" Dec 11 13:19:37 crc kubenswrapper[4898]: I1211 13:19:37.317924 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6f8f67696b-jczc2" Dec 11 13:19:37 crc kubenswrapper[4898]: I1211 13:19:37.696064 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-tk78x"] Dec 11 13:19:37 crc kubenswrapper[4898]: I1211 13:19:37.730737 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-tk78x" podUID="3010c9a6-a94e-4a73-98d0-e7c4e6bf8521" containerName="registry-server" containerID="cri-o://37bfc56bc1ed2966a525549ee29842deeafb804cd2d3070d6eb00e170740767e" gracePeriod=2 Dec 11 13:19:37 crc kubenswrapper[4898]: I1211 13:19:37.736486 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6f8f67696b-jczc2" Dec 11 13:19:37 crc kubenswrapper[4898]: I1211 13:19:37.812071 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-bf88584db-ctz8c"] Dec 11 13:19:38 crc kubenswrapper[4898]: I1211 13:19:38.741598 4898 generic.go:334] "Generic (PLEG): container finished" podID="3010c9a6-a94e-4a73-98d0-e7c4e6bf8521" containerID="37bfc56bc1ed2966a525549ee29842deeafb804cd2d3070d6eb00e170740767e" exitCode=0 Dec 11 13:19:38 crc kubenswrapper[4898]: I1211 13:19:38.741938 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tk78x" event={"ID":"3010c9a6-a94e-4a73-98d0-e7c4e6bf8521","Type":"ContainerDied","Data":"37bfc56bc1ed2966a525549ee29842deeafb804cd2d3070d6eb00e170740767e"} Dec 11 13:19:38 crc kubenswrapper[4898]: I1211 13:19:38.834441 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tk78x" Dec 11 13:19:38 crc kubenswrapper[4898]: I1211 13:19:38.981916 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3010c9a6-a94e-4a73-98d0-e7c4e6bf8521-catalog-content\") pod \"3010c9a6-a94e-4a73-98d0-e7c4e6bf8521\" (UID: \"3010c9a6-a94e-4a73-98d0-e7c4e6bf8521\") " Dec 11 13:19:38 crc kubenswrapper[4898]: I1211 13:19:38.981990 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3010c9a6-a94e-4a73-98d0-e7c4e6bf8521-utilities\") pod \"3010c9a6-a94e-4a73-98d0-e7c4e6bf8521\" (UID: \"3010c9a6-a94e-4a73-98d0-e7c4e6bf8521\") " Dec 11 13:19:38 crc kubenswrapper[4898]: I1211 13:19:38.982859 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3010c9a6-a94e-4a73-98d0-e7c4e6bf8521-utilities" (OuterVolumeSpecName: "utilities") pod "3010c9a6-a94e-4a73-98d0-e7c4e6bf8521" (UID: "3010c9a6-a94e-4a73-98d0-e7c4e6bf8521"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 13:19:38 crc kubenswrapper[4898]: I1211 13:19:38.982943 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2h8ch\" (UniqueName: \"kubernetes.io/projected/3010c9a6-a94e-4a73-98d0-e7c4e6bf8521-kube-api-access-2h8ch\") pod \"3010c9a6-a94e-4a73-98d0-e7c4e6bf8521\" (UID: \"3010c9a6-a94e-4a73-98d0-e7c4e6bf8521\") " Dec 11 13:19:38 crc kubenswrapper[4898]: I1211 13:19:38.984393 4898 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3010c9a6-a94e-4a73-98d0-e7c4e6bf8521-utilities\") on node \"crc\" DevicePath \"\"" Dec 11 13:19:39 crc kubenswrapper[4898]: I1211 13:19:39.001979 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3010c9a6-a94e-4a73-98d0-e7c4e6bf8521-kube-api-access-2h8ch" (OuterVolumeSpecName: "kube-api-access-2h8ch") pod "3010c9a6-a94e-4a73-98d0-e7c4e6bf8521" (UID: "3010c9a6-a94e-4a73-98d0-e7c4e6bf8521"). InnerVolumeSpecName "kube-api-access-2h8ch". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:19:39 crc kubenswrapper[4898]: I1211 13:19:39.049096 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3010c9a6-a94e-4a73-98d0-e7c4e6bf8521-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3010c9a6-a94e-4a73-98d0-e7c4e6bf8521" (UID: "3010c9a6-a94e-4a73-98d0-e7c4e6bf8521"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 13:19:39 crc kubenswrapper[4898]: I1211 13:19:39.085445 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2h8ch\" (UniqueName: \"kubernetes.io/projected/3010c9a6-a94e-4a73-98d0-e7c4e6bf8521-kube-api-access-2h8ch\") on node \"crc\" DevicePath \"\"" Dec 11 13:19:39 crc kubenswrapper[4898]: I1211 13:19:39.085628 4898 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3010c9a6-a94e-4a73-98d0-e7c4e6bf8521-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 11 13:19:39 crc kubenswrapper[4898]: I1211 13:19:39.755594 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tk78x" event={"ID":"3010c9a6-a94e-4a73-98d0-e7c4e6bf8521","Type":"ContainerDied","Data":"d24c1bb7572e2ed5180ce2ebc8310e957c9a1d941f36891b3e6da91685d9bf43"} Dec 11 13:19:39 crc kubenswrapper[4898]: I1211 13:19:39.755887 4898 scope.go:117] "RemoveContainer" containerID="37bfc56bc1ed2966a525549ee29842deeafb804cd2d3070d6eb00e170740767e" Dec 11 13:19:39 crc kubenswrapper[4898]: I1211 13:19:39.755643 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tk78x" Dec 11 13:19:39 crc kubenswrapper[4898]: I1211 13:19:39.785690 4898 scope.go:117] "RemoveContainer" containerID="19f62ba00aaf6ab1fd3e2aedc9ed728bd1bf0880cd56346a25f1494a9e06a0d1" Dec 11 13:19:39 crc kubenswrapper[4898]: I1211 13:19:39.815680 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-tk78x"] Dec 11 13:19:39 crc kubenswrapper[4898]: I1211 13:19:39.823729 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-tk78x"] Dec 11 13:19:39 crc kubenswrapper[4898]: I1211 13:19:39.847791 4898 scope.go:117] "RemoveContainer" containerID="0b2823a9b1d08b132a77b7b9c88437b247fe7f7f38c00c288002fb459e65a7bd" Dec 11 13:19:40 crc kubenswrapper[4898]: I1211 13:19:40.797111 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3010c9a6-a94e-4a73-98d0-e7c4e6bf8521" path="/var/lib/kubelet/pods/3010c9a6-a94e-4a73-98d0-e7c4e6bf8521/volumes" Dec 11 13:19:47 crc kubenswrapper[4898]: I1211 13:19:47.546133 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-f8fb84555-z6l22" Dec 11 13:20:02 crc kubenswrapper[4898]: I1211 13:20:02.865756 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-bf88584db-ctz8c" podUID="8bdbf089-988a-404d-beb0-212d4aa26387" containerName="console" containerID="cri-o://07a11f0707136045f796a5f1b5b4cdd6f868b69626ab7b74214b44dee21de322" gracePeriod=15 Dec 11 13:20:03 crc kubenswrapper[4898]: I1211 13:20:03.282280 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-bf88584db-ctz8c_8bdbf089-988a-404d-beb0-212d4aa26387/console/0.log" Dec 11 13:20:03 crc kubenswrapper[4898]: I1211 13:20:03.282730 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-bf88584db-ctz8c" Dec 11 13:20:03 crc kubenswrapper[4898]: I1211 13:20:03.421615 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8bdbf089-988a-404d-beb0-212d4aa26387-service-ca\") pod \"8bdbf089-988a-404d-beb0-212d4aa26387\" (UID: \"8bdbf089-988a-404d-beb0-212d4aa26387\") " Dec 11 13:20:03 crc kubenswrapper[4898]: I1211 13:20:03.421848 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8bdbf089-988a-404d-beb0-212d4aa26387-console-serving-cert\") pod \"8bdbf089-988a-404d-beb0-212d4aa26387\" (UID: \"8bdbf089-988a-404d-beb0-212d4aa26387\") " Dec 11 13:20:03 crc kubenswrapper[4898]: I1211 13:20:03.421930 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8bdbf089-988a-404d-beb0-212d4aa26387-console-config\") pod \"8bdbf089-988a-404d-beb0-212d4aa26387\" (UID: \"8bdbf089-988a-404d-beb0-212d4aa26387\") " Dec 11 13:20:03 crc kubenswrapper[4898]: I1211 13:20:03.421988 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8bdbf089-988a-404d-beb0-212d4aa26387-trusted-ca-bundle\") pod \"8bdbf089-988a-404d-beb0-212d4aa26387\" (UID: \"8bdbf089-988a-404d-beb0-212d4aa26387\") " Dec 11 13:20:03 crc kubenswrapper[4898]: I1211 13:20:03.422009 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8bdbf089-988a-404d-beb0-212d4aa26387-oauth-serving-cert\") pod \"8bdbf089-988a-404d-beb0-212d4aa26387\" (UID: \"8bdbf089-988a-404d-beb0-212d4aa26387\") " Dec 11 13:20:03 crc kubenswrapper[4898]: I1211 13:20:03.422040 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8bdbf089-988a-404d-beb0-212d4aa26387-console-oauth-config\") pod \"8bdbf089-988a-404d-beb0-212d4aa26387\" (UID: \"8bdbf089-988a-404d-beb0-212d4aa26387\") " Dec 11 13:20:03 crc kubenswrapper[4898]: I1211 13:20:03.422065 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w68w4\" (UniqueName: \"kubernetes.io/projected/8bdbf089-988a-404d-beb0-212d4aa26387-kube-api-access-w68w4\") pod \"8bdbf089-988a-404d-beb0-212d4aa26387\" (UID: \"8bdbf089-988a-404d-beb0-212d4aa26387\") " Dec 11 13:20:03 crc kubenswrapper[4898]: I1211 13:20:03.423318 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8bdbf089-988a-404d-beb0-212d4aa26387-service-ca" (OuterVolumeSpecName: "service-ca") pod "8bdbf089-988a-404d-beb0-212d4aa26387" (UID: "8bdbf089-988a-404d-beb0-212d4aa26387"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:20:03 crc kubenswrapper[4898]: I1211 13:20:03.423352 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8bdbf089-988a-404d-beb0-212d4aa26387-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "8bdbf089-988a-404d-beb0-212d4aa26387" (UID: "8bdbf089-988a-404d-beb0-212d4aa26387"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:20:03 crc kubenswrapper[4898]: I1211 13:20:03.423806 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8bdbf089-988a-404d-beb0-212d4aa26387-console-config" (OuterVolumeSpecName: "console-config") pod "8bdbf089-988a-404d-beb0-212d4aa26387" (UID: "8bdbf089-988a-404d-beb0-212d4aa26387"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:20:03 crc kubenswrapper[4898]: I1211 13:20:03.423819 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8bdbf089-988a-404d-beb0-212d4aa26387-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "8bdbf089-988a-404d-beb0-212d4aa26387" (UID: "8bdbf089-988a-404d-beb0-212d4aa26387"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:20:03 crc kubenswrapper[4898]: I1211 13:20:03.427239 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8bdbf089-988a-404d-beb0-212d4aa26387-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "8bdbf089-988a-404d-beb0-212d4aa26387" (UID: "8bdbf089-988a-404d-beb0-212d4aa26387"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:20:03 crc kubenswrapper[4898]: I1211 13:20:03.428066 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8bdbf089-988a-404d-beb0-212d4aa26387-kube-api-access-w68w4" (OuterVolumeSpecName: "kube-api-access-w68w4") pod "8bdbf089-988a-404d-beb0-212d4aa26387" (UID: "8bdbf089-988a-404d-beb0-212d4aa26387"). InnerVolumeSpecName "kube-api-access-w68w4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:20:03 crc kubenswrapper[4898]: I1211 13:20:03.428447 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8bdbf089-988a-404d-beb0-212d4aa26387-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "8bdbf089-988a-404d-beb0-212d4aa26387" (UID: "8bdbf089-988a-404d-beb0-212d4aa26387"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:20:03 crc kubenswrapper[4898]: I1211 13:20:03.525519 4898 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8bdbf089-988a-404d-beb0-212d4aa26387-console-config\") on node \"crc\" DevicePath \"\"" Dec 11 13:20:03 crc kubenswrapper[4898]: I1211 13:20:03.525549 4898 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8bdbf089-988a-404d-beb0-212d4aa26387-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 11 13:20:03 crc kubenswrapper[4898]: I1211 13:20:03.525558 4898 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8bdbf089-988a-404d-beb0-212d4aa26387-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 11 13:20:03 crc kubenswrapper[4898]: I1211 13:20:03.525567 4898 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8bdbf089-988a-404d-beb0-212d4aa26387-console-oauth-config\") on node \"crc\" DevicePath \"\"" Dec 11 13:20:03 crc kubenswrapper[4898]: I1211 13:20:03.525576 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w68w4\" (UniqueName: \"kubernetes.io/projected/8bdbf089-988a-404d-beb0-212d4aa26387-kube-api-access-w68w4\") on node \"crc\" DevicePath \"\"" Dec 11 13:20:03 crc kubenswrapper[4898]: I1211 13:20:03.525587 4898 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8bdbf089-988a-404d-beb0-212d4aa26387-service-ca\") on node \"crc\" DevicePath \"\"" Dec 11 13:20:03 crc kubenswrapper[4898]: I1211 13:20:03.525595 4898 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8bdbf089-988a-404d-beb0-212d4aa26387-console-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 11 13:20:03 crc kubenswrapper[4898]: I1211 13:20:03.962663 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-bf88584db-ctz8c_8bdbf089-988a-404d-beb0-212d4aa26387/console/0.log" Dec 11 13:20:03 crc kubenswrapper[4898]: I1211 13:20:03.962729 4898 generic.go:334] "Generic (PLEG): container finished" podID="8bdbf089-988a-404d-beb0-212d4aa26387" containerID="07a11f0707136045f796a5f1b5b4cdd6f868b69626ab7b74214b44dee21de322" exitCode=2 Dec 11 13:20:03 crc kubenswrapper[4898]: I1211 13:20:03.962771 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-bf88584db-ctz8c" event={"ID":"8bdbf089-988a-404d-beb0-212d4aa26387","Type":"ContainerDied","Data":"07a11f0707136045f796a5f1b5b4cdd6f868b69626ab7b74214b44dee21de322"} Dec 11 13:20:03 crc kubenswrapper[4898]: I1211 13:20:03.962804 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-bf88584db-ctz8c" event={"ID":"8bdbf089-988a-404d-beb0-212d4aa26387","Type":"ContainerDied","Data":"e6327fcecf51c0161df22c63e0b4450e64bb8d4b3bbeacaf3269bcb43e257548"} Dec 11 13:20:03 crc kubenswrapper[4898]: I1211 13:20:03.962847 4898 scope.go:117] "RemoveContainer" containerID="07a11f0707136045f796a5f1b5b4cdd6f868b69626ab7b74214b44dee21de322" Dec 11 13:20:03 crc kubenswrapper[4898]: I1211 13:20:03.963027 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-bf88584db-ctz8c" Dec 11 13:20:03 crc kubenswrapper[4898]: I1211 13:20:03.998854 4898 scope.go:117] "RemoveContainer" containerID="07a11f0707136045f796a5f1b5b4cdd6f868b69626ab7b74214b44dee21de322" Dec 11 13:20:04 crc kubenswrapper[4898]: E1211 13:20:04.000344 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"07a11f0707136045f796a5f1b5b4cdd6f868b69626ab7b74214b44dee21de322\": container with ID starting with 07a11f0707136045f796a5f1b5b4cdd6f868b69626ab7b74214b44dee21de322 not found: ID does not exist" containerID="07a11f0707136045f796a5f1b5b4cdd6f868b69626ab7b74214b44dee21de322" Dec 11 13:20:04 crc kubenswrapper[4898]: I1211 13:20:04.000392 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07a11f0707136045f796a5f1b5b4cdd6f868b69626ab7b74214b44dee21de322"} err="failed to get container status \"07a11f0707136045f796a5f1b5b4cdd6f868b69626ab7b74214b44dee21de322\": rpc error: code = NotFound desc = could not find container \"07a11f0707136045f796a5f1b5b4cdd6f868b69626ab7b74214b44dee21de322\": container with ID starting with 07a11f0707136045f796a5f1b5b4cdd6f868b69626ab7b74214b44dee21de322 not found: ID does not exist" Dec 11 13:20:04 crc kubenswrapper[4898]: I1211 13:20:04.006133 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-bf88584db-ctz8c"] Dec 11 13:20:04 crc kubenswrapper[4898]: I1211 13:20:04.012495 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-bf88584db-ctz8c"] Dec 11 13:20:04 crc kubenswrapper[4898]: I1211 13:20:04.793307 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8bdbf089-988a-404d-beb0-212d4aa26387" path="/var/lib/kubelet/pods/8bdbf089-988a-404d-beb0-212d4aa26387/volumes" Dec 11 13:20:06 crc kubenswrapper[4898]: I1211 13:20:06.151085 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d46qf9d"] Dec 11 13:20:06 crc kubenswrapper[4898]: E1211 13:20:06.151820 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3010c9a6-a94e-4a73-98d0-e7c4e6bf8521" containerName="extract-content" Dec 11 13:20:06 crc kubenswrapper[4898]: I1211 13:20:06.151833 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="3010c9a6-a94e-4a73-98d0-e7c4e6bf8521" containerName="extract-content" Dec 11 13:20:06 crc kubenswrapper[4898]: E1211 13:20:06.151855 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3010c9a6-a94e-4a73-98d0-e7c4e6bf8521" containerName="extract-utilities" Dec 11 13:20:06 crc kubenswrapper[4898]: I1211 13:20:06.151861 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="3010c9a6-a94e-4a73-98d0-e7c4e6bf8521" containerName="extract-utilities" Dec 11 13:20:06 crc kubenswrapper[4898]: E1211 13:20:06.151873 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8bdbf089-988a-404d-beb0-212d4aa26387" containerName="console" Dec 11 13:20:06 crc kubenswrapper[4898]: I1211 13:20:06.151882 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="8bdbf089-988a-404d-beb0-212d4aa26387" containerName="console" Dec 11 13:20:06 crc kubenswrapper[4898]: E1211 13:20:06.151892 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3010c9a6-a94e-4a73-98d0-e7c4e6bf8521" containerName="registry-server" Dec 11 13:20:06 crc kubenswrapper[4898]: I1211 13:20:06.151899 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="3010c9a6-a94e-4a73-98d0-e7c4e6bf8521" containerName="registry-server" Dec 11 13:20:06 crc kubenswrapper[4898]: I1211 13:20:06.152033 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="8bdbf089-988a-404d-beb0-212d4aa26387" containerName="console" Dec 11 13:20:06 crc kubenswrapper[4898]: I1211 13:20:06.152056 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="3010c9a6-a94e-4a73-98d0-e7c4e6bf8521" containerName="registry-server" Dec 11 13:20:06 crc kubenswrapper[4898]: I1211 13:20:06.153062 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d46qf9d" Dec 11 13:20:06 crc kubenswrapper[4898]: I1211 13:20:06.155379 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 11 13:20:06 crc kubenswrapper[4898]: I1211 13:20:06.160739 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d46qf9d"] Dec 11 13:20:06 crc kubenswrapper[4898]: I1211 13:20:06.264317 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6rcnj\" (UniqueName: \"kubernetes.io/projected/b949410c-72a6-4040-b66d-daacd4c2c4e2-kube-api-access-6rcnj\") pod \"5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d46qf9d\" (UID: \"b949410c-72a6-4040-b66d-daacd4c2c4e2\") " pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d46qf9d" Dec 11 13:20:06 crc kubenswrapper[4898]: I1211 13:20:06.264371 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b949410c-72a6-4040-b66d-daacd4c2c4e2-bundle\") pod \"5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d46qf9d\" (UID: \"b949410c-72a6-4040-b66d-daacd4c2c4e2\") " pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d46qf9d" Dec 11 13:20:06 crc kubenswrapper[4898]: I1211 13:20:06.265163 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b949410c-72a6-4040-b66d-daacd4c2c4e2-util\") pod \"5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d46qf9d\" (UID: \"b949410c-72a6-4040-b66d-daacd4c2c4e2\") " pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d46qf9d" Dec 11 13:20:06 crc kubenswrapper[4898]: I1211 13:20:06.367001 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6rcnj\" (UniqueName: \"kubernetes.io/projected/b949410c-72a6-4040-b66d-daacd4c2c4e2-kube-api-access-6rcnj\") pod \"5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d46qf9d\" (UID: \"b949410c-72a6-4040-b66d-daacd4c2c4e2\") " pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d46qf9d" Dec 11 13:20:06 crc kubenswrapper[4898]: I1211 13:20:06.367272 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b949410c-72a6-4040-b66d-daacd4c2c4e2-bundle\") pod \"5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d46qf9d\" (UID: \"b949410c-72a6-4040-b66d-daacd4c2c4e2\") " pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d46qf9d" Dec 11 13:20:06 crc kubenswrapper[4898]: I1211 13:20:06.367389 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b949410c-72a6-4040-b66d-daacd4c2c4e2-util\") pod \"5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d46qf9d\" (UID: \"b949410c-72a6-4040-b66d-daacd4c2c4e2\") " pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d46qf9d" Dec 11 13:20:06 crc kubenswrapper[4898]: I1211 13:20:06.368010 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b949410c-72a6-4040-b66d-daacd4c2c4e2-util\") pod \"5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d46qf9d\" (UID: \"b949410c-72a6-4040-b66d-daacd4c2c4e2\") " pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d46qf9d" Dec 11 13:20:06 crc kubenswrapper[4898]: I1211 13:20:06.368006 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b949410c-72a6-4040-b66d-daacd4c2c4e2-bundle\") pod \"5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d46qf9d\" (UID: \"b949410c-72a6-4040-b66d-daacd4c2c4e2\") " pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d46qf9d" Dec 11 13:20:06 crc kubenswrapper[4898]: I1211 13:20:06.394527 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6rcnj\" (UniqueName: \"kubernetes.io/projected/b949410c-72a6-4040-b66d-daacd4c2c4e2-kube-api-access-6rcnj\") pod \"5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d46qf9d\" (UID: \"b949410c-72a6-4040-b66d-daacd4c2c4e2\") " pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d46qf9d" Dec 11 13:20:06 crc kubenswrapper[4898]: I1211 13:20:06.485127 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d46qf9d" Dec 11 13:20:06 crc kubenswrapper[4898]: I1211 13:20:06.910238 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d46qf9d"] Dec 11 13:20:06 crc kubenswrapper[4898]: W1211 13:20:06.919516 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb949410c_72a6_4040_b66d_daacd4c2c4e2.slice/crio-1d804a06e0cfc301482241556b8a111a3cbaf8477d4dd9daf376721eea653e20 WatchSource:0}: Error finding container 1d804a06e0cfc301482241556b8a111a3cbaf8477d4dd9daf376721eea653e20: Status 404 returned error can't find the container with id 1d804a06e0cfc301482241556b8a111a3cbaf8477d4dd9daf376721eea653e20 Dec 11 13:20:06 crc kubenswrapper[4898]: I1211 13:20:06.987241 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d46qf9d" event={"ID":"b949410c-72a6-4040-b66d-daacd4c2c4e2","Type":"ContainerStarted","Data":"1d804a06e0cfc301482241556b8a111a3cbaf8477d4dd9daf376721eea653e20"} Dec 11 13:20:08 crc kubenswrapper[4898]: I1211 13:20:08.000581 4898 generic.go:334] "Generic (PLEG): container finished" podID="b949410c-72a6-4040-b66d-daacd4c2c4e2" containerID="32cf7f99f0c924cc4d545bedce167fa16610fb1fcebcc927dfaa4782cad62b36" exitCode=0 Dec 11 13:20:08 crc kubenswrapper[4898]: I1211 13:20:08.000811 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d46qf9d" event={"ID":"b949410c-72a6-4040-b66d-daacd4c2c4e2","Type":"ContainerDied","Data":"32cf7f99f0c924cc4d545bedce167fa16610fb1fcebcc927dfaa4782cad62b36"} Dec 11 13:20:08 crc kubenswrapper[4898]: I1211 13:20:08.003178 4898 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 11 13:20:09 crc kubenswrapper[4898]: I1211 13:20:09.012701 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d46qf9d" event={"ID":"b949410c-72a6-4040-b66d-daacd4c2c4e2","Type":"ContainerStarted","Data":"b6f1b1ee4aee15db76a3948400ee4ba251f4c9263ddce09cfec480dc31e1d967"} Dec 11 13:20:10 crc kubenswrapper[4898]: I1211 13:20:10.022813 4898 generic.go:334] "Generic (PLEG): container finished" podID="b949410c-72a6-4040-b66d-daacd4c2c4e2" containerID="b6f1b1ee4aee15db76a3948400ee4ba251f4c9263ddce09cfec480dc31e1d967" exitCode=0 Dec 11 13:20:10 crc kubenswrapper[4898]: I1211 13:20:10.022868 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d46qf9d" event={"ID":"b949410c-72a6-4040-b66d-daacd4c2c4e2","Type":"ContainerDied","Data":"b6f1b1ee4aee15db76a3948400ee4ba251f4c9263ddce09cfec480dc31e1d967"} Dec 11 13:20:11 crc kubenswrapper[4898]: I1211 13:20:11.033864 4898 generic.go:334] "Generic (PLEG): container finished" podID="b949410c-72a6-4040-b66d-daacd4c2c4e2" containerID="80de8caa75d224ccdac5bcd5f02a124e087e962778b89d9f1f51b1aba36e2eab" exitCode=0 Dec 11 13:20:11 crc kubenswrapper[4898]: I1211 13:20:11.033954 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d46qf9d" event={"ID":"b949410c-72a6-4040-b66d-daacd4c2c4e2","Type":"ContainerDied","Data":"80de8caa75d224ccdac5bcd5f02a124e087e962778b89d9f1f51b1aba36e2eab"} Dec 11 13:20:12 crc kubenswrapper[4898]: I1211 13:20:12.402217 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d46qf9d" Dec 11 13:20:12 crc kubenswrapper[4898]: I1211 13:20:12.571106 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6rcnj\" (UniqueName: \"kubernetes.io/projected/b949410c-72a6-4040-b66d-daacd4c2c4e2-kube-api-access-6rcnj\") pod \"b949410c-72a6-4040-b66d-daacd4c2c4e2\" (UID: \"b949410c-72a6-4040-b66d-daacd4c2c4e2\") " Dec 11 13:20:12 crc kubenswrapper[4898]: I1211 13:20:12.571178 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b949410c-72a6-4040-b66d-daacd4c2c4e2-util\") pod \"b949410c-72a6-4040-b66d-daacd4c2c4e2\" (UID: \"b949410c-72a6-4040-b66d-daacd4c2c4e2\") " Dec 11 13:20:12 crc kubenswrapper[4898]: I1211 13:20:12.571222 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b949410c-72a6-4040-b66d-daacd4c2c4e2-bundle\") pod \"b949410c-72a6-4040-b66d-daacd4c2c4e2\" (UID: \"b949410c-72a6-4040-b66d-daacd4c2c4e2\") " Dec 11 13:20:12 crc kubenswrapper[4898]: I1211 13:20:12.572538 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b949410c-72a6-4040-b66d-daacd4c2c4e2-bundle" (OuterVolumeSpecName: "bundle") pod "b949410c-72a6-4040-b66d-daacd4c2c4e2" (UID: "b949410c-72a6-4040-b66d-daacd4c2c4e2"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 13:20:12 crc kubenswrapper[4898]: I1211 13:20:12.579676 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b949410c-72a6-4040-b66d-daacd4c2c4e2-kube-api-access-6rcnj" (OuterVolumeSpecName: "kube-api-access-6rcnj") pod "b949410c-72a6-4040-b66d-daacd4c2c4e2" (UID: "b949410c-72a6-4040-b66d-daacd4c2c4e2"). InnerVolumeSpecName "kube-api-access-6rcnj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:20:12 crc kubenswrapper[4898]: I1211 13:20:12.674181 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6rcnj\" (UniqueName: \"kubernetes.io/projected/b949410c-72a6-4040-b66d-daacd4c2c4e2-kube-api-access-6rcnj\") on node \"crc\" DevicePath \"\"" Dec 11 13:20:12 crc kubenswrapper[4898]: I1211 13:20:12.674239 4898 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b949410c-72a6-4040-b66d-daacd4c2c4e2-bundle\") on node \"crc\" DevicePath \"\"" Dec 11 13:20:12 crc kubenswrapper[4898]: I1211 13:20:12.775827 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b949410c-72a6-4040-b66d-daacd4c2c4e2-util" (OuterVolumeSpecName: "util") pod "b949410c-72a6-4040-b66d-daacd4c2c4e2" (UID: "b949410c-72a6-4040-b66d-daacd4c2c4e2"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 13:20:12 crc kubenswrapper[4898]: I1211 13:20:12.777420 4898 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b949410c-72a6-4040-b66d-daacd4c2c4e2-util\") on node \"crc\" DevicePath \"\"" Dec 11 13:20:13 crc kubenswrapper[4898]: I1211 13:20:13.054335 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d46qf9d" event={"ID":"b949410c-72a6-4040-b66d-daacd4c2c4e2","Type":"ContainerDied","Data":"1d804a06e0cfc301482241556b8a111a3cbaf8477d4dd9daf376721eea653e20"} Dec 11 13:20:13 crc kubenswrapper[4898]: I1211 13:20:13.054396 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1d804a06e0cfc301482241556b8a111a3cbaf8477d4dd9daf376721eea653e20" Dec 11 13:20:13 crc kubenswrapper[4898]: I1211 13:20:13.054442 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d46qf9d" Dec 11 13:20:22 crc kubenswrapper[4898]: I1211 13:20:22.655774 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-7d49b6f5c7-png4s"] Dec 11 13:20:22 crc kubenswrapper[4898]: E1211 13:20:22.656546 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b949410c-72a6-4040-b66d-daacd4c2c4e2" containerName="pull" Dec 11 13:20:22 crc kubenswrapper[4898]: I1211 13:20:22.656564 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="b949410c-72a6-4040-b66d-daacd4c2c4e2" containerName="pull" Dec 11 13:20:22 crc kubenswrapper[4898]: E1211 13:20:22.656585 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b949410c-72a6-4040-b66d-daacd4c2c4e2" containerName="extract" Dec 11 13:20:22 crc kubenswrapper[4898]: I1211 13:20:22.656594 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="b949410c-72a6-4040-b66d-daacd4c2c4e2" containerName="extract" Dec 11 13:20:22 crc kubenswrapper[4898]: E1211 13:20:22.656614 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b949410c-72a6-4040-b66d-daacd4c2c4e2" containerName="util" Dec 11 13:20:22 crc kubenswrapper[4898]: I1211 13:20:22.656622 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="b949410c-72a6-4040-b66d-daacd4c2c4e2" containerName="util" Dec 11 13:20:22 crc kubenswrapper[4898]: I1211 13:20:22.656776 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="b949410c-72a6-4040-b66d-daacd4c2c4e2" containerName="extract" Dec 11 13:20:22 crc kubenswrapper[4898]: I1211 13:20:22.657277 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-7d49b6f5c7-png4s" Dec 11 13:20:22 crc kubenswrapper[4898]: I1211 13:20:22.660694 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Dec 11 13:20:22 crc kubenswrapper[4898]: I1211 13:20:22.660735 4898 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Dec 11 13:20:22 crc kubenswrapper[4898]: I1211 13:20:22.661283 4898 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Dec 11 13:20:22 crc kubenswrapper[4898]: I1211 13:20:22.661373 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Dec 11 13:20:22 crc kubenswrapper[4898]: I1211 13:20:22.663299 4898 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-j9k4s" Dec 11 13:20:22 crc kubenswrapper[4898]: I1211 13:20:22.670856 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mmbfn\" (UniqueName: \"kubernetes.io/projected/a3162798-e8d2-458c-a559-9a246c2cae3b-kube-api-access-mmbfn\") pod \"metallb-operator-controller-manager-7d49b6f5c7-png4s\" (UID: \"a3162798-e8d2-458c-a559-9a246c2cae3b\") " pod="metallb-system/metallb-operator-controller-manager-7d49b6f5c7-png4s" Dec 11 13:20:22 crc kubenswrapper[4898]: I1211 13:20:22.670913 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a3162798-e8d2-458c-a559-9a246c2cae3b-apiservice-cert\") pod \"metallb-operator-controller-manager-7d49b6f5c7-png4s\" (UID: \"a3162798-e8d2-458c-a559-9a246c2cae3b\") " pod="metallb-system/metallb-operator-controller-manager-7d49b6f5c7-png4s" Dec 11 13:20:22 crc kubenswrapper[4898]: I1211 13:20:22.671157 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a3162798-e8d2-458c-a559-9a246c2cae3b-webhook-cert\") pod \"metallb-operator-controller-manager-7d49b6f5c7-png4s\" (UID: \"a3162798-e8d2-458c-a559-9a246c2cae3b\") " pod="metallb-system/metallb-operator-controller-manager-7d49b6f5c7-png4s" Dec 11 13:20:22 crc kubenswrapper[4898]: I1211 13:20:22.693666 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-7d49b6f5c7-png4s"] Dec 11 13:20:22 crc kubenswrapper[4898]: I1211 13:20:22.772412 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mmbfn\" (UniqueName: \"kubernetes.io/projected/a3162798-e8d2-458c-a559-9a246c2cae3b-kube-api-access-mmbfn\") pod \"metallb-operator-controller-manager-7d49b6f5c7-png4s\" (UID: \"a3162798-e8d2-458c-a559-9a246c2cae3b\") " pod="metallb-system/metallb-operator-controller-manager-7d49b6f5c7-png4s" Dec 11 13:20:22 crc kubenswrapper[4898]: I1211 13:20:22.772493 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a3162798-e8d2-458c-a559-9a246c2cae3b-apiservice-cert\") pod \"metallb-operator-controller-manager-7d49b6f5c7-png4s\" (UID: \"a3162798-e8d2-458c-a559-9a246c2cae3b\") " pod="metallb-system/metallb-operator-controller-manager-7d49b6f5c7-png4s" Dec 11 13:20:22 crc kubenswrapper[4898]: I1211 13:20:22.772570 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a3162798-e8d2-458c-a559-9a246c2cae3b-webhook-cert\") pod \"metallb-operator-controller-manager-7d49b6f5c7-png4s\" (UID: \"a3162798-e8d2-458c-a559-9a246c2cae3b\") " pod="metallb-system/metallb-operator-controller-manager-7d49b6f5c7-png4s" Dec 11 13:20:22 crc kubenswrapper[4898]: I1211 13:20:22.787926 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a3162798-e8d2-458c-a559-9a246c2cae3b-webhook-cert\") pod \"metallb-operator-controller-manager-7d49b6f5c7-png4s\" (UID: \"a3162798-e8d2-458c-a559-9a246c2cae3b\") " pod="metallb-system/metallb-operator-controller-manager-7d49b6f5c7-png4s" Dec 11 13:20:22 crc kubenswrapper[4898]: I1211 13:20:22.788063 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a3162798-e8d2-458c-a559-9a246c2cae3b-apiservice-cert\") pod \"metallb-operator-controller-manager-7d49b6f5c7-png4s\" (UID: \"a3162798-e8d2-458c-a559-9a246c2cae3b\") " pod="metallb-system/metallb-operator-controller-manager-7d49b6f5c7-png4s" Dec 11 13:20:22 crc kubenswrapper[4898]: I1211 13:20:22.790941 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mmbfn\" (UniqueName: \"kubernetes.io/projected/a3162798-e8d2-458c-a559-9a246c2cae3b-kube-api-access-mmbfn\") pod \"metallb-operator-controller-manager-7d49b6f5c7-png4s\" (UID: \"a3162798-e8d2-458c-a559-9a246c2cae3b\") " pod="metallb-system/metallb-operator-controller-manager-7d49b6f5c7-png4s" Dec 11 13:20:22 crc kubenswrapper[4898]: I1211 13:20:22.975570 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-7d49b6f5c7-png4s" Dec 11 13:20:23 crc kubenswrapper[4898]: I1211 13:20:23.017732 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-585b5958fd-m8kkm"] Dec 11 13:20:23 crc kubenswrapper[4898]: I1211 13:20:23.019148 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-585b5958fd-m8kkm" Dec 11 13:20:23 crc kubenswrapper[4898]: I1211 13:20:23.023281 4898 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Dec 11 13:20:23 crc kubenswrapper[4898]: I1211 13:20:23.023749 4898 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Dec 11 13:20:23 crc kubenswrapper[4898]: I1211 13:20:23.024161 4898 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-jn4m4" Dec 11 13:20:23 crc kubenswrapper[4898]: I1211 13:20:23.050057 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-585b5958fd-m8kkm"] Dec 11 13:20:23 crc kubenswrapper[4898]: I1211 13:20:23.076762 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/351e2ec9-301e-4fd9-b8ef-45494b9a1291-apiservice-cert\") pod \"metallb-operator-webhook-server-585b5958fd-m8kkm\" (UID: \"351e2ec9-301e-4fd9-b8ef-45494b9a1291\") " pod="metallb-system/metallb-operator-webhook-server-585b5958fd-m8kkm" Dec 11 13:20:23 crc kubenswrapper[4898]: I1211 13:20:23.076847 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/351e2ec9-301e-4fd9-b8ef-45494b9a1291-webhook-cert\") pod \"metallb-operator-webhook-server-585b5958fd-m8kkm\" (UID: \"351e2ec9-301e-4fd9-b8ef-45494b9a1291\") " pod="metallb-system/metallb-operator-webhook-server-585b5958fd-m8kkm" Dec 11 13:20:23 crc kubenswrapper[4898]: I1211 13:20:23.076891 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-whzrm\" (UniqueName: \"kubernetes.io/projected/351e2ec9-301e-4fd9-b8ef-45494b9a1291-kube-api-access-whzrm\") pod \"metallb-operator-webhook-server-585b5958fd-m8kkm\" (UID: \"351e2ec9-301e-4fd9-b8ef-45494b9a1291\") " pod="metallb-system/metallb-operator-webhook-server-585b5958fd-m8kkm" Dec 11 13:20:23 crc kubenswrapper[4898]: I1211 13:20:23.178648 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/351e2ec9-301e-4fd9-b8ef-45494b9a1291-apiservice-cert\") pod \"metallb-operator-webhook-server-585b5958fd-m8kkm\" (UID: \"351e2ec9-301e-4fd9-b8ef-45494b9a1291\") " pod="metallb-system/metallb-operator-webhook-server-585b5958fd-m8kkm" Dec 11 13:20:23 crc kubenswrapper[4898]: I1211 13:20:23.178727 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/351e2ec9-301e-4fd9-b8ef-45494b9a1291-webhook-cert\") pod \"metallb-operator-webhook-server-585b5958fd-m8kkm\" (UID: \"351e2ec9-301e-4fd9-b8ef-45494b9a1291\") " pod="metallb-system/metallb-operator-webhook-server-585b5958fd-m8kkm" Dec 11 13:20:23 crc kubenswrapper[4898]: I1211 13:20:23.178750 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-whzrm\" (UniqueName: \"kubernetes.io/projected/351e2ec9-301e-4fd9-b8ef-45494b9a1291-kube-api-access-whzrm\") pod \"metallb-operator-webhook-server-585b5958fd-m8kkm\" (UID: \"351e2ec9-301e-4fd9-b8ef-45494b9a1291\") " pod="metallb-system/metallb-operator-webhook-server-585b5958fd-m8kkm" Dec 11 13:20:23 crc kubenswrapper[4898]: I1211 13:20:23.216520 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/351e2ec9-301e-4fd9-b8ef-45494b9a1291-apiservice-cert\") pod \"metallb-operator-webhook-server-585b5958fd-m8kkm\" (UID: \"351e2ec9-301e-4fd9-b8ef-45494b9a1291\") " pod="metallb-system/metallb-operator-webhook-server-585b5958fd-m8kkm" Dec 11 13:20:23 crc kubenswrapper[4898]: I1211 13:20:23.216537 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/351e2ec9-301e-4fd9-b8ef-45494b9a1291-webhook-cert\") pod \"metallb-operator-webhook-server-585b5958fd-m8kkm\" (UID: \"351e2ec9-301e-4fd9-b8ef-45494b9a1291\") " pod="metallb-system/metallb-operator-webhook-server-585b5958fd-m8kkm" Dec 11 13:20:23 crc kubenswrapper[4898]: I1211 13:20:23.226058 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-whzrm\" (UniqueName: \"kubernetes.io/projected/351e2ec9-301e-4fd9-b8ef-45494b9a1291-kube-api-access-whzrm\") pod \"metallb-operator-webhook-server-585b5958fd-m8kkm\" (UID: \"351e2ec9-301e-4fd9-b8ef-45494b9a1291\") " pod="metallb-system/metallb-operator-webhook-server-585b5958fd-m8kkm" Dec 11 13:20:23 crc kubenswrapper[4898]: I1211 13:20:23.412982 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-585b5958fd-m8kkm" Dec 11 13:20:23 crc kubenswrapper[4898]: I1211 13:20:23.563930 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-7d49b6f5c7-png4s"] Dec 11 13:20:23 crc kubenswrapper[4898]: W1211 13:20:23.575581 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda3162798_e8d2_458c_a559_9a246c2cae3b.slice/crio-12f16a5512f5c12adf016821323662f9f7e1a1584f0b5a8c109697c64a7a0e17 WatchSource:0}: Error finding container 12f16a5512f5c12adf016821323662f9f7e1a1584f0b5a8c109697c64a7a0e17: Status 404 returned error can't find the container with id 12f16a5512f5c12adf016821323662f9f7e1a1584f0b5a8c109697c64a7a0e17 Dec 11 13:20:23 crc kubenswrapper[4898]: I1211 13:20:23.843751 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-585b5958fd-m8kkm"] Dec 11 13:20:23 crc kubenswrapper[4898]: W1211 13:20:23.852637 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod351e2ec9_301e_4fd9_b8ef_45494b9a1291.slice/crio-c2bb9fc7bfcfba92da2b3e024b4d7043ba4c786d30767a44c221fd178a2222fa WatchSource:0}: Error finding container c2bb9fc7bfcfba92da2b3e024b4d7043ba4c786d30767a44c221fd178a2222fa: Status 404 returned error can't find the container with id c2bb9fc7bfcfba92da2b3e024b4d7043ba4c786d30767a44c221fd178a2222fa Dec 11 13:20:24 crc kubenswrapper[4898]: I1211 13:20:24.157663 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-7d49b6f5c7-png4s" event={"ID":"a3162798-e8d2-458c-a559-9a246c2cae3b","Type":"ContainerStarted","Data":"12f16a5512f5c12adf016821323662f9f7e1a1584f0b5a8c109697c64a7a0e17"} Dec 11 13:20:24 crc kubenswrapper[4898]: I1211 13:20:24.158951 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-585b5958fd-m8kkm" event={"ID":"351e2ec9-301e-4fd9-b8ef-45494b9a1291","Type":"ContainerStarted","Data":"c2bb9fc7bfcfba92da2b3e024b4d7043ba4c786d30767a44c221fd178a2222fa"} Dec 11 13:20:27 crc kubenswrapper[4898]: I1211 13:20:27.182636 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-7d49b6f5c7-png4s" event={"ID":"a3162798-e8d2-458c-a559-9a246c2cae3b","Type":"ContainerStarted","Data":"ff69bb4b41cc588b1609cbbbd7bf5d6ec1cedc9acf3b039bb4c6db56ba6bfdcc"} Dec 11 13:20:27 crc kubenswrapper[4898]: I1211 13:20:27.184564 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-7d49b6f5c7-png4s" Dec 11 13:20:30 crc kubenswrapper[4898]: I1211 13:20:30.209202 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-585b5958fd-m8kkm" event={"ID":"351e2ec9-301e-4fd9-b8ef-45494b9a1291","Type":"ContainerStarted","Data":"e7b9ba60b48fde8b6a2e4c37b7d1dcb4a4a83887e7a81b9446c20985a26e9c8d"} Dec 11 13:20:30 crc kubenswrapper[4898]: I1211 13:20:30.209903 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-585b5958fd-m8kkm" Dec 11 13:20:30 crc kubenswrapper[4898]: I1211 13:20:30.234137 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-585b5958fd-m8kkm" podStartSLOduration=2.171039169 podStartE2EDuration="8.234115858s" podCreationTimestamp="2025-12-11 13:20:22 +0000 UTC" firstStartedPulling="2025-12-11 13:20:23.857428335 +0000 UTC m=+981.429754772" lastFinishedPulling="2025-12-11 13:20:29.920505024 +0000 UTC m=+987.492831461" observedRunningTime="2025-12-11 13:20:30.232005149 +0000 UTC m=+987.804331606" watchObservedRunningTime="2025-12-11 13:20:30.234115858 +0000 UTC m=+987.806442325" Dec 11 13:20:30 crc kubenswrapper[4898]: I1211 13:20:30.237187 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-7d49b6f5c7-png4s" podStartSLOduration=5.053246463 podStartE2EDuration="8.237169622s" podCreationTimestamp="2025-12-11 13:20:22 +0000 UTC" firstStartedPulling="2025-12-11 13:20:23.578830168 +0000 UTC m=+981.151156605" lastFinishedPulling="2025-12-11 13:20:26.762753327 +0000 UTC m=+984.335079764" observedRunningTime="2025-12-11 13:20:27.222741519 +0000 UTC m=+984.795067956" watchObservedRunningTime="2025-12-11 13:20:30.237169622 +0000 UTC m=+987.809496069" Dec 11 13:20:34 crc kubenswrapper[4898]: I1211 13:20:34.995777 4898 patch_prober.go:28] interesting pod/machine-config-daemon-7mmvk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 11 13:20:34 crc kubenswrapper[4898]: I1211 13:20:34.996089 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 11 13:20:43 crc kubenswrapper[4898]: I1211 13:20:43.420066 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-585b5958fd-m8kkm" Dec 11 13:21:02 crc kubenswrapper[4898]: I1211 13:21:02.979105 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-7d49b6f5c7-png4s" Dec 11 13:21:03 crc kubenswrapper[4898]: I1211 13:21:03.931179 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-6gtqk"] Dec 11 13:21:03 crc kubenswrapper[4898]: I1211 13:21:03.935109 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-6gtqk" Dec 11 13:21:03 crc kubenswrapper[4898]: I1211 13:21:03.951069 4898 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-5tk5d" Dec 11 13:21:03 crc kubenswrapper[4898]: I1211 13:21:03.951432 4898 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Dec 11 13:21:03 crc kubenswrapper[4898]: I1211 13:21:03.961495 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Dec 11 13:21:03 crc kubenswrapper[4898]: I1211 13:21:03.964417 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-7784b6fcf-llp2z"] Dec 11 13:21:03 crc kubenswrapper[4898]: I1211 13:21:03.965419 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7784b6fcf-llp2z" Dec 11 13:21:03 crc kubenswrapper[4898]: I1211 13:21:03.969025 4898 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Dec 11 13:21:03 crc kubenswrapper[4898]: I1211 13:21:03.995850 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7784b6fcf-llp2z"] Dec 11 13:21:04 crc kubenswrapper[4898]: I1211 13:21:04.094501 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/d898c00f-7c50-483d-84f3-9c502696b39a-frr-conf\") pod \"frr-k8s-6gtqk\" (UID: \"d898c00f-7c50-483d-84f3-9c502696b39a\") " pod="metallb-system/frr-k8s-6gtqk" Dec 11 13:21:04 crc kubenswrapper[4898]: I1211 13:21:04.094566 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d898c00f-7c50-483d-84f3-9c502696b39a-metrics-certs\") pod \"frr-k8s-6gtqk\" (UID: \"d898c00f-7c50-483d-84f3-9c502696b39a\") " pod="metallb-system/frr-k8s-6gtqk" Dec 11 13:21:04 crc kubenswrapper[4898]: I1211 13:21:04.094599 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/d898c00f-7c50-483d-84f3-9c502696b39a-reloader\") pod \"frr-k8s-6gtqk\" (UID: \"d898c00f-7c50-483d-84f3-9c502696b39a\") " pod="metallb-system/frr-k8s-6gtqk" Dec 11 13:21:04 crc kubenswrapper[4898]: I1211 13:21:04.094636 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c0569df8-06fa-4d31-a59a-904b90e4a0ca-cert\") pod \"frr-k8s-webhook-server-7784b6fcf-llp2z\" (UID: \"c0569df8-06fa-4d31-a59a-904b90e4a0ca\") " pod="metallb-system/frr-k8s-webhook-server-7784b6fcf-llp2z" Dec 11 13:21:04 crc kubenswrapper[4898]: I1211 13:21:04.094663 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fptcx\" (UniqueName: \"kubernetes.io/projected/d898c00f-7c50-483d-84f3-9c502696b39a-kube-api-access-fptcx\") pod \"frr-k8s-6gtqk\" (UID: \"d898c00f-7c50-483d-84f3-9c502696b39a\") " pod="metallb-system/frr-k8s-6gtqk" Dec 11 13:21:04 crc kubenswrapper[4898]: I1211 13:21:04.094684 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/d898c00f-7c50-483d-84f3-9c502696b39a-frr-sockets\") pod \"frr-k8s-6gtqk\" (UID: \"d898c00f-7c50-483d-84f3-9c502696b39a\") " pod="metallb-system/frr-k8s-6gtqk" Dec 11 13:21:04 crc kubenswrapper[4898]: I1211 13:21:04.094788 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/d898c00f-7c50-483d-84f3-9c502696b39a-frr-startup\") pod \"frr-k8s-6gtqk\" (UID: \"d898c00f-7c50-483d-84f3-9c502696b39a\") " pod="metallb-system/frr-k8s-6gtqk" Dec 11 13:21:04 crc kubenswrapper[4898]: I1211 13:21:04.094838 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ckk54\" (UniqueName: \"kubernetes.io/projected/c0569df8-06fa-4d31-a59a-904b90e4a0ca-kube-api-access-ckk54\") pod \"frr-k8s-webhook-server-7784b6fcf-llp2z\" (UID: \"c0569df8-06fa-4d31-a59a-904b90e4a0ca\") " pod="metallb-system/frr-k8s-webhook-server-7784b6fcf-llp2z" Dec 11 13:21:04 crc kubenswrapper[4898]: I1211 13:21:04.094866 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/d898c00f-7c50-483d-84f3-9c502696b39a-metrics\") pod \"frr-k8s-6gtqk\" (UID: \"d898c00f-7c50-483d-84f3-9c502696b39a\") " pod="metallb-system/frr-k8s-6gtqk" Dec 11 13:21:04 crc kubenswrapper[4898]: I1211 13:21:04.095102 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-7jgw9"] Dec 11 13:21:04 crc kubenswrapper[4898]: I1211 13:21:04.096917 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-7jgw9" Dec 11 13:21:04 crc kubenswrapper[4898]: I1211 13:21:04.098896 4898 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-bhmrd" Dec 11 13:21:04 crc kubenswrapper[4898]: I1211 13:21:04.099028 4898 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Dec 11 13:21:04 crc kubenswrapper[4898]: I1211 13:21:04.099397 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Dec 11 13:21:04 crc kubenswrapper[4898]: I1211 13:21:04.100166 4898 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Dec 11 13:21:04 crc kubenswrapper[4898]: I1211 13:21:04.139717 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-5bddd4b946-bkr8v"] Dec 11 13:21:04 crc kubenswrapper[4898]: I1211 13:21:04.141331 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-5bddd4b946-bkr8v" Dec 11 13:21:04 crc kubenswrapper[4898]: I1211 13:21:04.158451 4898 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Dec 11 13:21:04 crc kubenswrapper[4898]: I1211 13:21:04.164675 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-5bddd4b946-bkr8v"] Dec 11 13:21:04 crc kubenswrapper[4898]: I1211 13:21:04.195951 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-plzm9\" (UniqueName: \"kubernetes.io/projected/80d1af81-ad34-4f94-afd2-94c3773ea9ea-kube-api-access-plzm9\") pod \"speaker-7jgw9\" (UID: \"80d1af81-ad34-4f94-afd2-94c3773ea9ea\") " pod="metallb-system/speaker-7jgw9" Dec 11 13:21:04 crc kubenswrapper[4898]: I1211 13:21:04.195996 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/80d1af81-ad34-4f94-afd2-94c3773ea9ea-memberlist\") pod \"speaker-7jgw9\" (UID: \"80d1af81-ad34-4f94-afd2-94c3773ea9ea\") " pod="metallb-system/speaker-7jgw9" Dec 11 13:21:04 crc kubenswrapper[4898]: I1211 13:21:04.196026 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c0569df8-06fa-4d31-a59a-904b90e4a0ca-cert\") pod \"frr-k8s-webhook-server-7784b6fcf-llp2z\" (UID: \"c0569df8-06fa-4d31-a59a-904b90e4a0ca\") " pod="metallb-system/frr-k8s-webhook-server-7784b6fcf-llp2z" Dec 11 13:21:04 crc kubenswrapper[4898]: I1211 13:21:04.196052 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/80d1af81-ad34-4f94-afd2-94c3773ea9ea-metrics-certs\") pod \"speaker-7jgw9\" (UID: \"80d1af81-ad34-4f94-afd2-94c3773ea9ea\") " pod="metallb-system/speaker-7jgw9" Dec 11 13:21:04 crc kubenswrapper[4898]: I1211 13:21:04.196084 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fptcx\" (UniqueName: \"kubernetes.io/projected/d898c00f-7c50-483d-84f3-9c502696b39a-kube-api-access-fptcx\") pod \"frr-k8s-6gtqk\" (UID: \"d898c00f-7c50-483d-84f3-9c502696b39a\") " pod="metallb-system/frr-k8s-6gtqk" Dec 11 13:21:04 crc kubenswrapper[4898]: I1211 13:21:04.196107 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/d898c00f-7c50-483d-84f3-9c502696b39a-frr-sockets\") pod \"frr-k8s-6gtqk\" (UID: \"d898c00f-7c50-483d-84f3-9c502696b39a\") " pod="metallb-system/frr-k8s-6gtqk" Dec 11 13:21:04 crc kubenswrapper[4898]: I1211 13:21:04.196132 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/d898c00f-7c50-483d-84f3-9c502696b39a-frr-startup\") pod \"frr-k8s-6gtqk\" (UID: \"d898c00f-7c50-483d-84f3-9c502696b39a\") " pod="metallb-system/frr-k8s-6gtqk" Dec 11 13:21:04 crc kubenswrapper[4898]: I1211 13:21:04.196153 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ckk54\" (UniqueName: \"kubernetes.io/projected/c0569df8-06fa-4d31-a59a-904b90e4a0ca-kube-api-access-ckk54\") pod \"frr-k8s-webhook-server-7784b6fcf-llp2z\" (UID: \"c0569df8-06fa-4d31-a59a-904b90e4a0ca\") " pod="metallb-system/frr-k8s-webhook-server-7784b6fcf-llp2z" Dec 11 13:21:04 crc kubenswrapper[4898]: I1211 13:21:04.196167 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/d898c00f-7c50-483d-84f3-9c502696b39a-metrics\") pod \"frr-k8s-6gtqk\" (UID: \"d898c00f-7c50-483d-84f3-9c502696b39a\") " pod="metallb-system/frr-k8s-6gtqk" Dec 11 13:21:04 crc kubenswrapper[4898]: I1211 13:21:04.196209 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/80d1af81-ad34-4f94-afd2-94c3773ea9ea-metallb-excludel2\") pod \"speaker-7jgw9\" (UID: \"80d1af81-ad34-4f94-afd2-94c3773ea9ea\") " pod="metallb-system/speaker-7jgw9" Dec 11 13:21:04 crc kubenswrapper[4898]: I1211 13:21:04.196229 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/d898c00f-7c50-483d-84f3-9c502696b39a-frr-conf\") pod \"frr-k8s-6gtqk\" (UID: \"d898c00f-7c50-483d-84f3-9c502696b39a\") " pod="metallb-system/frr-k8s-6gtqk" Dec 11 13:21:04 crc kubenswrapper[4898]: I1211 13:21:04.196260 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d898c00f-7c50-483d-84f3-9c502696b39a-metrics-certs\") pod \"frr-k8s-6gtqk\" (UID: \"d898c00f-7c50-483d-84f3-9c502696b39a\") " pod="metallb-system/frr-k8s-6gtqk" Dec 11 13:21:04 crc kubenswrapper[4898]: I1211 13:21:04.196289 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/d898c00f-7c50-483d-84f3-9c502696b39a-reloader\") pod \"frr-k8s-6gtqk\" (UID: \"d898c00f-7c50-483d-84f3-9c502696b39a\") " pod="metallb-system/frr-k8s-6gtqk" Dec 11 13:21:04 crc kubenswrapper[4898]: I1211 13:21:04.196678 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/d898c00f-7c50-483d-84f3-9c502696b39a-reloader\") pod \"frr-k8s-6gtqk\" (UID: \"d898c00f-7c50-483d-84f3-9c502696b39a\") " pod="metallb-system/frr-k8s-6gtqk" Dec 11 13:21:04 crc kubenswrapper[4898]: E1211 13:21:04.196790 4898 secret.go:188] Couldn't get secret metallb-system/frr-k8s-webhook-server-cert: secret "frr-k8s-webhook-server-cert" not found Dec 11 13:21:04 crc kubenswrapper[4898]: E1211 13:21:04.196842 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c0569df8-06fa-4d31-a59a-904b90e4a0ca-cert podName:c0569df8-06fa-4d31-a59a-904b90e4a0ca nodeName:}" failed. No retries permitted until 2025-12-11 13:21:04.696823946 +0000 UTC m=+1022.269150383 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c0569df8-06fa-4d31-a59a-904b90e4a0ca-cert") pod "frr-k8s-webhook-server-7784b6fcf-llp2z" (UID: "c0569df8-06fa-4d31-a59a-904b90e4a0ca") : secret "frr-k8s-webhook-server-cert" not found Dec 11 13:21:04 crc kubenswrapper[4898]: I1211 13:21:04.197379 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/d898c00f-7c50-483d-84f3-9c502696b39a-frr-sockets\") pod \"frr-k8s-6gtqk\" (UID: \"d898c00f-7c50-483d-84f3-9c502696b39a\") " pod="metallb-system/frr-k8s-6gtqk" Dec 11 13:21:04 crc kubenswrapper[4898]: I1211 13:21:04.198007 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/d898c00f-7c50-483d-84f3-9c502696b39a-frr-startup\") pod \"frr-k8s-6gtqk\" (UID: \"d898c00f-7c50-483d-84f3-9c502696b39a\") " pod="metallb-system/frr-k8s-6gtqk" Dec 11 13:21:04 crc kubenswrapper[4898]: I1211 13:21:04.198312 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/d898c00f-7c50-483d-84f3-9c502696b39a-metrics\") pod \"frr-k8s-6gtqk\" (UID: \"d898c00f-7c50-483d-84f3-9c502696b39a\") " pod="metallb-system/frr-k8s-6gtqk" Dec 11 13:21:04 crc kubenswrapper[4898]: I1211 13:21:04.198505 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/d898c00f-7c50-483d-84f3-9c502696b39a-frr-conf\") pod \"frr-k8s-6gtqk\" (UID: \"d898c00f-7c50-483d-84f3-9c502696b39a\") " pod="metallb-system/frr-k8s-6gtqk" Dec 11 13:21:04 crc kubenswrapper[4898]: I1211 13:21:04.203786 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d898c00f-7c50-483d-84f3-9c502696b39a-metrics-certs\") pod \"frr-k8s-6gtqk\" (UID: \"d898c00f-7c50-483d-84f3-9c502696b39a\") " pod="metallb-system/frr-k8s-6gtqk" Dec 11 13:21:04 crc kubenswrapper[4898]: I1211 13:21:04.225159 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fptcx\" (UniqueName: \"kubernetes.io/projected/d898c00f-7c50-483d-84f3-9c502696b39a-kube-api-access-fptcx\") pod \"frr-k8s-6gtqk\" (UID: \"d898c00f-7c50-483d-84f3-9c502696b39a\") " pod="metallb-system/frr-k8s-6gtqk" Dec 11 13:21:04 crc kubenswrapper[4898]: I1211 13:21:04.238489 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ckk54\" (UniqueName: \"kubernetes.io/projected/c0569df8-06fa-4d31-a59a-904b90e4a0ca-kube-api-access-ckk54\") pod \"frr-k8s-webhook-server-7784b6fcf-llp2z\" (UID: \"c0569df8-06fa-4d31-a59a-904b90e4a0ca\") " pod="metallb-system/frr-k8s-webhook-server-7784b6fcf-llp2z" Dec 11 13:21:04 crc kubenswrapper[4898]: I1211 13:21:04.261080 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-6gtqk" Dec 11 13:21:04 crc kubenswrapper[4898]: I1211 13:21:04.297420 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cfd0fc01-1bde-4b11-bbdd-d95693d0dd15-metrics-certs\") pod \"controller-5bddd4b946-bkr8v\" (UID: \"cfd0fc01-1bde-4b11-bbdd-d95693d0dd15\") " pod="metallb-system/controller-5bddd4b946-bkr8v" Dec 11 13:21:04 crc kubenswrapper[4898]: I1211 13:21:04.297484 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m57ll\" (UniqueName: \"kubernetes.io/projected/cfd0fc01-1bde-4b11-bbdd-d95693d0dd15-kube-api-access-m57ll\") pod \"controller-5bddd4b946-bkr8v\" (UID: \"cfd0fc01-1bde-4b11-bbdd-d95693d0dd15\") " pod="metallb-system/controller-5bddd4b946-bkr8v" Dec 11 13:21:04 crc kubenswrapper[4898]: I1211 13:21:04.297509 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-plzm9\" (UniqueName: \"kubernetes.io/projected/80d1af81-ad34-4f94-afd2-94c3773ea9ea-kube-api-access-plzm9\") pod \"speaker-7jgw9\" (UID: \"80d1af81-ad34-4f94-afd2-94c3773ea9ea\") " pod="metallb-system/speaker-7jgw9" Dec 11 13:21:04 crc kubenswrapper[4898]: I1211 13:21:04.297529 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/80d1af81-ad34-4f94-afd2-94c3773ea9ea-memberlist\") pod \"speaker-7jgw9\" (UID: \"80d1af81-ad34-4f94-afd2-94c3773ea9ea\") " pod="metallb-system/speaker-7jgw9" Dec 11 13:21:04 crc kubenswrapper[4898]: I1211 13:21:04.297570 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/80d1af81-ad34-4f94-afd2-94c3773ea9ea-metrics-certs\") pod \"speaker-7jgw9\" (UID: \"80d1af81-ad34-4f94-afd2-94c3773ea9ea\") " pod="metallb-system/speaker-7jgw9" Dec 11 13:21:04 crc kubenswrapper[4898]: I1211 13:21:04.297645 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cfd0fc01-1bde-4b11-bbdd-d95693d0dd15-cert\") pod \"controller-5bddd4b946-bkr8v\" (UID: \"cfd0fc01-1bde-4b11-bbdd-d95693d0dd15\") " pod="metallb-system/controller-5bddd4b946-bkr8v" Dec 11 13:21:04 crc kubenswrapper[4898]: I1211 13:21:04.297665 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/80d1af81-ad34-4f94-afd2-94c3773ea9ea-metallb-excludel2\") pod \"speaker-7jgw9\" (UID: \"80d1af81-ad34-4f94-afd2-94c3773ea9ea\") " pod="metallb-system/speaker-7jgw9" Dec 11 13:21:04 crc kubenswrapper[4898]: I1211 13:21:04.298373 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/80d1af81-ad34-4f94-afd2-94c3773ea9ea-metallb-excludel2\") pod \"speaker-7jgw9\" (UID: \"80d1af81-ad34-4f94-afd2-94c3773ea9ea\") " pod="metallb-system/speaker-7jgw9" Dec 11 13:21:04 crc kubenswrapper[4898]: E1211 13:21:04.298780 4898 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Dec 11 13:21:04 crc kubenswrapper[4898]: E1211 13:21:04.298830 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/80d1af81-ad34-4f94-afd2-94c3773ea9ea-memberlist podName:80d1af81-ad34-4f94-afd2-94c3773ea9ea nodeName:}" failed. No retries permitted until 2025-12-11 13:21:04.79881726 +0000 UTC m=+1022.371143697 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/80d1af81-ad34-4f94-afd2-94c3773ea9ea-memberlist") pod "speaker-7jgw9" (UID: "80d1af81-ad34-4f94-afd2-94c3773ea9ea") : secret "metallb-memberlist" not found Dec 11 13:21:04 crc kubenswrapper[4898]: E1211 13:21:04.299025 4898 secret.go:188] Couldn't get secret metallb-system/speaker-certs-secret: secret "speaker-certs-secret" not found Dec 11 13:21:04 crc kubenswrapper[4898]: E1211 13:21:04.299056 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/80d1af81-ad34-4f94-afd2-94c3773ea9ea-metrics-certs podName:80d1af81-ad34-4f94-afd2-94c3773ea9ea nodeName:}" failed. No retries permitted until 2025-12-11 13:21:04.799047996 +0000 UTC m=+1022.371374433 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/80d1af81-ad34-4f94-afd2-94c3773ea9ea-metrics-certs") pod "speaker-7jgw9" (UID: "80d1af81-ad34-4f94-afd2-94c3773ea9ea") : secret "speaker-certs-secret" not found Dec 11 13:21:04 crc kubenswrapper[4898]: I1211 13:21:04.315327 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-plzm9\" (UniqueName: \"kubernetes.io/projected/80d1af81-ad34-4f94-afd2-94c3773ea9ea-kube-api-access-plzm9\") pod \"speaker-7jgw9\" (UID: \"80d1af81-ad34-4f94-afd2-94c3773ea9ea\") " pod="metallb-system/speaker-7jgw9" Dec 11 13:21:04 crc kubenswrapper[4898]: I1211 13:21:04.400572 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cfd0fc01-1bde-4b11-bbdd-d95693d0dd15-cert\") pod \"controller-5bddd4b946-bkr8v\" (UID: \"cfd0fc01-1bde-4b11-bbdd-d95693d0dd15\") " pod="metallb-system/controller-5bddd4b946-bkr8v" Dec 11 13:21:04 crc kubenswrapper[4898]: I1211 13:21:04.400713 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cfd0fc01-1bde-4b11-bbdd-d95693d0dd15-metrics-certs\") pod \"controller-5bddd4b946-bkr8v\" (UID: \"cfd0fc01-1bde-4b11-bbdd-d95693d0dd15\") " pod="metallb-system/controller-5bddd4b946-bkr8v" Dec 11 13:21:04 crc kubenswrapper[4898]: I1211 13:21:04.400752 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m57ll\" (UniqueName: \"kubernetes.io/projected/cfd0fc01-1bde-4b11-bbdd-d95693d0dd15-kube-api-access-m57ll\") pod \"controller-5bddd4b946-bkr8v\" (UID: \"cfd0fc01-1bde-4b11-bbdd-d95693d0dd15\") " pod="metallb-system/controller-5bddd4b946-bkr8v" Dec 11 13:21:04 crc kubenswrapper[4898]: I1211 13:21:04.409307 4898 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Dec 11 13:21:04 crc kubenswrapper[4898]: I1211 13:21:04.411345 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cfd0fc01-1bde-4b11-bbdd-d95693d0dd15-metrics-certs\") pod \"controller-5bddd4b946-bkr8v\" (UID: \"cfd0fc01-1bde-4b11-bbdd-d95693d0dd15\") " pod="metallb-system/controller-5bddd4b946-bkr8v" Dec 11 13:21:04 crc kubenswrapper[4898]: I1211 13:21:04.415036 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cfd0fc01-1bde-4b11-bbdd-d95693d0dd15-cert\") pod \"controller-5bddd4b946-bkr8v\" (UID: \"cfd0fc01-1bde-4b11-bbdd-d95693d0dd15\") " pod="metallb-system/controller-5bddd4b946-bkr8v" Dec 11 13:21:04 crc kubenswrapper[4898]: I1211 13:21:04.419429 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m57ll\" (UniqueName: \"kubernetes.io/projected/cfd0fc01-1bde-4b11-bbdd-d95693d0dd15-kube-api-access-m57ll\") pod \"controller-5bddd4b946-bkr8v\" (UID: \"cfd0fc01-1bde-4b11-bbdd-d95693d0dd15\") " pod="metallb-system/controller-5bddd4b946-bkr8v" Dec 11 13:21:04 crc kubenswrapper[4898]: I1211 13:21:04.477746 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-5bddd4b946-bkr8v" Dec 11 13:21:04 crc kubenswrapper[4898]: I1211 13:21:04.525676 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-6gtqk" event={"ID":"d898c00f-7c50-483d-84f3-9c502696b39a","Type":"ContainerStarted","Data":"fc48857ffa6739cc1d87713c0df856aff2a3ef61c406bf6a0ffbcd2ec6d685c7"} Dec 11 13:21:04 crc kubenswrapper[4898]: I1211 13:21:04.706401 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c0569df8-06fa-4d31-a59a-904b90e4a0ca-cert\") pod \"frr-k8s-webhook-server-7784b6fcf-llp2z\" (UID: \"c0569df8-06fa-4d31-a59a-904b90e4a0ca\") " pod="metallb-system/frr-k8s-webhook-server-7784b6fcf-llp2z" Dec 11 13:21:04 crc kubenswrapper[4898]: I1211 13:21:04.710083 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c0569df8-06fa-4d31-a59a-904b90e4a0ca-cert\") pod \"frr-k8s-webhook-server-7784b6fcf-llp2z\" (UID: \"c0569df8-06fa-4d31-a59a-904b90e4a0ca\") " pod="metallb-system/frr-k8s-webhook-server-7784b6fcf-llp2z" Dec 11 13:21:04 crc kubenswrapper[4898]: I1211 13:21:04.807846 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/80d1af81-ad34-4f94-afd2-94c3773ea9ea-memberlist\") pod \"speaker-7jgw9\" (UID: \"80d1af81-ad34-4f94-afd2-94c3773ea9ea\") " pod="metallb-system/speaker-7jgw9" Dec 11 13:21:04 crc kubenswrapper[4898]: I1211 13:21:04.807890 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/80d1af81-ad34-4f94-afd2-94c3773ea9ea-metrics-certs\") pod \"speaker-7jgw9\" (UID: \"80d1af81-ad34-4f94-afd2-94c3773ea9ea\") " pod="metallb-system/speaker-7jgw9" Dec 11 13:21:04 crc kubenswrapper[4898]: E1211 13:21:04.808171 4898 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Dec 11 13:21:04 crc kubenswrapper[4898]: E1211 13:21:04.808292 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/80d1af81-ad34-4f94-afd2-94c3773ea9ea-memberlist podName:80d1af81-ad34-4f94-afd2-94c3773ea9ea nodeName:}" failed. No retries permitted until 2025-12-11 13:21:05.808264136 +0000 UTC m=+1023.380590613 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/80d1af81-ad34-4f94-afd2-94c3773ea9ea-memberlist") pod "speaker-7jgw9" (UID: "80d1af81-ad34-4f94-afd2-94c3773ea9ea") : secret "metallb-memberlist" not found Dec 11 13:21:04 crc kubenswrapper[4898]: I1211 13:21:04.810877 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/80d1af81-ad34-4f94-afd2-94c3773ea9ea-metrics-certs\") pod \"speaker-7jgw9\" (UID: \"80d1af81-ad34-4f94-afd2-94c3773ea9ea\") " pod="metallb-system/speaker-7jgw9" Dec 11 13:21:04 crc kubenswrapper[4898]: I1211 13:21:04.881873 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-5bddd4b946-bkr8v"] Dec 11 13:21:04 crc kubenswrapper[4898]: I1211 13:21:04.883512 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7784b6fcf-llp2z" Dec 11 13:21:04 crc kubenswrapper[4898]: W1211 13:21:04.897152 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcfd0fc01_1bde_4b11_bbdd_d95693d0dd15.slice/crio-269cf66d1da93b820710d0e3481b58365288bf581a338b45bd57a4d0c61c6ca7 WatchSource:0}: Error finding container 269cf66d1da93b820710d0e3481b58365288bf581a338b45bd57a4d0c61c6ca7: Status 404 returned error can't find the container with id 269cf66d1da93b820710d0e3481b58365288bf581a338b45bd57a4d0c61c6ca7 Dec 11 13:21:04 crc kubenswrapper[4898]: I1211 13:21:04.996114 4898 patch_prober.go:28] interesting pod/machine-config-daemon-7mmvk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 11 13:21:04 crc kubenswrapper[4898]: I1211 13:21:04.996387 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 11 13:21:05 crc kubenswrapper[4898]: I1211 13:21:05.398170 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7784b6fcf-llp2z"] Dec 11 13:21:05 crc kubenswrapper[4898]: W1211 13:21:05.399111 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc0569df8_06fa_4d31_a59a_904b90e4a0ca.slice/crio-ffaf5eb40c30995b0a40b5fa90f6a0b9663316982abe3f1519549932eded34cf WatchSource:0}: Error finding container ffaf5eb40c30995b0a40b5fa90f6a0b9663316982abe3f1519549932eded34cf: Status 404 returned error can't find the container with id ffaf5eb40c30995b0a40b5fa90f6a0b9663316982abe3f1519549932eded34cf Dec 11 13:21:05 crc kubenswrapper[4898]: I1211 13:21:05.536115 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-5bddd4b946-bkr8v" event={"ID":"cfd0fc01-1bde-4b11-bbdd-d95693d0dd15","Type":"ContainerStarted","Data":"63070b39da9a875ba61c2cbd562b8fdb8282f9071d8be9c28391e54deec1c538"} Dec 11 13:21:05 crc kubenswrapper[4898]: I1211 13:21:05.536184 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-5bddd4b946-bkr8v" event={"ID":"cfd0fc01-1bde-4b11-bbdd-d95693d0dd15","Type":"ContainerStarted","Data":"3e16a0086e3ec93d46bf43c8e759cc67b91eebe75876b80bf89ead505c54702f"} Dec 11 13:21:05 crc kubenswrapper[4898]: I1211 13:21:05.536220 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-5bddd4b946-bkr8v" event={"ID":"cfd0fc01-1bde-4b11-bbdd-d95693d0dd15","Type":"ContainerStarted","Data":"269cf66d1da93b820710d0e3481b58365288bf581a338b45bd57a4d0c61c6ca7"} Dec 11 13:21:05 crc kubenswrapper[4898]: I1211 13:21:05.536241 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-5bddd4b946-bkr8v" Dec 11 13:21:05 crc kubenswrapper[4898]: I1211 13:21:05.537058 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7784b6fcf-llp2z" event={"ID":"c0569df8-06fa-4d31-a59a-904b90e4a0ca","Type":"ContainerStarted","Data":"ffaf5eb40c30995b0a40b5fa90f6a0b9663316982abe3f1519549932eded34cf"} Dec 11 13:21:05 crc kubenswrapper[4898]: I1211 13:21:05.552331 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-5bddd4b946-bkr8v" podStartSLOduration=1.552311645 podStartE2EDuration="1.552311645s" podCreationTimestamp="2025-12-11 13:21:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 13:21:05.550421323 +0000 UTC m=+1023.122747760" watchObservedRunningTime="2025-12-11 13:21:05.552311645 +0000 UTC m=+1023.124638082" Dec 11 13:21:05 crc kubenswrapper[4898]: I1211 13:21:05.833975 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/80d1af81-ad34-4f94-afd2-94c3773ea9ea-memberlist\") pod \"speaker-7jgw9\" (UID: \"80d1af81-ad34-4f94-afd2-94c3773ea9ea\") " pod="metallb-system/speaker-7jgw9" Dec 11 13:21:05 crc kubenswrapper[4898]: I1211 13:21:05.844374 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/80d1af81-ad34-4f94-afd2-94c3773ea9ea-memberlist\") pod \"speaker-7jgw9\" (UID: \"80d1af81-ad34-4f94-afd2-94c3773ea9ea\") " pod="metallb-system/speaker-7jgw9" Dec 11 13:21:05 crc kubenswrapper[4898]: I1211 13:21:05.911732 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-7jgw9" Dec 11 13:21:05 crc kubenswrapper[4898]: W1211 13:21:05.962576 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod80d1af81_ad34_4f94_afd2_94c3773ea9ea.slice/crio-9378ee1ef172c60b5e1449d8646ae6723443281d2fc231f75adc809325cebbd2 WatchSource:0}: Error finding container 9378ee1ef172c60b5e1449d8646ae6723443281d2fc231f75adc809325cebbd2: Status 404 returned error can't find the container with id 9378ee1ef172c60b5e1449d8646ae6723443281d2fc231f75adc809325cebbd2 Dec 11 13:21:06 crc kubenswrapper[4898]: I1211 13:21:06.562570 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-7jgw9" event={"ID":"80d1af81-ad34-4f94-afd2-94c3773ea9ea","Type":"ContainerStarted","Data":"0118408fbb2c076001ba9423193529e8687f55b0c028875d6891c2017b967b83"} Dec 11 13:21:06 crc kubenswrapper[4898]: I1211 13:21:06.562823 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-7jgw9" event={"ID":"80d1af81-ad34-4f94-afd2-94c3773ea9ea","Type":"ContainerStarted","Data":"2cfb2c67bdab33eb6f318d6be708b6e1962ee20eb074ed81185ac3254fffdab4"} Dec 11 13:21:06 crc kubenswrapper[4898]: I1211 13:21:06.562837 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-7jgw9" event={"ID":"80d1af81-ad34-4f94-afd2-94c3773ea9ea","Type":"ContainerStarted","Data":"9378ee1ef172c60b5e1449d8646ae6723443281d2fc231f75adc809325cebbd2"} Dec 11 13:21:06 crc kubenswrapper[4898]: I1211 13:21:06.563309 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-7jgw9" Dec 11 13:21:06 crc kubenswrapper[4898]: I1211 13:21:06.593566 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-7jgw9" podStartSLOduration=2.5935498150000003 podStartE2EDuration="2.593549815s" podCreationTimestamp="2025-12-11 13:21:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 13:21:06.587910199 +0000 UTC m=+1024.160236636" watchObservedRunningTime="2025-12-11 13:21:06.593549815 +0000 UTC m=+1024.165876252" Dec 11 13:21:12 crc kubenswrapper[4898]: I1211 13:21:12.621385 4898 generic.go:334] "Generic (PLEG): container finished" podID="d898c00f-7c50-483d-84f3-9c502696b39a" containerID="4b48d911ac51ceaa0ae0f227611dddc4c578467339ef49aa945cb3c159531715" exitCode=0 Dec 11 13:21:12 crc kubenswrapper[4898]: I1211 13:21:12.621483 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-6gtqk" event={"ID":"d898c00f-7c50-483d-84f3-9c502696b39a","Type":"ContainerDied","Data":"4b48d911ac51ceaa0ae0f227611dddc4c578467339ef49aa945cb3c159531715"} Dec 11 13:21:12 crc kubenswrapper[4898]: I1211 13:21:12.624472 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7784b6fcf-llp2z" event={"ID":"c0569df8-06fa-4d31-a59a-904b90e4a0ca","Type":"ContainerStarted","Data":"34f28d0b4ab679632f86f1b5d58c05575e908ebba363aace242534e72c72d6fa"} Dec 11 13:21:12 crc kubenswrapper[4898]: I1211 13:21:12.624813 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-7784b6fcf-llp2z" Dec 11 13:21:12 crc kubenswrapper[4898]: I1211 13:21:12.676518 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-7784b6fcf-llp2z" podStartSLOduration=3.103417511 podStartE2EDuration="9.676493763s" podCreationTimestamp="2025-12-11 13:21:03 +0000 UTC" firstStartedPulling="2025-12-11 13:21:05.400690692 +0000 UTC m=+1022.973017129" lastFinishedPulling="2025-12-11 13:21:11.973766934 +0000 UTC m=+1029.546093381" observedRunningTime="2025-12-11 13:21:12.671801964 +0000 UTC m=+1030.244128421" watchObservedRunningTime="2025-12-11 13:21:12.676493763 +0000 UTC m=+1030.248820210" Dec 11 13:21:13 crc kubenswrapper[4898]: I1211 13:21:13.642658 4898 generic.go:334] "Generic (PLEG): container finished" podID="d898c00f-7c50-483d-84f3-9c502696b39a" containerID="a56183824d732b238e3e980f5694e10f3f43bef1ca7d6f2a9640cb81a778dd2c" exitCode=0 Dec 11 13:21:13 crc kubenswrapper[4898]: I1211 13:21:13.642754 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-6gtqk" event={"ID":"d898c00f-7c50-483d-84f3-9c502696b39a","Type":"ContainerDied","Data":"a56183824d732b238e3e980f5694e10f3f43bef1ca7d6f2a9640cb81a778dd2c"} Dec 11 13:21:14 crc kubenswrapper[4898]: I1211 13:21:14.651785 4898 generic.go:334] "Generic (PLEG): container finished" podID="d898c00f-7c50-483d-84f3-9c502696b39a" containerID="79171ccbd79816b25391af6c8838670b50ad85c45874caf970e0251a6053cb19" exitCode=0 Dec 11 13:21:14 crc kubenswrapper[4898]: I1211 13:21:14.651856 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-6gtqk" event={"ID":"d898c00f-7c50-483d-84f3-9c502696b39a","Type":"ContainerDied","Data":"79171ccbd79816b25391af6c8838670b50ad85c45874caf970e0251a6053cb19"} Dec 11 13:21:15 crc kubenswrapper[4898]: I1211 13:21:15.672525 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-6gtqk" event={"ID":"d898c00f-7c50-483d-84f3-9c502696b39a","Type":"ContainerStarted","Data":"7431525ca1322128640b240fe84d7f936c28b207999304b9cc1ea2692ce670fb"} Dec 11 13:21:15 crc kubenswrapper[4898]: I1211 13:21:15.672903 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-6gtqk" event={"ID":"d898c00f-7c50-483d-84f3-9c502696b39a","Type":"ContainerStarted","Data":"5fee51153bd283e97956e1996b7cafe2566de8290b01c96324743ec1a76a0f9e"} Dec 11 13:21:15 crc kubenswrapper[4898]: I1211 13:21:15.672920 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-6gtqk" event={"ID":"d898c00f-7c50-483d-84f3-9c502696b39a","Type":"ContainerStarted","Data":"733d478db220de49edf8e8feb941d0b9151d8d201e158649abadc90e697c08ee"} Dec 11 13:21:15 crc kubenswrapper[4898]: I1211 13:21:15.672936 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-6gtqk" event={"ID":"d898c00f-7c50-483d-84f3-9c502696b39a","Type":"ContainerStarted","Data":"327b3e02274fa424bc38f29f3663a5fac0c739d9c3423621c944ccbcb6810c8e"} Dec 11 13:21:15 crc kubenswrapper[4898]: I1211 13:21:15.672954 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-6gtqk" event={"ID":"d898c00f-7c50-483d-84f3-9c502696b39a","Type":"ContainerStarted","Data":"81b610b6e1b07706276adab2ea6bca3196a1ea67968d65a365ac15ce0eba7b74"} Dec 11 13:21:16 crc kubenswrapper[4898]: I1211 13:21:16.694736 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-6gtqk" event={"ID":"d898c00f-7c50-483d-84f3-9c502696b39a","Type":"ContainerStarted","Data":"917207d2b65d33dacb13d95de8aae4ec30373d6b252ff4e0c118e16adc28217f"} Dec 11 13:21:16 crc kubenswrapper[4898]: I1211 13:21:16.695220 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-6gtqk" Dec 11 13:21:16 crc kubenswrapper[4898]: I1211 13:21:16.746442 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-6gtqk" podStartSLOduration=6.289717696 podStartE2EDuration="13.746421499s" podCreationTimestamp="2025-12-11 13:21:03 +0000 UTC" firstStartedPulling="2025-12-11 13:21:04.476931204 +0000 UTC m=+1022.049257641" lastFinishedPulling="2025-12-11 13:21:11.933634967 +0000 UTC m=+1029.505961444" observedRunningTime="2025-12-11 13:21:16.734168271 +0000 UTC m=+1034.306494748" watchObservedRunningTime="2025-12-11 13:21:16.746421499 +0000 UTC m=+1034.318747946" Dec 11 13:21:19 crc kubenswrapper[4898]: I1211 13:21:19.262784 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-6gtqk" Dec 11 13:21:19 crc kubenswrapper[4898]: I1211 13:21:19.328404 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-6gtqk" Dec 11 13:21:24 crc kubenswrapper[4898]: I1211 13:21:24.264819 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-6gtqk" Dec 11 13:21:24 crc kubenswrapper[4898]: I1211 13:21:24.484851 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-5bddd4b946-bkr8v" Dec 11 13:21:24 crc kubenswrapper[4898]: I1211 13:21:24.891592 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-7784b6fcf-llp2z" Dec 11 13:21:25 crc kubenswrapper[4898]: I1211 13:21:25.917270 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-7jgw9" Dec 11 13:21:29 crc kubenswrapper[4898]: I1211 13:21:29.016620 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-s6pgl"] Dec 11 13:21:29 crc kubenswrapper[4898]: I1211 13:21:29.018750 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-s6pgl" Dec 11 13:21:29 crc kubenswrapper[4898]: I1211 13:21:29.023801 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Dec 11 13:21:29 crc kubenswrapper[4898]: I1211 13:21:29.024917 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Dec 11 13:21:29 crc kubenswrapper[4898]: I1211 13:21:29.027345 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-l6kpf" Dec 11 13:21:29 crc kubenswrapper[4898]: I1211 13:21:29.040387 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-s6pgl"] Dec 11 13:21:29 crc kubenswrapper[4898]: I1211 13:21:29.190081 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hghvk\" (UniqueName: \"kubernetes.io/projected/b65b88ef-7a96-48cb-b629-252e7e269f17-kube-api-access-hghvk\") pod \"openstack-operator-index-s6pgl\" (UID: \"b65b88ef-7a96-48cb-b629-252e7e269f17\") " pod="openstack-operators/openstack-operator-index-s6pgl" Dec 11 13:21:29 crc kubenswrapper[4898]: I1211 13:21:29.291449 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hghvk\" (UniqueName: \"kubernetes.io/projected/b65b88ef-7a96-48cb-b629-252e7e269f17-kube-api-access-hghvk\") pod \"openstack-operator-index-s6pgl\" (UID: \"b65b88ef-7a96-48cb-b629-252e7e269f17\") " pod="openstack-operators/openstack-operator-index-s6pgl" Dec 11 13:21:29 crc kubenswrapper[4898]: I1211 13:21:29.309839 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hghvk\" (UniqueName: \"kubernetes.io/projected/b65b88ef-7a96-48cb-b629-252e7e269f17-kube-api-access-hghvk\") pod \"openstack-operator-index-s6pgl\" (UID: \"b65b88ef-7a96-48cb-b629-252e7e269f17\") " pod="openstack-operators/openstack-operator-index-s6pgl" Dec 11 13:21:29 crc kubenswrapper[4898]: I1211 13:21:29.366295 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-s6pgl" Dec 11 13:21:29 crc kubenswrapper[4898]: I1211 13:21:29.828060 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-s6pgl"] Dec 11 13:21:29 crc kubenswrapper[4898]: W1211 13:21:29.834612 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb65b88ef_7a96_48cb_b629_252e7e269f17.slice/crio-34092af779068aacdff101d4f3517bbf7bd3ad051fe5c8d8c723738e56522137 WatchSource:0}: Error finding container 34092af779068aacdff101d4f3517bbf7bd3ad051fe5c8d8c723738e56522137: Status 404 returned error can't find the container with id 34092af779068aacdff101d4f3517bbf7bd3ad051fe5c8d8c723738e56522137 Dec 11 13:21:30 crc kubenswrapper[4898]: I1211 13:21:30.854798 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-s6pgl" event={"ID":"b65b88ef-7a96-48cb-b629-252e7e269f17","Type":"ContainerStarted","Data":"34092af779068aacdff101d4f3517bbf7bd3ad051fe5c8d8c723738e56522137"} Dec 11 13:21:33 crc kubenswrapper[4898]: I1211 13:21:33.382625 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-s6pgl"] Dec 11 13:21:33 crc kubenswrapper[4898]: I1211 13:21:33.990220 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-mxsdw"] Dec 11 13:21:34 crc kubenswrapper[4898]: I1211 13:21:34.005317 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-mxsdw"] Dec 11 13:21:34 crc kubenswrapper[4898]: I1211 13:21:34.005407 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-mxsdw" Dec 11 13:21:34 crc kubenswrapper[4898]: I1211 13:21:34.093944 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9p9k9\" (UniqueName: \"kubernetes.io/projected/2911f97f-4469-4335-b6be-48a0e3c6fda8-kube-api-access-9p9k9\") pod \"openstack-operator-index-mxsdw\" (UID: \"2911f97f-4469-4335-b6be-48a0e3c6fda8\") " pod="openstack-operators/openstack-operator-index-mxsdw" Dec 11 13:21:34 crc kubenswrapper[4898]: I1211 13:21:34.195659 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9p9k9\" (UniqueName: \"kubernetes.io/projected/2911f97f-4469-4335-b6be-48a0e3c6fda8-kube-api-access-9p9k9\") pod \"openstack-operator-index-mxsdw\" (UID: \"2911f97f-4469-4335-b6be-48a0e3c6fda8\") " pod="openstack-operators/openstack-operator-index-mxsdw" Dec 11 13:21:34 crc kubenswrapper[4898]: I1211 13:21:34.241879 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9p9k9\" (UniqueName: \"kubernetes.io/projected/2911f97f-4469-4335-b6be-48a0e3c6fda8-kube-api-access-9p9k9\") pod \"openstack-operator-index-mxsdw\" (UID: \"2911f97f-4469-4335-b6be-48a0e3c6fda8\") " pod="openstack-operators/openstack-operator-index-mxsdw" Dec 11 13:21:34 crc kubenswrapper[4898]: I1211 13:21:34.350865 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-mxsdw" Dec 11 13:21:34 crc kubenswrapper[4898]: I1211 13:21:34.812606 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-mxsdw"] Dec 11 13:21:34 crc kubenswrapper[4898]: I1211 13:21:34.891937 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-s6pgl" podUID="b65b88ef-7a96-48cb-b629-252e7e269f17" containerName="registry-server" containerID="cri-o://c50eca7ecfcc46c6f86fbf9a24ccf1dbe63c9a24b087df6d4b33ed470b83b285" gracePeriod=2 Dec 11 13:21:34 crc kubenswrapper[4898]: I1211 13:21:34.892075 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-s6pgl" event={"ID":"b65b88ef-7a96-48cb-b629-252e7e269f17","Type":"ContainerStarted","Data":"c50eca7ecfcc46c6f86fbf9a24ccf1dbe63c9a24b087df6d4b33ed470b83b285"} Dec 11 13:21:34 crc kubenswrapper[4898]: I1211 13:21:34.896117 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-mxsdw" event={"ID":"2911f97f-4469-4335-b6be-48a0e3c6fda8","Type":"ContainerStarted","Data":"83b798a94a62ef9b26695cedad1c3ef96deac56b5cbcfeea22752bd4886edd33"} Dec 11 13:21:34 crc kubenswrapper[4898]: I1211 13:21:34.904665 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-s6pgl" podStartSLOduration=2.481330017 podStartE2EDuration="6.904650634s" podCreationTimestamp="2025-12-11 13:21:28 +0000 UTC" firstStartedPulling="2025-12-11 13:21:29.837707128 +0000 UTC m=+1047.410033575" lastFinishedPulling="2025-12-11 13:21:34.261027755 +0000 UTC m=+1051.833354192" observedRunningTime="2025-12-11 13:21:34.904032687 +0000 UTC m=+1052.476359134" watchObservedRunningTime="2025-12-11 13:21:34.904650634 +0000 UTC m=+1052.476977081" Dec 11 13:21:34 crc kubenswrapper[4898]: I1211 13:21:34.995695 4898 patch_prober.go:28] interesting pod/machine-config-daemon-7mmvk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 11 13:21:34 crc kubenswrapper[4898]: I1211 13:21:34.995795 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 11 13:21:34 crc kubenswrapper[4898]: I1211 13:21:34.995870 4898 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" Dec 11 13:21:34 crc kubenswrapper[4898]: I1211 13:21:34.997112 4898 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d75b5443399659dfc4c4753c4049d2fed5f950b8d251c55a9abe387518cd27d2"} pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 11 13:21:34 crc kubenswrapper[4898]: I1211 13:21:34.997270 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" containerName="machine-config-daemon" containerID="cri-o://d75b5443399659dfc4c4753c4049d2fed5f950b8d251c55a9abe387518cd27d2" gracePeriod=600 Dec 11 13:21:35 crc kubenswrapper[4898]: I1211 13:21:35.414497 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-s6pgl" Dec 11 13:21:35 crc kubenswrapper[4898]: I1211 13:21:35.530197 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hghvk\" (UniqueName: \"kubernetes.io/projected/b65b88ef-7a96-48cb-b629-252e7e269f17-kube-api-access-hghvk\") pod \"b65b88ef-7a96-48cb-b629-252e7e269f17\" (UID: \"b65b88ef-7a96-48cb-b629-252e7e269f17\") " Dec 11 13:21:35 crc kubenswrapper[4898]: I1211 13:21:35.547243 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b65b88ef-7a96-48cb-b629-252e7e269f17-kube-api-access-hghvk" (OuterVolumeSpecName: "kube-api-access-hghvk") pod "b65b88ef-7a96-48cb-b629-252e7e269f17" (UID: "b65b88ef-7a96-48cb-b629-252e7e269f17"). InnerVolumeSpecName "kube-api-access-hghvk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:21:35 crc kubenswrapper[4898]: I1211 13:21:35.632084 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hghvk\" (UniqueName: \"kubernetes.io/projected/b65b88ef-7a96-48cb-b629-252e7e269f17-kube-api-access-hghvk\") on node \"crc\" DevicePath \"\"" Dec 11 13:21:35 crc kubenswrapper[4898]: I1211 13:21:35.907197 4898 generic.go:334] "Generic (PLEG): container finished" podID="b65b88ef-7a96-48cb-b629-252e7e269f17" containerID="c50eca7ecfcc46c6f86fbf9a24ccf1dbe63c9a24b087df6d4b33ed470b83b285" exitCode=0 Dec 11 13:21:35 crc kubenswrapper[4898]: I1211 13:21:35.907274 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-s6pgl" Dec 11 13:21:35 crc kubenswrapper[4898]: I1211 13:21:35.907328 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-s6pgl" event={"ID":"b65b88ef-7a96-48cb-b629-252e7e269f17","Type":"ContainerDied","Data":"c50eca7ecfcc46c6f86fbf9a24ccf1dbe63c9a24b087df6d4b33ed470b83b285"} Dec 11 13:21:35 crc kubenswrapper[4898]: I1211 13:21:35.907390 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-s6pgl" event={"ID":"b65b88ef-7a96-48cb-b629-252e7e269f17","Type":"ContainerDied","Data":"34092af779068aacdff101d4f3517bbf7bd3ad051fe5c8d8c723738e56522137"} Dec 11 13:21:35 crc kubenswrapper[4898]: I1211 13:21:35.907420 4898 scope.go:117] "RemoveContainer" containerID="c50eca7ecfcc46c6f86fbf9a24ccf1dbe63c9a24b087df6d4b33ed470b83b285" Dec 11 13:21:35 crc kubenswrapper[4898]: I1211 13:21:35.910866 4898 generic.go:334] "Generic (PLEG): container finished" podID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" containerID="d75b5443399659dfc4c4753c4049d2fed5f950b8d251c55a9abe387518cd27d2" exitCode=0 Dec 11 13:21:35 crc kubenswrapper[4898]: I1211 13:21:35.910919 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" event={"ID":"b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c","Type":"ContainerDied","Data":"d75b5443399659dfc4c4753c4049d2fed5f950b8d251c55a9abe387518cd27d2"} Dec 11 13:21:35 crc kubenswrapper[4898]: I1211 13:21:35.910997 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" event={"ID":"b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c","Type":"ContainerStarted","Data":"ca4a970ba19c45b9f6200362f9a5d9ef16d6404aa1395da5964e4d307dc4af2f"} Dec 11 13:21:35 crc kubenswrapper[4898]: I1211 13:21:35.915088 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-mxsdw" event={"ID":"2911f97f-4469-4335-b6be-48a0e3c6fda8","Type":"ContainerStarted","Data":"ffef00265ee502022a28fca5ad7c1776d4abbea34919009e273517a092ac0fb9"} Dec 11 13:21:35 crc kubenswrapper[4898]: I1211 13:21:35.931407 4898 scope.go:117] "RemoveContainer" containerID="c50eca7ecfcc46c6f86fbf9a24ccf1dbe63c9a24b087df6d4b33ed470b83b285" Dec 11 13:21:35 crc kubenswrapper[4898]: E1211 13:21:35.932247 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c50eca7ecfcc46c6f86fbf9a24ccf1dbe63c9a24b087df6d4b33ed470b83b285\": container with ID starting with c50eca7ecfcc46c6f86fbf9a24ccf1dbe63c9a24b087df6d4b33ed470b83b285 not found: ID does not exist" containerID="c50eca7ecfcc46c6f86fbf9a24ccf1dbe63c9a24b087df6d4b33ed470b83b285" Dec 11 13:21:35 crc kubenswrapper[4898]: I1211 13:21:35.932305 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c50eca7ecfcc46c6f86fbf9a24ccf1dbe63c9a24b087df6d4b33ed470b83b285"} err="failed to get container status \"c50eca7ecfcc46c6f86fbf9a24ccf1dbe63c9a24b087df6d4b33ed470b83b285\": rpc error: code = NotFound desc = could not find container \"c50eca7ecfcc46c6f86fbf9a24ccf1dbe63c9a24b087df6d4b33ed470b83b285\": container with ID starting with c50eca7ecfcc46c6f86fbf9a24ccf1dbe63c9a24b087df6d4b33ed470b83b285 not found: ID does not exist" Dec 11 13:21:35 crc kubenswrapper[4898]: I1211 13:21:35.932340 4898 scope.go:117] "RemoveContainer" containerID="0d92a34a49f382a6447faa4993a93c91c6e3595696f153fe710d6ffe0ba35fec" Dec 11 13:21:35 crc kubenswrapper[4898]: I1211 13:21:35.944531 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-mxsdw" podStartSLOduration=2.874181515 podStartE2EDuration="2.944514156s" podCreationTimestamp="2025-12-11 13:21:33 +0000 UTC" firstStartedPulling="2025-12-11 13:21:34.819768762 +0000 UTC m=+1052.392095199" lastFinishedPulling="2025-12-11 13:21:34.890101403 +0000 UTC m=+1052.462427840" observedRunningTime="2025-12-11 13:21:35.943356494 +0000 UTC m=+1053.515682961" watchObservedRunningTime="2025-12-11 13:21:35.944514156 +0000 UTC m=+1053.516840583" Dec 11 13:21:35 crc kubenswrapper[4898]: I1211 13:21:35.989498 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-s6pgl"] Dec 11 13:21:35 crc kubenswrapper[4898]: I1211 13:21:35.996388 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-s6pgl"] Dec 11 13:21:36 crc kubenswrapper[4898]: E1211 13:21:36.079128 4898 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb65b88ef_7a96_48cb_b629_252e7e269f17.slice\": RecentStats: unable to find data in memory cache]" Dec 11 13:21:36 crc kubenswrapper[4898]: I1211 13:21:36.788833 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b65b88ef-7a96-48cb-b629-252e7e269f17" path="/var/lib/kubelet/pods/b65b88ef-7a96-48cb-b629-252e7e269f17/volumes" Dec 11 13:21:44 crc kubenswrapper[4898]: I1211 13:21:44.351534 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-mxsdw" Dec 11 13:21:44 crc kubenswrapper[4898]: I1211 13:21:44.352304 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-mxsdw" Dec 11 13:21:44 crc kubenswrapper[4898]: I1211 13:21:44.396593 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-mxsdw" Dec 11 13:21:45 crc kubenswrapper[4898]: I1211 13:21:45.063194 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-mxsdw" Dec 11 13:21:52 crc kubenswrapper[4898]: I1211 13:21:52.023056 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/46a92e0302b0514c24e6aa6ad8547900656bd726806e912560255e10bdqbmbz"] Dec 11 13:21:52 crc kubenswrapper[4898]: E1211 13:21:52.023902 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b65b88ef-7a96-48cb-b629-252e7e269f17" containerName="registry-server" Dec 11 13:21:52 crc kubenswrapper[4898]: I1211 13:21:52.023934 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="b65b88ef-7a96-48cb-b629-252e7e269f17" containerName="registry-server" Dec 11 13:21:52 crc kubenswrapper[4898]: I1211 13:21:52.024137 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="b65b88ef-7a96-48cb-b629-252e7e269f17" containerName="registry-server" Dec 11 13:21:52 crc kubenswrapper[4898]: I1211 13:21:52.026614 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/46a92e0302b0514c24e6aa6ad8547900656bd726806e912560255e10bdqbmbz" Dec 11 13:21:52 crc kubenswrapper[4898]: I1211 13:21:52.029371 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/46a92e0302b0514c24e6aa6ad8547900656bd726806e912560255e10bdqbmbz"] Dec 11 13:21:52 crc kubenswrapper[4898]: I1211 13:21:52.031446 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-6mz5t" Dec 11 13:21:52 crc kubenswrapper[4898]: I1211 13:21:52.061916 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rt9sx\" (UniqueName: \"kubernetes.io/projected/4c60c53a-7f88-4da3-8ede-b6f1c3e32564-kube-api-access-rt9sx\") pod \"46a92e0302b0514c24e6aa6ad8547900656bd726806e912560255e10bdqbmbz\" (UID: \"4c60c53a-7f88-4da3-8ede-b6f1c3e32564\") " pod="openstack-operators/46a92e0302b0514c24e6aa6ad8547900656bd726806e912560255e10bdqbmbz" Dec 11 13:21:52 crc kubenswrapper[4898]: I1211 13:21:52.062070 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4c60c53a-7f88-4da3-8ede-b6f1c3e32564-util\") pod \"46a92e0302b0514c24e6aa6ad8547900656bd726806e912560255e10bdqbmbz\" (UID: \"4c60c53a-7f88-4da3-8ede-b6f1c3e32564\") " pod="openstack-operators/46a92e0302b0514c24e6aa6ad8547900656bd726806e912560255e10bdqbmbz" Dec 11 13:21:52 crc kubenswrapper[4898]: I1211 13:21:52.062151 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4c60c53a-7f88-4da3-8ede-b6f1c3e32564-bundle\") pod \"46a92e0302b0514c24e6aa6ad8547900656bd726806e912560255e10bdqbmbz\" (UID: \"4c60c53a-7f88-4da3-8ede-b6f1c3e32564\") " pod="openstack-operators/46a92e0302b0514c24e6aa6ad8547900656bd726806e912560255e10bdqbmbz" Dec 11 13:21:52 crc kubenswrapper[4898]: I1211 13:21:52.163744 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rt9sx\" (UniqueName: \"kubernetes.io/projected/4c60c53a-7f88-4da3-8ede-b6f1c3e32564-kube-api-access-rt9sx\") pod \"46a92e0302b0514c24e6aa6ad8547900656bd726806e912560255e10bdqbmbz\" (UID: \"4c60c53a-7f88-4da3-8ede-b6f1c3e32564\") " pod="openstack-operators/46a92e0302b0514c24e6aa6ad8547900656bd726806e912560255e10bdqbmbz" Dec 11 13:21:52 crc kubenswrapper[4898]: I1211 13:21:52.165641 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4c60c53a-7f88-4da3-8ede-b6f1c3e32564-util\") pod \"46a92e0302b0514c24e6aa6ad8547900656bd726806e912560255e10bdqbmbz\" (UID: \"4c60c53a-7f88-4da3-8ede-b6f1c3e32564\") " pod="openstack-operators/46a92e0302b0514c24e6aa6ad8547900656bd726806e912560255e10bdqbmbz" Dec 11 13:21:52 crc kubenswrapper[4898]: I1211 13:21:52.165765 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4c60c53a-7f88-4da3-8ede-b6f1c3e32564-bundle\") pod \"46a92e0302b0514c24e6aa6ad8547900656bd726806e912560255e10bdqbmbz\" (UID: \"4c60c53a-7f88-4da3-8ede-b6f1c3e32564\") " pod="openstack-operators/46a92e0302b0514c24e6aa6ad8547900656bd726806e912560255e10bdqbmbz" Dec 11 13:21:52 crc kubenswrapper[4898]: I1211 13:21:52.166515 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4c60c53a-7f88-4da3-8ede-b6f1c3e32564-bundle\") pod \"46a92e0302b0514c24e6aa6ad8547900656bd726806e912560255e10bdqbmbz\" (UID: \"4c60c53a-7f88-4da3-8ede-b6f1c3e32564\") " pod="openstack-operators/46a92e0302b0514c24e6aa6ad8547900656bd726806e912560255e10bdqbmbz" Dec 11 13:21:52 crc kubenswrapper[4898]: I1211 13:21:52.166927 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4c60c53a-7f88-4da3-8ede-b6f1c3e32564-util\") pod \"46a92e0302b0514c24e6aa6ad8547900656bd726806e912560255e10bdqbmbz\" (UID: \"4c60c53a-7f88-4da3-8ede-b6f1c3e32564\") " pod="openstack-operators/46a92e0302b0514c24e6aa6ad8547900656bd726806e912560255e10bdqbmbz" Dec 11 13:21:52 crc kubenswrapper[4898]: I1211 13:21:52.182853 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rt9sx\" (UniqueName: \"kubernetes.io/projected/4c60c53a-7f88-4da3-8ede-b6f1c3e32564-kube-api-access-rt9sx\") pod \"46a92e0302b0514c24e6aa6ad8547900656bd726806e912560255e10bdqbmbz\" (UID: \"4c60c53a-7f88-4da3-8ede-b6f1c3e32564\") " pod="openstack-operators/46a92e0302b0514c24e6aa6ad8547900656bd726806e912560255e10bdqbmbz" Dec 11 13:21:52 crc kubenswrapper[4898]: I1211 13:21:52.351395 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/46a92e0302b0514c24e6aa6ad8547900656bd726806e912560255e10bdqbmbz" Dec 11 13:21:52 crc kubenswrapper[4898]: I1211 13:21:52.810603 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/46a92e0302b0514c24e6aa6ad8547900656bd726806e912560255e10bdqbmbz"] Dec 11 13:21:52 crc kubenswrapper[4898]: W1211 13:21:52.820006 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4c60c53a_7f88_4da3_8ede_b6f1c3e32564.slice/crio-0cfdda319e544a3031fda6cbc8b7e37d1699c42b261f681d1fba55829f8908a6 WatchSource:0}: Error finding container 0cfdda319e544a3031fda6cbc8b7e37d1699c42b261f681d1fba55829f8908a6: Status 404 returned error can't find the container with id 0cfdda319e544a3031fda6cbc8b7e37d1699c42b261f681d1fba55829f8908a6 Dec 11 13:21:53 crc kubenswrapper[4898]: I1211 13:21:53.134305 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/46a92e0302b0514c24e6aa6ad8547900656bd726806e912560255e10bdqbmbz" event={"ID":"4c60c53a-7f88-4da3-8ede-b6f1c3e32564","Type":"ContainerStarted","Data":"0cfdda319e544a3031fda6cbc8b7e37d1699c42b261f681d1fba55829f8908a6"} Dec 11 13:21:54 crc kubenswrapper[4898]: I1211 13:21:54.147248 4898 generic.go:334] "Generic (PLEG): container finished" podID="4c60c53a-7f88-4da3-8ede-b6f1c3e32564" containerID="1a13c2f3cc20b57ad5184ccda0dbee1f233501506efe5dd766a83cc617d6fd5f" exitCode=0 Dec 11 13:21:54 crc kubenswrapper[4898]: I1211 13:21:54.147318 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/46a92e0302b0514c24e6aa6ad8547900656bd726806e912560255e10bdqbmbz" event={"ID":"4c60c53a-7f88-4da3-8ede-b6f1c3e32564","Type":"ContainerDied","Data":"1a13c2f3cc20b57ad5184ccda0dbee1f233501506efe5dd766a83cc617d6fd5f"} Dec 11 13:21:55 crc kubenswrapper[4898]: I1211 13:21:55.159537 4898 generic.go:334] "Generic (PLEG): container finished" podID="4c60c53a-7f88-4da3-8ede-b6f1c3e32564" containerID="a9268005442b4a3826244c11a5d2a27e9eac57a9d2c67e414be50583730c493f" exitCode=0 Dec 11 13:21:55 crc kubenswrapper[4898]: I1211 13:21:55.159686 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/46a92e0302b0514c24e6aa6ad8547900656bd726806e912560255e10bdqbmbz" event={"ID":"4c60c53a-7f88-4da3-8ede-b6f1c3e32564","Type":"ContainerDied","Data":"a9268005442b4a3826244c11a5d2a27e9eac57a9d2c67e414be50583730c493f"} Dec 11 13:21:56 crc kubenswrapper[4898]: I1211 13:21:56.176104 4898 generic.go:334] "Generic (PLEG): container finished" podID="4c60c53a-7f88-4da3-8ede-b6f1c3e32564" containerID="309c255b5b0e7f9a3bb7e514e9acdba555ccc1418959384525da887768199bb2" exitCode=0 Dec 11 13:21:56 crc kubenswrapper[4898]: I1211 13:21:56.176313 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/46a92e0302b0514c24e6aa6ad8547900656bd726806e912560255e10bdqbmbz" event={"ID":"4c60c53a-7f88-4da3-8ede-b6f1c3e32564","Type":"ContainerDied","Data":"309c255b5b0e7f9a3bb7e514e9acdba555ccc1418959384525da887768199bb2"} Dec 11 13:21:57 crc kubenswrapper[4898]: I1211 13:21:57.592033 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/46a92e0302b0514c24e6aa6ad8547900656bd726806e912560255e10bdqbmbz" Dec 11 13:21:57 crc kubenswrapper[4898]: I1211 13:21:57.772964 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4c60c53a-7f88-4da3-8ede-b6f1c3e32564-bundle\") pod \"4c60c53a-7f88-4da3-8ede-b6f1c3e32564\" (UID: \"4c60c53a-7f88-4da3-8ede-b6f1c3e32564\") " Dec 11 13:21:57 crc kubenswrapper[4898]: I1211 13:21:57.773031 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4c60c53a-7f88-4da3-8ede-b6f1c3e32564-util\") pod \"4c60c53a-7f88-4da3-8ede-b6f1c3e32564\" (UID: \"4c60c53a-7f88-4da3-8ede-b6f1c3e32564\") " Dec 11 13:21:57 crc kubenswrapper[4898]: I1211 13:21:57.773072 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rt9sx\" (UniqueName: \"kubernetes.io/projected/4c60c53a-7f88-4da3-8ede-b6f1c3e32564-kube-api-access-rt9sx\") pod \"4c60c53a-7f88-4da3-8ede-b6f1c3e32564\" (UID: \"4c60c53a-7f88-4da3-8ede-b6f1c3e32564\") " Dec 11 13:21:57 crc kubenswrapper[4898]: I1211 13:21:57.773965 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4c60c53a-7f88-4da3-8ede-b6f1c3e32564-bundle" (OuterVolumeSpecName: "bundle") pod "4c60c53a-7f88-4da3-8ede-b6f1c3e32564" (UID: "4c60c53a-7f88-4da3-8ede-b6f1c3e32564"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 13:21:57 crc kubenswrapper[4898]: I1211 13:21:57.780770 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c60c53a-7f88-4da3-8ede-b6f1c3e32564-kube-api-access-rt9sx" (OuterVolumeSpecName: "kube-api-access-rt9sx") pod "4c60c53a-7f88-4da3-8ede-b6f1c3e32564" (UID: "4c60c53a-7f88-4da3-8ede-b6f1c3e32564"). InnerVolumeSpecName "kube-api-access-rt9sx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:21:57 crc kubenswrapper[4898]: I1211 13:21:57.788566 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4c60c53a-7f88-4da3-8ede-b6f1c3e32564-util" (OuterVolumeSpecName: "util") pod "4c60c53a-7f88-4da3-8ede-b6f1c3e32564" (UID: "4c60c53a-7f88-4da3-8ede-b6f1c3e32564"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 13:21:57 crc kubenswrapper[4898]: I1211 13:21:57.875686 4898 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4c60c53a-7f88-4da3-8ede-b6f1c3e32564-util\") on node \"crc\" DevicePath \"\"" Dec 11 13:21:57 crc kubenswrapper[4898]: I1211 13:21:57.875714 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rt9sx\" (UniqueName: \"kubernetes.io/projected/4c60c53a-7f88-4da3-8ede-b6f1c3e32564-kube-api-access-rt9sx\") on node \"crc\" DevicePath \"\"" Dec 11 13:21:57 crc kubenswrapper[4898]: I1211 13:21:57.875725 4898 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4c60c53a-7f88-4da3-8ede-b6f1c3e32564-bundle\") on node \"crc\" DevicePath \"\"" Dec 11 13:21:58 crc kubenswrapper[4898]: I1211 13:21:58.198946 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/46a92e0302b0514c24e6aa6ad8547900656bd726806e912560255e10bdqbmbz" event={"ID":"4c60c53a-7f88-4da3-8ede-b6f1c3e32564","Type":"ContainerDied","Data":"0cfdda319e544a3031fda6cbc8b7e37d1699c42b261f681d1fba55829f8908a6"} Dec 11 13:21:58 crc kubenswrapper[4898]: I1211 13:21:58.199027 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0cfdda319e544a3031fda6cbc8b7e37d1699c42b261f681d1fba55829f8908a6" Dec 11 13:21:58 crc kubenswrapper[4898]: I1211 13:21:58.199124 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/46a92e0302b0514c24e6aa6ad8547900656bd726806e912560255e10bdqbmbz" Dec 11 13:22:04 crc kubenswrapper[4898]: I1211 13:22:04.621241 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-operator-5dc777b99d-mszqj"] Dec 11 13:22:04 crc kubenswrapper[4898]: E1211 13:22:04.622296 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c60c53a-7f88-4da3-8ede-b6f1c3e32564" containerName="util" Dec 11 13:22:04 crc kubenswrapper[4898]: I1211 13:22:04.622316 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c60c53a-7f88-4da3-8ede-b6f1c3e32564" containerName="util" Dec 11 13:22:04 crc kubenswrapper[4898]: E1211 13:22:04.622359 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c60c53a-7f88-4da3-8ede-b6f1c3e32564" containerName="extract" Dec 11 13:22:04 crc kubenswrapper[4898]: I1211 13:22:04.622371 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c60c53a-7f88-4da3-8ede-b6f1c3e32564" containerName="extract" Dec 11 13:22:04 crc kubenswrapper[4898]: E1211 13:22:04.622413 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c60c53a-7f88-4da3-8ede-b6f1c3e32564" containerName="pull" Dec 11 13:22:04 crc kubenswrapper[4898]: I1211 13:22:04.622429 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c60c53a-7f88-4da3-8ede-b6f1c3e32564" containerName="pull" Dec 11 13:22:04 crc kubenswrapper[4898]: I1211 13:22:04.622780 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c60c53a-7f88-4da3-8ede-b6f1c3e32564" containerName="extract" Dec 11 13:22:04 crc kubenswrapper[4898]: I1211 13:22:04.626994 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-5dc777b99d-mszqj" Dec 11 13:22:04 crc kubenswrapper[4898]: I1211 13:22:04.630329 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-operator-dockercfg-kwlgk" Dec 11 13:22:04 crc kubenswrapper[4898]: I1211 13:22:04.655998 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-5dc777b99d-mszqj"] Dec 11 13:22:04 crc kubenswrapper[4898]: I1211 13:22:04.703180 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p4qb7\" (UniqueName: \"kubernetes.io/projected/4e56a824-5c00-4a67-a8c3-a32a001f0ce4-kube-api-access-p4qb7\") pod \"openstack-operator-controller-operator-5dc777b99d-mszqj\" (UID: \"4e56a824-5c00-4a67-a8c3-a32a001f0ce4\") " pod="openstack-operators/openstack-operator-controller-operator-5dc777b99d-mszqj" Dec 11 13:22:04 crc kubenswrapper[4898]: I1211 13:22:04.805502 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p4qb7\" (UniqueName: \"kubernetes.io/projected/4e56a824-5c00-4a67-a8c3-a32a001f0ce4-kube-api-access-p4qb7\") pod \"openstack-operator-controller-operator-5dc777b99d-mszqj\" (UID: \"4e56a824-5c00-4a67-a8c3-a32a001f0ce4\") " pod="openstack-operators/openstack-operator-controller-operator-5dc777b99d-mszqj" Dec 11 13:22:04 crc kubenswrapper[4898]: I1211 13:22:04.836514 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p4qb7\" (UniqueName: \"kubernetes.io/projected/4e56a824-5c00-4a67-a8c3-a32a001f0ce4-kube-api-access-p4qb7\") pod \"openstack-operator-controller-operator-5dc777b99d-mszqj\" (UID: \"4e56a824-5c00-4a67-a8c3-a32a001f0ce4\") " pod="openstack-operators/openstack-operator-controller-operator-5dc777b99d-mszqj" Dec 11 13:22:04 crc kubenswrapper[4898]: I1211 13:22:04.945736 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-5dc777b99d-mszqj" Dec 11 13:22:05 crc kubenswrapper[4898]: I1211 13:22:05.483789 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-5dc777b99d-mszqj"] Dec 11 13:22:05 crc kubenswrapper[4898]: W1211 13:22:05.496832 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4e56a824_5c00_4a67_a8c3_a32a001f0ce4.slice/crio-3396a1637be8737f8f78aca1c94e86b06490728b62d99c625497ca271ea1b16c WatchSource:0}: Error finding container 3396a1637be8737f8f78aca1c94e86b06490728b62d99c625497ca271ea1b16c: Status 404 returned error can't find the container with id 3396a1637be8737f8f78aca1c94e86b06490728b62d99c625497ca271ea1b16c Dec 11 13:22:06 crc kubenswrapper[4898]: I1211 13:22:06.292876 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-5dc777b99d-mszqj" event={"ID":"4e56a824-5c00-4a67-a8c3-a32a001f0ce4","Type":"ContainerStarted","Data":"3396a1637be8737f8f78aca1c94e86b06490728b62d99c625497ca271ea1b16c"} Dec 11 13:22:10 crc kubenswrapper[4898]: I1211 13:22:10.339743 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-5dc777b99d-mszqj" event={"ID":"4e56a824-5c00-4a67-a8c3-a32a001f0ce4","Type":"ContainerStarted","Data":"7ccf10f8c88ea96fd218d94e4778584a7af8706ef1c9d835c0a0d0804d9f42d7"} Dec 11 13:22:10 crc kubenswrapper[4898]: I1211 13:22:10.340411 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-operator-5dc777b99d-mszqj" Dec 11 13:22:10 crc kubenswrapper[4898]: I1211 13:22:10.381440 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-operator-5dc777b99d-mszqj" podStartSLOduration=2.107540174 podStartE2EDuration="6.381416257s" podCreationTimestamp="2025-12-11 13:22:04 +0000 UTC" firstStartedPulling="2025-12-11 13:22:05.501350738 +0000 UTC m=+1083.073677185" lastFinishedPulling="2025-12-11 13:22:09.775226831 +0000 UTC m=+1087.347553268" observedRunningTime="2025-12-11 13:22:10.376937153 +0000 UTC m=+1087.949263600" watchObservedRunningTime="2025-12-11 13:22:10.381416257 +0000 UTC m=+1087.953742724" Dec 11 13:22:14 crc kubenswrapper[4898]: I1211 13:22:14.949766 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-operator-5dc777b99d-mszqj" Dec 11 13:22:51 crc kubenswrapper[4898]: I1211 13:22:51.283241 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7d9dfd778-hmbfx"] Dec 11 13:22:51 crc kubenswrapper[4898]: I1211 13:22:51.285058 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-hmbfx" Dec 11 13:22:51 crc kubenswrapper[4898]: I1211 13:22:51.287093 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-bqbfh" Dec 11 13:22:51 crc kubenswrapper[4898]: I1211 13:22:51.290827 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-6c677c69b-rfxp7"] Dec 11 13:22:51 crc kubenswrapper[4898]: I1211 13:22:51.292315 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-6c677c69b-rfxp7" Dec 11 13:22:51 crc kubenswrapper[4898]: I1211 13:22:51.294008 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-wdqfl" Dec 11 13:22:51 crc kubenswrapper[4898]: I1211 13:22:51.301185 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-6c677c69b-rfxp7"] Dec 11 13:22:51 crc kubenswrapper[4898]: I1211 13:22:51.311370 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7d9dfd778-hmbfx"] Dec 11 13:22:51 crc kubenswrapper[4898]: I1211 13:22:51.352228 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-697fb699cf-pdvzc"] Dec 11 13:22:51 crc kubenswrapper[4898]: I1211 13:22:51.354094 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-697fb699cf-pdvzc" Dec 11 13:22:51 crc kubenswrapper[4898]: I1211 13:22:51.357051 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-vpmhg" Dec 11 13:22:51 crc kubenswrapper[4898]: I1211 13:22:51.361287 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-5697bb5779-7kffw"] Dec 11 13:22:51 crc kubenswrapper[4898]: I1211 13:22:51.362993 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-5697bb5779-7kffw" Dec 11 13:22:51 crc kubenswrapper[4898]: I1211 13:22:51.365759 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-c8rcm" Dec 11 13:22:51 crc kubenswrapper[4898]: I1211 13:22:51.370551 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-697fb699cf-pdvzc"] Dec 11 13:22:51 crc kubenswrapper[4898]: I1211 13:22:51.381039 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-5f64f6f8bb-4lcmf"] Dec 11 13:22:51 crc kubenswrapper[4898]: I1211 13:22:51.383313 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-4lcmf" Dec 11 13:22:51 crc kubenswrapper[4898]: I1211 13:22:51.384397 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-htdgw\" (UniqueName: \"kubernetes.io/projected/d7d8e047-7525-4d88-b802-550590e7f743-kube-api-access-htdgw\") pod \"designate-operator-controller-manager-697fb699cf-pdvzc\" (UID: \"d7d8e047-7525-4d88-b802-550590e7f743\") " pod="openstack-operators/designate-operator-controller-manager-697fb699cf-pdvzc" Dec 11 13:22:51 crc kubenswrapper[4898]: I1211 13:22:51.384448 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bflkc\" (UniqueName: \"kubernetes.io/projected/5c391a19-7c2d-4838-9269-2c5cd8eea1ad-kube-api-access-bflkc\") pod \"glance-operator-controller-manager-5697bb5779-7kffw\" (UID: \"5c391a19-7c2d-4838-9269-2c5cd8eea1ad\") " pod="openstack-operators/glance-operator-controller-manager-5697bb5779-7kffw" Dec 11 13:22:51 crc kubenswrapper[4898]: I1211 13:22:51.384498 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5km74\" (UniqueName: \"kubernetes.io/projected/c154e39f-1760-4071-b688-f301c3a398e7-kube-api-access-5km74\") pod \"cinder-operator-controller-manager-6c677c69b-rfxp7\" (UID: \"c154e39f-1760-4071-b688-f301c3a398e7\") " pod="openstack-operators/cinder-operator-controller-manager-6c677c69b-rfxp7" Dec 11 13:22:51 crc kubenswrapper[4898]: I1211 13:22:51.384525 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4jbfs\" (UniqueName: \"kubernetes.io/projected/c99ec3c0-d415-4322-95cd-d57411a1db7b-kube-api-access-4jbfs\") pod \"barbican-operator-controller-manager-7d9dfd778-hmbfx\" (UID: \"c99ec3c0-d415-4322-95cd-d57411a1db7b\") " pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-hmbfx" Dec 11 13:22:51 crc kubenswrapper[4898]: I1211 13:22:51.392986 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-nvq57" Dec 11 13:22:51 crc kubenswrapper[4898]: I1211 13:22:51.405271 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-5697bb5779-7kffw"] Dec 11 13:22:51 crc kubenswrapper[4898]: I1211 13:22:51.495190 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-htdgw\" (UniqueName: \"kubernetes.io/projected/d7d8e047-7525-4d88-b802-550590e7f743-kube-api-access-htdgw\") pod \"designate-operator-controller-manager-697fb699cf-pdvzc\" (UID: \"d7d8e047-7525-4d88-b802-550590e7f743\") " pod="openstack-operators/designate-operator-controller-manager-697fb699cf-pdvzc" Dec 11 13:22:51 crc kubenswrapper[4898]: I1211 13:22:51.495243 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bflkc\" (UniqueName: \"kubernetes.io/projected/5c391a19-7c2d-4838-9269-2c5cd8eea1ad-kube-api-access-bflkc\") pod \"glance-operator-controller-manager-5697bb5779-7kffw\" (UID: \"5c391a19-7c2d-4838-9269-2c5cd8eea1ad\") " pod="openstack-operators/glance-operator-controller-manager-5697bb5779-7kffw" Dec 11 13:22:51 crc kubenswrapper[4898]: I1211 13:22:51.495276 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5km74\" (UniqueName: \"kubernetes.io/projected/c154e39f-1760-4071-b688-f301c3a398e7-kube-api-access-5km74\") pod \"cinder-operator-controller-manager-6c677c69b-rfxp7\" (UID: \"c154e39f-1760-4071-b688-f301c3a398e7\") " pod="openstack-operators/cinder-operator-controller-manager-6c677c69b-rfxp7" Dec 11 13:22:51 crc kubenswrapper[4898]: I1211 13:22:51.495301 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4jbfs\" (UniqueName: \"kubernetes.io/projected/c99ec3c0-d415-4322-95cd-d57411a1db7b-kube-api-access-4jbfs\") pod \"barbican-operator-controller-manager-7d9dfd778-hmbfx\" (UID: \"c99ec3c0-d415-4322-95cd-d57411a1db7b\") " pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-hmbfx" Dec 11 13:22:51 crc kubenswrapper[4898]: I1211 13:22:51.505143 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5f64f6f8bb-4lcmf"] Dec 11 13:22:51 crc kubenswrapper[4898]: I1211 13:22:51.520497 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-68c6d99b8f-tq8mw"] Dec 11 13:22:51 crc kubenswrapper[4898]: I1211 13:22:51.521890 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-tq8mw" Dec 11 13:22:51 crc kubenswrapper[4898]: I1211 13:22:51.525052 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-x8lzj" Dec 11 13:22:51 crc kubenswrapper[4898]: I1211 13:22:51.536813 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-78d48bff9d-8nj46"] Dec 11 13:22:51 crc kubenswrapper[4898]: I1211 13:22:51.544820 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-8nj46" Dec 11 13:22:51 crc kubenswrapper[4898]: I1211 13:22:51.547859 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-9h4wz" Dec 11 13:22:51 crc kubenswrapper[4898]: I1211 13:22:51.548930 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Dec 11 13:22:51 crc kubenswrapper[4898]: I1211 13:22:51.549908 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bflkc\" (UniqueName: \"kubernetes.io/projected/5c391a19-7c2d-4838-9269-2c5cd8eea1ad-kube-api-access-bflkc\") pod \"glance-operator-controller-manager-5697bb5779-7kffw\" (UID: \"5c391a19-7c2d-4838-9269-2c5cd8eea1ad\") " pod="openstack-operators/glance-operator-controller-manager-5697bb5779-7kffw" Dec 11 13:22:51 crc kubenswrapper[4898]: I1211 13:22:51.552993 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-htdgw\" (UniqueName: \"kubernetes.io/projected/d7d8e047-7525-4d88-b802-550590e7f743-kube-api-access-htdgw\") pod \"designate-operator-controller-manager-697fb699cf-pdvzc\" (UID: \"d7d8e047-7525-4d88-b802-550590e7f743\") " pod="openstack-operators/designate-operator-controller-manager-697fb699cf-pdvzc" Dec 11 13:22:51 crc kubenswrapper[4898]: I1211 13:22:51.554942 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5km74\" (UniqueName: \"kubernetes.io/projected/c154e39f-1760-4071-b688-f301c3a398e7-kube-api-access-5km74\") pod \"cinder-operator-controller-manager-6c677c69b-rfxp7\" (UID: \"c154e39f-1760-4071-b688-f301c3a398e7\") " pod="openstack-operators/cinder-operator-controller-manager-6c677c69b-rfxp7" Dec 11 13:22:51 crc kubenswrapper[4898]: I1211 13:22:51.559913 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-68c6d99b8f-tq8mw"] Dec 11 13:22:51 crc kubenswrapper[4898]: I1211 13:22:51.562219 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4jbfs\" (UniqueName: \"kubernetes.io/projected/c99ec3c0-d415-4322-95cd-d57411a1db7b-kube-api-access-4jbfs\") pod \"barbican-operator-controller-manager-7d9dfd778-hmbfx\" (UID: \"c99ec3c0-d415-4322-95cd-d57411a1db7b\") " pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-hmbfx" Dec 11 13:22:51 crc kubenswrapper[4898]: I1211 13:22:51.596511 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tvsd6\" (UniqueName: \"kubernetes.io/projected/7a0813f7-7167-46ed-b9f8-e2157e92f620-kube-api-access-tvsd6\") pod \"heat-operator-controller-manager-5f64f6f8bb-4lcmf\" (UID: \"7a0813f7-7167-46ed-b9f8-e2157e92f620\") " pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-4lcmf" Dec 11 13:22:51 crc kubenswrapper[4898]: I1211 13:22:51.605816 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-hmbfx" Dec 11 13:22:51 crc kubenswrapper[4898]: I1211 13:22:51.634999 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-6c677c69b-rfxp7" Dec 11 13:22:51 crc kubenswrapper[4898]: I1211 13:22:51.639670 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-78d48bff9d-8nj46"] Dec 11 13:22:51 crc kubenswrapper[4898]: I1211 13:22:51.654871 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-967d97867-qrfjf"] Dec 11 13:22:51 crc kubenswrapper[4898]: I1211 13:22:51.656183 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-967d97867-qrfjf" Dec 11 13:22:51 crc kubenswrapper[4898]: I1211 13:22:51.659889 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-gd89p" Dec 11 13:22:51 crc kubenswrapper[4898]: I1211 13:22:51.667478 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-967d97867-qrfjf"] Dec 11 13:22:51 crc kubenswrapper[4898]: I1211 13:22:51.689756 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-697fb699cf-pdvzc" Dec 11 13:22:51 crc kubenswrapper[4898]: I1211 13:22:51.695542 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7765d96ddf-wlqm4"] Dec 11 13:22:51 crc kubenswrapper[4898]: I1211 13:22:51.696784 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-wlqm4" Dec 11 13:22:51 crc kubenswrapper[4898]: I1211 13:22:51.698806 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xgkq8\" (UniqueName: \"kubernetes.io/projected/e2834985-dbd0-4ad6-afd2-8238997ec8e5-kube-api-access-xgkq8\") pod \"infra-operator-controller-manager-78d48bff9d-8nj46\" (UID: \"e2834985-dbd0-4ad6-afd2-8238997ec8e5\") " pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-8nj46" Dec 11 13:22:51 crc kubenswrapper[4898]: I1211 13:22:51.698856 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tvsd6\" (UniqueName: \"kubernetes.io/projected/7a0813f7-7167-46ed-b9f8-e2157e92f620-kube-api-access-tvsd6\") pod \"heat-operator-controller-manager-5f64f6f8bb-4lcmf\" (UID: \"7a0813f7-7167-46ed-b9f8-e2157e92f620\") " pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-4lcmf" Dec 11 13:22:51 crc kubenswrapper[4898]: I1211 13:22:51.698908 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e2834985-dbd0-4ad6-afd2-8238997ec8e5-cert\") pod \"infra-operator-controller-manager-78d48bff9d-8nj46\" (UID: \"e2834985-dbd0-4ad6-afd2-8238997ec8e5\") " pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-8nj46" Dec 11 13:22:51 crc kubenswrapper[4898]: I1211 13:22:51.699091 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mphhz\" (UniqueName: \"kubernetes.io/projected/0c6054e7-bb0a-4cbd-b459-d9d100182fa1-kube-api-access-mphhz\") pod \"horizon-operator-controller-manager-68c6d99b8f-tq8mw\" (UID: \"0c6054e7-bb0a-4cbd-b459-d9d100182fa1\") " pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-tq8mw" Dec 11 13:22:51 crc kubenswrapper[4898]: I1211 13:22:51.701534 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-p55ff" Dec 11 13:22:51 crc kubenswrapper[4898]: I1211 13:22:51.705305 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-5697bb5779-7kffw" Dec 11 13:22:51 crc kubenswrapper[4898]: I1211 13:22:51.713914 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-79c8c4686c-qpznh"] Dec 11 13:22:51 crc kubenswrapper[4898]: I1211 13:22:51.717839 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-79c8c4686c-qpznh" Dec 11 13:22:51 crc kubenswrapper[4898]: I1211 13:22:51.721686 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-26qs6" Dec 11 13:22:51 crc kubenswrapper[4898]: I1211 13:22:51.724689 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-5b5fd79c9c-wxz25"] Dec 11 13:22:51 crc kubenswrapper[4898]: I1211 13:22:51.727011 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-5b5fd79c9c-wxz25" Dec 11 13:22:51 crc kubenswrapper[4898]: I1211 13:22:51.731119 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-vcxqj" Dec 11 13:22:51 crc kubenswrapper[4898]: I1211 13:22:51.732130 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tvsd6\" (UniqueName: \"kubernetes.io/projected/7a0813f7-7167-46ed-b9f8-e2157e92f620-kube-api-access-tvsd6\") pod \"heat-operator-controller-manager-5f64f6f8bb-4lcmf\" (UID: \"7a0813f7-7167-46ed-b9f8-e2157e92f620\") " pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-4lcmf" Dec 11 13:22:51 crc kubenswrapper[4898]: I1211 13:22:51.735386 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7765d96ddf-wlqm4"] Dec 11 13:22:51 crc kubenswrapper[4898]: I1211 13:22:51.749561 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-pgfnv"] Dec 11 13:22:51 crc kubenswrapper[4898]: I1211 13:22:51.750730 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-pgfnv" Dec 11 13:22:51 crc kubenswrapper[4898]: I1211 13:22:51.754175 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-8wglh" Dec 11 13:22:51 crc kubenswrapper[4898]: I1211 13:22:51.757923 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-697bc559fc-pqc9z"] Dec 11 13:22:51 crc kubenswrapper[4898]: I1211 13:22:51.759190 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-pqc9z" Dec 11 13:22:51 crc kubenswrapper[4898]: I1211 13:22:51.768561 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-67xjf" Dec 11 13:22:51 crc kubenswrapper[4898]: I1211 13:22:51.773777 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-79c8c4686c-qpznh"] Dec 11 13:22:51 crc kubenswrapper[4898]: I1211 13:22:51.785486 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-5b5fd79c9c-wxz25"] Dec 11 13:22:51 crc kubenswrapper[4898]: I1211 13:22:51.792796 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-4lcmf" Dec 11 13:22:51 crc kubenswrapper[4898]: I1211 13:22:51.797225 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-998648c74-dvtzm"] Dec 11 13:22:51 crc kubenswrapper[4898]: I1211 13:22:51.798590 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-998648c74-dvtzm" Dec 11 13:22:51 crc kubenswrapper[4898]: I1211 13:22:51.800382 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xgkq8\" (UniqueName: \"kubernetes.io/projected/e2834985-dbd0-4ad6-afd2-8238997ec8e5-kube-api-access-xgkq8\") pod \"infra-operator-controller-manager-78d48bff9d-8nj46\" (UID: \"e2834985-dbd0-4ad6-afd2-8238997ec8e5\") " pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-8nj46" Dec 11 13:22:51 crc kubenswrapper[4898]: I1211 13:22:51.800407 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e2834985-dbd0-4ad6-afd2-8238997ec8e5-cert\") pod \"infra-operator-controller-manager-78d48bff9d-8nj46\" (UID: \"e2834985-dbd0-4ad6-afd2-8238997ec8e5\") " pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-8nj46" Dec 11 13:22:51 crc kubenswrapper[4898]: I1211 13:22:51.800432 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d5f2h\" (UniqueName: \"kubernetes.io/projected/e87a760e-bf60-4a98-bb37-1f44745e250f-kube-api-access-d5f2h\") pod \"keystone-operator-controller-manager-7765d96ddf-wlqm4\" (UID: \"e87a760e-bf60-4a98-bb37-1f44745e250f\") " pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-wlqm4" Dec 11 13:22:51 crc kubenswrapper[4898]: I1211 13:22:51.801570 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-2b2v2" Dec 11 13:22:51 crc kubenswrapper[4898]: E1211 13:22:51.802568 4898 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 11 13:22:51 crc kubenswrapper[4898]: E1211 13:22:51.802648 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e2834985-dbd0-4ad6-afd2-8238997ec8e5-cert podName:e2834985-dbd0-4ad6-afd2-8238997ec8e5 nodeName:}" failed. No retries permitted until 2025-12-11 13:22:52.302606347 +0000 UTC m=+1129.874932794 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e2834985-dbd0-4ad6-afd2-8238997ec8e5-cert") pod "infra-operator-controller-manager-78d48bff9d-8nj46" (UID: "e2834985-dbd0-4ad6-afd2-8238997ec8e5") : secret "infra-operator-webhook-server-cert" not found Dec 11 13:22:51 crc kubenswrapper[4898]: I1211 13:22:51.803371 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mphhz\" (UniqueName: \"kubernetes.io/projected/0c6054e7-bb0a-4cbd-b459-d9d100182fa1-kube-api-access-mphhz\") pod \"horizon-operator-controller-manager-68c6d99b8f-tq8mw\" (UID: \"0c6054e7-bb0a-4cbd-b459-d9d100182fa1\") " pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-tq8mw" Dec 11 13:22:51 crc kubenswrapper[4898]: I1211 13:22:51.803608 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jcvf9\" (UniqueName: \"kubernetes.io/projected/1f2676f6-97b8-425e-9d05-9ec2c52055de-kube-api-access-jcvf9\") pod \"ironic-operator-controller-manager-967d97867-qrfjf\" (UID: \"1f2676f6-97b8-425e-9d05-9ec2c52055de\") " pod="openstack-operators/ironic-operator-controller-manager-967d97867-qrfjf" Dec 11 13:22:51 crc kubenswrapper[4898]: I1211 13:22:51.816329 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-pgfnv"] Dec 11 13:22:51 crc kubenswrapper[4898]: I1211 13:22:51.824214 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mphhz\" (UniqueName: \"kubernetes.io/projected/0c6054e7-bb0a-4cbd-b459-d9d100182fa1-kube-api-access-mphhz\") pod \"horizon-operator-controller-manager-68c6d99b8f-tq8mw\" (UID: \"0c6054e7-bb0a-4cbd-b459-d9d100182fa1\") " pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-tq8mw" Dec 11 13:22:51 crc kubenswrapper[4898]: I1211 13:22:51.827763 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-697bc559fc-pqc9z"] Dec 11 13:22:51 crc kubenswrapper[4898]: I1211 13:22:51.835244 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xgkq8\" (UniqueName: \"kubernetes.io/projected/e2834985-dbd0-4ad6-afd2-8238997ec8e5-kube-api-access-xgkq8\") pod \"infra-operator-controller-manager-78d48bff9d-8nj46\" (UID: \"e2834985-dbd0-4ad6-afd2-8238997ec8e5\") " pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-8nj46" Dec 11 13:22:51 crc kubenswrapper[4898]: I1211 13:22:51.835807 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-998648c74-dvtzm"] Dec 11 13:22:51 crc kubenswrapper[4898]: I1211 13:22:51.853372 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-84b575879fv8ph7"] Dec 11 13:22:51 crc kubenswrapper[4898]: I1211 13:22:51.854756 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879fv8ph7" Dec 11 13:22:51 crc kubenswrapper[4898]: I1211 13:22:51.857428 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-4dw8m" Dec 11 13:22:51 crc kubenswrapper[4898]: I1211 13:22:51.857633 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Dec 11 13:22:51 crc kubenswrapper[4898]: I1211 13:22:51.875824 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-b6456fdb6-nx4h5"] Dec 11 13:22:51 crc kubenswrapper[4898]: I1211 13:22:51.877137 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-nx4h5" Dec 11 13:22:51 crc kubenswrapper[4898]: I1211 13:22:51.885573 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-bfpcm" Dec 11 13:22:51 crc kubenswrapper[4898]: I1211 13:22:51.905244 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5lmvl\" (UniqueName: \"kubernetes.io/projected/1e77353c-6728-4dfa-814c-1a92115c8bf2-kube-api-access-5lmvl\") pod \"mariadb-operator-controller-manager-79c8c4686c-qpznh\" (UID: \"1e77353c-6728-4dfa-814c-1a92115c8bf2\") " pod="openstack-operators/mariadb-operator-controller-manager-79c8c4686c-qpznh" Dec 11 13:22:51 crc kubenswrapper[4898]: I1211 13:22:51.905340 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t5k4r\" (UniqueName: \"kubernetes.io/projected/09d9c781-008c-4486-807c-159f4fefe857-kube-api-access-t5k4r\") pod \"neutron-operator-controller-manager-5fdfd5b6b5-pgfnv\" (UID: \"09d9c781-008c-4486-807c-159f4fefe857\") " pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-pgfnv" Dec 11 13:22:51 crc kubenswrapper[4898]: I1211 13:22:51.905406 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d5f2h\" (UniqueName: \"kubernetes.io/projected/e87a760e-bf60-4a98-bb37-1f44745e250f-kube-api-access-d5f2h\") pod \"keystone-operator-controller-manager-7765d96ddf-wlqm4\" (UID: \"e87a760e-bf60-4a98-bb37-1f44745e250f\") " pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-wlqm4" Dec 11 13:22:51 crc kubenswrapper[4898]: I1211 13:22:51.905505 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-772lz\" (UniqueName: \"kubernetes.io/projected/88e97f63-c1cb-4ef1-9d95-0c11dc52c94c-kube-api-access-772lz\") pod \"octavia-operator-controller-manager-998648c74-dvtzm\" (UID: \"88e97f63-c1cb-4ef1-9d95-0c11dc52c94c\") " pod="openstack-operators/octavia-operator-controller-manager-998648c74-dvtzm" Dec 11 13:22:51 crc kubenswrapper[4898]: I1211 13:22:51.905581 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pglc4\" (UniqueName: \"kubernetes.io/projected/ac70ed50-7e53-4bb9-ac63-35e5c0651db5-kube-api-access-pglc4\") pod \"nova-operator-controller-manager-697bc559fc-pqc9z\" (UID: \"ac70ed50-7e53-4bb9-ac63-35e5c0651db5\") " pod="openstack-operators/nova-operator-controller-manager-697bc559fc-pqc9z" Dec 11 13:22:51 crc kubenswrapper[4898]: I1211 13:22:51.905635 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jdd8k\" (UniqueName: \"kubernetes.io/projected/a99a2194-b89b-4a6a-a086-acd20b489632-kube-api-access-jdd8k\") pod \"manila-operator-controller-manager-5b5fd79c9c-wxz25\" (UID: \"a99a2194-b89b-4a6a-a086-acd20b489632\") " pod="openstack-operators/manila-operator-controller-manager-5b5fd79c9c-wxz25" Dec 11 13:22:51 crc kubenswrapper[4898]: I1211 13:22:51.905724 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jcvf9\" (UniqueName: \"kubernetes.io/projected/1f2676f6-97b8-425e-9d05-9ec2c52055de-kube-api-access-jcvf9\") pod \"ironic-operator-controller-manager-967d97867-qrfjf\" (UID: \"1f2676f6-97b8-425e-9d05-9ec2c52055de\") " pod="openstack-operators/ironic-operator-controller-manager-967d97867-qrfjf" Dec 11 13:22:51 crc kubenswrapper[4898]: I1211 13:22:51.928965 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d5f2h\" (UniqueName: \"kubernetes.io/projected/e87a760e-bf60-4a98-bb37-1f44745e250f-kube-api-access-d5f2h\") pod \"keystone-operator-controller-manager-7765d96ddf-wlqm4\" (UID: \"e87a760e-bf60-4a98-bb37-1f44745e250f\") " pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-wlqm4" Dec 11 13:22:51 crc kubenswrapper[4898]: I1211 13:22:51.931903 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-84b575879fv8ph7"] Dec 11 13:22:51 crc kubenswrapper[4898]: I1211 13:22:51.936556 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jcvf9\" (UniqueName: \"kubernetes.io/projected/1f2676f6-97b8-425e-9d05-9ec2c52055de-kube-api-access-jcvf9\") pod \"ironic-operator-controller-manager-967d97867-qrfjf\" (UID: \"1f2676f6-97b8-425e-9d05-9ec2c52055de\") " pod="openstack-operators/ironic-operator-controller-manager-967d97867-qrfjf" Dec 11 13:22:51 crc kubenswrapper[4898]: I1211 13:22:51.937856 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-tq8mw" Dec 11 13:22:51 crc kubenswrapper[4898]: I1211 13:22:51.964421 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-78f8948974-9p4w9"] Dec 11 13:22:51 crc kubenswrapper[4898]: I1211 13:22:51.973311 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-78f8948974-9p4w9" Dec 11 13:22:51 crc kubenswrapper[4898]: I1211 13:22:51.978838 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-b6456fdb6-nx4h5"] Dec 11 13:22:51 crc kubenswrapper[4898]: I1211 13:22:51.979819 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-6qndw" Dec 11 13:22:51 crc kubenswrapper[4898]: I1211 13:22:51.984553 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-78f8948974-9p4w9"] Dec 11 13:22:52 crc kubenswrapper[4898]: I1211 13:22:52.002494 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-9d58d64bc-2wt95"] Dec 11 13:22:52 crc kubenswrapper[4898]: I1211 13:22:52.004354 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-9d58d64bc-2wt95" Dec 11 13:22:52 crc kubenswrapper[4898]: I1211 13:22:52.009283 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-ncp4s" Dec 11 13:22:52 crc kubenswrapper[4898]: I1211 13:22:52.009590 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jdd8k\" (UniqueName: \"kubernetes.io/projected/a99a2194-b89b-4a6a-a086-acd20b489632-kube-api-access-jdd8k\") pod \"manila-operator-controller-manager-5b5fd79c9c-wxz25\" (UID: \"a99a2194-b89b-4a6a-a086-acd20b489632\") " pod="openstack-operators/manila-operator-controller-manager-5b5fd79c9c-wxz25" Dec 11 13:22:52 crc kubenswrapper[4898]: I1211 13:22:52.009674 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-stl6q\" (UniqueName: \"kubernetes.io/projected/95c66498-ab0d-4618-b884-523e1183d758-kube-api-access-stl6q\") pod \"ovn-operator-controller-manager-b6456fdb6-nx4h5\" (UID: \"95c66498-ab0d-4618-b884-523e1183d758\") " pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-nx4h5" Dec 11 13:22:52 crc kubenswrapper[4898]: I1211 13:22:52.009717 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5lmvl\" (UniqueName: \"kubernetes.io/projected/1e77353c-6728-4dfa-814c-1a92115c8bf2-kube-api-access-5lmvl\") pod \"mariadb-operator-controller-manager-79c8c4686c-qpznh\" (UID: \"1e77353c-6728-4dfa-814c-1a92115c8bf2\") " pod="openstack-operators/mariadb-operator-controller-manager-79c8c4686c-qpznh" Dec 11 13:22:52 crc kubenswrapper[4898]: I1211 13:22:52.009737 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t5k4r\" (UniqueName: \"kubernetes.io/projected/09d9c781-008c-4486-807c-159f4fefe857-kube-api-access-t5k4r\") pod \"neutron-operator-controller-manager-5fdfd5b6b5-pgfnv\" (UID: \"09d9c781-008c-4486-807c-159f4fefe857\") " pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-pgfnv" Dec 11 13:22:52 crc kubenswrapper[4898]: I1211 13:22:52.009777 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1fbc642b-9636-47c2-a3db-7913fa4a6b91-cert\") pod \"openstack-baremetal-operator-controller-manager-84b575879fv8ph7\" (UID: \"1fbc642b-9636-47c2-a3db-7913fa4a6b91\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879fv8ph7" Dec 11 13:22:52 crc kubenswrapper[4898]: I1211 13:22:52.009808 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-964rh\" (UniqueName: \"kubernetes.io/projected/1fbc642b-9636-47c2-a3db-7913fa4a6b91-kube-api-access-964rh\") pod \"openstack-baremetal-operator-controller-manager-84b575879fv8ph7\" (UID: \"1fbc642b-9636-47c2-a3db-7913fa4a6b91\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879fv8ph7" Dec 11 13:22:52 crc kubenswrapper[4898]: I1211 13:22:52.009830 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-772lz\" (UniqueName: \"kubernetes.io/projected/88e97f63-c1cb-4ef1-9d95-0c11dc52c94c-kube-api-access-772lz\") pod \"octavia-operator-controller-manager-998648c74-dvtzm\" (UID: \"88e97f63-c1cb-4ef1-9d95-0c11dc52c94c\") " pod="openstack-operators/octavia-operator-controller-manager-998648c74-dvtzm" Dec 11 13:22:52 crc kubenswrapper[4898]: I1211 13:22:52.009869 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pglc4\" (UniqueName: \"kubernetes.io/projected/ac70ed50-7e53-4bb9-ac63-35e5c0651db5-kube-api-access-pglc4\") pod \"nova-operator-controller-manager-697bc559fc-pqc9z\" (UID: \"ac70ed50-7e53-4bb9-ac63-35e5c0651db5\") " pod="openstack-operators/nova-operator-controller-manager-697bc559fc-pqc9z" Dec 11 13:22:52 crc kubenswrapper[4898]: I1211 13:22:52.014420 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-9d58d64bc-2wt95"] Dec 11 13:22:52 crc kubenswrapper[4898]: I1211 13:22:52.048529 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t5k4r\" (UniqueName: \"kubernetes.io/projected/09d9c781-008c-4486-807c-159f4fefe857-kube-api-access-t5k4r\") pod \"neutron-operator-controller-manager-5fdfd5b6b5-pgfnv\" (UID: \"09d9c781-008c-4486-807c-159f4fefe857\") " pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-pgfnv" Dec 11 13:22:52 crc kubenswrapper[4898]: I1211 13:22:52.048858 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jdd8k\" (UniqueName: \"kubernetes.io/projected/a99a2194-b89b-4a6a-a086-acd20b489632-kube-api-access-jdd8k\") pod \"manila-operator-controller-manager-5b5fd79c9c-wxz25\" (UID: \"a99a2194-b89b-4a6a-a086-acd20b489632\") " pod="openstack-operators/manila-operator-controller-manager-5b5fd79c9c-wxz25" Dec 11 13:22:52 crc kubenswrapper[4898]: I1211 13:22:52.049689 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pglc4\" (UniqueName: \"kubernetes.io/projected/ac70ed50-7e53-4bb9-ac63-35e5c0651db5-kube-api-access-pglc4\") pod \"nova-operator-controller-manager-697bc559fc-pqc9z\" (UID: \"ac70ed50-7e53-4bb9-ac63-35e5c0651db5\") " pod="openstack-operators/nova-operator-controller-manager-697bc559fc-pqc9z" Dec 11 13:22:52 crc kubenswrapper[4898]: I1211 13:22:52.051299 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-772lz\" (UniqueName: \"kubernetes.io/projected/88e97f63-c1cb-4ef1-9d95-0c11dc52c94c-kube-api-access-772lz\") pod \"octavia-operator-controller-manager-998648c74-dvtzm\" (UID: \"88e97f63-c1cb-4ef1-9d95-0c11dc52c94c\") " pod="openstack-operators/octavia-operator-controller-manager-998648c74-dvtzm" Dec 11 13:22:52 crc kubenswrapper[4898]: I1211 13:22:52.069220 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-967d97867-qrfjf" Dec 11 13:22:52 crc kubenswrapper[4898]: I1211 13:22:52.075784 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5lmvl\" (UniqueName: \"kubernetes.io/projected/1e77353c-6728-4dfa-814c-1a92115c8bf2-kube-api-access-5lmvl\") pod \"mariadb-operator-controller-manager-79c8c4686c-qpznh\" (UID: \"1e77353c-6728-4dfa-814c-1a92115c8bf2\") " pod="openstack-operators/mariadb-operator-controller-manager-79c8c4686c-qpznh" Dec 11 13:22:52 crc kubenswrapper[4898]: I1211 13:22:52.090741 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-wlqm4" Dec 11 13:22:52 crc kubenswrapper[4898]: I1211 13:22:52.113283 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1fbc642b-9636-47c2-a3db-7913fa4a6b91-cert\") pod \"openstack-baremetal-operator-controller-manager-84b575879fv8ph7\" (UID: \"1fbc642b-9636-47c2-a3db-7913fa4a6b91\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879fv8ph7" Dec 11 13:22:52 crc kubenswrapper[4898]: I1211 13:22:52.113390 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-964rh\" (UniqueName: \"kubernetes.io/projected/1fbc642b-9636-47c2-a3db-7913fa4a6b91-kube-api-access-964rh\") pod \"openstack-baremetal-operator-controller-manager-84b575879fv8ph7\" (UID: \"1fbc642b-9636-47c2-a3db-7913fa4a6b91\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879fv8ph7" Dec 11 13:22:52 crc kubenswrapper[4898]: I1211 13:22:52.113523 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rfg7c\" (UniqueName: \"kubernetes.io/projected/42b8c71f-abd8-49b1-b604-49b3292ba29a-kube-api-access-rfg7c\") pod \"placement-operator-controller-manager-78f8948974-9p4w9\" (UID: \"42b8c71f-abd8-49b1-b604-49b3292ba29a\") " pod="openstack-operators/placement-operator-controller-manager-78f8948974-9p4w9" Dec 11 13:22:52 crc kubenswrapper[4898]: I1211 13:22:52.113707 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-stl6q\" (UniqueName: \"kubernetes.io/projected/95c66498-ab0d-4618-b884-523e1183d758-kube-api-access-stl6q\") pod \"ovn-operator-controller-manager-b6456fdb6-nx4h5\" (UID: \"95c66498-ab0d-4618-b884-523e1183d758\") " pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-nx4h5" Dec 11 13:22:52 crc kubenswrapper[4898]: I1211 13:22:52.113773 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lkpbx\" (UniqueName: \"kubernetes.io/projected/cb6adf46-208a-4945-97aa-2c457b9c2614-kube-api-access-lkpbx\") pod \"swift-operator-controller-manager-9d58d64bc-2wt95\" (UID: \"cb6adf46-208a-4945-97aa-2c457b9c2614\") " pod="openstack-operators/swift-operator-controller-manager-9d58d64bc-2wt95" Dec 11 13:22:52 crc kubenswrapper[4898]: E1211 13:22:52.114002 4898 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 11 13:22:52 crc kubenswrapper[4898]: E1211 13:22:52.114067 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1fbc642b-9636-47c2-a3db-7913fa4a6b91-cert podName:1fbc642b-9636-47c2-a3db-7913fa4a6b91 nodeName:}" failed. No retries permitted until 2025-12-11 13:22:52.614040592 +0000 UTC m=+1130.186367039 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1fbc642b-9636-47c2-a3db-7913fa4a6b91-cert") pod "openstack-baremetal-operator-controller-manager-84b575879fv8ph7" (UID: "1fbc642b-9636-47c2-a3db-7913fa4a6b91") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 11 13:22:52 crc kubenswrapper[4898]: I1211 13:22:52.124422 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-766b45bcdb-ksffb"] Dec 11 13:22:52 crc kubenswrapper[4898]: I1211 13:22:52.128684 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-766b45bcdb-ksffb" Dec 11 13:22:52 crc kubenswrapper[4898]: I1211 13:22:52.129616 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-79c8c4686c-qpznh" Dec 11 13:22:52 crc kubenswrapper[4898]: I1211 13:22:52.131699 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-766b45bcdb-ksffb"] Dec 11 13:22:52 crc kubenswrapper[4898]: I1211 13:22:52.132242 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-scnn5" Dec 11 13:22:52 crc kubenswrapper[4898]: I1211 13:22:52.135832 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-964rh\" (UniqueName: \"kubernetes.io/projected/1fbc642b-9636-47c2-a3db-7913fa4a6b91-kube-api-access-964rh\") pod \"openstack-baremetal-operator-controller-manager-84b575879fv8ph7\" (UID: \"1fbc642b-9636-47c2-a3db-7913fa4a6b91\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879fv8ph7" Dec 11 13:22:52 crc kubenswrapper[4898]: I1211 13:22:52.143191 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-5b5fd79c9c-wxz25" Dec 11 13:22:52 crc kubenswrapper[4898]: I1211 13:22:52.163022 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-stl6q\" (UniqueName: \"kubernetes.io/projected/95c66498-ab0d-4618-b884-523e1183d758-kube-api-access-stl6q\") pod \"ovn-operator-controller-manager-b6456fdb6-nx4h5\" (UID: \"95c66498-ab0d-4618-b884-523e1183d758\") " pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-nx4h5" Dec 11 13:22:52 crc kubenswrapper[4898]: I1211 13:22:52.177860 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-5854674fcc-8d8zb"] Dec 11 13:22:52 crc kubenswrapper[4898]: I1211 13:22:52.182041 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5854674fcc-8d8zb" Dec 11 13:22:52 crc kubenswrapper[4898]: I1211 13:22:52.185887 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-rz7kb" Dec 11 13:22:52 crc kubenswrapper[4898]: I1211 13:22:52.215691 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lkpbx\" (UniqueName: \"kubernetes.io/projected/cb6adf46-208a-4945-97aa-2c457b9c2614-kube-api-access-lkpbx\") pod \"swift-operator-controller-manager-9d58d64bc-2wt95\" (UID: \"cb6adf46-208a-4945-97aa-2c457b9c2614\") " pod="openstack-operators/swift-operator-controller-manager-9d58d64bc-2wt95" Dec 11 13:22:52 crc kubenswrapper[4898]: I1211 13:22:52.215798 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rfg7c\" (UniqueName: \"kubernetes.io/projected/42b8c71f-abd8-49b1-b604-49b3292ba29a-kube-api-access-rfg7c\") pod \"placement-operator-controller-manager-78f8948974-9p4w9\" (UID: \"42b8c71f-abd8-49b1-b604-49b3292ba29a\") " pod="openstack-operators/placement-operator-controller-manager-78f8948974-9p4w9" Dec 11 13:22:52 crc kubenswrapper[4898]: I1211 13:22:52.216788 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5854674fcc-8d8zb"] Dec 11 13:22:52 crc kubenswrapper[4898]: I1211 13:22:52.235468 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-pgfnv" Dec 11 13:22:52 crc kubenswrapper[4898]: I1211 13:22:52.243439 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lkpbx\" (UniqueName: \"kubernetes.io/projected/cb6adf46-208a-4945-97aa-2c457b9c2614-kube-api-access-lkpbx\") pod \"swift-operator-controller-manager-9d58d64bc-2wt95\" (UID: \"cb6adf46-208a-4945-97aa-2c457b9c2614\") " pod="openstack-operators/swift-operator-controller-manager-9d58d64bc-2wt95" Dec 11 13:22:52 crc kubenswrapper[4898]: I1211 13:22:52.243844 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rfg7c\" (UniqueName: \"kubernetes.io/projected/42b8c71f-abd8-49b1-b604-49b3292ba29a-kube-api-access-rfg7c\") pod \"placement-operator-controller-manager-78f8948974-9p4w9\" (UID: \"42b8c71f-abd8-49b1-b604-49b3292ba29a\") " pod="openstack-operators/placement-operator-controller-manager-78f8948974-9p4w9" Dec 11 13:22:52 crc kubenswrapper[4898]: I1211 13:22:52.250041 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-75944c9b7-9kw6z"] Dec 11 13:22:52 crc kubenswrapper[4898]: I1211 13:22:52.254675 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-75944c9b7-9kw6z" Dec 11 13:22:52 crc kubenswrapper[4898]: I1211 13:22:52.257578 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-c7z5z" Dec 11 13:22:52 crc kubenswrapper[4898]: I1211 13:22:52.259232 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-pqc9z" Dec 11 13:22:52 crc kubenswrapper[4898]: I1211 13:22:52.264295 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-75944c9b7-9kw6z"] Dec 11 13:22:52 crc kubenswrapper[4898]: I1211 13:22:52.295862 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-596947c645-4xjkq"] Dec 11 13:22:52 crc kubenswrapper[4898]: I1211 13:22:52.297576 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-596947c645-4xjkq" Dec 11 13:22:52 crc kubenswrapper[4898]: I1211 13:22:52.300857 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Dec 11 13:22:52 crc kubenswrapper[4898]: I1211 13:22:52.301761 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Dec 11 13:22:52 crc kubenswrapper[4898]: I1211 13:22:52.301793 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-r2gw4" Dec 11 13:22:52 crc kubenswrapper[4898]: I1211 13:22:52.305706 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-596947c645-4xjkq"] Dec 11 13:22:52 crc kubenswrapper[4898]: I1211 13:22:52.318311 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-md52q\" (UniqueName: \"kubernetes.io/projected/f52c9389-ea61-4327-afd9-f4c92541a821-kube-api-access-md52q\") pod \"telemetry-operator-controller-manager-766b45bcdb-ksffb\" (UID: \"f52c9389-ea61-4327-afd9-f4c92541a821\") " pod="openstack-operators/telemetry-operator-controller-manager-766b45bcdb-ksffb" Dec 11 13:22:52 crc kubenswrapper[4898]: I1211 13:22:52.318356 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5c57n\" (UniqueName: \"kubernetes.io/projected/1a7e7363-7657-4eb2-a969-9f4c08a50984-kube-api-access-5c57n\") pod \"test-operator-controller-manager-5854674fcc-8d8zb\" (UID: \"1a7e7363-7657-4eb2-a969-9f4c08a50984\") " pod="openstack-operators/test-operator-controller-manager-5854674fcc-8d8zb" Dec 11 13:22:52 crc kubenswrapper[4898]: I1211 13:22:52.318433 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e2834985-dbd0-4ad6-afd2-8238997ec8e5-cert\") pod \"infra-operator-controller-manager-78d48bff9d-8nj46\" (UID: \"e2834985-dbd0-4ad6-afd2-8238997ec8e5\") " pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-8nj46" Dec 11 13:22:52 crc kubenswrapper[4898]: E1211 13:22:52.318683 4898 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 11 13:22:52 crc kubenswrapper[4898]: E1211 13:22:52.318739 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e2834985-dbd0-4ad6-afd2-8238997ec8e5-cert podName:e2834985-dbd0-4ad6-afd2-8238997ec8e5 nodeName:}" failed. No retries permitted until 2025-12-11 13:22:53.318719758 +0000 UTC m=+1130.891046195 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e2834985-dbd0-4ad6-afd2-8238997ec8e5-cert") pod "infra-operator-controller-manager-78d48bff9d-8nj46" (UID: "e2834985-dbd0-4ad6-afd2-8238997ec8e5") : secret "infra-operator-webhook-server-cert" not found Dec 11 13:22:52 crc kubenswrapper[4898]: I1211 13:22:52.322532 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-998648c74-dvtzm" Dec 11 13:22:52 crc kubenswrapper[4898]: I1211 13:22:52.335830 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-djqgv"] Dec 11 13:22:52 crc kubenswrapper[4898]: I1211 13:22:52.337028 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-djqgv" Dec 11 13:22:52 crc kubenswrapper[4898]: I1211 13:22:52.339514 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-tq2f6" Dec 11 13:22:52 crc kubenswrapper[4898]: I1211 13:22:52.348887 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-djqgv"] Dec 11 13:22:52 crc kubenswrapper[4898]: I1211 13:22:52.403002 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7d9dfd778-hmbfx"] Dec 11 13:22:52 crc kubenswrapper[4898]: I1211 13:22:52.405922 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-nx4h5" Dec 11 13:22:52 crc kubenswrapper[4898]: I1211 13:22:52.421117 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/c6d5540b-2eb6-411c-b1a9-b0db78e67ae7-webhook-certs\") pod \"openstack-operator-controller-manager-596947c645-4xjkq\" (UID: \"c6d5540b-2eb6-411c-b1a9-b0db78e67ae7\") " pod="openstack-operators/openstack-operator-controller-manager-596947c645-4xjkq" Dec 11 13:22:52 crc kubenswrapper[4898]: I1211 13:22:52.421232 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-md52q\" (UniqueName: \"kubernetes.io/projected/f52c9389-ea61-4327-afd9-f4c92541a821-kube-api-access-md52q\") pod \"telemetry-operator-controller-manager-766b45bcdb-ksffb\" (UID: \"f52c9389-ea61-4327-afd9-f4c92541a821\") " pod="openstack-operators/telemetry-operator-controller-manager-766b45bcdb-ksffb" Dec 11 13:22:52 crc kubenswrapper[4898]: I1211 13:22:52.421262 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5c57n\" (UniqueName: \"kubernetes.io/projected/1a7e7363-7657-4eb2-a969-9f4c08a50984-kube-api-access-5c57n\") pod \"test-operator-controller-manager-5854674fcc-8d8zb\" (UID: \"1a7e7363-7657-4eb2-a969-9f4c08a50984\") " pod="openstack-operators/test-operator-controller-manager-5854674fcc-8d8zb" Dec 11 13:22:52 crc kubenswrapper[4898]: I1211 13:22:52.421337 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xkbvt\" (UniqueName: \"kubernetes.io/projected/7d6dbccc-94de-44f9-b7d2-5bbcfee1d119-kube-api-access-xkbvt\") pod \"watcher-operator-controller-manager-75944c9b7-9kw6z\" (UID: \"7d6dbccc-94de-44f9-b7d2-5bbcfee1d119\") " pod="openstack-operators/watcher-operator-controller-manager-75944c9b7-9kw6z" Dec 11 13:22:52 crc kubenswrapper[4898]: I1211 13:22:52.421436 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wt2hf\" (UniqueName: \"kubernetes.io/projected/c6d5540b-2eb6-411c-b1a9-b0db78e67ae7-kube-api-access-wt2hf\") pod \"openstack-operator-controller-manager-596947c645-4xjkq\" (UID: \"c6d5540b-2eb6-411c-b1a9-b0db78e67ae7\") " pod="openstack-operators/openstack-operator-controller-manager-596947c645-4xjkq" Dec 11 13:22:52 crc kubenswrapper[4898]: I1211 13:22:52.421455 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2wnrp\" (UniqueName: \"kubernetes.io/projected/c7e9c45d-ed03-4e5f-a585-5d1af92727f9-kube-api-access-2wnrp\") pod \"rabbitmq-cluster-operator-manager-668c99d594-djqgv\" (UID: \"c7e9c45d-ed03-4e5f-a585-5d1af92727f9\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-djqgv" Dec 11 13:22:52 crc kubenswrapper[4898]: I1211 13:22:52.421474 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c6d5540b-2eb6-411c-b1a9-b0db78e67ae7-metrics-certs\") pod \"openstack-operator-controller-manager-596947c645-4xjkq\" (UID: \"c6d5540b-2eb6-411c-b1a9-b0db78e67ae7\") " pod="openstack-operators/openstack-operator-controller-manager-596947c645-4xjkq" Dec 11 13:22:52 crc kubenswrapper[4898]: I1211 13:22:52.426699 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-78f8948974-9p4w9" Dec 11 13:22:52 crc kubenswrapper[4898]: I1211 13:22:52.445639 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-9d58d64bc-2wt95" Dec 11 13:22:52 crc kubenswrapper[4898]: I1211 13:22:52.452103 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5c57n\" (UniqueName: \"kubernetes.io/projected/1a7e7363-7657-4eb2-a969-9f4c08a50984-kube-api-access-5c57n\") pod \"test-operator-controller-manager-5854674fcc-8d8zb\" (UID: \"1a7e7363-7657-4eb2-a969-9f4c08a50984\") " pod="openstack-operators/test-operator-controller-manager-5854674fcc-8d8zb" Dec 11 13:22:52 crc kubenswrapper[4898]: I1211 13:22:52.453063 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-md52q\" (UniqueName: \"kubernetes.io/projected/f52c9389-ea61-4327-afd9-f4c92541a821-kube-api-access-md52q\") pod \"telemetry-operator-controller-manager-766b45bcdb-ksffb\" (UID: \"f52c9389-ea61-4327-afd9-f4c92541a821\") " pod="openstack-operators/telemetry-operator-controller-manager-766b45bcdb-ksffb" Dec 11 13:22:52 crc kubenswrapper[4898]: I1211 13:22:52.467851 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-766b45bcdb-ksffb" Dec 11 13:22:52 crc kubenswrapper[4898]: I1211 13:22:52.527415 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/c6d5540b-2eb6-411c-b1a9-b0db78e67ae7-webhook-certs\") pod \"openstack-operator-controller-manager-596947c645-4xjkq\" (UID: \"c6d5540b-2eb6-411c-b1a9-b0db78e67ae7\") " pod="openstack-operators/openstack-operator-controller-manager-596947c645-4xjkq" Dec 11 13:22:52 crc kubenswrapper[4898]: I1211 13:22:52.527866 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xkbvt\" (UniqueName: \"kubernetes.io/projected/7d6dbccc-94de-44f9-b7d2-5bbcfee1d119-kube-api-access-xkbvt\") pod \"watcher-operator-controller-manager-75944c9b7-9kw6z\" (UID: \"7d6dbccc-94de-44f9-b7d2-5bbcfee1d119\") " pod="openstack-operators/watcher-operator-controller-manager-75944c9b7-9kw6z" Dec 11 13:22:52 crc kubenswrapper[4898]: E1211 13:22:52.527974 4898 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 11 13:22:52 crc kubenswrapper[4898]: I1211 13:22:52.527996 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wt2hf\" (UniqueName: \"kubernetes.io/projected/c6d5540b-2eb6-411c-b1a9-b0db78e67ae7-kube-api-access-wt2hf\") pod \"openstack-operator-controller-manager-596947c645-4xjkq\" (UID: \"c6d5540b-2eb6-411c-b1a9-b0db78e67ae7\") " pod="openstack-operators/openstack-operator-controller-manager-596947c645-4xjkq" Dec 11 13:22:52 crc kubenswrapper[4898]: I1211 13:22:52.528024 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2wnrp\" (UniqueName: \"kubernetes.io/projected/c7e9c45d-ed03-4e5f-a585-5d1af92727f9-kube-api-access-2wnrp\") pod \"rabbitmq-cluster-operator-manager-668c99d594-djqgv\" (UID: \"c7e9c45d-ed03-4e5f-a585-5d1af92727f9\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-djqgv" Dec 11 13:22:52 crc kubenswrapper[4898]: I1211 13:22:52.528046 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c6d5540b-2eb6-411c-b1a9-b0db78e67ae7-metrics-certs\") pod \"openstack-operator-controller-manager-596947c645-4xjkq\" (UID: \"c6d5540b-2eb6-411c-b1a9-b0db78e67ae7\") " pod="openstack-operators/openstack-operator-controller-manager-596947c645-4xjkq" Dec 11 13:22:52 crc kubenswrapper[4898]: E1211 13:22:52.528196 4898 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 11 13:22:52 crc kubenswrapper[4898]: E1211 13:22:52.528236 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c6d5540b-2eb6-411c-b1a9-b0db78e67ae7-metrics-certs podName:c6d5540b-2eb6-411c-b1a9-b0db78e67ae7 nodeName:}" failed. No retries permitted until 2025-12-11 13:22:53.028219425 +0000 UTC m=+1130.600545862 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c6d5540b-2eb6-411c-b1a9-b0db78e67ae7-metrics-certs") pod "openstack-operator-controller-manager-596947c645-4xjkq" (UID: "c6d5540b-2eb6-411c-b1a9-b0db78e67ae7") : secret "metrics-server-cert" not found Dec 11 13:22:52 crc kubenswrapper[4898]: E1211 13:22:52.528467 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c6d5540b-2eb6-411c-b1a9-b0db78e67ae7-webhook-certs podName:c6d5540b-2eb6-411c-b1a9-b0db78e67ae7 nodeName:}" failed. No retries permitted until 2025-12-11 13:22:53.028423131 +0000 UTC m=+1130.600749568 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/c6d5540b-2eb6-411c-b1a9-b0db78e67ae7-webhook-certs") pod "openstack-operator-controller-manager-596947c645-4xjkq" (UID: "c6d5540b-2eb6-411c-b1a9-b0db78e67ae7") : secret "webhook-server-cert" not found Dec 11 13:22:52 crc kubenswrapper[4898]: I1211 13:22:52.543679 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-697fb699cf-pdvzc"] Dec 11 13:22:52 crc kubenswrapper[4898]: I1211 13:22:52.544594 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5854674fcc-8d8zb" Dec 11 13:22:52 crc kubenswrapper[4898]: I1211 13:22:52.555771 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wt2hf\" (UniqueName: \"kubernetes.io/projected/c6d5540b-2eb6-411c-b1a9-b0db78e67ae7-kube-api-access-wt2hf\") pod \"openstack-operator-controller-manager-596947c645-4xjkq\" (UID: \"c6d5540b-2eb6-411c-b1a9-b0db78e67ae7\") " pod="openstack-operators/openstack-operator-controller-manager-596947c645-4xjkq" Dec 11 13:22:52 crc kubenswrapper[4898]: I1211 13:22:52.558301 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2wnrp\" (UniqueName: \"kubernetes.io/projected/c7e9c45d-ed03-4e5f-a585-5d1af92727f9-kube-api-access-2wnrp\") pod \"rabbitmq-cluster-operator-manager-668c99d594-djqgv\" (UID: \"c7e9c45d-ed03-4e5f-a585-5d1af92727f9\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-djqgv" Dec 11 13:22:52 crc kubenswrapper[4898]: I1211 13:22:52.558655 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-6c677c69b-rfxp7"] Dec 11 13:22:52 crc kubenswrapper[4898]: I1211 13:22:52.560752 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xkbvt\" (UniqueName: \"kubernetes.io/projected/7d6dbccc-94de-44f9-b7d2-5bbcfee1d119-kube-api-access-xkbvt\") pod \"watcher-operator-controller-manager-75944c9b7-9kw6z\" (UID: \"7d6dbccc-94de-44f9-b7d2-5bbcfee1d119\") " pod="openstack-operators/watcher-operator-controller-manager-75944c9b7-9kw6z" Dec 11 13:22:52 crc kubenswrapper[4898]: W1211 13:22:52.575210 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd7d8e047_7525_4d88_b802_550590e7f743.slice/crio-b75584360035b51a9cce170e82f93d9568a5368b7648c5f185490b30820dc622 WatchSource:0}: Error finding container b75584360035b51a9cce170e82f93d9568a5368b7648c5f185490b30820dc622: Status 404 returned error can't find the container with id b75584360035b51a9cce170e82f93d9568a5368b7648c5f185490b30820dc622 Dec 11 13:22:52 crc kubenswrapper[4898]: W1211 13:22:52.576803 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc154e39f_1760_4071_b688_f301c3a398e7.slice/crio-dc1e5a603d6f1673e8d69e0942fa7f644f71cf8aa072ffc8703f050fb83dc13b WatchSource:0}: Error finding container dc1e5a603d6f1673e8d69e0942fa7f644f71cf8aa072ffc8703f050fb83dc13b: Status 404 returned error can't find the container with id dc1e5a603d6f1673e8d69e0942fa7f644f71cf8aa072ffc8703f050fb83dc13b Dec 11 13:22:52 crc kubenswrapper[4898]: I1211 13:22:52.588176 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-75944c9b7-9kw6z" Dec 11 13:22:52 crc kubenswrapper[4898]: I1211 13:22:52.630620 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1fbc642b-9636-47c2-a3db-7913fa4a6b91-cert\") pod \"openstack-baremetal-operator-controller-manager-84b575879fv8ph7\" (UID: \"1fbc642b-9636-47c2-a3db-7913fa4a6b91\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879fv8ph7" Dec 11 13:22:52 crc kubenswrapper[4898]: E1211 13:22:52.630910 4898 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 11 13:22:52 crc kubenswrapper[4898]: E1211 13:22:52.630977 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1fbc642b-9636-47c2-a3db-7913fa4a6b91-cert podName:1fbc642b-9636-47c2-a3db-7913fa4a6b91 nodeName:}" failed. No retries permitted until 2025-12-11 13:22:53.630961935 +0000 UTC m=+1131.203288372 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1fbc642b-9636-47c2-a3db-7913fa4a6b91-cert") pod "openstack-baremetal-operator-controller-manager-84b575879fv8ph7" (UID: "1fbc642b-9636-47c2-a3db-7913fa4a6b91") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 11 13:22:52 crc kubenswrapper[4898]: I1211 13:22:52.658595 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-djqgv" Dec 11 13:22:52 crc kubenswrapper[4898]: I1211 13:22:52.767352 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-697fb699cf-pdvzc" event={"ID":"d7d8e047-7525-4d88-b802-550590e7f743","Type":"ContainerStarted","Data":"b75584360035b51a9cce170e82f93d9568a5368b7648c5f185490b30820dc622"} Dec 11 13:22:52 crc kubenswrapper[4898]: I1211 13:22:52.792842 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-hmbfx" event={"ID":"c99ec3c0-d415-4322-95cd-d57411a1db7b","Type":"ContainerStarted","Data":"ada538905434c020c78056a80b4a1ad31a77c25441f623604463ea90148a58c9"} Dec 11 13:22:52 crc kubenswrapper[4898]: I1211 13:22:52.796102 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-6c677c69b-rfxp7" event={"ID":"c154e39f-1760-4071-b688-f301c3a398e7","Type":"ContainerStarted","Data":"dc1e5a603d6f1673e8d69e0942fa7f644f71cf8aa072ffc8703f050fb83dc13b"} Dec 11 13:22:53 crc kubenswrapper[4898]: I1211 13:22:53.036674 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5f64f6f8bb-4lcmf"] Dec 11 13:22:53 crc kubenswrapper[4898]: W1211 13:22:53.037924 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5c391a19_7c2d_4838_9269_2c5cd8eea1ad.slice/crio-abb4112b372b430c70301f409934c78ac13983e0848329e3f0826d83e2408eda WatchSource:0}: Error finding container abb4112b372b430c70301f409934c78ac13983e0848329e3f0826d83e2408eda: Status 404 returned error can't find the container with id abb4112b372b430c70301f409934c78ac13983e0848329e3f0826d83e2408eda Dec 11 13:22:53 crc kubenswrapper[4898]: I1211 13:22:53.038938 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c6d5540b-2eb6-411c-b1a9-b0db78e67ae7-metrics-certs\") pod \"openstack-operator-controller-manager-596947c645-4xjkq\" (UID: \"c6d5540b-2eb6-411c-b1a9-b0db78e67ae7\") " pod="openstack-operators/openstack-operator-controller-manager-596947c645-4xjkq" Dec 11 13:22:53 crc kubenswrapper[4898]: I1211 13:22:53.039075 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/c6d5540b-2eb6-411c-b1a9-b0db78e67ae7-webhook-certs\") pod \"openstack-operator-controller-manager-596947c645-4xjkq\" (UID: \"c6d5540b-2eb6-411c-b1a9-b0db78e67ae7\") " pod="openstack-operators/openstack-operator-controller-manager-596947c645-4xjkq" Dec 11 13:22:53 crc kubenswrapper[4898]: E1211 13:22:53.039092 4898 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 11 13:22:53 crc kubenswrapper[4898]: E1211 13:22:53.039141 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c6d5540b-2eb6-411c-b1a9-b0db78e67ae7-metrics-certs podName:c6d5540b-2eb6-411c-b1a9-b0db78e67ae7 nodeName:}" failed. No retries permitted until 2025-12-11 13:22:54.039124797 +0000 UTC m=+1131.611451324 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c6d5540b-2eb6-411c-b1a9-b0db78e67ae7-metrics-certs") pod "openstack-operator-controller-manager-596947c645-4xjkq" (UID: "c6d5540b-2eb6-411c-b1a9-b0db78e67ae7") : secret "metrics-server-cert" not found Dec 11 13:22:53 crc kubenswrapper[4898]: E1211 13:22:53.039196 4898 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 11 13:22:53 crc kubenswrapper[4898]: E1211 13:22:53.039241 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c6d5540b-2eb6-411c-b1a9-b0db78e67ae7-webhook-certs podName:c6d5540b-2eb6-411c-b1a9-b0db78e67ae7 nodeName:}" failed. No retries permitted until 2025-12-11 13:22:54.039227129 +0000 UTC m=+1131.611553566 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/c6d5540b-2eb6-411c-b1a9-b0db78e67ae7-webhook-certs") pod "openstack-operator-controller-manager-596947c645-4xjkq" (UID: "c6d5540b-2eb6-411c-b1a9-b0db78e67ae7") : secret "webhook-server-cert" not found Dec 11 13:22:53 crc kubenswrapper[4898]: I1211 13:22:53.047079 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-68c6d99b8f-tq8mw"] Dec 11 13:22:53 crc kubenswrapper[4898]: I1211 13:22:53.053602 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-5697bb5779-7kffw"] Dec 11 13:22:53 crc kubenswrapper[4898]: I1211 13:22:53.270041 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-79c8c4686c-qpznh"] Dec 11 13:22:53 crc kubenswrapper[4898]: I1211 13:22:53.299840 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-967d97867-qrfjf"] Dec 11 13:22:53 crc kubenswrapper[4898]: W1211 13:22:53.305209 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1f2676f6_97b8_425e_9d05_9ec2c52055de.slice/crio-1b50dd94d10f68a702b85a7316273d8e5a5e3ae9382708b3099f73076e8fbf67 WatchSource:0}: Error finding container 1b50dd94d10f68a702b85a7316273d8e5a5e3ae9382708b3099f73076e8fbf67: Status 404 returned error can't find the container with id 1b50dd94d10f68a702b85a7316273d8e5a5e3ae9382708b3099f73076e8fbf67 Dec 11 13:22:53 crc kubenswrapper[4898]: I1211 13:22:53.306388 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7765d96ddf-wlqm4"] Dec 11 13:22:53 crc kubenswrapper[4898]: W1211 13:22:53.317354 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode87a760e_bf60_4a98_bb37_1f44745e250f.slice/crio-347ba79a4d3fb74861187baf9560d67bd994233839fc74a4a75e65f05d87fb5b WatchSource:0}: Error finding container 347ba79a4d3fb74861187baf9560d67bd994233839fc74a4a75e65f05d87fb5b: Status 404 returned error can't find the container with id 347ba79a4d3fb74861187baf9560d67bd994233839fc74a4a75e65f05d87fb5b Dec 11 13:22:53 crc kubenswrapper[4898]: I1211 13:22:53.346557 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e2834985-dbd0-4ad6-afd2-8238997ec8e5-cert\") pod \"infra-operator-controller-manager-78d48bff9d-8nj46\" (UID: \"e2834985-dbd0-4ad6-afd2-8238997ec8e5\") " pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-8nj46" Dec 11 13:22:53 crc kubenswrapper[4898]: E1211 13:22:53.346769 4898 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 11 13:22:53 crc kubenswrapper[4898]: E1211 13:22:53.346842 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e2834985-dbd0-4ad6-afd2-8238997ec8e5-cert podName:e2834985-dbd0-4ad6-afd2-8238997ec8e5 nodeName:}" failed. No retries permitted until 2025-12-11 13:22:55.34682403 +0000 UTC m=+1132.919150467 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e2834985-dbd0-4ad6-afd2-8238997ec8e5-cert") pod "infra-operator-controller-manager-78d48bff9d-8nj46" (UID: "e2834985-dbd0-4ad6-afd2-8238997ec8e5") : secret "infra-operator-webhook-server-cert" not found Dec 11 13:22:53 crc kubenswrapper[4898]: I1211 13:22:53.347610 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-5b5fd79c9c-wxz25"] Dec 11 13:22:53 crc kubenswrapper[4898]: W1211 13:22:53.352811 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda99a2194_b89b_4a6a_a086_acd20b489632.slice/crio-ee83c0e181d3feb63d16d1a53b6973228832717896d91c6334bc06ab262f8c63 WatchSource:0}: Error finding container ee83c0e181d3feb63d16d1a53b6973228832717896d91c6334bc06ab262f8c63: Status 404 returned error can't find the container with id ee83c0e181d3feb63d16d1a53b6973228832717896d91c6334bc06ab262f8c63 Dec 11 13:22:53 crc kubenswrapper[4898]: I1211 13:22:53.654144 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1fbc642b-9636-47c2-a3db-7913fa4a6b91-cert\") pod \"openstack-baremetal-operator-controller-manager-84b575879fv8ph7\" (UID: \"1fbc642b-9636-47c2-a3db-7913fa4a6b91\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879fv8ph7" Dec 11 13:22:53 crc kubenswrapper[4898]: E1211 13:22:53.654384 4898 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 11 13:22:53 crc kubenswrapper[4898]: E1211 13:22:53.654479 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1fbc642b-9636-47c2-a3db-7913fa4a6b91-cert podName:1fbc642b-9636-47c2-a3db-7913fa4a6b91 nodeName:}" failed. No retries permitted until 2025-12-11 13:22:55.654441822 +0000 UTC m=+1133.226768259 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1fbc642b-9636-47c2-a3db-7913fa4a6b91-cert") pod "openstack-baremetal-operator-controller-manager-84b575879fv8ph7" (UID: "1fbc642b-9636-47c2-a3db-7913fa4a6b91") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 11 13:22:53 crc kubenswrapper[4898]: I1211 13:22:53.828644 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-tq8mw" event={"ID":"0c6054e7-bb0a-4cbd-b459-d9d100182fa1","Type":"ContainerStarted","Data":"6a3f7f31cefc03af5109532116b30509ec03726aae7c9dc49769e07fdf01ade3"} Dec 11 13:22:53 crc kubenswrapper[4898]: I1211 13:22:53.829273 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-9d58d64bc-2wt95"] Dec 11 13:22:53 crc kubenswrapper[4898]: I1211 13:22:53.830725 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-5697bb5779-7kffw" event={"ID":"5c391a19-7c2d-4838-9269-2c5cd8eea1ad","Type":"ContainerStarted","Data":"abb4112b372b430c70301f409934c78ac13983e0848329e3f0826d83e2408eda"} Dec 11 13:22:53 crc kubenswrapper[4898]: I1211 13:22:53.836945 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-79c8c4686c-qpznh" event={"ID":"1e77353c-6728-4dfa-814c-1a92115c8bf2","Type":"ContainerStarted","Data":"3b735ea18c8b42bbdb6c2c6c879a627c6ff7a7a70a6e2d457d682e6f2a38adb4"} Dec 11 13:22:53 crc kubenswrapper[4898]: I1211 13:22:53.845273 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-wlqm4" event={"ID":"e87a760e-bf60-4a98-bb37-1f44745e250f","Type":"ContainerStarted","Data":"347ba79a4d3fb74861187baf9560d67bd994233839fc74a4a75e65f05d87fb5b"} Dec 11 13:22:53 crc kubenswrapper[4898]: I1211 13:22:53.848388 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-998648c74-dvtzm"] Dec 11 13:22:53 crc kubenswrapper[4898]: I1211 13:22:53.861052 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-4lcmf" event={"ID":"7a0813f7-7167-46ed-b9f8-e2157e92f620","Type":"ContainerStarted","Data":"3f93a83e2aa5330ff4a990e3b48397ec992432ad9724a128f5e0b09a1977f993"} Dec 11 13:22:53 crc kubenswrapper[4898]: W1211 13:22:53.862274 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podac70ed50_7e53_4bb9_ac63_35e5c0651db5.slice/crio-282712a41b30d2d06b77d6c36a91d4fb3f42d8850cf8d9316d96d36494a9f338 WatchSource:0}: Error finding container 282712a41b30d2d06b77d6c36a91d4fb3f42d8850cf8d9316d96d36494a9f338: Status 404 returned error can't find the container with id 282712a41b30d2d06b77d6c36a91d4fb3f42d8850cf8d9316d96d36494a9f338 Dec 11 13:22:53 crc kubenswrapper[4898]: I1211 13:22:53.862315 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-b6456fdb6-nx4h5"] Dec 11 13:22:53 crc kubenswrapper[4898]: I1211 13:22:53.873898 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-697bc559fc-pqc9z"] Dec 11 13:22:53 crc kubenswrapper[4898]: I1211 13:22:53.883718 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-967d97867-qrfjf" event={"ID":"1f2676f6-97b8-425e-9d05-9ec2c52055de","Type":"ContainerStarted","Data":"1b50dd94d10f68a702b85a7316273d8e5a5e3ae9382708b3099f73076e8fbf67"} Dec 11 13:22:53 crc kubenswrapper[4898]: W1211 13:22:53.883805 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod88e97f63_c1cb_4ef1_9d95_0c11dc52c94c.slice/crio-56d54fb2afd6f1c552d60abb1f3a77b2bc5172722a6b3f3c1883768c599078e3 WatchSource:0}: Error finding container 56d54fb2afd6f1c552d60abb1f3a77b2bc5172722a6b3f3c1883768c599078e3: Status 404 returned error can't find the container with id 56d54fb2afd6f1c552d60abb1f3a77b2bc5172722a6b3f3c1883768c599078e3 Dec 11 13:22:53 crc kubenswrapper[4898]: I1211 13:22:53.888078 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-pgfnv"] Dec 11 13:22:53 crc kubenswrapper[4898]: I1211 13:22:53.888818 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-5b5fd79c9c-wxz25" event={"ID":"a99a2194-b89b-4a6a-a086-acd20b489632","Type":"ContainerStarted","Data":"ee83c0e181d3feb63d16d1a53b6973228832717896d91c6334bc06ab262f8c63"} Dec 11 13:22:53 crc kubenswrapper[4898]: I1211 13:22:53.908466 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-766b45bcdb-ksffb"] Dec 11 13:22:53 crc kubenswrapper[4898]: I1211 13:22:53.917439 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-78f8948974-9p4w9"] Dec 11 13:22:53 crc kubenswrapper[4898]: I1211 13:22:53.937614 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-djqgv"] Dec 11 13:22:53 crc kubenswrapper[4898]: E1211 13:22:53.946524 4898 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:d29650b006da97eb9178fcc58f2eb9fead8c2b414fac18f86a3c3a1507488c4f,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-rfg7c,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-78f8948974-9p4w9_openstack-operators(42b8c71f-abd8-49b1-b604-49b3292ba29a): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 11 13:22:53 crc kubenswrapper[4898]: E1211 13:22:53.949481 4898 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-rfg7c,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-78f8948974-9p4w9_openstack-operators(42b8c71f-abd8-49b1-b604-49b3292ba29a): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 11 13:22:53 crc kubenswrapper[4898]: E1211 13:22:53.950562 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/placement-operator-controller-manager-78f8948974-9p4w9" podUID="42b8c71f-abd8-49b1-b604-49b3292ba29a" Dec 11 13:22:53 crc kubenswrapper[4898]: I1211 13:22:53.951124 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-75944c9b7-9kw6z"] Dec 11 13:22:53 crc kubenswrapper[4898]: E1211 13:22:53.954083 4898 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:101b3e007d8c9f2e183262d7712f986ad51256448099069bc14f1ea5f997ab94,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-5c57n,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5854674fcc-8d8zb_openstack-operators(1a7e7363-7657-4eb2-a969-9f4c08a50984): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 11 13:22:53 crc kubenswrapper[4898]: E1211 13:22:53.954392 4898 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-2wnrp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-djqgv_openstack-operators(c7e9c45d-ed03-4e5f-a585-5d1af92727f9): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 11 13:22:53 crc kubenswrapper[4898]: E1211 13:22:53.955592 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-djqgv" podUID="c7e9c45d-ed03-4e5f-a585-5d1af92727f9" Dec 11 13:22:53 crc kubenswrapper[4898]: E1211 13:22:53.956626 4898 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-5c57n,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5854674fcc-8d8zb_openstack-operators(1a7e7363-7657-4eb2-a969-9f4c08a50984): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 11 13:22:53 crc kubenswrapper[4898]: E1211 13:22:53.957853 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/test-operator-controller-manager-5854674fcc-8d8zb" podUID="1a7e7363-7657-4eb2-a969-9f4c08a50984" Dec 11 13:22:53 crc kubenswrapper[4898]: I1211 13:22:53.977190 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5854674fcc-8d8zb"] Dec 11 13:22:54 crc kubenswrapper[4898]: I1211 13:22:54.064797 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/c6d5540b-2eb6-411c-b1a9-b0db78e67ae7-webhook-certs\") pod \"openstack-operator-controller-manager-596947c645-4xjkq\" (UID: \"c6d5540b-2eb6-411c-b1a9-b0db78e67ae7\") " pod="openstack-operators/openstack-operator-controller-manager-596947c645-4xjkq" Dec 11 13:22:54 crc kubenswrapper[4898]: I1211 13:22:54.064984 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c6d5540b-2eb6-411c-b1a9-b0db78e67ae7-metrics-certs\") pod \"openstack-operator-controller-manager-596947c645-4xjkq\" (UID: \"c6d5540b-2eb6-411c-b1a9-b0db78e67ae7\") " pod="openstack-operators/openstack-operator-controller-manager-596947c645-4xjkq" Dec 11 13:22:54 crc kubenswrapper[4898]: E1211 13:22:54.065002 4898 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 11 13:22:54 crc kubenswrapper[4898]: E1211 13:22:54.065122 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c6d5540b-2eb6-411c-b1a9-b0db78e67ae7-webhook-certs podName:c6d5540b-2eb6-411c-b1a9-b0db78e67ae7 nodeName:}" failed. No retries permitted until 2025-12-11 13:22:56.06504518 +0000 UTC m=+1133.637371617 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/c6d5540b-2eb6-411c-b1a9-b0db78e67ae7-webhook-certs") pod "openstack-operator-controller-manager-596947c645-4xjkq" (UID: "c6d5540b-2eb6-411c-b1a9-b0db78e67ae7") : secret "webhook-server-cert" not found Dec 11 13:22:54 crc kubenswrapper[4898]: E1211 13:22:54.065166 4898 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 11 13:22:54 crc kubenswrapper[4898]: E1211 13:22:54.065238 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c6d5540b-2eb6-411c-b1a9-b0db78e67ae7-metrics-certs podName:c6d5540b-2eb6-411c-b1a9-b0db78e67ae7 nodeName:}" failed. No retries permitted until 2025-12-11 13:22:56.065220965 +0000 UTC m=+1133.637547402 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c6d5540b-2eb6-411c-b1a9-b0db78e67ae7-metrics-certs") pod "openstack-operator-controller-manager-596947c645-4xjkq" (UID: "c6d5540b-2eb6-411c-b1a9-b0db78e67ae7") : secret "metrics-server-cert" not found Dec 11 13:22:55 crc kubenswrapper[4898]: I1211 13:22:55.567091 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e2834985-dbd0-4ad6-afd2-8238997ec8e5-cert\") pod \"infra-operator-controller-manager-78d48bff9d-8nj46\" (UID: \"e2834985-dbd0-4ad6-afd2-8238997ec8e5\") " pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-8nj46" Dec 11 13:22:55 crc kubenswrapper[4898]: E1211 13:22:55.567309 4898 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 11 13:22:55 crc kubenswrapper[4898]: E1211 13:22:55.567360 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e2834985-dbd0-4ad6-afd2-8238997ec8e5-cert podName:e2834985-dbd0-4ad6-afd2-8238997ec8e5 nodeName:}" failed. No retries permitted until 2025-12-11 13:22:59.56734249 +0000 UTC m=+1137.139668927 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e2834985-dbd0-4ad6-afd2-8238997ec8e5-cert") pod "infra-operator-controller-manager-78d48bff9d-8nj46" (UID: "e2834985-dbd0-4ad6-afd2-8238997ec8e5") : secret "infra-operator-webhook-server-cert" not found Dec 11 13:22:55 crc kubenswrapper[4898]: I1211 13:22:55.594211 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-pgfnv" event={"ID":"09d9c781-008c-4486-807c-159f4fefe857","Type":"ContainerStarted","Data":"c0610a468ed3924a9f7fc6531465ee0aa94c530a03ef6fae750b983f39e6ec54"} Dec 11 13:22:55 crc kubenswrapper[4898]: I1211 13:22:55.604038 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-nx4h5" event={"ID":"95c66498-ab0d-4618-b884-523e1183d758","Type":"ContainerStarted","Data":"0eef4108d03c52cf95f0901af13b3550217c46814de66175e140bf0d44691a35"} Dec 11 13:22:55 crc kubenswrapper[4898]: I1211 13:22:55.606769 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-998648c74-dvtzm" event={"ID":"88e97f63-c1cb-4ef1-9d95-0c11dc52c94c","Type":"ContainerStarted","Data":"56d54fb2afd6f1c552d60abb1f3a77b2bc5172722a6b3f3c1883768c599078e3"} Dec 11 13:22:55 crc kubenswrapper[4898]: I1211 13:22:55.608272 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-9d58d64bc-2wt95" event={"ID":"cb6adf46-208a-4945-97aa-2c457b9c2614","Type":"ContainerStarted","Data":"0c748c946feaf21e771def585d58171b398700d94f6fb201d507c947694782b2"} Dec 11 13:22:55 crc kubenswrapper[4898]: I1211 13:22:55.609490 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-djqgv" event={"ID":"c7e9c45d-ed03-4e5f-a585-5d1af92727f9","Type":"ContainerStarted","Data":"7eebac9d9f5903cb5e373fb1834a4d082ab5263dcf652f1ce57b1dc93a1d284e"} Dec 11 13:22:55 crc kubenswrapper[4898]: E1211 13:22:55.614047 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-djqgv" podUID="c7e9c45d-ed03-4e5f-a585-5d1af92727f9" Dec 11 13:22:55 crc kubenswrapper[4898]: I1211 13:22:55.614669 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5854674fcc-8d8zb" event={"ID":"1a7e7363-7657-4eb2-a969-9f4c08a50984","Type":"ContainerStarted","Data":"31951da57e7b385b4c5fe560f131e0bdc2f92822681a566097bec5cd1b9981d6"} Dec 11 13:22:55 crc kubenswrapper[4898]: E1211 13:22:55.617399 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:101b3e007d8c9f2e183262d7712f986ad51256448099069bc14f1ea5f997ab94\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/test-operator-controller-manager-5854674fcc-8d8zb" podUID="1a7e7363-7657-4eb2-a969-9f4c08a50984" Dec 11 13:22:55 crc kubenswrapper[4898]: I1211 13:22:55.617832 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-78f8948974-9p4w9" event={"ID":"42b8c71f-abd8-49b1-b604-49b3292ba29a","Type":"ContainerStarted","Data":"d66e19c6552f11a55eb5921d57010492871478bd611720b368304a815a60cb3d"} Dec 11 13:22:55 crc kubenswrapper[4898]: E1211 13:22:55.624729 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:d29650b006da97eb9178fcc58f2eb9fead8c2b414fac18f86a3c3a1507488c4f\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/placement-operator-controller-manager-78f8948974-9p4w9" podUID="42b8c71f-abd8-49b1-b604-49b3292ba29a" Dec 11 13:22:55 crc kubenswrapper[4898]: I1211 13:22:55.624954 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-pqc9z" event={"ID":"ac70ed50-7e53-4bb9-ac63-35e5c0651db5","Type":"ContainerStarted","Data":"282712a41b30d2d06b77d6c36a91d4fb3f42d8850cf8d9316d96d36494a9f338"} Dec 11 13:22:55 crc kubenswrapper[4898]: I1211 13:22:55.627789 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-766b45bcdb-ksffb" event={"ID":"f52c9389-ea61-4327-afd9-f4c92541a821","Type":"ContainerStarted","Data":"3bec17f011c34ea3a388f8fc0ca6acdca66206a600fe6f77e6a82ca52ec2bfdb"} Dec 11 13:22:55 crc kubenswrapper[4898]: I1211 13:22:55.629490 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-75944c9b7-9kw6z" event={"ID":"7d6dbccc-94de-44f9-b7d2-5bbcfee1d119","Type":"ContainerStarted","Data":"91b76087010277b6e1f9d397136feda78421408361a2591a845adea1b664b9e7"} Dec 11 13:22:55 crc kubenswrapper[4898]: I1211 13:22:55.672483 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1fbc642b-9636-47c2-a3db-7913fa4a6b91-cert\") pod \"openstack-baremetal-operator-controller-manager-84b575879fv8ph7\" (UID: \"1fbc642b-9636-47c2-a3db-7913fa4a6b91\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879fv8ph7" Dec 11 13:22:55 crc kubenswrapper[4898]: E1211 13:22:55.673356 4898 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 11 13:22:55 crc kubenswrapper[4898]: E1211 13:22:55.673701 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1fbc642b-9636-47c2-a3db-7913fa4a6b91-cert podName:1fbc642b-9636-47c2-a3db-7913fa4a6b91 nodeName:}" failed. No retries permitted until 2025-12-11 13:22:59.673679137 +0000 UTC m=+1137.246005574 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1fbc642b-9636-47c2-a3db-7913fa4a6b91-cert") pod "openstack-baremetal-operator-controller-manager-84b575879fv8ph7" (UID: "1fbc642b-9636-47c2-a3db-7913fa4a6b91") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 11 13:22:56 crc kubenswrapper[4898]: I1211 13:22:56.084699 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c6d5540b-2eb6-411c-b1a9-b0db78e67ae7-metrics-certs\") pod \"openstack-operator-controller-manager-596947c645-4xjkq\" (UID: \"c6d5540b-2eb6-411c-b1a9-b0db78e67ae7\") " pod="openstack-operators/openstack-operator-controller-manager-596947c645-4xjkq" Dec 11 13:22:56 crc kubenswrapper[4898]: I1211 13:22:56.084807 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/c6d5540b-2eb6-411c-b1a9-b0db78e67ae7-webhook-certs\") pod \"openstack-operator-controller-manager-596947c645-4xjkq\" (UID: \"c6d5540b-2eb6-411c-b1a9-b0db78e67ae7\") " pod="openstack-operators/openstack-operator-controller-manager-596947c645-4xjkq" Dec 11 13:22:56 crc kubenswrapper[4898]: E1211 13:22:56.084885 4898 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 11 13:22:56 crc kubenswrapper[4898]: E1211 13:22:56.084971 4898 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 11 13:22:56 crc kubenswrapper[4898]: E1211 13:22:56.084983 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c6d5540b-2eb6-411c-b1a9-b0db78e67ae7-metrics-certs podName:c6d5540b-2eb6-411c-b1a9-b0db78e67ae7 nodeName:}" failed. No retries permitted until 2025-12-11 13:23:00.084962632 +0000 UTC m=+1137.657289069 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c6d5540b-2eb6-411c-b1a9-b0db78e67ae7-metrics-certs") pod "openstack-operator-controller-manager-596947c645-4xjkq" (UID: "c6d5540b-2eb6-411c-b1a9-b0db78e67ae7") : secret "metrics-server-cert" not found Dec 11 13:22:56 crc kubenswrapper[4898]: E1211 13:22:56.085003 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c6d5540b-2eb6-411c-b1a9-b0db78e67ae7-webhook-certs podName:c6d5540b-2eb6-411c-b1a9-b0db78e67ae7 nodeName:}" failed. No retries permitted until 2025-12-11 13:23:00.084992503 +0000 UTC m=+1137.657318940 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/c6d5540b-2eb6-411c-b1a9-b0db78e67ae7-webhook-certs") pod "openstack-operator-controller-manager-596947c645-4xjkq" (UID: "c6d5540b-2eb6-411c-b1a9-b0db78e67ae7") : secret "webhook-server-cert" not found Dec 11 13:22:56 crc kubenswrapper[4898]: E1211 13:22:56.649700 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-djqgv" podUID="c7e9c45d-ed03-4e5f-a585-5d1af92727f9" Dec 11 13:22:56 crc kubenswrapper[4898]: E1211 13:22:56.652635 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:d29650b006da97eb9178fcc58f2eb9fead8c2b414fac18f86a3c3a1507488c4f\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/placement-operator-controller-manager-78f8948974-9p4w9" podUID="42b8c71f-abd8-49b1-b604-49b3292ba29a" Dec 11 13:22:56 crc kubenswrapper[4898]: E1211 13:22:56.652665 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:101b3e007d8c9f2e183262d7712f986ad51256448099069bc14f1ea5f997ab94\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/test-operator-controller-manager-5854674fcc-8d8zb" podUID="1a7e7363-7657-4eb2-a969-9f4c08a50984" Dec 11 13:22:59 crc kubenswrapper[4898]: I1211 13:22:59.649243 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e2834985-dbd0-4ad6-afd2-8238997ec8e5-cert\") pod \"infra-operator-controller-manager-78d48bff9d-8nj46\" (UID: \"e2834985-dbd0-4ad6-afd2-8238997ec8e5\") " pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-8nj46" Dec 11 13:22:59 crc kubenswrapper[4898]: E1211 13:22:59.649426 4898 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 11 13:22:59 crc kubenswrapper[4898]: E1211 13:22:59.649860 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e2834985-dbd0-4ad6-afd2-8238997ec8e5-cert podName:e2834985-dbd0-4ad6-afd2-8238997ec8e5 nodeName:}" failed. No retries permitted until 2025-12-11 13:23:07.649840568 +0000 UTC m=+1145.222167005 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e2834985-dbd0-4ad6-afd2-8238997ec8e5-cert") pod "infra-operator-controller-manager-78d48bff9d-8nj46" (UID: "e2834985-dbd0-4ad6-afd2-8238997ec8e5") : secret "infra-operator-webhook-server-cert" not found Dec 11 13:22:59 crc kubenswrapper[4898]: I1211 13:22:59.751816 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1fbc642b-9636-47c2-a3db-7913fa4a6b91-cert\") pod \"openstack-baremetal-operator-controller-manager-84b575879fv8ph7\" (UID: \"1fbc642b-9636-47c2-a3db-7913fa4a6b91\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879fv8ph7" Dec 11 13:22:59 crc kubenswrapper[4898]: E1211 13:22:59.752010 4898 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 11 13:22:59 crc kubenswrapper[4898]: E1211 13:22:59.752085 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1fbc642b-9636-47c2-a3db-7913fa4a6b91-cert podName:1fbc642b-9636-47c2-a3db-7913fa4a6b91 nodeName:}" failed. No retries permitted until 2025-12-11 13:23:07.752066263 +0000 UTC m=+1145.324392700 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1fbc642b-9636-47c2-a3db-7913fa4a6b91-cert") pod "openstack-baremetal-operator-controller-manager-84b575879fv8ph7" (UID: "1fbc642b-9636-47c2-a3db-7913fa4a6b91") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 11 13:23:00 crc kubenswrapper[4898]: I1211 13:23:00.160123 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c6d5540b-2eb6-411c-b1a9-b0db78e67ae7-metrics-certs\") pod \"openstack-operator-controller-manager-596947c645-4xjkq\" (UID: \"c6d5540b-2eb6-411c-b1a9-b0db78e67ae7\") " pod="openstack-operators/openstack-operator-controller-manager-596947c645-4xjkq" Dec 11 13:23:00 crc kubenswrapper[4898]: I1211 13:23:00.160289 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/c6d5540b-2eb6-411c-b1a9-b0db78e67ae7-webhook-certs\") pod \"openstack-operator-controller-manager-596947c645-4xjkq\" (UID: \"c6d5540b-2eb6-411c-b1a9-b0db78e67ae7\") " pod="openstack-operators/openstack-operator-controller-manager-596947c645-4xjkq" Dec 11 13:23:00 crc kubenswrapper[4898]: E1211 13:23:00.160527 4898 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 11 13:23:00 crc kubenswrapper[4898]: E1211 13:23:00.160639 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c6d5540b-2eb6-411c-b1a9-b0db78e67ae7-webhook-certs podName:c6d5540b-2eb6-411c-b1a9-b0db78e67ae7 nodeName:}" failed. No retries permitted until 2025-12-11 13:23:08.160609145 +0000 UTC m=+1145.732935622 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/c6d5540b-2eb6-411c-b1a9-b0db78e67ae7-webhook-certs") pod "openstack-operator-controller-manager-596947c645-4xjkq" (UID: "c6d5540b-2eb6-411c-b1a9-b0db78e67ae7") : secret "webhook-server-cert" not found Dec 11 13:23:00 crc kubenswrapper[4898]: E1211 13:23:00.160721 4898 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 11 13:23:00 crc kubenswrapper[4898]: E1211 13:23:00.160823 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c6d5540b-2eb6-411c-b1a9-b0db78e67ae7-metrics-certs podName:c6d5540b-2eb6-411c-b1a9-b0db78e67ae7 nodeName:}" failed. No retries permitted until 2025-12-11 13:23:08.160800481 +0000 UTC m=+1145.733126958 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c6d5540b-2eb6-411c-b1a9-b0db78e67ae7-metrics-certs") pod "openstack-operator-controller-manager-596947c645-4xjkq" (UID: "c6d5540b-2eb6-411c-b1a9-b0db78e67ae7") : secret "metrics-server-cert" not found Dec 11 13:23:07 crc kubenswrapper[4898]: I1211 13:23:07.696620 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e2834985-dbd0-4ad6-afd2-8238997ec8e5-cert\") pod \"infra-operator-controller-manager-78d48bff9d-8nj46\" (UID: \"e2834985-dbd0-4ad6-afd2-8238997ec8e5\") " pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-8nj46" Dec 11 13:23:07 crc kubenswrapper[4898]: I1211 13:23:07.702868 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e2834985-dbd0-4ad6-afd2-8238997ec8e5-cert\") pod \"infra-operator-controller-manager-78d48bff9d-8nj46\" (UID: \"e2834985-dbd0-4ad6-afd2-8238997ec8e5\") " pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-8nj46" Dec 11 13:23:07 crc kubenswrapper[4898]: I1211 13:23:07.798813 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1fbc642b-9636-47c2-a3db-7913fa4a6b91-cert\") pod \"openstack-baremetal-operator-controller-manager-84b575879fv8ph7\" (UID: \"1fbc642b-9636-47c2-a3db-7913fa4a6b91\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879fv8ph7" Dec 11 13:23:07 crc kubenswrapper[4898]: I1211 13:23:07.812848 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1fbc642b-9636-47c2-a3db-7913fa4a6b91-cert\") pod \"openstack-baremetal-operator-controller-manager-84b575879fv8ph7\" (UID: \"1fbc642b-9636-47c2-a3db-7913fa4a6b91\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879fv8ph7" Dec 11 13:23:07 crc kubenswrapper[4898]: I1211 13:23:07.948762 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-9h4wz" Dec 11 13:23:07 crc kubenswrapper[4898]: I1211 13:23:07.956856 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-8nj46" Dec 11 13:23:07 crc kubenswrapper[4898]: I1211 13:23:07.964603 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-4dw8m" Dec 11 13:23:07 crc kubenswrapper[4898]: I1211 13:23:07.973110 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879fv8ph7" Dec 11 13:23:08 crc kubenswrapper[4898]: I1211 13:23:08.204370 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c6d5540b-2eb6-411c-b1a9-b0db78e67ae7-metrics-certs\") pod \"openstack-operator-controller-manager-596947c645-4xjkq\" (UID: \"c6d5540b-2eb6-411c-b1a9-b0db78e67ae7\") " pod="openstack-operators/openstack-operator-controller-manager-596947c645-4xjkq" Dec 11 13:23:08 crc kubenswrapper[4898]: I1211 13:23:08.204722 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/c6d5540b-2eb6-411c-b1a9-b0db78e67ae7-webhook-certs\") pod \"openstack-operator-controller-manager-596947c645-4xjkq\" (UID: \"c6d5540b-2eb6-411c-b1a9-b0db78e67ae7\") " pod="openstack-operators/openstack-operator-controller-manager-596947c645-4xjkq" Dec 11 13:23:08 crc kubenswrapper[4898]: E1211 13:23:08.204869 4898 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 11 13:23:08 crc kubenswrapper[4898]: E1211 13:23:08.204944 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c6d5540b-2eb6-411c-b1a9-b0db78e67ae7-webhook-certs podName:c6d5540b-2eb6-411c-b1a9-b0db78e67ae7 nodeName:}" failed. No retries permitted until 2025-12-11 13:23:24.204924769 +0000 UTC m=+1161.777251206 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/c6d5540b-2eb6-411c-b1a9-b0db78e67ae7-webhook-certs") pod "openstack-operator-controller-manager-596947c645-4xjkq" (UID: "c6d5540b-2eb6-411c-b1a9-b0db78e67ae7") : secret "webhook-server-cert" not found Dec 11 13:23:08 crc kubenswrapper[4898]: I1211 13:23:08.207967 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c6d5540b-2eb6-411c-b1a9-b0db78e67ae7-metrics-certs\") pod \"openstack-operator-controller-manager-596947c645-4xjkq\" (UID: \"c6d5540b-2eb6-411c-b1a9-b0db78e67ae7\") " pod="openstack-operators/openstack-operator-controller-manager-596947c645-4xjkq" Dec 11 13:23:09 crc kubenswrapper[4898]: E1211 13:23:09.546204 4898 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/horizon-operator@sha256:9e847f4dbdea19ab997f32a02b3680a9bd966f9c705911645c3866a19fda9ea5" Dec 11 13:23:09 crc kubenswrapper[4898]: E1211 13:23:09.546400 4898 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/horizon-operator@sha256:9e847f4dbdea19ab997f32a02b3680a9bd966f9c705911645c3866a19fda9ea5,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-mphhz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-operator-controller-manager-68c6d99b8f-tq8mw_openstack-operators(0c6054e7-bb0a-4cbd-b459-d9d100182fa1): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 11 13:23:10 crc kubenswrapper[4898]: E1211 13:23:10.746148 4898 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/manila-operator@sha256:44126f9c6b1d2bf752ddf989e20a4fc4cc1c07723d4fcb78465ccb2f55da6b3a" Dec 11 13:23:10 crc kubenswrapper[4898]: E1211 13:23:10.746986 4898 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/manila-operator@sha256:44126f9c6b1d2bf752ddf989e20a4fc4cc1c07723d4fcb78465ccb2f55da6b3a,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-jdd8k,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod manila-operator-controller-manager-5b5fd79c9c-wxz25_openstack-operators(a99a2194-b89b-4a6a-a086-acd20b489632): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 11 13:23:11 crc kubenswrapper[4898]: E1211 13:23:11.419828 4898 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/glance-operator@sha256:5370dc4a8e776923eec00bb50cbdb2e390e9dde50be26bdc04a216bd2d6b5027" Dec 11 13:23:11 crc kubenswrapper[4898]: E1211 13:23:11.419984 4898 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/glance-operator@sha256:5370dc4a8e776923eec00bb50cbdb2e390e9dde50be26bdc04a216bd2d6b5027,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-bflkc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod glance-operator-controller-manager-5697bb5779-7kffw_openstack-operators(5c391a19-7c2d-4838-9269-2c5cd8eea1ad): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 11 13:23:15 crc kubenswrapper[4898]: E1211 13:23:15.241150 4898 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/designate-operator@sha256:900050d3501c0785b227db34b89883efe68247816e5c7427cacb74f8aa10605a" Dec 11 13:23:15 crc kubenswrapper[4898]: E1211 13:23:15.241711 4898 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/designate-operator@sha256:900050d3501c0785b227db34b89883efe68247816e5c7427cacb74f8aa10605a,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-htdgw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod designate-operator-controller-manager-697fb699cf-pdvzc_openstack-operators(d7d8e047-7525-4d88-b802-550590e7f743): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 11 13:23:17 crc kubenswrapper[4898]: E1211 13:23:17.615614 4898 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/octavia-operator@sha256:d9a3694865a7d54ee96397add18c3898886e98d079aa20876a0f4de1fa7a7168" Dec 11 13:23:17 crc kubenswrapper[4898]: E1211 13:23:17.616042 4898 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/octavia-operator@sha256:d9a3694865a7d54ee96397add18c3898886e98d079aa20876a0f4de1fa7a7168,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-772lz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-998648c74-dvtzm_openstack-operators(88e97f63-c1cb-4ef1-9d95-0c11dc52c94c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 11 13:23:18 crc kubenswrapper[4898]: E1211 13:23:18.195039 4898 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/neutron-operator@sha256:0b3fb69f35c151895d3dffd514974a9f9fe1c77c3bca69b78b81efb183cf4557" Dec 11 13:23:18 crc kubenswrapper[4898]: E1211 13:23:18.195542 4898 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/neutron-operator@sha256:0b3fb69f35c151895d3dffd514974a9f9fe1c77c3bca69b78b81efb183cf4557,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-t5k4r,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod neutron-operator-controller-manager-5fdfd5b6b5-pgfnv_openstack-operators(09d9c781-008c-4486-807c-159f4fefe857): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 11 13:23:18 crc kubenswrapper[4898]: E1211 13:23:18.753682 4898 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/swift-operator@sha256:3aa109bb973253ae9dcf339b9b65abbd1176cdb4be672c93e538a5f113816991" Dec 11 13:23:18 crc kubenswrapper[4898]: E1211 13:23:18.753880 4898 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:3aa109bb973253ae9dcf339b9b65abbd1176cdb4be672c93e538a5f113816991,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-lkpbx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-9d58d64bc-2wt95_openstack-operators(cb6adf46-208a-4945-97aa-2c457b9c2614): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 11 13:23:19 crc kubenswrapper[4898]: E1211 13:23:19.336405 4898 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/watcher-operator@sha256:961417d59f527d925ac48ff6a11de747d0493315e496e34dc83d76a1a1fff58a" Dec 11 13:23:19 crc kubenswrapper[4898]: E1211 13:23:19.336623 4898 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:961417d59f527d925ac48ff6a11de747d0493315e496e34dc83d76a1a1fff58a,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-xkbvt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-75944c9b7-9kw6z_openstack-operators(7d6dbccc-94de-44f9-b7d2-5bbcfee1d119): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 11 13:23:19 crc kubenswrapper[4898]: E1211 13:23:19.401429 4898 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.98:5001/openstack-k8s-operators/telemetry-operator:5058fb97d80b3d5b49d16e3a70bedcd92b379609" Dec 11 13:23:19 crc kubenswrapper[4898]: E1211 13:23:19.401501 4898 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.98:5001/openstack-k8s-operators/telemetry-operator:5058fb97d80b3d5b49d16e3a70bedcd92b379609" Dec 11 13:23:19 crc kubenswrapper[4898]: E1211 13:23:19.401643 4898 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:38.102.83.98:5001/openstack-k8s-operators/telemetry-operator:5058fb97d80b3d5b49d16e3a70bedcd92b379609,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-md52q,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-766b45bcdb-ksffb_openstack-operators(f52c9389-ea61-4327-afd9-f4c92541a821): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 11 13:23:20 crc kubenswrapper[4898]: E1211 13:23:20.129919 4898 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/keystone-operator@sha256:72ad6517987f674af0d0ae092cbb874aeae909c8b8b60188099c311762ebc8f7" Dec 11 13:23:20 crc kubenswrapper[4898]: E1211 13:23:20.130286 4898 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:72ad6517987f674af0d0ae092cbb874aeae909c8b8b60188099c311762ebc8f7,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-d5f2h,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-7765d96ddf-wlqm4_openstack-operators(e87a760e-bf60-4a98-bb37-1f44745e250f): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 11 13:23:24 crc kubenswrapper[4898]: I1211 13:23:24.225478 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/c6d5540b-2eb6-411c-b1a9-b0db78e67ae7-webhook-certs\") pod \"openstack-operator-controller-manager-596947c645-4xjkq\" (UID: \"c6d5540b-2eb6-411c-b1a9-b0db78e67ae7\") " pod="openstack-operators/openstack-operator-controller-manager-596947c645-4xjkq" Dec 11 13:23:24 crc kubenswrapper[4898]: I1211 13:23:24.232639 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/c6d5540b-2eb6-411c-b1a9-b0db78e67ae7-webhook-certs\") pod \"openstack-operator-controller-manager-596947c645-4xjkq\" (UID: \"c6d5540b-2eb6-411c-b1a9-b0db78e67ae7\") " pod="openstack-operators/openstack-operator-controller-manager-596947c645-4xjkq" Dec 11 13:23:24 crc kubenswrapper[4898]: I1211 13:23:24.431012 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-r2gw4" Dec 11 13:23:24 crc kubenswrapper[4898]: I1211 13:23:24.439410 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-596947c645-4xjkq" Dec 11 13:23:27 crc kubenswrapper[4898]: E1211 13:23:27.906568 4898 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/nova-operator@sha256:779f0cee6024d0fb8f259b036fe790e62aa5a3b0431ea9bf15a6e7d02e2e5670" Dec 11 13:23:27 crc kubenswrapper[4898]: E1211 13:23:27.907226 4898 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:779f0cee6024d0fb8f259b036fe790e62aa5a3b0431ea9bf15a6e7d02e2e5670,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-pglc4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-697bc559fc-pqc9z_openstack-operators(ac70ed50-7e53-4bb9-ac63-35e5c0651db5): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 11 13:23:28 crc kubenswrapper[4898]: E1211 13:23:28.464952 4898 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2" Dec 11 13:23:28 crc kubenswrapper[4898]: E1211 13:23:28.465623 4898 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-2wnrp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-djqgv_openstack-operators(c7e9c45d-ed03-4e5f-a585-5d1af92727f9): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 11 13:23:28 crc kubenswrapper[4898]: E1211 13:23:28.467083 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-djqgv" podUID="c7e9c45d-ed03-4e5f-a585-5d1af92727f9" Dec 11 13:23:29 crc kubenswrapper[4898]: I1211 13:23:29.016161 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-78d48bff9d-8nj46"] Dec 11 13:23:29 crc kubenswrapper[4898]: I1211 13:23:29.300591 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-84b575879fv8ph7"] Dec 11 13:23:29 crc kubenswrapper[4898]: I1211 13:23:29.780203 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-596947c645-4xjkq"] Dec 11 13:23:29 crc kubenswrapper[4898]: I1211 13:23:29.946618 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-hmbfx" event={"ID":"c99ec3c0-d415-4322-95cd-d57411a1db7b","Type":"ContainerStarted","Data":"e8f56a6306d45bf6f88c5f508f064eb0a1d59d0f3b92992998886d0216cc52b6"} Dec 11 13:23:29 crc kubenswrapper[4898]: I1211 13:23:29.948583 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-967d97867-qrfjf" event={"ID":"1f2676f6-97b8-425e-9d05-9ec2c52055de","Type":"ContainerStarted","Data":"3f2031760970a5c30b84980efde9543c9adfff6cdf18d39bf03eaa3e4478300d"} Dec 11 13:23:29 crc kubenswrapper[4898]: I1211 13:23:29.949870 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-79c8c4686c-qpznh" event={"ID":"1e77353c-6728-4dfa-814c-1a92115c8bf2","Type":"ContainerStarted","Data":"938bace849a3bab0f39e755bba5a323ef946f21e2a2c96fbbebb59f03094283e"} Dec 11 13:23:29 crc kubenswrapper[4898]: I1211 13:23:29.950927 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879fv8ph7" event={"ID":"1fbc642b-9636-47c2-a3db-7913fa4a6b91","Type":"ContainerStarted","Data":"d1331cb6891b94a7c0f7bf4190c171b41a09d7e921d033d65218a20154ad89a8"} Dec 11 13:23:29 crc kubenswrapper[4898]: I1211 13:23:29.951911 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-8nj46" event={"ID":"e2834985-dbd0-4ad6-afd2-8238997ec8e5","Type":"ContainerStarted","Data":"b86bec6c4e75229a57548e1dfcdf4ba79c5d5d884c049b147db4ca15ad0084e3"} Dec 11 13:23:30 crc kubenswrapper[4898]: I1211 13:23:30.962778 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-4lcmf" event={"ID":"7a0813f7-7167-46ed-b9f8-e2157e92f620","Type":"ContainerStarted","Data":"3bafd86c16954503806830c437f3a807c905999944f335645dc61292479d4c93"} Dec 11 13:23:30 crc kubenswrapper[4898]: I1211 13:23:30.964838 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-596947c645-4xjkq" event={"ID":"c6d5540b-2eb6-411c-b1a9-b0db78e67ae7","Type":"ContainerStarted","Data":"466b7031961295744b99eaff660d3dfd9b41480ca9a43c42d40066b0ce1b5e75"} Dec 11 13:23:30 crc kubenswrapper[4898]: I1211 13:23:30.965262 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-6c677c69b-rfxp7" event={"ID":"c154e39f-1760-4071-b688-f301c3a398e7","Type":"ContainerStarted","Data":"80c7122c33e1badefbf5002d0a587f09e1a32a9198e803f0d06c129e9ac6bb24"} Dec 11 13:23:30 crc kubenswrapper[4898]: I1211 13:23:30.967132 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-78f8948974-9p4w9" event={"ID":"42b8c71f-abd8-49b1-b604-49b3292ba29a","Type":"ContainerStarted","Data":"766e418b378d0581338cf6dee9df32f76a0d681768b4e38db1c852439dc98e84"} Dec 11 13:23:30 crc kubenswrapper[4898]: I1211 13:23:30.968904 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-nx4h5" event={"ID":"95c66498-ab0d-4618-b884-523e1183d758","Type":"ContainerStarted","Data":"2ac6d51788e4e524499fff3d860a208ce80fba084b64b05ef1f2c9692d1bf0f4"} Dec 11 13:23:32 crc kubenswrapper[4898]: I1211 13:23:32.004472 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5854674fcc-8d8zb" event={"ID":"1a7e7363-7657-4eb2-a969-9f4c08a50984","Type":"ContainerStarted","Data":"91083dfa6ee5be08a5717d0c2b9a537caf179f57cc581c6054f7032fa46ae623"} Dec 11 13:23:33 crc kubenswrapper[4898]: E1211 13:23:33.212728 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/designate-operator-controller-manager-697fb699cf-pdvzc" podUID="d7d8e047-7525-4d88-b802-550590e7f743" Dec 11 13:23:34 crc kubenswrapper[4898]: I1211 13:23:34.028929 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-596947c645-4xjkq" event={"ID":"c6d5540b-2eb6-411c-b1a9-b0db78e67ae7","Type":"ContainerStarted","Data":"c30852cce6e1d4e75cd4429ca40a5be48e611929a54b6546057feee943223457"} Dec 11 13:23:34 crc kubenswrapper[4898]: I1211 13:23:34.029096 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-596947c645-4xjkq" Dec 11 13:23:34 crc kubenswrapper[4898]: I1211 13:23:34.031214 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-697fb699cf-pdvzc" event={"ID":"d7d8e047-7525-4d88-b802-550590e7f743","Type":"ContainerStarted","Data":"0b1deb913c038e9e48469f7a2c8fcb70bf5d33095afe5ee64a5c0d0ca5379292"} Dec 11 13:23:34 crc kubenswrapper[4898]: I1211 13:23:34.056852 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-596947c645-4xjkq" podStartSLOduration=43.056835833 podStartE2EDuration="43.056835833s" podCreationTimestamp="2025-12-11 13:22:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 13:23:34.054933072 +0000 UTC m=+1171.627259529" watchObservedRunningTime="2025-12-11 13:23:34.056835833 +0000 UTC m=+1171.629162270" Dec 11 13:23:34 crc kubenswrapper[4898]: E1211 13:23:34.946335 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-tq8mw" podUID="0c6054e7-bb0a-4cbd-b459-d9d100182fa1" Dec 11 13:23:35 crc kubenswrapper[4898]: I1211 13:23:35.047622 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-hmbfx" event={"ID":"c99ec3c0-d415-4322-95cd-d57411a1db7b","Type":"ContainerStarted","Data":"b01e34bb7e074357df5fc34395e3eb79a31ff5e7abeae5370b93faa8769a021c"} Dec 11 13:23:35 crc kubenswrapper[4898]: I1211 13:23:35.047806 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-hmbfx" Dec 11 13:23:35 crc kubenswrapper[4898]: I1211 13:23:35.049337 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-hmbfx" Dec 11 13:23:35 crc kubenswrapper[4898]: I1211 13:23:35.050232 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-tq8mw" event={"ID":"0c6054e7-bb0a-4cbd-b459-d9d100182fa1","Type":"ContainerStarted","Data":"4f371b7cdf36660fab1b0b7f233a3fe000749a7ee3d97adb8aba6feeb733995f"} Dec 11 13:23:35 crc kubenswrapper[4898]: I1211 13:23:35.067537 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-hmbfx" podStartSLOduration=3.509148854 podStartE2EDuration="44.067520305s" podCreationTimestamp="2025-12-11 13:22:51 +0000 UTC" firstStartedPulling="2025-12-11 13:22:52.459157507 +0000 UTC m=+1130.031483944" lastFinishedPulling="2025-12-11 13:23:33.017528958 +0000 UTC m=+1170.589855395" observedRunningTime="2025-12-11 13:23:35.062506609 +0000 UTC m=+1172.634833046" watchObservedRunningTime="2025-12-11 13:23:35.067520305 +0000 UTC m=+1172.639846742" Dec 11 13:23:36 crc kubenswrapper[4898]: I1211 13:23:36.069370 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-967d97867-qrfjf" event={"ID":"1f2676f6-97b8-425e-9d05-9ec2c52055de","Type":"ContainerStarted","Data":"9761ec4c19ceaeaee14bfe28a267b19d64d1dee47239d1d1a98d6880005d2b51"} Dec 11 13:23:36 crc kubenswrapper[4898]: I1211 13:23:36.069845 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-967d97867-qrfjf" Dec 11 13:23:36 crc kubenswrapper[4898]: I1211 13:23:36.074923 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-967d97867-qrfjf" Dec 11 13:23:36 crc kubenswrapper[4898]: I1211 13:23:36.095526 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-967d97867-qrfjf" podStartSLOduration=5.386752756 podStartE2EDuration="45.095503943s" podCreationTimestamp="2025-12-11 13:22:51 +0000 UTC" firstStartedPulling="2025-12-11 13:22:53.307739263 +0000 UTC m=+1130.880065700" lastFinishedPulling="2025-12-11 13:23:33.01649045 +0000 UTC m=+1170.588816887" observedRunningTime="2025-12-11 13:23:36.084163626 +0000 UTC m=+1173.656490063" watchObservedRunningTime="2025-12-11 13:23:36.095503943 +0000 UTC m=+1173.667830380" Dec 11 13:23:36 crc kubenswrapper[4898]: E1211 13:23:36.193810 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/swift-operator-controller-manager-9d58d64bc-2wt95" podUID="cb6adf46-208a-4945-97aa-2c457b9c2614" Dec 11 13:23:36 crc kubenswrapper[4898]: E1211 13:23:36.224180 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-pqc9z" podUID="ac70ed50-7e53-4bb9-ac63-35e5c0651db5" Dec 11 13:23:36 crc kubenswrapper[4898]: E1211 13:23:36.255028 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/telemetry-operator-controller-manager-766b45bcdb-ksffb" podUID="f52c9389-ea61-4327-afd9-f4c92541a821" Dec 11 13:23:36 crc kubenswrapper[4898]: E1211 13:23:36.554029 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/octavia-operator-controller-manager-998648c74-dvtzm" podUID="88e97f63-c1cb-4ef1-9d95-0c11dc52c94c" Dec 11 13:23:37 crc kubenswrapper[4898]: I1211 13:23:37.077741 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-5697bb5779-7kffw" event={"ID":"5c391a19-7c2d-4838-9269-2c5cd8eea1ad","Type":"ContainerStarted","Data":"f0d6819ce57509abc7f25f06b2ec394db085fcec95f12e4eeb56fd1a8cfacf7d"} Dec 11 13:23:37 crc kubenswrapper[4898]: I1211 13:23:37.079407 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-697fb699cf-pdvzc" event={"ID":"d7d8e047-7525-4d88-b802-550590e7f743","Type":"ContainerStarted","Data":"e92c6a9ab8597e23589c7a7d9e9b700f7ef6d773c5888f1292e299d01fa447d2"} Dec 11 13:23:37 crc kubenswrapper[4898]: I1211 13:23:37.080726 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-697fb699cf-pdvzc" Dec 11 13:23:37 crc kubenswrapper[4898]: I1211 13:23:37.086977 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-8nj46" event={"ID":"e2834985-dbd0-4ad6-afd2-8238997ec8e5","Type":"ContainerStarted","Data":"ed10b5f31179993b87032f4d900f74bfd15542901fc24c923c2497f3c66db8b8"} Dec 11 13:23:37 crc kubenswrapper[4898]: I1211 13:23:37.089991 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-pqc9z" event={"ID":"ac70ed50-7e53-4bb9-ac63-35e5c0651db5","Type":"ContainerStarted","Data":"f9d0ff30a914d693fc0e2a6b7564b0258c84e12896ecac64831d535ad4f4dcfb"} Dec 11 13:23:37 crc kubenswrapper[4898]: E1211 13:23:37.091556 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:779f0cee6024d0fb8f259b036fe790e62aa5a3b0431ea9bf15a6e7d02e2e5670\\\"\"" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-pqc9z" podUID="ac70ed50-7e53-4bb9-ac63-35e5c0651db5" Dec 11 13:23:37 crc kubenswrapper[4898]: I1211 13:23:37.092891 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-9d58d64bc-2wt95" event={"ID":"cb6adf46-208a-4945-97aa-2c457b9c2614","Type":"ContainerStarted","Data":"3794f508b786a6b5668be59202164f3896848a1c69b8753f5535c31e9302c8e1"} Dec 11 13:23:37 crc kubenswrapper[4898]: I1211 13:23:37.095310 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-4lcmf" event={"ID":"7a0813f7-7167-46ed-b9f8-e2157e92f620","Type":"ContainerStarted","Data":"711cb8eb9f108faca61e5b90fe4e2e72093c57f10edf22899ee94e9563e69d8f"} Dec 11 13:23:37 crc kubenswrapper[4898]: I1211 13:23:37.095730 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-4lcmf" Dec 11 13:23:37 crc kubenswrapper[4898]: I1211 13:23:37.098809 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-766b45bcdb-ksffb" event={"ID":"f52c9389-ea61-4327-afd9-f4c92541a821","Type":"ContainerStarted","Data":"bd2cfa317e773d65b4cd50c5d00e0e9187545703c1787a2c2c6365b0267f4af0"} Dec 11 13:23:37 crc kubenswrapper[4898]: I1211 13:23:37.099308 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-4lcmf" Dec 11 13:23:37 crc kubenswrapper[4898]: I1211 13:23:37.106200 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-tq8mw" event={"ID":"0c6054e7-bb0a-4cbd-b459-d9d100182fa1","Type":"ContainerStarted","Data":"108c851f84f199fdd82d1a2377567c4f310c5e86c6e90de2b2c289cd2784fe62"} Dec 11 13:23:37 crc kubenswrapper[4898]: I1211 13:23:37.107048 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-tq8mw" Dec 11 13:23:37 crc kubenswrapper[4898]: I1211 13:23:37.111907 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879fv8ph7" event={"ID":"1fbc642b-9636-47c2-a3db-7913fa4a6b91","Type":"ContainerStarted","Data":"4349b750ee8f0ff9ebcade040362a3fdaaed85dd862c4059adc6e3c26f1d1eef"} Dec 11 13:23:37 crc kubenswrapper[4898]: I1211 13:23:37.115331 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-697fb699cf-pdvzc" podStartSLOduration=3.059191441 podStartE2EDuration="46.115316281s" podCreationTimestamp="2025-12-11 13:22:51 +0000 UTC" firstStartedPulling="2025-12-11 13:22:52.597863849 +0000 UTC m=+1130.170190286" lastFinishedPulling="2025-12-11 13:23:35.653988689 +0000 UTC m=+1173.226315126" observedRunningTime="2025-12-11 13:23:37.111711473 +0000 UTC m=+1174.684037910" watchObservedRunningTime="2025-12-11 13:23:37.115316281 +0000 UTC m=+1174.687642718" Dec 11 13:23:37 crc kubenswrapper[4898]: I1211 13:23:37.119649 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-nx4h5" event={"ID":"95c66498-ab0d-4618-b884-523e1183d758","Type":"ContainerStarted","Data":"a3301a64e506c7763ab32594dce4270f65499d2b244bfc5c48f881366e06560c"} Dec 11 13:23:37 crc kubenswrapper[4898]: I1211 13:23:37.120185 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-nx4h5" Dec 11 13:23:37 crc kubenswrapper[4898]: I1211 13:23:37.125875 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-998648c74-dvtzm" event={"ID":"88e97f63-c1cb-4ef1-9d95-0c11dc52c94c","Type":"ContainerStarted","Data":"76988b803d9ad1ab9d04f3a0f68192b6f761ec99384aabab4783d3839be17436"} Dec 11 13:23:37 crc kubenswrapper[4898]: I1211 13:23:37.125955 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-nx4h5" Dec 11 13:23:37 crc kubenswrapper[4898]: I1211 13:23:37.196372 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-tq8mw" podStartSLOduration=3.577140623 podStartE2EDuration="46.196354003s" podCreationTimestamp="2025-12-11 13:22:51 +0000 UTC" firstStartedPulling="2025-12-11 13:22:53.036545347 +0000 UTC m=+1130.608871784" lastFinishedPulling="2025-12-11 13:23:35.655758727 +0000 UTC m=+1173.228085164" observedRunningTime="2025-12-11 13:23:37.190825163 +0000 UTC m=+1174.763151600" watchObservedRunningTime="2025-12-11 13:23:37.196354003 +0000 UTC m=+1174.768680430" Dec 11 13:23:37 crc kubenswrapper[4898]: I1211 13:23:37.214520 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-4lcmf" podStartSLOduration=3.632041078 podStartE2EDuration="46.214505094s" podCreationTimestamp="2025-12-11 13:22:51 +0000 UTC" firstStartedPulling="2025-12-11 13:22:53.03629741 +0000 UTC m=+1130.608623847" lastFinishedPulling="2025-12-11 13:23:35.618761426 +0000 UTC m=+1173.191087863" observedRunningTime="2025-12-11 13:23:37.210726062 +0000 UTC m=+1174.783052489" watchObservedRunningTime="2025-12-11 13:23:37.214505094 +0000 UTC m=+1174.786831531" Dec 11 13:23:37 crc kubenswrapper[4898]: I1211 13:23:37.232056 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-nx4h5" podStartSLOduration=5.2904888119999995 podStartE2EDuration="46.232036438s" podCreationTimestamp="2025-12-11 13:22:51 +0000 UTC" firstStartedPulling="2025-12-11 13:22:53.908971998 +0000 UTC m=+1131.481298435" lastFinishedPulling="2025-12-11 13:23:34.850519624 +0000 UTC m=+1172.422846061" observedRunningTime="2025-12-11 13:23:37.226058906 +0000 UTC m=+1174.798385343" watchObservedRunningTime="2025-12-11 13:23:37.232036438 +0000 UTC m=+1174.804362875" Dec 11 13:23:37 crc kubenswrapper[4898]: E1211 13:23:37.806965 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/glance-operator-controller-manager-5697bb5779-7kffw" podUID="5c391a19-7c2d-4838-9269-2c5cd8eea1ad" Dec 11 13:23:37 crc kubenswrapper[4898]: E1211 13:23:37.807158 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-wlqm4" podUID="e87a760e-bf60-4a98-bb37-1f44745e250f" Dec 11 13:23:37 crc kubenswrapper[4898]: E1211 13:23:37.807299 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/manila-operator-controller-manager-5b5fd79c9c-wxz25" podUID="a99a2194-b89b-4a6a-a086-acd20b489632" Dec 11 13:23:37 crc kubenswrapper[4898]: E1211 13:23:37.807440 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-pgfnv" podUID="09d9c781-008c-4486-807c-159f4fefe857" Dec 11 13:23:37 crc kubenswrapper[4898]: E1211 13:23:37.890011 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/watcher-operator-controller-manager-75944c9b7-9kw6z" podUID="7d6dbccc-94de-44f9-b7d2-5bbcfee1d119" Dec 11 13:23:38 crc kubenswrapper[4898]: I1211 13:23:38.143190 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-wlqm4" event={"ID":"e87a760e-bf60-4a98-bb37-1f44745e250f","Type":"ContainerStarted","Data":"366321fd210fb526542d927f27a2806079ce2ce66ecced31caa510ade563b1aa"} Dec 11 13:23:38 crc kubenswrapper[4898]: I1211 13:23:38.153959 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-5b5fd79c9c-wxz25" event={"ID":"a99a2194-b89b-4a6a-a086-acd20b489632","Type":"ContainerStarted","Data":"7c2b1bf66d9c4945e158f724146f4c2aeb987b6af93cff9e0657831bec0c49ca"} Dec 11 13:23:38 crc kubenswrapper[4898]: I1211 13:23:38.166105 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-pgfnv" event={"ID":"09d9c781-008c-4486-807c-159f4fefe857","Type":"ContainerStarted","Data":"d0aa1fe66ed16d7fcd28ed570c23a9e07a0ee951d8c9bf2143c30d8848e27f12"} Dec 11 13:23:38 crc kubenswrapper[4898]: I1211 13:23:38.178908 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-79c8c4686c-qpznh" event={"ID":"1e77353c-6728-4dfa-814c-1a92115c8bf2","Type":"ContainerStarted","Data":"1f36d528b372fd94668095bd9f0ac839b0d574cb9ba8407daaa3db1b885537b0"} Dec 11 13:23:38 crc kubenswrapper[4898]: I1211 13:23:38.179663 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-79c8c4686c-qpznh" Dec 11 13:23:38 crc kubenswrapper[4898]: I1211 13:23:38.198905 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-79c8c4686c-qpznh" Dec 11 13:23:38 crc kubenswrapper[4898]: I1211 13:23:38.214797 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-8nj46" event={"ID":"e2834985-dbd0-4ad6-afd2-8238997ec8e5","Type":"ContainerStarted","Data":"d49f9bdfbfdd087da2f67c47e71d078feed8896bd12b49ee629b50a6782e8ba3"} Dec 11 13:23:38 crc kubenswrapper[4898]: I1211 13:23:38.215731 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-8nj46" Dec 11 13:23:38 crc kubenswrapper[4898]: I1211 13:23:38.246217 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-75944c9b7-9kw6z" event={"ID":"7d6dbccc-94de-44f9-b7d2-5bbcfee1d119","Type":"ContainerStarted","Data":"10b1c8f067438d5bdfbd83a160efe273f0e9250b2eff078785456fce4bd015d8"} Dec 11 13:23:38 crc kubenswrapper[4898]: I1211 13:23:38.256395 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-79c8c4686c-qpznh" podStartSLOduration=4.9343975 podStartE2EDuration="47.256379049s" podCreationTimestamp="2025-12-11 13:22:51 +0000 UTC" firstStartedPulling="2025-12-11 13:22:53.275561893 +0000 UTC m=+1130.847888330" lastFinishedPulling="2025-12-11 13:23:35.597543442 +0000 UTC m=+1173.169869879" observedRunningTime="2025-12-11 13:23:38.254732964 +0000 UTC m=+1175.827059401" watchObservedRunningTime="2025-12-11 13:23:38.256379049 +0000 UTC m=+1175.828705486" Dec 11 13:23:38 crc kubenswrapper[4898]: I1211 13:23:38.261015 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5854674fcc-8d8zb" event={"ID":"1a7e7363-7657-4eb2-a969-9f4c08a50984","Type":"ContainerStarted","Data":"dc627c5ea4d0791c7582f55077893c1cf0ab99938e3edc8e550abd23c3a6997a"} Dec 11 13:23:38 crc kubenswrapper[4898]: I1211 13:23:38.262309 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5854674fcc-8d8zb" Dec 11 13:23:38 crc kubenswrapper[4898]: I1211 13:23:38.270926 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5854674fcc-8d8zb" Dec 11 13:23:38 crc kubenswrapper[4898]: I1211 13:23:38.271344 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-6c677c69b-rfxp7" event={"ID":"c154e39f-1760-4071-b688-f301c3a398e7","Type":"ContainerStarted","Data":"b06dd3bb09b0ba1c93e1de202fee6f43748758ff76470305a547aea78a57a5ff"} Dec 11 13:23:38 crc kubenswrapper[4898]: I1211 13:23:38.272184 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-6c677c69b-rfxp7" Dec 11 13:23:38 crc kubenswrapper[4898]: I1211 13:23:38.273365 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-78f8948974-9p4w9" event={"ID":"42b8c71f-abd8-49b1-b604-49b3292ba29a","Type":"ContainerStarted","Data":"5661bd3db5601fdd4110673d58767c6d08126fd655010b0699d3e8403bc4e6ba"} Dec 11 13:23:38 crc kubenswrapper[4898]: I1211 13:23:38.274949 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-78f8948974-9p4w9" Dec 11 13:23:38 crc kubenswrapper[4898]: I1211 13:23:38.281842 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-6c677c69b-rfxp7" Dec 11 13:23:38 crc kubenswrapper[4898]: I1211 13:23:38.285932 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-8nj46" podStartSLOduration=41.147379799 podStartE2EDuration="47.285921328s" podCreationTimestamp="2025-12-11 13:22:51 +0000 UTC" firstStartedPulling="2025-12-11 13:23:29.381873916 +0000 UTC m=+1166.954200353" lastFinishedPulling="2025-12-11 13:23:35.520415445 +0000 UTC m=+1173.092741882" observedRunningTime="2025-12-11 13:23:38.275688111 +0000 UTC m=+1175.848014548" watchObservedRunningTime="2025-12-11 13:23:38.285921328 +0000 UTC m=+1175.858247765" Dec 11 13:23:38 crc kubenswrapper[4898]: I1211 13:23:38.293881 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-78f8948974-9p4w9" Dec 11 13:23:38 crc kubenswrapper[4898]: I1211 13:23:38.296709 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879fv8ph7" event={"ID":"1fbc642b-9636-47c2-a3db-7913fa4a6b91","Type":"ContainerStarted","Data":"73fa9c7878716a5885d4967444797552e6144b4efd6e1bc0b897f24321a51bda"} Dec 11 13:23:38 crc kubenswrapper[4898]: I1211 13:23:38.297489 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879fv8ph7" Dec 11 13:23:38 crc kubenswrapper[4898]: I1211 13:23:38.317660 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-5854674fcc-8d8zb" podStartSLOduration=5.663042461 podStartE2EDuration="47.317648036s" podCreationTimestamp="2025-12-11 13:22:51 +0000 UTC" firstStartedPulling="2025-12-11 13:22:53.953933574 +0000 UTC m=+1131.526260011" lastFinishedPulling="2025-12-11 13:23:35.608539149 +0000 UTC m=+1173.180865586" observedRunningTime="2025-12-11 13:23:38.315812997 +0000 UTC m=+1175.888139434" watchObservedRunningTime="2025-12-11 13:23:38.317648036 +0000 UTC m=+1175.889974473" Dec 11 13:23:38 crc kubenswrapper[4898]: I1211 13:23:38.329210 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-766b45bcdb-ksffb" event={"ID":"f52c9389-ea61-4327-afd9-f4c92541a821","Type":"ContainerStarted","Data":"70eba36488af7e69ce374bfd6e879ef6839a763f1805cb09664963e19b93be95"} Dec 11 13:23:38 crc kubenswrapper[4898]: I1211 13:23:38.396838 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-6c677c69b-rfxp7" podStartSLOduration=4.375005387 podStartE2EDuration="47.396818758s" podCreationTimestamp="2025-12-11 13:22:51 +0000 UTC" firstStartedPulling="2025-12-11 13:22:52.597807438 +0000 UTC m=+1130.170133875" lastFinishedPulling="2025-12-11 13:23:35.619620809 +0000 UTC m=+1173.191947246" observedRunningTime="2025-12-11 13:23:38.38802381 +0000 UTC m=+1175.960350247" watchObservedRunningTime="2025-12-11 13:23:38.396818758 +0000 UTC m=+1175.969145195" Dec 11 13:23:38 crc kubenswrapper[4898]: I1211 13:23:38.421674 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-78f8948974-9p4w9" podStartSLOduration=5.749772998 podStartE2EDuration="47.42165682s" podCreationTimestamp="2025-12-11 13:22:51 +0000 UTC" firstStartedPulling="2025-12-11 13:22:53.946299328 +0000 UTC m=+1131.518625765" lastFinishedPulling="2025-12-11 13:23:35.61818315 +0000 UTC m=+1173.190509587" observedRunningTime="2025-12-11 13:23:38.412858282 +0000 UTC m=+1175.985184719" watchObservedRunningTime="2025-12-11 13:23:38.42165682 +0000 UTC m=+1175.993983257" Dec 11 13:23:38 crc kubenswrapper[4898]: I1211 13:23:38.457255 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879fv8ph7" podStartSLOduration=41.193606809 podStartE2EDuration="47.457239702s" podCreationTimestamp="2025-12-11 13:22:51 +0000 UTC" firstStartedPulling="2025-12-11 13:23:29.389139623 +0000 UTC m=+1166.961466060" lastFinishedPulling="2025-12-11 13:23:35.652772516 +0000 UTC m=+1173.225098953" observedRunningTime="2025-12-11 13:23:38.453133871 +0000 UTC m=+1176.025460308" watchObservedRunningTime="2025-12-11 13:23:38.457239702 +0000 UTC m=+1176.029566139" Dec 11 13:23:38 crc kubenswrapper[4898]: I1211 13:23:38.478795 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-766b45bcdb-ksffb" podStartSLOduration=3.461968087 podStartE2EDuration="47.478775285s" podCreationTimestamp="2025-12-11 13:22:51 +0000 UTC" firstStartedPulling="2025-12-11 13:22:53.886440898 +0000 UTC m=+1131.458767325" lastFinishedPulling="2025-12-11 13:23:37.903248086 +0000 UTC m=+1175.475574523" observedRunningTime="2025-12-11 13:23:38.4782244 +0000 UTC m=+1176.050550837" watchObservedRunningTime="2025-12-11 13:23:38.478775285 +0000 UTC m=+1176.051101722" Dec 11 13:23:39 crc kubenswrapper[4898]: I1211 13:23:39.337719 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-9d58d64bc-2wt95" event={"ID":"cb6adf46-208a-4945-97aa-2c457b9c2614","Type":"ContainerStarted","Data":"dd7be9be2fb7948220c3818d44ac0574a34115a3c0fa7858bcf7c5fd88f118c0"} Dec 11 13:23:39 crc kubenswrapper[4898]: I1211 13:23:39.338229 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-9d58d64bc-2wt95" Dec 11 13:23:39 crc kubenswrapper[4898]: I1211 13:23:39.340113 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-75944c9b7-9kw6z" event={"ID":"7d6dbccc-94de-44f9-b7d2-5bbcfee1d119","Type":"ContainerStarted","Data":"cd745a005e1b7c9b08d3b48edcf9e8f7e3b19d5dbdb6d44c0fa8531f1aace080"} Dec 11 13:23:39 crc kubenswrapper[4898]: I1211 13:23:39.340185 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-75944c9b7-9kw6z" Dec 11 13:23:39 crc kubenswrapper[4898]: I1211 13:23:39.342055 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-5b5fd79c9c-wxz25" event={"ID":"a99a2194-b89b-4a6a-a086-acd20b489632","Type":"ContainerStarted","Data":"3cd427a70095e4f44748ebe5479f2dcf14b6ff93510f7eee44b5666d03143faf"} Dec 11 13:23:39 crc kubenswrapper[4898]: I1211 13:23:39.342174 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-5b5fd79c9c-wxz25" Dec 11 13:23:39 crc kubenswrapper[4898]: I1211 13:23:39.345027 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-pgfnv" event={"ID":"09d9c781-008c-4486-807c-159f4fefe857","Type":"ContainerStarted","Data":"91a4f2df6cd794c3835090b18f071106f4437cc48ea186641d484562a1187e91"} Dec 11 13:23:39 crc kubenswrapper[4898]: I1211 13:23:39.345154 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-pgfnv" Dec 11 13:23:39 crc kubenswrapper[4898]: I1211 13:23:39.346899 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-998648c74-dvtzm" event={"ID":"88e97f63-c1cb-4ef1-9d95-0c11dc52c94c","Type":"ContainerStarted","Data":"4dc95f2f6b5fcd6dea0d0d18c4e09fc206b8e7899b74fffd7ad63ee12678f681"} Dec 11 13:23:39 crc kubenswrapper[4898]: I1211 13:23:39.349082 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-766b45bcdb-ksffb" Dec 11 13:23:39 crc kubenswrapper[4898]: I1211 13:23:39.365442 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-9d58d64bc-2wt95" podStartSLOduration=4.254517177 podStartE2EDuration="48.36542978s" podCreationTimestamp="2025-12-11 13:22:51 +0000 UTC" firstStartedPulling="2025-12-11 13:22:53.886016677 +0000 UTC m=+1131.458343114" lastFinishedPulling="2025-12-11 13:23:37.99692928 +0000 UTC m=+1175.569255717" observedRunningTime="2025-12-11 13:23:39.360562018 +0000 UTC m=+1176.932888455" watchObservedRunningTime="2025-12-11 13:23:39.36542978 +0000 UTC m=+1176.937756217" Dec 11 13:23:39 crc kubenswrapper[4898]: I1211 13:23:39.378228 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-998648c74-dvtzm" podStartSLOduration=4.266602304 podStartE2EDuration="48.378210956s" podCreationTimestamp="2025-12-11 13:22:51 +0000 UTC" firstStartedPulling="2025-12-11 13:22:53.93307093 +0000 UTC m=+1131.505397367" lastFinishedPulling="2025-12-11 13:23:38.044679582 +0000 UTC m=+1175.617006019" observedRunningTime="2025-12-11 13:23:39.374855025 +0000 UTC m=+1176.947181462" watchObservedRunningTime="2025-12-11 13:23:39.378210956 +0000 UTC m=+1176.950537393" Dec 11 13:23:39 crc kubenswrapper[4898]: I1211 13:23:39.406443 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-5b5fd79c9c-wxz25" podStartSLOduration=3.039374946 podStartE2EDuration="48.406426099s" podCreationTimestamp="2025-12-11 13:22:51 +0000 UTC" firstStartedPulling="2025-12-11 13:22:53.355256649 +0000 UTC m=+1130.927583086" lastFinishedPulling="2025-12-11 13:23:38.722307802 +0000 UTC m=+1176.294634239" observedRunningTime="2025-12-11 13:23:39.400883689 +0000 UTC m=+1176.973210126" watchObservedRunningTime="2025-12-11 13:23:39.406426099 +0000 UTC m=+1176.978752536" Dec 11 13:23:39 crc kubenswrapper[4898]: I1211 13:23:39.424664 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-75944c9b7-9kw6z" podStartSLOduration=3.587994317 podStartE2EDuration="48.424649072s" podCreationTimestamp="2025-12-11 13:22:51 +0000 UTC" firstStartedPulling="2025-12-11 13:22:53.945095755 +0000 UTC m=+1131.517422192" lastFinishedPulling="2025-12-11 13:23:38.78175051 +0000 UTC m=+1176.354076947" observedRunningTime="2025-12-11 13:23:39.422217006 +0000 UTC m=+1176.994543453" watchObservedRunningTime="2025-12-11 13:23:39.424649072 +0000 UTC m=+1176.996975509" Dec 11 13:23:39 crc kubenswrapper[4898]: I1211 13:23:39.448271 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-pgfnv" podStartSLOduration=3.47722499 podStartE2EDuration="48.44825361s" podCreationTimestamp="2025-12-11 13:22:51 +0000 UTC" firstStartedPulling="2025-12-11 13:22:53.883689044 +0000 UTC m=+1131.456015481" lastFinishedPulling="2025-12-11 13:23:38.854717664 +0000 UTC m=+1176.427044101" observedRunningTime="2025-12-11 13:23:39.445958438 +0000 UTC m=+1177.018284885" watchObservedRunningTime="2025-12-11 13:23:39.44825361 +0000 UTC m=+1177.020580047" Dec 11 13:23:40 crc kubenswrapper[4898]: I1211 13:23:40.356618 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-5697bb5779-7kffw" event={"ID":"5c391a19-7c2d-4838-9269-2c5cd8eea1ad","Type":"ContainerStarted","Data":"11c3c53eb1b9249a0c5ee9f6053c2ada18685c4c08dced01a5cfb7071e5c9374"} Dec 11 13:23:40 crc kubenswrapper[4898]: I1211 13:23:40.357212 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-5697bb5779-7kffw" Dec 11 13:23:40 crc kubenswrapper[4898]: I1211 13:23:40.358684 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-pqc9z" event={"ID":"ac70ed50-7e53-4bb9-ac63-35e5c0651db5","Type":"ContainerStarted","Data":"9009c1eb850480d3b6c9b12b3d5d8f49cd4846da3cfd1e022314aff62cd2ba1e"} Dec 11 13:23:40 crc kubenswrapper[4898]: I1211 13:23:40.358926 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-pqc9z" Dec 11 13:23:40 crc kubenswrapper[4898]: I1211 13:23:40.360790 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-wlqm4" event={"ID":"e87a760e-bf60-4a98-bb37-1f44745e250f","Type":"ContainerStarted","Data":"471e7656338f7e08c22bb2979c12972781fd143a2346583b1f9f85da3f041439"} Dec 11 13:23:40 crc kubenswrapper[4898]: I1211 13:23:40.361315 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-998648c74-dvtzm" Dec 11 13:23:40 crc kubenswrapper[4898]: I1211 13:23:40.377722 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-5697bb5779-7kffw" podStartSLOduration=3.304093026 podStartE2EDuration="49.377702064s" podCreationTimestamp="2025-12-11 13:22:51 +0000 UTC" firstStartedPulling="2025-12-11 13:22:53.041221293 +0000 UTC m=+1130.613547720" lastFinishedPulling="2025-12-11 13:23:39.114830321 +0000 UTC m=+1176.687156758" observedRunningTime="2025-12-11 13:23:40.374097377 +0000 UTC m=+1177.946423814" watchObservedRunningTime="2025-12-11 13:23:40.377702064 +0000 UTC m=+1177.950028501" Dec 11 13:23:40 crc kubenswrapper[4898]: I1211 13:23:40.403003 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-pqc9z" podStartSLOduration=4.158391928 podStartE2EDuration="49.402969098s" podCreationTimestamp="2025-12-11 13:22:51 +0000 UTC" firstStartedPulling="2025-12-11 13:22:53.875455631 +0000 UTC m=+1131.447782068" lastFinishedPulling="2025-12-11 13:23:39.120032801 +0000 UTC m=+1176.692359238" observedRunningTime="2025-12-11 13:23:40.393978344 +0000 UTC m=+1177.966304791" watchObservedRunningTime="2025-12-11 13:23:40.402969098 +0000 UTC m=+1177.975295525" Dec 11 13:23:40 crc kubenswrapper[4898]: I1211 13:23:40.420325 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-wlqm4" podStartSLOduration=3.622532571 podStartE2EDuration="49.420301156s" podCreationTimestamp="2025-12-11 13:22:51 +0000 UTC" firstStartedPulling="2025-12-11 13:22:53.323738006 +0000 UTC m=+1130.896064443" lastFinishedPulling="2025-12-11 13:23:39.121506591 +0000 UTC m=+1176.693833028" observedRunningTime="2025-12-11 13:23:40.414971402 +0000 UTC m=+1177.987297839" watchObservedRunningTime="2025-12-11 13:23:40.420301156 +0000 UTC m=+1177.992627593" Dec 11 13:23:41 crc kubenswrapper[4898]: I1211 13:23:41.374035 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-wlqm4" Dec 11 13:23:41 crc kubenswrapper[4898]: I1211 13:23:41.692496 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-697fb699cf-pdvzc" Dec 11 13:23:41 crc kubenswrapper[4898]: I1211 13:23:41.941450 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-tq8mw" Dec 11 13:23:43 crc kubenswrapper[4898]: E1211 13:23:43.779916 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-djqgv" podUID="c7e9c45d-ed03-4e5f-a585-5d1af92727f9" Dec 11 13:23:44 crc kubenswrapper[4898]: I1211 13:23:44.447033 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-596947c645-4xjkq" Dec 11 13:23:47 crc kubenswrapper[4898]: I1211 13:23:47.972091 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-8nj46" Dec 11 13:23:47 crc kubenswrapper[4898]: I1211 13:23:47.980921 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879fv8ph7" Dec 11 13:23:51 crc kubenswrapper[4898]: I1211 13:23:51.708704 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-5697bb5779-7kffw" Dec 11 13:23:52 crc kubenswrapper[4898]: I1211 13:23:52.094749 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-wlqm4" Dec 11 13:23:52 crc kubenswrapper[4898]: I1211 13:23:52.148111 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-5b5fd79c9c-wxz25" Dec 11 13:23:52 crc kubenswrapper[4898]: I1211 13:23:52.239889 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-pgfnv" Dec 11 13:23:52 crc kubenswrapper[4898]: I1211 13:23:52.265872 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-pqc9z" Dec 11 13:23:52 crc kubenswrapper[4898]: I1211 13:23:52.326289 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-998648c74-dvtzm" Dec 11 13:23:52 crc kubenswrapper[4898]: I1211 13:23:52.449113 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-9d58d64bc-2wt95" Dec 11 13:23:52 crc kubenswrapper[4898]: I1211 13:23:52.472563 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-766b45bcdb-ksffb" Dec 11 13:23:52 crc kubenswrapper[4898]: I1211 13:23:52.591084 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-75944c9b7-9kw6z" Dec 11 13:23:57 crc kubenswrapper[4898]: I1211 13:23:57.970308 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-djqgv" event={"ID":"c7e9c45d-ed03-4e5f-a585-5d1af92727f9","Type":"ContainerStarted","Data":"48ce5ac57949519b80f8baf0c7010d2f94d9dd000ecd2c67f78f399ddcc6b880"} Dec 11 13:23:58 crc kubenswrapper[4898]: I1211 13:23:57.999991 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-djqgv" podStartSLOduration=2.470341334 podStartE2EDuration="1m5.999956349s" podCreationTimestamp="2025-12-11 13:22:52 +0000 UTC" firstStartedPulling="2025-12-11 13:22:53.954298264 +0000 UTC m=+1131.526624711" lastFinishedPulling="2025-12-11 13:23:57.483913249 +0000 UTC m=+1195.056239726" observedRunningTime="2025-12-11 13:23:57.992273041 +0000 UTC m=+1195.564599518" watchObservedRunningTime="2025-12-11 13:23:57.999956349 +0000 UTC m=+1195.572282836" Dec 11 13:24:04 crc kubenswrapper[4898]: I1211 13:24:04.996344 4898 patch_prober.go:28] interesting pod/machine-config-daemon-7mmvk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 11 13:24:04 crc kubenswrapper[4898]: I1211 13:24:04.997129 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 11 13:24:13 crc kubenswrapper[4898]: I1211 13:24:13.535831 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-2fbtv"] Dec 11 13:24:13 crc kubenswrapper[4898]: I1211 13:24:13.538321 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-2fbtv" Dec 11 13:24:13 crc kubenswrapper[4898]: I1211 13:24:13.542426 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-h6hwp" Dec 11 13:24:13 crc kubenswrapper[4898]: I1211 13:24:13.542602 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Dec 11 13:24:13 crc kubenswrapper[4898]: I1211 13:24:13.542704 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Dec 11 13:24:13 crc kubenswrapper[4898]: I1211 13:24:13.542820 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Dec 11 13:24:13 crc kubenswrapper[4898]: I1211 13:24:13.554345 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-2fbtv"] Dec 11 13:24:13 crc kubenswrapper[4898]: I1211 13:24:13.646084 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vb296\" (UniqueName: \"kubernetes.io/projected/bcb38feb-75c6-4fe1-b522-5bfa0f8e62c8-kube-api-access-vb296\") pod \"dnsmasq-dns-675f4bcbfc-2fbtv\" (UID: \"bcb38feb-75c6-4fe1-b522-5bfa0f8e62c8\") " pod="openstack/dnsmasq-dns-675f4bcbfc-2fbtv" Dec 11 13:24:13 crc kubenswrapper[4898]: I1211 13:24:13.646282 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bcb38feb-75c6-4fe1-b522-5bfa0f8e62c8-config\") pod \"dnsmasq-dns-675f4bcbfc-2fbtv\" (UID: \"bcb38feb-75c6-4fe1-b522-5bfa0f8e62c8\") " pod="openstack/dnsmasq-dns-675f4bcbfc-2fbtv" Dec 11 13:24:13 crc kubenswrapper[4898]: I1211 13:24:13.669333 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-w4lmc"] Dec 11 13:24:13 crc kubenswrapper[4898]: I1211 13:24:13.683258 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-w4lmc" Dec 11 13:24:13 crc kubenswrapper[4898]: I1211 13:24:13.687618 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Dec 11 13:24:13 crc kubenswrapper[4898]: I1211 13:24:13.687702 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-w4lmc"] Dec 11 13:24:13 crc kubenswrapper[4898]: I1211 13:24:13.750379 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bcb38feb-75c6-4fe1-b522-5bfa0f8e62c8-config\") pod \"dnsmasq-dns-675f4bcbfc-2fbtv\" (UID: \"bcb38feb-75c6-4fe1-b522-5bfa0f8e62c8\") " pod="openstack/dnsmasq-dns-675f4bcbfc-2fbtv" Dec 11 13:24:13 crc kubenswrapper[4898]: I1211 13:24:13.750576 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vb296\" (UniqueName: \"kubernetes.io/projected/bcb38feb-75c6-4fe1-b522-5bfa0f8e62c8-kube-api-access-vb296\") pod \"dnsmasq-dns-675f4bcbfc-2fbtv\" (UID: \"bcb38feb-75c6-4fe1-b522-5bfa0f8e62c8\") " pod="openstack/dnsmasq-dns-675f4bcbfc-2fbtv" Dec 11 13:24:13 crc kubenswrapper[4898]: I1211 13:24:13.751283 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bcb38feb-75c6-4fe1-b522-5bfa0f8e62c8-config\") pod \"dnsmasq-dns-675f4bcbfc-2fbtv\" (UID: \"bcb38feb-75c6-4fe1-b522-5bfa0f8e62c8\") " pod="openstack/dnsmasq-dns-675f4bcbfc-2fbtv" Dec 11 13:24:13 crc kubenswrapper[4898]: I1211 13:24:13.775789 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vb296\" (UniqueName: \"kubernetes.io/projected/bcb38feb-75c6-4fe1-b522-5bfa0f8e62c8-kube-api-access-vb296\") pod \"dnsmasq-dns-675f4bcbfc-2fbtv\" (UID: \"bcb38feb-75c6-4fe1-b522-5bfa0f8e62c8\") " pod="openstack/dnsmasq-dns-675f4bcbfc-2fbtv" Dec 11 13:24:13 crc kubenswrapper[4898]: I1211 13:24:13.851843 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bp5hj\" (UniqueName: \"kubernetes.io/projected/61d961de-6da0-4540-b670-6d3ece217387-kube-api-access-bp5hj\") pod \"dnsmasq-dns-78dd6ddcc-w4lmc\" (UID: \"61d961de-6da0-4540-b670-6d3ece217387\") " pod="openstack/dnsmasq-dns-78dd6ddcc-w4lmc" Dec 11 13:24:13 crc kubenswrapper[4898]: I1211 13:24:13.851911 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/61d961de-6da0-4540-b670-6d3ece217387-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-w4lmc\" (UID: \"61d961de-6da0-4540-b670-6d3ece217387\") " pod="openstack/dnsmasq-dns-78dd6ddcc-w4lmc" Dec 11 13:24:13 crc kubenswrapper[4898]: I1211 13:24:13.851937 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/61d961de-6da0-4540-b670-6d3ece217387-config\") pod \"dnsmasq-dns-78dd6ddcc-w4lmc\" (UID: \"61d961de-6da0-4540-b670-6d3ece217387\") " pod="openstack/dnsmasq-dns-78dd6ddcc-w4lmc" Dec 11 13:24:13 crc kubenswrapper[4898]: I1211 13:24:13.863473 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-2fbtv" Dec 11 13:24:13 crc kubenswrapper[4898]: I1211 13:24:13.956701 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bp5hj\" (UniqueName: \"kubernetes.io/projected/61d961de-6da0-4540-b670-6d3ece217387-kube-api-access-bp5hj\") pod \"dnsmasq-dns-78dd6ddcc-w4lmc\" (UID: \"61d961de-6da0-4540-b670-6d3ece217387\") " pod="openstack/dnsmasq-dns-78dd6ddcc-w4lmc" Dec 11 13:24:13 crc kubenswrapper[4898]: I1211 13:24:13.956935 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/61d961de-6da0-4540-b670-6d3ece217387-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-w4lmc\" (UID: \"61d961de-6da0-4540-b670-6d3ece217387\") " pod="openstack/dnsmasq-dns-78dd6ddcc-w4lmc" Dec 11 13:24:13 crc kubenswrapper[4898]: I1211 13:24:13.956970 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/61d961de-6da0-4540-b670-6d3ece217387-config\") pod \"dnsmasq-dns-78dd6ddcc-w4lmc\" (UID: \"61d961de-6da0-4540-b670-6d3ece217387\") " pod="openstack/dnsmasq-dns-78dd6ddcc-w4lmc" Dec 11 13:24:13 crc kubenswrapper[4898]: I1211 13:24:13.958072 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/61d961de-6da0-4540-b670-6d3ece217387-config\") pod \"dnsmasq-dns-78dd6ddcc-w4lmc\" (UID: \"61d961de-6da0-4540-b670-6d3ece217387\") " pod="openstack/dnsmasq-dns-78dd6ddcc-w4lmc" Dec 11 13:24:13 crc kubenswrapper[4898]: I1211 13:24:13.959160 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/61d961de-6da0-4540-b670-6d3ece217387-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-w4lmc\" (UID: \"61d961de-6da0-4540-b670-6d3ece217387\") " pod="openstack/dnsmasq-dns-78dd6ddcc-w4lmc" Dec 11 13:24:13 crc kubenswrapper[4898]: I1211 13:24:13.986574 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bp5hj\" (UniqueName: \"kubernetes.io/projected/61d961de-6da0-4540-b670-6d3ece217387-kube-api-access-bp5hj\") pod \"dnsmasq-dns-78dd6ddcc-w4lmc\" (UID: \"61d961de-6da0-4540-b670-6d3ece217387\") " pod="openstack/dnsmasq-dns-78dd6ddcc-w4lmc" Dec 11 13:24:14 crc kubenswrapper[4898]: I1211 13:24:14.013100 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-w4lmc" Dec 11 13:24:14 crc kubenswrapper[4898]: I1211 13:24:14.306070 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-2fbtv"] Dec 11 13:24:14 crc kubenswrapper[4898]: W1211 13:24:14.312358 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbcb38feb_75c6_4fe1_b522_5bfa0f8e62c8.slice/crio-793223e679b2a3ac6e0898a4a75a4503d31c4a1786361b90f5d9abb18a0e735e WatchSource:0}: Error finding container 793223e679b2a3ac6e0898a4a75a4503d31c4a1786361b90f5d9abb18a0e735e: Status 404 returned error can't find the container with id 793223e679b2a3ac6e0898a4a75a4503d31c4a1786361b90f5d9abb18a0e735e Dec 11 13:24:14 crc kubenswrapper[4898]: I1211 13:24:14.460808 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-w4lmc"] Dec 11 13:24:14 crc kubenswrapper[4898]: W1211 13:24:14.461694 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod61d961de_6da0_4540_b670_6d3ece217387.slice/crio-250d975641ea3f88f010e75177101e5375adc41028a01ea5721d0f39c77927a5 WatchSource:0}: Error finding container 250d975641ea3f88f010e75177101e5375adc41028a01ea5721d0f39c77927a5: Status 404 returned error can't find the container with id 250d975641ea3f88f010e75177101e5375adc41028a01ea5721d0f39c77927a5 Dec 11 13:24:15 crc kubenswrapper[4898]: I1211 13:24:15.172965 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-w4lmc" event={"ID":"61d961de-6da0-4540-b670-6d3ece217387","Type":"ContainerStarted","Data":"250d975641ea3f88f010e75177101e5375adc41028a01ea5721d0f39c77927a5"} Dec 11 13:24:15 crc kubenswrapper[4898]: I1211 13:24:15.173830 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-2fbtv" event={"ID":"bcb38feb-75c6-4fe1-b522-5bfa0f8e62c8","Type":"ContainerStarted","Data":"793223e679b2a3ac6e0898a4a75a4503d31c4a1786361b90f5d9abb18a0e735e"} Dec 11 13:24:16 crc kubenswrapper[4898]: I1211 13:24:16.640141 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-2fbtv"] Dec 11 13:24:16 crc kubenswrapper[4898]: I1211 13:24:16.690192 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-6rnmm"] Dec 11 13:24:16 crc kubenswrapper[4898]: I1211 13:24:16.691856 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-6rnmm" Dec 11 13:24:16 crc kubenswrapper[4898]: I1211 13:24:16.707286 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-6rnmm"] Dec 11 13:24:16 crc kubenswrapper[4898]: I1211 13:24:16.817829 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vp4rc\" (UniqueName: \"kubernetes.io/projected/92279f83-81f1-4e41-8dae-72b3a58335e2-kube-api-access-vp4rc\") pod \"dnsmasq-dns-666b6646f7-6rnmm\" (UID: \"92279f83-81f1-4e41-8dae-72b3a58335e2\") " pod="openstack/dnsmasq-dns-666b6646f7-6rnmm" Dec 11 13:24:16 crc kubenswrapper[4898]: I1211 13:24:16.817900 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/92279f83-81f1-4e41-8dae-72b3a58335e2-config\") pod \"dnsmasq-dns-666b6646f7-6rnmm\" (UID: \"92279f83-81f1-4e41-8dae-72b3a58335e2\") " pod="openstack/dnsmasq-dns-666b6646f7-6rnmm" Dec 11 13:24:16 crc kubenswrapper[4898]: I1211 13:24:16.817960 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/92279f83-81f1-4e41-8dae-72b3a58335e2-dns-svc\") pod \"dnsmasq-dns-666b6646f7-6rnmm\" (UID: \"92279f83-81f1-4e41-8dae-72b3a58335e2\") " pod="openstack/dnsmasq-dns-666b6646f7-6rnmm" Dec 11 13:24:16 crc kubenswrapper[4898]: I1211 13:24:16.919626 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/92279f83-81f1-4e41-8dae-72b3a58335e2-dns-svc\") pod \"dnsmasq-dns-666b6646f7-6rnmm\" (UID: \"92279f83-81f1-4e41-8dae-72b3a58335e2\") " pod="openstack/dnsmasq-dns-666b6646f7-6rnmm" Dec 11 13:24:16 crc kubenswrapper[4898]: I1211 13:24:16.919739 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vp4rc\" (UniqueName: \"kubernetes.io/projected/92279f83-81f1-4e41-8dae-72b3a58335e2-kube-api-access-vp4rc\") pod \"dnsmasq-dns-666b6646f7-6rnmm\" (UID: \"92279f83-81f1-4e41-8dae-72b3a58335e2\") " pod="openstack/dnsmasq-dns-666b6646f7-6rnmm" Dec 11 13:24:16 crc kubenswrapper[4898]: I1211 13:24:16.919797 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/92279f83-81f1-4e41-8dae-72b3a58335e2-config\") pod \"dnsmasq-dns-666b6646f7-6rnmm\" (UID: \"92279f83-81f1-4e41-8dae-72b3a58335e2\") " pod="openstack/dnsmasq-dns-666b6646f7-6rnmm" Dec 11 13:24:16 crc kubenswrapper[4898]: I1211 13:24:16.921083 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/92279f83-81f1-4e41-8dae-72b3a58335e2-config\") pod \"dnsmasq-dns-666b6646f7-6rnmm\" (UID: \"92279f83-81f1-4e41-8dae-72b3a58335e2\") " pod="openstack/dnsmasq-dns-666b6646f7-6rnmm" Dec 11 13:24:16 crc kubenswrapper[4898]: I1211 13:24:16.921385 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/92279f83-81f1-4e41-8dae-72b3a58335e2-dns-svc\") pod \"dnsmasq-dns-666b6646f7-6rnmm\" (UID: \"92279f83-81f1-4e41-8dae-72b3a58335e2\") " pod="openstack/dnsmasq-dns-666b6646f7-6rnmm" Dec 11 13:24:16 crc kubenswrapper[4898]: I1211 13:24:16.923413 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-w4lmc"] Dec 11 13:24:16 crc kubenswrapper[4898]: I1211 13:24:16.940605 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-2t4lr"] Dec 11 13:24:16 crc kubenswrapper[4898]: I1211 13:24:16.942360 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-2t4lr" Dec 11 13:24:16 crc kubenswrapper[4898]: I1211 13:24:16.962223 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vp4rc\" (UniqueName: \"kubernetes.io/projected/92279f83-81f1-4e41-8dae-72b3a58335e2-kube-api-access-vp4rc\") pod \"dnsmasq-dns-666b6646f7-6rnmm\" (UID: \"92279f83-81f1-4e41-8dae-72b3a58335e2\") " pod="openstack/dnsmasq-dns-666b6646f7-6rnmm" Dec 11 13:24:16 crc kubenswrapper[4898]: I1211 13:24:16.969302 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-2t4lr"] Dec 11 13:24:17 crc kubenswrapper[4898]: I1211 13:24:17.026892 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-6rnmm" Dec 11 13:24:17 crc kubenswrapper[4898]: I1211 13:24:17.030515 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7a828046-bb9f-4a7a-be64-d18efa6ccb63-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-2t4lr\" (UID: \"7a828046-bb9f-4a7a-be64-d18efa6ccb63\") " pod="openstack/dnsmasq-dns-57d769cc4f-2t4lr" Dec 11 13:24:17 crc kubenswrapper[4898]: I1211 13:24:17.030565 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a828046-bb9f-4a7a-be64-d18efa6ccb63-config\") pod \"dnsmasq-dns-57d769cc4f-2t4lr\" (UID: \"7a828046-bb9f-4a7a-be64-d18efa6ccb63\") " pod="openstack/dnsmasq-dns-57d769cc4f-2t4lr" Dec 11 13:24:17 crc kubenswrapper[4898]: I1211 13:24:17.030653 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-67xj9\" (UniqueName: \"kubernetes.io/projected/7a828046-bb9f-4a7a-be64-d18efa6ccb63-kube-api-access-67xj9\") pod \"dnsmasq-dns-57d769cc4f-2t4lr\" (UID: \"7a828046-bb9f-4a7a-be64-d18efa6ccb63\") " pod="openstack/dnsmasq-dns-57d769cc4f-2t4lr" Dec 11 13:24:17 crc kubenswrapper[4898]: I1211 13:24:17.132103 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-67xj9\" (UniqueName: \"kubernetes.io/projected/7a828046-bb9f-4a7a-be64-d18efa6ccb63-kube-api-access-67xj9\") pod \"dnsmasq-dns-57d769cc4f-2t4lr\" (UID: \"7a828046-bb9f-4a7a-be64-d18efa6ccb63\") " pod="openstack/dnsmasq-dns-57d769cc4f-2t4lr" Dec 11 13:24:17 crc kubenswrapper[4898]: I1211 13:24:17.132899 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7a828046-bb9f-4a7a-be64-d18efa6ccb63-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-2t4lr\" (UID: \"7a828046-bb9f-4a7a-be64-d18efa6ccb63\") " pod="openstack/dnsmasq-dns-57d769cc4f-2t4lr" Dec 11 13:24:17 crc kubenswrapper[4898]: I1211 13:24:17.132944 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a828046-bb9f-4a7a-be64-d18efa6ccb63-config\") pod \"dnsmasq-dns-57d769cc4f-2t4lr\" (UID: \"7a828046-bb9f-4a7a-be64-d18efa6ccb63\") " pod="openstack/dnsmasq-dns-57d769cc4f-2t4lr" Dec 11 13:24:17 crc kubenswrapper[4898]: I1211 13:24:17.134555 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7a828046-bb9f-4a7a-be64-d18efa6ccb63-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-2t4lr\" (UID: \"7a828046-bb9f-4a7a-be64-d18efa6ccb63\") " pod="openstack/dnsmasq-dns-57d769cc4f-2t4lr" Dec 11 13:24:17 crc kubenswrapper[4898]: I1211 13:24:17.135020 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a828046-bb9f-4a7a-be64-d18efa6ccb63-config\") pod \"dnsmasq-dns-57d769cc4f-2t4lr\" (UID: \"7a828046-bb9f-4a7a-be64-d18efa6ccb63\") " pod="openstack/dnsmasq-dns-57d769cc4f-2t4lr" Dec 11 13:24:17 crc kubenswrapper[4898]: I1211 13:24:17.153726 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-67xj9\" (UniqueName: \"kubernetes.io/projected/7a828046-bb9f-4a7a-be64-d18efa6ccb63-kube-api-access-67xj9\") pod \"dnsmasq-dns-57d769cc4f-2t4lr\" (UID: \"7a828046-bb9f-4a7a-be64-d18efa6ccb63\") " pod="openstack/dnsmasq-dns-57d769cc4f-2t4lr" Dec 11 13:24:17 crc kubenswrapper[4898]: I1211 13:24:17.293591 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-2t4lr" Dec 11 13:24:17 crc kubenswrapper[4898]: I1211 13:24:17.592371 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-6rnmm"] Dec 11 13:24:17 crc kubenswrapper[4898]: I1211 13:24:17.802928 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Dec 11 13:24:17 crc kubenswrapper[4898]: I1211 13:24:17.804580 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 11 13:24:17 crc kubenswrapper[4898]: I1211 13:24:17.804660 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 11 13:24:17 crc kubenswrapper[4898]: I1211 13:24:17.849185 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Dec 11 13:24:17 crc kubenswrapper[4898]: I1211 13:24:17.849213 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-z7sht" Dec 11 13:24:17 crc kubenswrapper[4898]: I1211 13:24:17.849528 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Dec 11 13:24:17 crc kubenswrapper[4898]: I1211 13:24:17.849618 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Dec 11 13:24:17 crc kubenswrapper[4898]: I1211 13:24:17.849799 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Dec 11 13:24:17 crc kubenswrapper[4898]: I1211 13:24:17.850011 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Dec 11 13:24:17 crc kubenswrapper[4898]: I1211 13:24:17.852388 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Dec 11 13:24:17 crc kubenswrapper[4898]: I1211 13:24:17.954894 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bc2fn\" (UniqueName: \"kubernetes.io/projected/6eaf839e-626a-4f9a-b489-d9c37cee9065-kube-api-access-bc2fn\") pod \"rabbitmq-server-0\" (UID: \"6eaf839e-626a-4f9a-b489-d9c37cee9065\") " pod="openstack/rabbitmq-server-0" Dec 11 13:24:17 crc kubenswrapper[4898]: I1211 13:24:17.954956 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6eaf839e-626a-4f9a-b489-d9c37cee9065-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"6eaf839e-626a-4f9a-b489-d9c37cee9065\") " pod="openstack/rabbitmq-server-0" Dec 11 13:24:17 crc kubenswrapper[4898]: I1211 13:24:17.954990 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6eaf839e-626a-4f9a-b489-d9c37cee9065-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"6eaf839e-626a-4f9a-b489-d9c37cee9065\") " pod="openstack/rabbitmq-server-0" Dec 11 13:24:17 crc kubenswrapper[4898]: I1211 13:24:17.955093 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6eaf839e-626a-4f9a-b489-d9c37cee9065-pod-info\") pod \"rabbitmq-server-0\" (UID: \"6eaf839e-626a-4f9a-b489-d9c37cee9065\") " pod="openstack/rabbitmq-server-0" Dec 11 13:24:17 crc kubenswrapper[4898]: I1211 13:24:17.955367 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6eaf839e-626a-4f9a-b489-d9c37cee9065-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"6eaf839e-626a-4f9a-b489-d9c37cee9065\") " pod="openstack/rabbitmq-server-0" Dec 11 13:24:17 crc kubenswrapper[4898]: I1211 13:24:17.955469 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6eaf839e-626a-4f9a-b489-d9c37cee9065-config-data\") pod \"rabbitmq-server-0\" (UID: \"6eaf839e-626a-4f9a-b489-d9c37cee9065\") " pod="openstack/rabbitmq-server-0" Dec 11 13:24:17 crc kubenswrapper[4898]: I1211 13:24:17.955511 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6eaf839e-626a-4f9a-b489-d9c37cee9065-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"6eaf839e-626a-4f9a-b489-d9c37cee9065\") " pod="openstack/rabbitmq-server-0" Dec 11 13:24:17 crc kubenswrapper[4898]: I1211 13:24:17.955542 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6eaf839e-626a-4f9a-b489-d9c37cee9065-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"6eaf839e-626a-4f9a-b489-d9c37cee9065\") " pod="openstack/rabbitmq-server-0" Dec 11 13:24:17 crc kubenswrapper[4898]: I1211 13:24:17.955556 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6eaf839e-626a-4f9a-b489-d9c37cee9065-server-conf\") pod \"rabbitmq-server-0\" (UID: \"6eaf839e-626a-4f9a-b489-d9c37cee9065\") " pod="openstack/rabbitmq-server-0" Dec 11 13:24:17 crc kubenswrapper[4898]: I1211 13:24:17.955624 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"6eaf839e-626a-4f9a-b489-d9c37cee9065\") " pod="openstack/rabbitmq-server-0" Dec 11 13:24:17 crc kubenswrapper[4898]: I1211 13:24:17.955690 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6eaf839e-626a-4f9a-b489-d9c37cee9065-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"6eaf839e-626a-4f9a-b489-d9c37cee9065\") " pod="openstack/rabbitmq-server-0" Dec 11 13:24:18 crc kubenswrapper[4898]: I1211 13:24:18.057033 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6eaf839e-626a-4f9a-b489-d9c37cee9065-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"6eaf839e-626a-4f9a-b489-d9c37cee9065\") " pod="openstack/rabbitmq-server-0" Dec 11 13:24:18 crc kubenswrapper[4898]: I1211 13:24:18.057075 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6eaf839e-626a-4f9a-b489-d9c37cee9065-pod-info\") pod \"rabbitmq-server-0\" (UID: \"6eaf839e-626a-4f9a-b489-d9c37cee9065\") " pod="openstack/rabbitmq-server-0" Dec 11 13:24:18 crc kubenswrapper[4898]: I1211 13:24:18.057154 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6eaf839e-626a-4f9a-b489-d9c37cee9065-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"6eaf839e-626a-4f9a-b489-d9c37cee9065\") " pod="openstack/rabbitmq-server-0" Dec 11 13:24:18 crc kubenswrapper[4898]: I1211 13:24:18.057193 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6eaf839e-626a-4f9a-b489-d9c37cee9065-config-data\") pod \"rabbitmq-server-0\" (UID: \"6eaf839e-626a-4f9a-b489-d9c37cee9065\") " pod="openstack/rabbitmq-server-0" Dec 11 13:24:18 crc kubenswrapper[4898]: I1211 13:24:18.057217 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6eaf839e-626a-4f9a-b489-d9c37cee9065-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"6eaf839e-626a-4f9a-b489-d9c37cee9065\") " pod="openstack/rabbitmq-server-0" Dec 11 13:24:18 crc kubenswrapper[4898]: I1211 13:24:18.057235 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6eaf839e-626a-4f9a-b489-d9c37cee9065-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"6eaf839e-626a-4f9a-b489-d9c37cee9065\") " pod="openstack/rabbitmq-server-0" Dec 11 13:24:18 crc kubenswrapper[4898]: I1211 13:24:18.057252 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6eaf839e-626a-4f9a-b489-d9c37cee9065-server-conf\") pod \"rabbitmq-server-0\" (UID: \"6eaf839e-626a-4f9a-b489-d9c37cee9065\") " pod="openstack/rabbitmq-server-0" Dec 11 13:24:18 crc kubenswrapper[4898]: I1211 13:24:18.057289 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"6eaf839e-626a-4f9a-b489-d9c37cee9065\") " pod="openstack/rabbitmq-server-0" Dec 11 13:24:18 crc kubenswrapper[4898]: I1211 13:24:18.057321 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6eaf839e-626a-4f9a-b489-d9c37cee9065-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"6eaf839e-626a-4f9a-b489-d9c37cee9065\") " pod="openstack/rabbitmq-server-0" Dec 11 13:24:18 crc kubenswrapper[4898]: I1211 13:24:18.057340 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bc2fn\" (UniqueName: \"kubernetes.io/projected/6eaf839e-626a-4f9a-b489-d9c37cee9065-kube-api-access-bc2fn\") pod \"rabbitmq-server-0\" (UID: \"6eaf839e-626a-4f9a-b489-d9c37cee9065\") " pod="openstack/rabbitmq-server-0" Dec 11 13:24:18 crc kubenswrapper[4898]: I1211 13:24:18.057362 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6eaf839e-626a-4f9a-b489-d9c37cee9065-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"6eaf839e-626a-4f9a-b489-d9c37cee9065\") " pod="openstack/rabbitmq-server-0" Dec 11 13:24:18 crc kubenswrapper[4898]: I1211 13:24:18.058319 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6eaf839e-626a-4f9a-b489-d9c37cee9065-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"6eaf839e-626a-4f9a-b489-d9c37cee9065\") " pod="openstack/rabbitmq-server-0" Dec 11 13:24:18 crc kubenswrapper[4898]: I1211 13:24:18.060934 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6eaf839e-626a-4f9a-b489-d9c37cee9065-config-data\") pod \"rabbitmq-server-0\" (UID: \"6eaf839e-626a-4f9a-b489-d9c37cee9065\") " pod="openstack/rabbitmq-server-0" Dec 11 13:24:18 crc kubenswrapper[4898]: I1211 13:24:18.061400 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6eaf839e-626a-4f9a-b489-d9c37cee9065-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"6eaf839e-626a-4f9a-b489-d9c37cee9065\") " pod="openstack/rabbitmq-server-0" Dec 11 13:24:18 crc kubenswrapper[4898]: I1211 13:24:18.061592 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6eaf839e-626a-4f9a-b489-d9c37cee9065-server-conf\") pod \"rabbitmq-server-0\" (UID: \"6eaf839e-626a-4f9a-b489-d9c37cee9065\") " pod="openstack/rabbitmq-server-0" Dec 11 13:24:18 crc kubenswrapper[4898]: I1211 13:24:18.061732 4898 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"6eaf839e-626a-4f9a-b489-d9c37cee9065\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/rabbitmq-server-0" Dec 11 13:24:18 crc kubenswrapper[4898]: I1211 13:24:18.066909 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6eaf839e-626a-4f9a-b489-d9c37cee9065-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"6eaf839e-626a-4f9a-b489-d9c37cee9065\") " pod="openstack/rabbitmq-server-0" Dec 11 13:24:18 crc kubenswrapper[4898]: I1211 13:24:18.068249 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6eaf839e-626a-4f9a-b489-d9c37cee9065-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"6eaf839e-626a-4f9a-b489-d9c37cee9065\") " pod="openstack/rabbitmq-server-0" Dec 11 13:24:18 crc kubenswrapper[4898]: I1211 13:24:18.074977 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 11 13:24:18 crc kubenswrapper[4898]: I1211 13:24:18.077146 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 11 13:24:18 crc kubenswrapper[4898]: I1211 13:24:18.082330 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Dec 11 13:24:18 crc kubenswrapper[4898]: I1211 13:24:18.082649 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6eaf839e-626a-4f9a-b489-d9c37cee9065-pod-info\") pod \"rabbitmq-server-0\" (UID: \"6eaf839e-626a-4f9a-b489-d9c37cee9065\") " pod="openstack/rabbitmq-server-0" Dec 11 13:24:18 crc kubenswrapper[4898]: I1211 13:24:18.087089 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Dec 11 13:24:18 crc kubenswrapper[4898]: I1211 13:24:18.087211 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Dec 11 13:24:18 crc kubenswrapper[4898]: I1211 13:24:18.087372 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Dec 11 13:24:18 crc kubenswrapper[4898]: I1211 13:24:18.087507 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Dec 11 13:24:18 crc kubenswrapper[4898]: I1211 13:24:18.087646 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-6hkm6" Dec 11 13:24:18 crc kubenswrapper[4898]: I1211 13:24:18.088229 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Dec 11 13:24:18 crc kubenswrapper[4898]: I1211 13:24:18.098133 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6eaf839e-626a-4f9a-b489-d9c37cee9065-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"6eaf839e-626a-4f9a-b489-d9c37cee9065\") " pod="openstack/rabbitmq-server-0" Dec 11 13:24:18 crc kubenswrapper[4898]: I1211 13:24:18.098641 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6eaf839e-626a-4f9a-b489-d9c37cee9065-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"6eaf839e-626a-4f9a-b489-d9c37cee9065\") " pod="openstack/rabbitmq-server-0" Dec 11 13:24:18 crc kubenswrapper[4898]: I1211 13:24:18.099166 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bc2fn\" (UniqueName: \"kubernetes.io/projected/6eaf839e-626a-4f9a-b489-d9c37cee9065-kube-api-access-bc2fn\") pod \"rabbitmq-server-0\" (UID: \"6eaf839e-626a-4f9a-b489-d9c37cee9065\") " pod="openstack/rabbitmq-server-0" Dec 11 13:24:18 crc kubenswrapper[4898]: I1211 13:24:18.132712 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"6eaf839e-626a-4f9a-b489-d9c37cee9065\") " pod="openstack/rabbitmq-server-0" Dec 11 13:24:18 crc kubenswrapper[4898]: I1211 13:24:18.152819 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 11 13:24:18 crc kubenswrapper[4898]: I1211 13:24:18.161284 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-68zxq\" (UniqueName: \"kubernetes.io/projected/026f0391-aa61-4b41-963f-239e08b0cd34-kube-api-access-68zxq\") pod \"rabbitmq-cell1-server-0\" (UID: \"026f0391-aa61-4b41-963f-239e08b0cd34\") " pod="openstack/rabbitmq-cell1-server-0" Dec 11 13:24:18 crc kubenswrapper[4898]: I1211 13:24:18.161362 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/026f0391-aa61-4b41-963f-239e08b0cd34-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"026f0391-aa61-4b41-963f-239e08b0cd34\") " pod="openstack/rabbitmq-cell1-server-0" Dec 11 13:24:18 crc kubenswrapper[4898]: I1211 13:24:18.161392 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/026f0391-aa61-4b41-963f-239e08b0cd34-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"026f0391-aa61-4b41-963f-239e08b0cd34\") " pod="openstack/rabbitmq-cell1-server-0" Dec 11 13:24:18 crc kubenswrapper[4898]: I1211 13:24:18.161486 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/026f0391-aa61-4b41-963f-239e08b0cd34-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"026f0391-aa61-4b41-963f-239e08b0cd34\") " pod="openstack/rabbitmq-cell1-server-0" Dec 11 13:24:18 crc kubenswrapper[4898]: I1211 13:24:18.161551 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/026f0391-aa61-4b41-963f-239e08b0cd34-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"026f0391-aa61-4b41-963f-239e08b0cd34\") " pod="openstack/rabbitmq-cell1-server-0" Dec 11 13:24:18 crc kubenswrapper[4898]: I1211 13:24:18.161582 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"026f0391-aa61-4b41-963f-239e08b0cd34\") " pod="openstack/rabbitmq-cell1-server-0" Dec 11 13:24:18 crc kubenswrapper[4898]: I1211 13:24:18.161692 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/026f0391-aa61-4b41-963f-239e08b0cd34-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"026f0391-aa61-4b41-963f-239e08b0cd34\") " pod="openstack/rabbitmq-cell1-server-0" Dec 11 13:24:18 crc kubenswrapper[4898]: I1211 13:24:18.161729 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/026f0391-aa61-4b41-963f-239e08b0cd34-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"026f0391-aa61-4b41-963f-239e08b0cd34\") " pod="openstack/rabbitmq-cell1-server-0" Dec 11 13:24:18 crc kubenswrapper[4898]: I1211 13:24:18.161793 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/026f0391-aa61-4b41-963f-239e08b0cd34-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"026f0391-aa61-4b41-963f-239e08b0cd34\") " pod="openstack/rabbitmq-cell1-server-0" Dec 11 13:24:18 crc kubenswrapper[4898]: I1211 13:24:18.161845 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/026f0391-aa61-4b41-963f-239e08b0cd34-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"026f0391-aa61-4b41-963f-239e08b0cd34\") " pod="openstack/rabbitmq-cell1-server-0" Dec 11 13:24:18 crc kubenswrapper[4898]: I1211 13:24:18.161933 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/026f0391-aa61-4b41-963f-239e08b0cd34-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"026f0391-aa61-4b41-963f-239e08b0cd34\") " pod="openstack/rabbitmq-cell1-server-0" Dec 11 13:24:18 crc kubenswrapper[4898]: I1211 13:24:18.186481 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 11 13:24:18 crc kubenswrapper[4898]: I1211 13:24:18.264335 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/026f0391-aa61-4b41-963f-239e08b0cd34-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"026f0391-aa61-4b41-963f-239e08b0cd34\") " pod="openstack/rabbitmq-cell1-server-0" Dec 11 13:24:18 crc kubenswrapper[4898]: I1211 13:24:18.264400 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/026f0391-aa61-4b41-963f-239e08b0cd34-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"026f0391-aa61-4b41-963f-239e08b0cd34\") " pod="openstack/rabbitmq-cell1-server-0" Dec 11 13:24:18 crc kubenswrapper[4898]: I1211 13:24:18.264478 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/026f0391-aa61-4b41-963f-239e08b0cd34-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"026f0391-aa61-4b41-963f-239e08b0cd34\") " pod="openstack/rabbitmq-cell1-server-0" Dec 11 13:24:18 crc kubenswrapper[4898]: I1211 13:24:18.264520 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-68zxq\" (UniqueName: \"kubernetes.io/projected/026f0391-aa61-4b41-963f-239e08b0cd34-kube-api-access-68zxq\") pod \"rabbitmq-cell1-server-0\" (UID: \"026f0391-aa61-4b41-963f-239e08b0cd34\") " pod="openstack/rabbitmq-cell1-server-0" Dec 11 13:24:18 crc kubenswrapper[4898]: I1211 13:24:18.264555 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/026f0391-aa61-4b41-963f-239e08b0cd34-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"026f0391-aa61-4b41-963f-239e08b0cd34\") " pod="openstack/rabbitmq-cell1-server-0" Dec 11 13:24:18 crc kubenswrapper[4898]: I1211 13:24:18.264582 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/026f0391-aa61-4b41-963f-239e08b0cd34-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"026f0391-aa61-4b41-963f-239e08b0cd34\") " pod="openstack/rabbitmq-cell1-server-0" Dec 11 13:24:18 crc kubenswrapper[4898]: I1211 13:24:18.264601 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/026f0391-aa61-4b41-963f-239e08b0cd34-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"026f0391-aa61-4b41-963f-239e08b0cd34\") " pod="openstack/rabbitmq-cell1-server-0" Dec 11 13:24:18 crc kubenswrapper[4898]: I1211 13:24:18.264629 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/026f0391-aa61-4b41-963f-239e08b0cd34-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"026f0391-aa61-4b41-963f-239e08b0cd34\") " pod="openstack/rabbitmq-cell1-server-0" Dec 11 13:24:18 crc kubenswrapper[4898]: I1211 13:24:18.264654 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"026f0391-aa61-4b41-963f-239e08b0cd34\") " pod="openstack/rabbitmq-cell1-server-0" Dec 11 13:24:18 crc kubenswrapper[4898]: I1211 13:24:18.264706 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/026f0391-aa61-4b41-963f-239e08b0cd34-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"026f0391-aa61-4b41-963f-239e08b0cd34\") " pod="openstack/rabbitmq-cell1-server-0" Dec 11 13:24:18 crc kubenswrapper[4898]: I1211 13:24:18.264737 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/026f0391-aa61-4b41-963f-239e08b0cd34-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"026f0391-aa61-4b41-963f-239e08b0cd34\") " pod="openstack/rabbitmq-cell1-server-0" Dec 11 13:24:18 crc kubenswrapper[4898]: I1211 13:24:18.265230 4898 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"026f0391-aa61-4b41-963f-239e08b0cd34\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/rabbitmq-cell1-server-0" Dec 11 13:24:18 crc kubenswrapper[4898]: I1211 13:24:18.265729 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/026f0391-aa61-4b41-963f-239e08b0cd34-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"026f0391-aa61-4b41-963f-239e08b0cd34\") " pod="openstack/rabbitmq-cell1-server-0" Dec 11 13:24:18 crc kubenswrapper[4898]: I1211 13:24:18.265343 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/026f0391-aa61-4b41-963f-239e08b0cd34-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"026f0391-aa61-4b41-963f-239e08b0cd34\") " pod="openstack/rabbitmq-cell1-server-0" Dec 11 13:24:18 crc kubenswrapper[4898]: I1211 13:24:18.265276 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/026f0391-aa61-4b41-963f-239e08b0cd34-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"026f0391-aa61-4b41-963f-239e08b0cd34\") " pod="openstack/rabbitmq-cell1-server-0" Dec 11 13:24:18 crc kubenswrapper[4898]: I1211 13:24:18.267490 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/026f0391-aa61-4b41-963f-239e08b0cd34-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"026f0391-aa61-4b41-963f-239e08b0cd34\") " pod="openstack/rabbitmq-cell1-server-0" Dec 11 13:24:18 crc kubenswrapper[4898]: I1211 13:24:18.268104 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/026f0391-aa61-4b41-963f-239e08b0cd34-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"026f0391-aa61-4b41-963f-239e08b0cd34\") " pod="openstack/rabbitmq-cell1-server-0" Dec 11 13:24:18 crc kubenswrapper[4898]: I1211 13:24:18.269582 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/026f0391-aa61-4b41-963f-239e08b0cd34-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"026f0391-aa61-4b41-963f-239e08b0cd34\") " pod="openstack/rabbitmq-cell1-server-0" Dec 11 13:24:18 crc kubenswrapper[4898]: I1211 13:24:18.269652 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/026f0391-aa61-4b41-963f-239e08b0cd34-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"026f0391-aa61-4b41-963f-239e08b0cd34\") " pod="openstack/rabbitmq-cell1-server-0" Dec 11 13:24:18 crc kubenswrapper[4898]: I1211 13:24:18.273863 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/026f0391-aa61-4b41-963f-239e08b0cd34-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"026f0391-aa61-4b41-963f-239e08b0cd34\") " pod="openstack/rabbitmq-cell1-server-0" Dec 11 13:24:18 crc kubenswrapper[4898]: I1211 13:24:18.286134 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/026f0391-aa61-4b41-963f-239e08b0cd34-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"026f0391-aa61-4b41-963f-239e08b0cd34\") " pod="openstack/rabbitmq-cell1-server-0" Dec 11 13:24:18 crc kubenswrapper[4898]: I1211 13:24:18.289095 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-68zxq\" (UniqueName: \"kubernetes.io/projected/026f0391-aa61-4b41-963f-239e08b0cd34-kube-api-access-68zxq\") pod \"rabbitmq-cell1-server-0\" (UID: \"026f0391-aa61-4b41-963f-239e08b0cd34\") " pod="openstack/rabbitmq-cell1-server-0" Dec 11 13:24:18 crc kubenswrapper[4898]: I1211 13:24:18.313005 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"026f0391-aa61-4b41-963f-239e08b0cd34\") " pod="openstack/rabbitmq-cell1-server-0" Dec 11 13:24:18 crc kubenswrapper[4898]: I1211 13:24:18.487113 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 11 13:24:19 crc kubenswrapper[4898]: I1211 13:24:19.342245 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Dec 11 13:24:19 crc kubenswrapper[4898]: I1211 13:24:19.346333 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Dec 11 13:24:19 crc kubenswrapper[4898]: I1211 13:24:19.348802 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-lwqqm" Dec 11 13:24:19 crc kubenswrapper[4898]: I1211 13:24:19.353076 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Dec 11 13:24:19 crc kubenswrapper[4898]: I1211 13:24:19.353755 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Dec 11 13:24:19 crc kubenswrapper[4898]: I1211 13:24:19.362730 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Dec 11 13:24:19 crc kubenswrapper[4898]: I1211 13:24:19.365973 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Dec 11 13:24:19 crc kubenswrapper[4898]: I1211 13:24:19.368797 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Dec 11 13:24:19 crc kubenswrapper[4898]: I1211 13:24:19.488075 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/2e7cffb6-80f8-45e8-a4ab-219dc834a613-config-data-generated\") pod \"openstack-galera-0\" (UID: \"2e7cffb6-80f8-45e8-a4ab-219dc834a613\") " pod="openstack/openstack-galera-0" Dec 11 13:24:19 crc kubenswrapper[4898]: I1211 13:24:19.488133 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/2e7cffb6-80f8-45e8-a4ab-219dc834a613-config-data-default\") pod \"openstack-galera-0\" (UID: \"2e7cffb6-80f8-45e8-a4ab-219dc834a613\") " pod="openstack/openstack-galera-0" Dec 11 13:24:19 crc kubenswrapper[4898]: I1211 13:24:19.488178 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sfgsd\" (UniqueName: \"kubernetes.io/projected/2e7cffb6-80f8-45e8-a4ab-219dc834a613-kube-api-access-sfgsd\") pod \"openstack-galera-0\" (UID: \"2e7cffb6-80f8-45e8-a4ab-219dc834a613\") " pod="openstack/openstack-galera-0" Dec 11 13:24:19 crc kubenswrapper[4898]: I1211 13:24:19.488269 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/2e7cffb6-80f8-45e8-a4ab-219dc834a613-kolla-config\") pod \"openstack-galera-0\" (UID: \"2e7cffb6-80f8-45e8-a4ab-219dc834a613\") " pod="openstack/openstack-galera-0" Dec 11 13:24:19 crc kubenswrapper[4898]: I1211 13:24:19.488299 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e7cffb6-80f8-45e8-a4ab-219dc834a613-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"2e7cffb6-80f8-45e8-a4ab-219dc834a613\") " pod="openstack/openstack-galera-0" Dec 11 13:24:19 crc kubenswrapper[4898]: I1211 13:24:19.488332 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-galera-0\" (UID: \"2e7cffb6-80f8-45e8-a4ab-219dc834a613\") " pod="openstack/openstack-galera-0" Dec 11 13:24:19 crc kubenswrapper[4898]: I1211 13:24:19.488355 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/2e7cffb6-80f8-45e8-a4ab-219dc834a613-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"2e7cffb6-80f8-45e8-a4ab-219dc834a613\") " pod="openstack/openstack-galera-0" Dec 11 13:24:19 crc kubenswrapper[4898]: I1211 13:24:19.488612 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2e7cffb6-80f8-45e8-a4ab-219dc834a613-operator-scripts\") pod \"openstack-galera-0\" (UID: \"2e7cffb6-80f8-45e8-a4ab-219dc834a613\") " pod="openstack/openstack-galera-0" Dec 11 13:24:19 crc kubenswrapper[4898]: I1211 13:24:19.593503 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/2e7cffb6-80f8-45e8-a4ab-219dc834a613-kolla-config\") pod \"openstack-galera-0\" (UID: \"2e7cffb6-80f8-45e8-a4ab-219dc834a613\") " pod="openstack/openstack-galera-0" Dec 11 13:24:19 crc kubenswrapper[4898]: I1211 13:24:19.593571 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e7cffb6-80f8-45e8-a4ab-219dc834a613-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"2e7cffb6-80f8-45e8-a4ab-219dc834a613\") " pod="openstack/openstack-galera-0" Dec 11 13:24:19 crc kubenswrapper[4898]: I1211 13:24:19.593628 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-galera-0\" (UID: \"2e7cffb6-80f8-45e8-a4ab-219dc834a613\") " pod="openstack/openstack-galera-0" Dec 11 13:24:19 crc kubenswrapper[4898]: I1211 13:24:19.593656 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/2e7cffb6-80f8-45e8-a4ab-219dc834a613-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"2e7cffb6-80f8-45e8-a4ab-219dc834a613\") " pod="openstack/openstack-galera-0" Dec 11 13:24:19 crc kubenswrapper[4898]: I1211 13:24:19.593711 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2e7cffb6-80f8-45e8-a4ab-219dc834a613-operator-scripts\") pod \"openstack-galera-0\" (UID: \"2e7cffb6-80f8-45e8-a4ab-219dc834a613\") " pod="openstack/openstack-galera-0" Dec 11 13:24:19 crc kubenswrapper[4898]: I1211 13:24:19.593810 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/2e7cffb6-80f8-45e8-a4ab-219dc834a613-config-data-generated\") pod \"openstack-galera-0\" (UID: \"2e7cffb6-80f8-45e8-a4ab-219dc834a613\") " pod="openstack/openstack-galera-0" Dec 11 13:24:19 crc kubenswrapper[4898]: I1211 13:24:19.593847 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/2e7cffb6-80f8-45e8-a4ab-219dc834a613-config-data-default\") pod \"openstack-galera-0\" (UID: \"2e7cffb6-80f8-45e8-a4ab-219dc834a613\") " pod="openstack/openstack-galera-0" Dec 11 13:24:19 crc kubenswrapper[4898]: I1211 13:24:19.593882 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sfgsd\" (UniqueName: \"kubernetes.io/projected/2e7cffb6-80f8-45e8-a4ab-219dc834a613-kube-api-access-sfgsd\") pod \"openstack-galera-0\" (UID: \"2e7cffb6-80f8-45e8-a4ab-219dc834a613\") " pod="openstack/openstack-galera-0" Dec 11 13:24:19 crc kubenswrapper[4898]: I1211 13:24:19.594006 4898 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-galera-0\" (UID: \"2e7cffb6-80f8-45e8-a4ab-219dc834a613\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/openstack-galera-0" Dec 11 13:24:19 crc kubenswrapper[4898]: I1211 13:24:19.594794 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/2e7cffb6-80f8-45e8-a4ab-219dc834a613-config-data-generated\") pod \"openstack-galera-0\" (UID: \"2e7cffb6-80f8-45e8-a4ab-219dc834a613\") " pod="openstack/openstack-galera-0" Dec 11 13:24:19 crc kubenswrapper[4898]: I1211 13:24:19.595318 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/2e7cffb6-80f8-45e8-a4ab-219dc834a613-kolla-config\") pod \"openstack-galera-0\" (UID: \"2e7cffb6-80f8-45e8-a4ab-219dc834a613\") " pod="openstack/openstack-galera-0" Dec 11 13:24:19 crc kubenswrapper[4898]: I1211 13:24:19.595391 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/2e7cffb6-80f8-45e8-a4ab-219dc834a613-config-data-default\") pod \"openstack-galera-0\" (UID: \"2e7cffb6-80f8-45e8-a4ab-219dc834a613\") " pod="openstack/openstack-galera-0" Dec 11 13:24:19 crc kubenswrapper[4898]: I1211 13:24:19.595989 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2e7cffb6-80f8-45e8-a4ab-219dc834a613-operator-scripts\") pod \"openstack-galera-0\" (UID: \"2e7cffb6-80f8-45e8-a4ab-219dc834a613\") " pod="openstack/openstack-galera-0" Dec 11 13:24:19 crc kubenswrapper[4898]: I1211 13:24:19.616896 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e7cffb6-80f8-45e8-a4ab-219dc834a613-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"2e7cffb6-80f8-45e8-a4ab-219dc834a613\") " pod="openstack/openstack-galera-0" Dec 11 13:24:19 crc kubenswrapper[4898]: I1211 13:24:19.620890 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/2e7cffb6-80f8-45e8-a4ab-219dc834a613-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"2e7cffb6-80f8-45e8-a4ab-219dc834a613\") " pod="openstack/openstack-galera-0" Dec 11 13:24:19 crc kubenswrapper[4898]: I1211 13:24:19.625139 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sfgsd\" (UniqueName: \"kubernetes.io/projected/2e7cffb6-80f8-45e8-a4ab-219dc834a613-kube-api-access-sfgsd\") pod \"openstack-galera-0\" (UID: \"2e7cffb6-80f8-45e8-a4ab-219dc834a613\") " pod="openstack/openstack-galera-0" Dec 11 13:24:19 crc kubenswrapper[4898]: I1211 13:24:19.654702 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-galera-0\" (UID: \"2e7cffb6-80f8-45e8-a4ab-219dc834a613\") " pod="openstack/openstack-galera-0" Dec 11 13:24:19 crc kubenswrapper[4898]: I1211 13:24:19.695086 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Dec 11 13:24:20 crc kubenswrapper[4898]: I1211 13:24:20.711399 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 11 13:24:20 crc kubenswrapper[4898]: I1211 13:24:20.713524 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Dec 11 13:24:20 crc kubenswrapper[4898]: I1211 13:24:20.717348 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-4g28m" Dec 11 13:24:20 crc kubenswrapper[4898]: I1211 13:24:20.717613 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Dec 11 13:24:20 crc kubenswrapper[4898]: I1211 13:24:20.717841 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Dec 11 13:24:20 crc kubenswrapper[4898]: I1211 13:24:20.723118 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Dec 11 13:24:20 crc kubenswrapper[4898]: I1211 13:24:20.738742 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 11 13:24:20 crc kubenswrapper[4898]: I1211 13:24:20.816873 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/5ae31191-f9f6-452a-8f45-a48b4736012e-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"5ae31191-f9f6-452a-8f45-a48b4736012e\") " pod="openstack/openstack-cell1-galera-0" Dec 11 13:24:20 crc kubenswrapper[4898]: I1211 13:24:20.816970 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/5ae31191-f9f6-452a-8f45-a48b4736012e-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"5ae31191-f9f6-452a-8f45-a48b4736012e\") " pod="openstack/openstack-cell1-galera-0" Dec 11 13:24:20 crc kubenswrapper[4898]: I1211 13:24:20.817007 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-cell1-galera-0\" (UID: \"5ae31191-f9f6-452a-8f45-a48b4736012e\") " pod="openstack/openstack-cell1-galera-0" Dec 11 13:24:20 crc kubenswrapper[4898]: I1211 13:24:20.817060 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/5ae31191-f9f6-452a-8f45-a48b4736012e-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"5ae31191-f9f6-452a-8f45-a48b4736012e\") " pod="openstack/openstack-cell1-galera-0" Dec 11 13:24:20 crc kubenswrapper[4898]: I1211 13:24:20.817131 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ae31191-f9f6-452a-8f45-a48b4736012e-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"5ae31191-f9f6-452a-8f45-a48b4736012e\") " pod="openstack/openstack-cell1-galera-0" Dec 11 13:24:20 crc kubenswrapper[4898]: I1211 13:24:20.817175 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5ae31191-f9f6-452a-8f45-a48b4736012e-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"5ae31191-f9f6-452a-8f45-a48b4736012e\") " pod="openstack/openstack-cell1-galera-0" Dec 11 13:24:20 crc kubenswrapper[4898]: I1211 13:24:20.817221 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tjhns\" (UniqueName: \"kubernetes.io/projected/5ae31191-f9f6-452a-8f45-a48b4736012e-kube-api-access-tjhns\") pod \"openstack-cell1-galera-0\" (UID: \"5ae31191-f9f6-452a-8f45-a48b4736012e\") " pod="openstack/openstack-cell1-galera-0" Dec 11 13:24:20 crc kubenswrapper[4898]: I1211 13:24:20.817258 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/5ae31191-f9f6-452a-8f45-a48b4736012e-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"5ae31191-f9f6-452a-8f45-a48b4736012e\") " pod="openstack/openstack-cell1-galera-0" Dec 11 13:24:20 crc kubenswrapper[4898]: I1211 13:24:20.919369 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5ae31191-f9f6-452a-8f45-a48b4736012e-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"5ae31191-f9f6-452a-8f45-a48b4736012e\") " pod="openstack/openstack-cell1-galera-0" Dec 11 13:24:20 crc kubenswrapper[4898]: I1211 13:24:20.919467 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tjhns\" (UniqueName: \"kubernetes.io/projected/5ae31191-f9f6-452a-8f45-a48b4736012e-kube-api-access-tjhns\") pod \"openstack-cell1-galera-0\" (UID: \"5ae31191-f9f6-452a-8f45-a48b4736012e\") " pod="openstack/openstack-cell1-galera-0" Dec 11 13:24:20 crc kubenswrapper[4898]: I1211 13:24:20.919512 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/5ae31191-f9f6-452a-8f45-a48b4736012e-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"5ae31191-f9f6-452a-8f45-a48b4736012e\") " pod="openstack/openstack-cell1-galera-0" Dec 11 13:24:20 crc kubenswrapper[4898]: I1211 13:24:20.919574 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/5ae31191-f9f6-452a-8f45-a48b4736012e-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"5ae31191-f9f6-452a-8f45-a48b4736012e\") " pod="openstack/openstack-cell1-galera-0" Dec 11 13:24:20 crc kubenswrapper[4898]: I1211 13:24:20.919623 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/5ae31191-f9f6-452a-8f45-a48b4736012e-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"5ae31191-f9f6-452a-8f45-a48b4736012e\") " pod="openstack/openstack-cell1-galera-0" Dec 11 13:24:20 crc kubenswrapper[4898]: I1211 13:24:20.919659 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-cell1-galera-0\" (UID: \"5ae31191-f9f6-452a-8f45-a48b4736012e\") " pod="openstack/openstack-cell1-galera-0" Dec 11 13:24:20 crc kubenswrapper[4898]: I1211 13:24:20.919709 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/5ae31191-f9f6-452a-8f45-a48b4736012e-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"5ae31191-f9f6-452a-8f45-a48b4736012e\") " pod="openstack/openstack-cell1-galera-0" Dec 11 13:24:20 crc kubenswrapper[4898]: I1211 13:24:20.919753 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ae31191-f9f6-452a-8f45-a48b4736012e-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"5ae31191-f9f6-452a-8f45-a48b4736012e\") " pod="openstack/openstack-cell1-galera-0" Dec 11 13:24:20 crc kubenswrapper[4898]: I1211 13:24:20.920255 4898 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-cell1-galera-0\" (UID: \"5ae31191-f9f6-452a-8f45-a48b4736012e\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/openstack-cell1-galera-0" Dec 11 13:24:20 crc kubenswrapper[4898]: I1211 13:24:20.921127 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/5ae31191-f9f6-452a-8f45-a48b4736012e-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"5ae31191-f9f6-452a-8f45-a48b4736012e\") " pod="openstack/openstack-cell1-galera-0" Dec 11 13:24:20 crc kubenswrapper[4898]: I1211 13:24:20.921765 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/5ae31191-f9f6-452a-8f45-a48b4736012e-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"5ae31191-f9f6-452a-8f45-a48b4736012e\") " pod="openstack/openstack-cell1-galera-0" Dec 11 13:24:20 crc kubenswrapper[4898]: I1211 13:24:20.921897 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5ae31191-f9f6-452a-8f45-a48b4736012e-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"5ae31191-f9f6-452a-8f45-a48b4736012e\") " pod="openstack/openstack-cell1-galera-0" Dec 11 13:24:20 crc kubenswrapper[4898]: I1211 13:24:20.922091 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/5ae31191-f9f6-452a-8f45-a48b4736012e-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"5ae31191-f9f6-452a-8f45-a48b4736012e\") " pod="openstack/openstack-cell1-galera-0" Dec 11 13:24:20 crc kubenswrapper[4898]: I1211 13:24:20.930449 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/5ae31191-f9f6-452a-8f45-a48b4736012e-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"5ae31191-f9f6-452a-8f45-a48b4736012e\") " pod="openstack/openstack-cell1-galera-0" Dec 11 13:24:20 crc kubenswrapper[4898]: I1211 13:24:20.931962 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ae31191-f9f6-452a-8f45-a48b4736012e-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"5ae31191-f9f6-452a-8f45-a48b4736012e\") " pod="openstack/openstack-cell1-galera-0" Dec 11 13:24:20 crc kubenswrapper[4898]: I1211 13:24:20.950630 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-cell1-galera-0\" (UID: \"5ae31191-f9f6-452a-8f45-a48b4736012e\") " pod="openstack/openstack-cell1-galera-0" Dec 11 13:24:20 crc kubenswrapper[4898]: I1211 13:24:20.950993 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tjhns\" (UniqueName: \"kubernetes.io/projected/5ae31191-f9f6-452a-8f45-a48b4736012e-kube-api-access-tjhns\") pod \"openstack-cell1-galera-0\" (UID: \"5ae31191-f9f6-452a-8f45-a48b4736012e\") " pod="openstack/openstack-cell1-galera-0" Dec 11 13:24:21 crc kubenswrapper[4898]: I1211 13:24:21.008050 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Dec 11 13:24:21 crc kubenswrapper[4898]: I1211 13:24:21.011870 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Dec 11 13:24:21 crc kubenswrapper[4898]: I1211 13:24:21.014161 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Dec 11 13:24:21 crc kubenswrapper[4898]: I1211 13:24:21.014355 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-j9j7c" Dec 11 13:24:21 crc kubenswrapper[4898]: I1211 13:24:21.014851 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Dec 11 13:24:21 crc kubenswrapper[4898]: I1211 13:24:21.027443 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Dec 11 13:24:21 crc kubenswrapper[4898]: I1211 13:24:21.036313 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Dec 11 13:24:21 crc kubenswrapper[4898]: I1211 13:24:21.122799 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f596cf47-0571-443a-9104-c61109f65d44-kolla-config\") pod \"memcached-0\" (UID: \"f596cf47-0571-443a-9104-c61109f65d44\") " pod="openstack/memcached-0" Dec 11 13:24:21 crc kubenswrapper[4898]: I1211 13:24:21.122849 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f596cf47-0571-443a-9104-c61109f65d44-config-data\") pod \"memcached-0\" (UID: \"f596cf47-0571-443a-9104-c61109f65d44\") " pod="openstack/memcached-0" Dec 11 13:24:21 crc kubenswrapper[4898]: I1211 13:24:21.122895 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f596cf47-0571-443a-9104-c61109f65d44-combined-ca-bundle\") pod \"memcached-0\" (UID: \"f596cf47-0571-443a-9104-c61109f65d44\") " pod="openstack/memcached-0" Dec 11 13:24:21 crc kubenswrapper[4898]: I1211 13:24:21.122938 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/f596cf47-0571-443a-9104-c61109f65d44-memcached-tls-certs\") pod \"memcached-0\" (UID: \"f596cf47-0571-443a-9104-c61109f65d44\") " pod="openstack/memcached-0" Dec 11 13:24:21 crc kubenswrapper[4898]: I1211 13:24:21.122989 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z6j58\" (UniqueName: \"kubernetes.io/projected/f596cf47-0571-443a-9104-c61109f65d44-kube-api-access-z6j58\") pod \"memcached-0\" (UID: \"f596cf47-0571-443a-9104-c61109f65d44\") " pod="openstack/memcached-0" Dec 11 13:24:21 crc kubenswrapper[4898]: I1211 13:24:21.224169 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f596cf47-0571-443a-9104-c61109f65d44-kolla-config\") pod \"memcached-0\" (UID: \"f596cf47-0571-443a-9104-c61109f65d44\") " pod="openstack/memcached-0" Dec 11 13:24:21 crc kubenswrapper[4898]: I1211 13:24:21.224495 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f596cf47-0571-443a-9104-c61109f65d44-config-data\") pod \"memcached-0\" (UID: \"f596cf47-0571-443a-9104-c61109f65d44\") " pod="openstack/memcached-0" Dec 11 13:24:21 crc kubenswrapper[4898]: I1211 13:24:21.224556 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f596cf47-0571-443a-9104-c61109f65d44-combined-ca-bundle\") pod \"memcached-0\" (UID: \"f596cf47-0571-443a-9104-c61109f65d44\") " pod="openstack/memcached-0" Dec 11 13:24:21 crc kubenswrapper[4898]: I1211 13:24:21.224612 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/f596cf47-0571-443a-9104-c61109f65d44-memcached-tls-certs\") pod \"memcached-0\" (UID: \"f596cf47-0571-443a-9104-c61109f65d44\") " pod="openstack/memcached-0" Dec 11 13:24:21 crc kubenswrapper[4898]: I1211 13:24:21.224670 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z6j58\" (UniqueName: \"kubernetes.io/projected/f596cf47-0571-443a-9104-c61109f65d44-kube-api-access-z6j58\") pod \"memcached-0\" (UID: \"f596cf47-0571-443a-9104-c61109f65d44\") " pod="openstack/memcached-0" Dec 11 13:24:21 crc kubenswrapper[4898]: I1211 13:24:21.224871 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f596cf47-0571-443a-9104-c61109f65d44-kolla-config\") pod \"memcached-0\" (UID: \"f596cf47-0571-443a-9104-c61109f65d44\") " pod="openstack/memcached-0" Dec 11 13:24:21 crc kubenswrapper[4898]: I1211 13:24:21.225149 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f596cf47-0571-443a-9104-c61109f65d44-config-data\") pod \"memcached-0\" (UID: \"f596cf47-0571-443a-9104-c61109f65d44\") " pod="openstack/memcached-0" Dec 11 13:24:21 crc kubenswrapper[4898]: I1211 13:24:21.230971 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f596cf47-0571-443a-9104-c61109f65d44-combined-ca-bundle\") pod \"memcached-0\" (UID: \"f596cf47-0571-443a-9104-c61109f65d44\") " pod="openstack/memcached-0" Dec 11 13:24:21 crc kubenswrapper[4898]: I1211 13:24:21.241771 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/f596cf47-0571-443a-9104-c61109f65d44-memcached-tls-certs\") pod \"memcached-0\" (UID: \"f596cf47-0571-443a-9104-c61109f65d44\") " pod="openstack/memcached-0" Dec 11 13:24:21 crc kubenswrapper[4898]: I1211 13:24:21.249361 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z6j58\" (UniqueName: \"kubernetes.io/projected/f596cf47-0571-443a-9104-c61109f65d44-kube-api-access-z6j58\") pod \"memcached-0\" (UID: \"f596cf47-0571-443a-9104-c61109f65d44\") " pod="openstack/memcached-0" Dec 11 13:24:21 crc kubenswrapper[4898]: I1211 13:24:21.327918 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Dec 11 13:24:22 crc kubenswrapper[4898]: I1211 13:24:22.285620 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-6rnmm" event={"ID":"92279f83-81f1-4e41-8dae-72b3a58335e2","Type":"ContainerStarted","Data":"9661052eaa15d370b2be813d179a03ab7e25e83db66cfb2d455dcf8681711e1b"} Dec 11 13:24:22 crc kubenswrapper[4898]: I1211 13:24:22.971159 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Dec 11 13:24:22 crc kubenswrapper[4898]: I1211 13:24:22.972813 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 11 13:24:22 crc kubenswrapper[4898]: I1211 13:24:22.976281 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-jv72c" Dec 11 13:24:22 crc kubenswrapper[4898]: I1211 13:24:22.987672 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 11 13:24:23 crc kubenswrapper[4898]: I1211 13:24:23.083018 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-shs6l\" (UniqueName: \"kubernetes.io/projected/354dfb1c-30d5-4253-beeb-3e57ca531689-kube-api-access-shs6l\") pod \"kube-state-metrics-0\" (UID: \"354dfb1c-30d5-4253-beeb-3e57ca531689\") " pod="openstack/kube-state-metrics-0" Dec 11 13:24:23 crc kubenswrapper[4898]: I1211 13:24:23.184868 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-shs6l\" (UniqueName: \"kubernetes.io/projected/354dfb1c-30d5-4253-beeb-3e57ca531689-kube-api-access-shs6l\") pod \"kube-state-metrics-0\" (UID: \"354dfb1c-30d5-4253-beeb-3e57ca531689\") " pod="openstack/kube-state-metrics-0" Dec 11 13:24:23 crc kubenswrapper[4898]: I1211 13:24:23.217086 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-shs6l\" (UniqueName: \"kubernetes.io/projected/354dfb1c-30d5-4253-beeb-3e57ca531689-kube-api-access-shs6l\") pod \"kube-state-metrics-0\" (UID: \"354dfb1c-30d5-4253-beeb-3e57ca531689\") " pod="openstack/kube-state-metrics-0" Dec 11 13:24:23 crc kubenswrapper[4898]: I1211 13:24:23.297828 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 11 13:24:23 crc kubenswrapper[4898]: I1211 13:24:23.652810 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-ui-dashboards-7d5fb4cbfb-wb65t"] Dec 11 13:24:23 crc kubenswrapper[4898]: I1211 13:24:23.654555 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-ui-dashboards-7d5fb4cbfb-wb65t" Dec 11 13:24:23 crc kubenswrapper[4898]: I1211 13:24:23.660503 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-ui-dashboards-sa-dockercfg-9vx7t" Dec 11 13:24:23 crc kubenswrapper[4898]: I1211 13:24:23.660656 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-ui-dashboards" Dec 11 13:24:23 crc kubenswrapper[4898]: I1211 13:24:23.663083 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-ui-dashboards-7d5fb4cbfb-wb65t"] Dec 11 13:24:23 crc kubenswrapper[4898]: I1211 13:24:23.797494 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bb657218-de3e-4a7b-8412-cab942943d0a-serving-cert\") pod \"observability-ui-dashboards-7d5fb4cbfb-wb65t\" (UID: \"bb657218-de3e-4a7b-8412-cab942943d0a\") " pod="openshift-operators/observability-ui-dashboards-7d5fb4cbfb-wb65t" Dec 11 13:24:23 crc kubenswrapper[4898]: I1211 13:24:23.797951 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n4nlq\" (UniqueName: \"kubernetes.io/projected/bb657218-de3e-4a7b-8412-cab942943d0a-kube-api-access-n4nlq\") pod \"observability-ui-dashboards-7d5fb4cbfb-wb65t\" (UID: \"bb657218-de3e-4a7b-8412-cab942943d0a\") " pod="openshift-operators/observability-ui-dashboards-7d5fb4cbfb-wb65t" Dec 11 13:24:23 crc kubenswrapper[4898]: I1211 13:24:23.900118 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n4nlq\" (UniqueName: \"kubernetes.io/projected/bb657218-de3e-4a7b-8412-cab942943d0a-kube-api-access-n4nlq\") pod \"observability-ui-dashboards-7d5fb4cbfb-wb65t\" (UID: \"bb657218-de3e-4a7b-8412-cab942943d0a\") " pod="openshift-operators/observability-ui-dashboards-7d5fb4cbfb-wb65t" Dec 11 13:24:23 crc kubenswrapper[4898]: I1211 13:24:23.900183 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bb657218-de3e-4a7b-8412-cab942943d0a-serving-cert\") pod \"observability-ui-dashboards-7d5fb4cbfb-wb65t\" (UID: \"bb657218-de3e-4a7b-8412-cab942943d0a\") " pod="openshift-operators/observability-ui-dashboards-7d5fb4cbfb-wb65t" Dec 11 13:24:23 crc kubenswrapper[4898]: E1211 13:24:23.900378 4898 secret.go:188] Couldn't get secret openshift-operators/observability-ui-dashboards: secret "observability-ui-dashboards" not found Dec 11 13:24:23 crc kubenswrapper[4898]: E1211 13:24:23.900449 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bb657218-de3e-4a7b-8412-cab942943d0a-serving-cert podName:bb657218-de3e-4a7b-8412-cab942943d0a nodeName:}" failed. No retries permitted until 2025-12-11 13:24:24.400431546 +0000 UTC m=+1221.972757983 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/bb657218-de3e-4a7b-8412-cab942943d0a-serving-cert") pod "observability-ui-dashboards-7d5fb4cbfb-wb65t" (UID: "bb657218-de3e-4a7b-8412-cab942943d0a") : secret "observability-ui-dashboards" not found Dec 11 13:24:23 crc kubenswrapper[4898]: I1211 13:24:23.939396 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n4nlq\" (UniqueName: \"kubernetes.io/projected/bb657218-de3e-4a7b-8412-cab942943d0a-kube-api-access-n4nlq\") pod \"observability-ui-dashboards-7d5fb4cbfb-wb65t\" (UID: \"bb657218-de3e-4a7b-8412-cab942943d0a\") " pod="openshift-operators/observability-ui-dashboards-7d5fb4cbfb-wb65t" Dec 11 13:24:23 crc kubenswrapper[4898]: I1211 13:24:23.955012 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-9bc76884c-z28hg"] Dec 11 13:24:23 crc kubenswrapper[4898]: I1211 13:24:23.962694 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-9bc76884c-z28hg" Dec 11 13:24:23 crc kubenswrapper[4898]: I1211 13:24:23.970277 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-9bc76884c-z28hg"] Dec 11 13:24:24 crc kubenswrapper[4898]: I1211 13:24:24.104322 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4ba352c0-f542-46ac-abcc-c136ddfb67fc-console-serving-cert\") pod \"console-9bc76884c-z28hg\" (UID: \"4ba352c0-f542-46ac-abcc-c136ddfb67fc\") " pod="openshift-console/console-9bc76884c-z28hg" Dec 11 13:24:24 crc kubenswrapper[4898]: I1211 13:24:24.104373 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4vgms\" (UniqueName: \"kubernetes.io/projected/4ba352c0-f542-46ac-abcc-c136ddfb67fc-kube-api-access-4vgms\") pod \"console-9bc76884c-z28hg\" (UID: \"4ba352c0-f542-46ac-abcc-c136ddfb67fc\") " pod="openshift-console/console-9bc76884c-z28hg" Dec 11 13:24:24 crc kubenswrapper[4898]: I1211 13:24:24.104546 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4ba352c0-f542-46ac-abcc-c136ddfb67fc-console-config\") pod \"console-9bc76884c-z28hg\" (UID: \"4ba352c0-f542-46ac-abcc-c136ddfb67fc\") " pod="openshift-console/console-9bc76884c-z28hg" Dec 11 13:24:24 crc kubenswrapper[4898]: I1211 13:24:24.104633 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4ba352c0-f542-46ac-abcc-c136ddfb67fc-service-ca\") pod \"console-9bc76884c-z28hg\" (UID: \"4ba352c0-f542-46ac-abcc-c136ddfb67fc\") " pod="openshift-console/console-9bc76884c-z28hg" Dec 11 13:24:24 crc kubenswrapper[4898]: I1211 13:24:24.105382 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4ba352c0-f542-46ac-abcc-c136ddfb67fc-console-oauth-config\") pod \"console-9bc76884c-z28hg\" (UID: \"4ba352c0-f542-46ac-abcc-c136ddfb67fc\") " pod="openshift-console/console-9bc76884c-z28hg" Dec 11 13:24:24 crc kubenswrapper[4898]: I1211 13:24:24.105615 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4ba352c0-f542-46ac-abcc-c136ddfb67fc-oauth-serving-cert\") pod \"console-9bc76884c-z28hg\" (UID: \"4ba352c0-f542-46ac-abcc-c136ddfb67fc\") " pod="openshift-console/console-9bc76884c-z28hg" Dec 11 13:24:24 crc kubenswrapper[4898]: I1211 13:24:24.105829 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4ba352c0-f542-46ac-abcc-c136ddfb67fc-trusted-ca-bundle\") pod \"console-9bc76884c-z28hg\" (UID: \"4ba352c0-f542-46ac-abcc-c136ddfb67fc\") " pod="openshift-console/console-9bc76884c-z28hg" Dec 11 13:24:24 crc kubenswrapper[4898]: I1211 13:24:24.207120 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4ba352c0-f542-46ac-abcc-c136ddfb67fc-console-serving-cert\") pod \"console-9bc76884c-z28hg\" (UID: \"4ba352c0-f542-46ac-abcc-c136ddfb67fc\") " pod="openshift-console/console-9bc76884c-z28hg" Dec 11 13:24:24 crc kubenswrapper[4898]: I1211 13:24:24.207343 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4vgms\" (UniqueName: \"kubernetes.io/projected/4ba352c0-f542-46ac-abcc-c136ddfb67fc-kube-api-access-4vgms\") pod \"console-9bc76884c-z28hg\" (UID: \"4ba352c0-f542-46ac-abcc-c136ddfb67fc\") " pod="openshift-console/console-9bc76884c-z28hg" Dec 11 13:24:24 crc kubenswrapper[4898]: I1211 13:24:24.207363 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4ba352c0-f542-46ac-abcc-c136ddfb67fc-console-config\") pod \"console-9bc76884c-z28hg\" (UID: \"4ba352c0-f542-46ac-abcc-c136ddfb67fc\") " pod="openshift-console/console-9bc76884c-z28hg" Dec 11 13:24:24 crc kubenswrapper[4898]: I1211 13:24:24.207386 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4ba352c0-f542-46ac-abcc-c136ddfb67fc-service-ca\") pod \"console-9bc76884c-z28hg\" (UID: \"4ba352c0-f542-46ac-abcc-c136ddfb67fc\") " pod="openshift-console/console-9bc76884c-z28hg" Dec 11 13:24:24 crc kubenswrapper[4898]: I1211 13:24:24.207431 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4ba352c0-f542-46ac-abcc-c136ddfb67fc-console-oauth-config\") pod \"console-9bc76884c-z28hg\" (UID: \"4ba352c0-f542-46ac-abcc-c136ddfb67fc\") " pod="openshift-console/console-9bc76884c-z28hg" Dec 11 13:24:24 crc kubenswrapper[4898]: I1211 13:24:24.207450 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4ba352c0-f542-46ac-abcc-c136ddfb67fc-oauth-serving-cert\") pod \"console-9bc76884c-z28hg\" (UID: \"4ba352c0-f542-46ac-abcc-c136ddfb67fc\") " pod="openshift-console/console-9bc76884c-z28hg" Dec 11 13:24:24 crc kubenswrapper[4898]: I1211 13:24:24.207555 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4ba352c0-f542-46ac-abcc-c136ddfb67fc-trusted-ca-bundle\") pod \"console-9bc76884c-z28hg\" (UID: \"4ba352c0-f542-46ac-abcc-c136ddfb67fc\") " pod="openshift-console/console-9bc76884c-z28hg" Dec 11 13:24:24 crc kubenswrapper[4898]: I1211 13:24:24.208279 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4ba352c0-f542-46ac-abcc-c136ddfb67fc-service-ca\") pod \"console-9bc76884c-z28hg\" (UID: \"4ba352c0-f542-46ac-abcc-c136ddfb67fc\") " pod="openshift-console/console-9bc76884c-z28hg" Dec 11 13:24:24 crc kubenswrapper[4898]: I1211 13:24:24.208298 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4ba352c0-f542-46ac-abcc-c136ddfb67fc-console-config\") pod \"console-9bc76884c-z28hg\" (UID: \"4ba352c0-f542-46ac-abcc-c136ddfb67fc\") " pod="openshift-console/console-9bc76884c-z28hg" Dec 11 13:24:24 crc kubenswrapper[4898]: I1211 13:24:24.208504 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4ba352c0-f542-46ac-abcc-c136ddfb67fc-trusted-ca-bundle\") pod \"console-9bc76884c-z28hg\" (UID: \"4ba352c0-f542-46ac-abcc-c136ddfb67fc\") " pod="openshift-console/console-9bc76884c-z28hg" Dec 11 13:24:24 crc kubenswrapper[4898]: I1211 13:24:24.208657 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4ba352c0-f542-46ac-abcc-c136ddfb67fc-oauth-serving-cert\") pod \"console-9bc76884c-z28hg\" (UID: \"4ba352c0-f542-46ac-abcc-c136ddfb67fc\") " pod="openshift-console/console-9bc76884c-z28hg" Dec 11 13:24:24 crc kubenswrapper[4898]: I1211 13:24:24.217573 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4ba352c0-f542-46ac-abcc-c136ddfb67fc-console-serving-cert\") pod \"console-9bc76884c-z28hg\" (UID: \"4ba352c0-f542-46ac-abcc-c136ddfb67fc\") " pod="openshift-console/console-9bc76884c-z28hg" Dec 11 13:24:24 crc kubenswrapper[4898]: I1211 13:24:24.217960 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4ba352c0-f542-46ac-abcc-c136ddfb67fc-console-oauth-config\") pod \"console-9bc76884c-z28hg\" (UID: \"4ba352c0-f542-46ac-abcc-c136ddfb67fc\") " pod="openshift-console/console-9bc76884c-z28hg" Dec 11 13:24:24 crc kubenswrapper[4898]: I1211 13:24:24.233336 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4vgms\" (UniqueName: \"kubernetes.io/projected/4ba352c0-f542-46ac-abcc-c136ddfb67fc-kube-api-access-4vgms\") pod \"console-9bc76884c-z28hg\" (UID: \"4ba352c0-f542-46ac-abcc-c136ddfb67fc\") " pod="openshift-console/console-9bc76884c-z28hg" Dec 11 13:24:24 crc kubenswrapper[4898]: I1211 13:24:24.287721 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 11 13:24:24 crc kubenswrapper[4898]: I1211 13:24:24.289825 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Dec 11 13:24:24 crc kubenswrapper[4898]: I1211 13:24:24.293152 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Dec 11 13:24:24 crc kubenswrapper[4898]: I1211 13:24:24.293178 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Dec 11 13:24:24 crc kubenswrapper[4898]: I1211 13:24:24.293660 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Dec 11 13:24:24 crc kubenswrapper[4898]: I1211 13:24:24.294669 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Dec 11 13:24:24 crc kubenswrapper[4898]: I1211 13:24:24.296093 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-9bc76884c-z28hg" Dec 11 13:24:24 crc kubenswrapper[4898]: I1211 13:24:24.296770 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-csdtx" Dec 11 13:24:24 crc kubenswrapper[4898]: I1211 13:24:24.303659 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Dec 11 13:24:24 crc kubenswrapper[4898]: I1211 13:24:24.319819 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 11 13:24:24 crc kubenswrapper[4898]: I1211 13:24:24.414794 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dmklw\" (UniqueName: \"kubernetes.io/projected/87df6c08-c7eb-4e54-a329-6343e195c6f3-kube-api-access-dmklw\") pod \"prometheus-metric-storage-0\" (UID: \"87df6c08-c7eb-4e54-a329-6343e195c6f3\") " pod="openstack/prometheus-metric-storage-0" Dec 11 13:24:24 crc kubenswrapper[4898]: I1211 13:24:24.414837 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/87df6c08-c7eb-4e54-a329-6343e195c6f3-config\") pod \"prometheus-metric-storage-0\" (UID: \"87df6c08-c7eb-4e54-a329-6343e195c6f3\") " pod="openstack/prometheus-metric-storage-0" Dec 11 13:24:24 crc kubenswrapper[4898]: I1211 13:24:24.414861 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/87df6c08-c7eb-4e54-a329-6343e195c6f3-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"87df6c08-c7eb-4e54-a329-6343e195c6f3\") " pod="openstack/prometheus-metric-storage-0" Dec 11 13:24:24 crc kubenswrapper[4898]: I1211 13:24:24.414889 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/87df6c08-c7eb-4e54-a329-6343e195c6f3-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"87df6c08-c7eb-4e54-a329-6343e195c6f3\") " pod="openstack/prometheus-metric-storage-0" Dec 11 13:24:24 crc kubenswrapper[4898]: I1211 13:24:24.414923 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/87df6c08-c7eb-4e54-a329-6343e195c6f3-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"87df6c08-c7eb-4e54-a329-6343e195c6f3\") " pod="openstack/prometheus-metric-storage-0" Dec 11 13:24:24 crc kubenswrapper[4898]: I1211 13:24:24.414948 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/87df6c08-c7eb-4e54-a329-6343e195c6f3-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"87df6c08-c7eb-4e54-a329-6343e195c6f3\") " pod="openstack/prometheus-metric-storage-0" Dec 11 13:24:24 crc kubenswrapper[4898]: I1211 13:24:24.415136 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/87df6c08-c7eb-4e54-a329-6343e195c6f3-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"87df6c08-c7eb-4e54-a329-6343e195c6f3\") " pod="openstack/prometheus-metric-storage-0" Dec 11 13:24:24 crc kubenswrapper[4898]: I1211 13:24:24.415168 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"prometheus-metric-storage-0\" (UID: \"87df6c08-c7eb-4e54-a329-6343e195c6f3\") " pod="openstack/prometheus-metric-storage-0" Dec 11 13:24:24 crc kubenswrapper[4898]: I1211 13:24:24.415211 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bb657218-de3e-4a7b-8412-cab942943d0a-serving-cert\") pod \"observability-ui-dashboards-7d5fb4cbfb-wb65t\" (UID: \"bb657218-de3e-4a7b-8412-cab942943d0a\") " pod="openshift-operators/observability-ui-dashboards-7d5fb4cbfb-wb65t" Dec 11 13:24:24 crc kubenswrapper[4898]: I1211 13:24:24.421065 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bb657218-de3e-4a7b-8412-cab942943d0a-serving-cert\") pod \"observability-ui-dashboards-7d5fb4cbfb-wb65t\" (UID: \"bb657218-de3e-4a7b-8412-cab942943d0a\") " pod="openshift-operators/observability-ui-dashboards-7d5fb4cbfb-wb65t" Dec 11 13:24:24 crc kubenswrapper[4898]: I1211 13:24:24.517101 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dmklw\" (UniqueName: \"kubernetes.io/projected/87df6c08-c7eb-4e54-a329-6343e195c6f3-kube-api-access-dmklw\") pod \"prometheus-metric-storage-0\" (UID: \"87df6c08-c7eb-4e54-a329-6343e195c6f3\") " pod="openstack/prometheus-metric-storage-0" Dec 11 13:24:24 crc kubenswrapper[4898]: I1211 13:24:24.517140 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/87df6c08-c7eb-4e54-a329-6343e195c6f3-config\") pod \"prometheus-metric-storage-0\" (UID: \"87df6c08-c7eb-4e54-a329-6343e195c6f3\") " pod="openstack/prometheus-metric-storage-0" Dec 11 13:24:24 crc kubenswrapper[4898]: I1211 13:24:24.517160 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/87df6c08-c7eb-4e54-a329-6343e195c6f3-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"87df6c08-c7eb-4e54-a329-6343e195c6f3\") " pod="openstack/prometheus-metric-storage-0" Dec 11 13:24:24 crc kubenswrapper[4898]: I1211 13:24:24.517181 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/87df6c08-c7eb-4e54-a329-6343e195c6f3-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"87df6c08-c7eb-4e54-a329-6343e195c6f3\") " pod="openstack/prometheus-metric-storage-0" Dec 11 13:24:24 crc kubenswrapper[4898]: I1211 13:24:24.517204 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/87df6c08-c7eb-4e54-a329-6343e195c6f3-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"87df6c08-c7eb-4e54-a329-6343e195c6f3\") " pod="openstack/prometheus-metric-storage-0" Dec 11 13:24:24 crc kubenswrapper[4898]: I1211 13:24:24.517221 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/87df6c08-c7eb-4e54-a329-6343e195c6f3-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"87df6c08-c7eb-4e54-a329-6343e195c6f3\") " pod="openstack/prometheus-metric-storage-0" Dec 11 13:24:24 crc kubenswrapper[4898]: I1211 13:24:24.517324 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/87df6c08-c7eb-4e54-a329-6343e195c6f3-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"87df6c08-c7eb-4e54-a329-6343e195c6f3\") " pod="openstack/prometheus-metric-storage-0" Dec 11 13:24:24 crc kubenswrapper[4898]: I1211 13:24:24.517350 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"prometheus-metric-storage-0\" (UID: \"87df6c08-c7eb-4e54-a329-6343e195c6f3\") " pod="openstack/prometheus-metric-storage-0" Dec 11 13:24:24 crc kubenswrapper[4898]: I1211 13:24:24.518131 4898 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"prometheus-metric-storage-0\" (UID: \"87df6c08-c7eb-4e54-a329-6343e195c6f3\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/prometheus-metric-storage-0" Dec 11 13:24:24 crc kubenswrapper[4898]: I1211 13:24:24.518870 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/87df6c08-c7eb-4e54-a329-6343e195c6f3-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"87df6c08-c7eb-4e54-a329-6343e195c6f3\") " pod="openstack/prometheus-metric-storage-0" Dec 11 13:24:24 crc kubenswrapper[4898]: I1211 13:24:24.526402 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/87df6c08-c7eb-4e54-a329-6343e195c6f3-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"87df6c08-c7eb-4e54-a329-6343e195c6f3\") " pod="openstack/prometheus-metric-storage-0" Dec 11 13:24:24 crc kubenswrapper[4898]: I1211 13:24:24.539260 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/87df6c08-c7eb-4e54-a329-6343e195c6f3-config\") pod \"prometheus-metric-storage-0\" (UID: \"87df6c08-c7eb-4e54-a329-6343e195c6f3\") " pod="openstack/prometheus-metric-storage-0" Dec 11 13:24:24 crc kubenswrapper[4898]: I1211 13:24:24.540581 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dmklw\" (UniqueName: \"kubernetes.io/projected/87df6c08-c7eb-4e54-a329-6343e195c6f3-kube-api-access-dmklw\") pod \"prometheus-metric-storage-0\" (UID: \"87df6c08-c7eb-4e54-a329-6343e195c6f3\") " pod="openstack/prometheus-metric-storage-0" Dec 11 13:24:24 crc kubenswrapper[4898]: I1211 13:24:24.540871 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/87df6c08-c7eb-4e54-a329-6343e195c6f3-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"87df6c08-c7eb-4e54-a329-6343e195c6f3\") " pod="openstack/prometheus-metric-storage-0" Dec 11 13:24:24 crc kubenswrapper[4898]: I1211 13:24:24.540994 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/87df6c08-c7eb-4e54-a329-6343e195c6f3-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"87df6c08-c7eb-4e54-a329-6343e195c6f3\") " pod="openstack/prometheus-metric-storage-0" Dec 11 13:24:24 crc kubenswrapper[4898]: I1211 13:24:24.541565 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/87df6c08-c7eb-4e54-a329-6343e195c6f3-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"87df6c08-c7eb-4e54-a329-6343e195c6f3\") " pod="openstack/prometheus-metric-storage-0" Dec 11 13:24:24 crc kubenswrapper[4898]: I1211 13:24:24.579681 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-ui-dashboards-7d5fb4cbfb-wb65t" Dec 11 13:24:24 crc kubenswrapper[4898]: I1211 13:24:24.593965 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"prometheus-metric-storage-0\" (UID: \"87df6c08-c7eb-4e54-a329-6343e195c6f3\") " pod="openstack/prometheus-metric-storage-0" Dec 11 13:24:24 crc kubenswrapper[4898]: I1211 13:24:24.613372 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Dec 11 13:24:25 crc kubenswrapper[4898]: I1211 13:24:25.749658 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 11 13:24:26 crc kubenswrapper[4898]: I1211 13:24:26.135053 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-lsxj7"] Dec 11 13:24:26 crc kubenswrapper[4898]: I1211 13:24:26.136167 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-lsxj7" Dec 11 13:24:26 crc kubenswrapper[4898]: I1211 13:24:26.153841 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-lsxj7"] Dec 11 13:24:26 crc kubenswrapper[4898]: I1211 13:24:26.157707 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Dec 11 13:24:26 crc kubenswrapper[4898]: I1211 13:24:26.157896 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-pwsjd" Dec 11 13:24:26 crc kubenswrapper[4898]: I1211 13:24:26.157920 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Dec 11 13:24:26 crc kubenswrapper[4898]: I1211 13:24:26.189875 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-fxk76"] Dec 11 13:24:26 crc kubenswrapper[4898]: I1211 13:24:26.198554 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-fxk76" Dec 11 13:24:26 crc kubenswrapper[4898]: I1211 13:24:26.212404 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-fxk76"] Dec 11 13:24:26 crc kubenswrapper[4898]: I1211 13:24:26.254039 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/0c503750-f2d3-42e3-84ba-1db55db9228f-var-run-ovn\") pod \"ovn-controller-lsxj7\" (UID: \"0c503750-f2d3-42e3-84ba-1db55db9228f\") " pod="openstack/ovn-controller-lsxj7" Dec 11 13:24:26 crc kubenswrapper[4898]: I1211 13:24:26.254127 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/a6777806-e5a2-4585-bd5a-8ba7f7757c59-var-lib\") pod \"ovn-controller-ovs-fxk76\" (UID: \"a6777806-e5a2-4585-bd5a-8ba7f7757c59\") " pod="openstack/ovn-controller-ovs-fxk76" Dec 11 13:24:26 crc kubenswrapper[4898]: I1211 13:24:26.254162 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0c503750-f2d3-42e3-84ba-1db55db9228f-scripts\") pod \"ovn-controller-lsxj7\" (UID: \"0c503750-f2d3-42e3-84ba-1db55db9228f\") " pod="openstack/ovn-controller-lsxj7" Dec 11 13:24:26 crc kubenswrapper[4898]: I1211 13:24:26.254195 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a6777806-e5a2-4585-bd5a-8ba7f7757c59-var-run\") pod \"ovn-controller-ovs-fxk76\" (UID: \"a6777806-e5a2-4585-bd5a-8ba7f7757c59\") " pod="openstack/ovn-controller-ovs-fxk76" Dec 11 13:24:26 crc kubenswrapper[4898]: I1211 13:24:26.254220 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/0c503750-f2d3-42e3-84ba-1db55db9228f-var-run\") pod \"ovn-controller-lsxj7\" (UID: \"0c503750-f2d3-42e3-84ba-1db55db9228f\") " pod="openstack/ovn-controller-lsxj7" Dec 11 13:24:26 crc kubenswrapper[4898]: I1211 13:24:26.254259 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gtj52\" (UniqueName: \"kubernetes.io/projected/0c503750-f2d3-42e3-84ba-1db55db9228f-kube-api-access-gtj52\") pod \"ovn-controller-lsxj7\" (UID: \"0c503750-f2d3-42e3-84ba-1db55db9228f\") " pod="openstack/ovn-controller-lsxj7" Dec 11 13:24:26 crc kubenswrapper[4898]: I1211 13:24:26.254276 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/0c503750-f2d3-42e3-84ba-1db55db9228f-var-log-ovn\") pod \"ovn-controller-lsxj7\" (UID: \"0c503750-f2d3-42e3-84ba-1db55db9228f\") " pod="openstack/ovn-controller-lsxj7" Dec 11 13:24:26 crc kubenswrapper[4898]: I1211 13:24:26.254307 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a6777806-e5a2-4585-bd5a-8ba7f7757c59-scripts\") pod \"ovn-controller-ovs-fxk76\" (UID: \"a6777806-e5a2-4585-bd5a-8ba7f7757c59\") " pod="openstack/ovn-controller-ovs-fxk76" Dec 11 13:24:26 crc kubenswrapper[4898]: I1211 13:24:26.254330 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/a6777806-e5a2-4585-bd5a-8ba7f7757c59-etc-ovs\") pod \"ovn-controller-ovs-fxk76\" (UID: \"a6777806-e5a2-4585-bd5a-8ba7f7757c59\") " pod="openstack/ovn-controller-ovs-fxk76" Dec 11 13:24:26 crc kubenswrapper[4898]: I1211 13:24:26.254346 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hq899\" (UniqueName: \"kubernetes.io/projected/a6777806-e5a2-4585-bd5a-8ba7f7757c59-kube-api-access-hq899\") pod \"ovn-controller-ovs-fxk76\" (UID: \"a6777806-e5a2-4585-bd5a-8ba7f7757c59\") " pod="openstack/ovn-controller-ovs-fxk76" Dec 11 13:24:26 crc kubenswrapper[4898]: I1211 13:24:26.254371 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c503750-f2d3-42e3-84ba-1db55db9228f-ovn-controller-tls-certs\") pod \"ovn-controller-lsxj7\" (UID: \"0c503750-f2d3-42e3-84ba-1db55db9228f\") " pod="openstack/ovn-controller-lsxj7" Dec 11 13:24:26 crc kubenswrapper[4898]: I1211 13:24:26.254407 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c503750-f2d3-42e3-84ba-1db55db9228f-combined-ca-bundle\") pod \"ovn-controller-lsxj7\" (UID: \"0c503750-f2d3-42e3-84ba-1db55db9228f\") " pod="openstack/ovn-controller-lsxj7" Dec 11 13:24:26 crc kubenswrapper[4898]: I1211 13:24:26.254427 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/a6777806-e5a2-4585-bd5a-8ba7f7757c59-var-log\") pod \"ovn-controller-ovs-fxk76\" (UID: \"a6777806-e5a2-4585-bd5a-8ba7f7757c59\") " pod="openstack/ovn-controller-ovs-fxk76" Dec 11 13:24:26 crc kubenswrapper[4898]: I1211 13:24:26.356269 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gtj52\" (UniqueName: \"kubernetes.io/projected/0c503750-f2d3-42e3-84ba-1db55db9228f-kube-api-access-gtj52\") pod \"ovn-controller-lsxj7\" (UID: \"0c503750-f2d3-42e3-84ba-1db55db9228f\") " pod="openstack/ovn-controller-lsxj7" Dec 11 13:24:26 crc kubenswrapper[4898]: I1211 13:24:26.356312 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/0c503750-f2d3-42e3-84ba-1db55db9228f-var-log-ovn\") pod \"ovn-controller-lsxj7\" (UID: \"0c503750-f2d3-42e3-84ba-1db55db9228f\") " pod="openstack/ovn-controller-lsxj7" Dec 11 13:24:26 crc kubenswrapper[4898]: I1211 13:24:26.356354 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a6777806-e5a2-4585-bd5a-8ba7f7757c59-scripts\") pod \"ovn-controller-ovs-fxk76\" (UID: \"a6777806-e5a2-4585-bd5a-8ba7f7757c59\") " pod="openstack/ovn-controller-ovs-fxk76" Dec 11 13:24:26 crc kubenswrapper[4898]: I1211 13:24:26.356380 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/a6777806-e5a2-4585-bd5a-8ba7f7757c59-etc-ovs\") pod \"ovn-controller-ovs-fxk76\" (UID: \"a6777806-e5a2-4585-bd5a-8ba7f7757c59\") " pod="openstack/ovn-controller-ovs-fxk76" Dec 11 13:24:26 crc kubenswrapper[4898]: I1211 13:24:26.356399 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hq899\" (UniqueName: \"kubernetes.io/projected/a6777806-e5a2-4585-bd5a-8ba7f7757c59-kube-api-access-hq899\") pod \"ovn-controller-ovs-fxk76\" (UID: \"a6777806-e5a2-4585-bd5a-8ba7f7757c59\") " pod="openstack/ovn-controller-ovs-fxk76" Dec 11 13:24:26 crc kubenswrapper[4898]: I1211 13:24:26.356424 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c503750-f2d3-42e3-84ba-1db55db9228f-ovn-controller-tls-certs\") pod \"ovn-controller-lsxj7\" (UID: \"0c503750-f2d3-42e3-84ba-1db55db9228f\") " pod="openstack/ovn-controller-lsxj7" Dec 11 13:24:26 crc kubenswrapper[4898]: I1211 13:24:26.356480 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c503750-f2d3-42e3-84ba-1db55db9228f-combined-ca-bundle\") pod \"ovn-controller-lsxj7\" (UID: \"0c503750-f2d3-42e3-84ba-1db55db9228f\") " pod="openstack/ovn-controller-lsxj7" Dec 11 13:24:26 crc kubenswrapper[4898]: I1211 13:24:26.356501 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/a6777806-e5a2-4585-bd5a-8ba7f7757c59-var-log\") pod \"ovn-controller-ovs-fxk76\" (UID: \"a6777806-e5a2-4585-bd5a-8ba7f7757c59\") " pod="openstack/ovn-controller-ovs-fxk76" Dec 11 13:24:26 crc kubenswrapper[4898]: I1211 13:24:26.356533 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/0c503750-f2d3-42e3-84ba-1db55db9228f-var-run-ovn\") pod \"ovn-controller-lsxj7\" (UID: \"0c503750-f2d3-42e3-84ba-1db55db9228f\") " pod="openstack/ovn-controller-lsxj7" Dec 11 13:24:26 crc kubenswrapper[4898]: I1211 13:24:26.356571 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/a6777806-e5a2-4585-bd5a-8ba7f7757c59-var-lib\") pod \"ovn-controller-ovs-fxk76\" (UID: \"a6777806-e5a2-4585-bd5a-8ba7f7757c59\") " pod="openstack/ovn-controller-ovs-fxk76" Dec 11 13:24:26 crc kubenswrapper[4898]: I1211 13:24:26.356592 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0c503750-f2d3-42e3-84ba-1db55db9228f-scripts\") pod \"ovn-controller-lsxj7\" (UID: \"0c503750-f2d3-42e3-84ba-1db55db9228f\") " pod="openstack/ovn-controller-lsxj7" Dec 11 13:24:26 crc kubenswrapper[4898]: I1211 13:24:26.356622 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a6777806-e5a2-4585-bd5a-8ba7f7757c59-var-run\") pod \"ovn-controller-ovs-fxk76\" (UID: \"a6777806-e5a2-4585-bd5a-8ba7f7757c59\") " pod="openstack/ovn-controller-ovs-fxk76" Dec 11 13:24:26 crc kubenswrapper[4898]: I1211 13:24:26.356647 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/0c503750-f2d3-42e3-84ba-1db55db9228f-var-run\") pod \"ovn-controller-lsxj7\" (UID: \"0c503750-f2d3-42e3-84ba-1db55db9228f\") " pod="openstack/ovn-controller-lsxj7" Dec 11 13:24:26 crc kubenswrapper[4898]: I1211 13:24:26.357093 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/0c503750-f2d3-42e3-84ba-1db55db9228f-var-run-ovn\") pod \"ovn-controller-lsxj7\" (UID: \"0c503750-f2d3-42e3-84ba-1db55db9228f\") " pod="openstack/ovn-controller-lsxj7" Dec 11 13:24:26 crc kubenswrapper[4898]: I1211 13:24:26.357142 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/a6777806-e5a2-4585-bd5a-8ba7f7757c59-etc-ovs\") pod \"ovn-controller-ovs-fxk76\" (UID: \"a6777806-e5a2-4585-bd5a-8ba7f7757c59\") " pod="openstack/ovn-controller-ovs-fxk76" Dec 11 13:24:26 crc kubenswrapper[4898]: I1211 13:24:26.357162 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/0c503750-f2d3-42e3-84ba-1db55db9228f-var-run\") pod \"ovn-controller-lsxj7\" (UID: \"0c503750-f2d3-42e3-84ba-1db55db9228f\") " pod="openstack/ovn-controller-lsxj7" Dec 11 13:24:26 crc kubenswrapper[4898]: I1211 13:24:26.357197 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a6777806-e5a2-4585-bd5a-8ba7f7757c59-var-run\") pod \"ovn-controller-ovs-fxk76\" (UID: \"a6777806-e5a2-4585-bd5a-8ba7f7757c59\") " pod="openstack/ovn-controller-ovs-fxk76" Dec 11 13:24:26 crc kubenswrapper[4898]: I1211 13:24:26.357301 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/a6777806-e5a2-4585-bd5a-8ba7f7757c59-var-log\") pod \"ovn-controller-ovs-fxk76\" (UID: \"a6777806-e5a2-4585-bd5a-8ba7f7757c59\") " pod="openstack/ovn-controller-ovs-fxk76" Dec 11 13:24:26 crc kubenswrapper[4898]: I1211 13:24:26.357440 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/0c503750-f2d3-42e3-84ba-1db55db9228f-var-log-ovn\") pod \"ovn-controller-lsxj7\" (UID: \"0c503750-f2d3-42e3-84ba-1db55db9228f\") " pod="openstack/ovn-controller-lsxj7" Dec 11 13:24:26 crc kubenswrapper[4898]: I1211 13:24:26.358415 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/a6777806-e5a2-4585-bd5a-8ba7f7757c59-var-lib\") pod \"ovn-controller-ovs-fxk76\" (UID: \"a6777806-e5a2-4585-bd5a-8ba7f7757c59\") " pod="openstack/ovn-controller-ovs-fxk76" Dec 11 13:24:26 crc kubenswrapper[4898]: I1211 13:24:26.358950 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a6777806-e5a2-4585-bd5a-8ba7f7757c59-scripts\") pod \"ovn-controller-ovs-fxk76\" (UID: \"a6777806-e5a2-4585-bd5a-8ba7f7757c59\") " pod="openstack/ovn-controller-ovs-fxk76" Dec 11 13:24:26 crc kubenswrapper[4898]: I1211 13:24:26.360120 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0c503750-f2d3-42e3-84ba-1db55db9228f-scripts\") pod \"ovn-controller-lsxj7\" (UID: \"0c503750-f2d3-42e3-84ba-1db55db9228f\") " pod="openstack/ovn-controller-lsxj7" Dec 11 13:24:26 crc kubenswrapper[4898]: I1211 13:24:26.361542 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c503750-f2d3-42e3-84ba-1db55db9228f-ovn-controller-tls-certs\") pod \"ovn-controller-lsxj7\" (UID: \"0c503750-f2d3-42e3-84ba-1db55db9228f\") " pod="openstack/ovn-controller-lsxj7" Dec 11 13:24:26 crc kubenswrapper[4898]: I1211 13:24:26.363004 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c503750-f2d3-42e3-84ba-1db55db9228f-combined-ca-bundle\") pod \"ovn-controller-lsxj7\" (UID: \"0c503750-f2d3-42e3-84ba-1db55db9228f\") " pod="openstack/ovn-controller-lsxj7" Dec 11 13:24:26 crc kubenswrapper[4898]: I1211 13:24:26.378691 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gtj52\" (UniqueName: \"kubernetes.io/projected/0c503750-f2d3-42e3-84ba-1db55db9228f-kube-api-access-gtj52\") pod \"ovn-controller-lsxj7\" (UID: \"0c503750-f2d3-42e3-84ba-1db55db9228f\") " pod="openstack/ovn-controller-lsxj7" Dec 11 13:24:26 crc kubenswrapper[4898]: I1211 13:24:26.398444 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hq899\" (UniqueName: \"kubernetes.io/projected/a6777806-e5a2-4585-bd5a-8ba7f7757c59-kube-api-access-hq899\") pod \"ovn-controller-ovs-fxk76\" (UID: \"a6777806-e5a2-4585-bd5a-8ba7f7757c59\") " pod="openstack/ovn-controller-ovs-fxk76" Dec 11 13:24:26 crc kubenswrapper[4898]: I1211 13:24:26.461303 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-lsxj7" Dec 11 13:24:26 crc kubenswrapper[4898]: I1211 13:24:26.532894 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-fxk76" Dec 11 13:24:26 crc kubenswrapper[4898]: I1211 13:24:26.603282 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 11 13:24:26 crc kubenswrapper[4898]: I1211 13:24:26.605153 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Dec 11 13:24:26 crc kubenswrapper[4898]: I1211 13:24:26.607203 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Dec 11 13:24:26 crc kubenswrapper[4898]: I1211 13:24:26.607313 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Dec 11 13:24:26 crc kubenswrapper[4898]: I1211 13:24:26.607626 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Dec 11 13:24:26 crc kubenswrapper[4898]: I1211 13:24:26.607980 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-qxgjs" Dec 11 13:24:26 crc kubenswrapper[4898]: I1211 13:24:26.609279 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Dec 11 13:24:26 crc kubenswrapper[4898]: I1211 13:24:26.616289 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 11 13:24:26 crc kubenswrapper[4898]: I1211 13:24:26.663387 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f82c55a-6891-4ba6-bcbb-854c918faa92-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"2f82c55a-6891-4ba6-bcbb-854c918faa92\") " pod="openstack/ovsdbserver-nb-0" Dec 11 13:24:26 crc kubenswrapper[4898]: I1211 13:24:26.663482 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/2f82c55a-6891-4ba6-bcbb-854c918faa92-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"2f82c55a-6891-4ba6-bcbb-854c918faa92\") " pod="openstack/ovsdbserver-nb-0" Dec 11 13:24:26 crc kubenswrapper[4898]: I1211 13:24:26.663517 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-nb-0\" (UID: \"2f82c55a-6891-4ba6-bcbb-854c918faa92\") " pod="openstack/ovsdbserver-nb-0" Dec 11 13:24:26 crc kubenswrapper[4898]: I1211 13:24:26.663778 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4mxdw\" (UniqueName: \"kubernetes.io/projected/2f82c55a-6891-4ba6-bcbb-854c918faa92-kube-api-access-4mxdw\") pod \"ovsdbserver-nb-0\" (UID: \"2f82c55a-6891-4ba6-bcbb-854c918faa92\") " pod="openstack/ovsdbserver-nb-0" Dec 11 13:24:26 crc kubenswrapper[4898]: I1211 13:24:26.663866 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2f82c55a-6891-4ba6-bcbb-854c918faa92-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"2f82c55a-6891-4ba6-bcbb-854c918faa92\") " pod="openstack/ovsdbserver-nb-0" Dec 11 13:24:26 crc kubenswrapper[4898]: I1211 13:24:26.663968 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2f82c55a-6891-4ba6-bcbb-854c918faa92-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"2f82c55a-6891-4ba6-bcbb-854c918faa92\") " pod="openstack/ovsdbserver-nb-0" Dec 11 13:24:26 crc kubenswrapper[4898]: I1211 13:24:26.664011 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2f82c55a-6891-4ba6-bcbb-854c918faa92-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"2f82c55a-6891-4ba6-bcbb-854c918faa92\") " pod="openstack/ovsdbserver-nb-0" Dec 11 13:24:26 crc kubenswrapper[4898]: I1211 13:24:26.664151 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f82c55a-6891-4ba6-bcbb-854c918faa92-config\") pod \"ovsdbserver-nb-0\" (UID: \"2f82c55a-6891-4ba6-bcbb-854c918faa92\") " pod="openstack/ovsdbserver-nb-0" Dec 11 13:24:26 crc kubenswrapper[4898]: I1211 13:24:26.765989 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f82c55a-6891-4ba6-bcbb-854c918faa92-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"2f82c55a-6891-4ba6-bcbb-854c918faa92\") " pod="openstack/ovsdbserver-nb-0" Dec 11 13:24:26 crc kubenswrapper[4898]: I1211 13:24:26.766046 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/2f82c55a-6891-4ba6-bcbb-854c918faa92-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"2f82c55a-6891-4ba6-bcbb-854c918faa92\") " pod="openstack/ovsdbserver-nb-0" Dec 11 13:24:26 crc kubenswrapper[4898]: I1211 13:24:26.766073 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-nb-0\" (UID: \"2f82c55a-6891-4ba6-bcbb-854c918faa92\") " pod="openstack/ovsdbserver-nb-0" Dec 11 13:24:26 crc kubenswrapper[4898]: I1211 13:24:26.766103 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4mxdw\" (UniqueName: \"kubernetes.io/projected/2f82c55a-6891-4ba6-bcbb-854c918faa92-kube-api-access-4mxdw\") pod \"ovsdbserver-nb-0\" (UID: \"2f82c55a-6891-4ba6-bcbb-854c918faa92\") " pod="openstack/ovsdbserver-nb-0" Dec 11 13:24:26 crc kubenswrapper[4898]: I1211 13:24:26.766129 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2f82c55a-6891-4ba6-bcbb-854c918faa92-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"2f82c55a-6891-4ba6-bcbb-854c918faa92\") " pod="openstack/ovsdbserver-nb-0" Dec 11 13:24:26 crc kubenswrapper[4898]: I1211 13:24:26.766169 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2f82c55a-6891-4ba6-bcbb-854c918faa92-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"2f82c55a-6891-4ba6-bcbb-854c918faa92\") " pod="openstack/ovsdbserver-nb-0" Dec 11 13:24:26 crc kubenswrapper[4898]: I1211 13:24:26.766192 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2f82c55a-6891-4ba6-bcbb-854c918faa92-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"2f82c55a-6891-4ba6-bcbb-854c918faa92\") " pod="openstack/ovsdbserver-nb-0" Dec 11 13:24:26 crc kubenswrapper[4898]: I1211 13:24:26.766253 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f82c55a-6891-4ba6-bcbb-854c918faa92-config\") pod \"ovsdbserver-nb-0\" (UID: \"2f82c55a-6891-4ba6-bcbb-854c918faa92\") " pod="openstack/ovsdbserver-nb-0" Dec 11 13:24:26 crc kubenswrapper[4898]: I1211 13:24:26.766403 4898 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-nb-0\" (UID: \"2f82c55a-6891-4ba6-bcbb-854c918faa92\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/ovsdbserver-nb-0" Dec 11 13:24:26 crc kubenswrapper[4898]: I1211 13:24:26.767601 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f82c55a-6891-4ba6-bcbb-854c918faa92-config\") pod \"ovsdbserver-nb-0\" (UID: \"2f82c55a-6891-4ba6-bcbb-854c918faa92\") " pod="openstack/ovsdbserver-nb-0" Dec 11 13:24:26 crc kubenswrapper[4898]: I1211 13:24:26.767995 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2f82c55a-6891-4ba6-bcbb-854c918faa92-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"2f82c55a-6891-4ba6-bcbb-854c918faa92\") " pod="openstack/ovsdbserver-nb-0" Dec 11 13:24:26 crc kubenswrapper[4898]: I1211 13:24:26.768038 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2f82c55a-6891-4ba6-bcbb-854c918faa92-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"2f82c55a-6891-4ba6-bcbb-854c918faa92\") " pod="openstack/ovsdbserver-nb-0" Dec 11 13:24:26 crc kubenswrapper[4898]: I1211 13:24:26.770598 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2f82c55a-6891-4ba6-bcbb-854c918faa92-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"2f82c55a-6891-4ba6-bcbb-854c918faa92\") " pod="openstack/ovsdbserver-nb-0" Dec 11 13:24:26 crc kubenswrapper[4898]: I1211 13:24:26.770936 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f82c55a-6891-4ba6-bcbb-854c918faa92-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"2f82c55a-6891-4ba6-bcbb-854c918faa92\") " pod="openstack/ovsdbserver-nb-0" Dec 11 13:24:26 crc kubenswrapper[4898]: I1211 13:24:26.782478 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4mxdw\" (UniqueName: \"kubernetes.io/projected/2f82c55a-6891-4ba6-bcbb-854c918faa92-kube-api-access-4mxdw\") pod \"ovsdbserver-nb-0\" (UID: \"2f82c55a-6891-4ba6-bcbb-854c918faa92\") " pod="openstack/ovsdbserver-nb-0" Dec 11 13:24:26 crc kubenswrapper[4898]: I1211 13:24:26.782651 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/2f82c55a-6891-4ba6-bcbb-854c918faa92-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"2f82c55a-6891-4ba6-bcbb-854c918faa92\") " pod="openstack/ovsdbserver-nb-0" Dec 11 13:24:26 crc kubenswrapper[4898]: I1211 13:24:26.810218 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-nb-0\" (UID: \"2f82c55a-6891-4ba6-bcbb-854c918faa92\") " pod="openstack/ovsdbserver-nb-0" Dec 11 13:24:26 crc kubenswrapper[4898]: I1211 13:24:26.936993 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Dec 11 13:24:30 crc kubenswrapper[4898]: I1211 13:24:30.500640 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 11 13:24:30 crc kubenswrapper[4898]: I1211 13:24:30.502917 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Dec 11 13:24:30 crc kubenswrapper[4898]: I1211 13:24:30.505297 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-s6cjl" Dec 11 13:24:30 crc kubenswrapper[4898]: I1211 13:24:30.505949 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Dec 11 13:24:30 crc kubenswrapper[4898]: I1211 13:24:30.506149 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Dec 11 13:24:30 crc kubenswrapper[4898]: I1211 13:24:30.506281 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Dec 11 13:24:30 crc kubenswrapper[4898]: I1211 13:24:30.516604 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 11 13:24:30 crc kubenswrapper[4898]: I1211 13:24:30.537395 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aab51a6c-5473-415e-913c-dcb5907cd012-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"aab51a6c-5473-415e-913c-dcb5907cd012\") " pod="openstack/ovsdbserver-sb-0" Dec 11 13:24:30 crc kubenswrapper[4898]: I1211 13:24:30.537449 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/aab51a6c-5473-415e-913c-dcb5907cd012-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"aab51a6c-5473-415e-913c-dcb5907cd012\") " pod="openstack/ovsdbserver-sb-0" Dec 11 13:24:30 crc kubenswrapper[4898]: I1211 13:24:30.537499 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/aab51a6c-5473-415e-913c-dcb5907cd012-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"aab51a6c-5473-415e-913c-dcb5907cd012\") " pod="openstack/ovsdbserver-sb-0" Dec 11 13:24:30 crc kubenswrapper[4898]: I1211 13:24:30.537537 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/aab51a6c-5473-415e-913c-dcb5907cd012-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"aab51a6c-5473-415e-913c-dcb5907cd012\") " pod="openstack/ovsdbserver-sb-0" Dec 11 13:24:30 crc kubenswrapper[4898]: I1211 13:24:30.537577 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-sb-0\" (UID: \"aab51a6c-5473-415e-913c-dcb5907cd012\") " pod="openstack/ovsdbserver-sb-0" Dec 11 13:24:30 crc kubenswrapper[4898]: I1211 13:24:30.537623 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aab51a6c-5473-415e-913c-dcb5907cd012-config\") pod \"ovsdbserver-sb-0\" (UID: \"aab51a6c-5473-415e-913c-dcb5907cd012\") " pod="openstack/ovsdbserver-sb-0" Dec 11 13:24:30 crc kubenswrapper[4898]: I1211 13:24:30.537646 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/aab51a6c-5473-415e-913c-dcb5907cd012-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"aab51a6c-5473-415e-913c-dcb5907cd012\") " pod="openstack/ovsdbserver-sb-0" Dec 11 13:24:30 crc kubenswrapper[4898]: I1211 13:24:30.537675 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pdmzg\" (UniqueName: \"kubernetes.io/projected/aab51a6c-5473-415e-913c-dcb5907cd012-kube-api-access-pdmzg\") pod \"ovsdbserver-sb-0\" (UID: \"aab51a6c-5473-415e-913c-dcb5907cd012\") " pod="openstack/ovsdbserver-sb-0" Dec 11 13:24:30 crc kubenswrapper[4898]: I1211 13:24:30.639224 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pdmzg\" (UniqueName: \"kubernetes.io/projected/aab51a6c-5473-415e-913c-dcb5907cd012-kube-api-access-pdmzg\") pod \"ovsdbserver-sb-0\" (UID: \"aab51a6c-5473-415e-913c-dcb5907cd012\") " pod="openstack/ovsdbserver-sb-0" Dec 11 13:24:30 crc kubenswrapper[4898]: I1211 13:24:30.639310 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aab51a6c-5473-415e-913c-dcb5907cd012-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"aab51a6c-5473-415e-913c-dcb5907cd012\") " pod="openstack/ovsdbserver-sb-0" Dec 11 13:24:30 crc kubenswrapper[4898]: I1211 13:24:30.639349 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/aab51a6c-5473-415e-913c-dcb5907cd012-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"aab51a6c-5473-415e-913c-dcb5907cd012\") " pod="openstack/ovsdbserver-sb-0" Dec 11 13:24:30 crc kubenswrapper[4898]: I1211 13:24:30.639385 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/aab51a6c-5473-415e-913c-dcb5907cd012-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"aab51a6c-5473-415e-913c-dcb5907cd012\") " pod="openstack/ovsdbserver-sb-0" Dec 11 13:24:30 crc kubenswrapper[4898]: I1211 13:24:30.639407 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/aab51a6c-5473-415e-913c-dcb5907cd012-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"aab51a6c-5473-415e-913c-dcb5907cd012\") " pod="openstack/ovsdbserver-sb-0" Dec 11 13:24:30 crc kubenswrapper[4898]: I1211 13:24:30.639447 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-sb-0\" (UID: \"aab51a6c-5473-415e-913c-dcb5907cd012\") " pod="openstack/ovsdbserver-sb-0" Dec 11 13:24:30 crc kubenswrapper[4898]: I1211 13:24:30.639542 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aab51a6c-5473-415e-913c-dcb5907cd012-config\") pod \"ovsdbserver-sb-0\" (UID: \"aab51a6c-5473-415e-913c-dcb5907cd012\") " pod="openstack/ovsdbserver-sb-0" Dec 11 13:24:30 crc kubenswrapper[4898]: I1211 13:24:30.639568 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/aab51a6c-5473-415e-913c-dcb5907cd012-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"aab51a6c-5473-415e-913c-dcb5907cd012\") " pod="openstack/ovsdbserver-sb-0" Dec 11 13:24:30 crc kubenswrapper[4898]: I1211 13:24:30.639964 4898 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-sb-0\" (UID: \"aab51a6c-5473-415e-913c-dcb5907cd012\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/ovsdbserver-sb-0" Dec 11 13:24:30 crc kubenswrapper[4898]: I1211 13:24:30.640794 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aab51a6c-5473-415e-913c-dcb5907cd012-config\") pod \"ovsdbserver-sb-0\" (UID: \"aab51a6c-5473-415e-913c-dcb5907cd012\") " pod="openstack/ovsdbserver-sb-0" Dec 11 13:24:30 crc kubenswrapper[4898]: I1211 13:24:30.640884 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/aab51a6c-5473-415e-913c-dcb5907cd012-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"aab51a6c-5473-415e-913c-dcb5907cd012\") " pod="openstack/ovsdbserver-sb-0" Dec 11 13:24:30 crc kubenswrapper[4898]: I1211 13:24:30.642164 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/aab51a6c-5473-415e-913c-dcb5907cd012-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"aab51a6c-5473-415e-913c-dcb5907cd012\") " pod="openstack/ovsdbserver-sb-0" Dec 11 13:24:30 crc kubenswrapper[4898]: I1211 13:24:30.646036 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/aab51a6c-5473-415e-913c-dcb5907cd012-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"aab51a6c-5473-415e-913c-dcb5907cd012\") " pod="openstack/ovsdbserver-sb-0" Dec 11 13:24:30 crc kubenswrapper[4898]: I1211 13:24:30.646065 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aab51a6c-5473-415e-913c-dcb5907cd012-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"aab51a6c-5473-415e-913c-dcb5907cd012\") " pod="openstack/ovsdbserver-sb-0" Dec 11 13:24:30 crc kubenswrapper[4898]: I1211 13:24:30.647811 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/aab51a6c-5473-415e-913c-dcb5907cd012-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"aab51a6c-5473-415e-913c-dcb5907cd012\") " pod="openstack/ovsdbserver-sb-0" Dec 11 13:24:30 crc kubenswrapper[4898]: I1211 13:24:30.656408 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pdmzg\" (UniqueName: \"kubernetes.io/projected/aab51a6c-5473-415e-913c-dcb5907cd012-kube-api-access-pdmzg\") pod \"ovsdbserver-sb-0\" (UID: \"aab51a6c-5473-415e-913c-dcb5907cd012\") " pod="openstack/ovsdbserver-sb-0" Dec 11 13:24:30 crc kubenswrapper[4898]: I1211 13:24:30.670833 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-sb-0\" (UID: \"aab51a6c-5473-415e-913c-dcb5907cd012\") " pod="openstack/ovsdbserver-sb-0" Dec 11 13:24:30 crc kubenswrapper[4898]: I1211 13:24:30.830047 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Dec 11 13:24:33 crc kubenswrapper[4898]: I1211 13:24:33.403678 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"5ae31191-f9f6-452a-8f45-a48b4736012e","Type":"ContainerStarted","Data":"697e07b70ca7b13d917f988eea2922199d44bc04c73a4dddca9c2cc37952207f"} Dec 11 13:24:33 crc kubenswrapper[4898]: I1211 13:24:33.554528 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 11 13:24:33 crc kubenswrapper[4898]: E1211 13:24:33.961537 4898 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Dec 11 13:24:33 crc kubenswrapper[4898]: E1211 13:24:33.961938 4898 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vb296,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-2fbtv_openstack(bcb38feb-75c6-4fe1-b522-5bfa0f8e62c8): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 11 13:24:33 crc kubenswrapper[4898]: E1211 13:24:33.963014 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-2fbtv" podUID="bcb38feb-75c6-4fe1-b522-5bfa0f8e62c8" Dec 11 13:24:33 crc kubenswrapper[4898]: E1211 13:24:33.998125 4898 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Dec 11 13:24:33 crc kubenswrapper[4898]: E1211 13:24:33.999118 4898 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bp5hj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-w4lmc_openstack(61d961de-6da0-4540-b670-6d3ece217387): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 11 13:24:34 crc kubenswrapper[4898]: E1211 13:24:34.000341 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-w4lmc" podUID="61d961de-6da0-4540-b670-6d3ece217387" Dec 11 13:24:34 crc kubenswrapper[4898]: I1211 13:24:34.424112 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"6eaf839e-626a-4f9a-b489-d9c37cee9065","Type":"ContainerStarted","Data":"c40c2a955382f5d301c518d9e470a4554c1814de3c80decbb2eb1cad292fcaf0"} Dec 11 13:24:35 crc kubenswrapper[4898]: I1211 13:24:34.995931 4898 patch_prober.go:28] interesting pod/machine-config-daemon-7mmvk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 11 13:24:35 crc kubenswrapper[4898]: I1211 13:24:34.996295 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 11 13:24:35 crc kubenswrapper[4898]: I1211 13:24:35.227843 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-2t4lr"] Dec 11 13:24:35 crc kubenswrapper[4898]: I1211 13:24:35.229278 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-2fbtv" Dec 11 13:24:35 crc kubenswrapper[4898]: I1211 13:24:35.240081 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-w4lmc" Dec 11 13:24:35 crc kubenswrapper[4898]: I1211 13:24:35.240371 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Dec 11 13:24:35 crc kubenswrapper[4898]: W1211 13:24:35.258822 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7a828046_bb9f_4a7a_be64_d18efa6ccb63.slice/crio-4a36b6dbb86539d1273c5afd15fde10ddc4923de98fddbb747d23996c720749a WatchSource:0}: Error finding container 4a36b6dbb86539d1273c5afd15fde10ddc4923de98fddbb747d23996c720749a: Status 404 returned error can't find the container with id 4a36b6dbb86539d1273c5afd15fde10ddc4923de98fddbb747d23996c720749a Dec 11 13:24:35 crc kubenswrapper[4898]: W1211 13:24:35.272830 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2e7cffb6_80f8_45e8_a4ab_219dc834a613.slice/crio-f4e7931ad0bb5bb93c8f851dd31bc81378ab0ddf297fb3ae2efedbf4f24c69bf WatchSource:0}: Error finding container f4e7931ad0bb5bb93c8f851dd31bc81378ab0ddf297fb3ae2efedbf4f24c69bf: Status 404 returned error can't find the container with id f4e7931ad0bb5bb93c8f851dd31bc81378ab0ddf297fb3ae2efedbf4f24c69bf Dec 11 13:24:35 crc kubenswrapper[4898]: I1211 13:24:35.285588 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 11 13:24:35 crc kubenswrapper[4898]: W1211 13:24:35.289944 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod026f0391_aa61_4b41_963f_239e08b0cd34.slice/crio-8d647938454b298fe7a9ca7949a3907dfea0a58d81fcccb2a95acc292402e589 WatchSource:0}: Error finding container 8d647938454b298fe7a9ca7949a3907dfea0a58d81fcccb2a95acc292402e589: Status 404 returned error can't find the container with id 8d647938454b298fe7a9ca7949a3907dfea0a58d81fcccb2a95acc292402e589 Dec 11 13:24:35 crc kubenswrapper[4898]: I1211 13:24:35.339013 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bcb38feb-75c6-4fe1-b522-5bfa0f8e62c8-config\") pod \"bcb38feb-75c6-4fe1-b522-5bfa0f8e62c8\" (UID: \"bcb38feb-75c6-4fe1-b522-5bfa0f8e62c8\") " Dec 11 13:24:35 crc kubenswrapper[4898]: I1211 13:24:35.339210 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vb296\" (UniqueName: \"kubernetes.io/projected/bcb38feb-75c6-4fe1-b522-5bfa0f8e62c8-kube-api-access-vb296\") pod \"bcb38feb-75c6-4fe1-b522-5bfa0f8e62c8\" (UID: \"bcb38feb-75c6-4fe1-b522-5bfa0f8e62c8\") " Dec 11 13:24:35 crc kubenswrapper[4898]: I1211 13:24:35.339526 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bcb38feb-75c6-4fe1-b522-5bfa0f8e62c8-config" (OuterVolumeSpecName: "config") pod "bcb38feb-75c6-4fe1-b522-5bfa0f8e62c8" (UID: "bcb38feb-75c6-4fe1-b522-5bfa0f8e62c8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:24:35 crc kubenswrapper[4898]: I1211 13:24:35.340506 4898 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bcb38feb-75c6-4fe1-b522-5bfa0f8e62c8-config\") on node \"crc\" DevicePath \"\"" Dec 11 13:24:35 crc kubenswrapper[4898]: I1211 13:24:35.344621 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bcb38feb-75c6-4fe1-b522-5bfa0f8e62c8-kube-api-access-vb296" (OuterVolumeSpecName: "kube-api-access-vb296") pod "bcb38feb-75c6-4fe1-b522-5bfa0f8e62c8" (UID: "bcb38feb-75c6-4fe1-b522-5bfa0f8e62c8"). InnerVolumeSpecName "kube-api-access-vb296". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:24:35 crc kubenswrapper[4898]: I1211 13:24:35.396976 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 11 13:24:35 crc kubenswrapper[4898]: I1211 13:24:35.433098 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-w4lmc" event={"ID":"61d961de-6da0-4540-b670-6d3ece217387","Type":"ContainerDied","Data":"250d975641ea3f88f010e75177101e5375adc41028a01ea5721d0f39c77927a5"} Dec 11 13:24:35 crc kubenswrapper[4898]: I1211 13:24:35.433185 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-w4lmc" Dec 11 13:24:35 crc kubenswrapper[4898]: I1211 13:24:35.434511 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"aab51a6c-5473-415e-913c-dcb5907cd012","Type":"ContainerStarted","Data":"5a214d1015f9a6020f5016b8df48b806f5aa5003addb76b3e9e4ae7c12681380"} Dec 11 13:24:35 crc kubenswrapper[4898]: I1211 13:24:35.440186 4898 generic.go:334] "Generic (PLEG): container finished" podID="92279f83-81f1-4e41-8dae-72b3a58335e2" containerID="3f5b681c2f79dd6f02f8c15a18d383eb3a72142a777c854cefe14afdf342249c" exitCode=0 Dec 11 13:24:35 crc kubenswrapper[4898]: I1211 13:24:35.440342 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-6rnmm" event={"ID":"92279f83-81f1-4e41-8dae-72b3a58335e2","Type":"ContainerDied","Data":"3f5b681c2f79dd6f02f8c15a18d383eb3a72142a777c854cefe14afdf342249c"} Dec 11 13:24:35 crc kubenswrapper[4898]: I1211 13:24:35.441391 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bp5hj\" (UniqueName: \"kubernetes.io/projected/61d961de-6da0-4540-b670-6d3ece217387-kube-api-access-bp5hj\") pod \"61d961de-6da0-4540-b670-6d3ece217387\" (UID: \"61d961de-6da0-4540-b670-6d3ece217387\") " Dec 11 13:24:35 crc kubenswrapper[4898]: I1211 13:24:35.441516 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"2e7cffb6-80f8-45e8-a4ab-219dc834a613","Type":"ContainerStarted","Data":"f4e7931ad0bb5bb93c8f851dd31bc81378ab0ddf297fb3ae2efedbf4f24c69bf"} Dec 11 13:24:35 crc kubenswrapper[4898]: I1211 13:24:35.441642 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/61d961de-6da0-4540-b670-6d3ece217387-config\") pod \"61d961de-6da0-4540-b670-6d3ece217387\" (UID: \"61d961de-6da0-4540-b670-6d3ece217387\") " Dec 11 13:24:35 crc kubenswrapper[4898]: I1211 13:24:35.441811 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/61d961de-6da0-4540-b670-6d3ece217387-dns-svc\") pod \"61d961de-6da0-4540-b670-6d3ece217387\" (UID: \"61d961de-6da0-4540-b670-6d3ece217387\") " Dec 11 13:24:35 crc kubenswrapper[4898]: I1211 13:24:35.442338 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vb296\" (UniqueName: \"kubernetes.io/projected/bcb38feb-75c6-4fe1-b522-5bfa0f8e62c8-kube-api-access-vb296\") on node \"crc\" DevicePath \"\"" Dec 11 13:24:35 crc kubenswrapper[4898]: I1211 13:24:35.443192 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/61d961de-6da0-4540-b670-6d3ece217387-config" (OuterVolumeSpecName: "config") pod "61d961de-6da0-4540-b670-6d3ece217387" (UID: "61d961de-6da0-4540-b670-6d3ece217387"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:24:35 crc kubenswrapper[4898]: I1211 13:24:35.444091 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-2fbtv" Dec 11 13:24:35 crc kubenswrapper[4898]: I1211 13:24:35.444277 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-2fbtv" event={"ID":"bcb38feb-75c6-4fe1-b522-5bfa0f8e62c8","Type":"ContainerDied","Data":"793223e679b2a3ac6e0898a4a75a4503d31c4a1786361b90f5d9abb18a0e735e"} Dec 11 13:24:35 crc kubenswrapper[4898]: I1211 13:24:35.444715 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/61d961de-6da0-4540-b670-6d3ece217387-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "61d961de-6da0-4540-b670-6d3ece217387" (UID: "61d961de-6da0-4540-b670-6d3ece217387"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:24:35 crc kubenswrapper[4898]: I1211 13:24:35.447671 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/61d961de-6da0-4540-b670-6d3ece217387-kube-api-access-bp5hj" (OuterVolumeSpecName: "kube-api-access-bp5hj") pod "61d961de-6da0-4540-b670-6d3ece217387" (UID: "61d961de-6da0-4540-b670-6d3ece217387"). InnerVolumeSpecName "kube-api-access-bp5hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:24:35 crc kubenswrapper[4898]: I1211 13:24:35.448866 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-2t4lr" event={"ID":"7a828046-bb9f-4a7a-be64-d18efa6ccb63","Type":"ContainerStarted","Data":"4a36b6dbb86539d1273c5afd15fde10ddc4923de98fddbb747d23996c720749a"} Dec 11 13:24:35 crc kubenswrapper[4898]: I1211 13:24:35.450373 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"026f0391-aa61-4b41-963f-239e08b0cd34","Type":"ContainerStarted","Data":"8d647938454b298fe7a9ca7949a3907dfea0a58d81fcccb2a95acc292402e589"} Dec 11 13:24:35 crc kubenswrapper[4898]: I1211 13:24:35.521989 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-2fbtv"] Dec 11 13:24:35 crc kubenswrapper[4898]: I1211 13:24:35.529654 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-2fbtv"] Dec 11 13:24:35 crc kubenswrapper[4898]: I1211 13:24:35.544131 4898 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/61d961de-6da0-4540-b670-6d3ece217387-config\") on node \"crc\" DevicePath \"\"" Dec 11 13:24:35 crc kubenswrapper[4898]: I1211 13:24:35.544171 4898 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/61d961de-6da0-4540-b670-6d3ece217387-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 11 13:24:35 crc kubenswrapper[4898]: I1211 13:24:35.544181 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bp5hj\" (UniqueName: \"kubernetes.io/projected/61d961de-6da0-4540-b670-6d3ece217387-kube-api-access-bp5hj\") on node \"crc\" DevicePath \"\"" Dec 11 13:24:35 crc kubenswrapper[4898]: I1211 13:24:35.688656 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-9bc76884c-z28hg"] Dec 11 13:24:35 crc kubenswrapper[4898]: I1211 13:24:35.709168 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-ui-dashboards-7d5fb4cbfb-wb65t"] Dec 11 13:24:35 crc kubenswrapper[4898]: I1211 13:24:35.734267 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 11 13:24:35 crc kubenswrapper[4898]: I1211 13:24:35.746693 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 11 13:24:35 crc kubenswrapper[4898]: W1211 13:24:35.760816 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod354dfb1c_30d5_4253_beeb_3e57ca531689.slice/crio-022fc477c0f518ee661899d91f54c29bc1ee8e61021d6712aa7ecd99179376ab WatchSource:0}: Error finding container 022fc477c0f518ee661899d91f54c29bc1ee8e61021d6712aa7ecd99179376ab: Status 404 returned error can't find the container with id 022fc477c0f518ee661899d91f54c29bc1ee8e61021d6712aa7ecd99179376ab Dec 11 13:24:35 crc kubenswrapper[4898]: I1211 13:24:35.772509 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Dec 11 13:24:35 crc kubenswrapper[4898]: I1211 13:24:35.801741 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 11 13:24:35 crc kubenswrapper[4898]: I1211 13:24:35.822698 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-w4lmc"] Dec 11 13:24:35 crc kubenswrapper[4898]: I1211 13:24:35.834312 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-w4lmc"] Dec 11 13:24:35 crc kubenswrapper[4898]: I1211 13:24:35.853210 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-lsxj7"] Dec 11 13:24:35 crc kubenswrapper[4898]: W1211 13:24:35.877872 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf596cf47_0571_443a_9104_c61109f65d44.slice/crio-6fc315b156e3bfcdea5cd8e47533fda57273965d51904ffc4a5e740cf36630ca WatchSource:0}: Error finding container 6fc315b156e3bfcdea5cd8e47533fda57273965d51904ffc4a5e740cf36630ca: Status 404 returned error can't find the container with id 6fc315b156e3bfcdea5cd8e47533fda57273965d51904ffc4a5e740cf36630ca Dec 11 13:24:35 crc kubenswrapper[4898]: I1211 13:24:35.878304 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-fxk76"] Dec 11 13:24:35 crc kubenswrapper[4898]: W1211 13:24:35.888026 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod87df6c08_c7eb_4e54_a329_6343e195c6f3.slice/crio-2f19d00d2937ed6e40897642286ae713bf3180e3f839fd5ed58fad31930741cc WatchSource:0}: Error finding container 2f19d00d2937ed6e40897642286ae713bf3180e3f839fd5ed58fad31930741cc: Status 404 returned error can't find the container with id 2f19d00d2937ed6e40897642286ae713bf3180e3f839fd5ed58fad31930741cc Dec 11 13:24:35 crc kubenswrapper[4898]: W1211 13:24:35.891175 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2f82c55a_6891_4ba6_bcbb_854c918faa92.slice/crio-a0190a536f40d2355b5a84feb50768466ef6f8c6b5fda4c660eb23a44f7e2459 WatchSource:0}: Error finding container a0190a536f40d2355b5a84feb50768466ef6f8c6b5fda4c660eb23a44f7e2459: Status 404 returned error can't find the container with id a0190a536f40d2355b5a84feb50768466ef6f8c6b5fda4c660eb23a44f7e2459 Dec 11 13:24:35 crc kubenswrapper[4898]: W1211 13:24:35.901184 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda6777806_e5a2_4585_bd5a_8ba7f7757c59.slice/crio-6ac8252df101fb99b108c33785ede64bf03a0cf66430edee9d7a02dcf76808ce WatchSource:0}: Error finding container 6ac8252df101fb99b108c33785ede64bf03a0cf66430edee9d7a02dcf76808ce: Status 404 returned error can't find the container with id 6ac8252df101fb99b108c33785ede64bf03a0cf66430edee9d7a02dcf76808ce Dec 11 13:24:36 crc kubenswrapper[4898]: I1211 13:24:36.458820 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-lsxj7" event={"ID":"0c503750-f2d3-42e3-84ba-1db55db9228f","Type":"ContainerStarted","Data":"3a8e576e41d424880b261fe9550b67ff73bcc15705df44b8e0205643c527b56b"} Dec 11 13:24:36 crc kubenswrapper[4898]: I1211 13:24:36.461819 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-6rnmm" event={"ID":"92279f83-81f1-4e41-8dae-72b3a58335e2","Type":"ContainerStarted","Data":"a8ef3bc3231d2b202c81bbc7397ce0109aef75644a1639ee352bc3657b4dd611"} Dec 11 13:24:36 crc kubenswrapper[4898]: I1211 13:24:36.461945 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-666b6646f7-6rnmm" Dec 11 13:24:36 crc kubenswrapper[4898]: I1211 13:24:36.466653 4898 generic.go:334] "Generic (PLEG): container finished" podID="7a828046-bb9f-4a7a-be64-d18efa6ccb63" containerID="fcf247872ae4f36b4603111b5c4efd8f5e5a51791692ec12d59f3246e4661e8e" exitCode=0 Dec 11 13:24:36 crc kubenswrapper[4898]: I1211 13:24:36.466696 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-2t4lr" event={"ID":"7a828046-bb9f-4a7a-be64-d18efa6ccb63","Type":"ContainerDied","Data":"fcf247872ae4f36b4603111b5c4efd8f5e5a51791692ec12d59f3246e4661e8e"} Dec 11 13:24:36 crc kubenswrapper[4898]: I1211 13:24:36.468879 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-fxk76" event={"ID":"a6777806-e5a2-4585-bd5a-8ba7f7757c59","Type":"ContainerStarted","Data":"6ac8252df101fb99b108c33785ede64bf03a0cf66430edee9d7a02dcf76808ce"} Dec 11 13:24:36 crc kubenswrapper[4898]: I1211 13:24:36.470353 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-9bc76884c-z28hg" event={"ID":"4ba352c0-f542-46ac-abcc-c136ddfb67fc","Type":"ContainerStarted","Data":"2b862afefdde1f53049b99ed9a00156a984800927d438a9fa0edd9dabb7aebc0"} Dec 11 13:24:36 crc kubenswrapper[4898]: I1211 13:24:36.470393 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-9bc76884c-z28hg" event={"ID":"4ba352c0-f542-46ac-abcc-c136ddfb67fc","Type":"ContainerStarted","Data":"36ec9fac55dd7401eb4f1a64dd2b10bb381b1e6a9290604188c2414843e731ab"} Dec 11 13:24:36 crc kubenswrapper[4898]: I1211 13:24:36.472338 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"2f82c55a-6891-4ba6-bcbb-854c918faa92","Type":"ContainerStarted","Data":"a0190a536f40d2355b5a84feb50768466ef6f8c6b5fda4c660eb23a44f7e2459"} Dec 11 13:24:36 crc kubenswrapper[4898]: I1211 13:24:36.473402 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"354dfb1c-30d5-4253-beeb-3e57ca531689","Type":"ContainerStarted","Data":"022fc477c0f518ee661899d91f54c29bc1ee8e61021d6712aa7ecd99179376ab"} Dec 11 13:24:36 crc kubenswrapper[4898]: I1211 13:24:36.475100 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-ui-dashboards-7d5fb4cbfb-wb65t" event={"ID":"bb657218-de3e-4a7b-8412-cab942943d0a","Type":"ContainerStarted","Data":"9639451d3a8adf5412601e0632e0a28a18cc7370279da5cf96c5092e6f87083a"} Dec 11 13:24:36 crc kubenswrapper[4898]: I1211 13:24:36.476081 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"87df6c08-c7eb-4e54-a329-6343e195c6f3","Type":"ContainerStarted","Data":"2f19d00d2937ed6e40897642286ae713bf3180e3f839fd5ed58fad31930741cc"} Dec 11 13:24:36 crc kubenswrapper[4898]: I1211 13:24:36.476977 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"f596cf47-0571-443a-9104-c61109f65d44","Type":"ContainerStarted","Data":"6fc315b156e3bfcdea5cd8e47533fda57273965d51904ffc4a5e740cf36630ca"} Dec 11 13:24:36 crc kubenswrapper[4898]: I1211 13:24:36.485105 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-666b6646f7-6rnmm" podStartSLOduration=8.069580383 podStartE2EDuration="20.485086993s" podCreationTimestamp="2025-12-11 13:24:16 +0000 UTC" firstStartedPulling="2025-12-11 13:24:21.710379801 +0000 UTC m=+1219.282706238" lastFinishedPulling="2025-12-11 13:24:34.125886411 +0000 UTC m=+1231.698212848" observedRunningTime="2025-12-11 13:24:36.484927619 +0000 UTC m=+1234.057254066" watchObservedRunningTime="2025-12-11 13:24:36.485086993 +0000 UTC m=+1234.057413430" Dec 11 13:24:36 crc kubenswrapper[4898]: I1211 13:24:36.504566 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-9bc76884c-z28hg" podStartSLOduration=13.504547846 podStartE2EDuration="13.504547846s" podCreationTimestamp="2025-12-11 13:24:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 13:24:36.501253249 +0000 UTC m=+1234.073579686" watchObservedRunningTime="2025-12-11 13:24:36.504547846 +0000 UTC m=+1234.076874283" Dec 11 13:24:36 crc kubenswrapper[4898]: I1211 13:24:36.787106 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="61d961de-6da0-4540-b670-6d3ece217387" path="/var/lib/kubelet/pods/61d961de-6da0-4540-b670-6d3ece217387/volumes" Dec 11 13:24:36 crc kubenswrapper[4898]: I1211 13:24:36.787502 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bcb38feb-75c6-4fe1-b522-5bfa0f8e62c8" path="/var/lib/kubelet/pods/bcb38feb-75c6-4fe1-b522-5bfa0f8e62c8/volumes" Dec 11 13:24:40 crc kubenswrapper[4898]: I1211 13:24:40.534768 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-2t4lr" event={"ID":"7a828046-bb9f-4a7a-be64-d18efa6ccb63","Type":"ContainerStarted","Data":"aaa2f43d8b651ae4b7c4a21743945fd1180e1bc43b4fd44b200752ab13360407"} Dec 11 13:24:40 crc kubenswrapper[4898]: I1211 13:24:40.535167 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57d769cc4f-2t4lr" Dec 11 13:24:40 crc kubenswrapper[4898]: I1211 13:24:40.558927 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57d769cc4f-2t4lr" podStartSLOduration=24.558909295 podStartE2EDuration="24.558909295s" podCreationTimestamp="2025-12-11 13:24:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 13:24:40.551307375 +0000 UTC m=+1238.123633812" watchObservedRunningTime="2025-12-11 13:24:40.558909295 +0000 UTC m=+1238.131235732" Dec 11 13:24:42 crc kubenswrapper[4898]: I1211 13:24:42.029537 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-666b6646f7-6rnmm" Dec 11 13:24:44 crc kubenswrapper[4898]: I1211 13:24:44.296647 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-9bc76884c-z28hg" Dec 11 13:24:44 crc kubenswrapper[4898]: I1211 13:24:44.297061 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-9bc76884c-z28hg" Dec 11 13:24:44 crc kubenswrapper[4898]: I1211 13:24:44.303536 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-9bc76884c-z28hg" Dec 11 13:24:44 crc kubenswrapper[4898]: I1211 13:24:44.580891 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-9bc76884c-z28hg" Dec 11 13:24:44 crc kubenswrapper[4898]: I1211 13:24:44.685876 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6f8f67696b-jczc2"] Dec 11 13:24:47 crc kubenswrapper[4898]: I1211 13:24:47.296422 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-57d769cc4f-2t4lr" Dec 11 13:24:47 crc kubenswrapper[4898]: I1211 13:24:47.380038 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-6rnmm"] Dec 11 13:24:47 crc kubenswrapper[4898]: I1211 13:24:47.380277 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-666b6646f7-6rnmm" podUID="92279f83-81f1-4e41-8dae-72b3a58335e2" containerName="dnsmasq-dns" containerID="cri-o://a8ef3bc3231d2b202c81bbc7397ce0109aef75644a1639ee352bc3657b4dd611" gracePeriod=10 Dec 11 13:24:47 crc kubenswrapper[4898]: I1211 13:24:47.609085 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"2e7cffb6-80f8-45e8-a4ab-219dc834a613","Type":"ContainerStarted","Data":"8e2da9d79c75ea6186d1e97e4d1453771f0f353467eb2e7a6ba2aeef16fbe926"} Dec 11 13:24:47 crc kubenswrapper[4898]: I1211 13:24:47.619372 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"5ae31191-f9f6-452a-8f45-a48b4736012e","Type":"ContainerStarted","Data":"616e99db8f3892fa37bd8fe9d265de41aa023f547fc3a0cbb30fb52b20ea4db3"} Dec 11 13:24:47 crc kubenswrapper[4898]: I1211 13:24:47.622403 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-6rnmm" event={"ID":"92279f83-81f1-4e41-8dae-72b3a58335e2","Type":"ContainerDied","Data":"a8ef3bc3231d2b202c81bbc7397ce0109aef75644a1639ee352bc3657b4dd611"} Dec 11 13:24:47 crc kubenswrapper[4898]: I1211 13:24:47.624213 4898 generic.go:334] "Generic (PLEG): container finished" podID="92279f83-81f1-4e41-8dae-72b3a58335e2" containerID="a8ef3bc3231d2b202c81bbc7397ce0109aef75644a1639ee352bc3657b4dd611" exitCode=0 Dec 11 13:24:48 crc kubenswrapper[4898]: I1211 13:24:48.537218 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-6rnmm" Dec 11 13:24:48 crc kubenswrapper[4898]: I1211 13:24:48.636575 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-6rnmm" event={"ID":"92279f83-81f1-4e41-8dae-72b3a58335e2","Type":"ContainerDied","Data":"9661052eaa15d370b2be813d179a03ab7e25e83db66cfb2d455dcf8681711e1b"} Dec 11 13:24:48 crc kubenswrapper[4898]: I1211 13:24:48.636662 4898 scope.go:117] "RemoveContainer" containerID="a8ef3bc3231d2b202c81bbc7397ce0109aef75644a1639ee352bc3657b4dd611" Dec 11 13:24:48 crc kubenswrapper[4898]: I1211 13:24:48.636805 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-6rnmm" Dec 11 13:24:48 crc kubenswrapper[4898]: I1211 13:24:48.640085 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"6eaf839e-626a-4f9a-b489-d9c37cee9065","Type":"ContainerStarted","Data":"bbc99eb3803c73f2cd3f89521dd3bdaae4c7d877f90cf10b7734bbb008573b50"} Dec 11 13:24:48 crc kubenswrapper[4898]: I1211 13:24:48.642859 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-ui-dashboards-7d5fb4cbfb-wb65t" event={"ID":"bb657218-de3e-4a7b-8412-cab942943d0a","Type":"ContainerStarted","Data":"ce1e33fa718e034b3c85c150aa3a444127527e5624c5544d70fc04c931abeee3"} Dec 11 13:24:48 crc kubenswrapper[4898]: I1211 13:24:48.647499 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vp4rc\" (UniqueName: \"kubernetes.io/projected/92279f83-81f1-4e41-8dae-72b3a58335e2-kube-api-access-vp4rc\") pod \"92279f83-81f1-4e41-8dae-72b3a58335e2\" (UID: \"92279f83-81f1-4e41-8dae-72b3a58335e2\") " Dec 11 13:24:48 crc kubenswrapper[4898]: I1211 13:24:48.648217 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/92279f83-81f1-4e41-8dae-72b3a58335e2-config\") pod \"92279f83-81f1-4e41-8dae-72b3a58335e2\" (UID: \"92279f83-81f1-4e41-8dae-72b3a58335e2\") " Dec 11 13:24:48 crc kubenswrapper[4898]: I1211 13:24:48.648283 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/92279f83-81f1-4e41-8dae-72b3a58335e2-dns-svc\") pod \"92279f83-81f1-4e41-8dae-72b3a58335e2\" (UID: \"92279f83-81f1-4e41-8dae-72b3a58335e2\") " Dec 11 13:24:48 crc kubenswrapper[4898]: I1211 13:24:48.705804 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-ui-dashboards-7d5fb4cbfb-wb65t" podStartSLOduration=15.150862456 podStartE2EDuration="25.705787339s" podCreationTimestamp="2025-12-11 13:24:23 +0000 UTC" firstStartedPulling="2025-12-11 13:24:35.735091463 +0000 UTC m=+1233.307417900" lastFinishedPulling="2025-12-11 13:24:46.290016346 +0000 UTC m=+1243.862342783" observedRunningTime="2025-12-11 13:24:48.682179427 +0000 UTC m=+1246.254505864" watchObservedRunningTime="2025-12-11 13:24:48.705787339 +0000 UTC m=+1246.278113766" Dec 11 13:24:48 crc kubenswrapper[4898]: I1211 13:24:48.714226 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/92279f83-81f1-4e41-8dae-72b3a58335e2-kube-api-access-vp4rc" (OuterVolumeSpecName: "kube-api-access-vp4rc") pod "92279f83-81f1-4e41-8dae-72b3a58335e2" (UID: "92279f83-81f1-4e41-8dae-72b3a58335e2"). InnerVolumeSpecName "kube-api-access-vp4rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:24:48 crc kubenswrapper[4898]: I1211 13:24:48.753398 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vp4rc\" (UniqueName: \"kubernetes.io/projected/92279f83-81f1-4e41-8dae-72b3a58335e2-kube-api-access-vp4rc\") on node \"crc\" DevicePath \"\"" Dec 11 13:24:48 crc kubenswrapper[4898]: I1211 13:24:48.811047 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/92279f83-81f1-4e41-8dae-72b3a58335e2-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "92279f83-81f1-4e41-8dae-72b3a58335e2" (UID: "92279f83-81f1-4e41-8dae-72b3a58335e2"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:24:48 crc kubenswrapper[4898]: I1211 13:24:48.832152 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/92279f83-81f1-4e41-8dae-72b3a58335e2-config" (OuterVolumeSpecName: "config") pod "92279f83-81f1-4e41-8dae-72b3a58335e2" (UID: "92279f83-81f1-4e41-8dae-72b3a58335e2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:24:48 crc kubenswrapper[4898]: I1211 13:24:48.858983 4898 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/92279f83-81f1-4e41-8dae-72b3a58335e2-config\") on node \"crc\" DevicePath \"\"" Dec 11 13:24:48 crc kubenswrapper[4898]: I1211 13:24:48.859021 4898 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/92279f83-81f1-4e41-8dae-72b3a58335e2-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 11 13:24:48 crc kubenswrapper[4898]: I1211 13:24:48.974360 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-6rnmm"] Dec 11 13:24:48 crc kubenswrapper[4898]: I1211 13:24:48.982414 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-6rnmm"] Dec 11 13:24:49 crc kubenswrapper[4898]: I1211 13:24:49.032635 4898 scope.go:117] "RemoveContainer" containerID="3f5b681c2f79dd6f02f8c15a18d383eb3a72142a777c854cefe14afdf342249c" Dec 11 13:24:49 crc kubenswrapper[4898]: I1211 13:24:49.652266 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"aab51a6c-5473-415e-913c-dcb5907cd012","Type":"ContainerStarted","Data":"02ed77065da0c3b5634ba5540cce718c7e4128e6b16de401d54e6861aad06bc6"} Dec 11 13:24:49 crc kubenswrapper[4898]: I1211 13:24:49.655855 4898 generic.go:334] "Generic (PLEG): container finished" podID="a6777806-e5a2-4585-bd5a-8ba7f7757c59" containerID="02bbc337c207485f4c56000ea7ef660ee4c2835bef8a58143b4794d1083fa8e5" exitCode=0 Dec 11 13:24:49 crc kubenswrapper[4898]: I1211 13:24:49.655915 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-fxk76" event={"ID":"a6777806-e5a2-4585-bd5a-8ba7f7757c59","Type":"ContainerDied","Data":"02bbc337c207485f4c56000ea7ef660ee4c2835bef8a58143b4794d1083fa8e5"} Dec 11 13:24:49 crc kubenswrapper[4898]: I1211 13:24:49.657357 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"2f82c55a-6891-4ba6-bcbb-854c918faa92","Type":"ContainerStarted","Data":"78f510cf25f7d2028ea30354bc8e77262c08e17ec3a9cd504b34b428077a512a"} Dec 11 13:24:49 crc kubenswrapper[4898]: I1211 13:24:49.660797 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"354dfb1c-30d5-4253-beeb-3e57ca531689","Type":"ContainerStarted","Data":"473fa7fe6a537a1b8710634ab2622e077258c24c2ed80cd70ef967c0615240d6"} Dec 11 13:24:49 crc kubenswrapper[4898]: I1211 13:24:49.660875 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Dec 11 13:24:49 crc kubenswrapper[4898]: I1211 13:24:49.662508 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-lsxj7" event={"ID":"0c503750-f2d3-42e3-84ba-1db55db9228f","Type":"ContainerStarted","Data":"cfde723d7a508f2b966353fe5ec84c0cc16ad5e120ad48c1f4e2f730eda80e02"} Dec 11 13:24:49 crc kubenswrapper[4898]: I1211 13:24:49.662655 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-lsxj7" Dec 11 13:24:49 crc kubenswrapper[4898]: I1211 13:24:49.665572 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"f596cf47-0571-443a-9104-c61109f65d44","Type":"ContainerStarted","Data":"9d911f8f97be152e410239c5c952d71afc426dced61b88d6d365f523a42d1e49"} Dec 11 13:24:49 crc kubenswrapper[4898]: I1211 13:24:49.665605 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Dec 11 13:24:49 crc kubenswrapper[4898]: I1211 13:24:49.724539 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-lsxj7" podStartSLOduration=12.520623884 podStartE2EDuration="23.724521965s" podCreationTimestamp="2025-12-11 13:24:26 +0000 UTC" firstStartedPulling="2025-12-11 13:24:35.897884114 +0000 UTC m=+1233.470210551" lastFinishedPulling="2025-12-11 13:24:47.101782195 +0000 UTC m=+1244.674108632" observedRunningTime="2025-12-11 13:24:49.721752232 +0000 UTC m=+1247.294078669" watchObservedRunningTime="2025-12-11 13:24:49.724521965 +0000 UTC m=+1247.296848412" Dec 11 13:24:49 crc kubenswrapper[4898]: I1211 13:24:49.751027 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=19.255157407 podStartE2EDuration="29.751008153s" podCreationTimestamp="2025-12-11 13:24:20 +0000 UTC" firstStartedPulling="2025-12-11 13:24:35.890175921 +0000 UTC m=+1233.462502358" lastFinishedPulling="2025-12-11 13:24:46.386026667 +0000 UTC m=+1243.958353104" observedRunningTime="2025-12-11 13:24:49.744792639 +0000 UTC m=+1247.317119086" watchObservedRunningTime="2025-12-11 13:24:49.751008153 +0000 UTC m=+1247.323334600" Dec 11 13:24:49 crc kubenswrapper[4898]: I1211 13:24:49.770355 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=14.494690567 podStartE2EDuration="27.770335833s" podCreationTimestamp="2025-12-11 13:24:22 +0000 UTC" firstStartedPulling="2025-12-11 13:24:35.841785795 +0000 UTC m=+1233.414112232" lastFinishedPulling="2025-12-11 13:24:49.117431071 +0000 UTC m=+1246.689757498" observedRunningTime="2025-12-11 13:24:49.767903408 +0000 UTC m=+1247.340229835" watchObservedRunningTime="2025-12-11 13:24:49.770335833 +0000 UTC m=+1247.342662270" Dec 11 13:24:50 crc kubenswrapper[4898]: I1211 13:24:50.681133 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"026f0391-aa61-4b41-963f-239e08b0cd34","Type":"ContainerStarted","Data":"9bd52eab2cebdafa7238c3a7545f94428c7ec74df13750e07166556883515e5a"} Dec 11 13:24:50 crc kubenswrapper[4898]: I1211 13:24:50.683795 4898 generic.go:334] "Generic (PLEG): container finished" podID="5ae31191-f9f6-452a-8f45-a48b4736012e" containerID="616e99db8f3892fa37bd8fe9d265de41aa023f547fc3a0cbb30fb52b20ea4db3" exitCode=0 Dec 11 13:24:50 crc kubenswrapper[4898]: I1211 13:24:50.683854 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"5ae31191-f9f6-452a-8f45-a48b4736012e","Type":"ContainerDied","Data":"616e99db8f3892fa37bd8fe9d265de41aa023f547fc3a0cbb30fb52b20ea4db3"} Dec 11 13:24:50 crc kubenswrapper[4898]: I1211 13:24:50.689161 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"87df6c08-c7eb-4e54-a329-6343e195c6f3","Type":"ContainerStarted","Data":"abac1359be5a82b057a5932b0db735799507827d93dc6b8f2c45349e642e3283"} Dec 11 13:24:50 crc kubenswrapper[4898]: I1211 13:24:50.693943 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-fxk76" event={"ID":"a6777806-e5a2-4585-bd5a-8ba7f7757c59","Type":"ContainerStarted","Data":"bc5e211702b6750cc1d38d0ddfad7a189ad64a07f8d583e9b7f72616c344f236"} Dec 11 13:24:50 crc kubenswrapper[4898]: I1211 13:24:50.694072 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-fxk76" Dec 11 13:24:50 crc kubenswrapper[4898]: I1211 13:24:50.694174 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-fxk76" event={"ID":"a6777806-e5a2-4585-bd5a-8ba7f7757c59","Type":"ContainerStarted","Data":"570e8d137dd383c27ae6f09155fa62a3a186f88cb311944229ca6eea909b6aee"} Dec 11 13:24:50 crc kubenswrapper[4898]: I1211 13:24:50.702287 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-fxk76" Dec 11 13:24:50 crc kubenswrapper[4898]: I1211 13:24:50.798537 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-fxk76" podStartSLOduration=14.318504917 podStartE2EDuration="24.798517856s" podCreationTimestamp="2025-12-11 13:24:26 +0000 UTC" firstStartedPulling="2025-12-11 13:24:35.905821873 +0000 UTC m=+1233.478148310" lastFinishedPulling="2025-12-11 13:24:46.385834812 +0000 UTC m=+1243.958161249" observedRunningTime="2025-12-11 13:24:50.77742141 +0000 UTC m=+1248.349747847" watchObservedRunningTime="2025-12-11 13:24:50.798517856 +0000 UTC m=+1248.370844293" Dec 11 13:24:50 crc kubenswrapper[4898]: I1211 13:24:50.812563 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="92279f83-81f1-4e41-8dae-72b3a58335e2" path="/var/lib/kubelet/pods/92279f83-81f1-4e41-8dae-72b3a58335e2/volumes" Dec 11 13:24:51 crc kubenswrapper[4898]: I1211 13:24:51.712312 4898 generic.go:334] "Generic (PLEG): container finished" podID="2e7cffb6-80f8-45e8-a4ab-219dc834a613" containerID="8e2da9d79c75ea6186d1e97e4d1453771f0f353467eb2e7a6ba2aeef16fbe926" exitCode=0 Dec 11 13:24:51 crc kubenswrapper[4898]: I1211 13:24:51.712395 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"2e7cffb6-80f8-45e8-a4ab-219dc834a613","Type":"ContainerDied","Data":"8e2da9d79c75ea6186d1e97e4d1453771f0f353467eb2e7a6ba2aeef16fbe926"} Dec 11 13:24:51 crc kubenswrapper[4898]: I1211 13:24:51.723495 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"5ae31191-f9f6-452a-8f45-a48b4736012e","Type":"ContainerStarted","Data":"182cb6777d8dce969ca4cceb8ce2ae318316c8e9eefc71b7a475503bb04dab70"} Dec 11 13:24:51 crc kubenswrapper[4898]: I1211 13:24:51.757637 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=26.643583198 podStartE2EDuration="32.75761946s" podCreationTimestamp="2025-12-11 13:24:19 +0000 UTC" firstStartedPulling="2025-12-11 13:24:33.142659926 +0000 UTC m=+1230.714986403" lastFinishedPulling="2025-12-11 13:24:39.256696228 +0000 UTC m=+1236.829022665" observedRunningTime="2025-12-11 13:24:51.751408346 +0000 UTC m=+1249.323734783" watchObservedRunningTime="2025-12-11 13:24:51.75761946 +0000 UTC m=+1249.329945897" Dec 11 13:24:53 crc kubenswrapper[4898]: I1211 13:24:53.753581 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"aab51a6c-5473-415e-913c-dcb5907cd012","Type":"ContainerStarted","Data":"a64a0f81ebbf3aeee335e12b1427fd19745f66834cf0fb8b651a4a2d5a1e1f67"} Dec 11 13:24:53 crc kubenswrapper[4898]: I1211 13:24:53.757530 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"2f82c55a-6891-4ba6-bcbb-854c918faa92","Type":"ContainerStarted","Data":"61b83459ba76cd111734c5eef7063216d2e175eb2611188335f9b9585472d9c3"} Dec 11 13:24:53 crc kubenswrapper[4898]: I1211 13:24:53.761036 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"2e7cffb6-80f8-45e8-a4ab-219dc834a613","Type":"ContainerStarted","Data":"7ab646376dff1e213ff8bbf79f0d74175b5dbe8db34e0f510cbcf47d38898a79"} Dec 11 13:24:53 crc kubenswrapper[4898]: I1211 13:24:53.787720 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=7.595437124 podStartE2EDuration="24.787699267s" podCreationTimestamp="2025-12-11 13:24:29 +0000 UTC" firstStartedPulling="2025-12-11 13:24:35.418081806 +0000 UTC m=+1232.990408243" lastFinishedPulling="2025-12-11 13:24:52.610343949 +0000 UTC m=+1250.182670386" observedRunningTime="2025-12-11 13:24:53.775111865 +0000 UTC m=+1251.347438322" watchObservedRunningTime="2025-12-11 13:24:53.787699267 +0000 UTC m=+1251.360025704" Dec 11 13:24:53 crc kubenswrapper[4898]: I1211 13:24:53.806060 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=25.577566883 podStartE2EDuration="35.80603898s" podCreationTimestamp="2025-12-11 13:24:18 +0000 UTC" firstStartedPulling="2025-12-11 13:24:35.275099367 +0000 UTC m=+1232.847425814" lastFinishedPulling="2025-12-11 13:24:45.503571464 +0000 UTC m=+1243.075897911" observedRunningTime="2025-12-11 13:24:53.796982601 +0000 UTC m=+1251.369309048" watchObservedRunningTime="2025-12-11 13:24:53.80603898 +0000 UTC m=+1251.378365417" Dec 11 13:24:53 crc kubenswrapper[4898]: I1211 13:24:53.824394 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=12.128422163 podStartE2EDuration="28.824375673s" podCreationTimestamp="2025-12-11 13:24:25 +0000 UTC" firstStartedPulling="2025-12-11 13:24:35.899532468 +0000 UTC m=+1233.471858905" lastFinishedPulling="2025-12-11 13:24:52.595485978 +0000 UTC m=+1250.167812415" observedRunningTime="2025-12-11 13:24:53.817649376 +0000 UTC m=+1251.389975823" watchObservedRunningTime="2025-12-11 13:24:53.824375673 +0000 UTC m=+1251.396702120" Dec 11 13:24:53 crc kubenswrapper[4898]: I1211 13:24:53.938525 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Dec 11 13:24:53 crc kubenswrapper[4898]: I1211 13:24:53.980566 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Dec 11 13:24:54 crc kubenswrapper[4898]: I1211 13:24:54.767432 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Dec 11 13:24:54 crc kubenswrapper[4898]: I1211 13:24:54.830840 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Dec 11 13:24:54 crc kubenswrapper[4898]: I1211 13:24:54.832384 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Dec 11 13:24:54 crc kubenswrapper[4898]: I1211 13:24:54.890560 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Dec 11 13:24:55 crc kubenswrapper[4898]: I1211 13:24:55.032898 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-lm8fl"] Dec 11 13:24:55 crc kubenswrapper[4898]: E1211 13:24:55.033414 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92279f83-81f1-4e41-8dae-72b3a58335e2" containerName="dnsmasq-dns" Dec 11 13:24:55 crc kubenswrapper[4898]: I1211 13:24:55.033440 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="92279f83-81f1-4e41-8dae-72b3a58335e2" containerName="dnsmasq-dns" Dec 11 13:24:55 crc kubenswrapper[4898]: E1211 13:24:55.033470 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92279f83-81f1-4e41-8dae-72b3a58335e2" containerName="init" Dec 11 13:24:55 crc kubenswrapper[4898]: I1211 13:24:55.033480 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="92279f83-81f1-4e41-8dae-72b3a58335e2" containerName="init" Dec 11 13:24:55 crc kubenswrapper[4898]: I1211 13:24:55.033775 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="92279f83-81f1-4e41-8dae-72b3a58335e2" containerName="dnsmasq-dns" Dec 11 13:24:55 crc kubenswrapper[4898]: I1211 13:24:55.034996 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bf47b49b7-lm8fl" Dec 11 13:24:55 crc kubenswrapper[4898]: I1211 13:24:55.037682 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Dec 11 13:24:55 crc kubenswrapper[4898]: I1211 13:24:55.057398 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-lm8fl"] Dec 11 13:24:55 crc kubenswrapper[4898]: I1211 13:24:55.078515 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-5wp9b"] Dec 11 13:24:55 crc kubenswrapper[4898]: I1211 13:24:55.079837 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-5wp9b" Dec 11 13:24:55 crc kubenswrapper[4898]: I1211 13:24:55.082015 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Dec 11 13:24:55 crc kubenswrapper[4898]: I1211 13:24:55.100585 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-5wp9b"] Dec 11 13:24:55 crc kubenswrapper[4898]: I1211 13:24:55.191708 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/71e5f4cc-7724-4f78-82c6-ed399925b7b7-dns-svc\") pod \"dnsmasq-dns-5bf47b49b7-lm8fl\" (UID: \"71e5f4cc-7724-4f78-82c6-ed399925b7b7\") " pod="openstack/dnsmasq-dns-5bf47b49b7-lm8fl" Dec 11 13:24:55 crc kubenswrapper[4898]: I1211 13:24:55.191751 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/71e5f4cc-7724-4f78-82c6-ed399925b7b7-ovsdbserver-nb\") pod \"dnsmasq-dns-5bf47b49b7-lm8fl\" (UID: \"71e5f4cc-7724-4f78-82c6-ed399925b7b7\") " pod="openstack/dnsmasq-dns-5bf47b49b7-lm8fl" Dec 11 13:24:55 crc kubenswrapper[4898]: I1211 13:24:55.191789 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qlv4b\" (UniqueName: \"kubernetes.io/projected/71e5f4cc-7724-4f78-82c6-ed399925b7b7-kube-api-access-qlv4b\") pod \"dnsmasq-dns-5bf47b49b7-lm8fl\" (UID: \"71e5f4cc-7724-4f78-82c6-ed399925b7b7\") " pod="openstack/dnsmasq-dns-5bf47b49b7-lm8fl" Dec 11 13:24:55 crc kubenswrapper[4898]: I1211 13:24:55.191909 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33e30fc7-973e-436e-a9df-c839f8609a99-combined-ca-bundle\") pod \"ovn-controller-metrics-5wp9b\" (UID: \"33e30fc7-973e-436e-a9df-c839f8609a99\") " pod="openstack/ovn-controller-metrics-5wp9b" Dec 11 13:24:55 crc kubenswrapper[4898]: I1211 13:24:55.192020 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33e30fc7-973e-436e-a9df-c839f8609a99-config\") pod \"ovn-controller-metrics-5wp9b\" (UID: \"33e30fc7-973e-436e-a9df-c839f8609a99\") " pod="openstack/ovn-controller-metrics-5wp9b" Dec 11 13:24:55 crc kubenswrapper[4898]: I1211 13:24:55.192068 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j744t\" (UniqueName: \"kubernetes.io/projected/33e30fc7-973e-436e-a9df-c839f8609a99-kube-api-access-j744t\") pod \"ovn-controller-metrics-5wp9b\" (UID: \"33e30fc7-973e-436e-a9df-c839f8609a99\") " pod="openstack/ovn-controller-metrics-5wp9b" Dec 11 13:24:55 crc kubenswrapper[4898]: I1211 13:24:55.192089 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/71e5f4cc-7724-4f78-82c6-ed399925b7b7-config\") pod \"dnsmasq-dns-5bf47b49b7-lm8fl\" (UID: \"71e5f4cc-7724-4f78-82c6-ed399925b7b7\") " pod="openstack/dnsmasq-dns-5bf47b49b7-lm8fl" Dec 11 13:24:55 crc kubenswrapper[4898]: I1211 13:24:55.192138 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/33e30fc7-973e-436e-a9df-c839f8609a99-ovs-rundir\") pod \"ovn-controller-metrics-5wp9b\" (UID: \"33e30fc7-973e-436e-a9df-c839f8609a99\") " pod="openstack/ovn-controller-metrics-5wp9b" Dec 11 13:24:55 crc kubenswrapper[4898]: I1211 13:24:55.192230 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/33e30fc7-973e-436e-a9df-c839f8609a99-ovn-rundir\") pod \"ovn-controller-metrics-5wp9b\" (UID: \"33e30fc7-973e-436e-a9df-c839f8609a99\") " pod="openstack/ovn-controller-metrics-5wp9b" Dec 11 13:24:55 crc kubenswrapper[4898]: I1211 13:24:55.192311 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/33e30fc7-973e-436e-a9df-c839f8609a99-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-5wp9b\" (UID: \"33e30fc7-973e-436e-a9df-c839f8609a99\") " pod="openstack/ovn-controller-metrics-5wp9b" Dec 11 13:24:55 crc kubenswrapper[4898]: I1211 13:24:55.293862 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33e30fc7-973e-436e-a9df-c839f8609a99-config\") pod \"ovn-controller-metrics-5wp9b\" (UID: \"33e30fc7-973e-436e-a9df-c839f8609a99\") " pod="openstack/ovn-controller-metrics-5wp9b" Dec 11 13:24:55 crc kubenswrapper[4898]: I1211 13:24:55.293933 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j744t\" (UniqueName: \"kubernetes.io/projected/33e30fc7-973e-436e-a9df-c839f8609a99-kube-api-access-j744t\") pod \"ovn-controller-metrics-5wp9b\" (UID: \"33e30fc7-973e-436e-a9df-c839f8609a99\") " pod="openstack/ovn-controller-metrics-5wp9b" Dec 11 13:24:55 crc kubenswrapper[4898]: I1211 13:24:55.293956 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/71e5f4cc-7724-4f78-82c6-ed399925b7b7-config\") pod \"dnsmasq-dns-5bf47b49b7-lm8fl\" (UID: \"71e5f4cc-7724-4f78-82c6-ed399925b7b7\") " pod="openstack/dnsmasq-dns-5bf47b49b7-lm8fl" Dec 11 13:24:55 crc kubenswrapper[4898]: I1211 13:24:55.293997 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/33e30fc7-973e-436e-a9df-c839f8609a99-ovs-rundir\") pod \"ovn-controller-metrics-5wp9b\" (UID: \"33e30fc7-973e-436e-a9df-c839f8609a99\") " pod="openstack/ovn-controller-metrics-5wp9b" Dec 11 13:24:55 crc kubenswrapper[4898]: I1211 13:24:55.294017 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/33e30fc7-973e-436e-a9df-c839f8609a99-ovn-rundir\") pod \"ovn-controller-metrics-5wp9b\" (UID: \"33e30fc7-973e-436e-a9df-c839f8609a99\") " pod="openstack/ovn-controller-metrics-5wp9b" Dec 11 13:24:55 crc kubenswrapper[4898]: I1211 13:24:55.294037 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/33e30fc7-973e-436e-a9df-c839f8609a99-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-5wp9b\" (UID: \"33e30fc7-973e-436e-a9df-c839f8609a99\") " pod="openstack/ovn-controller-metrics-5wp9b" Dec 11 13:24:55 crc kubenswrapper[4898]: I1211 13:24:55.294056 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/71e5f4cc-7724-4f78-82c6-ed399925b7b7-dns-svc\") pod \"dnsmasq-dns-5bf47b49b7-lm8fl\" (UID: \"71e5f4cc-7724-4f78-82c6-ed399925b7b7\") " pod="openstack/dnsmasq-dns-5bf47b49b7-lm8fl" Dec 11 13:24:55 crc kubenswrapper[4898]: I1211 13:24:55.294075 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/71e5f4cc-7724-4f78-82c6-ed399925b7b7-ovsdbserver-nb\") pod \"dnsmasq-dns-5bf47b49b7-lm8fl\" (UID: \"71e5f4cc-7724-4f78-82c6-ed399925b7b7\") " pod="openstack/dnsmasq-dns-5bf47b49b7-lm8fl" Dec 11 13:24:55 crc kubenswrapper[4898]: I1211 13:24:55.294097 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qlv4b\" (UniqueName: \"kubernetes.io/projected/71e5f4cc-7724-4f78-82c6-ed399925b7b7-kube-api-access-qlv4b\") pod \"dnsmasq-dns-5bf47b49b7-lm8fl\" (UID: \"71e5f4cc-7724-4f78-82c6-ed399925b7b7\") " pod="openstack/dnsmasq-dns-5bf47b49b7-lm8fl" Dec 11 13:24:55 crc kubenswrapper[4898]: I1211 13:24:55.294132 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33e30fc7-973e-436e-a9df-c839f8609a99-combined-ca-bundle\") pod \"ovn-controller-metrics-5wp9b\" (UID: \"33e30fc7-973e-436e-a9df-c839f8609a99\") " pod="openstack/ovn-controller-metrics-5wp9b" Dec 11 13:24:55 crc kubenswrapper[4898]: I1211 13:24:55.294590 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/33e30fc7-973e-436e-a9df-c839f8609a99-ovn-rundir\") pod \"ovn-controller-metrics-5wp9b\" (UID: \"33e30fc7-973e-436e-a9df-c839f8609a99\") " pod="openstack/ovn-controller-metrics-5wp9b" Dec 11 13:24:55 crc kubenswrapper[4898]: I1211 13:24:55.294884 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/71e5f4cc-7724-4f78-82c6-ed399925b7b7-config\") pod \"dnsmasq-dns-5bf47b49b7-lm8fl\" (UID: \"71e5f4cc-7724-4f78-82c6-ed399925b7b7\") " pod="openstack/dnsmasq-dns-5bf47b49b7-lm8fl" Dec 11 13:24:55 crc kubenswrapper[4898]: I1211 13:24:55.295285 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/71e5f4cc-7724-4f78-82c6-ed399925b7b7-ovsdbserver-nb\") pod \"dnsmasq-dns-5bf47b49b7-lm8fl\" (UID: \"71e5f4cc-7724-4f78-82c6-ed399925b7b7\") " pod="openstack/dnsmasq-dns-5bf47b49b7-lm8fl" Dec 11 13:24:55 crc kubenswrapper[4898]: I1211 13:24:55.295435 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/71e5f4cc-7724-4f78-82c6-ed399925b7b7-dns-svc\") pod \"dnsmasq-dns-5bf47b49b7-lm8fl\" (UID: \"71e5f4cc-7724-4f78-82c6-ed399925b7b7\") " pod="openstack/dnsmasq-dns-5bf47b49b7-lm8fl" Dec 11 13:24:55 crc kubenswrapper[4898]: I1211 13:24:55.295725 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33e30fc7-973e-436e-a9df-c839f8609a99-config\") pod \"ovn-controller-metrics-5wp9b\" (UID: \"33e30fc7-973e-436e-a9df-c839f8609a99\") " pod="openstack/ovn-controller-metrics-5wp9b" Dec 11 13:24:55 crc kubenswrapper[4898]: I1211 13:24:55.302376 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33e30fc7-973e-436e-a9df-c839f8609a99-combined-ca-bundle\") pod \"ovn-controller-metrics-5wp9b\" (UID: \"33e30fc7-973e-436e-a9df-c839f8609a99\") " pod="openstack/ovn-controller-metrics-5wp9b" Dec 11 13:24:55 crc kubenswrapper[4898]: I1211 13:24:55.304086 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/33e30fc7-973e-436e-a9df-c839f8609a99-ovs-rundir\") pod \"ovn-controller-metrics-5wp9b\" (UID: \"33e30fc7-973e-436e-a9df-c839f8609a99\") " pod="openstack/ovn-controller-metrics-5wp9b" Dec 11 13:24:55 crc kubenswrapper[4898]: I1211 13:24:55.320189 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j744t\" (UniqueName: \"kubernetes.io/projected/33e30fc7-973e-436e-a9df-c839f8609a99-kube-api-access-j744t\") pod \"ovn-controller-metrics-5wp9b\" (UID: \"33e30fc7-973e-436e-a9df-c839f8609a99\") " pod="openstack/ovn-controller-metrics-5wp9b" Dec 11 13:24:55 crc kubenswrapper[4898]: I1211 13:24:55.325918 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/33e30fc7-973e-436e-a9df-c839f8609a99-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-5wp9b\" (UID: \"33e30fc7-973e-436e-a9df-c839f8609a99\") " pod="openstack/ovn-controller-metrics-5wp9b" Dec 11 13:24:55 crc kubenswrapper[4898]: I1211 13:24:55.332540 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qlv4b\" (UniqueName: \"kubernetes.io/projected/71e5f4cc-7724-4f78-82c6-ed399925b7b7-kube-api-access-qlv4b\") pod \"dnsmasq-dns-5bf47b49b7-lm8fl\" (UID: \"71e5f4cc-7724-4f78-82c6-ed399925b7b7\") " pod="openstack/dnsmasq-dns-5bf47b49b7-lm8fl" Dec 11 13:24:55 crc kubenswrapper[4898]: I1211 13:24:55.359161 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bf47b49b7-lm8fl" Dec 11 13:24:55 crc kubenswrapper[4898]: I1211 13:24:55.382131 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-lm8fl"] Dec 11 13:24:55 crc kubenswrapper[4898]: I1211 13:24:55.409404 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-5wp9b" Dec 11 13:24:55 crc kubenswrapper[4898]: I1211 13:24:55.419789 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8554648995-6l72c"] Dec 11 13:24:55 crc kubenswrapper[4898]: I1211 13:24:55.422451 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-6l72c" Dec 11 13:24:55 crc kubenswrapper[4898]: I1211 13:24:55.427970 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Dec 11 13:24:55 crc kubenswrapper[4898]: I1211 13:24:55.442542 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8554648995-6l72c"] Dec 11 13:24:55 crc kubenswrapper[4898]: I1211 13:24:55.599142 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d26c4733-9d89-4666-8554-f769295ad7b3-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-6l72c\" (UID: \"d26c4733-9d89-4666-8554-f769295ad7b3\") " pod="openstack/dnsmasq-dns-8554648995-6l72c" Dec 11 13:24:55 crc kubenswrapper[4898]: I1211 13:24:55.599417 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d26c4733-9d89-4666-8554-f769295ad7b3-config\") pod \"dnsmasq-dns-8554648995-6l72c\" (UID: \"d26c4733-9d89-4666-8554-f769295ad7b3\") " pod="openstack/dnsmasq-dns-8554648995-6l72c" Dec 11 13:24:55 crc kubenswrapper[4898]: I1211 13:24:55.599547 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bpmgn\" (UniqueName: \"kubernetes.io/projected/d26c4733-9d89-4666-8554-f769295ad7b3-kube-api-access-bpmgn\") pod \"dnsmasq-dns-8554648995-6l72c\" (UID: \"d26c4733-9d89-4666-8554-f769295ad7b3\") " pod="openstack/dnsmasq-dns-8554648995-6l72c" Dec 11 13:24:55 crc kubenswrapper[4898]: I1211 13:24:55.599691 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d26c4733-9d89-4666-8554-f769295ad7b3-dns-svc\") pod \"dnsmasq-dns-8554648995-6l72c\" (UID: \"d26c4733-9d89-4666-8554-f769295ad7b3\") " pod="openstack/dnsmasq-dns-8554648995-6l72c" Dec 11 13:24:55 crc kubenswrapper[4898]: I1211 13:24:55.599822 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d26c4733-9d89-4666-8554-f769295ad7b3-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-6l72c\" (UID: \"d26c4733-9d89-4666-8554-f769295ad7b3\") " pod="openstack/dnsmasq-dns-8554648995-6l72c" Dec 11 13:24:55 crc kubenswrapper[4898]: I1211 13:24:55.701239 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d26c4733-9d89-4666-8554-f769295ad7b3-dns-svc\") pod \"dnsmasq-dns-8554648995-6l72c\" (UID: \"d26c4733-9d89-4666-8554-f769295ad7b3\") " pod="openstack/dnsmasq-dns-8554648995-6l72c" Dec 11 13:24:55 crc kubenswrapper[4898]: I1211 13:24:55.701507 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d26c4733-9d89-4666-8554-f769295ad7b3-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-6l72c\" (UID: \"d26c4733-9d89-4666-8554-f769295ad7b3\") " pod="openstack/dnsmasq-dns-8554648995-6l72c" Dec 11 13:24:55 crc kubenswrapper[4898]: I1211 13:24:55.701638 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d26c4733-9d89-4666-8554-f769295ad7b3-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-6l72c\" (UID: \"d26c4733-9d89-4666-8554-f769295ad7b3\") " pod="openstack/dnsmasq-dns-8554648995-6l72c" Dec 11 13:24:55 crc kubenswrapper[4898]: I1211 13:24:55.701721 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d26c4733-9d89-4666-8554-f769295ad7b3-config\") pod \"dnsmasq-dns-8554648995-6l72c\" (UID: \"d26c4733-9d89-4666-8554-f769295ad7b3\") " pod="openstack/dnsmasq-dns-8554648995-6l72c" Dec 11 13:24:55 crc kubenswrapper[4898]: I1211 13:24:55.701810 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bpmgn\" (UniqueName: \"kubernetes.io/projected/d26c4733-9d89-4666-8554-f769295ad7b3-kube-api-access-bpmgn\") pod \"dnsmasq-dns-8554648995-6l72c\" (UID: \"d26c4733-9d89-4666-8554-f769295ad7b3\") " pod="openstack/dnsmasq-dns-8554648995-6l72c" Dec 11 13:24:55 crc kubenswrapper[4898]: I1211 13:24:55.702793 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d26c4733-9d89-4666-8554-f769295ad7b3-dns-svc\") pod \"dnsmasq-dns-8554648995-6l72c\" (UID: \"d26c4733-9d89-4666-8554-f769295ad7b3\") " pod="openstack/dnsmasq-dns-8554648995-6l72c" Dec 11 13:24:55 crc kubenswrapper[4898]: I1211 13:24:55.703321 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d26c4733-9d89-4666-8554-f769295ad7b3-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-6l72c\" (UID: \"d26c4733-9d89-4666-8554-f769295ad7b3\") " pod="openstack/dnsmasq-dns-8554648995-6l72c" Dec 11 13:24:55 crc kubenswrapper[4898]: I1211 13:24:55.703328 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d26c4733-9d89-4666-8554-f769295ad7b3-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-6l72c\" (UID: \"d26c4733-9d89-4666-8554-f769295ad7b3\") " pod="openstack/dnsmasq-dns-8554648995-6l72c" Dec 11 13:24:55 crc kubenswrapper[4898]: I1211 13:24:55.703644 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d26c4733-9d89-4666-8554-f769295ad7b3-config\") pod \"dnsmasq-dns-8554648995-6l72c\" (UID: \"d26c4733-9d89-4666-8554-f769295ad7b3\") " pod="openstack/dnsmasq-dns-8554648995-6l72c" Dec 11 13:24:55 crc kubenswrapper[4898]: I1211 13:24:55.719524 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bpmgn\" (UniqueName: \"kubernetes.io/projected/d26c4733-9d89-4666-8554-f769295ad7b3-kube-api-access-bpmgn\") pod \"dnsmasq-dns-8554648995-6l72c\" (UID: \"d26c4733-9d89-4666-8554-f769295ad7b3\") " pod="openstack/dnsmasq-dns-8554648995-6l72c" Dec 11 13:24:55 crc kubenswrapper[4898]: I1211 13:24:55.789608 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Dec 11 13:24:55 crc kubenswrapper[4898]: I1211 13:24:55.803475 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-6l72c" Dec 11 13:24:55 crc kubenswrapper[4898]: I1211 13:24:55.859929 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Dec 11 13:24:56 crc kubenswrapper[4898]: I1211 13:24:56.219512 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Dec 11 13:24:56 crc kubenswrapper[4898]: I1211 13:24:56.221405 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Dec 11 13:24:56 crc kubenswrapper[4898]: I1211 13:24:56.230205 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Dec 11 13:24:56 crc kubenswrapper[4898]: I1211 13:24:56.230401 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-lpbb2" Dec 11 13:24:56 crc kubenswrapper[4898]: I1211 13:24:56.230536 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Dec 11 13:24:56 crc kubenswrapper[4898]: I1211 13:24:56.233650 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-5wp9b"] Dec 11 13:24:56 crc kubenswrapper[4898]: I1211 13:24:56.235716 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Dec 11 13:24:56 crc kubenswrapper[4898]: I1211 13:24:56.331355 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Dec 11 13:24:56 crc kubenswrapper[4898]: I1211 13:24:56.331851 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Dec 11 13:24:56 crc kubenswrapper[4898]: I1211 13:24:56.336241 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69fcbdad-bb34-4a36-9100-352ddce7c906-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"69fcbdad-bb34-4a36-9100-352ddce7c906\") " pod="openstack/ovn-northd-0" Dec 11 13:24:56 crc kubenswrapper[4898]: I1211 13:24:56.336595 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kvgcw\" (UniqueName: \"kubernetes.io/projected/69fcbdad-bb34-4a36-9100-352ddce7c906-kube-api-access-kvgcw\") pod \"ovn-northd-0\" (UID: \"69fcbdad-bb34-4a36-9100-352ddce7c906\") " pod="openstack/ovn-northd-0" Dec 11 13:24:56 crc kubenswrapper[4898]: I1211 13:24:56.336687 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/69fcbdad-bb34-4a36-9100-352ddce7c906-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"69fcbdad-bb34-4a36-9100-352ddce7c906\") " pod="openstack/ovn-northd-0" Dec 11 13:24:56 crc kubenswrapper[4898]: I1211 13:24:56.336746 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/69fcbdad-bb34-4a36-9100-352ddce7c906-config\") pod \"ovn-northd-0\" (UID: \"69fcbdad-bb34-4a36-9100-352ddce7c906\") " pod="openstack/ovn-northd-0" Dec 11 13:24:56 crc kubenswrapper[4898]: I1211 13:24:56.336788 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/69fcbdad-bb34-4a36-9100-352ddce7c906-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"69fcbdad-bb34-4a36-9100-352ddce7c906\") " pod="openstack/ovn-northd-0" Dec 11 13:24:56 crc kubenswrapper[4898]: I1211 13:24:56.336853 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/69fcbdad-bb34-4a36-9100-352ddce7c906-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"69fcbdad-bb34-4a36-9100-352ddce7c906\") " pod="openstack/ovn-northd-0" Dec 11 13:24:56 crc kubenswrapper[4898]: I1211 13:24:56.336956 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/69fcbdad-bb34-4a36-9100-352ddce7c906-scripts\") pod \"ovn-northd-0\" (UID: \"69fcbdad-bb34-4a36-9100-352ddce7c906\") " pod="openstack/ovn-northd-0" Dec 11 13:24:56 crc kubenswrapper[4898]: I1211 13:24:56.439083 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/69fcbdad-bb34-4a36-9100-352ddce7c906-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"69fcbdad-bb34-4a36-9100-352ddce7c906\") " pod="openstack/ovn-northd-0" Dec 11 13:24:56 crc kubenswrapper[4898]: I1211 13:24:56.439157 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/69fcbdad-bb34-4a36-9100-352ddce7c906-scripts\") pod \"ovn-northd-0\" (UID: \"69fcbdad-bb34-4a36-9100-352ddce7c906\") " pod="openstack/ovn-northd-0" Dec 11 13:24:56 crc kubenswrapper[4898]: I1211 13:24:56.439238 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69fcbdad-bb34-4a36-9100-352ddce7c906-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"69fcbdad-bb34-4a36-9100-352ddce7c906\") " pod="openstack/ovn-northd-0" Dec 11 13:24:56 crc kubenswrapper[4898]: I1211 13:24:56.441398 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kvgcw\" (UniqueName: \"kubernetes.io/projected/69fcbdad-bb34-4a36-9100-352ddce7c906-kube-api-access-kvgcw\") pod \"ovn-northd-0\" (UID: \"69fcbdad-bb34-4a36-9100-352ddce7c906\") " pod="openstack/ovn-northd-0" Dec 11 13:24:56 crc kubenswrapper[4898]: I1211 13:24:56.441437 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/69fcbdad-bb34-4a36-9100-352ddce7c906-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"69fcbdad-bb34-4a36-9100-352ddce7c906\") " pod="openstack/ovn-northd-0" Dec 11 13:24:56 crc kubenswrapper[4898]: I1211 13:24:56.441473 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/69fcbdad-bb34-4a36-9100-352ddce7c906-config\") pod \"ovn-northd-0\" (UID: \"69fcbdad-bb34-4a36-9100-352ddce7c906\") " pod="openstack/ovn-northd-0" Dec 11 13:24:56 crc kubenswrapper[4898]: I1211 13:24:56.441490 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/69fcbdad-bb34-4a36-9100-352ddce7c906-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"69fcbdad-bb34-4a36-9100-352ddce7c906\") " pod="openstack/ovn-northd-0" Dec 11 13:24:56 crc kubenswrapper[4898]: I1211 13:24:56.446018 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/69fcbdad-bb34-4a36-9100-352ddce7c906-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"69fcbdad-bb34-4a36-9100-352ddce7c906\") " pod="openstack/ovn-northd-0" Dec 11 13:24:56 crc kubenswrapper[4898]: I1211 13:24:56.446807 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/69fcbdad-bb34-4a36-9100-352ddce7c906-scripts\") pod \"ovn-northd-0\" (UID: \"69fcbdad-bb34-4a36-9100-352ddce7c906\") " pod="openstack/ovn-northd-0" Dec 11 13:24:56 crc kubenswrapper[4898]: I1211 13:24:56.448359 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/69fcbdad-bb34-4a36-9100-352ddce7c906-config\") pod \"ovn-northd-0\" (UID: \"69fcbdad-bb34-4a36-9100-352ddce7c906\") " pod="openstack/ovn-northd-0" Dec 11 13:24:56 crc kubenswrapper[4898]: I1211 13:24:56.461174 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/69fcbdad-bb34-4a36-9100-352ddce7c906-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"69fcbdad-bb34-4a36-9100-352ddce7c906\") " pod="openstack/ovn-northd-0" Dec 11 13:24:56 crc kubenswrapper[4898]: I1211 13:24:56.470874 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/69fcbdad-bb34-4a36-9100-352ddce7c906-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"69fcbdad-bb34-4a36-9100-352ddce7c906\") " pod="openstack/ovn-northd-0" Dec 11 13:24:56 crc kubenswrapper[4898]: I1211 13:24:56.474687 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kvgcw\" (UniqueName: \"kubernetes.io/projected/69fcbdad-bb34-4a36-9100-352ddce7c906-kube-api-access-kvgcw\") pod \"ovn-northd-0\" (UID: \"69fcbdad-bb34-4a36-9100-352ddce7c906\") " pod="openstack/ovn-northd-0" Dec 11 13:24:56 crc kubenswrapper[4898]: I1211 13:24:56.476229 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69fcbdad-bb34-4a36-9100-352ddce7c906-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"69fcbdad-bb34-4a36-9100-352ddce7c906\") " pod="openstack/ovn-northd-0" Dec 11 13:24:56 crc kubenswrapper[4898]: I1211 13:24:56.550556 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-lm8fl"] Dec 11 13:24:56 crc kubenswrapper[4898]: I1211 13:24:56.564442 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Dec 11 13:24:56 crc kubenswrapper[4898]: I1211 13:24:56.790142 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8554648995-6l72c"] Dec 11 13:24:56 crc kubenswrapper[4898]: I1211 13:24:56.803239 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-5wp9b" event={"ID":"33e30fc7-973e-436e-a9df-c839f8609a99","Type":"ContainerStarted","Data":"80001b2d3f41e69a6587eabe02755eb3752123636f02c51908126a3a94257f6d"} Dec 11 13:24:56 crc kubenswrapper[4898]: I1211 13:24:56.806676 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-6l72c" event={"ID":"d26c4733-9d89-4666-8554-f769295ad7b3","Type":"ContainerStarted","Data":"22e7cbc110c8cecadc12718c48c0b8bcdfc6367feb5e79f51657b84b30cd5b83"} Dec 11 13:24:56 crc kubenswrapper[4898]: I1211 13:24:56.812864 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bf47b49b7-lm8fl" event={"ID":"71e5f4cc-7724-4f78-82c6-ed399925b7b7","Type":"ContainerStarted","Data":"3d2d4ad378859d76cfc7d44a891e57e94a740a075f8c2dbd62b1c991300b9404"} Dec 11 13:24:57 crc kubenswrapper[4898]: I1211 13:24:57.053855 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Dec 11 13:24:57 crc kubenswrapper[4898]: I1211 13:24:57.821712 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"69fcbdad-bb34-4a36-9100-352ddce7c906","Type":"ContainerStarted","Data":"8fedb4ed5d6e3c5321287f8ef207261f8f3c1d55d6b2c4b92801c38bf1f1ea6a"} Dec 11 13:24:57 crc kubenswrapper[4898]: I1211 13:24:57.824171 4898 generic.go:334] "Generic (PLEG): container finished" podID="87df6c08-c7eb-4e54-a329-6343e195c6f3" containerID="abac1359be5a82b057a5932b0db735799507827d93dc6b8f2c45349e642e3283" exitCode=0 Dec 11 13:24:57 crc kubenswrapper[4898]: I1211 13:24:57.824239 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"87df6c08-c7eb-4e54-a329-6343e195c6f3","Type":"ContainerDied","Data":"abac1359be5a82b057a5932b0db735799507827d93dc6b8f2c45349e642e3283"} Dec 11 13:24:58 crc kubenswrapper[4898]: I1211 13:24:58.849344 4898 generic.go:334] "Generic (PLEG): container finished" podID="d26c4733-9d89-4666-8554-f769295ad7b3" containerID="e8b05aeb810e7971ccaacd84e800c2b24aff185c98cd25917b3884958ac51054" exitCode=0 Dec 11 13:24:58 crc kubenswrapper[4898]: I1211 13:24:58.849554 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-6l72c" event={"ID":"d26c4733-9d89-4666-8554-f769295ad7b3","Type":"ContainerDied","Data":"e8b05aeb810e7971ccaacd84e800c2b24aff185c98cd25917b3884958ac51054"} Dec 11 13:24:58 crc kubenswrapper[4898]: I1211 13:24:58.857919 4898 generic.go:334] "Generic (PLEG): container finished" podID="71e5f4cc-7724-4f78-82c6-ed399925b7b7" containerID="02add320b8014853fcdf2f0b700623870e52193eadcd2a312a35a0c57c467f1f" exitCode=0 Dec 11 13:24:58 crc kubenswrapper[4898]: I1211 13:24:58.857991 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bf47b49b7-lm8fl" event={"ID":"71e5f4cc-7724-4f78-82c6-ed399925b7b7","Type":"ContainerDied","Data":"02add320b8014853fcdf2f0b700623870e52193eadcd2a312a35a0c57c467f1f"} Dec 11 13:24:58 crc kubenswrapper[4898]: I1211 13:24:58.873123 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-5wp9b" event={"ID":"33e30fc7-973e-436e-a9df-c839f8609a99","Type":"ContainerStarted","Data":"f988ce28dd3842374185d78325ed35b9365905e71c6274bda42d55bcb9eea084"} Dec 11 13:24:58 crc kubenswrapper[4898]: I1211 13:24:58.966691 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-5wp9b" podStartSLOduration=3.9666703009999997 podStartE2EDuration="3.966670301s" podCreationTimestamp="2025-12-11 13:24:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 13:24:58.96663231 +0000 UTC m=+1256.538958737" watchObservedRunningTime="2025-12-11 13:24:58.966670301 +0000 UTC m=+1256.538996738" Dec 11 13:24:59 crc kubenswrapper[4898]: I1211 13:24:59.378028 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bf47b49b7-lm8fl" Dec 11 13:24:59 crc kubenswrapper[4898]: I1211 13:24:59.420242 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qlv4b\" (UniqueName: \"kubernetes.io/projected/71e5f4cc-7724-4f78-82c6-ed399925b7b7-kube-api-access-qlv4b\") pod \"71e5f4cc-7724-4f78-82c6-ed399925b7b7\" (UID: \"71e5f4cc-7724-4f78-82c6-ed399925b7b7\") " Dec 11 13:24:59 crc kubenswrapper[4898]: I1211 13:24:59.420354 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/71e5f4cc-7724-4f78-82c6-ed399925b7b7-dns-svc\") pod \"71e5f4cc-7724-4f78-82c6-ed399925b7b7\" (UID: \"71e5f4cc-7724-4f78-82c6-ed399925b7b7\") " Dec 11 13:24:59 crc kubenswrapper[4898]: I1211 13:24:59.420490 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/71e5f4cc-7724-4f78-82c6-ed399925b7b7-ovsdbserver-nb\") pod \"71e5f4cc-7724-4f78-82c6-ed399925b7b7\" (UID: \"71e5f4cc-7724-4f78-82c6-ed399925b7b7\") " Dec 11 13:24:59 crc kubenswrapper[4898]: I1211 13:24:59.420544 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/71e5f4cc-7724-4f78-82c6-ed399925b7b7-config\") pod \"71e5f4cc-7724-4f78-82c6-ed399925b7b7\" (UID: \"71e5f4cc-7724-4f78-82c6-ed399925b7b7\") " Dec 11 13:24:59 crc kubenswrapper[4898]: I1211 13:24:59.425627 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71e5f4cc-7724-4f78-82c6-ed399925b7b7-kube-api-access-qlv4b" (OuterVolumeSpecName: "kube-api-access-qlv4b") pod "71e5f4cc-7724-4f78-82c6-ed399925b7b7" (UID: "71e5f4cc-7724-4f78-82c6-ed399925b7b7"). InnerVolumeSpecName "kube-api-access-qlv4b". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:24:59 crc kubenswrapper[4898]: I1211 13:24:59.457748 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/71e5f4cc-7724-4f78-82c6-ed399925b7b7-config" (OuterVolumeSpecName: "config") pod "71e5f4cc-7724-4f78-82c6-ed399925b7b7" (UID: "71e5f4cc-7724-4f78-82c6-ed399925b7b7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:24:59 crc kubenswrapper[4898]: I1211 13:24:59.473225 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/71e5f4cc-7724-4f78-82c6-ed399925b7b7-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "71e5f4cc-7724-4f78-82c6-ed399925b7b7" (UID: "71e5f4cc-7724-4f78-82c6-ed399925b7b7"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:24:59 crc kubenswrapper[4898]: I1211 13:24:59.485878 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/71e5f4cc-7724-4f78-82c6-ed399925b7b7-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "71e5f4cc-7724-4f78-82c6-ed399925b7b7" (UID: "71e5f4cc-7724-4f78-82c6-ed399925b7b7"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:24:59 crc kubenswrapper[4898]: I1211 13:24:59.524757 4898 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/71e5f4cc-7724-4f78-82c6-ed399925b7b7-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 11 13:24:59 crc kubenswrapper[4898]: I1211 13:24:59.524816 4898 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/71e5f4cc-7724-4f78-82c6-ed399925b7b7-config\") on node \"crc\" DevicePath \"\"" Dec 11 13:24:59 crc kubenswrapper[4898]: I1211 13:24:59.524829 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qlv4b\" (UniqueName: \"kubernetes.io/projected/71e5f4cc-7724-4f78-82c6-ed399925b7b7-kube-api-access-qlv4b\") on node \"crc\" DevicePath \"\"" Dec 11 13:24:59 crc kubenswrapper[4898]: I1211 13:24:59.524840 4898 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/71e5f4cc-7724-4f78-82c6-ed399925b7b7-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 11 13:24:59 crc kubenswrapper[4898]: I1211 13:24:59.695915 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Dec 11 13:24:59 crc kubenswrapper[4898]: I1211 13:24:59.695961 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Dec 11 13:24:59 crc kubenswrapper[4898]: I1211 13:24:59.788519 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Dec 11 13:24:59 crc kubenswrapper[4898]: I1211 13:24:59.885885 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-6l72c" event={"ID":"d26c4733-9d89-4666-8554-f769295ad7b3","Type":"ContainerStarted","Data":"7828a81136b680b4f184b12a4acd807787096a87b5cfa09e7b4fa75e117fbef8"} Dec 11 13:24:59 crc kubenswrapper[4898]: I1211 13:24:59.886336 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-8554648995-6l72c" Dec 11 13:24:59 crc kubenswrapper[4898]: I1211 13:24:59.898900 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bf47b49b7-lm8fl" event={"ID":"71e5f4cc-7724-4f78-82c6-ed399925b7b7","Type":"ContainerDied","Data":"3d2d4ad378859d76cfc7d44a891e57e94a740a075f8c2dbd62b1c991300b9404"} Dec 11 13:24:59 crc kubenswrapper[4898]: I1211 13:24:59.898976 4898 scope.go:117] "RemoveContainer" containerID="02add320b8014853fcdf2f0b700623870e52193eadcd2a312a35a0c57c467f1f" Dec 11 13:24:59 crc kubenswrapper[4898]: I1211 13:24:59.899143 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bf47b49b7-lm8fl" Dec 11 13:24:59 crc kubenswrapper[4898]: I1211 13:24:59.908536 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-8554648995-6l72c" podStartSLOduration=4.90851738 podStartE2EDuration="4.90851738s" podCreationTimestamp="2025-12-11 13:24:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 13:24:59.904772412 +0000 UTC m=+1257.477098879" watchObservedRunningTime="2025-12-11 13:24:59.90851738 +0000 UTC m=+1257.480843827" Dec 11 13:24:59 crc kubenswrapper[4898]: I1211 13:24:59.914884 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"69fcbdad-bb34-4a36-9100-352ddce7c906","Type":"ContainerStarted","Data":"4e292fdcc0e77e02332da2f9b92ec9c90d355a6e833fea0044298f1d297869cd"} Dec 11 13:24:59 crc kubenswrapper[4898]: I1211 13:24:59.914927 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"69fcbdad-bb34-4a36-9100-352ddce7c906","Type":"ContainerStarted","Data":"94adc9758b95918168355a83012cb21361feede83c0e59cacdd2b4f05280453f"} Dec 11 13:24:59 crc kubenswrapper[4898]: I1211 13:24:59.915203 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Dec 11 13:24:59 crc kubenswrapper[4898]: I1211 13:24:59.948737 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.304937088 podStartE2EDuration="3.94871737s" podCreationTimestamp="2025-12-11 13:24:56 +0000 UTC" firstStartedPulling="2025-12-11 13:24:57.054109294 +0000 UTC m=+1254.626435731" lastFinishedPulling="2025-12-11 13:24:58.697889586 +0000 UTC m=+1256.270216013" observedRunningTime="2025-12-11 13:24:59.941025057 +0000 UTC m=+1257.513351514" watchObservedRunningTime="2025-12-11 13:24:59.94871737 +0000 UTC m=+1257.521043807" Dec 11 13:25:00 crc kubenswrapper[4898]: I1211 13:25:00.019186 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-lm8fl"] Dec 11 13:25:00 crc kubenswrapper[4898]: I1211 13:25:00.026247 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-lm8fl"] Dec 11 13:25:00 crc kubenswrapper[4898]: I1211 13:25:00.026899 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Dec 11 13:25:00 crc kubenswrapper[4898]: I1211 13:25:00.791635 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="71e5f4cc-7724-4f78-82c6-ed399925b7b7" path="/var/lib/kubelet/pods/71e5f4cc-7724-4f78-82c6-ed399925b7b7/volumes" Dec 11 13:25:00 crc kubenswrapper[4898]: I1211 13:25:00.975210 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-267e-account-create-update-rkrvq"] Dec 11 13:25:00 crc kubenswrapper[4898]: E1211 13:25:00.976022 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71e5f4cc-7724-4f78-82c6-ed399925b7b7" containerName="init" Dec 11 13:25:00 crc kubenswrapper[4898]: I1211 13:25:00.976035 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="71e5f4cc-7724-4f78-82c6-ed399925b7b7" containerName="init" Dec 11 13:25:00 crc kubenswrapper[4898]: I1211 13:25:00.976345 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="71e5f4cc-7724-4f78-82c6-ed399925b7b7" containerName="init" Dec 11 13:25:00 crc kubenswrapper[4898]: I1211 13:25:00.977168 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-267e-account-create-update-rkrvq" Dec 11 13:25:00 crc kubenswrapper[4898]: I1211 13:25:00.980039 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Dec 11 13:25:00 crc kubenswrapper[4898]: I1211 13:25:00.984538 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-267e-account-create-update-rkrvq"] Dec 11 13:25:01 crc kubenswrapper[4898]: I1211 13:25:01.035700 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-pddzq"] Dec 11 13:25:01 crc kubenswrapper[4898]: I1211 13:25:01.037097 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-pddzq" Dec 11 13:25:01 crc kubenswrapper[4898]: I1211 13:25:01.037628 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Dec 11 13:25:01 crc kubenswrapper[4898]: I1211 13:25:01.038779 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Dec 11 13:25:01 crc kubenswrapper[4898]: I1211 13:25:01.057691 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-pddzq"] Dec 11 13:25:01 crc kubenswrapper[4898]: I1211 13:25:01.135892 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Dec 11 13:25:01 crc kubenswrapper[4898]: I1211 13:25:01.172698 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zjjr5\" (UniqueName: \"kubernetes.io/projected/399ee20b-e403-4d88-bd87-77a1c8b71e93-kube-api-access-zjjr5\") pod \"keystone-db-create-pddzq\" (UID: \"399ee20b-e403-4d88-bd87-77a1c8b71e93\") " pod="openstack/keystone-db-create-pddzq" Dec 11 13:25:01 crc kubenswrapper[4898]: I1211 13:25:01.172799 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/399ee20b-e403-4d88-bd87-77a1c8b71e93-operator-scripts\") pod \"keystone-db-create-pddzq\" (UID: \"399ee20b-e403-4d88-bd87-77a1c8b71e93\") " pod="openstack/keystone-db-create-pddzq" Dec 11 13:25:01 crc kubenswrapper[4898]: I1211 13:25:01.172862 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/04e19d4f-0d50-4924-bd7d-812d753d76ac-operator-scripts\") pod \"keystone-267e-account-create-update-rkrvq\" (UID: \"04e19d4f-0d50-4924-bd7d-812d753d76ac\") " pod="openstack/keystone-267e-account-create-update-rkrvq" Dec 11 13:25:01 crc kubenswrapper[4898]: I1211 13:25:01.173017 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jczn8\" (UniqueName: \"kubernetes.io/projected/04e19d4f-0d50-4924-bd7d-812d753d76ac-kube-api-access-jczn8\") pod \"keystone-267e-account-create-update-rkrvq\" (UID: \"04e19d4f-0d50-4924-bd7d-812d753d76ac\") " pod="openstack/keystone-267e-account-create-update-rkrvq" Dec 11 13:25:01 crc kubenswrapper[4898]: I1211 13:25:01.222031 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-mj9zd"] Dec 11 13:25:01 crc kubenswrapper[4898]: I1211 13:25:01.223208 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-mj9zd" Dec 11 13:25:01 crc kubenswrapper[4898]: I1211 13:25:01.242408 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-mj9zd"] Dec 11 13:25:01 crc kubenswrapper[4898]: I1211 13:25:01.276162 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/04e19d4f-0d50-4924-bd7d-812d753d76ac-operator-scripts\") pod \"keystone-267e-account-create-update-rkrvq\" (UID: \"04e19d4f-0d50-4924-bd7d-812d753d76ac\") " pod="openstack/keystone-267e-account-create-update-rkrvq" Dec 11 13:25:01 crc kubenswrapper[4898]: I1211 13:25:01.276232 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jczn8\" (UniqueName: \"kubernetes.io/projected/04e19d4f-0d50-4924-bd7d-812d753d76ac-kube-api-access-jczn8\") pod \"keystone-267e-account-create-update-rkrvq\" (UID: \"04e19d4f-0d50-4924-bd7d-812d753d76ac\") " pod="openstack/keystone-267e-account-create-update-rkrvq" Dec 11 13:25:01 crc kubenswrapper[4898]: I1211 13:25:01.277266 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zjjr5\" (UniqueName: \"kubernetes.io/projected/399ee20b-e403-4d88-bd87-77a1c8b71e93-kube-api-access-zjjr5\") pod \"keystone-db-create-pddzq\" (UID: \"399ee20b-e403-4d88-bd87-77a1c8b71e93\") " pod="openstack/keystone-db-create-pddzq" Dec 11 13:25:01 crc kubenswrapper[4898]: I1211 13:25:01.277420 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/399ee20b-e403-4d88-bd87-77a1c8b71e93-operator-scripts\") pod \"keystone-db-create-pddzq\" (UID: \"399ee20b-e403-4d88-bd87-77a1c8b71e93\") " pod="openstack/keystone-db-create-pddzq" Dec 11 13:25:01 crc kubenswrapper[4898]: I1211 13:25:01.277769 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/04e19d4f-0d50-4924-bd7d-812d753d76ac-operator-scripts\") pod \"keystone-267e-account-create-update-rkrvq\" (UID: \"04e19d4f-0d50-4924-bd7d-812d753d76ac\") " pod="openstack/keystone-267e-account-create-update-rkrvq" Dec 11 13:25:01 crc kubenswrapper[4898]: I1211 13:25:01.278079 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/399ee20b-e403-4d88-bd87-77a1c8b71e93-operator-scripts\") pod \"keystone-db-create-pddzq\" (UID: \"399ee20b-e403-4d88-bd87-77a1c8b71e93\") " pod="openstack/keystone-db-create-pddzq" Dec 11 13:25:01 crc kubenswrapper[4898]: I1211 13:25:01.332561 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jczn8\" (UniqueName: \"kubernetes.io/projected/04e19d4f-0d50-4924-bd7d-812d753d76ac-kube-api-access-jczn8\") pod \"keystone-267e-account-create-update-rkrvq\" (UID: \"04e19d4f-0d50-4924-bd7d-812d753d76ac\") " pod="openstack/keystone-267e-account-create-update-rkrvq" Dec 11 13:25:01 crc kubenswrapper[4898]: I1211 13:25:01.345842 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zjjr5\" (UniqueName: \"kubernetes.io/projected/399ee20b-e403-4d88-bd87-77a1c8b71e93-kube-api-access-zjjr5\") pod \"keystone-db-create-pddzq\" (UID: \"399ee20b-e403-4d88-bd87-77a1c8b71e93\") " pod="openstack/keystone-db-create-pddzq" Dec 11 13:25:01 crc kubenswrapper[4898]: I1211 13:25:01.355213 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-95d5-account-create-update-62mxt"] Dec 11 13:25:01 crc kubenswrapper[4898]: I1211 13:25:01.355680 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-pddzq" Dec 11 13:25:01 crc kubenswrapper[4898]: I1211 13:25:01.356856 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-95d5-account-create-update-62mxt" Dec 11 13:25:01 crc kubenswrapper[4898]: I1211 13:25:01.363589 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Dec 11 13:25:01 crc kubenswrapper[4898]: I1211 13:25:01.371353 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-95d5-account-create-update-62mxt"] Dec 11 13:25:01 crc kubenswrapper[4898]: I1211 13:25:01.383632 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d589c38c-afcf-4107-bf16-ac57d302576e-operator-scripts\") pod \"placement-db-create-mj9zd\" (UID: \"d589c38c-afcf-4107-bf16-ac57d302576e\") " pod="openstack/placement-db-create-mj9zd" Dec 11 13:25:01 crc kubenswrapper[4898]: I1211 13:25:01.383837 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-75p8t\" (UniqueName: \"kubernetes.io/projected/d589c38c-afcf-4107-bf16-ac57d302576e-kube-api-access-75p8t\") pod \"placement-db-create-mj9zd\" (UID: \"d589c38c-afcf-4107-bf16-ac57d302576e\") " pod="openstack/placement-db-create-mj9zd" Dec 11 13:25:01 crc kubenswrapper[4898]: I1211 13:25:01.493432 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/376bd2c9-d19e-4322-8269-847515a788cb-operator-scripts\") pod \"placement-95d5-account-create-update-62mxt\" (UID: \"376bd2c9-d19e-4322-8269-847515a788cb\") " pod="openstack/placement-95d5-account-create-update-62mxt" Dec 11 13:25:01 crc kubenswrapper[4898]: I1211 13:25:01.494183 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d589c38c-afcf-4107-bf16-ac57d302576e-operator-scripts\") pod \"placement-db-create-mj9zd\" (UID: \"d589c38c-afcf-4107-bf16-ac57d302576e\") " pod="openstack/placement-db-create-mj9zd" Dec 11 13:25:01 crc kubenswrapper[4898]: I1211 13:25:01.494402 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-75p8t\" (UniqueName: \"kubernetes.io/projected/d589c38c-afcf-4107-bf16-ac57d302576e-kube-api-access-75p8t\") pod \"placement-db-create-mj9zd\" (UID: \"d589c38c-afcf-4107-bf16-ac57d302576e\") " pod="openstack/placement-db-create-mj9zd" Dec 11 13:25:01 crc kubenswrapper[4898]: I1211 13:25:01.494448 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6nglq\" (UniqueName: \"kubernetes.io/projected/376bd2c9-d19e-4322-8269-847515a788cb-kube-api-access-6nglq\") pod \"placement-95d5-account-create-update-62mxt\" (UID: \"376bd2c9-d19e-4322-8269-847515a788cb\") " pod="openstack/placement-95d5-account-create-update-62mxt" Dec 11 13:25:01 crc kubenswrapper[4898]: I1211 13:25:01.495032 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d589c38c-afcf-4107-bf16-ac57d302576e-operator-scripts\") pod \"placement-db-create-mj9zd\" (UID: \"d589c38c-afcf-4107-bf16-ac57d302576e\") " pod="openstack/placement-db-create-mj9zd" Dec 11 13:25:01 crc kubenswrapper[4898]: I1211 13:25:01.518169 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-75p8t\" (UniqueName: \"kubernetes.io/projected/d589c38c-afcf-4107-bf16-ac57d302576e-kube-api-access-75p8t\") pod \"placement-db-create-mj9zd\" (UID: \"d589c38c-afcf-4107-bf16-ac57d302576e\") " pod="openstack/placement-db-create-mj9zd" Dec 11 13:25:01 crc kubenswrapper[4898]: I1211 13:25:01.556414 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-mj9zd" Dec 11 13:25:01 crc kubenswrapper[4898]: I1211 13:25:01.598282 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6nglq\" (UniqueName: \"kubernetes.io/projected/376bd2c9-d19e-4322-8269-847515a788cb-kube-api-access-6nglq\") pod \"placement-95d5-account-create-update-62mxt\" (UID: \"376bd2c9-d19e-4322-8269-847515a788cb\") " pod="openstack/placement-95d5-account-create-update-62mxt" Dec 11 13:25:01 crc kubenswrapper[4898]: I1211 13:25:01.598413 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/376bd2c9-d19e-4322-8269-847515a788cb-operator-scripts\") pod \"placement-95d5-account-create-update-62mxt\" (UID: \"376bd2c9-d19e-4322-8269-847515a788cb\") " pod="openstack/placement-95d5-account-create-update-62mxt" Dec 11 13:25:01 crc kubenswrapper[4898]: I1211 13:25:01.599107 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-267e-account-create-update-rkrvq" Dec 11 13:25:01 crc kubenswrapper[4898]: I1211 13:25:01.604250 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/376bd2c9-d19e-4322-8269-847515a788cb-operator-scripts\") pod \"placement-95d5-account-create-update-62mxt\" (UID: \"376bd2c9-d19e-4322-8269-847515a788cb\") " pod="openstack/placement-95d5-account-create-update-62mxt" Dec 11 13:25:01 crc kubenswrapper[4898]: I1211 13:25:01.615651 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6nglq\" (UniqueName: \"kubernetes.io/projected/376bd2c9-d19e-4322-8269-847515a788cb-kube-api-access-6nglq\") pod \"placement-95d5-account-create-update-62mxt\" (UID: \"376bd2c9-d19e-4322-8269-847515a788cb\") " pod="openstack/placement-95d5-account-create-update-62mxt" Dec 11 13:25:01 crc kubenswrapper[4898]: I1211 13:25:01.781053 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-95d5-account-create-update-62mxt" Dec 11 13:25:01 crc kubenswrapper[4898]: I1211 13:25:01.852994 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-pddzq"] Dec 11 13:25:01 crc kubenswrapper[4898]: W1211 13:25:01.854429 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod399ee20b_e403_4d88_bd87_77a1c8b71e93.slice/crio-780288fb3ce668c2ed9675211afe42d9244a274570cb8d96d1a08e5c50cba32e WatchSource:0}: Error finding container 780288fb3ce668c2ed9675211afe42d9244a274570cb8d96d1a08e5c50cba32e: Status 404 returned error can't find the container with id 780288fb3ce668c2ed9675211afe42d9244a274570cb8d96d1a08e5c50cba32e Dec 11 13:25:01 crc kubenswrapper[4898]: I1211 13:25:01.982433 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-pddzq" event={"ID":"399ee20b-e403-4d88-bd87-77a1c8b71e93","Type":"ContainerStarted","Data":"780288fb3ce668c2ed9675211afe42d9244a274570cb8d96d1a08e5c50cba32e"} Dec 11 13:25:02 crc kubenswrapper[4898]: I1211 13:25:02.104312 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-267e-account-create-update-rkrvq"] Dec 11 13:25:02 crc kubenswrapper[4898]: I1211 13:25:02.114672 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-mj9zd"] Dec 11 13:25:02 crc kubenswrapper[4898]: I1211 13:25:02.260094 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Dec 11 13:25:02 crc kubenswrapper[4898]: I1211 13:25:02.349269 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-95d5-account-create-update-62mxt"] Dec 11 13:25:03 crc kubenswrapper[4898]: I1211 13:25:03.013866 4898 generic.go:334] "Generic (PLEG): container finished" podID="399ee20b-e403-4d88-bd87-77a1c8b71e93" containerID="e91e97d01c180d6d046e785fd4f812474b772dffb8fc3766a01e98017b145bee" exitCode=0 Dec 11 13:25:03 crc kubenswrapper[4898]: I1211 13:25:03.014046 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-pddzq" event={"ID":"399ee20b-e403-4d88-bd87-77a1c8b71e93","Type":"ContainerDied","Data":"e91e97d01c180d6d046e785fd4f812474b772dffb8fc3766a01e98017b145bee"} Dec 11 13:25:03 crc kubenswrapper[4898]: I1211 13:25:03.020530 4898 generic.go:334] "Generic (PLEG): container finished" podID="d589c38c-afcf-4107-bf16-ac57d302576e" containerID="d628a291c281d94a14bfc2008ec184b1b172c8e4bf9783e218fe8826f72dca34" exitCode=0 Dec 11 13:25:03 crc kubenswrapper[4898]: I1211 13:25:03.020662 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-mj9zd" event={"ID":"d589c38c-afcf-4107-bf16-ac57d302576e","Type":"ContainerDied","Data":"d628a291c281d94a14bfc2008ec184b1b172c8e4bf9783e218fe8826f72dca34"} Dec 11 13:25:03 crc kubenswrapper[4898]: I1211 13:25:03.020701 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-mj9zd" event={"ID":"d589c38c-afcf-4107-bf16-ac57d302576e","Type":"ContainerStarted","Data":"abec92ee8d976d8bfbd1723bfd8aa462876afb8a5d5e6a5a5643a32497c300e0"} Dec 11 13:25:03 crc kubenswrapper[4898]: I1211 13:25:03.023968 4898 generic.go:334] "Generic (PLEG): container finished" podID="376bd2c9-d19e-4322-8269-847515a788cb" containerID="fa755ac0fd9e6a647310ce8425f2d331da86e2399db53aa213fbc7e0c23ae838" exitCode=0 Dec 11 13:25:03 crc kubenswrapper[4898]: I1211 13:25:03.024077 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-95d5-account-create-update-62mxt" event={"ID":"376bd2c9-d19e-4322-8269-847515a788cb","Type":"ContainerDied","Data":"fa755ac0fd9e6a647310ce8425f2d331da86e2399db53aa213fbc7e0c23ae838"} Dec 11 13:25:03 crc kubenswrapper[4898]: I1211 13:25:03.024285 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-95d5-account-create-update-62mxt" event={"ID":"376bd2c9-d19e-4322-8269-847515a788cb","Type":"ContainerStarted","Data":"a46ee95214c18d74a88e4ba2ee7cd18d649900cf550ef4ccbb552f217c1af0dd"} Dec 11 13:25:03 crc kubenswrapper[4898]: I1211 13:25:03.026418 4898 generic.go:334] "Generic (PLEG): container finished" podID="04e19d4f-0d50-4924-bd7d-812d753d76ac" containerID="1a799b1cdd93058623481fb9cd2fa7b7de26c460b06cc8ba6b0a53d030d6a5fe" exitCode=0 Dec 11 13:25:03 crc kubenswrapper[4898]: I1211 13:25:03.026486 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-267e-account-create-update-rkrvq" event={"ID":"04e19d4f-0d50-4924-bd7d-812d753d76ac","Type":"ContainerDied","Data":"1a799b1cdd93058623481fb9cd2fa7b7de26c460b06cc8ba6b0a53d030d6a5fe"} Dec 11 13:25:03 crc kubenswrapper[4898]: I1211 13:25:03.026519 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-267e-account-create-update-rkrvq" event={"ID":"04e19d4f-0d50-4924-bd7d-812d753d76ac","Type":"ContainerStarted","Data":"fdecbca737a9e141765bad4ae02e5665b7455264912ed8fd01474674883d4923"} Dec 11 13:25:03 crc kubenswrapper[4898]: I1211 13:25:03.112569 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mysqld-exporter-openstack-db-create-2wggg"] Dec 11 13:25:03 crc kubenswrapper[4898]: I1211 13:25:03.114346 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-db-create-2wggg" Dec 11 13:25:03 crc kubenswrapper[4898]: I1211 13:25:03.121871 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-openstack-db-create-2wggg"] Dec 11 13:25:03 crc kubenswrapper[4898]: I1211 13:25:03.233522 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b7b41e46-2b51-441b-b181-ab36339a8d19-operator-scripts\") pod \"mysqld-exporter-openstack-db-create-2wggg\" (UID: \"b7b41e46-2b51-441b-b181-ab36339a8d19\") " pod="openstack/mysqld-exporter-openstack-db-create-2wggg" Dec 11 13:25:03 crc kubenswrapper[4898]: I1211 13:25:03.233674 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rfcdb\" (UniqueName: \"kubernetes.io/projected/b7b41e46-2b51-441b-b181-ab36339a8d19-kube-api-access-rfcdb\") pod \"mysqld-exporter-openstack-db-create-2wggg\" (UID: \"b7b41e46-2b51-441b-b181-ab36339a8d19\") " pod="openstack/mysqld-exporter-openstack-db-create-2wggg" Dec 11 13:25:03 crc kubenswrapper[4898]: I1211 13:25:03.311795 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mysqld-exporter-2649-account-create-update-xf8ww"] Dec 11 13:25:03 crc kubenswrapper[4898]: I1211 13:25:03.313238 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-2649-account-create-update-xf8ww" Dec 11 13:25:03 crc kubenswrapper[4898]: I1211 13:25:03.316429 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"mysqld-exporter-openstack-db-secret" Dec 11 13:25:03 crc kubenswrapper[4898]: I1211 13:25:03.328963 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Dec 11 13:25:03 crc kubenswrapper[4898]: I1211 13:25:03.340800 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-2649-account-create-update-xf8ww"] Dec 11 13:25:03 crc kubenswrapper[4898]: I1211 13:25:03.349027 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v4d47\" (UniqueName: \"kubernetes.io/projected/b83689fb-d5d6-4c88-8976-791b41ff048f-kube-api-access-v4d47\") pod \"mysqld-exporter-2649-account-create-update-xf8ww\" (UID: \"b83689fb-d5d6-4c88-8976-791b41ff048f\") " pod="openstack/mysqld-exporter-2649-account-create-update-xf8ww" Dec 11 13:25:03 crc kubenswrapper[4898]: I1211 13:25:03.349572 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b7b41e46-2b51-441b-b181-ab36339a8d19-operator-scripts\") pod \"mysqld-exporter-openstack-db-create-2wggg\" (UID: \"b7b41e46-2b51-441b-b181-ab36339a8d19\") " pod="openstack/mysqld-exporter-openstack-db-create-2wggg" Dec 11 13:25:03 crc kubenswrapper[4898]: I1211 13:25:03.349811 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b83689fb-d5d6-4c88-8976-791b41ff048f-operator-scripts\") pod \"mysqld-exporter-2649-account-create-update-xf8ww\" (UID: \"b83689fb-d5d6-4c88-8976-791b41ff048f\") " pod="openstack/mysqld-exporter-2649-account-create-update-xf8ww" Dec 11 13:25:03 crc kubenswrapper[4898]: I1211 13:25:03.350044 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rfcdb\" (UniqueName: \"kubernetes.io/projected/b7b41e46-2b51-441b-b181-ab36339a8d19-kube-api-access-rfcdb\") pod \"mysqld-exporter-openstack-db-create-2wggg\" (UID: \"b7b41e46-2b51-441b-b181-ab36339a8d19\") " pod="openstack/mysqld-exporter-openstack-db-create-2wggg" Dec 11 13:25:03 crc kubenswrapper[4898]: I1211 13:25:03.364273 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b7b41e46-2b51-441b-b181-ab36339a8d19-operator-scripts\") pod \"mysqld-exporter-openstack-db-create-2wggg\" (UID: \"b7b41e46-2b51-441b-b181-ab36339a8d19\") " pod="openstack/mysqld-exporter-openstack-db-create-2wggg" Dec 11 13:25:03 crc kubenswrapper[4898]: I1211 13:25:03.400140 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8554648995-6l72c"] Dec 11 13:25:03 crc kubenswrapper[4898]: I1211 13:25:03.400396 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rfcdb\" (UniqueName: \"kubernetes.io/projected/b7b41e46-2b51-441b-b181-ab36339a8d19-kube-api-access-rfcdb\") pod \"mysqld-exporter-openstack-db-create-2wggg\" (UID: \"b7b41e46-2b51-441b-b181-ab36339a8d19\") " pod="openstack/mysqld-exporter-openstack-db-create-2wggg" Dec 11 13:25:03 crc kubenswrapper[4898]: I1211 13:25:03.400420 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-8554648995-6l72c" podUID="d26c4733-9d89-4666-8554-f769295ad7b3" containerName="dnsmasq-dns" containerID="cri-o://7828a81136b680b4f184b12a4acd807787096a87b5cfa09e7b4fa75e117fbef8" gracePeriod=10 Dec 11 13:25:03 crc kubenswrapper[4898]: I1211 13:25:03.435960 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-db-create-2wggg" Dec 11 13:25:03 crc kubenswrapper[4898]: I1211 13:25:03.453178 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-kljnd"] Dec 11 13:25:03 crc kubenswrapper[4898]: I1211 13:25:03.454499 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v4d47\" (UniqueName: \"kubernetes.io/projected/b83689fb-d5d6-4c88-8976-791b41ff048f-kube-api-access-v4d47\") pod \"mysqld-exporter-2649-account-create-update-xf8ww\" (UID: \"b83689fb-d5d6-4c88-8976-791b41ff048f\") " pod="openstack/mysqld-exporter-2649-account-create-update-xf8ww" Dec 11 13:25:03 crc kubenswrapper[4898]: I1211 13:25:03.454626 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b83689fb-d5d6-4c88-8976-791b41ff048f-operator-scripts\") pod \"mysqld-exporter-2649-account-create-update-xf8ww\" (UID: \"b83689fb-d5d6-4c88-8976-791b41ff048f\") " pod="openstack/mysqld-exporter-2649-account-create-update-xf8ww" Dec 11 13:25:03 crc kubenswrapper[4898]: I1211 13:25:03.455258 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-kljnd" Dec 11 13:25:03 crc kubenswrapper[4898]: I1211 13:25:03.455534 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b83689fb-d5d6-4c88-8976-791b41ff048f-operator-scripts\") pod \"mysqld-exporter-2649-account-create-update-xf8ww\" (UID: \"b83689fb-d5d6-4c88-8976-791b41ff048f\") " pod="openstack/mysqld-exporter-2649-account-create-update-xf8ww" Dec 11 13:25:03 crc kubenswrapper[4898]: I1211 13:25:03.477008 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-kljnd"] Dec 11 13:25:03 crc kubenswrapper[4898]: I1211 13:25:03.501302 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v4d47\" (UniqueName: \"kubernetes.io/projected/b83689fb-d5d6-4c88-8976-791b41ff048f-kube-api-access-v4d47\") pod \"mysqld-exporter-2649-account-create-update-xf8ww\" (UID: \"b83689fb-d5d6-4c88-8976-791b41ff048f\") " pod="openstack/mysqld-exporter-2649-account-create-update-xf8ww" Dec 11 13:25:03 crc kubenswrapper[4898]: I1211 13:25:03.556141 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/75e494bf-b288-45e2-8f87-7c146a9bb74f-config\") pod \"dnsmasq-dns-b8fbc5445-kljnd\" (UID: \"75e494bf-b288-45e2-8f87-7c146a9bb74f\") " pod="openstack/dnsmasq-dns-b8fbc5445-kljnd" Dec 11 13:25:03 crc kubenswrapper[4898]: I1211 13:25:03.557575 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/75e494bf-b288-45e2-8f87-7c146a9bb74f-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-kljnd\" (UID: \"75e494bf-b288-45e2-8f87-7c146a9bb74f\") " pod="openstack/dnsmasq-dns-b8fbc5445-kljnd" Dec 11 13:25:03 crc kubenswrapper[4898]: I1211 13:25:03.559562 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t6kn7\" (UniqueName: \"kubernetes.io/projected/75e494bf-b288-45e2-8f87-7c146a9bb74f-kube-api-access-t6kn7\") pod \"dnsmasq-dns-b8fbc5445-kljnd\" (UID: \"75e494bf-b288-45e2-8f87-7c146a9bb74f\") " pod="openstack/dnsmasq-dns-b8fbc5445-kljnd" Dec 11 13:25:03 crc kubenswrapper[4898]: I1211 13:25:03.559731 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/75e494bf-b288-45e2-8f87-7c146a9bb74f-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-kljnd\" (UID: \"75e494bf-b288-45e2-8f87-7c146a9bb74f\") " pod="openstack/dnsmasq-dns-b8fbc5445-kljnd" Dec 11 13:25:03 crc kubenswrapper[4898]: I1211 13:25:03.559818 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/75e494bf-b288-45e2-8f87-7c146a9bb74f-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-kljnd\" (UID: \"75e494bf-b288-45e2-8f87-7c146a9bb74f\") " pod="openstack/dnsmasq-dns-b8fbc5445-kljnd" Dec 11 13:25:03 crc kubenswrapper[4898]: I1211 13:25:03.653279 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-2649-account-create-update-xf8ww" Dec 11 13:25:03 crc kubenswrapper[4898]: I1211 13:25:03.666418 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t6kn7\" (UniqueName: \"kubernetes.io/projected/75e494bf-b288-45e2-8f87-7c146a9bb74f-kube-api-access-t6kn7\") pod \"dnsmasq-dns-b8fbc5445-kljnd\" (UID: \"75e494bf-b288-45e2-8f87-7c146a9bb74f\") " pod="openstack/dnsmasq-dns-b8fbc5445-kljnd" Dec 11 13:25:03 crc kubenswrapper[4898]: I1211 13:25:03.666483 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/75e494bf-b288-45e2-8f87-7c146a9bb74f-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-kljnd\" (UID: \"75e494bf-b288-45e2-8f87-7c146a9bb74f\") " pod="openstack/dnsmasq-dns-b8fbc5445-kljnd" Dec 11 13:25:03 crc kubenswrapper[4898]: I1211 13:25:03.666517 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/75e494bf-b288-45e2-8f87-7c146a9bb74f-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-kljnd\" (UID: \"75e494bf-b288-45e2-8f87-7c146a9bb74f\") " pod="openstack/dnsmasq-dns-b8fbc5445-kljnd" Dec 11 13:25:03 crc kubenswrapper[4898]: I1211 13:25:03.666709 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/75e494bf-b288-45e2-8f87-7c146a9bb74f-config\") pod \"dnsmasq-dns-b8fbc5445-kljnd\" (UID: \"75e494bf-b288-45e2-8f87-7c146a9bb74f\") " pod="openstack/dnsmasq-dns-b8fbc5445-kljnd" Dec 11 13:25:03 crc kubenswrapper[4898]: I1211 13:25:03.666761 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/75e494bf-b288-45e2-8f87-7c146a9bb74f-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-kljnd\" (UID: \"75e494bf-b288-45e2-8f87-7c146a9bb74f\") " pod="openstack/dnsmasq-dns-b8fbc5445-kljnd" Dec 11 13:25:03 crc kubenswrapper[4898]: I1211 13:25:03.667872 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/75e494bf-b288-45e2-8f87-7c146a9bb74f-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-kljnd\" (UID: \"75e494bf-b288-45e2-8f87-7c146a9bb74f\") " pod="openstack/dnsmasq-dns-b8fbc5445-kljnd" Dec 11 13:25:03 crc kubenswrapper[4898]: I1211 13:25:03.668031 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/75e494bf-b288-45e2-8f87-7c146a9bb74f-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-kljnd\" (UID: \"75e494bf-b288-45e2-8f87-7c146a9bb74f\") " pod="openstack/dnsmasq-dns-b8fbc5445-kljnd" Dec 11 13:25:03 crc kubenswrapper[4898]: I1211 13:25:03.668292 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/75e494bf-b288-45e2-8f87-7c146a9bb74f-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-kljnd\" (UID: \"75e494bf-b288-45e2-8f87-7c146a9bb74f\") " pod="openstack/dnsmasq-dns-b8fbc5445-kljnd" Dec 11 13:25:03 crc kubenswrapper[4898]: I1211 13:25:03.668492 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/75e494bf-b288-45e2-8f87-7c146a9bb74f-config\") pod \"dnsmasq-dns-b8fbc5445-kljnd\" (UID: \"75e494bf-b288-45e2-8f87-7c146a9bb74f\") " pod="openstack/dnsmasq-dns-b8fbc5445-kljnd" Dec 11 13:25:03 crc kubenswrapper[4898]: I1211 13:25:03.687395 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t6kn7\" (UniqueName: \"kubernetes.io/projected/75e494bf-b288-45e2-8f87-7c146a9bb74f-kube-api-access-t6kn7\") pod \"dnsmasq-dns-b8fbc5445-kljnd\" (UID: \"75e494bf-b288-45e2-8f87-7c146a9bb74f\") " pod="openstack/dnsmasq-dns-b8fbc5445-kljnd" Dec 11 13:25:03 crc kubenswrapper[4898]: I1211 13:25:03.831525 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-kljnd" Dec 11 13:25:04 crc kubenswrapper[4898]: I1211 13:25:04.041385 4898 generic.go:334] "Generic (PLEG): container finished" podID="d26c4733-9d89-4666-8554-f769295ad7b3" containerID="7828a81136b680b4f184b12a4acd807787096a87b5cfa09e7b4fa75e117fbef8" exitCode=0 Dec 11 13:25:04 crc kubenswrapper[4898]: I1211 13:25:04.041516 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-6l72c" event={"ID":"d26c4733-9d89-4666-8554-f769295ad7b3","Type":"ContainerDied","Data":"7828a81136b680b4f184b12a4acd807787096a87b5cfa09e7b4fa75e117fbef8"} Dec 11 13:25:04 crc kubenswrapper[4898]: I1211 13:25:04.510187 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Dec 11 13:25:04 crc kubenswrapper[4898]: I1211 13:25:04.526376 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Dec 11 13:25:04 crc kubenswrapper[4898]: I1211 13:25:04.541654 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Dec 11 13:25:04 crc kubenswrapper[4898]: I1211 13:25:04.548258 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Dec 11 13:25:04 crc kubenswrapper[4898]: I1211 13:25:04.549151 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Dec 11 13:25:04 crc kubenswrapper[4898]: I1211 13:25:04.551039 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-cwtjr" Dec 11 13:25:04 crc kubenswrapper[4898]: I1211 13:25:04.584716 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Dec 11 13:25:04 crc kubenswrapper[4898]: I1211 13:25:04.710302 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/717e7951-d95b-497f-b2b7-3ec4ef755642-lock\") pod \"swift-storage-0\" (UID: \"717e7951-d95b-497f-b2b7-3ec4ef755642\") " pod="openstack/swift-storage-0" Dec 11 13:25:04 crc kubenswrapper[4898]: I1211 13:25:04.710343 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/717e7951-d95b-497f-b2b7-3ec4ef755642-cache\") pod \"swift-storage-0\" (UID: \"717e7951-d95b-497f-b2b7-3ec4ef755642\") " pod="openstack/swift-storage-0" Dec 11 13:25:04 crc kubenswrapper[4898]: I1211 13:25:04.710500 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/717e7951-d95b-497f-b2b7-3ec4ef755642-etc-swift\") pod \"swift-storage-0\" (UID: \"717e7951-d95b-497f-b2b7-3ec4ef755642\") " pod="openstack/swift-storage-0" Dec 11 13:25:04 crc kubenswrapper[4898]: I1211 13:25:04.710535 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"swift-storage-0\" (UID: \"717e7951-d95b-497f-b2b7-3ec4ef755642\") " pod="openstack/swift-storage-0" Dec 11 13:25:04 crc kubenswrapper[4898]: I1211 13:25:04.710554 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5kv4z\" (UniqueName: \"kubernetes.io/projected/717e7951-d95b-497f-b2b7-3ec4ef755642-kube-api-access-5kv4z\") pod \"swift-storage-0\" (UID: \"717e7951-d95b-497f-b2b7-3ec4ef755642\") " pod="openstack/swift-storage-0" Dec 11 13:25:04 crc kubenswrapper[4898]: I1211 13:25:04.816645 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/717e7951-d95b-497f-b2b7-3ec4ef755642-lock\") pod \"swift-storage-0\" (UID: \"717e7951-d95b-497f-b2b7-3ec4ef755642\") " pod="openstack/swift-storage-0" Dec 11 13:25:04 crc kubenswrapper[4898]: I1211 13:25:04.816687 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/717e7951-d95b-497f-b2b7-3ec4ef755642-cache\") pod \"swift-storage-0\" (UID: \"717e7951-d95b-497f-b2b7-3ec4ef755642\") " pod="openstack/swift-storage-0" Dec 11 13:25:04 crc kubenswrapper[4898]: I1211 13:25:04.816796 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/717e7951-d95b-497f-b2b7-3ec4ef755642-etc-swift\") pod \"swift-storage-0\" (UID: \"717e7951-d95b-497f-b2b7-3ec4ef755642\") " pod="openstack/swift-storage-0" Dec 11 13:25:04 crc kubenswrapper[4898]: I1211 13:25:04.816831 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"swift-storage-0\" (UID: \"717e7951-d95b-497f-b2b7-3ec4ef755642\") " pod="openstack/swift-storage-0" Dec 11 13:25:04 crc kubenswrapper[4898]: I1211 13:25:04.816851 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5kv4z\" (UniqueName: \"kubernetes.io/projected/717e7951-d95b-497f-b2b7-3ec4ef755642-kube-api-access-5kv4z\") pod \"swift-storage-0\" (UID: \"717e7951-d95b-497f-b2b7-3ec4ef755642\") " pod="openstack/swift-storage-0" Dec 11 13:25:04 crc kubenswrapper[4898]: I1211 13:25:04.817562 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/717e7951-d95b-497f-b2b7-3ec4ef755642-lock\") pod \"swift-storage-0\" (UID: \"717e7951-d95b-497f-b2b7-3ec4ef755642\") " pod="openstack/swift-storage-0" Dec 11 13:25:04 crc kubenswrapper[4898]: I1211 13:25:04.817809 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/717e7951-d95b-497f-b2b7-3ec4ef755642-cache\") pod \"swift-storage-0\" (UID: \"717e7951-d95b-497f-b2b7-3ec4ef755642\") " pod="openstack/swift-storage-0" Dec 11 13:25:04 crc kubenswrapper[4898]: E1211 13:25:04.817891 4898 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 11 13:25:04 crc kubenswrapper[4898]: E1211 13:25:04.817911 4898 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 11 13:25:04 crc kubenswrapper[4898]: E1211 13:25:04.817949 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/717e7951-d95b-497f-b2b7-3ec4ef755642-etc-swift podName:717e7951-d95b-497f-b2b7-3ec4ef755642 nodeName:}" failed. No retries permitted until 2025-12-11 13:25:05.317935609 +0000 UTC m=+1262.890262046 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/717e7951-d95b-497f-b2b7-3ec4ef755642-etc-swift") pod "swift-storage-0" (UID: "717e7951-d95b-497f-b2b7-3ec4ef755642") : configmap "swift-ring-files" not found Dec 11 13:25:04 crc kubenswrapper[4898]: I1211 13:25:04.818363 4898 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"swift-storage-0\" (UID: \"717e7951-d95b-497f-b2b7-3ec4ef755642\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/swift-storage-0" Dec 11 13:25:04 crc kubenswrapper[4898]: I1211 13:25:04.845609 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5kv4z\" (UniqueName: \"kubernetes.io/projected/717e7951-d95b-497f-b2b7-3ec4ef755642-kube-api-access-5kv4z\") pod \"swift-storage-0\" (UID: \"717e7951-d95b-497f-b2b7-3ec4ef755642\") " pod="openstack/swift-storage-0" Dec 11 13:25:04 crc kubenswrapper[4898]: I1211 13:25:04.869095 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"swift-storage-0\" (UID: \"717e7951-d95b-497f-b2b7-3ec4ef755642\") " pod="openstack/swift-storage-0" Dec 11 13:25:04 crc kubenswrapper[4898]: I1211 13:25:04.995352 4898 patch_prober.go:28] interesting pod/machine-config-daemon-7mmvk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 11 13:25:04 crc kubenswrapper[4898]: I1211 13:25:04.995399 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 11 13:25:04 crc kubenswrapper[4898]: I1211 13:25:04.995441 4898 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" Dec 11 13:25:04 crc kubenswrapper[4898]: I1211 13:25:04.996205 4898 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ca4a970ba19c45b9f6200362f9a5d9ef16d6404aa1395da5964e4d307dc4af2f"} pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 11 13:25:04 crc kubenswrapper[4898]: I1211 13:25:04.996257 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" containerName="machine-config-daemon" containerID="cri-o://ca4a970ba19c45b9f6200362f9a5d9ef16d6404aa1395da5964e4d307dc4af2f" gracePeriod=600 Dec 11 13:25:05 crc kubenswrapper[4898]: I1211 13:25:05.090592 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-lx2j5"] Dec 11 13:25:05 crc kubenswrapper[4898]: I1211 13:25:05.091925 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-lx2j5" Dec 11 13:25:05 crc kubenswrapper[4898]: I1211 13:25:05.093389 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Dec 11 13:25:05 crc kubenswrapper[4898]: I1211 13:25:05.094120 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Dec 11 13:25:05 crc kubenswrapper[4898]: I1211 13:25:05.094745 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Dec 11 13:25:05 crc kubenswrapper[4898]: I1211 13:25:05.108306 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-lx2j5"] Dec 11 13:25:05 crc kubenswrapper[4898]: I1211 13:25:05.225091 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6d5760a1-aea8-4f95-8da7-8832f8879d57-scripts\") pod \"swift-ring-rebalance-lx2j5\" (UID: \"6d5760a1-aea8-4f95-8da7-8832f8879d57\") " pod="openstack/swift-ring-rebalance-lx2j5" Dec 11 13:25:05 crc kubenswrapper[4898]: I1211 13:25:05.225150 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sl5t4\" (UniqueName: \"kubernetes.io/projected/6d5760a1-aea8-4f95-8da7-8832f8879d57-kube-api-access-sl5t4\") pod \"swift-ring-rebalance-lx2j5\" (UID: \"6d5760a1-aea8-4f95-8da7-8832f8879d57\") " pod="openstack/swift-ring-rebalance-lx2j5" Dec 11 13:25:05 crc kubenswrapper[4898]: I1211 13:25:05.225190 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/6d5760a1-aea8-4f95-8da7-8832f8879d57-etc-swift\") pod \"swift-ring-rebalance-lx2j5\" (UID: \"6d5760a1-aea8-4f95-8da7-8832f8879d57\") " pod="openstack/swift-ring-rebalance-lx2j5" Dec 11 13:25:05 crc kubenswrapper[4898]: I1211 13:25:05.225356 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/6d5760a1-aea8-4f95-8da7-8832f8879d57-swiftconf\") pod \"swift-ring-rebalance-lx2j5\" (UID: \"6d5760a1-aea8-4f95-8da7-8832f8879d57\") " pod="openstack/swift-ring-rebalance-lx2j5" Dec 11 13:25:05 crc kubenswrapper[4898]: I1211 13:25:05.225594 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d5760a1-aea8-4f95-8da7-8832f8879d57-combined-ca-bundle\") pod \"swift-ring-rebalance-lx2j5\" (UID: \"6d5760a1-aea8-4f95-8da7-8832f8879d57\") " pod="openstack/swift-ring-rebalance-lx2j5" Dec 11 13:25:05 crc kubenswrapper[4898]: I1211 13:25:05.225788 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/6d5760a1-aea8-4f95-8da7-8832f8879d57-ring-data-devices\") pod \"swift-ring-rebalance-lx2j5\" (UID: \"6d5760a1-aea8-4f95-8da7-8832f8879d57\") " pod="openstack/swift-ring-rebalance-lx2j5" Dec 11 13:25:05 crc kubenswrapper[4898]: I1211 13:25:05.225959 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/6d5760a1-aea8-4f95-8da7-8832f8879d57-dispersionconf\") pod \"swift-ring-rebalance-lx2j5\" (UID: \"6d5760a1-aea8-4f95-8da7-8832f8879d57\") " pod="openstack/swift-ring-rebalance-lx2j5" Dec 11 13:25:05 crc kubenswrapper[4898]: I1211 13:25:05.327510 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/6d5760a1-aea8-4f95-8da7-8832f8879d57-dispersionconf\") pod \"swift-ring-rebalance-lx2j5\" (UID: \"6d5760a1-aea8-4f95-8da7-8832f8879d57\") " pod="openstack/swift-ring-rebalance-lx2j5" Dec 11 13:25:05 crc kubenswrapper[4898]: I1211 13:25:05.327637 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6d5760a1-aea8-4f95-8da7-8832f8879d57-scripts\") pod \"swift-ring-rebalance-lx2j5\" (UID: \"6d5760a1-aea8-4f95-8da7-8832f8879d57\") " pod="openstack/swift-ring-rebalance-lx2j5" Dec 11 13:25:05 crc kubenswrapper[4898]: I1211 13:25:05.327677 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sl5t4\" (UniqueName: \"kubernetes.io/projected/6d5760a1-aea8-4f95-8da7-8832f8879d57-kube-api-access-sl5t4\") pod \"swift-ring-rebalance-lx2j5\" (UID: \"6d5760a1-aea8-4f95-8da7-8832f8879d57\") " pod="openstack/swift-ring-rebalance-lx2j5" Dec 11 13:25:05 crc kubenswrapper[4898]: I1211 13:25:05.327727 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/6d5760a1-aea8-4f95-8da7-8832f8879d57-etc-swift\") pod \"swift-ring-rebalance-lx2j5\" (UID: \"6d5760a1-aea8-4f95-8da7-8832f8879d57\") " pod="openstack/swift-ring-rebalance-lx2j5" Dec 11 13:25:05 crc kubenswrapper[4898]: I1211 13:25:05.327804 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/6d5760a1-aea8-4f95-8da7-8832f8879d57-swiftconf\") pod \"swift-ring-rebalance-lx2j5\" (UID: \"6d5760a1-aea8-4f95-8da7-8832f8879d57\") " pod="openstack/swift-ring-rebalance-lx2j5" Dec 11 13:25:05 crc kubenswrapper[4898]: I1211 13:25:05.327849 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d5760a1-aea8-4f95-8da7-8832f8879d57-combined-ca-bundle\") pod \"swift-ring-rebalance-lx2j5\" (UID: \"6d5760a1-aea8-4f95-8da7-8832f8879d57\") " pod="openstack/swift-ring-rebalance-lx2j5" Dec 11 13:25:05 crc kubenswrapper[4898]: I1211 13:25:05.327906 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/6d5760a1-aea8-4f95-8da7-8832f8879d57-ring-data-devices\") pod \"swift-ring-rebalance-lx2j5\" (UID: \"6d5760a1-aea8-4f95-8da7-8832f8879d57\") " pod="openstack/swift-ring-rebalance-lx2j5" Dec 11 13:25:05 crc kubenswrapper[4898]: I1211 13:25:05.327938 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/717e7951-d95b-497f-b2b7-3ec4ef755642-etc-swift\") pod \"swift-storage-0\" (UID: \"717e7951-d95b-497f-b2b7-3ec4ef755642\") " pod="openstack/swift-storage-0" Dec 11 13:25:05 crc kubenswrapper[4898]: E1211 13:25:05.328098 4898 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 11 13:25:05 crc kubenswrapper[4898]: E1211 13:25:05.328114 4898 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 11 13:25:05 crc kubenswrapper[4898]: E1211 13:25:05.328174 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/717e7951-d95b-497f-b2b7-3ec4ef755642-etc-swift podName:717e7951-d95b-497f-b2b7-3ec4ef755642 nodeName:}" failed. No retries permitted until 2025-12-11 13:25:06.3281568 +0000 UTC m=+1263.900483247 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/717e7951-d95b-497f-b2b7-3ec4ef755642-etc-swift") pod "swift-storage-0" (UID: "717e7951-d95b-497f-b2b7-3ec4ef755642") : configmap "swift-ring-files" not found Dec 11 13:25:05 crc kubenswrapper[4898]: I1211 13:25:05.329003 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/6d5760a1-aea8-4f95-8da7-8832f8879d57-ring-data-devices\") pod \"swift-ring-rebalance-lx2j5\" (UID: \"6d5760a1-aea8-4f95-8da7-8832f8879d57\") " pod="openstack/swift-ring-rebalance-lx2j5" Dec 11 13:25:05 crc kubenswrapper[4898]: I1211 13:25:05.329018 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6d5760a1-aea8-4f95-8da7-8832f8879d57-scripts\") pod \"swift-ring-rebalance-lx2j5\" (UID: \"6d5760a1-aea8-4f95-8da7-8832f8879d57\") " pod="openstack/swift-ring-rebalance-lx2j5" Dec 11 13:25:05 crc kubenswrapper[4898]: I1211 13:25:05.329285 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/6d5760a1-aea8-4f95-8da7-8832f8879d57-etc-swift\") pod \"swift-ring-rebalance-lx2j5\" (UID: \"6d5760a1-aea8-4f95-8da7-8832f8879d57\") " pod="openstack/swift-ring-rebalance-lx2j5" Dec 11 13:25:05 crc kubenswrapper[4898]: I1211 13:25:05.334891 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/6d5760a1-aea8-4f95-8da7-8832f8879d57-dispersionconf\") pod \"swift-ring-rebalance-lx2j5\" (UID: \"6d5760a1-aea8-4f95-8da7-8832f8879d57\") " pod="openstack/swift-ring-rebalance-lx2j5" Dec 11 13:25:05 crc kubenswrapper[4898]: I1211 13:25:05.335138 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d5760a1-aea8-4f95-8da7-8832f8879d57-combined-ca-bundle\") pod \"swift-ring-rebalance-lx2j5\" (UID: \"6d5760a1-aea8-4f95-8da7-8832f8879d57\") " pod="openstack/swift-ring-rebalance-lx2j5" Dec 11 13:25:05 crc kubenswrapper[4898]: I1211 13:25:05.343773 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sl5t4\" (UniqueName: \"kubernetes.io/projected/6d5760a1-aea8-4f95-8da7-8832f8879d57-kube-api-access-sl5t4\") pod \"swift-ring-rebalance-lx2j5\" (UID: \"6d5760a1-aea8-4f95-8da7-8832f8879d57\") " pod="openstack/swift-ring-rebalance-lx2j5" Dec 11 13:25:05 crc kubenswrapper[4898]: I1211 13:25:05.345961 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/6d5760a1-aea8-4f95-8da7-8832f8879d57-swiftconf\") pod \"swift-ring-rebalance-lx2j5\" (UID: \"6d5760a1-aea8-4f95-8da7-8832f8879d57\") " pod="openstack/swift-ring-rebalance-lx2j5" Dec 11 13:25:05 crc kubenswrapper[4898]: I1211 13:25:05.418743 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-lx2j5" Dec 11 13:25:05 crc kubenswrapper[4898]: I1211 13:25:05.805837 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-8554648995-6l72c" podUID="d26c4733-9d89-4666-8554-f769295ad7b3" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.145:5353: connect: connection refused" Dec 11 13:25:06 crc kubenswrapper[4898]: I1211 13:25:06.071781 4898 generic.go:334] "Generic (PLEG): container finished" podID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" containerID="ca4a970ba19c45b9f6200362f9a5d9ef16d6404aa1395da5964e4d307dc4af2f" exitCode=0 Dec 11 13:25:06 crc kubenswrapper[4898]: I1211 13:25:06.071850 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" event={"ID":"b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c","Type":"ContainerDied","Data":"ca4a970ba19c45b9f6200362f9a5d9ef16d6404aa1395da5964e4d307dc4af2f"} Dec 11 13:25:06 crc kubenswrapper[4898]: I1211 13:25:06.072202 4898 scope.go:117] "RemoveContainer" containerID="d75b5443399659dfc4c4753c4049d2fed5f950b8d251c55a9abe387518cd27d2" Dec 11 13:25:06 crc kubenswrapper[4898]: I1211 13:25:06.347490 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/717e7951-d95b-497f-b2b7-3ec4ef755642-etc-swift\") pod \"swift-storage-0\" (UID: \"717e7951-d95b-497f-b2b7-3ec4ef755642\") " pod="openstack/swift-storage-0" Dec 11 13:25:06 crc kubenswrapper[4898]: E1211 13:25:06.347780 4898 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 11 13:25:06 crc kubenswrapper[4898]: E1211 13:25:06.347794 4898 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 11 13:25:06 crc kubenswrapper[4898]: E1211 13:25:06.347979 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/717e7951-d95b-497f-b2b7-3ec4ef755642-etc-swift podName:717e7951-d95b-497f-b2b7-3ec4ef755642 nodeName:}" failed. No retries permitted until 2025-12-11 13:25:08.347937483 +0000 UTC m=+1265.920263920 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/717e7951-d95b-497f-b2b7-3ec4ef755642-etc-swift") pod "swift-storage-0" (UID: "717e7951-d95b-497f-b2b7-3ec4ef755642") : configmap "swift-ring-files" not found Dec 11 13:25:06 crc kubenswrapper[4898]: I1211 13:25:06.586192 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-x6tgv"] Dec 11 13:25:06 crc kubenswrapper[4898]: I1211 13:25:06.587569 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-x6tgv" Dec 11 13:25:06 crc kubenswrapper[4898]: I1211 13:25:06.608784 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-6502-account-create-update-6vp66"] Dec 11 13:25:06 crc kubenswrapper[4898]: I1211 13:25:06.610126 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-6502-account-create-update-6vp66" Dec 11 13:25:06 crc kubenswrapper[4898]: I1211 13:25:06.611806 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Dec 11 13:25:06 crc kubenswrapper[4898]: I1211 13:25:06.624595 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-x6tgv"] Dec 11 13:25:06 crc kubenswrapper[4898]: I1211 13:25:06.637076 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-6502-account-create-update-6vp66"] Dec 11 13:25:06 crc kubenswrapper[4898]: I1211 13:25:06.757072 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lwkc5\" (UniqueName: \"kubernetes.io/projected/9d28503d-c258-4bea-870f-7d8b34591c6e-kube-api-access-lwkc5\") pod \"glance-6502-account-create-update-6vp66\" (UID: \"9d28503d-c258-4bea-870f-7d8b34591c6e\") " pod="openstack/glance-6502-account-create-update-6vp66" Dec 11 13:25:06 crc kubenswrapper[4898]: I1211 13:25:06.757389 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8d685a72-643d-4739-9e26-28a37c6391d3-operator-scripts\") pod \"glance-db-create-x6tgv\" (UID: \"8d685a72-643d-4739-9e26-28a37c6391d3\") " pod="openstack/glance-db-create-x6tgv" Dec 11 13:25:06 crc kubenswrapper[4898]: I1211 13:25:06.757666 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9d28503d-c258-4bea-870f-7d8b34591c6e-operator-scripts\") pod \"glance-6502-account-create-update-6vp66\" (UID: \"9d28503d-c258-4bea-870f-7d8b34591c6e\") " pod="openstack/glance-6502-account-create-update-6vp66" Dec 11 13:25:06 crc kubenswrapper[4898]: I1211 13:25:06.757802 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n2m94\" (UniqueName: \"kubernetes.io/projected/8d685a72-643d-4739-9e26-28a37c6391d3-kube-api-access-n2m94\") pod \"glance-db-create-x6tgv\" (UID: \"8d685a72-643d-4739-9e26-28a37c6391d3\") " pod="openstack/glance-db-create-x6tgv" Dec 11 13:25:06 crc kubenswrapper[4898]: I1211 13:25:06.859164 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lwkc5\" (UniqueName: \"kubernetes.io/projected/9d28503d-c258-4bea-870f-7d8b34591c6e-kube-api-access-lwkc5\") pod \"glance-6502-account-create-update-6vp66\" (UID: \"9d28503d-c258-4bea-870f-7d8b34591c6e\") " pod="openstack/glance-6502-account-create-update-6vp66" Dec 11 13:25:06 crc kubenswrapper[4898]: I1211 13:25:06.859273 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8d685a72-643d-4739-9e26-28a37c6391d3-operator-scripts\") pod \"glance-db-create-x6tgv\" (UID: \"8d685a72-643d-4739-9e26-28a37c6391d3\") " pod="openstack/glance-db-create-x6tgv" Dec 11 13:25:06 crc kubenswrapper[4898]: I1211 13:25:06.859919 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8d685a72-643d-4739-9e26-28a37c6391d3-operator-scripts\") pod \"glance-db-create-x6tgv\" (UID: \"8d685a72-643d-4739-9e26-28a37c6391d3\") " pod="openstack/glance-db-create-x6tgv" Dec 11 13:25:06 crc kubenswrapper[4898]: I1211 13:25:06.860070 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9d28503d-c258-4bea-870f-7d8b34591c6e-operator-scripts\") pod \"glance-6502-account-create-update-6vp66\" (UID: \"9d28503d-c258-4bea-870f-7d8b34591c6e\") " pod="openstack/glance-6502-account-create-update-6vp66" Dec 11 13:25:06 crc kubenswrapper[4898]: I1211 13:25:06.860139 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n2m94\" (UniqueName: \"kubernetes.io/projected/8d685a72-643d-4739-9e26-28a37c6391d3-kube-api-access-n2m94\") pod \"glance-db-create-x6tgv\" (UID: \"8d685a72-643d-4739-9e26-28a37c6391d3\") " pod="openstack/glance-db-create-x6tgv" Dec 11 13:25:06 crc kubenswrapper[4898]: I1211 13:25:06.861220 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9d28503d-c258-4bea-870f-7d8b34591c6e-operator-scripts\") pod \"glance-6502-account-create-update-6vp66\" (UID: \"9d28503d-c258-4bea-870f-7d8b34591c6e\") " pod="openstack/glance-6502-account-create-update-6vp66" Dec 11 13:25:06 crc kubenswrapper[4898]: I1211 13:25:06.883156 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lwkc5\" (UniqueName: \"kubernetes.io/projected/9d28503d-c258-4bea-870f-7d8b34591c6e-kube-api-access-lwkc5\") pod \"glance-6502-account-create-update-6vp66\" (UID: \"9d28503d-c258-4bea-870f-7d8b34591c6e\") " pod="openstack/glance-6502-account-create-update-6vp66" Dec 11 13:25:06 crc kubenswrapper[4898]: I1211 13:25:06.883255 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n2m94\" (UniqueName: \"kubernetes.io/projected/8d685a72-643d-4739-9e26-28a37c6391d3-kube-api-access-n2m94\") pod \"glance-db-create-x6tgv\" (UID: \"8d685a72-643d-4739-9e26-28a37c6391d3\") " pod="openstack/glance-db-create-x6tgv" Dec 11 13:25:06 crc kubenswrapper[4898]: I1211 13:25:06.909130 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-x6tgv" Dec 11 13:25:06 crc kubenswrapper[4898]: I1211 13:25:06.933020 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-6502-account-create-update-6vp66" Dec 11 13:25:07 crc kubenswrapper[4898]: I1211 13:25:07.320038 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-pddzq" Dec 11 13:25:07 crc kubenswrapper[4898]: I1211 13:25:07.325240 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-267e-account-create-update-rkrvq" Dec 11 13:25:07 crc kubenswrapper[4898]: I1211 13:25:07.332575 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-95d5-account-create-update-62mxt" Dec 11 13:25:07 crc kubenswrapper[4898]: I1211 13:25:07.360231 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-mj9zd" Dec 11 13:25:07 crc kubenswrapper[4898]: I1211 13:25:07.476192 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-75p8t\" (UniqueName: \"kubernetes.io/projected/d589c38c-afcf-4107-bf16-ac57d302576e-kube-api-access-75p8t\") pod \"d589c38c-afcf-4107-bf16-ac57d302576e\" (UID: \"d589c38c-afcf-4107-bf16-ac57d302576e\") " Dec 11 13:25:07 crc kubenswrapper[4898]: I1211 13:25:07.476258 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/04e19d4f-0d50-4924-bd7d-812d753d76ac-operator-scripts\") pod \"04e19d4f-0d50-4924-bd7d-812d753d76ac\" (UID: \"04e19d4f-0d50-4924-bd7d-812d753d76ac\") " Dec 11 13:25:07 crc kubenswrapper[4898]: I1211 13:25:07.476332 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/399ee20b-e403-4d88-bd87-77a1c8b71e93-operator-scripts\") pod \"399ee20b-e403-4d88-bd87-77a1c8b71e93\" (UID: \"399ee20b-e403-4d88-bd87-77a1c8b71e93\") " Dec 11 13:25:07 crc kubenswrapper[4898]: I1211 13:25:07.476398 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jczn8\" (UniqueName: \"kubernetes.io/projected/04e19d4f-0d50-4924-bd7d-812d753d76ac-kube-api-access-jczn8\") pod \"04e19d4f-0d50-4924-bd7d-812d753d76ac\" (UID: \"04e19d4f-0d50-4924-bd7d-812d753d76ac\") " Dec 11 13:25:07 crc kubenswrapper[4898]: I1211 13:25:07.478588 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/376bd2c9-d19e-4322-8269-847515a788cb-operator-scripts\") pod \"376bd2c9-d19e-4322-8269-847515a788cb\" (UID: \"376bd2c9-d19e-4322-8269-847515a788cb\") " Dec 11 13:25:07 crc kubenswrapper[4898]: I1211 13:25:07.478663 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zjjr5\" (UniqueName: \"kubernetes.io/projected/399ee20b-e403-4d88-bd87-77a1c8b71e93-kube-api-access-zjjr5\") pod \"399ee20b-e403-4d88-bd87-77a1c8b71e93\" (UID: \"399ee20b-e403-4d88-bd87-77a1c8b71e93\") " Dec 11 13:25:07 crc kubenswrapper[4898]: I1211 13:25:07.478792 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d589c38c-afcf-4107-bf16-ac57d302576e-operator-scripts\") pod \"d589c38c-afcf-4107-bf16-ac57d302576e\" (UID: \"d589c38c-afcf-4107-bf16-ac57d302576e\") " Dec 11 13:25:07 crc kubenswrapper[4898]: I1211 13:25:07.478818 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6nglq\" (UniqueName: \"kubernetes.io/projected/376bd2c9-d19e-4322-8269-847515a788cb-kube-api-access-6nglq\") pod \"376bd2c9-d19e-4322-8269-847515a788cb\" (UID: \"376bd2c9-d19e-4322-8269-847515a788cb\") " Dec 11 13:25:07 crc kubenswrapper[4898]: I1211 13:25:07.481295 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/04e19d4f-0d50-4924-bd7d-812d753d76ac-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "04e19d4f-0d50-4924-bd7d-812d753d76ac" (UID: "04e19d4f-0d50-4924-bd7d-812d753d76ac"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:25:07 crc kubenswrapper[4898]: I1211 13:25:07.482046 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/376bd2c9-d19e-4322-8269-847515a788cb-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "376bd2c9-d19e-4322-8269-847515a788cb" (UID: "376bd2c9-d19e-4322-8269-847515a788cb"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:25:07 crc kubenswrapper[4898]: I1211 13:25:07.482326 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d589c38c-afcf-4107-bf16-ac57d302576e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d589c38c-afcf-4107-bf16-ac57d302576e" (UID: "d589c38c-afcf-4107-bf16-ac57d302576e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:25:07 crc kubenswrapper[4898]: I1211 13:25:07.483166 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d589c38c-afcf-4107-bf16-ac57d302576e-kube-api-access-75p8t" (OuterVolumeSpecName: "kube-api-access-75p8t") pod "d589c38c-afcf-4107-bf16-ac57d302576e" (UID: "d589c38c-afcf-4107-bf16-ac57d302576e"). InnerVolumeSpecName "kube-api-access-75p8t". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:25:07 crc kubenswrapper[4898]: I1211 13:25:07.483775 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04e19d4f-0d50-4924-bd7d-812d753d76ac-kube-api-access-jczn8" (OuterVolumeSpecName: "kube-api-access-jczn8") pod "04e19d4f-0d50-4924-bd7d-812d753d76ac" (UID: "04e19d4f-0d50-4924-bd7d-812d753d76ac"). InnerVolumeSpecName "kube-api-access-jczn8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:25:07 crc kubenswrapper[4898]: I1211 13:25:07.484997 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/399ee20b-e403-4d88-bd87-77a1c8b71e93-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "399ee20b-e403-4d88-bd87-77a1c8b71e93" (UID: "399ee20b-e403-4d88-bd87-77a1c8b71e93"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:25:07 crc kubenswrapper[4898]: I1211 13:25:07.487594 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/399ee20b-e403-4d88-bd87-77a1c8b71e93-kube-api-access-zjjr5" (OuterVolumeSpecName: "kube-api-access-zjjr5") pod "399ee20b-e403-4d88-bd87-77a1c8b71e93" (UID: "399ee20b-e403-4d88-bd87-77a1c8b71e93"). InnerVolumeSpecName "kube-api-access-zjjr5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:25:07 crc kubenswrapper[4898]: I1211 13:25:07.487685 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/376bd2c9-d19e-4322-8269-847515a788cb-kube-api-access-6nglq" (OuterVolumeSpecName: "kube-api-access-6nglq") pod "376bd2c9-d19e-4322-8269-847515a788cb" (UID: "376bd2c9-d19e-4322-8269-847515a788cb"). InnerVolumeSpecName "kube-api-access-6nglq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:25:07 crc kubenswrapper[4898]: I1211 13:25:07.582387 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-75p8t\" (UniqueName: \"kubernetes.io/projected/d589c38c-afcf-4107-bf16-ac57d302576e-kube-api-access-75p8t\") on node \"crc\" DevicePath \"\"" Dec 11 13:25:07 crc kubenswrapper[4898]: I1211 13:25:07.582690 4898 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/04e19d4f-0d50-4924-bd7d-812d753d76ac-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 11 13:25:07 crc kubenswrapper[4898]: I1211 13:25:07.582700 4898 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/399ee20b-e403-4d88-bd87-77a1c8b71e93-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 11 13:25:07 crc kubenswrapper[4898]: I1211 13:25:07.582709 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jczn8\" (UniqueName: \"kubernetes.io/projected/04e19d4f-0d50-4924-bd7d-812d753d76ac-kube-api-access-jczn8\") on node \"crc\" DevicePath \"\"" Dec 11 13:25:07 crc kubenswrapper[4898]: I1211 13:25:07.582718 4898 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/376bd2c9-d19e-4322-8269-847515a788cb-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 11 13:25:07 crc kubenswrapper[4898]: I1211 13:25:07.582727 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zjjr5\" (UniqueName: \"kubernetes.io/projected/399ee20b-e403-4d88-bd87-77a1c8b71e93-kube-api-access-zjjr5\") on node \"crc\" DevicePath \"\"" Dec 11 13:25:07 crc kubenswrapper[4898]: I1211 13:25:07.582755 4898 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d589c38c-afcf-4107-bf16-ac57d302576e-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 11 13:25:07 crc kubenswrapper[4898]: I1211 13:25:07.582767 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6nglq\" (UniqueName: \"kubernetes.io/projected/376bd2c9-d19e-4322-8269-847515a788cb-kube-api-access-6nglq\") on node \"crc\" DevicePath \"\"" Dec 11 13:25:07 crc kubenswrapper[4898]: I1211 13:25:07.736012 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-6l72c" Dec 11 13:25:07 crc kubenswrapper[4898]: I1211 13:25:07.891186 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d26c4733-9d89-4666-8554-f769295ad7b3-ovsdbserver-nb\") pod \"d26c4733-9d89-4666-8554-f769295ad7b3\" (UID: \"d26c4733-9d89-4666-8554-f769295ad7b3\") " Dec 11 13:25:07 crc kubenswrapper[4898]: I1211 13:25:07.891672 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d26c4733-9d89-4666-8554-f769295ad7b3-ovsdbserver-sb\") pod \"d26c4733-9d89-4666-8554-f769295ad7b3\" (UID: \"d26c4733-9d89-4666-8554-f769295ad7b3\") " Dec 11 13:25:07 crc kubenswrapper[4898]: I1211 13:25:07.891731 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d26c4733-9d89-4666-8554-f769295ad7b3-config\") pod \"d26c4733-9d89-4666-8554-f769295ad7b3\" (UID: \"d26c4733-9d89-4666-8554-f769295ad7b3\") " Dec 11 13:25:07 crc kubenswrapper[4898]: I1211 13:25:07.892004 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d26c4733-9d89-4666-8554-f769295ad7b3-dns-svc\") pod \"d26c4733-9d89-4666-8554-f769295ad7b3\" (UID: \"d26c4733-9d89-4666-8554-f769295ad7b3\") " Dec 11 13:25:07 crc kubenswrapper[4898]: I1211 13:25:07.892114 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bpmgn\" (UniqueName: \"kubernetes.io/projected/d26c4733-9d89-4666-8554-f769295ad7b3-kube-api-access-bpmgn\") pod \"d26c4733-9d89-4666-8554-f769295ad7b3\" (UID: \"d26c4733-9d89-4666-8554-f769295ad7b3\") " Dec 11 13:25:07 crc kubenswrapper[4898]: I1211 13:25:07.908451 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d26c4733-9d89-4666-8554-f769295ad7b3-kube-api-access-bpmgn" (OuterVolumeSpecName: "kube-api-access-bpmgn") pod "d26c4733-9d89-4666-8554-f769295ad7b3" (UID: "d26c4733-9d89-4666-8554-f769295ad7b3"). InnerVolumeSpecName "kube-api-access-bpmgn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:25:07 crc kubenswrapper[4898]: I1211 13:25:07.957318 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d26c4733-9d89-4666-8554-f769295ad7b3-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "d26c4733-9d89-4666-8554-f769295ad7b3" (UID: "d26c4733-9d89-4666-8554-f769295ad7b3"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:25:07 crc kubenswrapper[4898]: I1211 13:25:07.967366 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d26c4733-9d89-4666-8554-f769295ad7b3-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d26c4733-9d89-4666-8554-f769295ad7b3" (UID: "d26c4733-9d89-4666-8554-f769295ad7b3"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:25:07 crc kubenswrapper[4898]: I1211 13:25:07.992704 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d26c4733-9d89-4666-8554-f769295ad7b3-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "d26c4733-9d89-4666-8554-f769295ad7b3" (UID: "d26c4733-9d89-4666-8554-f769295ad7b3"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:25:07 crc kubenswrapper[4898]: I1211 13:25:07.994876 4898 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d26c4733-9d89-4666-8554-f769295ad7b3-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 11 13:25:07 crc kubenswrapper[4898]: I1211 13:25:07.994907 4898 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d26c4733-9d89-4666-8554-f769295ad7b3-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 11 13:25:07 crc kubenswrapper[4898]: I1211 13:25:07.994917 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bpmgn\" (UniqueName: \"kubernetes.io/projected/d26c4733-9d89-4666-8554-f769295ad7b3-kube-api-access-bpmgn\") on node \"crc\" DevicePath \"\"" Dec 11 13:25:07 crc kubenswrapper[4898]: I1211 13:25:07.994929 4898 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d26c4733-9d89-4666-8554-f769295ad7b3-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 11 13:25:08 crc kubenswrapper[4898]: I1211 13:25:08.013215 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d26c4733-9d89-4666-8554-f769295ad7b3-config" (OuterVolumeSpecName: "config") pod "d26c4733-9d89-4666-8554-f769295ad7b3" (UID: "d26c4733-9d89-4666-8554-f769295ad7b3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:25:08 crc kubenswrapper[4898]: I1211 13:25:08.037528 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-kljnd"] Dec 11 13:25:08 crc kubenswrapper[4898]: I1211 13:25:08.094799 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-pddzq" event={"ID":"399ee20b-e403-4d88-bd87-77a1c8b71e93","Type":"ContainerDied","Data":"780288fb3ce668c2ed9675211afe42d9244a274570cb8d96d1a08e5c50cba32e"} Dec 11 13:25:08 crc kubenswrapper[4898]: I1211 13:25:08.095364 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="780288fb3ce668c2ed9675211afe42d9244a274570cb8d96d1a08e5c50cba32e" Dec 11 13:25:08 crc kubenswrapper[4898]: I1211 13:25:08.095519 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-pddzq" Dec 11 13:25:08 crc kubenswrapper[4898]: I1211 13:25:08.102622 4898 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d26c4733-9d89-4666-8554-f769295ad7b3-config\") on node \"crc\" DevicePath \"\"" Dec 11 13:25:08 crc kubenswrapper[4898]: I1211 13:25:08.102795 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-6l72c" event={"ID":"d26c4733-9d89-4666-8554-f769295ad7b3","Type":"ContainerDied","Data":"22e7cbc110c8cecadc12718c48c0b8bcdfc6367feb5e79f51657b84b30cd5b83"} Dec 11 13:25:08 crc kubenswrapper[4898]: I1211 13:25:08.102832 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-6l72c" Dec 11 13:25:08 crc kubenswrapper[4898]: I1211 13:25:08.102866 4898 scope.go:117] "RemoveContainer" containerID="7828a81136b680b4f184b12a4acd807787096a87b5cfa09e7b4fa75e117fbef8" Dec 11 13:25:08 crc kubenswrapper[4898]: I1211 13:25:08.107734 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-95d5-account-create-update-62mxt" event={"ID":"376bd2c9-d19e-4322-8269-847515a788cb","Type":"ContainerDied","Data":"a46ee95214c18d74a88e4ba2ee7cd18d649900cf550ef4ccbb552f217c1af0dd"} Dec 11 13:25:08 crc kubenswrapper[4898]: I1211 13:25:08.107787 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a46ee95214c18d74a88e4ba2ee7cd18d649900cf550ef4ccbb552f217c1af0dd" Dec 11 13:25:08 crc kubenswrapper[4898]: I1211 13:25:08.107856 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-95d5-account-create-update-62mxt" Dec 11 13:25:08 crc kubenswrapper[4898]: I1211 13:25:08.110617 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-267e-account-create-update-rkrvq" event={"ID":"04e19d4f-0d50-4924-bd7d-812d753d76ac","Type":"ContainerDied","Data":"fdecbca737a9e141765bad4ae02e5665b7455264912ed8fd01474674883d4923"} Dec 11 13:25:08 crc kubenswrapper[4898]: I1211 13:25:08.110663 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fdecbca737a9e141765bad4ae02e5665b7455264912ed8fd01474674883d4923" Dec 11 13:25:08 crc kubenswrapper[4898]: I1211 13:25:08.110716 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-267e-account-create-update-rkrvq" Dec 11 13:25:08 crc kubenswrapper[4898]: I1211 13:25:08.116861 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"87df6c08-c7eb-4e54-a329-6343e195c6f3","Type":"ContainerStarted","Data":"69179c7c18851bf5d304c6633fde399a71c8e469371a335fdd351be9705b5692"} Dec 11 13:25:08 crc kubenswrapper[4898]: I1211 13:25:08.118333 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-kljnd" event={"ID":"75e494bf-b288-45e2-8f87-7c146a9bb74f","Type":"ContainerStarted","Data":"0bacbdb44b71cd986e9cbadda0cd0afa8def878e1ff2ae92066c98613391ab46"} Dec 11 13:25:08 crc kubenswrapper[4898]: I1211 13:25:08.120379 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-mj9zd" event={"ID":"d589c38c-afcf-4107-bf16-ac57d302576e","Type":"ContainerDied","Data":"abec92ee8d976d8bfbd1723bfd8aa462876afb8a5d5e6a5a5643a32497c300e0"} Dec 11 13:25:08 crc kubenswrapper[4898]: I1211 13:25:08.120419 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="abec92ee8d976d8bfbd1723bfd8aa462876afb8a5d5e6a5a5643a32497c300e0" Dec 11 13:25:08 crc kubenswrapper[4898]: I1211 13:25:08.120443 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-mj9zd" Dec 11 13:25:08 crc kubenswrapper[4898]: I1211 13:25:08.122927 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" event={"ID":"b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c","Type":"ContainerStarted","Data":"0cb68cd95a282ab2d60091b5e9b86bf1ff863de795c2f88e13ffcb75e9110201"} Dec 11 13:25:08 crc kubenswrapper[4898]: I1211 13:25:08.188998 4898 scope.go:117] "RemoveContainer" containerID="e8b05aeb810e7971ccaacd84e800c2b24aff185c98cd25917b3884958ac51054" Dec 11 13:25:08 crc kubenswrapper[4898]: I1211 13:25:08.353472 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8554648995-6l72c"] Dec 11 13:25:08 crc kubenswrapper[4898]: I1211 13:25:08.370396 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-8554648995-6l72c"] Dec 11 13:25:08 crc kubenswrapper[4898]: I1211 13:25:08.409644 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/717e7951-d95b-497f-b2b7-3ec4ef755642-etc-swift\") pod \"swift-storage-0\" (UID: \"717e7951-d95b-497f-b2b7-3ec4ef755642\") " pod="openstack/swift-storage-0" Dec 11 13:25:08 crc kubenswrapper[4898]: E1211 13:25:08.410194 4898 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 11 13:25:08 crc kubenswrapper[4898]: E1211 13:25:08.410208 4898 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 11 13:25:08 crc kubenswrapper[4898]: E1211 13:25:08.410253 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/717e7951-d95b-497f-b2b7-3ec4ef755642-etc-swift podName:717e7951-d95b-497f-b2b7-3ec4ef755642 nodeName:}" failed. No retries permitted until 2025-12-11 13:25:12.410237698 +0000 UTC m=+1269.982564135 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/717e7951-d95b-497f-b2b7-3ec4ef755642-etc-swift") pod "swift-storage-0" (UID: "717e7951-d95b-497f-b2b7-3ec4ef755642") : configmap "swift-ring-files" not found Dec 11 13:25:08 crc kubenswrapper[4898]: I1211 13:25:08.466148 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-x6tgv"] Dec 11 13:25:08 crc kubenswrapper[4898]: I1211 13:25:08.476536 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-2649-account-create-update-xf8ww"] Dec 11 13:25:08 crc kubenswrapper[4898]: I1211 13:25:08.487865 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-6502-account-create-update-6vp66"] Dec 11 13:25:08 crc kubenswrapper[4898]: W1211 13:25:08.489792 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb7b41e46_2b51_441b_b181_ab36339a8d19.slice/crio-e553c9f827d6a01c1e5c217b14bb441d44d00adac6ba29cacba14145ed768ad4 WatchSource:0}: Error finding container e553c9f827d6a01c1e5c217b14bb441d44d00adac6ba29cacba14145ed768ad4: Status 404 returned error can't find the container with id e553c9f827d6a01c1e5c217b14bb441d44d00adac6ba29cacba14145ed768ad4 Dec 11 13:25:08 crc kubenswrapper[4898]: I1211 13:25:08.512570 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-lx2j5"] Dec 11 13:25:08 crc kubenswrapper[4898]: I1211 13:25:08.521572 4898 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 11 13:25:08 crc kubenswrapper[4898]: I1211 13:25:08.530673 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-openstack-db-create-2wggg"] Dec 11 13:25:08 crc kubenswrapper[4898]: I1211 13:25:08.791061 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d26c4733-9d89-4666-8554-f769295ad7b3" path="/var/lib/kubelet/pods/d26c4733-9d89-4666-8554-f769295ad7b3/volumes" Dec 11 13:25:09 crc kubenswrapper[4898]: I1211 13:25:09.134768 4898 generic.go:334] "Generic (PLEG): container finished" podID="b83689fb-d5d6-4c88-8976-791b41ff048f" containerID="ee6a9e1bfc5f1196653d49d8f163454806653e71d73badb97e660496a768d9c3" exitCode=0 Dec 11 13:25:09 crc kubenswrapper[4898]: I1211 13:25:09.134851 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-2649-account-create-update-xf8ww" event={"ID":"b83689fb-d5d6-4c88-8976-791b41ff048f","Type":"ContainerDied","Data":"ee6a9e1bfc5f1196653d49d8f163454806653e71d73badb97e660496a768d9c3"} Dec 11 13:25:09 crc kubenswrapper[4898]: I1211 13:25:09.135144 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-2649-account-create-update-xf8ww" event={"ID":"b83689fb-d5d6-4c88-8976-791b41ff048f","Type":"ContainerStarted","Data":"ae352e218b5da2d86c37baa9c869ec1803573b0176aeeeca804064ac372b5c66"} Dec 11 13:25:09 crc kubenswrapper[4898]: I1211 13:25:09.136768 4898 generic.go:334] "Generic (PLEG): container finished" podID="75e494bf-b288-45e2-8f87-7c146a9bb74f" containerID="d34d334e13830b57ee1b38d6cdb7249be2dc658324bb3afacd49fac3f310027b" exitCode=0 Dec 11 13:25:09 crc kubenswrapper[4898]: I1211 13:25:09.136841 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-kljnd" event={"ID":"75e494bf-b288-45e2-8f87-7c146a9bb74f","Type":"ContainerDied","Data":"d34d334e13830b57ee1b38d6cdb7249be2dc658324bb3afacd49fac3f310027b"} Dec 11 13:25:09 crc kubenswrapper[4898]: I1211 13:25:09.141567 4898 generic.go:334] "Generic (PLEG): container finished" podID="b7b41e46-2b51-441b-b181-ab36339a8d19" containerID="e01a2f3c04c58809c753d3d56c8b5179d770c2fb21f9f086d7a992fd0e41e50d" exitCode=0 Dec 11 13:25:09 crc kubenswrapper[4898]: I1211 13:25:09.141663 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-openstack-db-create-2wggg" event={"ID":"b7b41e46-2b51-441b-b181-ab36339a8d19","Type":"ContainerDied","Data":"e01a2f3c04c58809c753d3d56c8b5179d770c2fb21f9f086d7a992fd0e41e50d"} Dec 11 13:25:09 crc kubenswrapper[4898]: I1211 13:25:09.141698 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-openstack-db-create-2wggg" event={"ID":"b7b41e46-2b51-441b-b181-ab36339a8d19","Type":"ContainerStarted","Data":"e553c9f827d6a01c1e5c217b14bb441d44d00adac6ba29cacba14145ed768ad4"} Dec 11 13:25:09 crc kubenswrapper[4898]: I1211 13:25:09.146022 4898 generic.go:334] "Generic (PLEG): container finished" podID="8d685a72-643d-4739-9e26-28a37c6391d3" containerID="146f4995a97325967ce27374ab4cb69947c0be1a93c44dbee535156d898cc07b" exitCode=0 Dec 11 13:25:09 crc kubenswrapper[4898]: I1211 13:25:09.146090 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-x6tgv" event={"ID":"8d685a72-643d-4739-9e26-28a37c6391d3","Type":"ContainerDied","Data":"146f4995a97325967ce27374ab4cb69947c0be1a93c44dbee535156d898cc07b"} Dec 11 13:25:09 crc kubenswrapper[4898]: I1211 13:25:09.146118 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-x6tgv" event={"ID":"8d685a72-643d-4739-9e26-28a37c6391d3","Type":"ContainerStarted","Data":"ba633db7e0e2b09a133c46be1b40b84179d7618c1157ff4a1689bfed65bd9754"} Dec 11 13:25:09 crc kubenswrapper[4898]: I1211 13:25:09.150818 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-lx2j5" event={"ID":"6d5760a1-aea8-4f95-8da7-8832f8879d57","Type":"ContainerStarted","Data":"b823c9a858364f5c6ccbb6a91ccbee90e356eb4e21a9dcb5d28a1a826c051716"} Dec 11 13:25:09 crc kubenswrapper[4898]: I1211 13:25:09.155307 4898 generic.go:334] "Generic (PLEG): container finished" podID="9d28503d-c258-4bea-870f-7d8b34591c6e" containerID="369e4b9ed923862da0bc88beeeeff2e5ceda527cab10a6f59c016a734e9429f1" exitCode=0 Dec 11 13:25:09 crc kubenswrapper[4898]: I1211 13:25:09.155496 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-6502-account-create-update-6vp66" event={"ID":"9d28503d-c258-4bea-870f-7d8b34591c6e","Type":"ContainerDied","Data":"369e4b9ed923862da0bc88beeeeff2e5ceda527cab10a6f59c016a734e9429f1"} Dec 11 13:25:09 crc kubenswrapper[4898]: I1211 13:25:09.155606 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-6502-account-create-update-6vp66" event={"ID":"9d28503d-c258-4bea-870f-7d8b34591c6e","Type":"ContainerStarted","Data":"770de92dac938ef2541f9b3640afc1d3a71ec3f0b22cbf6f68de8a700415fd07"} Dec 11 13:25:09 crc kubenswrapper[4898]: I1211 13:25:09.757943 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-6f8f67696b-jczc2" podUID="c31c6aa7-c341-4e86-92d2-56ea1ab37168" containerName="console" containerID="cri-o://49dba1f0a7e23cfadf22b5fecf6ad3ac4c0b0b87d27f3f34e37ae4fe72a4bd35" gracePeriod=15 Dec 11 13:25:10 crc kubenswrapper[4898]: I1211 13:25:10.168628 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-kljnd" event={"ID":"75e494bf-b288-45e2-8f87-7c146a9bb74f","Type":"ContainerStarted","Data":"315e559ddb3ac3b90876b8105f8d971ba84c4c052f8ac4100c18d896859254f1"} Dec 11 13:25:10 crc kubenswrapper[4898]: I1211 13:25:10.169202 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-b8fbc5445-kljnd" Dec 11 13:25:10 crc kubenswrapper[4898]: I1211 13:25:10.170758 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6f8f67696b-jczc2_c31c6aa7-c341-4e86-92d2-56ea1ab37168/console/0.log" Dec 11 13:25:10 crc kubenswrapper[4898]: I1211 13:25:10.170797 4898 generic.go:334] "Generic (PLEG): container finished" podID="c31c6aa7-c341-4e86-92d2-56ea1ab37168" containerID="49dba1f0a7e23cfadf22b5fecf6ad3ac4c0b0b87d27f3f34e37ae4fe72a4bd35" exitCode=2 Dec 11 13:25:10 crc kubenswrapper[4898]: I1211 13:25:10.170958 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6f8f67696b-jczc2" event={"ID":"c31c6aa7-c341-4e86-92d2-56ea1ab37168","Type":"ContainerDied","Data":"49dba1f0a7e23cfadf22b5fecf6ad3ac4c0b0b87d27f3f34e37ae4fe72a4bd35"} Dec 11 13:25:10 crc kubenswrapper[4898]: I1211 13:25:10.199276 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-b8fbc5445-kljnd" podStartSLOduration=7.199251209 podStartE2EDuration="7.199251209s" podCreationTimestamp="2025-12-11 13:25:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 13:25:10.193406075 +0000 UTC m=+1267.765732522" watchObservedRunningTime="2025-12-11 13:25:10.199251209 +0000 UTC m=+1267.771577656" Dec 11 13:25:10 crc kubenswrapper[4898]: I1211 13:25:10.809523 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-6502-account-create-update-6vp66" Dec 11 13:25:10 crc kubenswrapper[4898]: I1211 13:25:10.866692 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lwkc5\" (UniqueName: \"kubernetes.io/projected/9d28503d-c258-4bea-870f-7d8b34591c6e-kube-api-access-lwkc5\") pod \"9d28503d-c258-4bea-870f-7d8b34591c6e\" (UID: \"9d28503d-c258-4bea-870f-7d8b34591c6e\") " Dec 11 13:25:10 crc kubenswrapper[4898]: I1211 13:25:10.866807 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9d28503d-c258-4bea-870f-7d8b34591c6e-operator-scripts\") pod \"9d28503d-c258-4bea-870f-7d8b34591c6e\" (UID: \"9d28503d-c258-4bea-870f-7d8b34591c6e\") " Dec 11 13:25:10 crc kubenswrapper[4898]: I1211 13:25:10.867691 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d28503d-c258-4bea-870f-7d8b34591c6e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9d28503d-c258-4bea-870f-7d8b34591c6e" (UID: "9d28503d-c258-4bea-870f-7d8b34591c6e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:25:10 crc kubenswrapper[4898]: I1211 13:25:10.870580 4898 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9d28503d-c258-4bea-870f-7d8b34591c6e-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 11 13:25:10 crc kubenswrapper[4898]: I1211 13:25:10.873982 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d28503d-c258-4bea-870f-7d8b34591c6e-kube-api-access-lwkc5" (OuterVolumeSpecName: "kube-api-access-lwkc5") pod "9d28503d-c258-4bea-870f-7d8b34591c6e" (UID: "9d28503d-c258-4bea-870f-7d8b34591c6e"). InnerVolumeSpecName "kube-api-access-lwkc5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:25:10 crc kubenswrapper[4898]: I1211 13:25:10.974047 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lwkc5\" (UniqueName: \"kubernetes.io/projected/9d28503d-c258-4bea-870f-7d8b34591c6e-kube-api-access-lwkc5\") on node \"crc\" DevicePath \"\"" Dec 11 13:25:11 crc kubenswrapper[4898]: I1211 13:25:11.182074 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-6502-account-create-update-6vp66" event={"ID":"9d28503d-c258-4bea-870f-7d8b34591c6e","Type":"ContainerDied","Data":"770de92dac938ef2541f9b3640afc1d3a71ec3f0b22cbf6f68de8a700415fd07"} Dec 11 13:25:11 crc kubenswrapper[4898]: I1211 13:25:11.182802 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="770de92dac938ef2541f9b3640afc1d3a71ec3f0b22cbf6f68de8a700415fd07" Dec 11 13:25:11 crc kubenswrapper[4898]: I1211 13:25:11.182196 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-6502-account-create-update-6vp66" Dec 11 13:25:11 crc kubenswrapper[4898]: I1211 13:25:11.184060 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-2649-account-create-update-xf8ww" event={"ID":"b83689fb-d5d6-4c88-8976-791b41ff048f","Type":"ContainerDied","Data":"ae352e218b5da2d86c37baa9c869ec1803573b0176aeeeca804064ac372b5c66"} Dec 11 13:25:11 crc kubenswrapper[4898]: I1211 13:25:11.184110 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ae352e218b5da2d86c37baa9c869ec1803573b0176aeeeca804064ac372b5c66" Dec 11 13:25:11 crc kubenswrapper[4898]: I1211 13:25:11.186187 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-x6tgv" event={"ID":"8d685a72-643d-4739-9e26-28a37c6391d3","Type":"ContainerDied","Data":"ba633db7e0e2b09a133c46be1b40b84179d7618c1157ff4a1689bfed65bd9754"} Dec 11 13:25:11 crc kubenswrapper[4898]: I1211 13:25:11.186241 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ba633db7e0e2b09a133c46be1b40b84179d7618c1157ff4a1689bfed65bd9754" Dec 11 13:25:11 crc kubenswrapper[4898]: I1211 13:25:11.188045 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-openstack-db-create-2wggg" event={"ID":"b7b41e46-2b51-441b-b181-ab36339a8d19","Type":"ContainerDied","Data":"e553c9f827d6a01c1e5c217b14bb441d44d00adac6ba29cacba14145ed768ad4"} Dec 11 13:25:11 crc kubenswrapper[4898]: I1211 13:25:11.188139 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e553c9f827d6a01c1e5c217b14bb441d44d00adac6ba29cacba14145ed768ad4" Dec 11 13:25:11 crc kubenswrapper[4898]: I1211 13:25:11.213895 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-x6tgv" Dec 11 13:25:11 crc kubenswrapper[4898]: I1211 13:25:11.223082 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-2649-account-create-update-xf8ww" Dec 11 13:25:11 crc kubenswrapper[4898]: I1211 13:25:11.234078 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-db-create-2wggg" Dec 11 13:25:11 crc kubenswrapper[4898]: I1211 13:25:11.280028 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n2m94\" (UniqueName: \"kubernetes.io/projected/8d685a72-643d-4739-9e26-28a37c6391d3-kube-api-access-n2m94\") pod \"8d685a72-643d-4739-9e26-28a37c6391d3\" (UID: \"8d685a72-643d-4739-9e26-28a37c6391d3\") " Dec 11 13:25:11 crc kubenswrapper[4898]: I1211 13:25:11.280294 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8d685a72-643d-4739-9e26-28a37c6391d3-operator-scripts\") pod \"8d685a72-643d-4739-9e26-28a37c6391d3\" (UID: \"8d685a72-643d-4739-9e26-28a37c6391d3\") " Dec 11 13:25:11 crc kubenswrapper[4898]: I1211 13:25:11.281570 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8d685a72-643d-4739-9e26-28a37c6391d3-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8d685a72-643d-4739-9e26-28a37c6391d3" (UID: "8d685a72-643d-4739-9e26-28a37c6391d3"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:25:11 crc kubenswrapper[4898]: I1211 13:25:11.382207 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rfcdb\" (UniqueName: \"kubernetes.io/projected/b7b41e46-2b51-441b-b181-ab36339a8d19-kube-api-access-rfcdb\") pod \"b7b41e46-2b51-441b-b181-ab36339a8d19\" (UID: \"b7b41e46-2b51-441b-b181-ab36339a8d19\") " Dec 11 13:25:11 crc kubenswrapper[4898]: I1211 13:25:11.382284 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b83689fb-d5d6-4c88-8976-791b41ff048f-operator-scripts\") pod \"b83689fb-d5d6-4c88-8976-791b41ff048f\" (UID: \"b83689fb-d5d6-4c88-8976-791b41ff048f\") " Dec 11 13:25:11 crc kubenswrapper[4898]: I1211 13:25:11.382309 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b7b41e46-2b51-441b-b181-ab36339a8d19-operator-scripts\") pod \"b7b41e46-2b51-441b-b181-ab36339a8d19\" (UID: \"b7b41e46-2b51-441b-b181-ab36339a8d19\") " Dec 11 13:25:11 crc kubenswrapper[4898]: I1211 13:25:11.382583 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v4d47\" (UniqueName: \"kubernetes.io/projected/b83689fb-d5d6-4c88-8976-791b41ff048f-kube-api-access-v4d47\") pod \"b83689fb-d5d6-4c88-8976-791b41ff048f\" (UID: \"b83689fb-d5d6-4c88-8976-791b41ff048f\") " Dec 11 13:25:11 crc kubenswrapper[4898]: I1211 13:25:11.383003 4898 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8d685a72-643d-4739-9e26-28a37c6391d3-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 11 13:25:11 crc kubenswrapper[4898]: I1211 13:25:11.383110 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d685a72-643d-4739-9e26-28a37c6391d3-kube-api-access-n2m94" (OuterVolumeSpecName: "kube-api-access-n2m94") pod "8d685a72-643d-4739-9e26-28a37c6391d3" (UID: "8d685a72-643d-4739-9e26-28a37c6391d3"). InnerVolumeSpecName "kube-api-access-n2m94". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:25:11 crc kubenswrapper[4898]: I1211 13:25:11.383512 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b7b41e46-2b51-441b-b181-ab36339a8d19-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b7b41e46-2b51-441b-b181-ab36339a8d19" (UID: "b7b41e46-2b51-441b-b181-ab36339a8d19"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:25:11 crc kubenswrapper[4898]: I1211 13:25:11.386142 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b83689fb-d5d6-4c88-8976-791b41ff048f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b83689fb-d5d6-4c88-8976-791b41ff048f" (UID: "b83689fb-d5d6-4c88-8976-791b41ff048f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:25:11 crc kubenswrapper[4898]: I1211 13:25:11.386732 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b7b41e46-2b51-441b-b181-ab36339a8d19-kube-api-access-rfcdb" (OuterVolumeSpecName: "kube-api-access-rfcdb") pod "b7b41e46-2b51-441b-b181-ab36339a8d19" (UID: "b7b41e46-2b51-441b-b181-ab36339a8d19"). InnerVolumeSpecName "kube-api-access-rfcdb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:25:11 crc kubenswrapper[4898]: I1211 13:25:11.388423 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b83689fb-d5d6-4c88-8976-791b41ff048f-kube-api-access-v4d47" (OuterVolumeSpecName: "kube-api-access-v4d47") pod "b83689fb-d5d6-4c88-8976-791b41ff048f" (UID: "b83689fb-d5d6-4c88-8976-791b41ff048f"). InnerVolumeSpecName "kube-api-access-v4d47". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:25:11 crc kubenswrapper[4898]: I1211 13:25:11.485147 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v4d47\" (UniqueName: \"kubernetes.io/projected/b83689fb-d5d6-4c88-8976-791b41ff048f-kube-api-access-v4d47\") on node \"crc\" DevicePath \"\"" Dec 11 13:25:11 crc kubenswrapper[4898]: I1211 13:25:11.485186 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rfcdb\" (UniqueName: \"kubernetes.io/projected/b7b41e46-2b51-441b-b181-ab36339a8d19-kube-api-access-rfcdb\") on node \"crc\" DevicePath \"\"" Dec 11 13:25:11 crc kubenswrapper[4898]: I1211 13:25:11.485195 4898 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b83689fb-d5d6-4c88-8976-791b41ff048f-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 11 13:25:11 crc kubenswrapper[4898]: I1211 13:25:11.485204 4898 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b7b41e46-2b51-441b-b181-ab36339a8d19-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 11 13:25:11 crc kubenswrapper[4898]: I1211 13:25:11.485213 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n2m94\" (UniqueName: \"kubernetes.io/projected/8d685a72-643d-4739-9e26-28a37c6391d3-kube-api-access-n2m94\") on node \"crc\" DevicePath \"\"" Dec 11 13:25:11 crc kubenswrapper[4898]: I1211 13:25:11.665830 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Dec 11 13:25:11 crc kubenswrapper[4898]: I1211 13:25:11.828625 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6f8f67696b-jczc2_c31c6aa7-c341-4e86-92d2-56ea1ab37168/console/0.log" Dec 11 13:25:11 crc kubenswrapper[4898]: I1211 13:25:11.828897 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6f8f67696b-jczc2" Dec 11 13:25:11 crc kubenswrapper[4898]: I1211 13:25:11.895199 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c31c6aa7-c341-4e86-92d2-56ea1ab37168-oauth-serving-cert\") pod \"c31c6aa7-c341-4e86-92d2-56ea1ab37168\" (UID: \"c31c6aa7-c341-4e86-92d2-56ea1ab37168\") " Dec 11 13:25:11 crc kubenswrapper[4898]: I1211 13:25:11.895283 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c31c6aa7-c341-4e86-92d2-56ea1ab37168-service-ca\") pod \"c31c6aa7-c341-4e86-92d2-56ea1ab37168\" (UID: \"c31c6aa7-c341-4e86-92d2-56ea1ab37168\") " Dec 11 13:25:11 crc kubenswrapper[4898]: I1211 13:25:11.895348 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c31c6aa7-c341-4e86-92d2-56ea1ab37168-console-config\") pod \"c31c6aa7-c341-4e86-92d2-56ea1ab37168\" (UID: \"c31c6aa7-c341-4e86-92d2-56ea1ab37168\") " Dec 11 13:25:11 crc kubenswrapper[4898]: I1211 13:25:11.895416 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c31c6aa7-c341-4e86-92d2-56ea1ab37168-console-serving-cert\") pod \"c31c6aa7-c341-4e86-92d2-56ea1ab37168\" (UID: \"c31c6aa7-c341-4e86-92d2-56ea1ab37168\") " Dec 11 13:25:11 crc kubenswrapper[4898]: I1211 13:25:11.895577 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7rz5p\" (UniqueName: \"kubernetes.io/projected/c31c6aa7-c341-4e86-92d2-56ea1ab37168-kube-api-access-7rz5p\") pod \"c31c6aa7-c341-4e86-92d2-56ea1ab37168\" (UID: \"c31c6aa7-c341-4e86-92d2-56ea1ab37168\") " Dec 11 13:25:11 crc kubenswrapper[4898]: I1211 13:25:11.895651 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c31c6aa7-c341-4e86-92d2-56ea1ab37168-console-oauth-config\") pod \"c31c6aa7-c341-4e86-92d2-56ea1ab37168\" (UID: \"c31c6aa7-c341-4e86-92d2-56ea1ab37168\") " Dec 11 13:25:11 crc kubenswrapper[4898]: I1211 13:25:11.895737 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c31c6aa7-c341-4e86-92d2-56ea1ab37168-trusted-ca-bundle\") pod \"c31c6aa7-c341-4e86-92d2-56ea1ab37168\" (UID: \"c31c6aa7-c341-4e86-92d2-56ea1ab37168\") " Dec 11 13:25:11 crc kubenswrapper[4898]: I1211 13:25:11.897557 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c31c6aa7-c341-4e86-92d2-56ea1ab37168-service-ca" (OuterVolumeSpecName: "service-ca") pod "c31c6aa7-c341-4e86-92d2-56ea1ab37168" (UID: "c31c6aa7-c341-4e86-92d2-56ea1ab37168"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:25:11 crc kubenswrapper[4898]: I1211 13:25:11.897662 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c31c6aa7-c341-4e86-92d2-56ea1ab37168-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "c31c6aa7-c341-4e86-92d2-56ea1ab37168" (UID: "c31c6aa7-c341-4e86-92d2-56ea1ab37168"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:25:11 crc kubenswrapper[4898]: I1211 13:25:11.897536 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c31c6aa7-c341-4e86-92d2-56ea1ab37168-console-config" (OuterVolumeSpecName: "console-config") pod "c31c6aa7-c341-4e86-92d2-56ea1ab37168" (UID: "c31c6aa7-c341-4e86-92d2-56ea1ab37168"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:25:11 crc kubenswrapper[4898]: I1211 13:25:11.897847 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c31c6aa7-c341-4e86-92d2-56ea1ab37168-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "c31c6aa7-c341-4e86-92d2-56ea1ab37168" (UID: "c31c6aa7-c341-4e86-92d2-56ea1ab37168"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:25:11 crc kubenswrapper[4898]: I1211 13:25:11.902907 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c31c6aa7-c341-4e86-92d2-56ea1ab37168-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "c31c6aa7-c341-4e86-92d2-56ea1ab37168" (UID: "c31c6aa7-c341-4e86-92d2-56ea1ab37168"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:25:11 crc kubenswrapper[4898]: I1211 13:25:11.904809 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c31c6aa7-c341-4e86-92d2-56ea1ab37168-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "c31c6aa7-c341-4e86-92d2-56ea1ab37168" (UID: "c31c6aa7-c341-4e86-92d2-56ea1ab37168"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:25:11 crc kubenswrapper[4898]: I1211 13:25:11.906843 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c31c6aa7-c341-4e86-92d2-56ea1ab37168-kube-api-access-7rz5p" (OuterVolumeSpecName: "kube-api-access-7rz5p") pod "c31c6aa7-c341-4e86-92d2-56ea1ab37168" (UID: "c31c6aa7-c341-4e86-92d2-56ea1ab37168"). InnerVolumeSpecName "kube-api-access-7rz5p". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:25:11 crc kubenswrapper[4898]: I1211 13:25:11.998403 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7rz5p\" (UniqueName: \"kubernetes.io/projected/c31c6aa7-c341-4e86-92d2-56ea1ab37168-kube-api-access-7rz5p\") on node \"crc\" DevicePath \"\"" Dec 11 13:25:11 crc kubenswrapper[4898]: I1211 13:25:11.998433 4898 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c31c6aa7-c341-4e86-92d2-56ea1ab37168-console-oauth-config\") on node \"crc\" DevicePath \"\"" Dec 11 13:25:11 crc kubenswrapper[4898]: I1211 13:25:11.998441 4898 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c31c6aa7-c341-4e86-92d2-56ea1ab37168-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 11 13:25:11 crc kubenswrapper[4898]: I1211 13:25:11.998463 4898 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c31c6aa7-c341-4e86-92d2-56ea1ab37168-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 11 13:25:11 crc kubenswrapper[4898]: I1211 13:25:11.998472 4898 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c31c6aa7-c341-4e86-92d2-56ea1ab37168-service-ca\") on node \"crc\" DevicePath \"\"" Dec 11 13:25:11 crc kubenswrapper[4898]: I1211 13:25:11.998481 4898 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c31c6aa7-c341-4e86-92d2-56ea1ab37168-console-config\") on node \"crc\" DevicePath \"\"" Dec 11 13:25:11 crc kubenswrapper[4898]: I1211 13:25:11.998490 4898 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c31c6aa7-c341-4e86-92d2-56ea1ab37168-console-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 11 13:25:12 crc kubenswrapper[4898]: I1211 13:25:12.202549 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"87df6c08-c7eb-4e54-a329-6343e195c6f3","Type":"ContainerStarted","Data":"39e2333a7a41f963b9bfefb04ce49dffd32436c98690f83c354375a6e1f91a1e"} Dec 11 13:25:12 crc kubenswrapper[4898]: I1211 13:25:12.205867 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6f8f67696b-jczc2_c31c6aa7-c341-4e86-92d2-56ea1ab37168/console/0.log" Dec 11 13:25:12 crc kubenswrapper[4898]: I1211 13:25:12.205966 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-db-create-2wggg" Dec 11 13:25:12 crc kubenswrapper[4898]: I1211 13:25:12.206049 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6f8f67696b-jczc2" event={"ID":"c31c6aa7-c341-4e86-92d2-56ea1ab37168","Type":"ContainerDied","Data":"d8eda09cf08c67b4ca42a3ef68d3fd652b8f12351dc59b833a7c89e246030bda"} Dec 11 13:25:12 crc kubenswrapper[4898]: I1211 13:25:12.206092 4898 scope.go:117] "RemoveContainer" containerID="49dba1f0a7e23cfadf22b5fecf6ad3ac4c0b0b87d27f3f34e37ae4fe72a4bd35" Dec 11 13:25:12 crc kubenswrapper[4898]: I1211 13:25:12.206122 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-x6tgv" Dec 11 13:25:12 crc kubenswrapper[4898]: I1211 13:25:12.206151 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6f8f67696b-jczc2" Dec 11 13:25:12 crc kubenswrapper[4898]: I1211 13:25:12.206228 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-2649-account-create-update-xf8ww" Dec 11 13:25:12 crc kubenswrapper[4898]: I1211 13:25:12.270746 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6f8f67696b-jczc2"] Dec 11 13:25:12 crc kubenswrapper[4898]: I1211 13:25:12.284682 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-6f8f67696b-jczc2"] Dec 11 13:25:12 crc kubenswrapper[4898]: I1211 13:25:12.506998 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/717e7951-d95b-497f-b2b7-3ec4ef755642-etc-swift\") pod \"swift-storage-0\" (UID: \"717e7951-d95b-497f-b2b7-3ec4ef755642\") " pod="openstack/swift-storage-0" Dec 11 13:25:12 crc kubenswrapper[4898]: E1211 13:25:12.507193 4898 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 11 13:25:12 crc kubenswrapper[4898]: E1211 13:25:12.507219 4898 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 11 13:25:12 crc kubenswrapper[4898]: E1211 13:25:12.507277 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/717e7951-d95b-497f-b2b7-3ec4ef755642-etc-swift podName:717e7951-d95b-497f-b2b7-3ec4ef755642 nodeName:}" failed. No retries permitted until 2025-12-11 13:25:20.507259542 +0000 UTC m=+1278.079585969 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/717e7951-d95b-497f-b2b7-3ec4ef755642-etc-swift") pod "swift-storage-0" (UID: "717e7951-d95b-497f-b2b7-3ec4ef755642") : configmap "swift-ring-files" not found Dec 11 13:25:12 crc kubenswrapper[4898]: I1211 13:25:12.797651 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c31c6aa7-c341-4e86-92d2-56ea1ab37168" path="/var/lib/kubelet/pods/c31c6aa7-c341-4e86-92d2-56ea1ab37168/volumes" Dec 11 13:25:13 crc kubenswrapper[4898]: I1211 13:25:13.535619 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mysqld-exporter-openstack-cell1-db-create-bhvfq"] Dec 11 13:25:13 crc kubenswrapper[4898]: E1211 13:25:13.536304 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7b41e46-2b51-441b-b181-ab36339a8d19" containerName="mariadb-database-create" Dec 11 13:25:13 crc kubenswrapper[4898]: I1211 13:25:13.536319 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7b41e46-2b51-441b-b181-ab36339a8d19" containerName="mariadb-database-create" Dec 11 13:25:13 crc kubenswrapper[4898]: E1211 13:25:13.536337 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b83689fb-d5d6-4c88-8976-791b41ff048f" containerName="mariadb-account-create-update" Dec 11 13:25:13 crc kubenswrapper[4898]: I1211 13:25:13.536345 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="b83689fb-d5d6-4c88-8976-791b41ff048f" containerName="mariadb-account-create-update" Dec 11 13:25:13 crc kubenswrapper[4898]: E1211 13:25:13.536357 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="376bd2c9-d19e-4322-8269-847515a788cb" containerName="mariadb-account-create-update" Dec 11 13:25:13 crc kubenswrapper[4898]: I1211 13:25:13.536364 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="376bd2c9-d19e-4322-8269-847515a788cb" containerName="mariadb-account-create-update" Dec 11 13:25:13 crc kubenswrapper[4898]: E1211 13:25:13.536376 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="399ee20b-e403-4d88-bd87-77a1c8b71e93" containerName="mariadb-database-create" Dec 11 13:25:13 crc kubenswrapper[4898]: I1211 13:25:13.536383 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="399ee20b-e403-4d88-bd87-77a1c8b71e93" containerName="mariadb-database-create" Dec 11 13:25:13 crc kubenswrapper[4898]: E1211 13:25:13.536392 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d685a72-643d-4739-9e26-28a37c6391d3" containerName="mariadb-database-create" Dec 11 13:25:13 crc kubenswrapper[4898]: I1211 13:25:13.536401 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d685a72-643d-4739-9e26-28a37c6391d3" containerName="mariadb-database-create" Dec 11 13:25:13 crc kubenswrapper[4898]: E1211 13:25:13.536417 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04e19d4f-0d50-4924-bd7d-812d753d76ac" containerName="mariadb-account-create-update" Dec 11 13:25:13 crc kubenswrapper[4898]: I1211 13:25:13.536425 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="04e19d4f-0d50-4924-bd7d-812d753d76ac" containerName="mariadb-account-create-update" Dec 11 13:25:13 crc kubenswrapper[4898]: E1211 13:25:13.536436 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d26c4733-9d89-4666-8554-f769295ad7b3" containerName="dnsmasq-dns" Dec 11 13:25:13 crc kubenswrapper[4898]: I1211 13:25:13.536443 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="d26c4733-9d89-4666-8554-f769295ad7b3" containerName="dnsmasq-dns" Dec 11 13:25:13 crc kubenswrapper[4898]: E1211 13:25:13.536483 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c31c6aa7-c341-4e86-92d2-56ea1ab37168" containerName="console" Dec 11 13:25:13 crc kubenswrapper[4898]: I1211 13:25:13.536492 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="c31c6aa7-c341-4e86-92d2-56ea1ab37168" containerName="console" Dec 11 13:25:13 crc kubenswrapper[4898]: E1211 13:25:13.536501 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d26c4733-9d89-4666-8554-f769295ad7b3" containerName="init" Dec 11 13:25:13 crc kubenswrapper[4898]: I1211 13:25:13.536508 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="d26c4733-9d89-4666-8554-f769295ad7b3" containerName="init" Dec 11 13:25:13 crc kubenswrapper[4898]: E1211 13:25:13.536525 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d589c38c-afcf-4107-bf16-ac57d302576e" containerName="mariadb-database-create" Dec 11 13:25:13 crc kubenswrapper[4898]: I1211 13:25:13.536532 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="d589c38c-afcf-4107-bf16-ac57d302576e" containerName="mariadb-database-create" Dec 11 13:25:13 crc kubenswrapper[4898]: E1211 13:25:13.536558 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d28503d-c258-4bea-870f-7d8b34591c6e" containerName="mariadb-account-create-update" Dec 11 13:25:13 crc kubenswrapper[4898]: I1211 13:25:13.536565 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d28503d-c258-4bea-870f-7d8b34591c6e" containerName="mariadb-account-create-update" Dec 11 13:25:13 crc kubenswrapper[4898]: I1211 13:25:13.536777 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="b83689fb-d5d6-4c88-8976-791b41ff048f" containerName="mariadb-account-create-update" Dec 11 13:25:13 crc kubenswrapper[4898]: I1211 13:25:13.536795 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7b41e46-2b51-441b-b181-ab36339a8d19" containerName="mariadb-database-create" Dec 11 13:25:13 crc kubenswrapper[4898]: I1211 13:25:13.536816 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="c31c6aa7-c341-4e86-92d2-56ea1ab37168" containerName="console" Dec 11 13:25:13 crc kubenswrapper[4898]: I1211 13:25:13.536826 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="d589c38c-afcf-4107-bf16-ac57d302576e" containerName="mariadb-database-create" Dec 11 13:25:13 crc kubenswrapper[4898]: I1211 13:25:13.536839 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="04e19d4f-0d50-4924-bd7d-812d753d76ac" containerName="mariadb-account-create-update" Dec 11 13:25:13 crc kubenswrapper[4898]: I1211 13:25:13.536850 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="d26c4733-9d89-4666-8554-f769295ad7b3" containerName="dnsmasq-dns" Dec 11 13:25:13 crc kubenswrapper[4898]: I1211 13:25:13.536864 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="399ee20b-e403-4d88-bd87-77a1c8b71e93" containerName="mariadb-database-create" Dec 11 13:25:13 crc kubenswrapper[4898]: I1211 13:25:13.536877 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="376bd2c9-d19e-4322-8269-847515a788cb" containerName="mariadb-account-create-update" Dec 11 13:25:13 crc kubenswrapper[4898]: I1211 13:25:13.536895 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d28503d-c258-4bea-870f-7d8b34591c6e" containerName="mariadb-account-create-update" Dec 11 13:25:13 crc kubenswrapper[4898]: I1211 13:25:13.536905 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d685a72-643d-4739-9e26-28a37c6391d3" containerName="mariadb-database-create" Dec 11 13:25:13 crc kubenswrapper[4898]: I1211 13:25:13.537581 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-cell1-db-create-bhvfq" Dec 11 13:25:13 crc kubenswrapper[4898]: I1211 13:25:13.586231 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-openstack-cell1-db-create-bhvfq"] Dec 11 13:25:13 crc kubenswrapper[4898]: I1211 13:25:13.643061 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8cc39efd-7ed2-48a9-a27e-e637091064ef-operator-scripts\") pod \"mysqld-exporter-openstack-cell1-db-create-bhvfq\" (UID: \"8cc39efd-7ed2-48a9-a27e-e637091064ef\") " pod="openstack/mysqld-exporter-openstack-cell1-db-create-bhvfq" Dec 11 13:25:13 crc kubenswrapper[4898]: I1211 13:25:13.643115 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vqlrn\" (UniqueName: \"kubernetes.io/projected/8cc39efd-7ed2-48a9-a27e-e637091064ef-kube-api-access-vqlrn\") pod \"mysqld-exporter-openstack-cell1-db-create-bhvfq\" (UID: \"8cc39efd-7ed2-48a9-a27e-e637091064ef\") " pod="openstack/mysqld-exporter-openstack-cell1-db-create-bhvfq" Dec 11 13:25:13 crc kubenswrapper[4898]: I1211 13:25:13.745849 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8cc39efd-7ed2-48a9-a27e-e637091064ef-operator-scripts\") pod \"mysqld-exporter-openstack-cell1-db-create-bhvfq\" (UID: \"8cc39efd-7ed2-48a9-a27e-e637091064ef\") " pod="openstack/mysqld-exporter-openstack-cell1-db-create-bhvfq" Dec 11 13:25:13 crc kubenswrapper[4898]: I1211 13:25:13.745965 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vqlrn\" (UniqueName: \"kubernetes.io/projected/8cc39efd-7ed2-48a9-a27e-e637091064ef-kube-api-access-vqlrn\") pod \"mysqld-exporter-openstack-cell1-db-create-bhvfq\" (UID: \"8cc39efd-7ed2-48a9-a27e-e637091064ef\") " pod="openstack/mysqld-exporter-openstack-cell1-db-create-bhvfq" Dec 11 13:25:13 crc kubenswrapper[4898]: I1211 13:25:13.747571 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mysqld-exporter-ca33-account-create-update-8x4rf"] Dec 11 13:25:13 crc kubenswrapper[4898]: I1211 13:25:13.748959 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-ca33-account-create-update-8x4rf" Dec 11 13:25:13 crc kubenswrapper[4898]: I1211 13:25:13.751434 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8cc39efd-7ed2-48a9-a27e-e637091064ef-operator-scripts\") pod \"mysqld-exporter-openstack-cell1-db-create-bhvfq\" (UID: \"8cc39efd-7ed2-48a9-a27e-e637091064ef\") " pod="openstack/mysqld-exporter-openstack-cell1-db-create-bhvfq" Dec 11 13:25:13 crc kubenswrapper[4898]: I1211 13:25:13.751718 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"mysqld-exporter-openstack-cell1-db-secret" Dec 11 13:25:13 crc kubenswrapper[4898]: I1211 13:25:13.762168 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-ca33-account-create-update-8x4rf"] Dec 11 13:25:13 crc kubenswrapper[4898]: I1211 13:25:13.785530 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vqlrn\" (UniqueName: \"kubernetes.io/projected/8cc39efd-7ed2-48a9-a27e-e637091064ef-kube-api-access-vqlrn\") pod \"mysqld-exporter-openstack-cell1-db-create-bhvfq\" (UID: \"8cc39efd-7ed2-48a9-a27e-e637091064ef\") " pod="openstack/mysqld-exporter-openstack-cell1-db-create-bhvfq" Dec 11 13:25:13 crc kubenswrapper[4898]: I1211 13:25:13.848220 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-27xj4\" (UniqueName: \"kubernetes.io/projected/95b56d2f-c89b-4531-9e3a-f2c7accab5dd-kube-api-access-27xj4\") pod \"mysqld-exporter-ca33-account-create-update-8x4rf\" (UID: \"95b56d2f-c89b-4531-9e3a-f2c7accab5dd\") " pod="openstack/mysqld-exporter-ca33-account-create-update-8x4rf" Dec 11 13:25:13 crc kubenswrapper[4898]: I1211 13:25:13.848321 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/95b56d2f-c89b-4531-9e3a-f2c7accab5dd-operator-scripts\") pod \"mysqld-exporter-ca33-account-create-update-8x4rf\" (UID: \"95b56d2f-c89b-4531-9e3a-f2c7accab5dd\") " pod="openstack/mysqld-exporter-ca33-account-create-update-8x4rf" Dec 11 13:25:13 crc kubenswrapper[4898]: I1211 13:25:13.910703 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-cell1-db-create-bhvfq" Dec 11 13:25:13 crc kubenswrapper[4898]: I1211 13:25:13.951734 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-27xj4\" (UniqueName: \"kubernetes.io/projected/95b56d2f-c89b-4531-9e3a-f2c7accab5dd-kube-api-access-27xj4\") pod \"mysqld-exporter-ca33-account-create-update-8x4rf\" (UID: \"95b56d2f-c89b-4531-9e3a-f2c7accab5dd\") " pod="openstack/mysqld-exporter-ca33-account-create-update-8x4rf" Dec 11 13:25:13 crc kubenswrapper[4898]: I1211 13:25:13.951826 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/95b56d2f-c89b-4531-9e3a-f2c7accab5dd-operator-scripts\") pod \"mysqld-exporter-ca33-account-create-update-8x4rf\" (UID: \"95b56d2f-c89b-4531-9e3a-f2c7accab5dd\") " pod="openstack/mysqld-exporter-ca33-account-create-update-8x4rf" Dec 11 13:25:13 crc kubenswrapper[4898]: I1211 13:25:13.952616 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/95b56d2f-c89b-4531-9e3a-f2c7accab5dd-operator-scripts\") pod \"mysqld-exporter-ca33-account-create-update-8x4rf\" (UID: \"95b56d2f-c89b-4531-9e3a-f2c7accab5dd\") " pod="openstack/mysqld-exporter-ca33-account-create-update-8x4rf" Dec 11 13:25:13 crc kubenswrapper[4898]: I1211 13:25:13.977860 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-27xj4\" (UniqueName: \"kubernetes.io/projected/95b56d2f-c89b-4531-9e3a-f2c7accab5dd-kube-api-access-27xj4\") pod \"mysqld-exporter-ca33-account-create-update-8x4rf\" (UID: \"95b56d2f-c89b-4531-9e3a-f2c7accab5dd\") " pod="openstack/mysqld-exporter-ca33-account-create-update-8x4rf" Dec 11 13:25:14 crc kubenswrapper[4898]: I1211 13:25:14.113829 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-ca33-account-create-update-8x4rf" Dec 11 13:25:15 crc kubenswrapper[4898]: I1211 13:25:15.164957 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-openstack-cell1-db-create-bhvfq"] Dec 11 13:25:15 crc kubenswrapper[4898]: W1211 13:25:15.172585 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8cc39efd_7ed2_48a9_a27e_e637091064ef.slice/crio-5b1b44cfdc32defc8d4544c8ab15690179e74a1d227efebfc78cfa40ed409fd7 WatchSource:0}: Error finding container 5b1b44cfdc32defc8d4544c8ab15690179e74a1d227efebfc78cfa40ed409fd7: Status 404 returned error can't find the container with id 5b1b44cfdc32defc8d4544c8ab15690179e74a1d227efebfc78cfa40ed409fd7 Dec 11 13:25:15 crc kubenswrapper[4898]: I1211 13:25:15.241587 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-ca33-account-create-update-8x4rf"] Dec 11 13:25:15 crc kubenswrapper[4898]: I1211 13:25:15.242348 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-openstack-cell1-db-create-bhvfq" event={"ID":"8cc39efd-7ed2-48a9-a27e-e637091064ef","Type":"ContainerStarted","Data":"5b1b44cfdc32defc8d4544c8ab15690179e74a1d227efebfc78cfa40ed409fd7"} Dec 11 13:25:15 crc kubenswrapper[4898]: I1211 13:25:15.244156 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-lx2j5" event={"ID":"6d5760a1-aea8-4f95-8da7-8832f8879d57","Type":"ContainerStarted","Data":"44911eebd274ee213fd2d3374b6b2e15a715288e93dbe3edcd83c9ff6f131c30"} Dec 11 13:25:15 crc kubenswrapper[4898]: I1211 13:25:15.267322 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-lx2j5" podStartSLOduration=4.10718732 podStartE2EDuration="10.267298831s" podCreationTimestamp="2025-12-11 13:25:05 +0000 UTC" firstStartedPulling="2025-12-11 13:25:08.521319486 +0000 UTC m=+1266.093645923" lastFinishedPulling="2025-12-11 13:25:14.681430987 +0000 UTC m=+1272.253757434" observedRunningTime="2025-12-11 13:25:15.258878269 +0000 UTC m=+1272.831204716" watchObservedRunningTime="2025-12-11 13:25:15.267298831 +0000 UTC m=+1272.839625268" Dec 11 13:25:16 crc kubenswrapper[4898]: I1211 13:25:16.254591 4898 generic.go:334] "Generic (PLEG): container finished" podID="8cc39efd-7ed2-48a9-a27e-e637091064ef" containerID="612c2c79ebc3623dba06ee7e330fab02c5aaf37676368645a297d4ecef5f9190" exitCode=0 Dec 11 13:25:16 crc kubenswrapper[4898]: I1211 13:25:16.254699 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-openstack-cell1-db-create-bhvfq" event={"ID":"8cc39efd-7ed2-48a9-a27e-e637091064ef","Type":"ContainerDied","Data":"612c2c79ebc3623dba06ee7e330fab02c5aaf37676368645a297d4ecef5f9190"} Dec 11 13:25:16 crc kubenswrapper[4898]: I1211 13:25:16.257647 4898 generic.go:334] "Generic (PLEG): container finished" podID="95b56d2f-c89b-4531-9e3a-f2c7accab5dd" containerID="bb2a3d1269e0f3aba572d22ac5a2c83c4753175f6dccd0278ce3743a1138239a" exitCode=0 Dec 11 13:25:16 crc kubenswrapper[4898]: I1211 13:25:16.257743 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-ca33-account-create-update-8x4rf" event={"ID":"95b56d2f-c89b-4531-9e3a-f2c7accab5dd","Type":"ContainerDied","Data":"bb2a3d1269e0f3aba572d22ac5a2c83c4753175f6dccd0278ce3743a1138239a"} Dec 11 13:25:16 crc kubenswrapper[4898]: I1211 13:25:16.257774 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-ca33-account-create-update-8x4rf" event={"ID":"95b56d2f-c89b-4531-9e3a-f2c7accab5dd","Type":"ContainerStarted","Data":"797378eee6b3ff2352d2177447ac21d0c23d1c26648c4f58c7bc08c238f17322"} Dec 11 13:25:16 crc kubenswrapper[4898]: I1211 13:25:16.862231 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-lkkj2"] Dec 11 13:25:16 crc kubenswrapper[4898]: I1211 13:25:16.866813 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-lkkj2" Dec 11 13:25:16 crc kubenswrapper[4898]: I1211 13:25:16.870484 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-hm6b4" Dec 11 13:25:16 crc kubenswrapper[4898]: I1211 13:25:16.872548 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Dec 11 13:25:16 crc kubenswrapper[4898]: I1211 13:25:16.890568 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-lkkj2"] Dec 11 13:25:16 crc kubenswrapper[4898]: I1211 13:25:16.922115 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca6259a8-1ee8-47a6-b102-e7e22b93c2c1-config-data\") pod \"glance-db-sync-lkkj2\" (UID: \"ca6259a8-1ee8-47a6-b102-e7e22b93c2c1\") " pod="openstack/glance-db-sync-lkkj2" Dec 11 13:25:16 crc kubenswrapper[4898]: I1211 13:25:16.922235 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca6259a8-1ee8-47a6-b102-e7e22b93c2c1-combined-ca-bundle\") pod \"glance-db-sync-lkkj2\" (UID: \"ca6259a8-1ee8-47a6-b102-e7e22b93c2c1\") " pod="openstack/glance-db-sync-lkkj2" Dec 11 13:25:16 crc kubenswrapper[4898]: I1211 13:25:16.922266 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kx7jx\" (UniqueName: \"kubernetes.io/projected/ca6259a8-1ee8-47a6-b102-e7e22b93c2c1-kube-api-access-kx7jx\") pod \"glance-db-sync-lkkj2\" (UID: \"ca6259a8-1ee8-47a6-b102-e7e22b93c2c1\") " pod="openstack/glance-db-sync-lkkj2" Dec 11 13:25:16 crc kubenswrapper[4898]: I1211 13:25:16.922348 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ca6259a8-1ee8-47a6-b102-e7e22b93c2c1-db-sync-config-data\") pod \"glance-db-sync-lkkj2\" (UID: \"ca6259a8-1ee8-47a6-b102-e7e22b93c2c1\") " pod="openstack/glance-db-sync-lkkj2" Dec 11 13:25:17 crc kubenswrapper[4898]: I1211 13:25:17.024512 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ca6259a8-1ee8-47a6-b102-e7e22b93c2c1-db-sync-config-data\") pod \"glance-db-sync-lkkj2\" (UID: \"ca6259a8-1ee8-47a6-b102-e7e22b93c2c1\") " pod="openstack/glance-db-sync-lkkj2" Dec 11 13:25:17 crc kubenswrapper[4898]: I1211 13:25:17.024746 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca6259a8-1ee8-47a6-b102-e7e22b93c2c1-config-data\") pod \"glance-db-sync-lkkj2\" (UID: \"ca6259a8-1ee8-47a6-b102-e7e22b93c2c1\") " pod="openstack/glance-db-sync-lkkj2" Dec 11 13:25:17 crc kubenswrapper[4898]: I1211 13:25:17.024851 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca6259a8-1ee8-47a6-b102-e7e22b93c2c1-combined-ca-bundle\") pod \"glance-db-sync-lkkj2\" (UID: \"ca6259a8-1ee8-47a6-b102-e7e22b93c2c1\") " pod="openstack/glance-db-sync-lkkj2" Dec 11 13:25:17 crc kubenswrapper[4898]: I1211 13:25:17.024912 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kx7jx\" (UniqueName: \"kubernetes.io/projected/ca6259a8-1ee8-47a6-b102-e7e22b93c2c1-kube-api-access-kx7jx\") pod \"glance-db-sync-lkkj2\" (UID: \"ca6259a8-1ee8-47a6-b102-e7e22b93c2c1\") " pod="openstack/glance-db-sync-lkkj2" Dec 11 13:25:17 crc kubenswrapper[4898]: I1211 13:25:17.042991 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ca6259a8-1ee8-47a6-b102-e7e22b93c2c1-db-sync-config-data\") pod \"glance-db-sync-lkkj2\" (UID: \"ca6259a8-1ee8-47a6-b102-e7e22b93c2c1\") " pod="openstack/glance-db-sync-lkkj2" Dec 11 13:25:17 crc kubenswrapper[4898]: I1211 13:25:17.043545 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca6259a8-1ee8-47a6-b102-e7e22b93c2c1-combined-ca-bundle\") pod \"glance-db-sync-lkkj2\" (UID: \"ca6259a8-1ee8-47a6-b102-e7e22b93c2c1\") " pod="openstack/glance-db-sync-lkkj2" Dec 11 13:25:17 crc kubenswrapper[4898]: I1211 13:25:17.045575 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca6259a8-1ee8-47a6-b102-e7e22b93c2c1-config-data\") pod \"glance-db-sync-lkkj2\" (UID: \"ca6259a8-1ee8-47a6-b102-e7e22b93c2c1\") " pod="openstack/glance-db-sync-lkkj2" Dec 11 13:25:17 crc kubenswrapper[4898]: I1211 13:25:17.052922 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kx7jx\" (UniqueName: \"kubernetes.io/projected/ca6259a8-1ee8-47a6-b102-e7e22b93c2c1-kube-api-access-kx7jx\") pod \"glance-db-sync-lkkj2\" (UID: \"ca6259a8-1ee8-47a6-b102-e7e22b93c2c1\") " pod="openstack/glance-db-sync-lkkj2" Dec 11 13:25:17 crc kubenswrapper[4898]: I1211 13:25:17.195128 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-lkkj2" Dec 11 13:25:17 crc kubenswrapper[4898]: I1211 13:25:17.793968 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-cell1-db-create-bhvfq" Dec 11 13:25:17 crc kubenswrapper[4898]: I1211 13:25:17.808334 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-ca33-account-create-update-8x4rf" Dec 11 13:25:17 crc kubenswrapper[4898]: I1211 13:25:17.847124 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vqlrn\" (UniqueName: \"kubernetes.io/projected/8cc39efd-7ed2-48a9-a27e-e637091064ef-kube-api-access-vqlrn\") pod \"8cc39efd-7ed2-48a9-a27e-e637091064ef\" (UID: \"8cc39efd-7ed2-48a9-a27e-e637091064ef\") " Dec 11 13:25:17 crc kubenswrapper[4898]: I1211 13:25:17.847255 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8cc39efd-7ed2-48a9-a27e-e637091064ef-operator-scripts\") pod \"8cc39efd-7ed2-48a9-a27e-e637091064ef\" (UID: \"8cc39efd-7ed2-48a9-a27e-e637091064ef\") " Dec 11 13:25:17 crc kubenswrapper[4898]: I1211 13:25:17.848209 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cc39efd-7ed2-48a9-a27e-e637091064ef-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8cc39efd-7ed2-48a9-a27e-e637091064ef" (UID: "8cc39efd-7ed2-48a9-a27e-e637091064ef"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:25:17 crc kubenswrapper[4898]: I1211 13:25:17.853395 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cc39efd-7ed2-48a9-a27e-e637091064ef-kube-api-access-vqlrn" (OuterVolumeSpecName: "kube-api-access-vqlrn") pod "8cc39efd-7ed2-48a9-a27e-e637091064ef" (UID: "8cc39efd-7ed2-48a9-a27e-e637091064ef"). InnerVolumeSpecName "kube-api-access-vqlrn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:25:17 crc kubenswrapper[4898]: I1211 13:25:17.949548 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/95b56d2f-c89b-4531-9e3a-f2c7accab5dd-operator-scripts\") pod \"95b56d2f-c89b-4531-9e3a-f2c7accab5dd\" (UID: \"95b56d2f-c89b-4531-9e3a-f2c7accab5dd\") " Dec 11 13:25:17 crc kubenswrapper[4898]: I1211 13:25:17.949746 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-27xj4\" (UniqueName: \"kubernetes.io/projected/95b56d2f-c89b-4531-9e3a-f2c7accab5dd-kube-api-access-27xj4\") pod \"95b56d2f-c89b-4531-9e3a-f2c7accab5dd\" (UID: \"95b56d2f-c89b-4531-9e3a-f2c7accab5dd\") " Dec 11 13:25:17 crc kubenswrapper[4898]: I1211 13:25:17.950317 4898 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8cc39efd-7ed2-48a9-a27e-e637091064ef-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 11 13:25:17 crc kubenswrapper[4898]: I1211 13:25:17.950339 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vqlrn\" (UniqueName: \"kubernetes.io/projected/8cc39efd-7ed2-48a9-a27e-e637091064ef-kube-api-access-vqlrn\") on node \"crc\" DevicePath \"\"" Dec 11 13:25:17 crc kubenswrapper[4898]: I1211 13:25:17.951309 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/95b56d2f-c89b-4531-9e3a-f2c7accab5dd-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "95b56d2f-c89b-4531-9e3a-f2c7accab5dd" (UID: "95b56d2f-c89b-4531-9e3a-f2c7accab5dd"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:25:17 crc kubenswrapper[4898]: I1211 13:25:17.954161 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95b56d2f-c89b-4531-9e3a-f2c7accab5dd-kube-api-access-27xj4" (OuterVolumeSpecName: "kube-api-access-27xj4") pod "95b56d2f-c89b-4531-9e3a-f2c7accab5dd" (UID: "95b56d2f-c89b-4531-9e3a-f2c7accab5dd"). InnerVolumeSpecName "kube-api-access-27xj4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:25:18 crc kubenswrapper[4898]: I1211 13:25:18.015772 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-lkkj2"] Dec 11 13:25:18 crc kubenswrapper[4898]: I1211 13:25:18.052206 4898 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/95b56d2f-c89b-4531-9e3a-f2c7accab5dd-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 11 13:25:18 crc kubenswrapper[4898]: I1211 13:25:18.052240 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-27xj4\" (UniqueName: \"kubernetes.io/projected/95b56d2f-c89b-4531-9e3a-f2c7accab5dd-kube-api-access-27xj4\") on node \"crc\" DevicePath \"\"" Dec 11 13:25:18 crc kubenswrapper[4898]: I1211 13:25:18.278201 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-ca33-account-create-update-8x4rf" event={"ID":"95b56d2f-c89b-4531-9e3a-f2c7accab5dd","Type":"ContainerDied","Data":"797378eee6b3ff2352d2177447ac21d0c23d1c26648c4f58c7bc08c238f17322"} Dec 11 13:25:18 crc kubenswrapper[4898]: I1211 13:25:18.278227 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-ca33-account-create-update-8x4rf" Dec 11 13:25:18 crc kubenswrapper[4898]: I1211 13:25:18.278242 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="797378eee6b3ff2352d2177447ac21d0c23d1c26648c4f58c7bc08c238f17322" Dec 11 13:25:18 crc kubenswrapper[4898]: I1211 13:25:18.281115 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"87df6c08-c7eb-4e54-a329-6343e195c6f3","Type":"ContainerStarted","Data":"fa48f9d39cded41edeb6fe71cbe1dbe8ca554cefb4447d5d51b85325ebbe31ce"} Dec 11 13:25:18 crc kubenswrapper[4898]: I1211 13:25:18.283401 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-lkkj2" event={"ID":"ca6259a8-1ee8-47a6-b102-e7e22b93c2c1","Type":"ContainerStarted","Data":"c332a2bc37bb191ef4009f22e8e94334d007bd79dca5b2a319d184d5717fbc7b"} Dec 11 13:25:18 crc kubenswrapper[4898]: I1211 13:25:18.285128 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-cell1-db-create-bhvfq" Dec 11 13:25:18 crc kubenswrapper[4898]: I1211 13:25:18.289735 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-openstack-cell1-db-create-bhvfq" event={"ID":"8cc39efd-7ed2-48a9-a27e-e637091064ef","Type":"ContainerDied","Data":"5b1b44cfdc32defc8d4544c8ab15690179e74a1d227efebfc78cfa40ed409fd7"} Dec 11 13:25:18 crc kubenswrapper[4898]: I1211 13:25:18.289783 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5b1b44cfdc32defc8d4544c8ab15690179e74a1d227efebfc78cfa40ed409fd7" Dec 11 13:25:18 crc kubenswrapper[4898]: I1211 13:25:18.316391 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=13.947755191 podStartE2EDuration="55.316371269s" podCreationTimestamp="2025-12-11 13:24:23 +0000 UTC" firstStartedPulling="2025-12-11 13:24:35.890814428 +0000 UTC m=+1233.463140865" lastFinishedPulling="2025-12-11 13:25:17.259430506 +0000 UTC m=+1274.831756943" observedRunningTime="2025-12-11 13:25:18.301688062 +0000 UTC m=+1275.874014499" watchObservedRunningTime="2025-12-11 13:25:18.316371269 +0000 UTC m=+1275.888697716" Dec 11 13:25:18 crc kubenswrapper[4898]: I1211 13:25:18.833149 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-b8fbc5445-kljnd" Dec 11 13:25:18 crc kubenswrapper[4898]: I1211 13:25:18.927635 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-2t4lr"] Dec 11 13:25:18 crc kubenswrapper[4898]: I1211 13:25:18.927909 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57d769cc4f-2t4lr" podUID="7a828046-bb9f-4a7a-be64-d18efa6ccb63" containerName="dnsmasq-dns" containerID="cri-o://aaa2f43d8b651ae4b7c4a21743945fd1180e1bc43b4fd44b200752ab13360407" gracePeriod=10 Dec 11 13:25:19 crc kubenswrapper[4898]: I1211 13:25:19.305397 4898 generic.go:334] "Generic (PLEG): container finished" podID="7a828046-bb9f-4a7a-be64-d18efa6ccb63" containerID="aaa2f43d8b651ae4b7c4a21743945fd1180e1bc43b4fd44b200752ab13360407" exitCode=0 Dec 11 13:25:19 crc kubenswrapper[4898]: I1211 13:25:19.305581 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-2t4lr" event={"ID":"7a828046-bb9f-4a7a-be64-d18efa6ccb63","Type":"ContainerDied","Data":"aaa2f43d8b651ae4b7c4a21743945fd1180e1bc43b4fd44b200752ab13360407"} Dec 11 13:25:19 crc kubenswrapper[4898]: I1211 13:25:19.437883 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-2t4lr" Dec 11 13:25:19 crc kubenswrapper[4898]: I1211 13:25:19.486291 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7a828046-bb9f-4a7a-be64-d18efa6ccb63-dns-svc\") pod \"7a828046-bb9f-4a7a-be64-d18efa6ccb63\" (UID: \"7a828046-bb9f-4a7a-be64-d18efa6ccb63\") " Dec 11 13:25:19 crc kubenswrapper[4898]: I1211 13:25:19.486477 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-67xj9\" (UniqueName: \"kubernetes.io/projected/7a828046-bb9f-4a7a-be64-d18efa6ccb63-kube-api-access-67xj9\") pod \"7a828046-bb9f-4a7a-be64-d18efa6ccb63\" (UID: \"7a828046-bb9f-4a7a-be64-d18efa6ccb63\") " Dec 11 13:25:19 crc kubenswrapper[4898]: I1211 13:25:19.486556 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a828046-bb9f-4a7a-be64-d18efa6ccb63-config\") pod \"7a828046-bb9f-4a7a-be64-d18efa6ccb63\" (UID: \"7a828046-bb9f-4a7a-be64-d18efa6ccb63\") " Dec 11 13:25:19 crc kubenswrapper[4898]: I1211 13:25:19.491689 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a828046-bb9f-4a7a-be64-d18efa6ccb63-kube-api-access-67xj9" (OuterVolumeSpecName: "kube-api-access-67xj9") pod "7a828046-bb9f-4a7a-be64-d18efa6ccb63" (UID: "7a828046-bb9f-4a7a-be64-d18efa6ccb63"). InnerVolumeSpecName "kube-api-access-67xj9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:25:19 crc kubenswrapper[4898]: I1211 13:25:19.541563 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7a828046-bb9f-4a7a-be64-d18efa6ccb63-config" (OuterVolumeSpecName: "config") pod "7a828046-bb9f-4a7a-be64-d18efa6ccb63" (UID: "7a828046-bb9f-4a7a-be64-d18efa6ccb63"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:25:19 crc kubenswrapper[4898]: I1211 13:25:19.568566 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7a828046-bb9f-4a7a-be64-d18efa6ccb63-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "7a828046-bb9f-4a7a-be64-d18efa6ccb63" (UID: "7a828046-bb9f-4a7a-be64-d18efa6ccb63"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:25:19 crc kubenswrapper[4898]: I1211 13:25:19.588300 4898 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7a828046-bb9f-4a7a-be64-d18efa6ccb63-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 11 13:25:19 crc kubenswrapper[4898]: I1211 13:25:19.588408 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-67xj9\" (UniqueName: \"kubernetes.io/projected/7a828046-bb9f-4a7a-be64-d18efa6ccb63-kube-api-access-67xj9\") on node \"crc\" DevicePath \"\"" Dec 11 13:25:19 crc kubenswrapper[4898]: I1211 13:25:19.588498 4898 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a828046-bb9f-4a7a-be64-d18efa6ccb63-config\") on node \"crc\" DevicePath \"\"" Dec 11 13:25:19 crc kubenswrapper[4898]: I1211 13:25:19.613663 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Dec 11 13:25:20 crc kubenswrapper[4898]: I1211 13:25:20.318090 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-2t4lr" event={"ID":"7a828046-bb9f-4a7a-be64-d18efa6ccb63","Type":"ContainerDied","Data":"4a36b6dbb86539d1273c5afd15fde10ddc4923de98fddbb747d23996c720749a"} Dec 11 13:25:20 crc kubenswrapper[4898]: I1211 13:25:20.318416 4898 scope.go:117] "RemoveContainer" containerID="aaa2f43d8b651ae4b7c4a21743945fd1180e1bc43b4fd44b200752ab13360407" Dec 11 13:25:20 crc kubenswrapper[4898]: I1211 13:25:20.319853 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-2t4lr" Dec 11 13:25:20 crc kubenswrapper[4898]: I1211 13:25:20.320748 4898 generic.go:334] "Generic (PLEG): container finished" podID="6eaf839e-626a-4f9a-b489-d9c37cee9065" containerID="bbc99eb3803c73f2cd3f89521dd3bdaae4c7d877f90cf10b7734bbb008573b50" exitCode=0 Dec 11 13:25:20 crc kubenswrapper[4898]: I1211 13:25:20.322074 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"6eaf839e-626a-4f9a-b489-d9c37cee9065","Type":"ContainerDied","Data":"bbc99eb3803c73f2cd3f89521dd3bdaae4c7d877f90cf10b7734bbb008573b50"} Dec 11 13:25:20 crc kubenswrapper[4898]: I1211 13:25:20.354564 4898 scope.go:117] "RemoveContainer" containerID="fcf247872ae4f36b4603111b5c4efd8f5e5a51791692ec12d59f3246e4661e8e" Dec 11 13:25:20 crc kubenswrapper[4898]: I1211 13:25:20.569725 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-2t4lr"] Dec 11 13:25:20 crc kubenswrapper[4898]: I1211 13:25:20.577369 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-2t4lr"] Dec 11 13:25:20 crc kubenswrapper[4898]: I1211 13:25:20.609092 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/717e7951-d95b-497f-b2b7-3ec4ef755642-etc-swift\") pod \"swift-storage-0\" (UID: \"717e7951-d95b-497f-b2b7-3ec4ef755642\") " pod="openstack/swift-storage-0" Dec 11 13:25:20 crc kubenswrapper[4898]: E1211 13:25:20.609389 4898 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 11 13:25:20 crc kubenswrapper[4898]: E1211 13:25:20.609408 4898 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 11 13:25:20 crc kubenswrapper[4898]: E1211 13:25:20.609490 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/717e7951-d95b-497f-b2b7-3ec4ef755642-etc-swift podName:717e7951-d95b-497f-b2b7-3ec4ef755642 nodeName:}" failed. No retries permitted until 2025-12-11 13:25:36.609444397 +0000 UTC m=+1294.181770834 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/717e7951-d95b-497f-b2b7-3ec4ef755642-etc-swift") pod "swift-storage-0" (UID: "717e7951-d95b-497f-b2b7-3ec4ef755642") : configmap "swift-ring-files" not found Dec 11 13:25:20 crc kubenswrapper[4898]: I1211 13:25:20.789429 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a828046-bb9f-4a7a-be64-d18efa6ccb63" path="/var/lib/kubelet/pods/7a828046-bb9f-4a7a-be64-d18efa6ccb63/volumes" Dec 11 13:25:21 crc kubenswrapper[4898]: I1211 13:25:21.504310 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-lsxj7" podUID="0c503750-f2d3-42e3-84ba-1db55db9228f" containerName="ovn-controller" probeResult="failure" output=< Dec 11 13:25:21 crc kubenswrapper[4898]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Dec 11 13:25:21 crc kubenswrapper[4898]: > Dec 11 13:25:21 crc kubenswrapper[4898]: I1211 13:25:21.577064 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-fxk76" Dec 11 13:25:21 crc kubenswrapper[4898]: I1211 13:25:21.592920 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-fxk76" Dec 11 13:25:21 crc kubenswrapper[4898]: I1211 13:25:21.805289 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-lsxj7-config-6m5ww"] Dec 11 13:25:21 crc kubenswrapper[4898]: E1211 13:25:21.805759 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a828046-bb9f-4a7a-be64-d18efa6ccb63" containerName="dnsmasq-dns" Dec 11 13:25:21 crc kubenswrapper[4898]: I1211 13:25:21.805777 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a828046-bb9f-4a7a-be64-d18efa6ccb63" containerName="dnsmasq-dns" Dec 11 13:25:21 crc kubenswrapper[4898]: E1211 13:25:21.805797 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a828046-bb9f-4a7a-be64-d18efa6ccb63" containerName="init" Dec 11 13:25:21 crc kubenswrapper[4898]: I1211 13:25:21.805806 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a828046-bb9f-4a7a-be64-d18efa6ccb63" containerName="init" Dec 11 13:25:21 crc kubenswrapper[4898]: E1211 13:25:21.805829 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95b56d2f-c89b-4531-9e3a-f2c7accab5dd" containerName="mariadb-account-create-update" Dec 11 13:25:21 crc kubenswrapper[4898]: I1211 13:25:21.805835 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="95b56d2f-c89b-4531-9e3a-f2c7accab5dd" containerName="mariadb-account-create-update" Dec 11 13:25:21 crc kubenswrapper[4898]: E1211 13:25:21.805850 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8cc39efd-7ed2-48a9-a27e-e637091064ef" containerName="mariadb-database-create" Dec 11 13:25:21 crc kubenswrapper[4898]: I1211 13:25:21.805856 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="8cc39efd-7ed2-48a9-a27e-e637091064ef" containerName="mariadb-database-create" Dec 11 13:25:21 crc kubenswrapper[4898]: I1211 13:25:21.806076 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="95b56d2f-c89b-4531-9e3a-f2c7accab5dd" containerName="mariadb-account-create-update" Dec 11 13:25:21 crc kubenswrapper[4898]: I1211 13:25:21.806096 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a828046-bb9f-4a7a-be64-d18efa6ccb63" containerName="dnsmasq-dns" Dec 11 13:25:21 crc kubenswrapper[4898]: I1211 13:25:21.806111 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="8cc39efd-7ed2-48a9-a27e-e637091064ef" containerName="mariadb-database-create" Dec 11 13:25:21 crc kubenswrapper[4898]: I1211 13:25:21.806835 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-lsxj7-config-6m5ww" Dec 11 13:25:21 crc kubenswrapper[4898]: I1211 13:25:21.811477 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Dec 11 13:25:21 crc kubenswrapper[4898]: I1211 13:25:21.837208 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/3d891dcc-9c6b-4d7f-940a-df6c42e0a3e3-var-log-ovn\") pod \"ovn-controller-lsxj7-config-6m5ww\" (UID: \"3d891dcc-9c6b-4d7f-940a-df6c42e0a3e3\") " pod="openstack/ovn-controller-lsxj7-config-6m5ww" Dec 11 13:25:21 crc kubenswrapper[4898]: I1211 13:25:21.837315 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/3d891dcc-9c6b-4d7f-940a-df6c42e0a3e3-var-run-ovn\") pod \"ovn-controller-lsxj7-config-6m5ww\" (UID: \"3d891dcc-9c6b-4d7f-940a-df6c42e0a3e3\") " pod="openstack/ovn-controller-lsxj7-config-6m5ww" Dec 11 13:25:21 crc kubenswrapper[4898]: I1211 13:25:21.837369 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j56tg\" (UniqueName: \"kubernetes.io/projected/3d891dcc-9c6b-4d7f-940a-df6c42e0a3e3-kube-api-access-j56tg\") pod \"ovn-controller-lsxj7-config-6m5ww\" (UID: \"3d891dcc-9c6b-4d7f-940a-df6c42e0a3e3\") " pod="openstack/ovn-controller-lsxj7-config-6m5ww" Dec 11 13:25:21 crc kubenswrapper[4898]: I1211 13:25:21.837471 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/3d891dcc-9c6b-4d7f-940a-df6c42e0a3e3-var-run\") pod \"ovn-controller-lsxj7-config-6m5ww\" (UID: \"3d891dcc-9c6b-4d7f-940a-df6c42e0a3e3\") " pod="openstack/ovn-controller-lsxj7-config-6m5ww" Dec 11 13:25:21 crc kubenswrapper[4898]: I1211 13:25:21.837542 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3d891dcc-9c6b-4d7f-940a-df6c42e0a3e3-scripts\") pod \"ovn-controller-lsxj7-config-6m5ww\" (UID: \"3d891dcc-9c6b-4d7f-940a-df6c42e0a3e3\") " pod="openstack/ovn-controller-lsxj7-config-6m5ww" Dec 11 13:25:21 crc kubenswrapper[4898]: I1211 13:25:21.837617 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/3d891dcc-9c6b-4d7f-940a-df6c42e0a3e3-additional-scripts\") pod \"ovn-controller-lsxj7-config-6m5ww\" (UID: \"3d891dcc-9c6b-4d7f-940a-df6c42e0a3e3\") " pod="openstack/ovn-controller-lsxj7-config-6m5ww" Dec 11 13:25:21 crc kubenswrapper[4898]: I1211 13:25:21.845000 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-lsxj7-config-6m5ww"] Dec 11 13:25:21 crc kubenswrapper[4898]: I1211 13:25:21.939500 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3d891dcc-9c6b-4d7f-940a-df6c42e0a3e3-scripts\") pod \"ovn-controller-lsxj7-config-6m5ww\" (UID: \"3d891dcc-9c6b-4d7f-940a-df6c42e0a3e3\") " pod="openstack/ovn-controller-lsxj7-config-6m5ww" Dec 11 13:25:21 crc kubenswrapper[4898]: I1211 13:25:21.939612 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/3d891dcc-9c6b-4d7f-940a-df6c42e0a3e3-additional-scripts\") pod \"ovn-controller-lsxj7-config-6m5ww\" (UID: \"3d891dcc-9c6b-4d7f-940a-df6c42e0a3e3\") " pod="openstack/ovn-controller-lsxj7-config-6m5ww" Dec 11 13:25:21 crc kubenswrapper[4898]: I1211 13:25:21.939710 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/3d891dcc-9c6b-4d7f-940a-df6c42e0a3e3-var-log-ovn\") pod \"ovn-controller-lsxj7-config-6m5ww\" (UID: \"3d891dcc-9c6b-4d7f-940a-df6c42e0a3e3\") " pod="openstack/ovn-controller-lsxj7-config-6m5ww" Dec 11 13:25:21 crc kubenswrapper[4898]: I1211 13:25:21.939792 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/3d891dcc-9c6b-4d7f-940a-df6c42e0a3e3-var-run-ovn\") pod \"ovn-controller-lsxj7-config-6m5ww\" (UID: \"3d891dcc-9c6b-4d7f-940a-df6c42e0a3e3\") " pod="openstack/ovn-controller-lsxj7-config-6m5ww" Dec 11 13:25:21 crc kubenswrapper[4898]: I1211 13:25:21.939834 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j56tg\" (UniqueName: \"kubernetes.io/projected/3d891dcc-9c6b-4d7f-940a-df6c42e0a3e3-kube-api-access-j56tg\") pod \"ovn-controller-lsxj7-config-6m5ww\" (UID: \"3d891dcc-9c6b-4d7f-940a-df6c42e0a3e3\") " pod="openstack/ovn-controller-lsxj7-config-6m5ww" Dec 11 13:25:21 crc kubenswrapper[4898]: I1211 13:25:21.939906 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/3d891dcc-9c6b-4d7f-940a-df6c42e0a3e3-var-run\") pod \"ovn-controller-lsxj7-config-6m5ww\" (UID: \"3d891dcc-9c6b-4d7f-940a-df6c42e0a3e3\") " pod="openstack/ovn-controller-lsxj7-config-6m5ww" Dec 11 13:25:21 crc kubenswrapper[4898]: I1211 13:25:21.940057 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/3d891dcc-9c6b-4d7f-940a-df6c42e0a3e3-var-run\") pod \"ovn-controller-lsxj7-config-6m5ww\" (UID: \"3d891dcc-9c6b-4d7f-940a-df6c42e0a3e3\") " pod="openstack/ovn-controller-lsxj7-config-6m5ww" Dec 11 13:25:21 crc kubenswrapper[4898]: I1211 13:25:21.940062 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/3d891dcc-9c6b-4d7f-940a-df6c42e0a3e3-var-log-ovn\") pod \"ovn-controller-lsxj7-config-6m5ww\" (UID: \"3d891dcc-9c6b-4d7f-940a-df6c42e0a3e3\") " pod="openstack/ovn-controller-lsxj7-config-6m5ww" Dec 11 13:25:21 crc kubenswrapper[4898]: I1211 13:25:21.940126 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/3d891dcc-9c6b-4d7f-940a-df6c42e0a3e3-var-run-ovn\") pod \"ovn-controller-lsxj7-config-6m5ww\" (UID: \"3d891dcc-9c6b-4d7f-940a-df6c42e0a3e3\") " pod="openstack/ovn-controller-lsxj7-config-6m5ww" Dec 11 13:25:21 crc kubenswrapper[4898]: I1211 13:25:21.940368 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/3d891dcc-9c6b-4d7f-940a-df6c42e0a3e3-additional-scripts\") pod \"ovn-controller-lsxj7-config-6m5ww\" (UID: \"3d891dcc-9c6b-4d7f-940a-df6c42e0a3e3\") " pod="openstack/ovn-controller-lsxj7-config-6m5ww" Dec 11 13:25:21 crc kubenswrapper[4898]: I1211 13:25:21.942188 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3d891dcc-9c6b-4d7f-940a-df6c42e0a3e3-scripts\") pod \"ovn-controller-lsxj7-config-6m5ww\" (UID: \"3d891dcc-9c6b-4d7f-940a-df6c42e0a3e3\") " pod="openstack/ovn-controller-lsxj7-config-6m5ww" Dec 11 13:25:21 crc kubenswrapper[4898]: I1211 13:25:21.960597 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j56tg\" (UniqueName: \"kubernetes.io/projected/3d891dcc-9c6b-4d7f-940a-df6c42e0a3e3-kube-api-access-j56tg\") pod \"ovn-controller-lsxj7-config-6m5ww\" (UID: \"3d891dcc-9c6b-4d7f-940a-df6c42e0a3e3\") " pod="openstack/ovn-controller-lsxj7-config-6m5ww" Dec 11 13:25:22 crc kubenswrapper[4898]: I1211 13:25:22.140795 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-lsxj7-config-6m5ww" Dec 11 13:25:22 crc kubenswrapper[4898]: I1211 13:25:22.365174 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"6eaf839e-626a-4f9a-b489-d9c37cee9065","Type":"ContainerStarted","Data":"a46216fb9d726589cfc7c90a401d302f459367871eee5fff476cedaf7f8498bd"} Dec 11 13:25:22 crc kubenswrapper[4898]: I1211 13:25:22.366898 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Dec 11 13:25:22 crc kubenswrapper[4898]: I1211 13:25:22.370839 4898 generic.go:334] "Generic (PLEG): container finished" podID="026f0391-aa61-4b41-963f-239e08b0cd34" containerID="9bd52eab2cebdafa7238c3a7545f94428c7ec74df13750e07166556883515e5a" exitCode=0 Dec 11 13:25:22 crc kubenswrapper[4898]: I1211 13:25:22.371354 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"026f0391-aa61-4b41-963f-239e08b0cd34","Type":"ContainerDied","Data":"9bd52eab2cebdafa7238c3a7545f94428c7ec74df13750e07166556883515e5a"} Dec 11 13:25:22 crc kubenswrapper[4898]: I1211 13:25:22.455856 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=61.279460014 podStartE2EDuration="1m6.455835522s" podCreationTimestamp="2025-12-11 13:24:16 +0000 UTC" firstStartedPulling="2025-12-11 13:24:34.073386387 +0000 UTC m=+1231.645712824" lastFinishedPulling="2025-12-11 13:24:39.249761895 +0000 UTC m=+1236.822088332" observedRunningTime="2025-12-11 13:25:22.40798516 +0000 UTC m=+1279.980311597" watchObservedRunningTime="2025-12-11 13:25:22.455835522 +0000 UTC m=+1280.028161959" Dec 11 13:25:22 crc kubenswrapper[4898]: I1211 13:25:22.572640 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-lsxj7-config-6m5ww"] Dec 11 13:25:22 crc kubenswrapper[4898]: W1211 13:25:22.585690 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3d891dcc_9c6b_4d7f_940a_df6c42e0a3e3.slice/crio-e92b79dd99342c211a0b70b12d813bab380f8e17916cc66e2c46e40cbd5cfd98 WatchSource:0}: Error finding container e92b79dd99342c211a0b70b12d813bab380f8e17916cc66e2c46e40cbd5cfd98: Status 404 returned error can't find the container with id e92b79dd99342c211a0b70b12d813bab380f8e17916cc66e2c46e40cbd5cfd98 Dec 11 13:25:23 crc kubenswrapper[4898]: I1211 13:25:23.382262 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-lsxj7-config-6m5ww" event={"ID":"3d891dcc-9c6b-4d7f-940a-df6c42e0a3e3","Type":"ContainerStarted","Data":"e92b79dd99342c211a0b70b12d813bab380f8e17916cc66e2c46e40cbd5cfd98"} Dec 11 13:25:24 crc kubenswrapper[4898]: I1211 13:25:24.053925 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mysqld-exporter-0"] Dec 11 13:25:24 crc kubenswrapper[4898]: I1211 13:25:24.055990 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-0" Dec 11 13:25:24 crc kubenswrapper[4898]: I1211 13:25:24.058657 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"mysqld-exporter-config-data" Dec 11 13:25:24 crc kubenswrapper[4898]: I1211 13:25:24.077776 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-0"] Dec 11 13:25:24 crc kubenswrapper[4898]: I1211 13:25:24.193671 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fdqz2\" (UniqueName: \"kubernetes.io/projected/3a77ab96-9580-4509-ba2c-963e51ed44a5-kube-api-access-fdqz2\") pod \"mysqld-exporter-0\" (UID: \"3a77ab96-9580-4509-ba2c-963e51ed44a5\") " pod="openstack/mysqld-exporter-0" Dec 11 13:25:24 crc kubenswrapper[4898]: I1211 13:25:24.194128 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a77ab96-9580-4509-ba2c-963e51ed44a5-config-data\") pod \"mysqld-exporter-0\" (UID: \"3a77ab96-9580-4509-ba2c-963e51ed44a5\") " pod="openstack/mysqld-exporter-0" Dec 11 13:25:24 crc kubenswrapper[4898]: I1211 13:25:24.194208 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a77ab96-9580-4509-ba2c-963e51ed44a5-combined-ca-bundle\") pod \"mysqld-exporter-0\" (UID: \"3a77ab96-9580-4509-ba2c-963e51ed44a5\") " pod="openstack/mysqld-exporter-0" Dec 11 13:25:24 crc kubenswrapper[4898]: I1211 13:25:24.296744 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fdqz2\" (UniqueName: \"kubernetes.io/projected/3a77ab96-9580-4509-ba2c-963e51ed44a5-kube-api-access-fdqz2\") pod \"mysqld-exporter-0\" (UID: \"3a77ab96-9580-4509-ba2c-963e51ed44a5\") " pod="openstack/mysqld-exporter-0" Dec 11 13:25:24 crc kubenswrapper[4898]: I1211 13:25:24.296814 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a77ab96-9580-4509-ba2c-963e51ed44a5-config-data\") pod \"mysqld-exporter-0\" (UID: \"3a77ab96-9580-4509-ba2c-963e51ed44a5\") " pod="openstack/mysqld-exporter-0" Dec 11 13:25:24 crc kubenswrapper[4898]: I1211 13:25:24.296851 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a77ab96-9580-4509-ba2c-963e51ed44a5-combined-ca-bundle\") pod \"mysqld-exporter-0\" (UID: \"3a77ab96-9580-4509-ba2c-963e51ed44a5\") " pod="openstack/mysqld-exporter-0" Dec 11 13:25:24 crc kubenswrapper[4898]: I1211 13:25:24.305320 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a77ab96-9580-4509-ba2c-963e51ed44a5-combined-ca-bundle\") pod \"mysqld-exporter-0\" (UID: \"3a77ab96-9580-4509-ba2c-963e51ed44a5\") " pod="openstack/mysqld-exporter-0" Dec 11 13:25:24 crc kubenswrapper[4898]: I1211 13:25:24.310105 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a77ab96-9580-4509-ba2c-963e51ed44a5-config-data\") pod \"mysqld-exporter-0\" (UID: \"3a77ab96-9580-4509-ba2c-963e51ed44a5\") " pod="openstack/mysqld-exporter-0" Dec 11 13:25:24 crc kubenswrapper[4898]: I1211 13:25:24.341476 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fdqz2\" (UniqueName: \"kubernetes.io/projected/3a77ab96-9580-4509-ba2c-963e51ed44a5-kube-api-access-fdqz2\") pod \"mysqld-exporter-0\" (UID: \"3a77ab96-9580-4509-ba2c-963e51ed44a5\") " pod="openstack/mysqld-exporter-0" Dec 11 13:25:24 crc kubenswrapper[4898]: I1211 13:25:24.380494 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-0" Dec 11 13:25:24 crc kubenswrapper[4898]: I1211 13:25:24.401820 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-lsxj7-config-6m5ww" event={"ID":"3d891dcc-9c6b-4d7f-940a-df6c42e0a3e3","Type":"ContainerStarted","Data":"19984b831f4957ac3cf343e7090d9ec3c271a60a5d0b21b8b0f78fbb9f8a6bdf"} Dec 11 13:25:24 crc kubenswrapper[4898]: I1211 13:25:24.405760 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"026f0391-aa61-4b41-963f-239e08b0cd34","Type":"ContainerStarted","Data":"47a1737676630cf454851548de058721d30c553e93dce666a023ff5347745677"} Dec 11 13:25:24 crc kubenswrapper[4898]: I1211 13:25:24.405956 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Dec 11 13:25:24 crc kubenswrapper[4898]: I1211 13:25:24.434970 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-lsxj7-config-6m5ww" podStartSLOduration=3.434947414 podStartE2EDuration="3.434947414s" podCreationTimestamp="2025-12-11 13:25:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 13:25:24.420757839 +0000 UTC m=+1281.993084276" watchObservedRunningTime="2025-12-11 13:25:24.434947414 +0000 UTC m=+1282.007273851" Dec 11 13:25:24 crc kubenswrapper[4898]: I1211 13:25:24.614533 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Dec 11 13:25:24 crc kubenswrapper[4898]: I1211 13:25:24.616463 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Dec 11 13:25:24 crc kubenswrapper[4898]: I1211 13:25:24.648728 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=56.572085172 podStartE2EDuration="1m7.648709769s" podCreationTimestamp="2025-12-11 13:24:17 +0000 UTC" firstStartedPulling="2025-12-11 13:24:35.293380148 +0000 UTC m=+1232.865706585" lastFinishedPulling="2025-12-11 13:24:46.370004735 +0000 UTC m=+1243.942331182" observedRunningTime="2025-12-11 13:25:24.459786788 +0000 UTC m=+1282.032113225" watchObservedRunningTime="2025-12-11 13:25:24.648709769 +0000 UTC m=+1282.221036206" Dec 11 13:25:25 crc kubenswrapper[4898]: I1211 13:25:25.441567 4898 generic.go:334] "Generic (PLEG): container finished" podID="6d5760a1-aea8-4f95-8da7-8832f8879d57" containerID="44911eebd274ee213fd2d3374b6b2e15a715288e93dbe3edcd83c9ff6f131c30" exitCode=0 Dec 11 13:25:25 crc kubenswrapper[4898]: I1211 13:25:25.441679 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-lx2j5" event={"ID":"6d5760a1-aea8-4f95-8da7-8832f8879d57","Type":"ContainerDied","Data":"44911eebd274ee213fd2d3374b6b2e15a715288e93dbe3edcd83c9ff6f131c30"} Dec 11 13:25:25 crc kubenswrapper[4898]: I1211 13:25:25.449610 4898 generic.go:334] "Generic (PLEG): container finished" podID="3d891dcc-9c6b-4d7f-940a-df6c42e0a3e3" containerID="19984b831f4957ac3cf343e7090d9ec3c271a60a5d0b21b8b0f78fbb9f8a6bdf" exitCode=0 Dec 11 13:25:25 crc kubenswrapper[4898]: I1211 13:25:25.449724 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-lsxj7-config-6m5ww" event={"ID":"3d891dcc-9c6b-4d7f-940a-df6c42e0a3e3","Type":"ContainerDied","Data":"19984b831f4957ac3cf343e7090d9ec3c271a60a5d0b21b8b0f78fbb9f8a6bdf"} Dec 11 13:25:25 crc kubenswrapper[4898]: I1211 13:25:25.456184 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Dec 11 13:25:26 crc kubenswrapper[4898]: I1211 13:25:26.521432 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-lsxj7" Dec 11 13:25:28 crc kubenswrapper[4898]: I1211 13:25:28.134230 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 11 13:25:28 crc kubenswrapper[4898]: I1211 13:25:28.134771 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="87df6c08-c7eb-4e54-a329-6343e195c6f3" containerName="prometheus" containerID="cri-o://69179c7c18851bf5d304c6633fde399a71c8e469371a335fdd351be9705b5692" gracePeriod=600 Dec 11 13:25:28 crc kubenswrapper[4898]: I1211 13:25:28.134868 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="87df6c08-c7eb-4e54-a329-6343e195c6f3" containerName="thanos-sidecar" containerID="cri-o://fa48f9d39cded41edeb6fe71cbe1dbe8ca554cefb4447d5d51b85325ebbe31ce" gracePeriod=600 Dec 11 13:25:28 crc kubenswrapper[4898]: I1211 13:25:28.134906 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="87df6c08-c7eb-4e54-a329-6343e195c6f3" containerName="config-reloader" containerID="cri-o://39e2333a7a41f963b9bfefb04ce49dffd32436c98690f83c354375a6e1f91a1e" gracePeriod=600 Dec 11 13:25:28 crc kubenswrapper[4898]: I1211 13:25:28.491248 4898 generic.go:334] "Generic (PLEG): container finished" podID="87df6c08-c7eb-4e54-a329-6343e195c6f3" containerID="fa48f9d39cded41edeb6fe71cbe1dbe8ca554cefb4447d5d51b85325ebbe31ce" exitCode=0 Dec 11 13:25:28 crc kubenswrapper[4898]: I1211 13:25:28.491599 4898 generic.go:334] "Generic (PLEG): container finished" podID="87df6c08-c7eb-4e54-a329-6343e195c6f3" containerID="69179c7c18851bf5d304c6633fde399a71c8e469371a335fdd351be9705b5692" exitCode=0 Dec 11 13:25:28 crc kubenswrapper[4898]: I1211 13:25:28.497681 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"87df6c08-c7eb-4e54-a329-6343e195c6f3","Type":"ContainerDied","Data":"fa48f9d39cded41edeb6fe71cbe1dbe8ca554cefb4447d5d51b85325ebbe31ce"} Dec 11 13:25:28 crc kubenswrapper[4898]: I1211 13:25:28.497768 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"87df6c08-c7eb-4e54-a329-6343e195c6f3","Type":"ContainerDied","Data":"69179c7c18851bf5d304c6633fde399a71c8e469371a335fdd351be9705b5692"} Dec 11 13:25:29 crc kubenswrapper[4898]: I1211 13:25:29.502470 4898 generic.go:334] "Generic (PLEG): container finished" podID="87df6c08-c7eb-4e54-a329-6343e195c6f3" containerID="39e2333a7a41f963b9bfefb04ce49dffd32436c98690f83c354375a6e1f91a1e" exitCode=0 Dec 11 13:25:29 crc kubenswrapper[4898]: I1211 13:25:29.502587 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"87df6c08-c7eb-4e54-a329-6343e195c6f3","Type":"ContainerDied","Data":"39e2333a7a41f963b9bfefb04ce49dffd32436c98690f83c354375a6e1f91a1e"} Dec 11 13:25:29 crc kubenswrapper[4898]: I1211 13:25:29.614495 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/prometheus-metric-storage-0" podUID="87df6c08-c7eb-4e54-a329-6343e195c6f3" containerName="prometheus" probeResult="failure" output="Get \"http://10.217.0.138:9090/-/ready\": dial tcp 10.217.0.138:9090: connect: connection refused" Dec 11 13:25:32 crc kubenswrapper[4898]: I1211 13:25:32.161076 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-lsxj7-config-6m5ww" Dec 11 13:25:32 crc kubenswrapper[4898]: I1211 13:25:32.186953 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-lx2j5" Dec 11 13:25:32 crc kubenswrapper[4898]: I1211 13:25:32.278716 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/6d5760a1-aea8-4f95-8da7-8832f8879d57-ring-data-devices\") pod \"6d5760a1-aea8-4f95-8da7-8832f8879d57\" (UID: \"6d5760a1-aea8-4f95-8da7-8832f8879d57\") " Dec 11 13:25:32 crc kubenswrapper[4898]: I1211 13:25:32.278767 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3d891dcc-9c6b-4d7f-940a-df6c42e0a3e3-scripts\") pod \"3d891dcc-9c6b-4d7f-940a-df6c42e0a3e3\" (UID: \"3d891dcc-9c6b-4d7f-940a-df6c42e0a3e3\") " Dec 11 13:25:32 crc kubenswrapper[4898]: I1211 13:25:32.278870 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/3d891dcc-9c6b-4d7f-940a-df6c42e0a3e3-var-log-ovn\") pod \"3d891dcc-9c6b-4d7f-940a-df6c42e0a3e3\" (UID: \"3d891dcc-9c6b-4d7f-940a-df6c42e0a3e3\") " Dec 11 13:25:32 crc kubenswrapper[4898]: I1211 13:25:32.278886 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6d5760a1-aea8-4f95-8da7-8832f8879d57-scripts\") pod \"6d5760a1-aea8-4f95-8da7-8832f8879d57\" (UID: \"6d5760a1-aea8-4f95-8da7-8832f8879d57\") " Dec 11 13:25:32 crc kubenswrapper[4898]: I1211 13:25:32.278900 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/3d891dcc-9c6b-4d7f-940a-df6c42e0a3e3-additional-scripts\") pod \"3d891dcc-9c6b-4d7f-940a-df6c42e0a3e3\" (UID: \"3d891dcc-9c6b-4d7f-940a-df6c42e0a3e3\") " Dec 11 13:25:32 crc kubenswrapper[4898]: I1211 13:25:32.278931 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/6d5760a1-aea8-4f95-8da7-8832f8879d57-dispersionconf\") pod \"6d5760a1-aea8-4f95-8da7-8832f8879d57\" (UID: \"6d5760a1-aea8-4f95-8da7-8832f8879d57\") " Dec 11 13:25:32 crc kubenswrapper[4898]: I1211 13:25:32.278953 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/6d5760a1-aea8-4f95-8da7-8832f8879d57-etc-swift\") pod \"6d5760a1-aea8-4f95-8da7-8832f8879d57\" (UID: \"6d5760a1-aea8-4f95-8da7-8832f8879d57\") " Dec 11 13:25:32 crc kubenswrapper[4898]: I1211 13:25:32.278989 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/3d891dcc-9c6b-4d7f-940a-df6c42e0a3e3-var-run-ovn\") pod \"3d891dcc-9c6b-4d7f-940a-df6c42e0a3e3\" (UID: \"3d891dcc-9c6b-4d7f-940a-df6c42e0a3e3\") " Dec 11 13:25:32 crc kubenswrapper[4898]: I1211 13:25:32.279020 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sl5t4\" (UniqueName: \"kubernetes.io/projected/6d5760a1-aea8-4f95-8da7-8832f8879d57-kube-api-access-sl5t4\") pod \"6d5760a1-aea8-4f95-8da7-8832f8879d57\" (UID: \"6d5760a1-aea8-4f95-8da7-8832f8879d57\") " Dec 11 13:25:32 crc kubenswrapper[4898]: I1211 13:25:32.279060 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d5760a1-aea8-4f95-8da7-8832f8879d57-combined-ca-bundle\") pod \"6d5760a1-aea8-4f95-8da7-8832f8879d57\" (UID: \"6d5760a1-aea8-4f95-8da7-8832f8879d57\") " Dec 11 13:25:32 crc kubenswrapper[4898]: I1211 13:25:32.279094 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j56tg\" (UniqueName: \"kubernetes.io/projected/3d891dcc-9c6b-4d7f-940a-df6c42e0a3e3-kube-api-access-j56tg\") pod \"3d891dcc-9c6b-4d7f-940a-df6c42e0a3e3\" (UID: \"3d891dcc-9c6b-4d7f-940a-df6c42e0a3e3\") " Dec 11 13:25:32 crc kubenswrapper[4898]: I1211 13:25:32.279177 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/3d891dcc-9c6b-4d7f-940a-df6c42e0a3e3-var-run\") pod \"3d891dcc-9c6b-4d7f-940a-df6c42e0a3e3\" (UID: \"3d891dcc-9c6b-4d7f-940a-df6c42e0a3e3\") " Dec 11 13:25:32 crc kubenswrapper[4898]: I1211 13:25:32.279201 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/6d5760a1-aea8-4f95-8da7-8832f8879d57-swiftconf\") pod \"6d5760a1-aea8-4f95-8da7-8832f8879d57\" (UID: \"6d5760a1-aea8-4f95-8da7-8832f8879d57\") " Dec 11 13:25:32 crc kubenswrapper[4898]: I1211 13:25:32.280124 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6d5760a1-aea8-4f95-8da7-8832f8879d57-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "6d5760a1-aea8-4f95-8da7-8832f8879d57" (UID: "6d5760a1-aea8-4f95-8da7-8832f8879d57"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:25:32 crc kubenswrapper[4898]: I1211 13:25:32.280169 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3d891dcc-9c6b-4d7f-940a-df6c42e0a3e3-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "3d891dcc-9c6b-4d7f-940a-df6c42e0a3e3" (UID: "3d891dcc-9c6b-4d7f-940a-df6c42e0a3e3"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 11 13:25:32 crc kubenswrapper[4898]: I1211 13:25:32.280161 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3d891dcc-9c6b-4d7f-940a-df6c42e0a3e3-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "3d891dcc-9c6b-4d7f-940a-df6c42e0a3e3" (UID: "3d891dcc-9c6b-4d7f-940a-df6c42e0a3e3"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 11 13:25:32 crc kubenswrapper[4898]: I1211 13:25:32.280214 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3d891dcc-9c6b-4d7f-940a-df6c42e0a3e3-var-run" (OuterVolumeSpecName: "var-run") pod "3d891dcc-9c6b-4d7f-940a-df6c42e0a3e3" (UID: "3d891dcc-9c6b-4d7f-940a-df6c42e0a3e3"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 11 13:25:32 crc kubenswrapper[4898]: I1211 13:25:32.280526 4898 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/3d891dcc-9c6b-4d7f-940a-df6c42e0a3e3-var-log-ovn\") on node \"crc\" DevicePath \"\"" Dec 11 13:25:32 crc kubenswrapper[4898]: I1211 13:25:32.280543 4898 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/3d891dcc-9c6b-4d7f-940a-df6c42e0a3e3-var-run-ovn\") on node \"crc\" DevicePath \"\"" Dec 11 13:25:32 crc kubenswrapper[4898]: I1211 13:25:32.280552 4898 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/3d891dcc-9c6b-4d7f-940a-df6c42e0a3e3-var-run\") on node \"crc\" DevicePath \"\"" Dec 11 13:25:32 crc kubenswrapper[4898]: I1211 13:25:32.280561 4898 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/6d5760a1-aea8-4f95-8da7-8832f8879d57-ring-data-devices\") on node \"crc\" DevicePath \"\"" Dec 11 13:25:32 crc kubenswrapper[4898]: I1211 13:25:32.280760 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3d891dcc-9c6b-4d7f-940a-df6c42e0a3e3-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "3d891dcc-9c6b-4d7f-940a-df6c42e0a3e3" (UID: "3d891dcc-9c6b-4d7f-940a-df6c42e0a3e3"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:25:32 crc kubenswrapper[4898]: I1211 13:25:32.281527 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6d5760a1-aea8-4f95-8da7-8832f8879d57-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "6d5760a1-aea8-4f95-8da7-8832f8879d57" (UID: "6d5760a1-aea8-4f95-8da7-8832f8879d57"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 13:25:32 crc kubenswrapper[4898]: I1211 13:25:32.281715 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3d891dcc-9c6b-4d7f-940a-df6c42e0a3e3-scripts" (OuterVolumeSpecName: "scripts") pod "3d891dcc-9c6b-4d7f-940a-df6c42e0a3e3" (UID: "3d891dcc-9c6b-4d7f-940a-df6c42e0a3e3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:25:32 crc kubenswrapper[4898]: I1211 13:25:32.285830 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d5760a1-aea8-4f95-8da7-8832f8879d57-kube-api-access-sl5t4" (OuterVolumeSpecName: "kube-api-access-sl5t4") pod "6d5760a1-aea8-4f95-8da7-8832f8879d57" (UID: "6d5760a1-aea8-4f95-8da7-8832f8879d57"). InnerVolumeSpecName "kube-api-access-sl5t4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:25:32 crc kubenswrapper[4898]: I1211 13:25:32.286733 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d891dcc-9c6b-4d7f-940a-df6c42e0a3e3-kube-api-access-j56tg" (OuterVolumeSpecName: "kube-api-access-j56tg") pod "3d891dcc-9c6b-4d7f-940a-df6c42e0a3e3" (UID: "3d891dcc-9c6b-4d7f-940a-df6c42e0a3e3"). InnerVolumeSpecName "kube-api-access-j56tg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:25:32 crc kubenswrapper[4898]: I1211 13:25:32.291983 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d5760a1-aea8-4f95-8da7-8832f8879d57-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "6d5760a1-aea8-4f95-8da7-8832f8879d57" (UID: "6d5760a1-aea8-4f95-8da7-8832f8879d57"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:25:32 crc kubenswrapper[4898]: I1211 13:25:32.309326 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d5760a1-aea8-4f95-8da7-8832f8879d57-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "6d5760a1-aea8-4f95-8da7-8832f8879d57" (UID: "6d5760a1-aea8-4f95-8da7-8832f8879d57"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:25:32 crc kubenswrapper[4898]: I1211 13:25:32.312027 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6d5760a1-aea8-4f95-8da7-8832f8879d57-scripts" (OuterVolumeSpecName: "scripts") pod "6d5760a1-aea8-4f95-8da7-8832f8879d57" (UID: "6d5760a1-aea8-4f95-8da7-8832f8879d57"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:25:32 crc kubenswrapper[4898]: I1211 13:25:32.348983 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Dec 11 13:25:32 crc kubenswrapper[4898]: I1211 13:25:32.352219 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d5760a1-aea8-4f95-8da7-8832f8879d57-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6d5760a1-aea8-4f95-8da7-8832f8879d57" (UID: "6d5760a1-aea8-4f95-8da7-8832f8879d57"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:25:32 crc kubenswrapper[4898]: I1211 13:25:32.382271 4898 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6d5760a1-aea8-4f95-8da7-8832f8879d57-scripts\") on node \"crc\" DevicePath \"\"" Dec 11 13:25:32 crc kubenswrapper[4898]: I1211 13:25:32.382300 4898 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/3d891dcc-9c6b-4d7f-940a-df6c42e0a3e3-additional-scripts\") on node \"crc\" DevicePath \"\"" Dec 11 13:25:32 crc kubenswrapper[4898]: I1211 13:25:32.382311 4898 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/6d5760a1-aea8-4f95-8da7-8832f8879d57-dispersionconf\") on node \"crc\" DevicePath \"\"" Dec 11 13:25:32 crc kubenswrapper[4898]: I1211 13:25:32.382321 4898 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/6d5760a1-aea8-4f95-8da7-8832f8879d57-etc-swift\") on node \"crc\" DevicePath \"\"" Dec 11 13:25:32 crc kubenswrapper[4898]: I1211 13:25:32.382329 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sl5t4\" (UniqueName: \"kubernetes.io/projected/6d5760a1-aea8-4f95-8da7-8832f8879d57-kube-api-access-sl5t4\") on node \"crc\" DevicePath \"\"" Dec 11 13:25:32 crc kubenswrapper[4898]: I1211 13:25:32.382337 4898 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d5760a1-aea8-4f95-8da7-8832f8879d57-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 11 13:25:32 crc kubenswrapper[4898]: I1211 13:25:32.382346 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j56tg\" (UniqueName: \"kubernetes.io/projected/3d891dcc-9c6b-4d7f-940a-df6c42e0a3e3-kube-api-access-j56tg\") on node \"crc\" DevicePath \"\"" Dec 11 13:25:32 crc kubenswrapper[4898]: I1211 13:25:32.382353 4898 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/6d5760a1-aea8-4f95-8da7-8832f8879d57-swiftconf\") on node \"crc\" DevicePath \"\"" Dec 11 13:25:32 crc kubenswrapper[4898]: I1211 13:25:32.382363 4898 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3d891dcc-9c6b-4d7f-940a-df6c42e0a3e3-scripts\") on node \"crc\" DevicePath \"\"" Dec 11 13:25:32 crc kubenswrapper[4898]: I1211 13:25:32.484117 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/87df6c08-c7eb-4e54-a329-6343e195c6f3-prometheus-metric-storage-rulefiles-0\") pod \"87df6c08-c7eb-4e54-a329-6343e195c6f3\" (UID: \"87df6c08-c7eb-4e54-a329-6343e195c6f3\") " Dec 11 13:25:32 crc kubenswrapper[4898]: I1211 13:25:32.484199 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/87df6c08-c7eb-4e54-a329-6343e195c6f3-tls-assets\") pod \"87df6c08-c7eb-4e54-a329-6343e195c6f3\" (UID: \"87df6c08-c7eb-4e54-a329-6343e195c6f3\") " Dec 11 13:25:32 crc kubenswrapper[4898]: I1211 13:25:32.484268 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/87df6c08-c7eb-4e54-a329-6343e195c6f3-web-config\") pod \"87df6c08-c7eb-4e54-a329-6343e195c6f3\" (UID: \"87df6c08-c7eb-4e54-a329-6343e195c6f3\") " Dec 11 13:25:32 crc kubenswrapper[4898]: I1211 13:25:32.484318 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/87df6c08-c7eb-4e54-a329-6343e195c6f3-thanos-prometheus-http-client-file\") pod \"87df6c08-c7eb-4e54-a329-6343e195c6f3\" (UID: \"87df6c08-c7eb-4e54-a329-6343e195c6f3\") " Dec 11 13:25:32 crc kubenswrapper[4898]: I1211 13:25:32.484341 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/87df6c08-c7eb-4e54-a329-6343e195c6f3-config\") pod \"87df6c08-c7eb-4e54-a329-6343e195c6f3\" (UID: \"87df6c08-c7eb-4e54-a329-6343e195c6f3\") " Dec 11 13:25:32 crc kubenswrapper[4898]: I1211 13:25:32.484366 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dmklw\" (UniqueName: \"kubernetes.io/projected/87df6c08-c7eb-4e54-a329-6343e195c6f3-kube-api-access-dmklw\") pod \"87df6c08-c7eb-4e54-a329-6343e195c6f3\" (UID: \"87df6c08-c7eb-4e54-a329-6343e195c6f3\") " Dec 11 13:25:32 crc kubenswrapper[4898]: I1211 13:25:32.484435 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/87df6c08-c7eb-4e54-a329-6343e195c6f3-config-out\") pod \"87df6c08-c7eb-4e54-a329-6343e195c6f3\" (UID: \"87df6c08-c7eb-4e54-a329-6343e195c6f3\") " Dec 11 13:25:32 crc kubenswrapper[4898]: I1211 13:25:32.484556 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"87df6c08-c7eb-4e54-a329-6343e195c6f3\" (UID: \"87df6c08-c7eb-4e54-a329-6343e195c6f3\") " Dec 11 13:25:32 crc kubenswrapper[4898]: I1211 13:25:32.488629 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87df6c08-c7eb-4e54-a329-6343e195c6f3-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "87df6c08-c7eb-4e54-a329-6343e195c6f3" (UID: "87df6c08-c7eb-4e54-a329-6343e195c6f3"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:25:32 crc kubenswrapper[4898]: I1211 13:25:32.489017 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87df6c08-c7eb-4e54-a329-6343e195c6f3-prometheus-metric-storage-rulefiles-0" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-0") pod "87df6c08-c7eb-4e54-a329-6343e195c6f3" (UID: "87df6c08-c7eb-4e54-a329-6343e195c6f3"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:25:32 crc kubenswrapper[4898]: I1211 13:25:32.491747 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87df6c08-c7eb-4e54-a329-6343e195c6f3-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "87df6c08-c7eb-4e54-a329-6343e195c6f3" (UID: "87df6c08-c7eb-4e54-a329-6343e195c6f3"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:25:32 crc kubenswrapper[4898]: I1211 13:25:32.493718 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87df6c08-c7eb-4e54-a329-6343e195c6f3-kube-api-access-dmklw" (OuterVolumeSpecName: "kube-api-access-dmklw") pod "87df6c08-c7eb-4e54-a329-6343e195c6f3" (UID: "87df6c08-c7eb-4e54-a329-6343e195c6f3"). InnerVolumeSpecName "kube-api-access-dmklw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:25:32 crc kubenswrapper[4898]: I1211 13:25:32.497615 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87df6c08-c7eb-4e54-a329-6343e195c6f3-config" (OuterVolumeSpecName: "config") pod "87df6c08-c7eb-4e54-a329-6343e195c6f3" (UID: "87df6c08-c7eb-4e54-a329-6343e195c6f3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:25:32 crc kubenswrapper[4898]: I1211 13:25:32.497678 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/87df6c08-c7eb-4e54-a329-6343e195c6f3-config-out" (OuterVolumeSpecName: "config-out") pod "87df6c08-c7eb-4e54-a329-6343e195c6f3" (UID: "87df6c08-c7eb-4e54-a329-6343e195c6f3"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 13:25:32 crc kubenswrapper[4898]: I1211 13:25:32.498083 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "prometheus-metric-storage-db") pod "87df6c08-c7eb-4e54-a329-6343e195c6f3" (UID: "87df6c08-c7eb-4e54-a329-6343e195c6f3"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 11 13:25:32 crc kubenswrapper[4898]: I1211 13:25:32.532762 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87df6c08-c7eb-4e54-a329-6343e195c6f3-web-config" (OuterVolumeSpecName: "web-config") pod "87df6c08-c7eb-4e54-a329-6343e195c6f3" (UID: "87df6c08-c7eb-4e54-a329-6343e195c6f3"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:25:32 crc kubenswrapper[4898]: I1211 13:25:32.534751 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-lsxj7-config-6m5ww" event={"ID":"3d891dcc-9c6b-4d7f-940a-df6c42e0a3e3","Type":"ContainerDied","Data":"e92b79dd99342c211a0b70b12d813bab380f8e17916cc66e2c46e40cbd5cfd98"} Dec 11 13:25:32 crc kubenswrapper[4898]: I1211 13:25:32.534791 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e92b79dd99342c211a0b70b12d813bab380f8e17916cc66e2c46e40cbd5cfd98" Dec 11 13:25:32 crc kubenswrapper[4898]: I1211 13:25:32.534852 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-lsxj7-config-6m5ww" Dec 11 13:25:32 crc kubenswrapper[4898]: I1211 13:25:32.550510 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Dec 11 13:25:32 crc kubenswrapper[4898]: I1211 13:25:32.550684 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"87df6c08-c7eb-4e54-a329-6343e195c6f3","Type":"ContainerDied","Data":"2f19d00d2937ed6e40897642286ae713bf3180e3f839fd5ed58fad31930741cc"} Dec 11 13:25:32 crc kubenswrapper[4898]: I1211 13:25:32.550920 4898 scope.go:117] "RemoveContainer" containerID="fa48f9d39cded41edeb6fe71cbe1dbe8ca554cefb4447d5d51b85325ebbe31ce" Dec 11 13:25:32 crc kubenswrapper[4898]: I1211 13:25:32.568236 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-lx2j5" event={"ID":"6d5760a1-aea8-4f95-8da7-8832f8879d57","Type":"ContainerDied","Data":"b823c9a858364f5c6ccbb6a91ccbee90e356eb4e21a9dcb5d28a1a826c051716"} Dec 11 13:25:32 crc kubenswrapper[4898]: I1211 13:25:32.568283 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b823c9a858364f5c6ccbb6a91ccbee90e356eb4e21a9dcb5d28a1a826c051716" Dec 11 13:25:32 crc kubenswrapper[4898]: I1211 13:25:32.568366 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-lx2j5" Dec 11 13:25:32 crc kubenswrapper[4898]: I1211 13:25:32.591684 4898 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/87df6c08-c7eb-4e54-a329-6343e195c6f3-config-out\") on node \"crc\" DevicePath \"\"" Dec 11 13:25:32 crc kubenswrapper[4898]: I1211 13:25:32.591730 4898 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Dec 11 13:25:32 crc kubenswrapper[4898]: I1211 13:25:32.591749 4898 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/87df6c08-c7eb-4e54-a329-6343e195c6f3-prometheus-metric-storage-rulefiles-0\") on node \"crc\" DevicePath \"\"" Dec 11 13:25:32 crc kubenswrapper[4898]: I1211 13:25:32.591761 4898 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/87df6c08-c7eb-4e54-a329-6343e195c6f3-tls-assets\") on node \"crc\" DevicePath \"\"" Dec 11 13:25:32 crc kubenswrapper[4898]: I1211 13:25:32.591772 4898 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/87df6c08-c7eb-4e54-a329-6343e195c6f3-web-config\") on node \"crc\" DevicePath \"\"" Dec 11 13:25:32 crc kubenswrapper[4898]: I1211 13:25:32.591784 4898 reconciler_common.go:293] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/87df6c08-c7eb-4e54-a329-6343e195c6f3-thanos-prometheus-http-client-file\") on node \"crc\" DevicePath \"\"" Dec 11 13:25:32 crc kubenswrapper[4898]: I1211 13:25:32.591796 4898 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/87df6c08-c7eb-4e54-a329-6343e195c6f3-config\") on node \"crc\" DevicePath \"\"" Dec 11 13:25:32 crc kubenswrapper[4898]: I1211 13:25:32.591806 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dmklw\" (UniqueName: \"kubernetes.io/projected/87df6c08-c7eb-4e54-a329-6343e195c6f3-kube-api-access-dmklw\") on node \"crc\" DevicePath \"\"" Dec 11 13:25:32 crc kubenswrapper[4898]: I1211 13:25:32.613617 4898 scope.go:117] "RemoveContainer" containerID="39e2333a7a41f963b9bfefb04ce49dffd32436c98690f83c354375a6e1f91a1e" Dec 11 13:25:32 crc kubenswrapper[4898]: I1211 13:25:32.614997 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 11 13:25:32 crc kubenswrapper[4898]: I1211 13:25:32.644150 4898 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Dec 11 13:25:32 crc kubenswrapper[4898]: I1211 13:25:32.648236 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 11 13:25:32 crc kubenswrapper[4898]: I1211 13:25:32.675721 4898 scope.go:117] "RemoveContainer" containerID="69179c7c18851bf5d304c6633fde399a71c8e469371a335fdd351be9705b5692" Dec 11 13:25:32 crc kubenswrapper[4898]: I1211 13:25:32.685688 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 11 13:25:32 crc kubenswrapper[4898]: E1211 13:25:32.686251 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d891dcc-9c6b-4d7f-940a-df6c42e0a3e3" containerName="ovn-config" Dec 11 13:25:32 crc kubenswrapper[4898]: I1211 13:25:32.686280 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d891dcc-9c6b-4d7f-940a-df6c42e0a3e3" containerName="ovn-config" Dec 11 13:25:32 crc kubenswrapper[4898]: E1211 13:25:32.686300 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87df6c08-c7eb-4e54-a329-6343e195c6f3" containerName="thanos-sidecar" Dec 11 13:25:32 crc kubenswrapper[4898]: I1211 13:25:32.686309 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="87df6c08-c7eb-4e54-a329-6343e195c6f3" containerName="thanos-sidecar" Dec 11 13:25:32 crc kubenswrapper[4898]: E1211 13:25:32.686333 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d5760a1-aea8-4f95-8da7-8832f8879d57" containerName="swift-ring-rebalance" Dec 11 13:25:32 crc kubenswrapper[4898]: I1211 13:25:32.686342 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d5760a1-aea8-4f95-8da7-8832f8879d57" containerName="swift-ring-rebalance" Dec 11 13:25:32 crc kubenswrapper[4898]: E1211 13:25:32.686367 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87df6c08-c7eb-4e54-a329-6343e195c6f3" containerName="config-reloader" Dec 11 13:25:32 crc kubenswrapper[4898]: I1211 13:25:32.686377 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="87df6c08-c7eb-4e54-a329-6343e195c6f3" containerName="config-reloader" Dec 11 13:25:32 crc kubenswrapper[4898]: E1211 13:25:32.686389 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87df6c08-c7eb-4e54-a329-6343e195c6f3" containerName="prometheus" Dec 11 13:25:32 crc kubenswrapper[4898]: I1211 13:25:32.686398 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="87df6c08-c7eb-4e54-a329-6343e195c6f3" containerName="prometheus" Dec 11 13:25:32 crc kubenswrapper[4898]: E1211 13:25:32.686416 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87df6c08-c7eb-4e54-a329-6343e195c6f3" containerName="init-config-reloader" Dec 11 13:25:32 crc kubenswrapper[4898]: I1211 13:25:32.686423 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="87df6c08-c7eb-4e54-a329-6343e195c6f3" containerName="init-config-reloader" Dec 11 13:25:32 crc kubenswrapper[4898]: I1211 13:25:32.686718 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d5760a1-aea8-4f95-8da7-8832f8879d57" containerName="swift-ring-rebalance" Dec 11 13:25:32 crc kubenswrapper[4898]: I1211 13:25:32.686757 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d891dcc-9c6b-4d7f-940a-df6c42e0a3e3" containerName="ovn-config" Dec 11 13:25:32 crc kubenswrapper[4898]: I1211 13:25:32.686772 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="87df6c08-c7eb-4e54-a329-6343e195c6f3" containerName="thanos-sidecar" Dec 11 13:25:32 crc kubenswrapper[4898]: I1211 13:25:32.686798 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="87df6c08-c7eb-4e54-a329-6343e195c6f3" containerName="config-reloader" Dec 11 13:25:32 crc kubenswrapper[4898]: I1211 13:25:32.686811 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="87df6c08-c7eb-4e54-a329-6343e195c6f3" containerName="prometheus" Dec 11 13:25:32 crc kubenswrapper[4898]: I1211 13:25:32.689245 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Dec 11 13:25:32 crc kubenswrapper[4898]: I1211 13:25:32.692241 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Dec 11 13:25:32 crc kubenswrapper[4898]: I1211 13:25:32.692520 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Dec 11 13:25:32 crc kubenswrapper[4898]: I1211 13:25:32.695011 4898 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Dec 11 13:25:32 crc kubenswrapper[4898]: I1211 13:25:32.695551 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Dec 11 13:25:32 crc kubenswrapper[4898]: I1211 13:25:32.696752 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-metric-storage-prometheus-svc" Dec 11 13:25:32 crc kubenswrapper[4898]: I1211 13:25:32.696873 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-csdtx" Dec 11 13:25:32 crc kubenswrapper[4898]: I1211 13:25:32.698819 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Dec 11 13:25:32 crc kubenswrapper[4898]: I1211 13:25:32.700998 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Dec 11 13:25:32 crc kubenswrapper[4898]: I1211 13:25:32.714615 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 11 13:25:32 crc kubenswrapper[4898]: I1211 13:25:32.717525 4898 scope.go:117] "RemoveContainer" containerID="abac1359be5a82b057a5932b0db735799507827d93dc6b8f2c45349e642e3283" Dec 11 13:25:32 crc kubenswrapper[4898]: I1211 13:25:32.736248 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-0"] Dec 11 13:25:32 crc kubenswrapper[4898]: I1211 13:25:32.787480 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87df6c08-c7eb-4e54-a329-6343e195c6f3" path="/var/lib/kubelet/pods/87df6c08-c7eb-4e54-a329-6343e195c6f3/volumes" Dec 11 13:25:32 crc kubenswrapper[4898]: I1211 13:25:32.798174 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/81600875-da95-4cb0-b179-1804494d29d8-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"81600875-da95-4cb0-b179-1804494d29d8\") " pod="openstack/prometheus-metric-storage-0" Dec 11 13:25:32 crc kubenswrapper[4898]: I1211 13:25:32.798224 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/81600875-da95-4cb0-b179-1804494d29d8-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"81600875-da95-4cb0-b179-1804494d29d8\") " pod="openstack/prometheus-metric-storage-0" Dec 11 13:25:32 crc kubenswrapper[4898]: I1211 13:25:32.798252 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81600875-da95-4cb0-b179-1804494d29d8-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"81600875-da95-4cb0-b179-1804494d29d8\") " pod="openstack/prometheus-metric-storage-0" Dec 11 13:25:32 crc kubenswrapper[4898]: I1211 13:25:32.798275 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/81600875-da95-4cb0-b179-1804494d29d8-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"81600875-da95-4cb0-b179-1804494d29d8\") " pod="openstack/prometheus-metric-storage-0" Dec 11 13:25:32 crc kubenswrapper[4898]: I1211 13:25:32.798296 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/81600875-da95-4cb0-b179-1804494d29d8-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"81600875-da95-4cb0-b179-1804494d29d8\") " pod="openstack/prometheus-metric-storage-0" Dec 11 13:25:32 crc kubenswrapper[4898]: I1211 13:25:32.798323 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"prometheus-metric-storage-0\" (UID: \"81600875-da95-4cb0-b179-1804494d29d8\") " pod="openstack/prometheus-metric-storage-0" Dec 11 13:25:32 crc kubenswrapper[4898]: I1211 13:25:32.799157 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/81600875-da95-4cb0-b179-1804494d29d8-config\") pod \"prometheus-metric-storage-0\" (UID: \"81600875-da95-4cb0-b179-1804494d29d8\") " pod="openstack/prometheus-metric-storage-0" Dec 11 13:25:32 crc kubenswrapper[4898]: I1211 13:25:32.799219 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/81600875-da95-4cb0-b179-1804494d29d8-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"81600875-da95-4cb0-b179-1804494d29d8\") " pod="openstack/prometheus-metric-storage-0" Dec 11 13:25:32 crc kubenswrapper[4898]: I1211 13:25:32.799282 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s7cjx\" (UniqueName: \"kubernetes.io/projected/81600875-da95-4cb0-b179-1804494d29d8-kube-api-access-s7cjx\") pod \"prometheus-metric-storage-0\" (UID: \"81600875-da95-4cb0-b179-1804494d29d8\") " pod="openstack/prometheus-metric-storage-0" Dec 11 13:25:32 crc kubenswrapper[4898]: I1211 13:25:32.799467 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/81600875-da95-4cb0-b179-1804494d29d8-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"81600875-da95-4cb0-b179-1804494d29d8\") " pod="openstack/prometheus-metric-storage-0" Dec 11 13:25:32 crc kubenswrapper[4898]: I1211 13:25:32.799549 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/81600875-da95-4cb0-b179-1804494d29d8-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"81600875-da95-4cb0-b179-1804494d29d8\") " pod="openstack/prometheus-metric-storage-0" Dec 11 13:25:32 crc kubenswrapper[4898]: I1211 13:25:32.901124 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/81600875-da95-4cb0-b179-1804494d29d8-config\") pod \"prometheus-metric-storage-0\" (UID: \"81600875-da95-4cb0-b179-1804494d29d8\") " pod="openstack/prometheus-metric-storage-0" Dec 11 13:25:32 crc kubenswrapper[4898]: I1211 13:25:32.901195 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/81600875-da95-4cb0-b179-1804494d29d8-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"81600875-da95-4cb0-b179-1804494d29d8\") " pod="openstack/prometheus-metric-storage-0" Dec 11 13:25:32 crc kubenswrapper[4898]: I1211 13:25:32.901231 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s7cjx\" (UniqueName: \"kubernetes.io/projected/81600875-da95-4cb0-b179-1804494d29d8-kube-api-access-s7cjx\") pod \"prometheus-metric-storage-0\" (UID: \"81600875-da95-4cb0-b179-1804494d29d8\") " pod="openstack/prometheus-metric-storage-0" Dec 11 13:25:32 crc kubenswrapper[4898]: I1211 13:25:32.901318 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/81600875-da95-4cb0-b179-1804494d29d8-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"81600875-da95-4cb0-b179-1804494d29d8\") " pod="openstack/prometheus-metric-storage-0" Dec 11 13:25:32 crc kubenswrapper[4898]: I1211 13:25:32.901377 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/81600875-da95-4cb0-b179-1804494d29d8-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"81600875-da95-4cb0-b179-1804494d29d8\") " pod="openstack/prometheus-metric-storage-0" Dec 11 13:25:32 crc kubenswrapper[4898]: I1211 13:25:32.901485 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/81600875-da95-4cb0-b179-1804494d29d8-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"81600875-da95-4cb0-b179-1804494d29d8\") " pod="openstack/prometheus-metric-storage-0" Dec 11 13:25:32 crc kubenswrapper[4898]: I1211 13:25:32.901531 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/81600875-da95-4cb0-b179-1804494d29d8-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"81600875-da95-4cb0-b179-1804494d29d8\") " pod="openstack/prometheus-metric-storage-0" Dec 11 13:25:32 crc kubenswrapper[4898]: I1211 13:25:32.901557 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81600875-da95-4cb0-b179-1804494d29d8-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"81600875-da95-4cb0-b179-1804494d29d8\") " pod="openstack/prometheus-metric-storage-0" Dec 11 13:25:32 crc kubenswrapper[4898]: I1211 13:25:32.901586 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/81600875-da95-4cb0-b179-1804494d29d8-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"81600875-da95-4cb0-b179-1804494d29d8\") " pod="openstack/prometheus-metric-storage-0" Dec 11 13:25:32 crc kubenswrapper[4898]: I1211 13:25:32.901604 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/81600875-da95-4cb0-b179-1804494d29d8-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"81600875-da95-4cb0-b179-1804494d29d8\") " pod="openstack/prometheus-metric-storage-0" Dec 11 13:25:32 crc kubenswrapper[4898]: I1211 13:25:32.901625 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"prometheus-metric-storage-0\" (UID: \"81600875-da95-4cb0-b179-1804494d29d8\") " pod="openstack/prometheus-metric-storage-0" Dec 11 13:25:32 crc kubenswrapper[4898]: I1211 13:25:32.901817 4898 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"prometheus-metric-storage-0\" (UID: \"81600875-da95-4cb0-b179-1804494d29d8\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/prometheus-metric-storage-0" Dec 11 13:25:32 crc kubenswrapper[4898]: I1211 13:25:32.903613 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/81600875-da95-4cb0-b179-1804494d29d8-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"81600875-da95-4cb0-b179-1804494d29d8\") " pod="openstack/prometheus-metric-storage-0" Dec 11 13:25:32 crc kubenswrapper[4898]: I1211 13:25:32.906847 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81600875-da95-4cb0-b179-1804494d29d8-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"81600875-da95-4cb0-b179-1804494d29d8\") " pod="openstack/prometheus-metric-storage-0" Dec 11 13:25:32 crc kubenswrapper[4898]: I1211 13:25:32.908188 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/81600875-da95-4cb0-b179-1804494d29d8-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"81600875-da95-4cb0-b179-1804494d29d8\") " pod="openstack/prometheus-metric-storage-0" Dec 11 13:25:32 crc kubenswrapper[4898]: I1211 13:25:32.908342 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/81600875-da95-4cb0-b179-1804494d29d8-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"81600875-da95-4cb0-b179-1804494d29d8\") " pod="openstack/prometheus-metric-storage-0" Dec 11 13:25:32 crc kubenswrapper[4898]: I1211 13:25:32.908395 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/81600875-da95-4cb0-b179-1804494d29d8-config\") pod \"prometheus-metric-storage-0\" (UID: \"81600875-da95-4cb0-b179-1804494d29d8\") " pod="openstack/prometheus-metric-storage-0" Dec 11 13:25:32 crc kubenswrapper[4898]: I1211 13:25:32.909815 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/81600875-da95-4cb0-b179-1804494d29d8-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"81600875-da95-4cb0-b179-1804494d29d8\") " pod="openstack/prometheus-metric-storage-0" Dec 11 13:25:32 crc kubenswrapper[4898]: I1211 13:25:32.909888 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/81600875-da95-4cb0-b179-1804494d29d8-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"81600875-da95-4cb0-b179-1804494d29d8\") " pod="openstack/prometheus-metric-storage-0" Dec 11 13:25:32 crc kubenswrapper[4898]: I1211 13:25:32.910880 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/81600875-da95-4cb0-b179-1804494d29d8-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"81600875-da95-4cb0-b179-1804494d29d8\") " pod="openstack/prometheus-metric-storage-0" Dec 11 13:25:32 crc kubenswrapper[4898]: I1211 13:25:32.914744 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/81600875-da95-4cb0-b179-1804494d29d8-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"81600875-da95-4cb0-b179-1804494d29d8\") " pod="openstack/prometheus-metric-storage-0" Dec 11 13:25:32 crc kubenswrapper[4898]: I1211 13:25:32.918526 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s7cjx\" (UniqueName: \"kubernetes.io/projected/81600875-da95-4cb0-b179-1804494d29d8-kube-api-access-s7cjx\") pod \"prometheus-metric-storage-0\" (UID: \"81600875-da95-4cb0-b179-1804494d29d8\") " pod="openstack/prometheus-metric-storage-0" Dec 11 13:25:32 crc kubenswrapper[4898]: I1211 13:25:32.930523 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"prometheus-metric-storage-0\" (UID: \"81600875-da95-4cb0-b179-1804494d29d8\") " pod="openstack/prometheus-metric-storage-0" Dec 11 13:25:33 crc kubenswrapper[4898]: I1211 13:25:33.035049 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Dec 11 13:25:33 crc kubenswrapper[4898]: I1211 13:25:33.284095 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-lsxj7-config-6m5ww"] Dec 11 13:25:33 crc kubenswrapper[4898]: I1211 13:25:33.298020 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-lsxj7-config-6m5ww"] Dec 11 13:25:33 crc kubenswrapper[4898]: I1211 13:25:33.552095 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 11 13:25:33 crc kubenswrapper[4898]: W1211 13:25:33.562217 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod81600875_da95_4cb0_b179_1804494d29d8.slice/crio-c6e24f4196d85b41d72a7404af88466ce3c0db4bde8a4614633f14819dcae6bf WatchSource:0}: Error finding container c6e24f4196d85b41d72a7404af88466ce3c0db4bde8a4614633f14819dcae6bf: Status 404 returned error can't find the container with id c6e24f4196d85b41d72a7404af88466ce3c0db4bde8a4614633f14819dcae6bf Dec 11 13:25:33 crc kubenswrapper[4898]: I1211 13:25:33.601978 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-lkkj2" event={"ID":"ca6259a8-1ee8-47a6-b102-e7e22b93c2c1","Type":"ContainerStarted","Data":"8f1383c507a0d9e655255875ca4a2160451be78073d4b1e6294538ffaa30d522"} Dec 11 13:25:33 crc kubenswrapper[4898]: I1211 13:25:33.607308 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-0" event={"ID":"3a77ab96-9580-4509-ba2c-963e51ed44a5","Type":"ContainerStarted","Data":"a35f41f8c5d53cb13d130f09965357e4a0bf44514942145950cb29c970d489cc"} Dec 11 13:25:33 crc kubenswrapper[4898]: I1211 13:25:33.615754 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"81600875-da95-4cb0-b179-1804494d29d8","Type":"ContainerStarted","Data":"c6e24f4196d85b41d72a7404af88466ce3c0db4bde8a4614633f14819dcae6bf"} Dec 11 13:25:33 crc kubenswrapper[4898]: I1211 13:25:33.622636 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-lkkj2" podStartSLOduration=3.558369259 podStartE2EDuration="17.622609734s" podCreationTimestamp="2025-12-11 13:25:16 +0000 UTC" firstStartedPulling="2025-12-11 13:25:18.017953152 +0000 UTC m=+1275.590279589" lastFinishedPulling="2025-12-11 13:25:32.082193627 +0000 UTC m=+1289.654520064" observedRunningTime="2025-12-11 13:25:33.617956221 +0000 UTC m=+1291.190282668" watchObservedRunningTime="2025-12-11 13:25:33.622609734 +0000 UTC m=+1291.194936171" Dec 11 13:25:34 crc kubenswrapper[4898]: I1211 13:25:34.624519 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-0" event={"ID":"3a77ab96-9580-4509-ba2c-963e51ed44a5","Type":"ContainerStarted","Data":"3947f61e9b2304dc24476513cec9b1930a2cdd56cace91f82e0ff5dfa848e8e0"} Dec 11 13:25:34 crc kubenswrapper[4898]: I1211 13:25:34.661274 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mysqld-exporter-0" podStartSLOduration=9.023658176 podStartE2EDuration="10.661248294s" podCreationTimestamp="2025-12-11 13:25:24 +0000 UTC" firstStartedPulling="2025-12-11 13:25:32.717270349 +0000 UTC m=+1290.289596786" lastFinishedPulling="2025-12-11 13:25:34.354860467 +0000 UTC m=+1291.927186904" observedRunningTime="2025-12-11 13:25:34.641957626 +0000 UTC m=+1292.214284063" watchObservedRunningTime="2025-12-11 13:25:34.661248294 +0000 UTC m=+1292.233574731" Dec 11 13:25:34 crc kubenswrapper[4898]: I1211 13:25:34.790208 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d891dcc-9c6b-4d7f-940a-df6c42e0a3e3" path="/var/lib/kubelet/pods/3d891dcc-9c6b-4d7f-940a-df6c42e0a3e3/volumes" Dec 11 13:25:36 crc kubenswrapper[4898]: I1211 13:25:36.697098 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/717e7951-d95b-497f-b2b7-3ec4ef755642-etc-swift\") pod \"swift-storage-0\" (UID: \"717e7951-d95b-497f-b2b7-3ec4ef755642\") " pod="openstack/swift-storage-0" Dec 11 13:25:36 crc kubenswrapper[4898]: I1211 13:25:36.706531 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/717e7951-d95b-497f-b2b7-3ec4ef755642-etc-swift\") pod \"swift-storage-0\" (UID: \"717e7951-d95b-497f-b2b7-3ec4ef755642\") " pod="openstack/swift-storage-0" Dec 11 13:25:36 crc kubenswrapper[4898]: I1211 13:25:36.729831 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Dec 11 13:25:37 crc kubenswrapper[4898]: I1211 13:25:37.311932 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Dec 11 13:25:37 crc kubenswrapper[4898]: W1211 13:25:37.545443 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod717e7951_d95b_497f_b2b7_3ec4ef755642.slice/crio-d377535bec6a263b212ef4ac617c40a823e57ae3b3823327602860dc0ed78c48 WatchSource:0}: Error finding container d377535bec6a263b212ef4ac617c40a823e57ae3b3823327602860dc0ed78c48: Status 404 returned error can't find the container with id d377535bec6a263b212ef4ac617c40a823e57ae3b3823327602860dc0ed78c48 Dec 11 13:25:37 crc kubenswrapper[4898]: I1211 13:25:37.654620 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"717e7951-d95b-497f-b2b7-3ec4ef755642","Type":"ContainerStarted","Data":"d377535bec6a263b212ef4ac617c40a823e57ae3b3823327602860dc0ed78c48"} Dec 11 13:25:38 crc kubenswrapper[4898]: I1211 13:25:38.190739 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Dec 11 13:25:38 crc kubenswrapper[4898]: I1211 13:25:38.494511 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Dec 11 13:25:38 crc kubenswrapper[4898]: I1211 13:25:38.510664 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-create-llbsz"] Dec 11 13:25:38 crc kubenswrapper[4898]: I1211 13:25:38.512400 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-llbsz" Dec 11 13:25:38 crc kubenswrapper[4898]: I1211 13:25:38.529782 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-create-llbsz"] Dec 11 13:25:38 crc kubenswrapper[4898]: I1211 13:25:38.535705 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/204bbeca-833c-4e42-a955-03fde2c57e84-operator-scripts\") pod \"heat-db-create-llbsz\" (UID: \"204bbeca-833c-4e42-a955-03fde2c57e84\") " pod="openstack/heat-db-create-llbsz" Dec 11 13:25:38 crc kubenswrapper[4898]: I1211 13:25:38.535732 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4pbmd\" (UniqueName: \"kubernetes.io/projected/204bbeca-833c-4e42-a955-03fde2c57e84-kube-api-access-4pbmd\") pod \"heat-db-create-llbsz\" (UID: \"204bbeca-833c-4e42-a955-03fde2c57e84\") " pod="openstack/heat-db-create-llbsz" Dec 11 13:25:38 crc kubenswrapper[4898]: I1211 13:25:38.636652 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/204bbeca-833c-4e42-a955-03fde2c57e84-operator-scripts\") pod \"heat-db-create-llbsz\" (UID: \"204bbeca-833c-4e42-a955-03fde2c57e84\") " pod="openstack/heat-db-create-llbsz" Dec 11 13:25:38 crc kubenswrapper[4898]: I1211 13:25:38.636707 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4pbmd\" (UniqueName: \"kubernetes.io/projected/204bbeca-833c-4e42-a955-03fde2c57e84-kube-api-access-4pbmd\") pod \"heat-db-create-llbsz\" (UID: \"204bbeca-833c-4e42-a955-03fde2c57e84\") " pod="openstack/heat-db-create-llbsz" Dec 11 13:25:38 crc kubenswrapper[4898]: I1211 13:25:38.637899 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/204bbeca-833c-4e42-a955-03fde2c57e84-operator-scripts\") pod \"heat-db-create-llbsz\" (UID: \"204bbeca-833c-4e42-a955-03fde2c57e84\") " pod="openstack/heat-db-create-llbsz" Dec 11 13:25:38 crc kubenswrapper[4898]: I1211 13:25:38.666794 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4pbmd\" (UniqueName: \"kubernetes.io/projected/204bbeca-833c-4e42-a955-03fde2c57e84-kube-api-access-4pbmd\") pod \"heat-db-create-llbsz\" (UID: \"204bbeca-833c-4e42-a955-03fde2c57e84\") " pod="openstack/heat-db-create-llbsz" Dec 11 13:25:38 crc kubenswrapper[4898]: I1211 13:25:38.719781 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-nwptx"] Dec 11 13:25:38 crc kubenswrapper[4898]: I1211 13:25:38.721062 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-nwptx" Dec 11 13:25:38 crc kubenswrapper[4898]: I1211 13:25:38.738358 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/301f2f68-2aae-4019-8b8e-a9473250de65-operator-scripts\") pod \"barbican-db-create-nwptx\" (UID: \"301f2f68-2aae-4019-8b8e-a9473250de65\") " pod="openstack/barbican-db-create-nwptx" Dec 11 13:25:38 crc kubenswrapper[4898]: I1211 13:25:38.738445 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vh6js\" (UniqueName: \"kubernetes.io/projected/301f2f68-2aae-4019-8b8e-a9473250de65-kube-api-access-vh6js\") pod \"barbican-db-create-nwptx\" (UID: \"301f2f68-2aae-4019-8b8e-a9473250de65\") " pod="openstack/barbican-db-create-nwptx" Dec 11 13:25:38 crc kubenswrapper[4898]: I1211 13:25:38.740678 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-nwptx"] Dec 11 13:25:38 crc kubenswrapper[4898]: I1211 13:25:38.796894 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-d4nc9"] Dec 11 13:25:38 crc kubenswrapper[4898]: I1211 13:25:38.799164 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-d4nc9" Dec 11 13:25:38 crc kubenswrapper[4898]: I1211 13:25:38.821506 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-c170-account-create-update-l27sp"] Dec 11 13:25:38 crc kubenswrapper[4898]: I1211 13:25:38.823365 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-c170-account-create-update-l27sp" Dec 11 13:25:38 crc kubenswrapper[4898]: I1211 13:25:38.825930 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Dec 11 13:25:38 crc kubenswrapper[4898]: I1211 13:25:38.839417 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/301f2f68-2aae-4019-8b8e-a9473250de65-operator-scripts\") pod \"barbican-db-create-nwptx\" (UID: \"301f2f68-2aae-4019-8b8e-a9473250de65\") " pod="openstack/barbican-db-create-nwptx" Dec 11 13:25:38 crc kubenswrapper[4898]: I1211 13:25:38.839497 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q4x7r\" (UniqueName: \"kubernetes.io/projected/9a3ef5d3-9ee5-4b40-bae4-3b5ddcfd036e-kube-api-access-q4x7r\") pod \"cinder-c170-account-create-update-l27sp\" (UID: \"9a3ef5d3-9ee5-4b40-bae4-3b5ddcfd036e\") " pod="openstack/cinder-c170-account-create-update-l27sp" Dec 11 13:25:38 crc kubenswrapper[4898]: I1211 13:25:38.839528 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9a3ef5d3-9ee5-4b40-bae4-3b5ddcfd036e-operator-scripts\") pod \"cinder-c170-account-create-update-l27sp\" (UID: \"9a3ef5d3-9ee5-4b40-bae4-3b5ddcfd036e\") " pod="openstack/cinder-c170-account-create-update-l27sp" Dec 11 13:25:38 crc kubenswrapper[4898]: I1211 13:25:38.839551 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8k7lt\" (UniqueName: \"kubernetes.io/projected/11baa6bc-e306-47e7-80c0-75a2236f35d0-kube-api-access-8k7lt\") pod \"cinder-db-create-d4nc9\" (UID: \"11baa6bc-e306-47e7-80c0-75a2236f35d0\") " pod="openstack/cinder-db-create-d4nc9" Dec 11 13:25:38 crc kubenswrapper[4898]: I1211 13:25:38.839572 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vh6js\" (UniqueName: \"kubernetes.io/projected/301f2f68-2aae-4019-8b8e-a9473250de65-kube-api-access-vh6js\") pod \"barbican-db-create-nwptx\" (UID: \"301f2f68-2aae-4019-8b8e-a9473250de65\") " pod="openstack/barbican-db-create-nwptx" Dec 11 13:25:38 crc kubenswrapper[4898]: I1211 13:25:38.839630 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/11baa6bc-e306-47e7-80c0-75a2236f35d0-operator-scripts\") pod \"cinder-db-create-d4nc9\" (UID: \"11baa6bc-e306-47e7-80c0-75a2236f35d0\") " pod="openstack/cinder-db-create-d4nc9" Dec 11 13:25:38 crc kubenswrapper[4898]: I1211 13:25:38.840030 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/301f2f68-2aae-4019-8b8e-a9473250de65-operator-scripts\") pod \"barbican-db-create-nwptx\" (UID: \"301f2f68-2aae-4019-8b8e-a9473250de65\") " pod="openstack/barbican-db-create-nwptx" Dec 11 13:25:38 crc kubenswrapper[4898]: I1211 13:25:38.841503 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-llbsz" Dec 11 13:25:38 crc kubenswrapper[4898]: I1211 13:25:38.859585 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-d4nc9"] Dec 11 13:25:38 crc kubenswrapper[4898]: I1211 13:25:38.877970 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-c170-account-create-update-l27sp"] Dec 11 13:25:38 crc kubenswrapper[4898]: I1211 13:25:38.884055 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vh6js\" (UniqueName: \"kubernetes.io/projected/301f2f68-2aae-4019-8b8e-a9473250de65-kube-api-access-vh6js\") pod \"barbican-db-create-nwptx\" (UID: \"301f2f68-2aae-4019-8b8e-a9473250de65\") " pod="openstack/barbican-db-create-nwptx" Dec 11 13:25:38 crc kubenswrapper[4898]: I1211 13:25:38.940389 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q4x7r\" (UniqueName: \"kubernetes.io/projected/9a3ef5d3-9ee5-4b40-bae4-3b5ddcfd036e-kube-api-access-q4x7r\") pod \"cinder-c170-account-create-update-l27sp\" (UID: \"9a3ef5d3-9ee5-4b40-bae4-3b5ddcfd036e\") " pod="openstack/cinder-c170-account-create-update-l27sp" Dec 11 13:25:38 crc kubenswrapper[4898]: I1211 13:25:38.940433 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9a3ef5d3-9ee5-4b40-bae4-3b5ddcfd036e-operator-scripts\") pod \"cinder-c170-account-create-update-l27sp\" (UID: \"9a3ef5d3-9ee5-4b40-bae4-3b5ddcfd036e\") " pod="openstack/cinder-c170-account-create-update-l27sp" Dec 11 13:25:38 crc kubenswrapper[4898]: I1211 13:25:38.940486 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8k7lt\" (UniqueName: \"kubernetes.io/projected/11baa6bc-e306-47e7-80c0-75a2236f35d0-kube-api-access-8k7lt\") pod \"cinder-db-create-d4nc9\" (UID: \"11baa6bc-e306-47e7-80c0-75a2236f35d0\") " pod="openstack/cinder-db-create-d4nc9" Dec 11 13:25:38 crc kubenswrapper[4898]: I1211 13:25:38.940550 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/11baa6bc-e306-47e7-80c0-75a2236f35d0-operator-scripts\") pod \"cinder-db-create-d4nc9\" (UID: \"11baa6bc-e306-47e7-80c0-75a2236f35d0\") " pod="openstack/cinder-db-create-d4nc9" Dec 11 13:25:38 crc kubenswrapper[4898]: I1211 13:25:38.942802 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/11baa6bc-e306-47e7-80c0-75a2236f35d0-operator-scripts\") pod \"cinder-db-create-d4nc9\" (UID: \"11baa6bc-e306-47e7-80c0-75a2236f35d0\") " pod="openstack/cinder-db-create-d4nc9" Dec 11 13:25:38 crc kubenswrapper[4898]: I1211 13:25:38.943076 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9a3ef5d3-9ee5-4b40-bae4-3b5ddcfd036e-operator-scripts\") pod \"cinder-c170-account-create-update-l27sp\" (UID: \"9a3ef5d3-9ee5-4b40-bae4-3b5ddcfd036e\") " pod="openstack/cinder-c170-account-create-update-l27sp" Dec 11 13:25:38 crc kubenswrapper[4898]: I1211 13:25:38.951445 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-af81-account-create-update-rqlbc"] Dec 11 13:25:38 crc kubenswrapper[4898]: I1211 13:25:38.952757 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-af81-account-create-update-rqlbc" Dec 11 13:25:38 crc kubenswrapper[4898]: I1211 13:25:38.955635 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Dec 11 13:25:38 crc kubenswrapper[4898]: I1211 13:25:38.976713 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8k7lt\" (UniqueName: \"kubernetes.io/projected/11baa6bc-e306-47e7-80c0-75a2236f35d0-kube-api-access-8k7lt\") pod \"cinder-db-create-d4nc9\" (UID: \"11baa6bc-e306-47e7-80c0-75a2236f35d0\") " pod="openstack/cinder-db-create-d4nc9" Dec 11 13:25:38 crc kubenswrapper[4898]: I1211 13:25:38.997350 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q4x7r\" (UniqueName: \"kubernetes.io/projected/9a3ef5d3-9ee5-4b40-bae4-3b5ddcfd036e-kube-api-access-q4x7r\") pod \"cinder-c170-account-create-update-l27sp\" (UID: \"9a3ef5d3-9ee5-4b40-bae4-3b5ddcfd036e\") " pod="openstack/cinder-c170-account-create-update-l27sp" Dec 11 13:25:39 crc kubenswrapper[4898]: I1211 13:25:38.983844 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-af81-account-create-update-rqlbc"] Dec 11 13:25:39 crc kubenswrapper[4898]: I1211 13:25:39.049963 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-nwptx" Dec 11 13:25:39 crc kubenswrapper[4898]: I1211 13:25:39.085634 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-689f7"] Dec 11 13:25:39 crc kubenswrapper[4898]: I1211 13:25:39.087039 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-689f7" Dec 11 13:25:39 crc kubenswrapper[4898]: I1211 13:25:39.121762 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-d4nc9" Dec 11 13:25:39 crc kubenswrapper[4898]: I1211 13:25:39.127610 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-689f7"] Dec 11 13:25:39 crc kubenswrapper[4898]: I1211 13:25:39.140074 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-c170-account-create-update-l27sp" Dec 11 13:25:39 crc kubenswrapper[4898]: I1211 13:25:39.146133 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pbzmj\" (UniqueName: \"kubernetes.io/projected/eaf27272-313d-40d9-b882-151aaaf3da23-kube-api-access-pbzmj\") pod \"barbican-af81-account-create-update-rqlbc\" (UID: \"eaf27272-313d-40d9-b882-151aaaf3da23\") " pod="openstack/barbican-af81-account-create-update-rqlbc" Dec 11 13:25:39 crc kubenswrapper[4898]: I1211 13:25:39.146181 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eaf27272-313d-40d9-b882-151aaaf3da23-operator-scripts\") pod \"barbican-af81-account-create-update-rqlbc\" (UID: \"eaf27272-313d-40d9-b882-151aaaf3da23\") " pod="openstack/barbican-af81-account-create-update-rqlbc" Dec 11 13:25:39 crc kubenswrapper[4898]: I1211 13:25:39.165341 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-4pkrx"] Dec 11 13:25:39 crc kubenswrapper[4898]: I1211 13:25:39.166949 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-4pkrx" Dec 11 13:25:39 crc kubenswrapper[4898]: I1211 13:25:39.180428 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 11 13:25:39 crc kubenswrapper[4898]: I1211 13:25:39.180941 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-qkphd" Dec 11 13:25:39 crc kubenswrapper[4898]: I1211 13:25:39.183516 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 11 13:25:39 crc kubenswrapper[4898]: I1211 13:25:39.183564 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 11 13:25:39 crc kubenswrapper[4898]: I1211 13:25:39.250177 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0b57dfab-5b26-4f64-a406-ea1701ef79d1-operator-scripts\") pod \"neutron-db-create-689f7\" (UID: \"0b57dfab-5b26-4f64-a406-ea1701ef79d1\") " pod="openstack/neutron-db-create-689f7" Dec 11 13:25:39 crc kubenswrapper[4898]: I1211 13:25:39.250229 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d314c3d5-6a51-4713-bae9-d25641533de2-config-data\") pod \"keystone-db-sync-4pkrx\" (UID: \"d314c3d5-6a51-4713-bae9-d25641533de2\") " pod="openstack/keystone-db-sync-4pkrx" Dec 11 13:25:39 crc kubenswrapper[4898]: I1211 13:25:39.250261 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rbm44\" (UniqueName: \"kubernetes.io/projected/0b57dfab-5b26-4f64-a406-ea1701ef79d1-kube-api-access-rbm44\") pod \"neutron-db-create-689f7\" (UID: \"0b57dfab-5b26-4f64-a406-ea1701ef79d1\") " pod="openstack/neutron-db-create-689f7" Dec 11 13:25:39 crc kubenswrapper[4898]: I1211 13:25:39.250307 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-csk6q\" (UniqueName: \"kubernetes.io/projected/d314c3d5-6a51-4713-bae9-d25641533de2-kube-api-access-csk6q\") pod \"keystone-db-sync-4pkrx\" (UID: \"d314c3d5-6a51-4713-bae9-d25641533de2\") " pod="openstack/keystone-db-sync-4pkrx" Dec 11 13:25:39 crc kubenswrapper[4898]: I1211 13:25:39.250355 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pbzmj\" (UniqueName: \"kubernetes.io/projected/eaf27272-313d-40d9-b882-151aaaf3da23-kube-api-access-pbzmj\") pod \"barbican-af81-account-create-update-rqlbc\" (UID: \"eaf27272-313d-40d9-b882-151aaaf3da23\") " pod="openstack/barbican-af81-account-create-update-rqlbc" Dec 11 13:25:39 crc kubenswrapper[4898]: I1211 13:25:39.250377 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eaf27272-313d-40d9-b882-151aaaf3da23-operator-scripts\") pod \"barbican-af81-account-create-update-rqlbc\" (UID: \"eaf27272-313d-40d9-b882-151aaaf3da23\") " pod="openstack/barbican-af81-account-create-update-rqlbc" Dec 11 13:25:39 crc kubenswrapper[4898]: I1211 13:25:39.250434 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d314c3d5-6a51-4713-bae9-d25641533de2-combined-ca-bundle\") pod \"keystone-db-sync-4pkrx\" (UID: \"d314c3d5-6a51-4713-bae9-d25641533de2\") " pod="openstack/keystone-db-sync-4pkrx" Dec 11 13:25:39 crc kubenswrapper[4898]: I1211 13:25:39.251193 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eaf27272-313d-40d9-b882-151aaaf3da23-operator-scripts\") pod \"barbican-af81-account-create-update-rqlbc\" (UID: \"eaf27272-313d-40d9-b882-151aaaf3da23\") " pod="openstack/barbican-af81-account-create-update-rqlbc" Dec 11 13:25:39 crc kubenswrapper[4898]: I1211 13:25:39.274527 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-4pkrx"] Dec 11 13:25:39 crc kubenswrapper[4898]: I1211 13:25:39.291682 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pbzmj\" (UniqueName: \"kubernetes.io/projected/eaf27272-313d-40d9-b882-151aaaf3da23-kube-api-access-pbzmj\") pod \"barbican-af81-account-create-update-rqlbc\" (UID: \"eaf27272-313d-40d9-b882-151aaaf3da23\") " pod="openstack/barbican-af81-account-create-update-rqlbc" Dec 11 13:25:39 crc kubenswrapper[4898]: I1211 13:25:39.351994 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d314c3d5-6a51-4713-bae9-d25641533de2-combined-ca-bundle\") pod \"keystone-db-sync-4pkrx\" (UID: \"d314c3d5-6a51-4713-bae9-d25641533de2\") " pod="openstack/keystone-db-sync-4pkrx" Dec 11 13:25:39 crc kubenswrapper[4898]: I1211 13:25:39.352071 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0b57dfab-5b26-4f64-a406-ea1701ef79d1-operator-scripts\") pod \"neutron-db-create-689f7\" (UID: \"0b57dfab-5b26-4f64-a406-ea1701ef79d1\") " pod="openstack/neutron-db-create-689f7" Dec 11 13:25:39 crc kubenswrapper[4898]: I1211 13:25:39.352105 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d314c3d5-6a51-4713-bae9-d25641533de2-config-data\") pod \"keystone-db-sync-4pkrx\" (UID: \"d314c3d5-6a51-4713-bae9-d25641533de2\") " pod="openstack/keystone-db-sync-4pkrx" Dec 11 13:25:39 crc kubenswrapper[4898]: I1211 13:25:39.352127 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rbm44\" (UniqueName: \"kubernetes.io/projected/0b57dfab-5b26-4f64-a406-ea1701ef79d1-kube-api-access-rbm44\") pod \"neutron-db-create-689f7\" (UID: \"0b57dfab-5b26-4f64-a406-ea1701ef79d1\") " pod="openstack/neutron-db-create-689f7" Dec 11 13:25:39 crc kubenswrapper[4898]: I1211 13:25:39.352172 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-csk6q\" (UniqueName: \"kubernetes.io/projected/d314c3d5-6a51-4713-bae9-d25641533de2-kube-api-access-csk6q\") pod \"keystone-db-sync-4pkrx\" (UID: \"d314c3d5-6a51-4713-bae9-d25641533de2\") " pod="openstack/keystone-db-sync-4pkrx" Dec 11 13:25:39 crc kubenswrapper[4898]: I1211 13:25:39.356166 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0b57dfab-5b26-4f64-a406-ea1701ef79d1-operator-scripts\") pod \"neutron-db-create-689f7\" (UID: \"0b57dfab-5b26-4f64-a406-ea1701ef79d1\") " pod="openstack/neutron-db-create-689f7" Dec 11 13:25:39 crc kubenswrapper[4898]: I1211 13:25:39.369121 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d314c3d5-6a51-4713-bae9-d25641533de2-combined-ca-bundle\") pod \"keystone-db-sync-4pkrx\" (UID: \"d314c3d5-6a51-4713-bae9-d25641533de2\") " pod="openstack/keystone-db-sync-4pkrx" Dec 11 13:25:39 crc kubenswrapper[4898]: I1211 13:25:39.369201 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-a629-account-create-update-fsml9"] Dec 11 13:25:39 crc kubenswrapper[4898]: I1211 13:25:39.378946 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d314c3d5-6a51-4713-bae9-d25641533de2-config-data\") pod \"keystone-db-sync-4pkrx\" (UID: \"d314c3d5-6a51-4713-bae9-d25641533de2\") " pod="openstack/keystone-db-sync-4pkrx" Dec 11 13:25:39 crc kubenswrapper[4898]: I1211 13:25:39.379157 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-csk6q\" (UniqueName: \"kubernetes.io/projected/d314c3d5-6a51-4713-bae9-d25641533de2-kube-api-access-csk6q\") pod \"keystone-db-sync-4pkrx\" (UID: \"d314c3d5-6a51-4713-bae9-d25641533de2\") " pod="openstack/keystone-db-sync-4pkrx" Dec 11 13:25:39 crc kubenswrapper[4898]: I1211 13:25:39.380524 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-a629-account-create-update-fsml9" Dec 11 13:25:39 crc kubenswrapper[4898]: I1211 13:25:39.381963 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-af81-account-create-update-rqlbc" Dec 11 13:25:39 crc kubenswrapper[4898]: I1211 13:25:39.384859 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-db-secret" Dec 11 13:25:39 crc kubenswrapper[4898]: I1211 13:25:39.385528 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rbm44\" (UniqueName: \"kubernetes.io/projected/0b57dfab-5b26-4f64-a406-ea1701ef79d1-kube-api-access-rbm44\") pod \"neutron-db-create-689f7\" (UID: \"0b57dfab-5b26-4f64-a406-ea1701ef79d1\") " pod="openstack/neutron-db-create-689f7" Dec 11 13:25:39 crc kubenswrapper[4898]: I1211 13:25:39.394696 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-a629-account-create-update-fsml9"] Dec 11 13:25:39 crc kubenswrapper[4898]: I1211 13:25:39.430473 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-689f7" Dec 11 13:25:39 crc kubenswrapper[4898]: I1211 13:25:39.493636 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/821d27f4-deb1-4474-bee8-76c9caf611b1-operator-scripts\") pod \"heat-a629-account-create-update-fsml9\" (UID: \"821d27f4-deb1-4474-bee8-76c9caf611b1\") " pod="openstack/heat-a629-account-create-update-fsml9" Dec 11 13:25:39 crc kubenswrapper[4898]: I1211 13:25:39.493726 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9hlj4\" (UniqueName: \"kubernetes.io/projected/821d27f4-deb1-4474-bee8-76c9caf611b1-kube-api-access-9hlj4\") pod \"heat-a629-account-create-update-fsml9\" (UID: \"821d27f4-deb1-4474-bee8-76c9caf611b1\") " pod="openstack/heat-a629-account-create-update-fsml9" Dec 11 13:25:39 crc kubenswrapper[4898]: I1211 13:25:39.500829 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-fed5-account-create-update-6hwgx"] Dec 11 13:25:39 crc kubenswrapper[4898]: I1211 13:25:39.514314 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-fed5-account-create-update-6hwgx" Dec 11 13:25:39 crc kubenswrapper[4898]: I1211 13:25:39.516060 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-fed5-account-create-update-6hwgx"] Dec 11 13:25:39 crc kubenswrapper[4898]: I1211 13:25:39.531796 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Dec 11 13:25:39 crc kubenswrapper[4898]: I1211 13:25:39.552259 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-4pkrx" Dec 11 13:25:39 crc kubenswrapper[4898]: I1211 13:25:39.597223 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/821d27f4-deb1-4474-bee8-76c9caf611b1-operator-scripts\") pod \"heat-a629-account-create-update-fsml9\" (UID: \"821d27f4-deb1-4474-bee8-76c9caf611b1\") " pod="openstack/heat-a629-account-create-update-fsml9" Dec 11 13:25:39 crc kubenswrapper[4898]: I1211 13:25:39.597299 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9hlj4\" (UniqueName: \"kubernetes.io/projected/821d27f4-deb1-4474-bee8-76c9caf611b1-kube-api-access-9hlj4\") pod \"heat-a629-account-create-update-fsml9\" (UID: \"821d27f4-deb1-4474-bee8-76c9caf611b1\") " pod="openstack/heat-a629-account-create-update-fsml9" Dec 11 13:25:39 crc kubenswrapper[4898]: I1211 13:25:39.600137 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/821d27f4-deb1-4474-bee8-76c9caf611b1-operator-scripts\") pod \"heat-a629-account-create-update-fsml9\" (UID: \"821d27f4-deb1-4474-bee8-76c9caf611b1\") " pod="openstack/heat-a629-account-create-update-fsml9" Dec 11 13:25:39 crc kubenswrapper[4898]: I1211 13:25:39.653927 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9hlj4\" (UniqueName: \"kubernetes.io/projected/821d27f4-deb1-4474-bee8-76c9caf611b1-kube-api-access-9hlj4\") pod \"heat-a629-account-create-update-fsml9\" (UID: \"821d27f4-deb1-4474-bee8-76c9caf611b1\") " pod="openstack/heat-a629-account-create-update-fsml9" Dec 11 13:25:39 crc kubenswrapper[4898]: I1211 13:25:39.693849 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"81600875-da95-4cb0-b179-1804494d29d8","Type":"ContainerStarted","Data":"aa9f5174956acb94b0c096eb5585b4a0597f83b7ef735b5ecb41a49880fe69cd"} Dec 11 13:25:39 crc kubenswrapper[4898]: I1211 13:25:39.702557 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6ffd41d5-6bb8-4cbf-8054-93e5e6b2a96c-operator-scripts\") pod \"neutron-fed5-account-create-update-6hwgx\" (UID: \"6ffd41d5-6bb8-4cbf-8054-93e5e6b2a96c\") " pod="openstack/neutron-fed5-account-create-update-6hwgx" Dec 11 13:25:39 crc kubenswrapper[4898]: I1211 13:25:39.702812 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w22kf\" (UniqueName: \"kubernetes.io/projected/6ffd41d5-6bb8-4cbf-8054-93e5e6b2a96c-kube-api-access-w22kf\") pod \"neutron-fed5-account-create-update-6hwgx\" (UID: \"6ffd41d5-6bb8-4cbf-8054-93e5e6b2a96c\") " pod="openstack/neutron-fed5-account-create-update-6hwgx" Dec 11 13:25:39 crc kubenswrapper[4898]: I1211 13:25:39.720638 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-a629-account-create-update-fsml9" Dec 11 13:25:39 crc kubenswrapper[4898]: I1211 13:25:39.779777 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-nwptx"] Dec 11 13:25:39 crc kubenswrapper[4898]: I1211 13:25:39.804288 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w22kf\" (UniqueName: \"kubernetes.io/projected/6ffd41d5-6bb8-4cbf-8054-93e5e6b2a96c-kube-api-access-w22kf\") pod \"neutron-fed5-account-create-update-6hwgx\" (UID: \"6ffd41d5-6bb8-4cbf-8054-93e5e6b2a96c\") " pod="openstack/neutron-fed5-account-create-update-6hwgx" Dec 11 13:25:39 crc kubenswrapper[4898]: I1211 13:25:39.804362 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6ffd41d5-6bb8-4cbf-8054-93e5e6b2a96c-operator-scripts\") pod \"neutron-fed5-account-create-update-6hwgx\" (UID: \"6ffd41d5-6bb8-4cbf-8054-93e5e6b2a96c\") " pod="openstack/neutron-fed5-account-create-update-6hwgx" Dec 11 13:25:39 crc kubenswrapper[4898]: I1211 13:25:39.805788 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6ffd41d5-6bb8-4cbf-8054-93e5e6b2a96c-operator-scripts\") pod \"neutron-fed5-account-create-update-6hwgx\" (UID: \"6ffd41d5-6bb8-4cbf-8054-93e5e6b2a96c\") " pod="openstack/neutron-fed5-account-create-update-6hwgx" Dec 11 13:25:39 crc kubenswrapper[4898]: I1211 13:25:39.835789 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-create-llbsz"] Dec 11 13:25:39 crc kubenswrapper[4898]: I1211 13:25:39.863934 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w22kf\" (UniqueName: \"kubernetes.io/projected/6ffd41d5-6bb8-4cbf-8054-93e5e6b2a96c-kube-api-access-w22kf\") pod \"neutron-fed5-account-create-update-6hwgx\" (UID: \"6ffd41d5-6bb8-4cbf-8054-93e5e6b2a96c\") " pod="openstack/neutron-fed5-account-create-update-6hwgx" Dec 11 13:25:39 crc kubenswrapper[4898]: I1211 13:25:39.879970 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-fed5-account-create-update-6hwgx" Dec 11 13:25:40 crc kubenswrapper[4898]: I1211 13:25:40.017420 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-d4nc9"] Dec 11 13:25:40 crc kubenswrapper[4898]: I1211 13:25:40.268101 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-c170-account-create-update-l27sp"] Dec 11 13:25:40 crc kubenswrapper[4898]: I1211 13:25:40.281962 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-af81-account-create-update-rqlbc"] Dec 11 13:25:40 crc kubenswrapper[4898]: W1211 13:25:40.286267 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeaf27272_313d_40d9_b882_151aaaf3da23.slice/crio-79bf62a3fccbc65aa62333c943535e24cc7998959f3df9e8b17fb419ef37897b WatchSource:0}: Error finding container 79bf62a3fccbc65aa62333c943535e24cc7998959f3df9e8b17fb419ef37897b: Status 404 returned error can't find the container with id 79bf62a3fccbc65aa62333c943535e24cc7998959f3df9e8b17fb419ef37897b Dec 11 13:25:40 crc kubenswrapper[4898]: I1211 13:25:40.445551 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-4pkrx"] Dec 11 13:25:40 crc kubenswrapper[4898]: I1211 13:25:40.456722 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-689f7"] Dec 11 13:25:40 crc kubenswrapper[4898]: W1211 13:25:40.457228 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd314c3d5_6a51_4713_bae9_d25641533de2.slice/crio-ca0a3293be65012d97853bd14bb6a8d637897f11e64370b9af669eec162dcd09 WatchSource:0}: Error finding container ca0a3293be65012d97853bd14bb6a8d637897f11e64370b9af669eec162dcd09: Status 404 returned error can't find the container with id ca0a3293be65012d97853bd14bb6a8d637897f11e64370b9af669eec162dcd09 Dec 11 13:25:40 crc kubenswrapper[4898]: E1211 13:25:40.508108 4898 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod301f2f68_2aae_4019_8b8e_a9473250de65.slice/crio-conmon-815d602c2ce595b20b6ecfa99677b314b69572a0ae1586f14b22081f787294c8.scope\": RecentStats: unable to find data in memory cache]" Dec 11 13:25:40 crc kubenswrapper[4898]: W1211 13:25:40.632108 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0b57dfab_5b26_4f64_a406_ea1701ef79d1.slice/crio-ef5275c719b921ba94b9e3d1b1a3c462c45165b2f6d76b9e84e106d3f3c5d23d WatchSource:0}: Error finding container ef5275c719b921ba94b9e3d1b1a3c462c45165b2f6d76b9e84e106d3f3c5d23d: Status 404 returned error can't find the container with id ef5275c719b921ba94b9e3d1b1a3c462c45165b2f6d76b9e84e106d3f3c5d23d Dec 11 13:25:40 crc kubenswrapper[4898]: I1211 13:25:40.671286 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-fed5-account-create-update-6hwgx"] Dec 11 13:25:40 crc kubenswrapper[4898]: I1211 13:25:40.683408 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-a629-account-create-update-fsml9"] Dec 11 13:25:40 crc kubenswrapper[4898]: I1211 13:25:40.715478 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-af81-account-create-update-rqlbc" event={"ID":"eaf27272-313d-40d9-b882-151aaaf3da23","Type":"ContainerStarted","Data":"79bf62a3fccbc65aa62333c943535e24cc7998959f3df9e8b17fb419ef37897b"} Dec 11 13:25:40 crc kubenswrapper[4898]: I1211 13:25:40.719838 4898 generic.go:334] "Generic (PLEG): container finished" podID="204bbeca-833c-4e42-a955-03fde2c57e84" containerID="f0f6384e87bf34cb6ab3ac838d184a33ef92aa13204fb285d70e73c0cee37487" exitCode=0 Dec 11 13:25:40 crc kubenswrapper[4898]: I1211 13:25:40.719942 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-llbsz" event={"ID":"204bbeca-833c-4e42-a955-03fde2c57e84","Type":"ContainerDied","Data":"f0f6384e87bf34cb6ab3ac838d184a33ef92aa13204fb285d70e73c0cee37487"} Dec 11 13:25:40 crc kubenswrapper[4898]: I1211 13:25:40.719984 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-llbsz" event={"ID":"204bbeca-833c-4e42-a955-03fde2c57e84","Type":"ContainerStarted","Data":"f8ddd391dd411a7204e22cb9d2e76b5437d23898e2e657f6b9b9d4fbf4316e16"} Dec 11 13:25:40 crc kubenswrapper[4898]: I1211 13:25:40.724884 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-c170-account-create-update-l27sp" event={"ID":"9a3ef5d3-9ee5-4b40-bae4-3b5ddcfd036e","Type":"ContainerStarted","Data":"7a0d8ed2c646db01745993be7a1b8e346b3597264c91fbd52fe2b4ceff222cb5"} Dec 11 13:25:40 crc kubenswrapper[4898]: I1211 13:25:40.731354 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-689f7" event={"ID":"0b57dfab-5b26-4f64-a406-ea1701ef79d1","Type":"ContainerStarted","Data":"ef5275c719b921ba94b9e3d1b1a3c462c45165b2f6d76b9e84e106d3f3c5d23d"} Dec 11 13:25:40 crc kubenswrapper[4898]: I1211 13:25:40.732691 4898 generic.go:334] "Generic (PLEG): container finished" podID="301f2f68-2aae-4019-8b8e-a9473250de65" containerID="815d602c2ce595b20b6ecfa99677b314b69572a0ae1586f14b22081f787294c8" exitCode=0 Dec 11 13:25:40 crc kubenswrapper[4898]: I1211 13:25:40.732735 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-nwptx" event={"ID":"301f2f68-2aae-4019-8b8e-a9473250de65","Type":"ContainerDied","Data":"815d602c2ce595b20b6ecfa99677b314b69572a0ae1586f14b22081f787294c8"} Dec 11 13:25:40 crc kubenswrapper[4898]: I1211 13:25:40.732750 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-nwptx" event={"ID":"301f2f68-2aae-4019-8b8e-a9473250de65","Type":"ContainerStarted","Data":"0059c36b5a06279d7e27b7c91927fe622bb231d86e969f3321cf19b33c9c516c"} Dec 11 13:25:40 crc kubenswrapper[4898]: I1211 13:25:40.735755 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-4pkrx" event={"ID":"d314c3d5-6a51-4713-bae9-d25641533de2","Type":"ContainerStarted","Data":"ca0a3293be65012d97853bd14bb6a8d637897f11e64370b9af669eec162dcd09"} Dec 11 13:25:40 crc kubenswrapper[4898]: I1211 13:25:40.742264 4898 generic.go:334] "Generic (PLEG): container finished" podID="11baa6bc-e306-47e7-80c0-75a2236f35d0" containerID="fa5a8c762147c773985fdb23692bd511f587c5c083dfe27381386c6ee7a53f2b" exitCode=0 Dec 11 13:25:40 crc kubenswrapper[4898]: I1211 13:25:40.742809 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-d4nc9" event={"ID":"11baa6bc-e306-47e7-80c0-75a2236f35d0","Type":"ContainerDied","Data":"fa5a8c762147c773985fdb23692bd511f587c5c083dfe27381386c6ee7a53f2b"} Dec 11 13:25:40 crc kubenswrapper[4898]: I1211 13:25:40.743047 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-d4nc9" event={"ID":"11baa6bc-e306-47e7-80c0-75a2236f35d0","Type":"ContainerStarted","Data":"e2e62213f2e5e20ce0909570d314e89c24b012d345d0be646d0b21b17de73d9f"} Dec 11 13:25:41 crc kubenswrapper[4898]: W1211 13:25:41.065555 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6ffd41d5_6bb8_4cbf_8054_93e5e6b2a96c.slice/crio-eb757541e2f94236b84bfb076297c318fe8d75d0920788a56a26335328da3f20 WatchSource:0}: Error finding container eb757541e2f94236b84bfb076297c318fe8d75d0920788a56a26335328da3f20: Status 404 returned error can't find the container with id eb757541e2f94236b84bfb076297c318fe8d75d0920788a56a26335328da3f20 Dec 11 13:25:41 crc kubenswrapper[4898]: W1211 13:25:41.069100 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod821d27f4_deb1_4474_bee8_76c9caf611b1.slice/crio-4ee548e15470f6ac43ac1d07fc6c13e595d1fabbb3e63cb06e565da00ee66747 WatchSource:0}: Error finding container 4ee548e15470f6ac43ac1d07fc6c13e595d1fabbb3e63cb06e565da00ee66747: Status 404 returned error can't find the container with id 4ee548e15470f6ac43ac1d07fc6c13e595d1fabbb3e63cb06e565da00ee66747 Dec 11 13:25:41 crc kubenswrapper[4898]: I1211 13:25:41.759144 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"717e7951-d95b-497f-b2b7-3ec4ef755642","Type":"ContainerStarted","Data":"55a31507bf9dd1a21156b30933202cb9ce613bf33f71887151baacad12250af8"} Dec 11 13:25:41 crc kubenswrapper[4898]: I1211 13:25:41.759730 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"717e7951-d95b-497f-b2b7-3ec4ef755642","Type":"ContainerStarted","Data":"c41c1e52be8b202bd733cecc6d3cadb9114d2225e4e959d42f1f1a4c40e3cfc9"} Dec 11 13:25:41 crc kubenswrapper[4898]: I1211 13:25:41.760856 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-fed5-account-create-update-6hwgx" event={"ID":"6ffd41d5-6bb8-4cbf-8054-93e5e6b2a96c","Type":"ContainerStarted","Data":"2086865a139fffb7d9fb0bd0965100c91f1332465d40efaaa0782cfb315e8bcc"} Dec 11 13:25:41 crc kubenswrapper[4898]: I1211 13:25:41.760887 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-fed5-account-create-update-6hwgx" event={"ID":"6ffd41d5-6bb8-4cbf-8054-93e5e6b2a96c","Type":"ContainerStarted","Data":"eb757541e2f94236b84bfb076297c318fe8d75d0920788a56a26335328da3f20"} Dec 11 13:25:41 crc kubenswrapper[4898]: I1211 13:25:41.762216 4898 generic.go:334] "Generic (PLEG): container finished" podID="9a3ef5d3-9ee5-4b40-bae4-3b5ddcfd036e" containerID="99a106f5af64c3e0fc5dc2b7062d46228765e0e0110a7d694a95c2416f1ece26" exitCode=0 Dec 11 13:25:41 crc kubenswrapper[4898]: I1211 13:25:41.762281 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-c170-account-create-update-l27sp" event={"ID":"9a3ef5d3-9ee5-4b40-bae4-3b5ddcfd036e","Type":"ContainerDied","Data":"99a106f5af64c3e0fc5dc2b7062d46228765e0e0110a7d694a95c2416f1ece26"} Dec 11 13:25:41 crc kubenswrapper[4898]: I1211 13:25:41.763905 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-689f7" event={"ID":"0b57dfab-5b26-4f64-a406-ea1701ef79d1","Type":"ContainerStarted","Data":"2aadd6b419d7953b73237ba0b801687d9b696c3257afe7c941accd0fc0fc9c4e"} Dec 11 13:25:41 crc kubenswrapper[4898]: I1211 13:25:41.765896 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-a629-account-create-update-fsml9" event={"ID":"821d27f4-deb1-4474-bee8-76c9caf611b1","Type":"ContainerStarted","Data":"b276dd4c401d4d881332ff7574fd26b6e60cd7531037eb63ff2213a045643cb6"} Dec 11 13:25:41 crc kubenswrapper[4898]: I1211 13:25:41.765923 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-a629-account-create-update-fsml9" event={"ID":"821d27f4-deb1-4474-bee8-76c9caf611b1","Type":"ContainerStarted","Data":"4ee548e15470f6ac43ac1d07fc6c13e595d1fabbb3e63cb06e565da00ee66747"} Dec 11 13:25:41 crc kubenswrapper[4898]: I1211 13:25:41.769904 4898 generic.go:334] "Generic (PLEG): container finished" podID="eaf27272-313d-40d9-b882-151aaaf3da23" containerID="98d3ce981c0ba2ba0f2609170da8e268d18bba274c4883942640e6a45a718f2c" exitCode=0 Dec 11 13:25:41 crc kubenswrapper[4898]: I1211 13:25:41.770024 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-af81-account-create-update-rqlbc" event={"ID":"eaf27272-313d-40d9-b882-151aaaf3da23","Type":"ContainerDied","Data":"98d3ce981c0ba2ba0f2609170da8e268d18bba274c4883942640e6a45a718f2c"} Dec 11 13:25:41 crc kubenswrapper[4898]: I1211 13:25:41.845040 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-fed5-account-create-update-6hwgx" podStartSLOduration=2.845020399 podStartE2EDuration="2.845020399s" podCreationTimestamp="2025-12-11 13:25:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 13:25:41.780722354 +0000 UTC m=+1299.353048791" watchObservedRunningTime="2025-12-11 13:25:41.845020399 +0000 UTC m=+1299.417346836" Dec 11 13:25:41 crc kubenswrapper[4898]: I1211 13:25:41.859527 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-a629-account-create-update-fsml9" podStartSLOduration=2.859510881 podStartE2EDuration="2.859510881s" podCreationTimestamp="2025-12-11 13:25:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 13:25:41.843046507 +0000 UTC m=+1299.415372964" watchObservedRunningTime="2025-12-11 13:25:41.859510881 +0000 UTC m=+1299.431837318" Dec 11 13:25:41 crc kubenswrapper[4898]: I1211 13:25:41.902385 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-create-689f7" podStartSLOduration=2.902363671 podStartE2EDuration="2.902363671s" podCreationTimestamp="2025-12-11 13:25:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 13:25:41.886675247 +0000 UTC m=+1299.459001684" watchObservedRunningTime="2025-12-11 13:25:41.902363671 +0000 UTC m=+1299.474690108" Dec 11 13:25:42 crc kubenswrapper[4898]: I1211 13:25:42.351188 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-d4nc9" Dec 11 13:25:42 crc kubenswrapper[4898]: I1211 13:25:42.359598 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-llbsz" Dec 11 13:25:42 crc kubenswrapper[4898]: I1211 13:25:42.369338 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-nwptx" Dec 11 13:25:42 crc kubenswrapper[4898]: I1211 13:25:42.481923 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4pbmd\" (UniqueName: \"kubernetes.io/projected/204bbeca-833c-4e42-a955-03fde2c57e84-kube-api-access-4pbmd\") pod \"204bbeca-833c-4e42-a955-03fde2c57e84\" (UID: \"204bbeca-833c-4e42-a955-03fde2c57e84\") " Dec 11 13:25:42 crc kubenswrapper[4898]: I1211 13:25:42.482026 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/204bbeca-833c-4e42-a955-03fde2c57e84-operator-scripts\") pod \"204bbeca-833c-4e42-a955-03fde2c57e84\" (UID: \"204bbeca-833c-4e42-a955-03fde2c57e84\") " Dec 11 13:25:42 crc kubenswrapper[4898]: I1211 13:25:42.482181 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/11baa6bc-e306-47e7-80c0-75a2236f35d0-operator-scripts\") pod \"11baa6bc-e306-47e7-80c0-75a2236f35d0\" (UID: \"11baa6bc-e306-47e7-80c0-75a2236f35d0\") " Dec 11 13:25:42 crc kubenswrapper[4898]: I1211 13:25:42.482216 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8k7lt\" (UniqueName: \"kubernetes.io/projected/11baa6bc-e306-47e7-80c0-75a2236f35d0-kube-api-access-8k7lt\") pod \"11baa6bc-e306-47e7-80c0-75a2236f35d0\" (UID: \"11baa6bc-e306-47e7-80c0-75a2236f35d0\") " Dec 11 13:25:42 crc kubenswrapper[4898]: I1211 13:25:42.482251 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vh6js\" (UniqueName: \"kubernetes.io/projected/301f2f68-2aae-4019-8b8e-a9473250de65-kube-api-access-vh6js\") pod \"301f2f68-2aae-4019-8b8e-a9473250de65\" (UID: \"301f2f68-2aae-4019-8b8e-a9473250de65\") " Dec 11 13:25:42 crc kubenswrapper[4898]: I1211 13:25:42.482300 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/301f2f68-2aae-4019-8b8e-a9473250de65-operator-scripts\") pod \"301f2f68-2aae-4019-8b8e-a9473250de65\" (UID: \"301f2f68-2aae-4019-8b8e-a9473250de65\") " Dec 11 13:25:42 crc kubenswrapper[4898]: I1211 13:25:42.483145 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/301f2f68-2aae-4019-8b8e-a9473250de65-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "301f2f68-2aae-4019-8b8e-a9473250de65" (UID: "301f2f68-2aae-4019-8b8e-a9473250de65"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:25:42 crc kubenswrapper[4898]: I1211 13:25:42.484581 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/11baa6bc-e306-47e7-80c0-75a2236f35d0-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "11baa6bc-e306-47e7-80c0-75a2236f35d0" (UID: "11baa6bc-e306-47e7-80c0-75a2236f35d0"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:25:42 crc kubenswrapper[4898]: I1211 13:25:42.484695 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/204bbeca-833c-4e42-a955-03fde2c57e84-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "204bbeca-833c-4e42-a955-03fde2c57e84" (UID: "204bbeca-833c-4e42-a955-03fde2c57e84"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:25:42 crc kubenswrapper[4898]: I1211 13:25:42.488988 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/204bbeca-833c-4e42-a955-03fde2c57e84-kube-api-access-4pbmd" (OuterVolumeSpecName: "kube-api-access-4pbmd") pod "204bbeca-833c-4e42-a955-03fde2c57e84" (UID: "204bbeca-833c-4e42-a955-03fde2c57e84"). InnerVolumeSpecName "kube-api-access-4pbmd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:25:42 crc kubenswrapper[4898]: I1211 13:25:42.489142 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11baa6bc-e306-47e7-80c0-75a2236f35d0-kube-api-access-8k7lt" (OuterVolumeSpecName: "kube-api-access-8k7lt") pod "11baa6bc-e306-47e7-80c0-75a2236f35d0" (UID: "11baa6bc-e306-47e7-80c0-75a2236f35d0"). InnerVolumeSpecName "kube-api-access-8k7lt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:25:42 crc kubenswrapper[4898]: I1211 13:25:42.489238 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/301f2f68-2aae-4019-8b8e-a9473250de65-kube-api-access-vh6js" (OuterVolumeSpecName: "kube-api-access-vh6js") pod "301f2f68-2aae-4019-8b8e-a9473250de65" (UID: "301f2f68-2aae-4019-8b8e-a9473250de65"). InnerVolumeSpecName "kube-api-access-vh6js". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:25:42 crc kubenswrapper[4898]: I1211 13:25:42.585587 4898 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/11baa6bc-e306-47e7-80c0-75a2236f35d0-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 11 13:25:42 crc kubenswrapper[4898]: I1211 13:25:42.585628 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8k7lt\" (UniqueName: \"kubernetes.io/projected/11baa6bc-e306-47e7-80c0-75a2236f35d0-kube-api-access-8k7lt\") on node \"crc\" DevicePath \"\"" Dec 11 13:25:42 crc kubenswrapper[4898]: I1211 13:25:42.585641 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vh6js\" (UniqueName: \"kubernetes.io/projected/301f2f68-2aae-4019-8b8e-a9473250de65-kube-api-access-vh6js\") on node \"crc\" DevicePath \"\"" Dec 11 13:25:42 crc kubenswrapper[4898]: I1211 13:25:42.585649 4898 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/301f2f68-2aae-4019-8b8e-a9473250de65-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 11 13:25:42 crc kubenswrapper[4898]: I1211 13:25:42.585658 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4pbmd\" (UniqueName: \"kubernetes.io/projected/204bbeca-833c-4e42-a955-03fde2c57e84-kube-api-access-4pbmd\") on node \"crc\" DevicePath \"\"" Dec 11 13:25:42 crc kubenswrapper[4898]: I1211 13:25:42.585667 4898 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/204bbeca-833c-4e42-a955-03fde2c57e84-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 11 13:25:42 crc kubenswrapper[4898]: I1211 13:25:42.792890 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-nwptx" Dec 11 13:25:42 crc kubenswrapper[4898]: I1211 13:25:42.800557 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-d4nc9" Dec 11 13:25:42 crc kubenswrapper[4898]: I1211 13:25:42.802850 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-nwptx" event={"ID":"301f2f68-2aae-4019-8b8e-a9473250de65","Type":"ContainerDied","Data":"0059c36b5a06279d7e27b7c91927fe622bb231d86e969f3321cf19b33c9c516c"} Dec 11 13:25:42 crc kubenswrapper[4898]: I1211 13:25:42.802887 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0059c36b5a06279d7e27b7c91927fe622bb231d86e969f3321cf19b33c9c516c" Dec 11 13:25:42 crc kubenswrapper[4898]: I1211 13:25:42.802898 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-d4nc9" event={"ID":"11baa6bc-e306-47e7-80c0-75a2236f35d0","Type":"ContainerDied","Data":"e2e62213f2e5e20ce0909570d314e89c24b012d345d0be646d0b21b17de73d9f"} Dec 11 13:25:42 crc kubenswrapper[4898]: I1211 13:25:42.802909 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e2e62213f2e5e20ce0909570d314e89c24b012d345d0be646d0b21b17de73d9f" Dec 11 13:25:42 crc kubenswrapper[4898]: I1211 13:25:42.808030 4898 generic.go:334] "Generic (PLEG): container finished" podID="821d27f4-deb1-4474-bee8-76c9caf611b1" containerID="b276dd4c401d4d881332ff7574fd26b6e60cd7531037eb63ff2213a045643cb6" exitCode=0 Dec 11 13:25:42 crc kubenswrapper[4898]: I1211 13:25:42.808100 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-a629-account-create-update-fsml9" event={"ID":"821d27f4-deb1-4474-bee8-76c9caf611b1","Type":"ContainerDied","Data":"b276dd4c401d4d881332ff7574fd26b6e60cd7531037eb63ff2213a045643cb6"} Dec 11 13:25:42 crc kubenswrapper[4898]: I1211 13:25:42.817682 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"717e7951-d95b-497f-b2b7-3ec4ef755642","Type":"ContainerStarted","Data":"2e3039c326c669bb38f5fa0674f17632a14a368f2bae25141a77bd9dca547e1c"} Dec 11 13:25:42 crc kubenswrapper[4898]: I1211 13:25:42.817727 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"717e7951-d95b-497f-b2b7-3ec4ef755642","Type":"ContainerStarted","Data":"63e1928510a61dcd53ad03cf2763fcb0492634f1e86e5baaae63534a4f229ba6"} Dec 11 13:25:42 crc kubenswrapper[4898]: I1211 13:25:42.820823 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-llbsz" event={"ID":"204bbeca-833c-4e42-a955-03fde2c57e84","Type":"ContainerDied","Data":"f8ddd391dd411a7204e22cb9d2e76b5437d23898e2e657f6b9b9d4fbf4316e16"} Dec 11 13:25:42 crc kubenswrapper[4898]: I1211 13:25:42.820867 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f8ddd391dd411a7204e22cb9d2e76b5437d23898e2e657f6b9b9d4fbf4316e16" Dec 11 13:25:42 crc kubenswrapper[4898]: I1211 13:25:42.820836 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-llbsz" Dec 11 13:25:42 crc kubenswrapper[4898]: I1211 13:25:42.824348 4898 generic.go:334] "Generic (PLEG): container finished" podID="6ffd41d5-6bb8-4cbf-8054-93e5e6b2a96c" containerID="2086865a139fffb7d9fb0bd0965100c91f1332465d40efaaa0782cfb315e8bcc" exitCode=0 Dec 11 13:25:42 crc kubenswrapper[4898]: I1211 13:25:42.824550 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-fed5-account-create-update-6hwgx" event={"ID":"6ffd41d5-6bb8-4cbf-8054-93e5e6b2a96c","Type":"ContainerDied","Data":"2086865a139fffb7d9fb0bd0965100c91f1332465d40efaaa0782cfb315e8bcc"} Dec 11 13:25:42 crc kubenswrapper[4898]: I1211 13:25:42.829327 4898 generic.go:334] "Generic (PLEG): container finished" podID="0b57dfab-5b26-4f64-a406-ea1701ef79d1" containerID="2aadd6b419d7953b73237ba0b801687d9b696c3257afe7c941accd0fc0fc9c4e" exitCode=0 Dec 11 13:25:42 crc kubenswrapper[4898]: I1211 13:25:42.829618 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-689f7" event={"ID":"0b57dfab-5b26-4f64-a406-ea1701ef79d1","Type":"ContainerDied","Data":"2aadd6b419d7953b73237ba0b801687d9b696c3257afe7c941accd0fc0fc9c4e"} Dec 11 13:25:43 crc kubenswrapper[4898]: I1211 13:25:43.264592 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-c170-account-create-update-l27sp" Dec 11 13:25:43 crc kubenswrapper[4898]: I1211 13:25:43.407625 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9a3ef5d3-9ee5-4b40-bae4-3b5ddcfd036e-operator-scripts\") pod \"9a3ef5d3-9ee5-4b40-bae4-3b5ddcfd036e\" (UID: \"9a3ef5d3-9ee5-4b40-bae4-3b5ddcfd036e\") " Dec 11 13:25:43 crc kubenswrapper[4898]: I1211 13:25:43.407876 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q4x7r\" (UniqueName: \"kubernetes.io/projected/9a3ef5d3-9ee5-4b40-bae4-3b5ddcfd036e-kube-api-access-q4x7r\") pod \"9a3ef5d3-9ee5-4b40-bae4-3b5ddcfd036e\" (UID: \"9a3ef5d3-9ee5-4b40-bae4-3b5ddcfd036e\") " Dec 11 13:25:43 crc kubenswrapper[4898]: I1211 13:25:43.408511 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9a3ef5d3-9ee5-4b40-bae4-3b5ddcfd036e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9a3ef5d3-9ee5-4b40-bae4-3b5ddcfd036e" (UID: "9a3ef5d3-9ee5-4b40-bae4-3b5ddcfd036e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:25:43 crc kubenswrapper[4898]: I1211 13:25:43.417557 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a3ef5d3-9ee5-4b40-bae4-3b5ddcfd036e-kube-api-access-q4x7r" (OuterVolumeSpecName: "kube-api-access-q4x7r") pod "9a3ef5d3-9ee5-4b40-bae4-3b5ddcfd036e" (UID: "9a3ef5d3-9ee5-4b40-bae4-3b5ddcfd036e"). InnerVolumeSpecName "kube-api-access-q4x7r". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:25:43 crc kubenswrapper[4898]: I1211 13:25:43.435181 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-af81-account-create-update-rqlbc" Dec 11 13:25:43 crc kubenswrapper[4898]: I1211 13:25:43.509617 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pbzmj\" (UniqueName: \"kubernetes.io/projected/eaf27272-313d-40d9-b882-151aaaf3da23-kube-api-access-pbzmj\") pod \"eaf27272-313d-40d9-b882-151aaaf3da23\" (UID: \"eaf27272-313d-40d9-b882-151aaaf3da23\") " Dec 11 13:25:43 crc kubenswrapper[4898]: I1211 13:25:43.509931 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eaf27272-313d-40d9-b882-151aaaf3da23-operator-scripts\") pod \"eaf27272-313d-40d9-b882-151aaaf3da23\" (UID: \"eaf27272-313d-40d9-b882-151aaaf3da23\") " Dec 11 13:25:43 crc kubenswrapper[4898]: I1211 13:25:43.510599 4898 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9a3ef5d3-9ee5-4b40-bae4-3b5ddcfd036e-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 11 13:25:43 crc kubenswrapper[4898]: I1211 13:25:43.510619 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q4x7r\" (UniqueName: \"kubernetes.io/projected/9a3ef5d3-9ee5-4b40-bae4-3b5ddcfd036e-kube-api-access-q4x7r\") on node \"crc\" DevicePath \"\"" Dec 11 13:25:43 crc kubenswrapper[4898]: I1211 13:25:43.510757 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eaf27272-313d-40d9-b882-151aaaf3da23-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "eaf27272-313d-40d9-b882-151aaaf3da23" (UID: "eaf27272-313d-40d9-b882-151aaaf3da23"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:25:43 crc kubenswrapper[4898]: I1211 13:25:43.515968 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eaf27272-313d-40d9-b882-151aaaf3da23-kube-api-access-pbzmj" (OuterVolumeSpecName: "kube-api-access-pbzmj") pod "eaf27272-313d-40d9-b882-151aaaf3da23" (UID: "eaf27272-313d-40d9-b882-151aaaf3da23"). InnerVolumeSpecName "kube-api-access-pbzmj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:25:43 crc kubenswrapper[4898]: I1211 13:25:43.612038 4898 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eaf27272-313d-40d9-b882-151aaaf3da23-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 11 13:25:43 crc kubenswrapper[4898]: I1211 13:25:43.612067 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pbzmj\" (UniqueName: \"kubernetes.io/projected/eaf27272-313d-40d9-b882-151aaaf3da23-kube-api-access-pbzmj\") on node \"crc\" DevicePath \"\"" Dec 11 13:25:43 crc kubenswrapper[4898]: I1211 13:25:43.842263 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-af81-account-create-update-rqlbc" event={"ID":"eaf27272-313d-40d9-b882-151aaaf3da23","Type":"ContainerDied","Data":"79bf62a3fccbc65aa62333c943535e24cc7998959f3df9e8b17fb419ef37897b"} Dec 11 13:25:43 crc kubenswrapper[4898]: I1211 13:25:43.842659 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="79bf62a3fccbc65aa62333c943535e24cc7998959f3df9e8b17fb419ef37897b" Dec 11 13:25:43 crc kubenswrapper[4898]: I1211 13:25:43.842394 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-af81-account-create-update-rqlbc" Dec 11 13:25:43 crc kubenswrapper[4898]: I1211 13:25:43.850503 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-c170-account-create-update-l27sp" event={"ID":"9a3ef5d3-9ee5-4b40-bae4-3b5ddcfd036e","Type":"ContainerDied","Data":"7a0d8ed2c646db01745993be7a1b8e346b3597264c91fbd52fe2b4ceff222cb5"} Dec 11 13:25:43 crc kubenswrapper[4898]: I1211 13:25:43.850542 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7a0d8ed2c646db01745993be7a1b8e346b3597264c91fbd52fe2b4ceff222cb5" Dec 11 13:25:43 crc kubenswrapper[4898]: I1211 13:25:43.850619 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-c170-account-create-update-l27sp" Dec 11 13:25:43 crc kubenswrapper[4898]: I1211 13:25:43.856294 4898 generic.go:334] "Generic (PLEG): container finished" podID="ca6259a8-1ee8-47a6-b102-e7e22b93c2c1" containerID="8f1383c507a0d9e655255875ca4a2160451be78073d4b1e6294538ffaa30d522" exitCode=0 Dec 11 13:25:43 crc kubenswrapper[4898]: I1211 13:25:43.856438 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-lkkj2" event={"ID":"ca6259a8-1ee8-47a6-b102-e7e22b93c2c1","Type":"ContainerDied","Data":"8f1383c507a0d9e655255875ca4a2160451be78073d4b1e6294538ffaa30d522"} Dec 11 13:25:44 crc kubenswrapper[4898]: I1211 13:25:44.303756 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-a629-account-create-update-fsml9" Dec 11 13:25:44 crc kubenswrapper[4898]: I1211 13:25:44.431262 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9hlj4\" (UniqueName: \"kubernetes.io/projected/821d27f4-deb1-4474-bee8-76c9caf611b1-kube-api-access-9hlj4\") pod \"821d27f4-deb1-4474-bee8-76c9caf611b1\" (UID: \"821d27f4-deb1-4474-bee8-76c9caf611b1\") " Dec 11 13:25:44 crc kubenswrapper[4898]: I1211 13:25:44.431524 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/821d27f4-deb1-4474-bee8-76c9caf611b1-operator-scripts\") pod \"821d27f4-deb1-4474-bee8-76c9caf611b1\" (UID: \"821d27f4-deb1-4474-bee8-76c9caf611b1\") " Dec 11 13:25:44 crc kubenswrapper[4898]: I1211 13:25:44.432924 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/821d27f4-deb1-4474-bee8-76c9caf611b1-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "821d27f4-deb1-4474-bee8-76c9caf611b1" (UID: "821d27f4-deb1-4474-bee8-76c9caf611b1"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:25:44 crc kubenswrapper[4898]: I1211 13:25:44.437529 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/821d27f4-deb1-4474-bee8-76c9caf611b1-kube-api-access-9hlj4" (OuterVolumeSpecName: "kube-api-access-9hlj4") pod "821d27f4-deb1-4474-bee8-76c9caf611b1" (UID: "821d27f4-deb1-4474-bee8-76c9caf611b1"). InnerVolumeSpecName "kube-api-access-9hlj4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:25:44 crc kubenswrapper[4898]: I1211 13:25:44.534373 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9hlj4\" (UniqueName: \"kubernetes.io/projected/821d27f4-deb1-4474-bee8-76c9caf611b1-kube-api-access-9hlj4\") on node \"crc\" DevicePath \"\"" Dec 11 13:25:44 crc kubenswrapper[4898]: I1211 13:25:44.535300 4898 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/821d27f4-deb1-4474-bee8-76c9caf611b1-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 11 13:25:44 crc kubenswrapper[4898]: I1211 13:25:44.880126 4898 generic.go:334] "Generic (PLEG): container finished" podID="81600875-da95-4cb0-b179-1804494d29d8" containerID="aa9f5174956acb94b0c096eb5585b4a0597f83b7ef735b5ecb41a49880fe69cd" exitCode=0 Dec 11 13:25:44 crc kubenswrapper[4898]: I1211 13:25:44.880520 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"81600875-da95-4cb0-b179-1804494d29d8","Type":"ContainerDied","Data":"aa9f5174956acb94b0c096eb5585b4a0597f83b7ef735b5ecb41a49880fe69cd"} Dec 11 13:25:44 crc kubenswrapper[4898]: I1211 13:25:44.885023 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-a629-account-create-update-fsml9" event={"ID":"821d27f4-deb1-4474-bee8-76c9caf611b1","Type":"ContainerDied","Data":"4ee548e15470f6ac43ac1d07fc6c13e595d1fabbb3e63cb06e565da00ee66747"} Dec 11 13:25:44 crc kubenswrapper[4898]: I1211 13:25:44.885066 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4ee548e15470f6ac43ac1d07fc6c13e595d1fabbb3e63cb06e565da00ee66747" Dec 11 13:25:44 crc kubenswrapper[4898]: I1211 13:25:44.885042 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-a629-account-create-update-fsml9" Dec 11 13:25:44 crc kubenswrapper[4898]: I1211 13:25:44.888957 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"717e7951-d95b-497f-b2b7-3ec4ef755642","Type":"ContainerStarted","Data":"6a7e9aca350ca5aebf32d7541c539d29eef58ddfcf85a2a3d696f916044c3b55"} Dec 11 13:25:44 crc kubenswrapper[4898]: I1211 13:25:44.889006 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"717e7951-d95b-497f-b2b7-3ec4ef755642","Type":"ContainerStarted","Data":"9c3fdd744d04af516cd9bf3114d8bf136b148a2097db74c981832301e2666aa1"} Dec 11 13:25:48 crc kubenswrapper[4898]: I1211 13:25:48.229192 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-689f7" Dec 11 13:25:48 crc kubenswrapper[4898]: I1211 13:25:48.279930 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-lkkj2" Dec 11 13:25:48 crc kubenswrapper[4898]: I1211 13:25:48.309298 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-fed5-account-create-update-6hwgx" Dec 11 13:25:48 crc kubenswrapper[4898]: I1211 13:25:48.338273 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rbm44\" (UniqueName: \"kubernetes.io/projected/0b57dfab-5b26-4f64-a406-ea1701ef79d1-kube-api-access-rbm44\") pod \"0b57dfab-5b26-4f64-a406-ea1701ef79d1\" (UID: \"0b57dfab-5b26-4f64-a406-ea1701ef79d1\") " Dec 11 13:25:48 crc kubenswrapper[4898]: I1211 13:25:48.338694 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca6259a8-1ee8-47a6-b102-e7e22b93c2c1-combined-ca-bundle\") pod \"ca6259a8-1ee8-47a6-b102-e7e22b93c2c1\" (UID: \"ca6259a8-1ee8-47a6-b102-e7e22b93c2c1\") " Dec 11 13:25:48 crc kubenswrapper[4898]: I1211 13:25:48.338865 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ca6259a8-1ee8-47a6-b102-e7e22b93c2c1-db-sync-config-data\") pod \"ca6259a8-1ee8-47a6-b102-e7e22b93c2c1\" (UID: \"ca6259a8-1ee8-47a6-b102-e7e22b93c2c1\") " Dec 11 13:25:48 crc kubenswrapper[4898]: I1211 13:25:48.339025 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kx7jx\" (UniqueName: \"kubernetes.io/projected/ca6259a8-1ee8-47a6-b102-e7e22b93c2c1-kube-api-access-kx7jx\") pod \"ca6259a8-1ee8-47a6-b102-e7e22b93c2c1\" (UID: \"ca6259a8-1ee8-47a6-b102-e7e22b93c2c1\") " Dec 11 13:25:48 crc kubenswrapper[4898]: I1211 13:25:48.339241 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0b57dfab-5b26-4f64-a406-ea1701ef79d1-operator-scripts\") pod \"0b57dfab-5b26-4f64-a406-ea1701ef79d1\" (UID: \"0b57dfab-5b26-4f64-a406-ea1701ef79d1\") " Dec 11 13:25:48 crc kubenswrapper[4898]: I1211 13:25:48.339534 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w22kf\" (UniqueName: \"kubernetes.io/projected/6ffd41d5-6bb8-4cbf-8054-93e5e6b2a96c-kube-api-access-w22kf\") pod \"6ffd41d5-6bb8-4cbf-8054-93e5e6b2a96c\" (UID: \"6ffd41d5-6bb8-4cbf-8054-93e5e6b2a96c\") " Dec 11 13:25:48 crc kubenswrapper[4898]: I1211 13:25:48.339755 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca6259a8-1ee8-47a6-b102-e7e22b93c2c1-config-data\") pod \"ca6259a8-1ee8-47a6-b102-e7e22b93c2c1\" (UID: \"ca6259a8-1ee8-47a6-b102-e7e22b93c2c1\") " Dec 11 13:25:48 crc kubenswrapper[4898]: I1211 13:25:48.339962 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6ffd41d5-6bb8-4cbf-8054-93e5e6b2a96c-operator-scripts\") pod \"6ffd41d5-6bb8-4cbf-8054-93e5e6b2a96c\" (UID: \"6ffd41d5-6bb8-4cbf-8054-93e5e6b2a96c\") " Dec 11 13:25:48 crc kubenswrapper[4898]: I1211 13:25:48.340070 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b57dfab-5b26-4f64-a406-ea1701ef79d1-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0b57dfab-5b26-4f64-a406-ea1701ef79d1" (UID: "0b57dfab-5b26-4f64-a406-ea1701ef79d1"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:25:48 crc kubenswrapper[4898]: I1211 13:25:48.341839 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ffd41d5-6bb8-4cbf-8054-93e5e6b2a96c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6ffd41d5-6bb8-4cbf-8054-93e5e6b2a96c" (UID: "6ffd41d5-6bb8-4cbf-8054-93e5e6b2a96c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:25:48 crc kubenswrapper[4898]: I1211 13:25:48.343682 4898 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0b57dfab-5b26-4f64-a406-ea1701ef79d1-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 11 13:25:48 crc kubenswrapper[4898]: I1211 13:25:48.343957 4898 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6ffd41d5-6bb8-4cbf-8054-93e5e6b2a96c-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 11 13:25:48 crc kubenswrapper[4898]: I1211 13:25:48.347532 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca6259a8-1ee8-47a6-b102-e7e22b93c2c1-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "ca6259a8-1ee8-47a6-b102-e7e22b93c2c1" (UID: "ca6259a8-1ee8-47a6-b102-e7e22b93c2c1"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:25:48 crc kubenswrapper[4898]: I1211 13:25:48.348310 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b57dfab-5b26-4f64-a406-ea1701ef79d1-kube-api-access-rbm44" (OuterVolumeSpecName: "kube-api-access-rbm44") pod "0b57dfab-5b26-4f64-a406-ea1701ef79d1" (UID: "0b57dfab-5b26-4f64-a406-ea1701ef79d1"). InnerVolumeSpecName "kube-api-access-rbm44". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:25:48 crc kubenswrapper[4898]: I1211 13:25:48.348890 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ffd41d5-6bb8-4cbf-8054-93e5e6b2a96c-kube-api-access-w22kf" (OuterVolumeSpecName: "kube-api-access-w22kf") pod "6ffd41d5-6bb8-4cbf-8054-93e5e6b2a96c" (UID: "6ffd41d5-6bb8-4cbf-8054-93e5e6b2a96c"). InnerVolumeSpecName "kube-api-access-w22kf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:25:48 crc kubenswrapper[4898]: I1211 13:25:48.356538 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca6259a8-1ee8-47a6-b102-e7e22b93c2c1-kube-api-access-kx7jx" (OuterVolumeSpecName: "kube-api-access-kx7jx") pod "ca6259a8-1ee8-47a6-b102-e7e22b93c2c1" (UID: "ca6259a8-1ee8-47a6-b102-e7e22b93c2c1"). InnerVolumeSpecName "kube-api-access-kx7jx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:25:48 crc kubenswrapper[4898]: I1211 13:25:48.400003 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca6259a8-1ee8-47a6-b102-e7e22b93c2c1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ca6259a8-1ee8-47a6-b102-e7e22b93c2c1" (UID: "ca6259a8-1ee8-47a6-b102-e7e22b93c2c1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:25:48 crc kubenswrapper[4898]: I1211 13:25:48.430167 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca6259a8-1ee8-47a6-b102-e7e22b93c2c1-config-data" (OuterVolumeSpecName: "config-data") pod "ca6259a8-1ee8-47a6-b102-e7e22b93c2c1" (UID: "ca6259a8-1ee8-47a6-b102-e7e22b93c2c1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:25:48 crc kubenswrapper[4898]: I1211 13:25:48.445299 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rbm44\" (UniqueName: \"kubernetes.io/projected/0b57dfab-5b26-4f64-a406-ea1701ef79d1-kube-api-access-rbm44\") on node \"crc\" DevicePath \"\"" Dec 11 13:25:48 crc kubenswrapper[4898]: I1211 13:25:48.445347 4898 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca6259a8-1ee8-47a6-b102-e7e22b93c2c1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 11 13:25:48 crc kubenswrapper[4898]: I1211 13:25:48.445368 4898 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ca6259a8-1ee8-47a6-b102-e7e22b93c2c1-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 11 13:25:48 crc kubenswrapper[4898]: I1211 13:25:48.445385 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kx7jx\" (UniqueName: \"kubernetes.io/projected/ca6259a8-1ee8-47a6-b102-e7e22b93c2c1-kube-api-access-kx7jx\") on node \"crc\" DevicePath \"\"" Dec 11 13:25:48 crc kubenswrapper[4898]: I1211 13:25:48.445403 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w22kf\" (UniqueName: \"kubernetes.io/projected/6ffd41d5-6bb8-4cbf-8054-93e5e6b2a96c-kube-api-access-w22kf\") on node \"crc\" DevicePath \"\"" Dec 11 13:25:48 crc kubenswrapper[4898]: I1211 13:25:48.445420 4898 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca6259a8-1ee8-47a6-b102-e7e22b93c2c1-config-data\") on node \"crc\" DevicePath \"\"" Dec 11 13:25:48 crc kubenswrapper[4898]: I1211 13:25:48.938946 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"81600875-da95-4cb0-b179-1804494d29d8","Type":"ContainerStarted","Data":"553ac7b12778d3b42643b7b7060b258af007ee5070205ae9e8f863c11433a252"} Dec 11 13:25:48 crc kubenswrapper[4898]: I1211 13:25:48.942758 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"717e7951-d95b-497f-b2b7-3ec4ef755642","Type":"ContainerStarted","Data":"ab136be38b4d221093bafb2751790eb1de3882a936a46f8074559b421db9023c"} Dec 11 13:25:48 crc kubenswrapper[4898]: I1211 13:25:48.942797 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"717e7951-d95b-497f-b2b7-3ec4ef755642","Type":"ContainerStarted","Data":"1dda32aa828438bdd7a4a17aaf983b22456828d8340ef3f7b8b566ff50772f9b"} Dec 11 13:25:48 crc kubenswrapper[4898]: I1211 13:25:48.945129 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-fed5-account-create-update-6hwgx" event={"ID":"6ffd41d5-6bb8-4cbf-8054-93e5e6b2a96c","Type":"ContainerDied","Data":"eb757541e2f94236b84bfb076297c318fe8d75d0920788a56a26335328da3f20"} Dec 11 13:25:48 crc kubenswrapper[4898]: I1211 13:25:48.945154 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eb757541e2f94236b84bfb076297c318fe8d75d0920788a56a26335328da3f20" Dec 11 13:25:48 crc kubenswrapper[4898]: I1211 13:25:48.945170 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-fed5-account-create-update-6hwgx" Dec 11 13:25:48 crc kubenswrapper[4898]: I1211 13:25:48.948937 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-lkkj2" event={"ID":"ca6259a8-1ee8-47a6-b102-e7e22b93c2c1","Type":"ContainerDied","Data":"c332a2bc37bb191ef4009f22e8e94334d007bd79dca5b2a319d184d5717fbc7b"} Dec 11 13:25:48 crc kubenswrapper[4898]: I1211 13:25:48.948978 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c332a2bc37bb191ef4009f22e8e94334d007bd79dca5b2a319d184d5717fbc7b" Dec 11 13:25:48 crc kubenswrapper[4898]: I1211 13:25:48.949051 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-lkkj2" Dec 11 13:25:48 crc kubenswrapper[4898]: I1211 13:25:48.951950 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-689f7" Dec 11 13:25:48 crc kubenswrapper[4898]: I1211 13:25:48.952167 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-689f7" event={"ID":"0b57dfab-5b26-4f64-a406-ea1701ef79d1","Type":"ContainerDied","Data":"ef5275c719b921ba94b9e3d1b1a3c462c45165b2f6d76b9e84e106d3f3c5d23d"} Dec 11 13:25:48 crc kubenswrapper[4898]: I1211 13:25:48.952194 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ef5275c719b921ba94b9e3d1b1a3c462c45165b2f6d76b9e84e106d3f3c5d23d" Dec 11 13:25:48 crc kubenswrapper[4898]: I1211 13:25:48.966826 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-4pkrx" event={"ID":"d314c3d5-6a51-4713-bae9-d25641533de2","Type":"ContainerStarted","Data":"1705ff38453876f18c54dd48ada64a13d3af0c4717292a3feaf5b6e58d32fada"} Dec 11 13:25:48 crc kubenswrapper[4898]: I1211 13:25:48.998441 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-4pkrx" podStartSLOduration=2.386565074 podStartE2EDuration="9.998422344s" podCreationTimestamp="2025-12-11 13:25:39 +0000 UTC" firstStartedPulling="2025-12-11 13:25:40.466413897 +0000 UTC m=+1298.038740334" lastFinishedPulling="2025-12-11 13:25:48.078271167 +0000 UTC m=+1305.650597604" observedRunningTime="2025-12-11 13:25:48.991412379 +0000 UTC m=+1306.563738816" watchObservedRunningTime="2025-12-11 13:25:48.998422344 +0000 UTC m=+1306.570748781" Dec 11 13:25:49 crc kubenswrapper[4898]: I1211 13:25:49.713499 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-74dc88fc-w4rz8"] Dec 11 13:25:49 crc kubenswrapper[4898]: E1211 13:25:49.714172 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="204bbeca-833c-4e42-a955-03fde2c57e84" containerName="mariadb-database-create" Dec 11 13:25:49 crc kubenswrapper[4898]: I1211 13:25:49.714187 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="204bbeca-833c-4e42-a955-03fde2c57e84" containerName="mariadb-database-create" Dec 11 13:25:49 crc kubenswrapper[4898]: E1211 13:25:49.714211 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a3ef5d3-9ee5-4b40-bae4-3b5ddcfd036e" containerName="mariadb-account-create-update" Dec 11 13:25:49 crc kubenswrapper[4898]: I1211 13:25:49.714217 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a3ef5d3-9ee5-4b40-bae4-3b5ddcfd036e" containerName="mariadb-account-create-update" Dec 11 13:25:49 crc kubenswrapper[4898]: E1211 13:25:49.714230 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="301f2f68-2aae-4019-8b8e-a9473250de65" containerName="mariadb-database-create" Dec 11 13:25:49 crc kubenswrapper[4898]: I1211 13:25:49.714237 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="301f2f68-2aae-4019-8b8e-a9473250de65" containerName="mariadb-database-create" Dec 11 13:25:49 crc kubenswrapper[4898]: E1211 13:25:49.714248 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eaf27272-313d-40d9-b882-151aaaf3da23" containerName="mariadb-account-create-update" Dec 11 13:25:49 crc kubenswrapper[4898]: I1211 13:25:49.714254 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="eaf27272-313d-40d9-b882-151aaaf3da23" containerName="mariadb-account-create-update" Dec 11 13:25:49 crc kubenswrapper[4898]: E1211 13:25:49.714262 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ffd41d5-6bb8-4cbf-8054-93e5e6b2a96c" containerName="mariadb-account-create-update" Dec 11 13:25:49 crc kubenswrapper[4898]: I1211 13:25:49.714268 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ffd41d5-6bb8-4cbf-8054-93e5e6b2a96c" containerName="mariadb-account-create-update" Dec 11 13:25:49 crc kubenswrapper[4898]: E1211 13:25:49.714291 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11baa6bc-e306-47e7-80c0-75a2236f35d0" containerName="mariadb-database-create" Dec 11 13:25:49 crc kubenswrapper[4898]: I1211 13:25:49.714297 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="11baa6bc-e306-47e7-80c0-75a2236f35d0" containerName="mariadb-database-create" Dec 11 13:25:49 crc kubenswrapper[4898]: E1211 13:25:49.714307 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca6259a8-1ee8-47a6-b102-e7e22b93c2c1" containerName="glance-db-sync" Dec 11 13:25:49 crc kubenswrapper[4898]: I1211 13:25:49.714313 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca6259a8-1ee8-47a6-b102-e7e22b93c2c1" containerName="glance-db-sync" Dec 11 13:25:49 crc kubenswrapper[4898]: E1211 13:25:49.714325 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b57dfab-5b26-4f64-a406-ea1701ef79d1" containerName="mariadb-database-create" Dec 11 13:25:49 crc kubenswrapper[4898]: I1211 13:25:49.714333 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b57dfab-5b26-4f64-a406-ea1701ef79d1" containerName="mariadb-database-create" Dec 11 13:25:49 crc kubenswrapper[4898]: E1211 13:25:49.714344 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="821d27f4-deb1-4474-bee8-76c9caf611b1" containerName="mariadb-account-create-update" Dec 11 13:25:49 crc kubenswrapper[4898]: I1211 13:25:49.714350 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="821d27f4-deb1-4474-bee8-76c9caf611b1" containerName="mariadb-account-create-update" Dec 11 13:25:49 crc kubenswrapper[4898]: I1211 13:25:49.722688 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="eaf27272-313d-40d9-b882-151aaaf3da23" containerName="mariadb-account-create-update" Dec 11 13:25:49 crc kubenswrapper[4898]: I1211 13:25:49.722739 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="301f2f68-2aae-4019-8b8e-a9473250de65" containerName="mariadb-database-create" Dec 11 13:25:49 crc kubenswrapper[4898]: I1211 13:25:49.722751 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b57dfab-5b26-4f64-a406-ea1701ef79d1" containerName="mariadb-database-create" Dec 11 13:25:49 crc kubenswrapper[4898]: I1211 13:25:49.722764 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="204bbeca-833c-4e42-a955-03fde2c57e84" containerName="mariadb-database-create" Dec 11 13:25:49 crc kubenswrapper[4898]: I1211 13:25:49.722773 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="11baa6bc-e306-47e7-80c0-75a2236f35d0" containerName="mariadb-database-create" Dec 11 13:25:49 crc kubenswrapper[4898]: I1211 13:25:49.722784 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a3ef5d3-9ee5-4b40-bae4-3b5ddcfd036e" containerName="mariadb-account-create-update" Dec 11 13:25:49 crc kubenswrapper[4898]: I1211 13:25:49.722793 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca6259a8-1ee8-47a6-b102-e7e22b93c2c1" containerName="glance-db-sync" Dec 11 13:25:49 crc kubenswrapper[4898]: I1211 13:25:49.722802 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="821d27f4-deb1-4474-bee8-76c9caf611b1" containerName="mariadb-account-create-update" Dec 11 13:25:49 crc kubenswrapper[4898]: I1211 13:25:49.722810 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ffd41d5-6bb8-4cbf-8054-93e5e6b2a96c" containerName="mariadb-account-create-update" Dec 11 13:25:49 crc kubenswrapper[4898]: I1211 13:25:49.724191 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74dc88fc-w4rz8" Dec 11 13:25:49 crc kubenswrapper[4898]: I1211 13:25:49.742005 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-74dc88fc-w4rz8"] Dec 11 13:25:49 crc kubenswrapper[4898]: I1211 13:25:49.782828 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f8fd80d4-db6d-40ce-bbaf-13565266e5cc-config\") pod \"dnsmasq-dns-74dc88fc-w4rz8\" (UID: \"f8fd80d4-db6d-40ce-bbaf-13565266e5cc\") " pod="openstack/dnsmasq-dns-74dc88fc-w4rz8" Dec 11 13:25:49 crc kubenswrapper[4898]: I1211 13:25:49.782892 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f8fd80d4-db6d-40ce-bbaf-13565266e5cc-ovsdbserver-sb\") pod \"dnsmasq-dns-74dc88fc-w4rz8\" (UID: \"f8fd80d4-db6d-40ce-bbaf-13565266e5cc\") " pod="openstack/dnsmasq-dns-74dc88fc-w4rz8" Dec 11 13:25:49 crc kubenswrapper[4898]: I1211 13:25:49.783044 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f8fd80d4-db6d-40ce-bbaf-13565266e5cc-dns-svc\") pod \"dnsmasq-dns-74dc88fc-w4rz8\" (UID: \"f8fd80d4-db6d-40ce-bbaf-13565266e5cc\") " pod="openstack/dnsmasq-dns-74dc88fc-w4rz8" Dec 11 13:25:49 crc kubenswrapper[4898]: I1211 13:25:49.783086 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f8fd80d4-db6d-40ce-bbaf-13565266e5cc-ovsdbserver-nb\") pod \"dnsmasq-dns-74dc88fc-w4rz8\" (UID: \"f8fd80d4-db6d-40ce-bbaf-13565266e5cc\") " pod="openstack/dnsmasq-dns-74dc88fc-w4rz8" Dec 11 13:25:49 crc kubenswrapper[4898]: I1211 13:25:49.783189 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-274mg\" (UniqueName: \"kubernetes.io/projected/f8fd80d4-db6d-40ce-bbaf-13565266e5cc-kube-api-access-274mg\") pod \"dnsmasq-dns-74dc88fc-w4rz8\" (UID: \"f8fd80d4-db6d-40ce-bbaf-13565266e5cc\") " pod="openstack/dnsmasq-dns-74dc88fc-w4rz8" Dec 11 13:25:49 crc kubenswrapper[4898]: I1211 13:25:49.886885 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-274mg\" (UniqueName: \"kubernetes.io/projected/f8fd80d4-db6d-40ce-bbaf-13565266e5cc-kube-api-access-274mg\") pod \"dnsmasq-dns-74dc88fc-w4rz8\" (UID: \"f8fd80d4-db6d-40ce-bbaf-13565266e5cc\") " pod="openstack/dnsmasq-dns-74dc88fc-w4rz8" Dec 11 13:25:49 crc kubenswrapper[4898]: I1211 13:25:49.887029 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f8fd80d4-db6d-40ce-bbaf-13565266e5cc-config\") pod \"dnsmasq-dns-74dc88fc-w4rz8\" (UID: \"f8fd80d4-db6d-40ce-bbaf-13565266e5cc\") " pod="openstack/dnsmasq-dns-74dc88fc-w4rz8" Dec 11 13:25:49 crc kubenswrapper[4898]: I1211 13:25:49.887062 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f8fd80d4-db6d-40ce-bbaf-13565266e5cc-ovsdbserver-sb\") pod \"dnsmasq-dns-74dc88fc-w4rz8\" (UID: \"f8fd80d4-db6d-40ce-bbaf-13565266e5cc\") " pod="openstack/dnsmasq-dns-74dc88fc-w4rz8" Dec 11 13:25:49 crc kubenswrapper[4898]: I1211 13:25:49.887175 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f8fd80d4-db6d-40ce-bbaf-13565266e5cc-dns-svc\") pod \"dnsmasq-dns-74dc88fc-w4rz8\" (UID: \"f8fd80d4-db6d-40ce-bbaf-13565266e5cc\") " pod="openstack/dnsmasq-dns-74dc88fc-w4rz8" Dec 11 13:25:49 crc kubenswrapper[4898]: I1211 13:25:49.887213 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f8fd80d4-db6d-40ce-bbaf-13565266e5cc-ovsdbserver-nb\") pod \"dnsmasq-dns-74dc88fc-w4rz8\" (UID: \"f8fd80d4-db6d-40ce-bbaf-13565266e5cc\") " pod="openstack/dnsmasq-dns-74dc88fc-w4rz8" Dec 11 13:25:49 crc kubenswrapper[4898]: I1211 13:25:49.888283 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f8fd80d4-db6d-40ce-bbaf-13565266e5cc-ovsdbserver-nb\") pod \"dnsmasq-dns-74dc88fc-w4rz8\" (UID: \"f8fd80d4-db6d-40ce-bbaf-13565266e5cc\") " pod="openstack/dnsmasq-dns-74dc88fc-w4rz8" Dec 11 13:25:49 crc kubenswrapper[4898]: I1211 13:25:49.888431 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f8fd80d4-db6d-40ce-bbaf-13565266e5cc-ovsdbserver-sb\") pod \"dnsmasq-dns-74dc88fc-w4rz8\" (UID: \"f8fd80d4-db6d-40ce-bbaf-13565266e5cc\") " pod="openstack/dnsmasq-dns-74dc88fc-w4rz8" Dec 11 13:25:49 crc kubenswrapper[4898]: I1211 13:25:49.890521 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f8fd80d4-db6d-40ce-bbaf-13565266e5cc-config\") pod \"dnsmasq-dns-74dc88fc-w4rz8\" (UID: \"f8fd80d4-db6d-40ce-bbaf-13565266e5cc\") " pod="openstack/dnsmasq-dns-74dc88fc-w4rz8" Dec 11 13:25:49 crc kubenswrapper[4898]: I1211 13:25:49.890829 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f8fd80d4-db6d-40ce-bbaf-13565266e5cc-dns-svc\") pod \"dnsmasq-dns-74dc88fc-w4rz8\" (UID: \"f8fd80d4-db6d-40ce-bbaf-13565266e5cc\") " pod="openstack/dnsmasq-dns-74dc88fc-w4rz8" Dec 11 13:25:49 crc kubenswrapper[4898]: I1211 13:25:49.921518 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-274mg\" (UniqueName: \"kubernetes.io/projected/f8fd80d4-db6d-40ce-bbaf-13565266e5cc-kube-api-access-274mg\") pod \"dnsmasq-dns-74dc88fc-w4rz8\" (UID: \"f8fd80d4-db6d-40ce-bbaf-13565266e5cc\") " pod="openstack/dnsmasq-dns-74dc88fc-w4rz8" Dec 11 13:25:50 crc kubenswrapper[4898]: I1211 13:25:50.101295 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74dc88fc-w4rz8" Dec 11 13:25:50 crc kubenswrapper[4898]: I1211 13:25:50.701333 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-74dc88fc-w4rz8"] Dec 11 13:25:50 crc kubenswrapper[4898]: I1211 13:25:50.993909 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74dc88fc-w4rz8" event={"ID":"f8fd80d4-db6d-40ce-bbaf-13565266e5cc","Type":"ContainerStarted","Data":"ce268f2a7f6b938bc17435e640e4c1d276a996d07747708cf3e4099c93464507"} Dec 11 13:25:52 crc kubenswrapper[4898]: I1211 13:25:52.020723 4898 generic.go:334] "Generic (PLEG): container finished" podID="f8fd80d4-db6d-40ce-bbaf-13565266e5cc" containerID="3d1a3b5c868a63a4a586e8e454fa7fba969e9fa25900977fbad4682786865ae5" exitCode=0 Dec 11 13:25:52 crc kubenswrapper[4898]: I1211 13:25:52.021124 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74dc88fc-w4rz8" event={"ID":"f8fd80d4-db6d-40ce-bbaf-13565266e5cc","Type":"ContainerDied","Data":"3d1a3b5c868a63a4a586e8e454fa7fba969e9fa25900977fbad4682786865ae5"} Dec 11 13:25:52 crc kubenswrapper[4898]: I1211 13:25:52.032873 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"81600875-da95-4cb0-b179-1804494d29d8","Type":"ContainerStarted","Data":"c5d6c55e90633c5cbdafa76d5e4d6b16831528eb12df14aff34c5f2bf9a4830e"} Dec 11 13:25:52 crc kubenswrapper[4898]: I1211 13:25:52.032923 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"81600875-da95-4cb0-b179-1804494d29d8","Type":"ContainerStarted","Data":"b9b15ec6b320a4d58dc4a475b402d3bce08b5bd3af617f12c907d5009caa71d9"} Dec 11 13:25:52 crc kubenswrapper[4898]: I1211 13:25:52.044554 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"717e7951-d95b-497f-b2b7-3ec4ef755642","Type":"ContainerStarted","Data":"e556e68a0187c82fd9d587f769eef2ed1ca5dcc32ad0b1f327502c68859641ac"} Dec 11 13:25:52 crc kubenswrapper[4898]: I1211 13:25:52.044600 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"717e7951-d95b-497f-b2b7-3ec4ef755642","Type":"ContainerStarted","Data":"261a00eed6e47ea5bb8d65ec9ad05bdb850918b3c50cde913b689893def18ac4"} Dec 11 13:25:52 crc kubenswrapper[4898]: I1211 13:25:52.044612 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"717e7951-d95b-497f-b2b7-3ec4ef755642","Type":"ContainerStarted","Data":"7fe2c8754819bb2f032aceada6515c6e083ff53baf39b6682d1ade1afc180633"} Dec 11 13:25:52 crc kubenswrapper[4898]: I1211 13:25:52.077614 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=20.077598015 podStartE2EDuration="20.077598015s" podCreationTimestamp="2025-12-11 13:25:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 13:25:52.069853361 +0000 UTC m=+1309.642179808" watchObservedRunningTime="2025-12-11 13:25:52.077598015 +0000 UTC m=+1309.649924452" Dec 11 13:25:53 crc kubenswrapper[4898]: I1211 13:25:53.035636 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Dec 11 13:25:53 crc kubenswrapper[4898]: I1211 13:25:53.070063 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"717e7951-d95b-497f-b2b7-3ec4ef755642","Type":"ContainerStarted","Data":"2a8efe97bb028e9055fabc495befb32c68ed9470d579e835840f4ca50e39df2e"} Dec 11 13:25:53 crc kubenswrapper[4898]: I1211 13:25:53.071679 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"717e7951-d95b-497f-b2b7-3ec4ef755642","Type":"ContainerStarted","Data":"bbc756c1ac48626d88231f13ea594a98d51052f1fa61c78eb827c5d91e2e83c3"} Dec 11 13:25:53 crc kubenswrapper[4898]: I1211 13:25:53.071807 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"717e7951-d95b-497f-b2b7-3ec4ef755642","Type":"ContainerStarted","Data":"2950dc299b7d43980a87965322c7ea76e658512e74afb8d352248aad8dea7b9c"} Dec 11 13:25:53 crc kubenswrapper[4898]: I1211 13:25:53.071897 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"717e7951-d95b-497f-b2b7-3ec4ef755642","Type":"ContainerStarted","Data":"b9c60d65e40221db0d08742bd00d81cc9b72af941beb7448fcb49c3b92555b51"} Dec 11 13:25:53 crc kubenswrapper[4898]: I1211 13:25:53.075260 4898 generic.go:334] "Generic (PLEG): container finished" podID="d314c3d5-6a51-4713-bae9-d25641533de2" containerID="1705ff38453876f18c54dd48ada64a13d3af0c4717292a3feaf5b6e58d32fada" exitCode=0 Dec 11 13:25:53 crc kubenswrapper[4898]: I1211 13:25:53.075332 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-4pkrx" event={"ID":"d314c3d5-6a51-4713-bae9-d25641533de2","Type":"ContainerDied","Data":"1705ff38453876f18c54dd48ada64a13d3af0c4717292a3feaf5b6e58d32fada"} Dec 11 13:25:53 crc kubenswrapper[4898]: I1211 13:25:53.079866 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74dc88fc-w4rz8" event={"ID":"f8fd80d4-db6d-40ce-bbaf-13565266e5cc","Type":"ContainerStarted","Data":"a9cbb15434f9bb88ff30b78dbc766a42c4ca59a66f3a25e0b2a428474a01ffb9"} Dec 11 13:25:53 crc kubenswrapper[4898]: I1211 13:25:53.079990 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-74dc88fc-w4rz8" Dec 11 13:25:53 crc kubenswrapper[4898]: I1211 13:25:53.115955 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=36.747289489 podStartE2EDuration="50.115932547s" podCreationTimestamp="2025-12-11 13:25:03 +0000 UTC" firstStartedPulling="2025-12-11 13:25:37.548491136 +0000 UTC m=+1295.120817583" lastFinishedPulling="2025-12-11 13:25:50.917134204 +0000 UTC m=+1308.489460641" observedRunningTime="2025-12-11 13:25:53.101955979 +0000 UTC m=+1310.674282416" watchObservedRunningTime="2025-12-11 13:25:53.115932547 +0000 UTC m=+1310.688258984" Dec 11 13:25:53 crc kubenswrapper[4898]: I1211 13:25:53.134673 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-74dc88fc-w4rz8" podStartSLOduration=4.134648751 podStartE2EDuration="4.134648751s" podCreationTimestamp="2025-12-11 13:25:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 13:25:53.125071788 +0000 UTC m=+1310.697398225" watchObservedRunningTime="2025-12-11 13:25:53.134648751 +0000 UTC m=+1310.706975188" Dec 11 13:25:53 crc kubenswrapper[4898]: I1211 13:25:53.378473 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74dc88fc-w4rz8"] Dec 11 13:25:53 crc kubenswrapper[4898]: I1211 13:25:53.413103 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5f59b8f679-9flrp"] Dec 11 13:25:53 crc kubenswrapper[4898]: I1211 13:25:53.415646 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f59b8f679-9flrp" Dec 11 13:25:53 crc kubenswrapper[4898]: I1211 13:25:53.420593 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Dec 11 13:25:53 crc kubenswrapper[4898]: I1211 13:25:53.428486 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5f59b8f679-9flrp"] Dec 11 13:25:53 crc kubenswrapper[4898]: I1211 13:25:53.574066 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5c05e490-8014-425f-a1f1-7f4374e1d7da-dns-swift-storage-0\") pod \"dnsmasq-dns-5f59b8f679-9flrp\" (UID: \"5c05e490-8014-425f-a1f1-7f4374e1d7da\") " pod="openstack/dnsmasq-dns-5f59b8f679-9flrp" Dec 11 13:25:53 crc kubenswrapper[4898]: I1211 13:25:53.574171 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5c05e490-8014-425f-a1f1-7f4374e1d7da-dns-svc\") pod \"dnsmasq-dns-5f59b8f679-9flrp\" (UID: \"5c05e490-8014-425f-a1f1-7f4374e1d7da\") " pod="openstack/dnsmasq-dns-5f59b8f679-9flrp" Dec 11 13:25:53 crc kubenswrapper[4898]: I1211 13:25:53.574226 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5c05e490-8014-425f-a1f1-7f4374e1d7da-ovsdbserver-nb\") pod \"dnsmasq-dns-5f59b8f679-9flrp\" (UID: \"5c05e490-8014-425f-a1f1-7f4374e1d7da\") " pod="openstack/dnsmasq-dns-5f59b8f679-9flrp" Dec 11 13:25:53 crc kubenswrapper[4898]: I1211 13:25:53.574346 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5c05e490-8014-425f-a1f1-7f4374e1d7da-config\") pod \"dnsmasq-dns-5f59b8f679-9flrp\" (UID: \"5c05e490-8014-425f-a1f1-7f4374e1d7da\") " pod="openstack/dnsmasq-dns-5f59b8f679-9flrp" Dec 11 13:25:53 crc kubenswrapper[4898]: I1211 13:25:53.574376 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5c05e490-8014-425f-a1f1-7f4374e1d7da-ovsdbserver-sb\") pod \"dnsmasq-dns-5f59b8f679-9flrp\" (UID: \"5c05e490-8014-425f-a1f1-7f4374e1d7da\") " pod="openstack/dnsmasq-dns-5f59b8f679-9flrp" Dec 11 13:25:53 crc kubenswrapper[4898]: I1211 13:25:53.574548 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2h4k9\" (UniqueName: \"kubernetes.io/projected/5c05e490-8014-425f-a1f1-7f4374e1d7da-kube-api-access-2h4k9\") pod \"dnsmasq-dns-5f59b8f679-9flrp\" (UID: \"5c05e490-8014-425f-a1f1-7f4374e1d7da\") " pod="openstack/dnsmasq-dns-5f59b8f679-9flrp" Dec 11 13:25:53 crc kubenswrapper[4898]: I1211 13:25:53.676493 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5c05e490-8014-425f-a1f1-7f4374e1d7da-ovsdbserver-nb\") pod \"dnsmasq-dns-5f59b8f679-9flrp\" (UID: \"5c05e490-8014-425f-a1f1-7f4374e1d7da\") " pod="openstack/dnsmasq-dns-5f59b8f679-9flrp" Dec 11 13:25:53 crc kubenswrapper[4898]: I1211 13:25:53.676609 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5c05e490-8014-425f-a1f1-7f4374e1d7da-config\") pod \"dnsmasq-dns-5f59b8f679-9flrp\" (UID: \"5c05e490-8014-425f-a1f1-7f4374e1d7da\") " pod="openstack/dnsmasq-dns-5f59b8f679-9flrp" Dec 11 13:25:53 crc kubenswrapper[4898]: I1211 13:25:53.676630 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5c05e490-8014-425f-a1f1-7f4374e1d7da-ovsdbserver-sb\") pod \"dnsmasq-dns-5f59b8f679-9flrp\" (UID: \"5c05e490-8014-425f-a1f1-7f4374e1d7da\") " pod="openstack/dnsmasq-dns-5f59b8f679-9flrp" Dec 11 13:25:53 crc kubenswrapper[4898]: I1211 13:25:53.676693 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2h4k9\" (UniqueName: \"kubernetes.io/projected/5c05e490-8014-425f-a1f1-7f4374e1d7da-kube-api-access-2h4k9\") pod \"dnsmasq-dns-5f59b8f679-9flrp\" (UID: \"5c05e490-8014-425f-a1f1-7f4374e1d7da\") " pod="openstack/dnsmasq-dns-5f59b8f679-9flrp" Dec 11 13:25:53 crc kubenswrapper[4898]: I1211 13:25:53.676719 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5c05e490-8014-425f-a1f1-7f4374e1d7da-dns-swift-storage-0\") pod \"dnsmasq-dns-5f59b8f679-9flrp\" (UID: \"5c05e490-8014-425f-a1f1-7f4374e1d7da\") " pod="openstack/dnsmasq-dns-5f59b8f679-9flrp" Dec 11 13:25:53 crc kubenswrapper[4898]: I1211 13:25:53.676765 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5c05e490-8014-425f-a1f1-7f4374e1d7da-dns-svc\") pod \"dnsmasq-dns-5f59b8f679-9flrp\" (UID: \"5c05e490-8014-425f-a1f1-7f4374e1d7da\") " pod="openstack/dnsmasq-dns-5f59b8f679-9flrp" Dec 11 13:25:53 crc kubenswrapper[4898]: I1211 13:25:53.677539 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5c05e490-8014-425f-a1f1-7f4374e1d7da-ovsdbserver-nb\") pod \"dnsmasq-dns-5f59b8f679-9flrp\" (UID: \"5c05e490-8014-425f-a1f1-7f4374e1d7da\") " pod="openstack/dnsmasq-dns-5f59b8f679-9flrp" Dec 11 13:25:53 crc kubenswrapper[4898]: I1211 13:25:53.677635 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5c05e490-8014-425f-a1f1-7f4374e1d7da-ovsdbserver-sb\") pod \"dnsmasq-dns-5f59b8f679-9flrp\" (UID: \"5c05e490-8014-425f-a1f1-7f4374e1d7da\") " pod="openstack/dnsmasq-dns-5f59b8f679-9flrp" Dec 11 13:25:53 crc kubenswrapper[4898]: I1211 13:25:53.677700 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5c05e490-8014-425f-a1f1-7f4374e1d7da-dns-svc\") pod \"dnsmasq-dns-5f59b8f679-9flrp\" (UID: \"5c05e490-8014-425f-a1f1-7f4374e1d7da\") " pod="openstack/dnsmasq-dns-5f59b8f679-9flrp" Dec 11 13:25:53 crc kubenswrapper[4898]: I1211 13:25:53.677697 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5c05e490-8014-425f-a1f1-7f4374e1d7da-config\") pod \"dnsmasq-dns-5f59b8f679-9flrp\" (UID: \"5c05e490-8014-425f-a1f1-7f4374e1d7da\") " pod="openstack/dnsmasq-dns-5f59b8f679-9flrp" Dec 11 13:25:53 crc kubenswrapper[4898]: I1211 13:25:53.677857 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5c05e490-8014-425f-a1f1-7f4374e1d7da-dns-swift-storage-0\") pod \"dnsmasq-dns-5f59b8f679-9flrp\" (UID: \"5c05e490-8014-425f-a1f1-7f4374e1d7da\") " pod="openstack/dnsmasq-dns-5f59b8f679-9flrp" Dec 11 13:25:53 crc kubenswrapper[4898]: I1211 13:25:53.701466 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2h4k9\" (UniqueName: \"kubernetes.io/projected/5c05e490-8014-425f-a1f1-7f4374e1d7da-kube-api-access-2h4k9\") pod \"dnsmasq-dns-5f59b8f679-9flrp\" (UID: \"5c05e490-8014-425f-a1f1-7f4374e1d7da\") " pod="openstack/dnsmasq-dns-5f59b8f679-9flrp" Dec 11 13:25:53 crc kubenswrapper[4898]: I1211 13:25:53.746600 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f59b8f679-9flrp" Dec 11 13:25:54 crc kubenswrapper[4898]: I1211 13:25:54.209624 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5f59b8f679-9flrp"] Dec 11 13:25:54 crc kubenswrapper[4898]: W1211 13:25:54.211864 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5c05e490_8014_425f_a1f1_7f4374e1d7da.slice/crio-808880a1addb80674fe1bc5a2ad79ef00371ba9cdcd9c877f2c3782f187da30f WatchSource:0}: Error finding container 808880a1addb80674fe1bc5a2ad79ef00371ba9cdcd9c877f2c3782f187da30f: Status 404 returned error can't find the container with id 808880a1addb80674fe1bc5a2ad79ef00371ba9cdcd9c877f2c3782f187da30f Dec 11 13:25:54 crc kubenswrapper[4898]: I1211 13:25:54.397772 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-4pkrx" Dec 11 13:25:54 crc kubenswrapper[4898]: I1211 13:25:54.494502 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d314c3d5-6a51-4713-bae9-d25641533de2-config-data\") pod \"d314c3d5-6a51-4713-bae9-d25641533de2\" (UID: \"d314c3d5-6a51-4713-bae9-d25641533de2\") " Dec 11 13:25:54 crc kubenswrapper[4898]: I1211 13:25:54.494625 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d314c3d5-6a51-4713-bae9-d25641533de2-combined-ca-bundle\") pod \"d314c3d5-6a51-4713-bae9-d25641533de2\" (UID: \"d314c3d5-6a51-4713-bae9-d25641533de2\") " Dec 11 13:25:54 crc kubenswrapper[4898]: I1211 13:25:54.494704 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-csk6q\" (UniqueName: \"kubernetes.io/projected/d314c3d5-6a51-4713-bae9-d25641533de2-kube-api-access-csk6q\") pod \"d314c3d5-6a51-4713-bae9-d25641533de2\" (UID: \"d314c3d5-6a51-4713-bae9-d25641533de2\") " Dec 11 13:25:54 crc kubenswrapper[4898]: I1211 13:25:54.499242 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d314c3d5-6a51-4713-bae9-d25641533de2-kube-api-access-csk6q" (OuterVolumeSpecName: "kube-api-access-csk6q") pod "d314c3d5-6a51-4713-bae9-d25641533de2" (UID: "d314c3d5-6a51-4713-bae9-d25641533de2"). InnerVolumeSpecName "kube-api-access-csk6q". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:25:54 crc kubenswrapper[4898]: I1211 13:25:54.537694 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d314c3d5-6a51-4713-bae9-d25641533de2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d314c3d5-6a51-4713-bae9-d25641533de2" (UID: "d314c3d5-6a51-4713-bae9-d25641533de2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:25:54 crc kubenswrapper[4898]: I1211 13:25:54.559069 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d314c3d5-6a51-4713-bae9-d25641533de2-config-data" (OuterVolumeSpecName: "config-data") pod "d314c3d5-6a51-4713-bae9-d25641533de2" (UID: "d314c3d5-6a51-4713-bae9-d25641533de2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:25:54 crc kubenswrapper[4898]: I1211 13:25:54.596669 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-csk6q\" (UniqueName: \"kubernetes.io/projected/d314c3d5-6a51-4713-bae9-d25641533de2-kube-api-access-csk6q\") on node \"crc\" DevicePath \"\"" Dec 11 13:25:54 crc kubenswrapper[4898]: I1211 13:25:54.596963 4898 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d314c3d5-6a51-4713-bae9-d25641533de2-config-data\") on node \"crc\" DevicePath \"\"" Dec 11 13:25:54 crc kubenswrapper[4898]: I1211 13:25:54.596981 4898 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d314c3d5-6a51-4713-bae9-d25641533de2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 11 13:25:55 crc kubenswrapper[4898]: I1211 13:25:55.099881 4898 generic.go:334] "Generic (PLEG): container finished" podID="5c05e490-8014-425f-a1f1-7f4374e1d7da" containerID="41500bcf18203778da4094b610eb74d8a1fd2e65f82dfa4b72b7ec1602c4279c" exitCode=0 Dec 11 13:25:55 crc kubenswrapper[4898]: I1211 13:25:55.100008 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f59b8f679-9flrp" event={"ID":"5c05e490-8014-425f-a1f1-7f4374e1d7da","Type":"ContainerDied","Data":"41500bcf18203778da4094b610eb74d8a1fd2e65f82dfa4b72b7ec1602c4279c"} Dec 11 13:25:55 crc kubenswrapper[4898]: I1211 13:25:55.100088 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f59b8f679-9flrp" event={"ID":"5c05e490-8014-425f-a1f1-7f4374e1d7da","Type":"ContainerStarted","Data":"808880a1addb80674fe1bc5a2ad79ef00371ba9cdcd9c877f2c3782f187da30f"} Dec 11 13:25:55 crc kubenswrapper[4898]: I1211 13:25:55.104414 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-4pkrx" event={"ID":"d314c3d5-6a51-4713-bae9-d25641533de2","Type":"ContainerDied","Data":"ca0a3293be65012d97853bd14bb6a8d637897f11e64370b9af669eec162dcd09"} Dec 11 13:25:55 crc kubenswrapper[4898]: I1211 13:25:55.104535 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ca0a3293be65012d97853bd14bb6a8d637897f11e64370b9af669eec162dcd09" Dec 11 13:25:55 crc kubenswrapper[4898]: I1211 13:25:55.104581 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-4pkrx" Dec 11 13:25:55 crc kubenswrapper[4898]: I1211 13:25:55.104700 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-74dc88fc-w4rz8" podUID="f8fd80d4-db6d-40ce-bbaf-13565266e5cc" containerName="dnsmasq-dns" containerID="cri-o://a9cbb15434f9bb88ff30b78dbc766a42c4ca59a66f3a25e0b2a428474a01ffb9" gracePeriod=10 Dec 11 13:25:55 crc kubenswrapper[4898]: I1211 13:25:55.403090 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-zctw4"] Dec 11 13:25:55 crc kubenswrapper[4898]: E1211 13:25:55.403975 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d314c3d5-6a51-4713-bae9-d25641533de2" containerName="keystone-db-sync" Dec 11 13:25:55 crc kubenswrapper[4898]: I1211 13:25:55.403994 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="d314c3d5-6a51-4713-bae9-d25641533de2" containerName="keystone-db-sync" Dec 11 13:25:55 crc kubenswrapper[4898]: I1211 13:25:55.404250 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="d314c3d5-6a51-4713-bae9-d25641533de2" containerName="keystone-db-sync" Dec 11 13:25:55 crc kubenswrapper[4898]: I1211 13:25:55.405239 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-zctw4" Dec 11 13:25:55 crc kubenswrapper[4898]: I1211 13:25:55.411274 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 11 13:25:55 crc kubenswrapper[4898]: I1211 13:25:55.419491 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 11 13:25:55 crc kubenswrapper[4898]: I1211 13:25:55.419667 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 11 13:25:55 crc kubenswrapper[4898]: I1211 13:25:55.419783 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-qkphd" Dec 11 13:25:55 crc kubenswrapper[4898]: I1211 13:25:55.419919 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 11 13:25:55 crc kubenswrapper[4898]: I1211 13:25:55.470078 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-zctw4"] Dec 11 13:25:55 crc kubenswrapper[4898]: I1211 13:25:55.534057 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/730a9af6-05eb-4ab9-a19e-c2d359813459-fernet-keys\") pod \"keystone-bootstrap-zctw4\" (UID: \"730a9af6-05eb-4ab9-a19e-c2d359813459\") " pod="openstack/keystone-bootstrap-zctw4" Dec 11 13:25:55 crc kubenswrapper[4898]: I1211 13:25:55.534141 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/730a9af6-05eb-4ab9-a19e-c2d359813459-credential-keys\") pod \"keystone-bootstrap-zctw4\" (UID: \"730a9af6-05eb-4ab9-a19e-c2d359813459\") " pod="openstack/keystone-bootstrap-zctw4" Dec 11 13:25:55 crc kubenswrapper[4898]: I1211 13:25:55.534243 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/730a9af6-05eb-4ab9-a19e-c2d359813459-combined-ca-bundle\") pod \"keystone-bootstrap-zctw4\" (UID: \"730a9af6-05eb-4ab9-a19e-c2d359813459\") " pod="openstack/keystone-bootstrap-zctw4" Dec 11 13:25:55 crc kubenswrapper[4898]: I1211 13:25:55.534275 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/730a9af6-05eb-4ab9-a19e-c2d359813459-config-data\") pod \"keystone-bootstrap-zctw4\" (UID: \"730a9af6-05eb-4ab9-a19e-c2d359813459\") " pod="openstack/keystone-bootstrap-zctw4" Dec 11 13:25:55 crc kubenswrapper[4898]: I1211 13:25:55.534322 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cdgpd\" (UniqueName: \"kubernetes.io/projected/730a9af6-05eb-4ab9-a19e-c2d359813459-kube-api-access-cdgpd\") pod \"keystone-bootstrap-zctw4\" (UID: \"730a9af6-05eb-4ab9-a19e-c2d359813459\") " pod="openstack/keystone-bootstrap-zctw4" Dec 11 13:25:55 crc kubenswrapper[4898]: I1211 13:25:55.534368 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/730a9af6-05eb-4ab9-a19e-c2d359813459-scripts\") pod \"keystone-bootstrap-zctw4\" (UID: \"730a9af6-05eb-4ab9-a19e-c2d359813459\") " pod="openstack/keystone-bootstrap-zctw4" Dec 11 13:25:55 crc kubenswrapper[4898]: I1211 13:25:55.537141 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5f59b8f679-9flrp"] Dec 11 13:25:55 crc kubenswrapper[4898]: I1211 13:25:55.601667 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-bbf5cc879-2mhm5"] Dec 11 13:25:55 crc kubenswrapper[4898]: I1211 13:25:55.604032 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bbf5cc879-2mhm5" Dec 11 13:25:55 crc kubenswrapper[4898]: I1211 13:25:55.624269 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-bbf5cc879-2mhm5"] Dec 11 13:25:55 crc kubenswrapper[4898]: I1211 13:25:55.636495 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/730a9af6-05eb-4ab9-a19e-c2d359813459-fernet-keys\") pod \"keystone-bootstrap-zctw4\" (UID: \"730a9af6-05eb-4ab9-a19e-c2d359813459\") " pod="openstack/keystone-bootstrap-zctw4" Dec 11 13:25:55 crc kubenswrapper[4898]: I1211 13:25:55.636568 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/730a9af6-05eb-4ab9-a19e-c2d359813459-credential-keys\") pod \"keystone-bootstrap-zctw4\" (UID: \"730a9af6-05eb-4ab9-a19e-c2d359813459\") " pod="openstack/keystone-bootstrap-zctw4" Dec 11 13:25:55 crc kubenswrapper[4898]: I1211 13:25:55.636626 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/730a9af6-05eb-4ab9-a19e-c2d359813459-combined-ca-bundle\") pod \"keystone-bootstrap-zctw4\" (UID: \"730a9af6-05eb-4ab9-a19e-c2d359813459\") " pod="openstack/keystone-bootstrap-zctw4" Dec 11 13:25:55 crc kubenswrapper[4898]: I1211 13:25:55.636645 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/730a9af6-05eb-4ab9-a19e-c2d359813459-config-data\") pod \"keystone-bootstrap-zctw4\" (UID: \"730a9af6-05eb-4ab9-a19e-c2d359813459\") " pod="openstack/keystone-bootstrap-zctw4" Dec 11 13:25:55 crc kubenswrapper[4898]: I1211 13:25:55.636687 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cdgpd\" (UniqueName: \"kubernetes.io/projected/730a9af6-05eb-4ab9-a19e-c2d359813459-kube-api-access-cdgpd\") pod \"keystone-bootstrap-zctw4\" (UID: \"730a9af6-05eb-4ab9-a19e-c2d359813459\") " pod="openstack/keystone-bootstrap-zctw4" Dec 11 13:25:55 crc kubenswrapper[4898]: I1211 13:25:55.637484 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/730a9af6-05eb-4ab9-a19e-c2d359813459-scripts\") pod \"keystone-bootstrap-zctw4\" (UID: \"730a9af6-05eb-4ab9-a19e-c2d359813459\") " pod="openstack/keystone-bootstrap-zctw4" Dec 11 13:25:55 crc kubenswrapper[4898]: I1211 13:25:55.648439 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-sync-mvzhc"] Dec 11 13:25:55 crc kubenswrapper[4898]: I1211 13:25:55.649815 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-mvzhc" Dec 11 13:25:55 crc kubenswrapper[4898]: I1211 13:25:55.652528 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/730a9af6-05eb-4ab9-a19e-c2d359813459-config-data\") pod \"keystone-bootstrap-zctw4\" (UID: \"730a9af6-05eb-4ab9-a19e-c2d359813459\") " pod="openstack/keystone-bootstrap-zctw4" Dec 11 13:25:55 crc kubenswrapper[4898]: I1211 13:25:55.653879 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/730a9af6-05eb-4ab9-a19e-c2d359813459-fernet-keys\") pod \"keystone-bootstrap-zctw4\" (UID: \"730a9af6-05eb-4ab9-a19e-c2d359813459\") " pod="openstack/keystone-bootstrap-zctw4" Dec 11 13:25:55 crc kubenswrapper[4898]: I1211 13:25:55.656929 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/730a9af6-05eb-4ab9-a19e-c2d359813459-credential-keys\") pod \"keystone-bootstrap-zctw4\" (UID: \"730a9af6-05eb-4ab9-a19e-c2d359813459\") " pod="openstack/keystone-bootstrap-zctw4" Dec 11 13:25:55 crc kubenswrapper[4898]: I1211 13:25:55.657230 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/730a9af6-05eb-4ab9-a19e-c2d359813459-scripts\") pod \"keystone-bootstrap-zctw4\" (UID: \"730a9af6-05eb-4ab9-a19e-c2d359813459\") " pod="openstack/keystone-bootstrap-zctw4" Dec 11 13:25:55 crc kubenswrapper[4898]: I1211 13:25:55.660524 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-heat-dockercfg-nrsz5" Dec 11 13:25:55 crc kubenswrapper[4898]: I1211 13:25:55.660718 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-config-data" Dec 11 13:25:55 crc kubenswrapper[4898]: I1211 13:25:55.677185 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/730a9af6-05eb-4ab9-a19e-c2d359813459-combined-ca-bundle\") pod \"keystone-bootstrap-zctw4\" (UID: \"730a9af6-05eb-4ab9-a19e-c2d359813459\") " pod="openstack/keystone-bootstrap-zctw4" Dec 11 13:25:55 crc kubenswrapper[4898]: I1211 13:25:55.681832 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-mvzhc"] Dec 11 13:25:55 crc kubenswrapper[4898]: I1211 13:25:55.696555 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-f57gt"] Dec 11 13:25:55 crc kubenswrapper[4898]: I1211 13:25:55.697856 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-f57gt" Dec 11 13:25:55 crc kubenswrapper[4898]: I1211 13:25:55.703795 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cdgpd\" (UniqueName: \"kubernetes.io/projected/730a9af6-05eb-4ab9-a19e-c2d359813459-kube-api-access-cdgpd\") pod \"keystone-bootstrap-zctw4\" (UID: \"730a9af6-05eb-4ab9-a19e-c2d359813459\") " pod="openstack/keystone-bootstrap-zctw4" Dec 11 13:25:55 crc kubenswrapper[4898]: I1211 13:25:55.713521 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-f57gt"] Dec 11 13:25:55 crc kubenswrapper[4898]: I1211 13:25:55.733476 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Dec 11 13:25:55 crc kubenswrapper[4898]: I1211 13:25:55.733837 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-dn6dl" Dec 11 13:25:55 crc kubenswrapper[4898]: I1211 13:25:55.734007 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Dec 11 13:25:55 crc kubenswrapper[4898]: I1211 13:25:55.734088 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-6v7s7"] Dec 11 13:25:55 crc kubenswrapper[4898]: I1211 13:25:55.735320 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-6v7s7" Dec 11 13:25:55 crc kubenswrapper[4898]: I1211 13:25:55.736548 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-zctw4" Dec 11 13:25:55 crc kubenswrapper[4898]: I1211 13:25:55.739068 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/83e282d3-244c-4ec5-afc0-8ac624fe0879-dns-svc\") pod \"dnsmasq-dns-bbf5cc879-2mhm5\" (UID: \"83e282d3-244c-4ec5-afc0-8ac624fe0879\") " pod="openstack/dnsmasq-dns-bbf5cc879-2mhm5" Dec 11 13:25:55 crc kubenswrapper[4898]: I1211 13:25:55.746894 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23459d62-b558-4f82-a875-311d5fa486e5-combined-ca-bundle\") pod \"heat-db-sync-mvzhc\" (UID: \"23459d62-b558-4f82-a875-311d5fa486e5\") " pod="openstack/heat-db-sync-mvzhc" Dec 11 13:25:55 crc kubenswrapper[4898]: I1211 13:25:55.747094 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2ndbb\" (UniqueName: \"kubernetes.io/projected/83e282d3-244c-4ec5-afc0-8ac624fe0879-kube-api-access-2ndbb\") pod \"dnsmasq-dns-bbf5cc879-2mhm5\" (UID: \"83e282d3-244c-4ec5-afc0-8ac624fe0879\") " pod="openstack/dnsmasq-dns-bbf5cc879-2mhm5" Dec 11 13:25:55 crc kubenswrapper[4898]: I1211 13:25:55.747242 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23459d62-b558-4f82-a875-311d5fa486e5-config-data\") pod \"heat-db-sync-mvzhc\" (UID: \"23459d62-b558-4f82-a875-311d5fa486e5\") " pod="openstack/heat-db-sync-mvzhc" Dec 11 13:25:55 crc kubenswrapper[4898]: I1211 13:25:55.747482 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/83e282d3-244c-4ec5-afc0-8ac624fe0879-ovsdbserver-sb\") pod \"dnsmasq-dns-bbf5cc879-2mhm5\" (UID: \"83e282d3-244c-4ec5-afc0-8ac624fe0879\") " pod="openstack/dnsmasq-dns-bbf5cc879-2mhm5" Dec 11 13:25:55 crc kubenswrapper[4898]: I1211 13:25:55.748172 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/83e282d3-244c-4ec5-afc0-8ac624fe0879-dns-swift-storage-0\") pod \"dnsmasq-dns-bbf5cc879-2mhm5\" (UID: \"83e282d3-244c-4ec5-afc0-8ac624fe0879\") " pod="openstack/dnsmasq-dns-bbf5cc879-2mhm5" Dec 11 13:25:55 crc kubenswrapper[4898]: I1211 13:25:55.748355 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wwt28\" (UniqueName: \"kubernetes.io/projected/23459d62-b558-4f82-a875-311d5fa486e5-kube-api-access-wwt28\") pod \"heat-db-sync-mvzhc\" (UID: \"23459d62-b558-4f82-a875-311d5fa486e5\") " pod="openstack/heat-db-sync-mvzhc" Dec 11 13:25:55 crc kubenswrapper[4898]: I1211 13:25:55.748448 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/83e282d3-244c-4ec5-afc0-8ac624fe0879-ovsdbserver-nb\") pod \"dnsmasq-dns-bbf5cc879-2mhm5\" (UID: \"83e282d3-244c-4ec5-afc0-8ac624fe0879\") " pod="openstack/dnsmasq-dns-bbf5cc879-2mhm5" Dec 11 13:25:55 crc kubenswrapper[4898]: I1211 13:25:55.748592 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/83e282d3-244c-4ec5-afc0-8ac624fe0879-config\") pod \"dnsmasq-dns-bbf5cc879-2mhm5\" (UID: \"83e282d3-244c-4ec5-afc0-8ac624fe0879\") " pod="openstack/dnsmasq-dns-bbf5cc879-2mhm5" Dec 11 13:25:55 crc kubenswrapper[4898]: I1211 13:25:55.762131 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-d68j7" Dec 11 13:25:55 crc kubenswrapper[4898]: I1211 13:25:55.762436 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Dec 11 13:25:55 crc kubenswrapper[4898]: I1211 13:25:55.762635 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Dec 11 13:25:55 crc kubenswrapper[4898]: I1211 13:25:55.806518 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-6v7s7"] Dec 11 13:25:55 crc kubenswrapper[4898]: I1211 13:25:55.851104 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/83e282d3-244c-4ec5-afc0-8ac624fe0879-ovsdbserver-sb\") pod \"dnsmasq-dns-bbf5cc879-2mhm5\" (UID: \"83e282d3-244c-4ec5-afc0-8ac624fe0879\") " pod="openstack/dnsmasq-dns-bbf5cc879-2mhm5" Dec 11 13:25:55 crc kubenswrapper[4898]: I1211 13:25:55.851154 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ff1a3a4-ba9d-4155-b289-46de3809f5f4-combined-ca-bundle\") pod \"neutron-db-sync-f57gt\" (UID: \"4ff1a3a4-ba9d-4155-b289-46de3809f5f4\") " pod="openstack/neutron-db-sync-f57gt" Dec 11 13:25:55 crc kubenswrapper[4898]: I1211 13:25:55.851176 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c7ssj\" (UniqueName: \"kubernetes.io/projected/4ff1a3a4-ba9d-4155-b289-46de3809f5f4-kube-api-access-c7ssj\") pod \"neutron-db-sync-f57gt\" (UID: \"4ff1a3a4-ba9d-4155-b289-46de3809f5f4\") " pod="openstack/neutron-db-sync-f57gt" Dec 11 13:25:55 crc kubenswrapper[4898]: I1211 13:25:55.851206 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/83e282d3-244c-4ec5-afc0-8ac624fe0879-dns-swift-storage-0\") pod \"dnsmasq-dns-bbf5cc879-2mhm5\" (UID: \"83e282d3-244c-4ec5-afc0-8ac624fe0879\") " pod="openstack/dnsmasq-dns-bbf5cc879-2mhm5" Dec 11 13:25:55 crc kubenswrapper[4898]: I1211 13:25:55.851252 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76d89e82-8f2e-4198-8736-28293404a0bd-config-data\") pod \"cinder-db-sync-6v7s7\" (UID: \"76d89e82-8f2e-4198-8736-28293404a0bd\") " pod="openstack/cinder-db-sync-6v7s7" Dec 11 13:25:55 crc kubenswrapper[4898]: I1211 13:25:55.851278 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wwt28\" (UniqueName: \"kubernetes.io/projected/23459d62-b558-4f82-a875-311d5fa486e5-kube-api-access-wwt28\") pod \"heat-db-sync-mvzhc\" (UID: \"23459d62-b558-4f82-a875-311d5fa486e5\") " pod="openstack/heat-db-sync-mvzhc" Dec 11 13:25:55 crc kubenswrapper[4898]: I1211 13:25:55.851299 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/83e282d3-244c-4ec5-afc0-8ac624fe0879-ovsdbserver-nb\") pod \"dnsmasq-dns-bbf5cc879-2mhm5\" (UID: \"83e282d3-244c-4ec5-afc0-8ac624fe0879\") " pod="openstack/dnsmasq-dns-bbf5cc879-2mhm5" Dec 11 13:25:55 crc kubenswrapper[4898]: I1211 13:25:55.851320 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4ff1a3a4-ba9d-4155-b289-46de3809f5f4-config\") pod \"neutron-db-sync-f57gt\" (UID: \"4ff1a3a4-ba9d-4155-b289-46de3809f5f4\") " pod="openstack/neutron-db-sync-f57gt" Dec 11 13:25:55 crc kubenswrapper[4898]: I1211 13:25:55.851344 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/83e282d3-244c-4ec5-afc0-8ac624fe0879-config\") pod \"dnsmasq-dns-bbf5cc879-2mhm5\" (UID: \"83e282d3-244c-4ec5-afc0-8ac624fe0879\") " pod="openstack/dnsmasq-dns-bbf5cc879-2mhm5" Dec 11 13:25:55 crc kubenswrapper[4898]: I1211 13:25:55.851380 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/76d89e82-8f2e-4198-8736-28293404a0bd-db-sync-config-data\") pod \"cinder-db-sync-6v7s7\" (UID: \"76d89e82-8f2e-4198-8736-28293404a0bd\") " pod="openstack/cinder-db-sync-6v7s7" Dec 11 13:25:55 crc kubenswrapper[4898]: I1211 13:25:55.851446 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pzlfq\" (UniqueName: \"kubernetes.io/projected/76d89e82-8f2e-4198-8736-28293404a0bd-kube-api-access-pzlfq\") pod \"cinder-db-sync-6v7s7\" (UID: \"76d89e82-8f2e-4198-8736-28293404a0bd\") " pod="openstack/cinder-db-sync-6v7s7" Dec 11 13:25:55 crc kubenswrapper[4898]: I1211 13:25:55.851493 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/83e282d3-244c-4ec5-afc0-8ac624fe0879-dns-svc\") pod \"dnsmasq-dns-bbf5cc879-2mhm5\" (UID: \"83e282d3-244c-4ec5-afc0-8ac624fe0879\") " pod="openstack/dnsmasq-dns-bbf5cc879-2mhm5" Dec 11 13:25:55 crc kubenswrapper[4898]: I1211 13:25:55.851516 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23459d62-b558-4f82-a875-311d5fa486e5-combined-ca-bundle\") pod \"heat-db-sync-mvzhc\" (UID: \"23459d62-b558-4f82-a875-311d5fa486e5\") " pod="openstack/heat-db-sync-mvzhc" Dec 11 13:25:55 crc kubenswrapper[4898]: I1211 13:25:55.851533 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2ndbb\" (UniqueName: \"kubernetes.io/projected/83e282d3-244c-4ec5-afc0-8ac624fe0879-kube-api-access-2ndbb\") pod \"dnsmasq-dns-bbf5cc879-2mhm5\" (UID: \"83e282d3-244c-4ec5-afc0-8ac624fe0879\") " pod="openstack/dnsmasq-dns-bbf5cc879-2mhm5" Dec 11 13:25:55 crc kubenswrapper[4898]: I1211 13:25:55.851558 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23459d62-b558-4f82-a875-311d5fa486e5-config-data\") pod \"heat-db-sync-mvzhc\" (UID: \"23459d62-b558-4f82-a875-311d5fa486e5\") " pod="openstack/heat-db-sync-mvzhc" Dec 11 13:25:55 crc kubenswrapper[4898]: I1211 13:25:55.851579 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/76d89e82-8f2e-4198-8736-28293404a0bd-etc-machine-id\") pod \"cinder-db-sync-6v7s7\" (UID: \"76d89e82-8f2e-4198-8736-28293404a0bd\") " pod="openstack/cinder-db-sync-6v7s7" Dec 11 13:25:55 crc kubenswrapper[4898]: I1211 13:25:55.851594 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76d89e82-8f2e-4198-8736-28293404a0bd-combined-ca-bundle\") pod \"cinder-db-sync-6v7s7\" (UID: \"76d89e82-8f2e-4198-8736-28293404a0bd\") " pod="openstack/cinder-db-sync-6v7s7" Dec 11 13:25:55 crc kubenswrapper[4898]: I1211 13:25:55.851623 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/76d89e82-8f2e-4198-8736-28293404a0bd-scripts\") pod \"cinder-db-sync-6v7s7\" (UID: \"76d89e82-8f2e-4198-8736-28293404a0bd\") " pod="openstack/cinder-db-sync-6v7s7" Dec 11 13:25:55 crc kubenswrapper[4898]: I1211 13:25:55.852424 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/83e282d3-244c-4ec5-afc0-8ac624fe0879-ovsdbserver-sb\") pod \"dnsmasq-dns-bbf5cc879-2mhm5\" (UID: \"83e282d3-244c-4ec5-afc0-8ac624fe0879\") " pod="openstack/dnsmasq-dns-bbf5cc879-2mhm5" Dec 11 13:25:55 crc kubenswrapper[4898]: I1211 13:25:55.852874 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/83e282d3-244c-4ec5-afc0-8ac624fe0879-ovsdbserver-nb\") pod \"dnsmasq-dns-bbf5cc879-2mhm5\" (UID: \"83e282d3-244c-4ec5-afc0-8ac624fe0879\") " pod="openstack/dnsmasq-dns-bbf5cc879-2mhm5" Dec 11 13:25:55 crc kubenswrapper[4898]: I1211 13:25:55.853040 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/83e282d3-244c-4ec5-afc0-8ac624fe0879-config\") pod \"dnsmasq-dns-bbf5cc879-2mhm5\" (UID: \"83e282d3-244c-4ec5-afc0-8ac624fe0879\") " pod="openstack/dnsmasq-dns-bbf5cc879-2mhm5" Dec 11 13:25:55 crc kubenswrapper[4898]: I1211 13:25:55.853388 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/83e282d3-244c-4ec5-afc0-8ac624fe0879-dns-swift-storage-0\") pod \"dnsmasq-dns-bbf5cc879-2mhm5\" (UID: \"83e282d3-244c-4ec5-afc0-8ac624fe0879\") " pod="openstack/dnsmasq-dns-bbf5cc879-2mhm5" Dec 11 13:25:55 crc kubenswrapper[4898]: I1211 13:25:55.853803 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/83e282d3-244c-4ec5-afc0-8ac624fe0879-dns-svc\") pod \"dnsmasq-dns-bbf5cc879-2mhm5\" (UID: \"83e282d3-244c-4ec5-afc0-8ac624fe0879\") " pod="openstack/dnsmasq-dns-bbf5cc879-2mhm5" Dec 11 13:25:55 crc kubenswrapper[4898]: I1211 13:25:55.875367 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23459d62-b558-4f82-a875-311d5fa486e5-config-data\") pod \"heat-db-sync-mvzhc\" (UID: \"23459d62-b558-4f82-a875-311d5fa486e5\") " pod="openstack/heat-db-sync-mvzhc" Dec 11 13:25:55 crc kubenswrapper[4898]: I1211 13:25:55.879697 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-gtqwj"] Dec 11 13:25:55 crc kubenswrapper[4898]: I1211 13:25:55.886370 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-gtqwj" Dec 11 13:25:55 crc kubenswrapper[4898]: I1211 13:25:55.889290 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23459d62-b558-4f82-a875-311d5fa486e5-combined-ca-bundle\") pod \"heat-db-sync-mvzhc\" (UID: \"23459d62-b558-4f82-a875-311d5fa486e5\") " pod="openstack/heat-db-sync-mvzhc" Dec 11 13:25:55 crc kubenswrapper[4898]: I1211 13:25:55.908253 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2ndbb\" (UniqueName: \"kubernetes.io/projected/83e282d3-244c-4ec5-afc0-8ac624fe0879-kube-api-access-2ndbb\") pod \"dnsmasq-dns-bbf5cc879-2mhm5\" (UID: \"83e282d3-244c-4ec5-afc0-8ac624fe0879\") " pod="openstack/dnsmasq-dns-bbf5cc879-2mhm5" Dec 11 13:25:55 crc kubenswrapper[4898]: I1211 13:25:55.917953 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-cns9k" Dec 11 13:25:55 crc kubenswrapper[4898]: I1211 13:25:55.925807 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Dec 11 13:25:55 crc kubenswrapper[4898]: I1211 13:25:55.933573 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wwt28\" (UniqueName: \"kubernetes.io/projected/23459d62-b558-4f82-a875-311d5fa486e5-kube-api-access-wwt28\") pod \"heat-db-sync-mvzhc\" (UID: \"23459d62-b558-4f82-a875-311d5fa486e5\") " pod="openstack/heat-db-sync-mvzhc" Dec 11 13:25:55 crc kubenswrapper[4898]: I1211 13:25:55.943575 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bbf5cc879-2mhm5" Dec 11 13:25:55 crc kubenswrapper[4898]: I1211 13:25:55.958383 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/76d89e82-8f2e-4198-8736-28293404a0bd-scripts\") pod \"cinder-db-sync-6v7s7\" (UID: \"76d89e82-8f2e-4198-8736-28293404a0bd\") " pod="openstack/cinder-db-sync-6v7s7" Dec 11 13:25:55 crc kubenswrapper[4898]: I1211 13:25:55.958434 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ff1a3a4-ba9d-4155-b289-46de3809f5f4-combined-ca-bundle\") pod \"neutron-db-sync-f57gt\" (UID: \"4ff1a3a4-ba9d-4155-b289-46de3809f5f4\") " pod="openstack/neutron-db-sync-f57gt" Dec 11 13:25:55 crc kubenswrapper[4898]: I1211 13:25:55.958496 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c7ssj\" (UniqueName: \"kubernetes.io/projected/4ff1a3a4-ba9d-4155-b289-46de3809f5f4-kube-api-access-c7ssj\") pod \"neutron-db-sync-f57gt\" (UID: \"4ff1a3a4-ba9d-4155-b289-46de3809f5f4\") " pod="openstack/neutron-db-sync-f57gt" Dec 11 13:25:55 crc kubenswrapper[4898]: I1211 13:25:55.958544 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76d89e82-8f2e-4198-8736-28293404a0bd-config-data\") pod \"cinder-db-sync-6v7s7\" (UID: \"76d89e82-8f2e-4198-8736-28293404a0bd\") " pod="openstack/cinder-db-sync-6v7s7" Dec 11 13:25:55 crc kubenswrapper[4898]: I1211 13:25:55.958589 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4ff1a3a4-ba9d-4155-b289-46de3809f5f4-config\") pod \"neutron-db-sync-f57gt\" (UID: \"4ff1a3a4-ba9d-4155-b289-46de3809f5f4\") " pod="openstack/neutron-db-sync-f57gt" Dec 11 13:25:55 crc kubenswrapper[4898]: I1211 13:25:55.958624 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/76d89e82-8f2e-4198-8736-28293404a0bd-db-sync-config-data\") pod \"cinder-db-sync-6v7s7\" (UID: \"76d89e82-8f2e-4198-8736-28293404a0bd\") " pod="openstack/cinder-db-sync-6v7s7" Dec 11 13:25:55 crc kubenswrapper[4898]: I1211 13:25:55.958672 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pzlfq\" (UniqueName: \"kubernetes.io/projected/76d89e82-8f2e-4198-8736-28293404a0bd-kube-api-access-pzlfq\") pod \"cinder-db-sync-6v7s7\" (UID: \"76d89e82-8f2e-4198-8736-28293404a0bd\") " pod="openstack/cinder-db-sync-6v7s7" Dec 11 13:25:55 crc kubenswrapper[4898]: I1211 13:25:55.958733 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/76d89e82-8f2e-4198-8736-28293404a0bd-etc-machine-id\") pod \"cinder-db-sync-6v7s7\" (UID: \"76d89e82-8f2e-4198-8736-28293404a0bd\") " pod="openstack/cinder-db-sync-6v7s7" Dec 11 13:25:55 crc kubenswrapper[4898]: I1211 13:25:55.958753 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76d89e82-8f2e-4198-8736-28293404a0bd-combined-ca-bundle\") pod \"cinder-db-sync-6v7s7\" (UID: \"76d89e82-8f2e-4198-8736-28293404a0bd\") " pod="openstack/cinder-db-sync-6v7s7" Dec 11 13:25:55 crc kubenswrapper[4898]: I1211 13:25:55.965559 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76d89e82-8f2e-4198-8736-28293404a0bd-config-data\") pod \"cinder-db-sync-6v7s7\" (UID: \"76d89e82-8f2e-4198-8736-28293404a0bd\") " pod="openstack/cinder-db-sync-6v7s7" Dec 11 13:25:55 crc kubenswrapper[4898]: I1211 13:25:55.974945 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/76d89e82-8f2e-4198-8736-28293404a0bd-etc-machine-id\") pod \"cinder-db-sync-6v7s7\" (UID: \"76d89e82-8f2e-4198-8736-28293404a0bd\") " pod="openstack/cinder-db-sync-6v7s7" Dec 11 13:25:55 crc kubenswrapper[4898]: I1211 13:25:55.977139 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-mvzhc" Dec 11 13:25:55 crc kubenswrapper[4898]: I1211 13:25:55.987978 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/76d89e82-8f2e-4198-8736-28293404a0bd-db-sync-config-data\") pod \"cinder-db-sync-6v7s7\" (UID: \"76d89e82-8f2e-4198-8736-28293404a0bd\") " pod="openstack/cinder-db-sync-6v7s7" Dec 11 13:25:55 crc kubenswrapper[4898]: I1211 13:25:55.989512 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-gtqwj"] Dec 11 13:25:56 crc kubenswrapper[4898]: I1211 13:25:56.001280 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76d89e82-8f2e-4198-8736-28293404a0bd-combined-ca-bundle\") pod \"cinder-db-sync-6v7s7\" (UID: \"76d89e82-8f2e-4198-8736-28293404a0bd\") " pod="openstack/cinder-db-sync-6v7s7" Dec 11 13:25:56 crc kubenswrapper[4898]: I1211 13:25:56.001974 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ff1a3a4-ba9d-4155-b289-46de3809f5f4-combined-ca-bundle\") pod \"neutron-db-sync-f57gt\" (UID: \"4ff1a3a4-ba9d-4155-b289-46de3809f5f4\") " pod="openstack/neutron-db-sync-f57gt" Dec 11 13:25:56 crc kubenswrapper[4898]: I1211 13:25:56.007741 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-mhjjq"] Dec 11 13:25:56 crc kubenswrapper[4898]: I1211 13:25:56.008337 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/76d89e82-8f2e-4198-8736-28293404a0bd-scripts\") pod \"cinder-db-sync-6v7s7\" (UID: \"76d89e82-8f2e-4198-8736-28293404a0bd\") " pod="openstack/cinder-db-sync-6v7s7" Dec 11 13:25:56 crc kubenswrapper[4898]: I1211 13:25:56.011608 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-mhjjq" Dec 11 13:25:56 crc kubenswrapper[4898]: I1211 13:25:56.013481 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pzlfq\" (UniqueName: \"kubernetes.io/projected/76d89e82-8f2e-4198-8736-28293404a0bd-kube-api-access-pzlfq\") pod \"cinder-db-sync-6v7s7\" (UID: \"76d89e82-8f2e-4198-8736-28293404a0bd\") " pod="openstack/cinder-db-sync-6v7s7" Dec 11 13:25:56 crc kubenswrapper[4898]: I1211 13:25:56.015395 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/4ff1a3a4-ba9d-4155-b289-46de3809f5f4-config\") pod \"neutron-db-sync-f57gt\" (UID: \"4ff1a3a4-ba9d-4155-b289-46de3809f5f4\") " pod="openstack/neutron-db-sync-f57gt" Dec 11 13:25:56 crc kubenswrapper[4898]: I1211 13:25:56.042042 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Dec 11 13:25:56 crc kubenswrapper[4898]: I1211 13:25:56.042332 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-hfvt8" Dec 11 13:25:56 crc kubenswrapper[4898]: I1211 13:25:56.042340 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Dec 11 13:25:56 crc kubenswrapper[4898]: I1211 13:25:56.045380 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c7ssj\" (UniqueName: \"kubernetes.io/projected/4ff1a3a4-ba9d-4155-b289-46de3809f5f4-kube-api-access-c7ssj\") pod \"neutron-db-sync-f57gt\" (UID: \"4ff1a3a4-ba9d-4155-b289-46de3809f5f4\") " pod="openstack/neutron-db-sync-f57gt" Dec 11 13:25:56 crc kubenswrapper[4898]: I1211 13:25:56.051588 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-f57gt" Dec 11 13:25:56 crc kubenswrapper[4898]: I1211 13:25:56.063969 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5924443a-434a-4efc-b04d-fdc73d3e2fe6-db-sync-config-data\") pod \"barbican-db-sync-gtqwj\" (UID: \"5924443a-434a-4efc-b04d-fdc73d3e2fe6\") " pod="openstack/barbican-db-sync-gtqwj" Dec 11 13:25:56 crc kubenswrapper[4898]: I1211 13:25:56.064093 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5924443a-434a-4efc-b04d-fdc73d3e2fe6-combined-ca-bundle\") pod \"barbican-db-sync-gtqwj\" (UID: \"5924443a-434a-4efc-b04d-fdc73d3e2fe6\") " pod="openstack/barbican-db-sync-gtqwj" Dec 11 13:25:56 crc kubenswrapper[4898]: I1211 13:25:56.064124 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r4jdf\" (UniqueName: \"kubernetes.io/projected/5924443a-434a-4efc-b04d-fdc73d3e2fe6-kube-api-access-r4jdf\") pod \"barbican-db-sync-gtqwj\" (UID: \"5924443a-434a-4efc-b04d-fdc73d3e2fe6\") " pod="openstack/barbican-db-sync-gtqwj" Dec 11 13:25:56 crc kubenswrapper[4898]: I1211 13:25:56.107589 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-6v7s7" Dec 11 13:25:56 crc kubenswrapper[4898]: I1211 13:25:56.167786 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5924443a-434a-4efc-b04d-fdc73d3e2fe6-db-sync-config-data\") pod \"barbican-db-sync-gtqwj\" (UID: \"5924443a-434a-4efc-b04d-fdc73d3e2fe6\") " pod="openstack/barbican-db-sync-gtqwj" Dec 11 13:25:56 crc kubenswrapper[4898]: I1211 13:25:56.168043 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b93b66d1-a195-42b8-912d-5029d9f0e6b3-logs\") pod \"placement-db-sync-mhjjq\" (UID: \"b93b66d1-a195-42b8-912d-5029d9f0e6b3\") " pod="openstack/placement-db-sync-mhjjq" Dec 11 13:25:56 crc kubenswrapper[4898]: I1211 13:25:56.168094 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b93b66d1-a195-42b8-912d-5029d9f0e6b3-scripts\") pod \"placement-db-sync-mhjjq\" (UID: \"b93b66d1-a195-42b8-912d-5029d9f0e6b3\") " pod="openstack/placement-db-sync-mhjjq" Dec 11 13:25:56 crc kubenswrapper[4898]: I1211 13:25:56.168112 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b93b66d1-a195-42b8-912d-5029d9f0e6b3-combined-ca-bundle\") pod \"placement-db-sync-mhjjq\" (UID: \"b93b66d1-a195-42b8-912d-5029d9f0e6b3\") " pod="openstack/placement-db-sync-mhjjq" Dec 11 13:25:56 crc kubenswrapper[4898]: I1211 13:25:56.168158 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5924443a-434a-4efc-b04d-fdc73d3e2fe6-combined-ca-bundle\") pod \"barbican-db-sync-gtqwj\" (UID: \"5924443a-434a-4efc-b04d-fdc73d3e2fe6\") " pod="openstack/barbican-db-sync-gtqwj" Dec 11 13:25:56 crc kubenswrapper[4898]: I1211 13:25:56.168188 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qw5pd\" (UniqueName: \"kubernetes.io/projected/b93b66d1-a195-42b8-912d-5029d9f0e6b3-kube-api-access-qw5pd\") pod \"placement-db-sync-mhjjq\" (UID: \"b93b66d1-a195-42b8-912d-5029d9f0e6b3\") " pod="openstack/placement-db-sync-mhjjq" Dec 11 13:25:56 crc kubenswrapper[4898]: I1211 13:25:56.168206 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r4jdf\" (UniqueName: \"kubernetes.io/projected/5924443a-434a-4efc-b04d-fdc73d3e2fe6-kube-api-access-r4jdf\") pod \"barbican-db-sync-gtqwj\" (UID: \"5924443a-434a-4efc-b04d-fdc73d3e2fe6\") " pod="openstack/barbican-db-sync-gtqwj" Dec 11 13:25:56 crc kubenswrapper[4898]: I1211 13:25:56.168241 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b93b66d1-a195-42b8-912d-5029d9f0e6b3-config-data\") pod \"placement-db-sync-mhjjq\" (UID: \"b93b66d1-a195-42b8-912d-5029d9f0e6b3\") " pod="openstack/placement-db-sync-mhjjq" Dec 11 13:25:56 crc kubenswrapper[4898]: I1211 13:25:56.179391 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5924443a-434a-4efc-b04d-fdc73d3e2fe6-combined-ca-bundle\") pod \"barbican-db-sync-gtqwj\" (UID: \"5924443a-434a-4efc-b04d-fdc73d3e2fe6\") " pod="openstack/barbican-db-sync-gtqwj" Dec 11 13:25:56 crc kubenswrapper[4898]: I1211 13:25:56.194066 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5924443a-434a-4efc-b04d-fdc73d3e2fe6-db-sync-config-data\") pod \"barbican-db-sync-gtqwj\" (UID: \"5924443a-434a-4efc-b04d-fdc73d3e2fe6\") " pod="openstack/barbican-db-sync-gtqwj" Dec 11 13:25:56 crc kubenswrapper[4898]: I1211 13:25:56.211608 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-mhjjq"] Dec 11 13:25:56 crc kubenswrapper[4898]: I1211 13:25:56.226967 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r4jdf\" (UniqueName: \"kubernetes.io/projected/5924443a-434a-4efc-b04d-fdc73d3e2fe6-kube-api-access-r4jdf\") pod \"barbican-db-sync-gtqwj\" (UID: \"5924443a-434a-4efc-b04d-fdc73d3e2fe6\") " pod="openstack/barbican-db-sync-gtqwj" Dec 11 13:25:56 crc kubenswrapper[4898]: I1211 13:25:56.244576 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f59b8f679-9flrp" event={"ID":"5c05e490-8014-425f-a1f1-7f4374e1d7da","Type":"ContainerStarted","Data":"bdc535a5c94d2ee02d91b1838893c317b59cf33b60b12338f985b27fa6368719"} Dec 11 13:25:56 crc kubenswrapper[4898]: I1211 13:25:56.244653 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5f59b8f679-9flrp" Dec 11 13:25:56 crc kubenswrapper[4898]: I1211 13:25:56.244663 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5f59b8f679-9flrp" podUID="5c05e490-8014-425f-a1f1-7f4374e1d7da" containerName="dnsmasq-dns" containerID="cri-o://bdc535a5c94d2ee02d91b1838893c317b59cf33b60b12338f985b27fa6368719" gracePeriod=10 Dec 11 13:25:56 crc kubenswrapper[4898]: I1211 13:25:56.271938 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b93b66d1-a195-42b8-912d-5029d9f0e6b3-logs\") pod \"placement-db-sync-mhjjq\" (UID: \"b93b66d1-a195-42b8-912d-5029d9f0e6b3\") " pod="openstack/placement-db-sync-mhjjq" Dec 11 13:25:56 crc kubenswrapper[4898]: I1211 13:25:56.272036 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b93b66d1-a195-42b8-912d-5029d9f0e6b3-scripts\") pod \"placement-db-sync-mhjjq\" (UID: \"b93b66d1-a195-42b8-912d-5029d9f0e6b3\") " pod="openstack/placement-db-sync-mhjjq" Dec 11 13:25:56 crc kubenswrapper[4898]: I1211 13:25:56.272059 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b93b66d1-a195-42b8-912d-5029d9f0e6b3-combined-ca-bundle\") pod \"placement-db-sync-mhjjq\" (UID: \"b93b66d1-a195-42b8-912d-5029d9f0e6b3\") " pod="openstack/placement-db-sync-mhjjq" Dec 11 13:25:56 crc kubenswrapper[4898]: I1211 13:25:56.272181 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qw5pd\" (UniqueName: \"kubernetes.io/projected/b93b66d1-a195-42b8-912d-5029d9f0e6b3-kube-api-access-qw5pd\") pod \"placement-db-sync-mhjjq\" (UID: \"b93b66d1-a195-42b8-912d-5029d9f0e6b3\") " pod="openstack/placement-db-sync-mhjjq" Dec 11 13:25:56 crc kubenswrapper[4898]: I1211 13:25:56.272239 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b93b66d1-a195-42b8-912d-5029d9f0e6b3-config-data\") pod \"placement-db-sync-mhjjq\" (UID: \"b93b66d1-a195-42b8-912d-5029d9f0e6b3\") " pod="openstack/placement-db-sync-mhjjq" Dec 11 13:25:56 crc kubenswrapper[4898]: I1211 13:25:56.275247 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b93b66d1-a195-42b8-912d-5029d9f0e6b3-logs\") pod \"placement-db-sync-mhjjq\" (UID: \"b93b66d1-a195-42b8-912d-5029d9f0e6b3\") " pod="openstack/placement-db-sync-mhjjq" Dec 11 13:25:56 crc kubenswrapper[4898]: I1211 13:25:56.282819 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bbf5cc879-2mhm5"] Dec 11 13:25:56 crc kubenswrapper[4898]: I1211 13:25:56.284787 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b93b66d1-a195-42b8-912d-5029d9f0e6b3-scripts\") pod \"placement-db-sync-mhjjq\" (UID: \"b93b66d1-a195-42b8-912d-5029d9f0e6b3\") " pod="openstack/placement-db-sync-mhjjq" Dec 11 13:25:56 crc kubenswrapper[4898]: I1211 13:25:56.294647 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b93b66d1-a195-42b8-912d-5029d9f0e6b3-combined-ca-bundle\") pod \"placement-db-sync-mhjjq\" (UID: \"b93b66d1-a195-42b8-912d-5029d9f0e6b3\") " pod="openstack/placement-db-sync-mhjjq" Dec 11 13:25:56 crc kubenswrapper[4898]: I1211 13:25:56.307160 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b93b66d1-a195-42b8-912d-5029d9f0e6b3-config-data\") pod \"placement-db-sync-mhjjq\" (UID: \"b93b66d1-a195-42b8-912d-5029d9f0e6b3\") " pod="openstack/placement-db-sync-mhjjq" Dec 11 13:25:56 crc kubenswrapper[4898]: I1211 13:25:56.310378 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qw5pd\" (UniqueName: \"kubernetes.io/projected/b93b66d1-a195-42b8-912d-5029d9f0e6b3-kube-api-access-qw5pd\") pod \"placement-db-sync-mhjjq\" (UID: \"b93b66d1-a195-42b8-912d-5029d9f0e6b3\") " pod="openstack/placement-db-sync-mhjjq" Dec 11 13:25:56 crc kubenswrapper[4898]: I1211 13:25:56.328267 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 11 13:25:56 crc kubenswrapper[4898]: I1211 13:25:56.336882 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 11 13:25:56 crc kubenswrapper[4898]: I1211 13:25:56.353327 4898 generic.go:334] "Generic (PLEG): container finished" podID="f8fd80d4-db6d-40ce-bbaf-13565266e5cc" containerID="a9cbb15434f9bb88ff30b78dbc766a42c4ca59a66f3a25e0b2a428474a01ffb9" exitCode=0 Dec 11 13:25:56 crc kubenswrapper[4898]: I1211 13:25:56.353369 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74dc88fc-w4rz8" event={"ID":"f8fd80d4-db6d-40ce-bbaf-13565266e5cc","Type":"ContainerDied","Data":"a9cbb15434f9bb88ff30b78dbc766a42c4ca59a66f3a25e0b2a428474a01ffb9"} Dec 11 13:25:56 crc kubenswrapper[4898]: I1211 13:25:56.353534 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 11 13:25:56 crc kubenswrapper[4898]: I1211 13:25:56.354322 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 11 13:25:56 crc kubenswrapper[4898]: I1211 13:25:56.361871 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74dc88fc-w4rz8" Dec 11 13:25:56 crc kubenswrapper[4898]: I1211 13:25:56.374412 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 11 13:25:56 crc kubenswrapper[4898]: I1211 13:25:56.386383 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-d89j6"] Dec 11 13:25:56 crc kubenswrapper[4898]: E1211 13:25:56.386888 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8fd80d4-db6d-40ce-bbaf-13565266e5cc" containerName="dnsmasq-dns" Dec 11 13:25:56 crc kubenswrapper[4898]: I1211 13:25:56.386904 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8fd80d4-db6d-40ce-bbaf-13565266e5cc" containerName="dnsmasq-dns" Dec 11 13:25:56 crc kubenswrapper[4898]: E1211 13:25:56.386952 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8fd80d4-db6d-40ce-bbaf-13565266e5cc" containerName="init" Dec 11 13:25:56 crc kubenswrapper[4898]: I1211 13:25:56.386960 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8fd80d4-db6d-40ce-bbaf-13565266e5cc" containerName="init" Dec 11 13:25:56 crc kubenswrapper[4898]: I1211 13:25:56.387205 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8fd80d4-db6d-40ce-bbaf-13565266e5cc" containerName="dnsmasq-dns" Dec 11 13:25:56 crc kubenswrapper[4898]: I1211 13:25:56.410883 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56df8fb6b7-d89j6" Dec 11 13:25:56 crc kubenswrapper[4898]: I1211 13:25:56.420647 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-d89j6"] Dec 11 13:25:56 crc kubenswrapper[4898]: I1211 13:25:56.422048 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5f59b8f679-9flrp" podStartSLOduration=3.422030491 podStartE2EDuration="3.422030491s" podCreationTimestamp="2025-12-11 13:25:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 13:25:56.328230588 +0000 UTC m=+1313.900557025" watchObservedRunningTime="2025-12-11 13:25:56.422030491 +0000 UTC m=+1313.994356928" Dec 11 13:25:56 crc kubenswrapper[4898]: I1211 13:25:56.432030 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-gtqwj" Dec 11 13:25:56 crc kubenswrapper[4898]: I1211 13:25:56.479196 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-mhjjq" Dec 11 13:25:56 crc kubenswrapper[4898]: I1211 13:25:56.479376 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f8fd80d4-db6d-40ce-bbaf-13565266e5cc-config\") pod \"f8fd80d4-db6d-40ce-bbaf-13565266e5cc\" (UID: \"f8fd80d4-db6d-40ce-bbaf-13565266e5cc\") " Dec 11 13:25:56 crc kubenswrapper[4898]: I1211 13:25:56.479430 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f8fd80d4-db6d-40ce-bbaf-13565266e5cc-ovsdbserver-nb\") pod \"f8fd80d4-db6d-40ce-bbaf-13565266e5cc\" (UID: \"f8fd80d4-db6d-40ce-bbaf-13565266e5cc\") " Dec 11 13:25:56 crc kubenswrapper[4898]: I1211 13:25:56.479519 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f8fd80d4-db6d-40ce-bbaf-13565266e5cc-dns-svc\") pod \"f8fd80d4-db6d-40ce-bbaf-13565266e5cc\" (UID: \"f8fd80d4-db6d-40ce-bbaf-13565266e5cc\") " Dec 11 13:25:56 crc kubenswrapper[4898]: I1211 13:25:56.479608 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f8fd80d4-db6d-40ce-bbaf-13565266e5cc-ovsdbserver-sb\") pod \"f8fd80d4-db6d-40ce-bbaf-13565266e5cc\" (UID: \"f8fd80d4-db6d-40ce-bbaf-13565266e5cc\") " Dec 11 13:25:56 crc kubenswrapper[4898]: I1211 13:25:56.487006 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-274mg\" (UniqueName: \"kubernetes.io/projected/f8fd80d4-db6d-40ce-bbaf-13565266e5cc-kube-api-access-274mg\") pod \"f8fd80d4-db6d-40ce-bbaf-13565266e5cc\" (UID: \"f8fd80d4-db6d-40ce-bbaf-13565266e5cc\") " Dec 11 13:25:56 crc kubenswrapper[4898]: I1211 13:25:56.487410 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ce9ab866-6ba7-4782-a002-5f0f4c252b4e-log-httpd\") pod \"ceilometer-0\" (UID: \"ce9ab866-6ba7-4782-a002-5f0f4c252b4e\") " pod="openstack/ceilometer-0" Dec 11 13:25:56 crc kubenswrapper[4898]: I1211 13:25:56.487435 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ce9ab866-6ba7-4782-a002-5f0f4c252b4e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ce9ab866-6ba7-4782-a002-5f0f4c252b4e\") " pod="openstack/ceilometer-0" Dec 11 13:25:56 crc kubenswrapper[4898]: I1211 13:25:56.487518 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ce9ab866-6ba7-4782-a002-5f0f4c252b4e-scripts\") pod \"ceilometer-0\" (UID: \"ce9ab866-6ba7-4782-a002-5f0f4c252b4e\") " pod="openstack/ceilometer-0" Dec 11 13:25:56 crc kubenswrapper[4898]: I1211 13:25:56.487550 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wf29f\" (UniqueName: \"kubernetes.io/projected/ce9ab866-6ba7-4782-a002-5f0f4c252b4e-kube-api-access-wf29f\") pod \"ceilometer-0\" (UID: \"ce9ab866-6ba7-4782-a002-5f0f4c252b4e\") " pod="openstack/ceilometer-0" Dec 11 13:25:56 crc kubenswrapper[4898]: I1211 13:25:56.487577 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce9ab866-6ba7-4782-a002-5f0f4c252b4e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ce9ab866-6ba7-4782-a002-5f0f4c252b4e\") " pod="openstack/ceilometer-0" Dec 11 13:25:56 crc kubenswrapper[4898]: I1211 13:25:56.487808 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce9ab866-6ba7-4782-a002-5f0f4c252b4e-config-data\") pod \"ceilometer-0\" (UID: \"ce9ab866-6ba7-4782-a002-5f0f4c252b4e\") " pod="openstack/ceilometer-0" Dec 11 13:25:56 crc kubenswrapper[4898]: I1211 13:25:56.487832 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ce9ab866-6ba7-4782-a002-5f0f4c252b4e-run-httpd\") pod \"ceilometer-0\" (UID: \"ce9ab866-6ba7-4782-a002-5f0f4c252b4e\") " pod="openstack/ceilometer-0" Dec 11 13:25:56 crc kubenswrapper[4898]: I1211 13:25:56.504935 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8fd80d4-db6d-40ce-bbaf-13565266e5cc-kube-api-access-274mg" (OuterVolumeSpecName: "kube-api-access-274mg") pod "f8fd80d4-db6d-40ce-bbaf-13565266e5cc" (UID: "f8fd80d4-db6d-40ce-bbaf-13565266e5cc"). InnerVolumeSpecName "kube-api-access-274mg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:25:56 crc kubenswrapper[4898]: I1211 13:25:56.590894 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/298d4481-1b69-43a5-89a8-218a127c0a51-ovsdbserver-nb\") pod \"dnsmasq-dns-56df8fb6b7-d89j6\" (UID: \"298d4481-1b69-43a5-89a8-218a127c0a51\") " pod="openstack/dnsmasq-dns-56df8fb6b7-d89j6" Dec 11 13:25:56 crc kubenswrapper[4898]: I1211 13:25:56.591134 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce9ab866-6ba7-4782-a002-5f0f4c252b4e-config-data\") pod \"ceilometer-0\" (UID: \"ce9ab866-6ba7-4782-a002-5f0f4c252b4e\") " pod="openstack/ceilometer-0" Dec 11 13:25:56 crc kubenswrapper[4898]: I1211 13:25:56.591178 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ce9ab866-6ba7-4782-a002-5f0f4c252b4e-run-httpd\") pod \"ceilometer-0\" (UID: \"ce9ab866-6ba7-4782-a002-5f0f4c252b4e\") " pod="openstack/ceilometer-0" Dec 11 13:25:56 crc kubenswrapper[4898]: I1211 13:25:56.591260 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/298d4481-1b69-43a5-89a8-218a127c0a51-dns-svc\") pod \"dnsmasq-dns-56df8fb6b7-d89j6\" (UID: \"298d4481-1b69-43a5-89a8-218a127c0a51\") " pod="openstack/dnsmasq-dns-56df8fb6b7-d89j6" Dec 11 13:25:56 crc kubenswrapper[4898]: I1211 13:25:56.591292 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/298d4481-1b69-43a5-89a8-218a127c0a51-ovsdbserver-sb\") pod \"dnsmasq-dns-56df8fb6b7-d89j6\" (UID: \"298d4481-1b69-43a5-89a8-218a127c0a51\") " pod="openstack/dnsmasq-dns-56df8fb6b7-d89j6" Dec 11 13:25:56 crc kubenswrapper[4898]: I1211 13:25:56.591323 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ce9ab866-6ba7-4782-a002-5f0f4c252b4e-log-httpd\") pod \"ceilometer-0\" (UID: \"ce9ab866-6ba7-4782-a002-5f0f4c252b4e\") " pod="openstack/ceilometer-0" Dec 11 13:25:56 crc kubenswrapper[4898]: I1211 13:25:56.591351 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ce9ab866-6ba7-4782-a002-5f0f4c252b4e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ce9ab866-6ba7-4782-a002-5f0f4c252b4e\") " pod="openstack/ceilometer-0" Dec 11 13:25:56 crc kubenswrapper[4898]: I1211 13:25:56.591399 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ce9ab866-6ba7-4782-a002-5f0f4c252b4e-scripts\") pod \"ceilometer-0\" (UID: \"ce9ab866-6ba7-4782-a002-5f0f4c252b4e\") " pod="openstack/ceilometer-0" Dec 11 13:25:56 crc kubenswrapper[4898]: I1211 13:25:56.591440 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wf29f\" (UniqueName: \"kubernetes.io/projected/ce9ab866-6ba7-4782-a002-5f0f4c252b4e-kube-api-access-wf29f\") pod \"ceilometer-0\" (UID: \"ce9ab866-6ba7-4782-a002-5f0f4c252b4e\") " pod="openstack/ceilometer-0" Dec 11 13:25:56 crc kubenswrapper[4898]: I1211 13:25:56.591494 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/298d4481-1b69-43a5-89a8-218a127c0a51-config\") pod \"dnsmasq-dns-56df8fb6b7-d89j6\" (UID: \"298d4481-1b69-43a5-89a8-218a127c0a51\") " pod="openstack/dnsmasq-dns-56df8fb6b7-d89j6" Dec 11 13:25:56 crc kubenswrapper[4898]: I1211 13:25:56.591519 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce9ab866-6ba7-4782-a002-5f0f4c252b4e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ce9ab866-6ba7-4782-a002-5f0f4c252b4e\") " pod="openstack/ceilometer-0" Dec 11 13:25:56 crc kubenswrapper[4898]: I1211 13:25:56.591558 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gbrbr\" (UniqueName: \"kubernetes.io/projected/298d4481-1b69-43a5-89a8-218a127c0a51-kube-api-access-gbrbr\") pod \"dnsmasq-dns-56df8fb6b7-d89j6\" (UID: \"298d4481-1b69-43a5-89a8-218a127c0a51\") " pod="openstack/dnsmasq-dns-56df8fb6b7-d89j6" Dec 11 13:25:56 crc kubenswrapper[4898]: I1211 13:25:56.591585 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/298d4481-1b69-43a5-89a8-218a127c0a51-dns-swift-storage-0\") pod \"dnsmasq-dns-56df8fb6b7-d89j6\" (UID: \"298d4481-1b69-43a5-89a8-218a127c0a51\") " pod="openstack/dnsmasq-dns-56df8fb6b7-d89j6" Dec 11 13:25:56 crc kubenswrapper[4898]: I1211 13:25:56.591656 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-274mg\" (UniqueName: \"kubernetes.io/projected/f8fd80d4-db6d-40ce-bbaf-13565266e5cc-kube-api-access-274mg\") on node \"crc\" DevicePath \"\"" Dec 11 13:25:56 crc kubenswrapper[4898]: I1211 13:25:56.599605 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ce9ab866-6ba7-4782-a002-5f0f4c252b4e-run-httpd\") pod \"ceilometer-0\" (UID: \"ce9ab866-6ba7-4782-a002-5f0f4c252b4e\") " pod="openstack/ceilometer-0" Dec 11 13:25:56 crc kubenswrapper[4898]: I1211 13:25:56.599883 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ce9ab866-6ba7-4782-a002-5f0f4c252b4e-log-httpd\") pod \"ceilometer-0\" (UID: \"ce9ab866-6ba7-4782-a002-5f0f4c252b4e\") " pod="openstack/ceilometer-0" Dec 11 13:25:56 crc kubenswrapper[4898]: I1211 13:25:56.610785 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ce9ab866-6ba7-4782-a002-5f0f4c252b4e-scripts\") pod \"ceilometer-0\" (UID: \"ce9ab866-6ba7-4782-a002-5f0f4c252b4e\") " pod="openstack/ceilometer-0" Dec 11 13:25:56 crc kubenswrapper[4898]: I1211 13:25:56.622806 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce9ab866-6ba7-4782-a002-5f0f4c252b4e-config-data\") pod \"ceilometer-0\" (UID: \"ce9ab866-6ba7-4782-a002-5f0f4c252b4e\") " pod="openstack/ceilometer-0" Dec 11 13:25:56 crc kubenswrapper[4898]: I1211 13:25:56.623755 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 11 13:25:56 crc kubenswrapper[4898]: I1211 13:25:56.625897 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce9ab866-6ba7-4782-a002-5f0f4c252b4e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ce9ab866-6ba7-4782-a002-5f0f4c252b4e\") " pod="openstack/ceilometer-0" Dec 11 13:25:56 crc kubenswrapper[4898]: I1211 13:25:56.629604 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 11 13:25:56 crc kubenswrapper[4898]: I1211 13:25:56.663250 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ce9ab866-6ba7-4782-a002-5f0f4c252b4e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ce9ab866-6ba7-4782-a002-5f0f4c252b4e\") " pod="openstack/ceilometer-0" Dec 11 13:25:56 crc kubenswrapper[4898]: I1211 13:25:56.666946 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f8fd80d4-db6d-40ce-bbaf-13565266e5cc-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f8fd80d4-db6d-40ce-bbaf-13565266e5cc" (UID: "f8fd80d4-db6d-40ce-bbaf-13565266e5cc"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:25:56 crc kubenswrapper[4898]: I1211 13:25:56.667252 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f8fd80d4-db6d-40ce-bbaf-13565266e5cc-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f8fd80d4-db6d-40ce-bbaf-13565266e5cc" (UID: "f8fd80d4-db6d-40ce-bbaf-13565266e5cc"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:25:56 crc kubenswrapper[4898]: I1211 13:25:56.667805 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f8fd80d4-db6d-40ce-bbaf-13565266e5cc-config" (OuterVolumeSpecName: "config") pod "f8fd80d4-db6d-40ce-bbaf-13565266e5cc" (UID: "f8fd80d4-db6d-40ce-bbaf-13565266e5cc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:25:56 crc kubenswrapper[4898]: I1211 13:25:56.667816 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f8fd80d4-db6d-40ce-bbaf-13565266e5cc-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f8fd80d4-db6d-40ce-bbaf-13565266e5cc" (UID: "f8fd80d4-db6d-40ce-bbaf-13565266e5cc"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:25:56 crc kubenswrapper[4898]: I1211 13:25:56.669165 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 11 13:25:56 crc kubenswrapper[4898]: I1211 13:25:56.669336 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wf29f\" (UniqueName: \"kubernetes.io/projected/ce9ab866-6ba7-4782-a002-5f0f4c252b4e-kube-api-access-wf29f\") pod \"ceilometer-0\" (UID: \"ce9ab866-6ba7-4782-a002-5f0f4c252b4e\") " pod="openstack/ceilometer-0" Dec 11 13:25:56 crc kubenswrapper[4898]: I1211 13:25:56.670194 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Dec 11 13:25:56 crc kubenswrapper[4898]: I1211 13:25:56.674171 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Dec 11 13:25:56 crc kubenswrapper[4898]: I1211 13:25:56.674533 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-hm6b4" Dec 11 13:25:56 crc kubenswrapper[4898]: I1211 13:25:56.684160 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 11 13:25:56 crc kubenswrapper[4898]: I1211 13:25:56.694539 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/298d4481-1b69-43a5-89a8-218a127c0a51-dns-svc\") pod \"dnsmasq-dns-56df8fb6b7-d89j6\" (UID: \"298d4481-1b69-43a5-89a8-218a127c0a51\") " pod="openstack/dnsmasq-dns-56df8fb6b7-d89j6" Dec 11 13:25:56 crc kubenswrapper[4898]: I1211 13:25:56.697018 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/298d4481-1b69-43a5-89a8-218a127c0a51-dns-svc\") pod \"dnsmasq-dns-56df8fb6b7-d89j6\" (UID: \"298d4481-1b69-43a5-89a8-218a127c0a51\") " pod="openstack/dnsmasq-dns-56df8fb6b7-d89j6" Dec 11 13:25:56 crc kubenswrapper[4898]: I1211 13:25:56.713609 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/298d4481-1b69-43a5-89a8-218a127c0a51-ovsdbserver-sb\") pod \"dnsmasq-dns-56df8fb6b7-d89j6\" (UID: \"298d4481-1b69-43a5-89a8-218a127c0a51\") " pod="openstack/dnsmasq-dns-56df8fb6b7-d89j6" Dec 11 13:25:56 crc kubenswrapper[4898]: I1211 13:25:56.713798 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 11 13:25:56 crc kubenswrapper[4898]: I1211 13:25:56.714518 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/298d4481-1b69-43a5-89a8-218a127c0a51-ovsdbserver-sb\") pod \"dnsmasq-dns-56df8fb6b7-d89j6\" (UID: \"298d4481-1b69-43a5-89a8-218a127c0a51\") " pod="openstack/dnsmasq-dns-56df8fb6b7-d89j6" Dec 11 13:25:56 crc kubenswrapper[4898]: I1211 13:25:56.718553 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/298d4481-1b69-43a5-89a8-218a127c0a51-config\") pod \"dnsmasq-dns-56df8fb6b7-d89j6\" (UID: \"298d4481-1b69-43a5-89a8-218a127c0a51\") " pod="openstack/dnsmasq-dns-56df8fb6b7-d89j6" Dec 11 13:25:56 crc kubenswrapper[4898]: I1211 13:25:56.720526 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gbrbr\" (UniqueName: \"kubernetes.io/projected/298d4481-1b69-43a5-89a8-218a127c0a51-kube-api-access-gbrbr\") pod \"dnsmasq-dns-56df8fb6b7-d89j6\" (UID: \"298d4481-1b69-43a5-89a8-218a127c0a51\") " pod="openstack/dnsmasq-dns-56df8fb6b7-d89j6" Dec 11 13:25:56 crc kubenswrapper[4898]: I1211 13:25:56.720572 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/298d4481-1b69-43a5-89a8-218a127c0a51-dns-swift-storage-0\") pod \"dnsmasq-dns-56df8fb6b7-d89j6\" (UID: \"298d4481-1b69-43a5-89a8-218a127c0a51\") " pod="openstack/dnsmasq-dns-56df8fb6b7-d89j6" Dec 11 13:25:56 crc kubenswrapper[4898]: I1211 13:25:56.720823 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/298d4481-1b69-43a5-89a8-218a127c0a51-ovsdbserver-nb\") pod \"dnsmasq-dns-56df8fb6b7-d89j6\" (UID: \"298d4481-1b69-43a5-89a8-218a127c0a51\") " pod="openstack/dnsmasq-dns-56df8fb6b7-d89j6" Dec 11 13:25:56 crc kubenswrapper[4898]: I1211 13:25:56.721075 4898 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f8fd80d4-db6d-40ce-bbaf-13565266e5cc-config\") on node \"crc\" DevicePath \"\"" Dec 11 13:25:56 crc kubenswrapper[4898]: I1211 13:25:56.721087 4898 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f8fd80d4-db6d-40ce-bbaf-13565266e5cc-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 11 13:25:56 crc kubenswrapper[4898]: I1211 13:25:56.721102 4898 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f8fd80d4-db6d-40ce-bbaf-13565266e5cc-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 11 13:25:56 crc kubenswrapper[4898]: I1211 13:25:56.721114 4898 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f8fd80d4-db6d-40ce-bbaf-13565266e5cc-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 11 13:25:56 crc kubenswrapper[4898]: I1211 13:25:56.725042 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/298d4481-1b69-43a5-89a8-218a127c0a51-ovsdbserver-nb\") pod \"dnsmasq-dns-56df8fb6b7-d89j6\" (UID: \"298d4481-1b69-43a5-89a8-218a127c0a51\") " pod="openstack/dnsmasq-dns-56df8fb6b7-d89j6" Dec 11 13:25:56 crc kubenswrapper[4898]: I1211 13:25:56.738053 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-zctw4"] Dec 11 13:25:56 crc kubenswrapper[4898]: I1211 13:25:56.740241 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/298d4481-1b69-43a5-89a8-218a127c0a51-config\") pod \"dnsmasq-dns-56df8fb6b7-d89j6\" (UID: \"298d4481-1b69-43a5-89a8-218a127c0a51\") " pod="openstack/dnsmasq-dns-56df8fb6b7-d89j6" Dec 11 13:25:56 crc kubenswrapper[4898]: W1211 13:25:56.747732 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod730a9af6_05eb_4ab9_a19e_c2d359813459.slice/crio-dbbbc06f9e2d2eb7a765c29bba63540e470d4d13927ff331a91e200916d69166 WatchSource:0}: Error finding container dbbbc06f9e2d2eb7a765c29bba63540e470d4d13927ff331a91e200916d69166: Status 404 returned error can't find the container with id dbbbc06f9e2d2eb7a765c29bba63540e470d4d13927ff331a91e200916d69166 Dec 11 13:25:56 crc kubenswrapper[4898]: I1211 13:25:56.751600 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/298d4481-1b69-43a5-89a8-218a127c0a51-dns-swift-storage-0\") pod \"dnsmasq-dns-56df8fb6b7-d89j6\" (UID: \"298d4481-1b69-43a5-89a8-218a127c0a51\") " pod="openstack/dnsmasq-dns-56df8fb6b7-d89j6" Dec 11 13:25:56 crc kubenswrapper[4898]: I1211 13:25:56.754715 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gbrbr\" (UniqueName: \"kubernetes.io/projected/298d4481-1b69-43a5-89a8-218a127c0a51-kube-api-access-gbrbr\") pod \"dnsmasq-dns-56df8fb6b7-d89j6\" (UID: \"298d4481-1b69-43a5-89a8-218a127c0a51\") " pod="openstack/dnsmasq-dns-56df8fb6b7-d89j6" Dec 11 13:25:56 crc kubenswrapper[4898]: I1211 13:25:56.790584 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56df8fb6b7-d89j6" Dec 11 13:25:56 crc kubenswrapper[4898]: I1211 13:25:56.809559 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 11 13:25:56 crc kubenswrapper[4898]: I1211 13:25:56.811447 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 11 13:25:56 crc kubenswrapper[4898]: I1211 13:25:56.818350 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 11 13:25:56 crc kubenswrapper[4898]: I1211 13:25:56.825781 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98476a59-e1fa-4912-828f-c2bc4d6f1317-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"98476a59-e1fa-4912-828f-c2bc4d6f1317\") " pod="openstack/glance-default-external-api-0" Dec 11 13:25:56 crc kubenswrapper[4898]: I1211 13:25:56.825882 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"98476a59-e1fa-4912-828f-c2bc4d6f1317\") " pod="openstack/glance-default-external-api-0" Dec 11 13:25:56 crc kubenswrapper[4898]: I1211 13:25:56.825952 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/98476a59-e1fa-4912-828f-c2bc4d6f1317-logs\") pod \"glance-default-external-api-0\" (UID: \"98476a59-e1fa-4912-828f-c2bc4d6f1317\") " pod="openstack/glance-default-external-api-0" Dec 11 13:25:56 crc kubenswrapper[4898]: I1211 13:25:56.826019 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-979zz\" (UniqueName: \"kubernetes.io/projected/98476a59-e1fa-4912-828f-c2bc4d6f1317-kube-api-access-979zz\") pod \"glance-default-external-api-0\" (UID: \"98476a59-e1fa-4912-828f-c2bc4d6f1317\") " pod="openstack/glance-default-external-api-0" Dec 11 13:25:56 crc kubenswrapper[4898]: I1211 13:25:56.826053 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/98476a59-e1fa-4912-828f-c2bc4d6f1317-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"98476a59-e1fa-4912-828f-c2bc4d6f1317\") " pod="openstack/glance-default-external-api-0" Dec 11 13:25:56 crc kubenswrapper[4898]: I1211 13:25:56.826071 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/98476a59-e1fa-4912-828f-c2bc4d6f1317-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"98476a59-e1fa-4912-828f-c2bc4d6f1317\") " pod="openstack/glance-default-external-api-0" Dec 11 13:25:56 crc kubenswrapper[4898]: I1211 13:25:56.826133 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/98476a59-e1fa-4912-828f-c2bc4d6f1317-scripts\") pod \"glance-default-external-api-0\" (UID: \"98476a59-e1fa-4912-828f-c2bc4d6f1317\") " pod="openstack/glance-default-external-api-0" Dec 11 13:25:56 crc kubenswrapper[4898]: I1211 13:25:56.827822 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/98476a59-e1fa-4912-828f-c2bc4d6f1317-config-data\") pod \"glance-default-external-api-0\" (UID: \"98476a59-e1fa-4912-828f-c2bc4d6f1317\") " pod="openstack/glance-default-external-api-0" Dec 11 13:25:56 crc kubenswrapper[4898]: I1211 13:25:56.829156 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Dec 11 13:25:56 crc kubenswrapper[4898]: I1211 13:25:56.829371 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 11 13:25:56 crc kubenswrapper[4898]: I1211 13:25:56.931501 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/98476a59-e1fa-4912-828f-c2bc4d6f1317-scripts\") pod \"glance-default-external-api-0\" (UID: \"98476a59-e1fa-4912-828f-c2bc4d6f1317\") " pod="openstack/glance-default-external-api-0" Dec 11 13:25:56 crc kubenswrapper[4898]: I1211 13:25:56.931571 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ae50cbd-0394-40d7-8423-4243d355d939-config-data\") pod \"glance-default-internal-api-0\" (UID: \"0ae50cbd-0394-40d7-8423-4243d355d939\") " pod="openstack/glance-default-internal-api-0" Dec 11 13:25:56 crc kubenswrapper[4898]: I1211 13:25:56.931675 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/98476a59-e1fa-4912-828f-c2bc4d6f1317-config-data\") pod \"glance-default-external-api-0\" (UID: \"98476a59-e1fa-4912-828f-c2bc4d6f1317\") " pod="openstack/glance-default-external-api-0" Dec 11 13:25:56 crc kubenswrapper[4898]: I1211 13:25:56.931704 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ae50cbd-0394-40d7-8423-4243d355d939-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"0ae50cbd-0394-40d7-8423-4243d355d939\") " pod="openstack/glance-default-internal-api-0" Dec 11 13:25:56 crc kubenswrapper[4898]: I1211 13:25:56.931755 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"0ae50cbd-0394-40d7-8423-4243d355d939\") " pod="openstack/glance-default-internal-api-0" Dec 11 13:25:56 crc kubenswrapper[4898]: I1211 13:25:56.931800 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0ae50cbd-0394-40d7-8423-4243d355d939-scripts\") pod \"glance-default-internal-api-0\" (UID: \"0ae50cbd-0394-40d7-8423-4243d355d939\") " pod="openstack/glance-default-internal-api-0" Dec 11 13:25:56 crc kubenswrapper[4898]: I1211 13:25:56.931818 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0ae50cbd-0394-40d7-8423-4243d355d939-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"0ae50cbd-0394-40d7-8423-4243d355d939\") " pod="openstack/glance-default-internal-api-0" Dec 11 13:25:56 crc kubenswrapper[4898]: I1211 13:25:56.931841 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98476a59-e1fa-4912-828f-c2bc4d6f1317-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"98476a59-e1fa-4912-828f-c2bc4d6f1317\") " pod="openstack/glance-default-external-api-0" Dec 11 13:25:56 crc kubenswrapper[4898]: I1211 13:25:56.931896 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"98476a59-e1fa-4912-828f-c2bc4d6f1317\") " pod="openstack/glance-default-external-api-0" Dec 11 13:25:56 crc kubenswrapper[4898]: I1211 13:25:56.931956 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/98476a59-e1fa-4912-828f-c2bc4d6f1317-logs\") pod \"glance-default-external-api-0\" (UID: \"98476a59-e1fa-4912-828f-c2bc4d6f1317\") " pod="openstack/glance-default-external-api-0" Dec 11 13:25:56 crc kubenswrapper[4898]: I1211 13:25:56.931974 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0ae50cbd-0394-40d7-8423-4243d355d939-logs\") pod \"glance-default-internal-api-0\" (UID: \"0ae50cbd-0394-40d7-8423-4243d355d939\") " pod="openstack/glance-default-internal-api-0" Dec 11 13:25:56 crc kubenswrapper[4898]: I1211 13:25:56.932005 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bkfhl\" (UniqueName: \"kubernetes.io/projected/0ae50cbd-0394-40d7-8423-4243d355d939-kube-api-access-bkfhl\") pod \"glance-default-internal-api-0\" (UID: \"0ae50cbd-0394-40d7-8423-4243d355d939\") " pod="openstack/glance-default-internal-api-0" Dec 11 13:25:56 crc kubenswrapper[4898]: I1211 13:25:56.932022 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0ae50cbd-0394-40d7-8423-4243d355d939-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"0ae50cbd-0394-40d7-8423-4243d355d939\") " pod="openstack/glance-default-internal-api-0" Dec 11 13:25:56 crc kubenswrapper[4898]: I1211 13:25:56.932069 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-979zz\" (UniqueName: \"kubernetes.io/projected/98476a59-e1fa-4912-828f-c2bc4d6f1317-kube-api-access-979zz\") pod \"glance-default-external-api-0\" (UID: \"98476a59-e1fa-4912-828f-c2bc4d6f1317\") " pod="openstack/glance-default-external-api-0" Dec 11 13:25:56 crc kubenswrapper[4898]: I1211 13:25:56.932097 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/98476a59-e1fa-4912-828f-c2bc4d6f1317-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"98476a59-e1fa-4912-828f-c2bc4d6f1317\") " pod="openstack/glance-default-external-api-0" Dec 11 13:25:56 crc kubenswrapper[4898]: I1211 13:25:56.932115 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/98476a59-e1fa-4912-828f-c2bc4d6f1317-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"98476a59-e1fa-4912-828f-c2bc4d6f1317\") " pod="openstack/glance-default-external-api-0" Dec 11 13:25:56 crc kubenswrapper[4898]: I1211 13:25:56.932378 4898 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"98476a59-e1fa-4912-828f-c2bc4d6f1317\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/glance-default-external-api-0" Dec 11 13:25:56 crc kubenswrapper[4898]: I1211 13:25:56.935943 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/98476a59-e1fa-4912-828f-c2bc4d6f1317-scripts\") pod \"glance-default-external-api-0\" (UID: \"98476a59-e1fa-4912-828f-c2bc4d6f1317\") " pod="openstack/glance-default-external-api-0" Dec 11 13:25:56 crc kubenswrapper[4898]: I1211 13:25:56.936636 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/98476a59-e1fa-4912-828f-c2bc4d6f1317-logs\") pod \"glance-default-external-api-0\" (UID: \"98476a59-e1fa-4912-828f-c2bc4d6f1317\") " pod="openstack/glance-default-external-api-0" Dec 11 13:25:56 crc kubenswrapper[4898]: I1211 13:25:56.944355 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/98476a59-e1fa-4912-828f-c2bc4d6f1317-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"98476a59-e1fa-4912-828f-c2bc4d6f1317\") " pod="openstack/glance-default-external-api-0" Dec 11 13:25:56 crc kubenswrapper[4898]: I1211 13:25:56.950692 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98476a59-e1fa-4912-828f-c2bc4d6f1317-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"98476a59-e1fa-4912-828f-c2bc4d6f1317\") " pod="openstack/glance-default-external-api-0" Dec 11 13:25:56 crc kubenswrapper[4898]: I1211 13:25:56.950945 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/98476a59-e1fa-4912-828f-c2bc4d6f1317-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"98476a59-e1fa-4912-828f-c2bc4d6f1317\") " pod="openstack/glance-default-external-api-0" Dec 11 13:25:56 crc kubenswrapper[4898]: I1211 13:25:56.954317 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/98476a59-e1fa-4912-828f-c2bc4d6f1317-config-data\") pod \"glance-default-external-api-0\" (UID: \"98476a59-e1fa-4912-828f-c2bc4d6f1317\") " pod="openstack/glance-default-external-api-0" Dec 11 13:25:56 crc kubenswrapper[4898]: I1211 13:25:56.966171 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-979zz\" (UniqueName: \"kubernetes.io/projected/98476a59-e1fa-4912-828f-c2bc4d6f1317-kube-api-access-979zz\") pod \"glance-default-external-api-0\" (UID: \"98476a59-e1fa-4912-828f-c2bc4d6f1317\") " pod="openstack/glance-default-external-api-0" Dec 11 13:25:57 crc kubenswrapper[4898]: I1211 13:25:57.005209 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"98476a59-e1fa-4912-828f-c2bc4d6f1317\") " pod="openstack/glance-default-external-api-0" Dec 11 13:25:57 crc kubenswrapper[4898]: I1211 13:25:57.033891 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0ae50cbd-0394-40d7-8423-4243d355d939-logs\") pod \"glance-default-internal-api-0\" (UID: \"0ae50cbd-0394-40d7-8423-4243d355d939\") " pod="openstack/glance-default-internal-api-0" Dec 11 13:25:57 crc kubenswrapper[4898]: I1211 13:25:57.033944 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bkfhl\" (UniqueName: \"kubernetes.io/projected/0ae50cbd-0394-40d7-8423-4243d355d939-kube-api-access-bkfhl\") pod \"glance-default-internal-api-0\" (UID: \"0ae50cbd-0394-40d7-8423-4243d355d939\") " pod="openstack/glance-default-internal-api-0" Dec 11 13:25:57 crc kubenswrapper[4898]: I1211 13:25:57.033967 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0ae50cbd-0394-40d7-8423-4243d355d939-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"0ae50cbd-0394-40d7-8423-4243d355d939\") " pod="openstack/glance-default-internal-api-0" Dec 11 13:25:57 crc kubenswrapper[4898]: I1211 13:25:57.034029 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ae50cbd-0394-40d7-8423-4243d355d939-config-data\") pod \"glance-default-internal-api-0\" (UID: \"0ae50cbd-0394-40d7-8423-4243d355d939\") " pod="openstack/glance-default-internal-api-0" Dec 11 13:25:57 crc kubenswrapper[4898]: I1211 13:25:57.034070 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ae50cbd-0394-40d7-8423-4243d355d939-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"0ae50cbd-0394-40d7-8423-4243d355d939\") " pod="openstack/glance-default-internal-api-0" Dec 11 13:25:57 crc kubenswrapper[4898]: I1211 13:25:57.034104 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"0ae50cbd-0394-40d7-8423-4243d355d939\") " pod="openstack/glance-default-internal-api-0" Dec 11 13:25:57 crc kubenswrapper[4898]: I1211 13:25:57.034138 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0ae50cbd-0394-40d7-8423-4243d355d939-scripts\") pod \"glance-default-internal-api-0\" (UID: \"0ae50cbd-0394-40d7-8423-4243d355d939\") " pod="openstack/glance-default-internal-api-0" Dec 11 13:25:57 crc kubenswrapper[4898]: I1211 13:25:57.034154 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0ae50cbd-0394-40d7-8423-4243d355d939-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"0ae50cbd-0394-40d7-8423-4243d355d939\") " pod="openstack/glance-default-internal-api-0" Dec 11 13:25:57 crc kubenswrapper[4898]: I1211 13:25:57.034674 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0ae50cbd-0394-40d7-8423-4243d355d939-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"0ae50cbd-0394-40d7-8423-4243d355d939\") " pod="openstack/glance-default-internal-api-0" Dec 11 13:25:57 crc kubenswrapper[4898]: I1211 13:25:57.036265 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0ae50cbd-0394-40d7-8423-4243d355d939-logs\") pod \"glance-default-internal-api-0\" (UID: \"0ae50cbd-0394-40d7-8423-4243d355d939\") " pod="openstack/glance-default-internal-api-0" Dec 11 13:25:57 crc kubenswrapper[4898]: I1211 13:25:57.037108 4898 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"0ae50cbd-0394-40d7-8423-4243d355d939\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/glance-default-internal-api-0" Dec 11 13:25:57 crc kubenswrapper[4898]: I1211 13:25:57.040976 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0ae50cbd-0394-40d7-8423-4243d355d939-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"0ae50cbd-0394-40d7-8423-4243d355d939\") " pod="openstack/glance-default-internal-api-0" Dec 11 13:25:57 crc kubenswrapper[4898]: I1211 13:25:57.041358 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0ae50cbd-0394-40d7-8423-4243d355d939-scripts\") pod \"glance-default-internal-api-0\" (UID: \"0ae50cbd-0394-40d7-8423-4243d355d939\") " pod="openstack/glance-default-internal-api-0" Dec 11 13:25:57 crc kubenswrapper[4898]: I1211 13:25:57.041875 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ae50cbd-0394-40d7-8423-4243d355d939-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"0ae50cbd-0394-40d7-8423-4243d355d939\") " pod="openstack/glance-default-internal-api-0" Dec 11 13:25:57 crc kubenswrapper[4898]: I1211 13:25:57.051063 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ae50cbd-0394-40d7-8423-4243d355d939-config-data\") pod \"glance-default-internal-api-0\" (UID: \"0ae50cbd-0394-40d7-8423-4243d355d939\") " pod="openstack/glance-default-internal-api-0" Dec 11 13:25:57 crc kubenswrapper[4898]: I1211 13:25:57.055220 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bkfhl\" (UniqueName: \"kubernetes.io/projected/0ae50cbd-0394-40d7-8423-4243d355d939-kube-api-access-bkfhl\") pod \"glance-default-internal-api-0\" (UID: \"0ae50cbd-0394-40d7-8423-4243d355d939\") " pod="openstack/glance-default-internal-api-0" Dec 11 13:25:57 crc kubenswrapper[4898]: I1211 13:25:57.081697 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"0ae50cbd-0394-40d7-8423-4243d355d939\") " pod="openstack/glance-default-internal-api-0" Dec 11 13:25:57 crc kubenswrapper[4898]: I1211 13:25:57.130589 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 11 13:25:57 crc kubenswrapper[4898]: I1211 13:25:57.193064 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-6v7s7"] Dec 11 13:25:57 crc kubenswrapper[4898]: I1211 13:25:57.210571 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-mvzhc"] Dec 11 13:25:57 crc kubenswrapper[4898]: I1211 13:25:57.222332 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bbf5cc879-2mhm5"] Dec 11 13:25:57 crc kubenswrapper[4898]: W1211 13:25:57.223736 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod83e282d3_244c_4ec5_afc0_8ac624fe0879.slice/crio-a408a404d774d265c277749289462690697640952c6c08eb75a982227e486833 WatchSource:0}: Error finding container a408a404d774d265c277749289462690697640952c6c08eb75a982227e486833: Status 404 returned error can't find the container with id a408a404d774d265c277749289462690697640952c6c08eb75a982227e486833 Dec 11 13:25:57 crc kubenswrapper[4898]: I1211 13:25:57.252123 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-f57gt"] Dec 11 13:25:57 crc kubenswrapper[4898]: W1211 13:25:57.264652 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4ff1a3a4_ba9d_4155_b289_46de3809f5f4.slice/crio-1e4c2ee7dae75df9aa983ba3b8b3c33cac263a6a7c91858357354a129bd4e8d1 WatchSource:0}: Error finding container 1e4c2ee7dae75df9aa983ba3b8b3c33cac263a6a7c91858357354a129bd4e8d1: Status 404 returned error can't find the container with id 1e4c2ee7dae75df9aa983ba3b8b3c33cac263a6a7c91858357354a129bd4e8d1 Dec 11 13:25:57 crc kubenswrapper[4898]: I1211 13:25:57.275213 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 11 13:25:57 crc kubenswrapper[4898]: I1211 13:25:57.378058 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74dc88fc-w4rz8" event={"ID":"f8fd80d4-db6d-40ce-bbaf-13565266e5cc","Type":"ContainerDied","Data":"ce268f2a7f6b938bc17435e640e4c1d276a996d07747708cf3e4099c93464507"} Dec 11 13:25:57 crc kubenswrapper[4898]: I1211 13:25:57.378079 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74dc88fc-w4rz8" Dec 11 13:25:57 crc kubenswrapper[4898]: I1211 13:25:57.378389 4898 scope.go:117] "RemoveContainer" containerID="a9cbb15434f9bb88ff30b78dbc766a42c4ca59a66f3a25e0b2a428474a01ffb9" Dec 11 13:25:57 crc kubenswrapper[4898]: I1211 13:25:57.384554 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-6v7s7" event={"ID":"76d89e82-8f2e-4198-8736-28293404a0bd","Type":"ContainerStarted","Data":"70d347fdd48e904d66a228ed5fa1d1994b4b9c1d3260d11fc00099a4d99f221c"} Dec 11 13:25:57 crc kubenswrapper[4898]: I1211 13:25:57.386283 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-f57gt" event={"ID":"4ff1a3a4-ba9d-4155-b289-46de3809f5f4","Type":"ContainerStarted","Data":"1e4c2ee7dae75df9aa983ba3b8b3c33cac263a6a7c91858357354a129bd4e8d1"} Dec 11 13:25:57 crc kubenswrapper[4898]: I1211 13:25:57.387346 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-mvzhc" event={"ID":"23459d62-b558-4f82-a875-311d5fa486e5","Type":"ContainerStarted","Data":"5326e65bd790418544772d15d689ceb2f8acd7cf5eab25c5ae6b87a2e2a3c8ef"} Dec 11 13:25:57 crc kubenswrapper[4898]: I1211 13:25:57.389478 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bbf5cc879-2mhm5" event={"ID":"83e282d3-244c-4ec5-afc0-8ac624fe0879","Type":"ContainerStarted","Data":"a408a404d774d265c277749289462690697640952c6c08eb75a982227e486833"} Dec 11 13:25:57 crc kubenswrapper[4898]: I1211 13:25:57.418734 4898 scope.go:117] "RemoveContainer" containerID="3d1a3b5c868a63a4a586e8e454fa7fba969e9fa25900977fbad4682786865ae5" Dec 11 13:25:57 crc kubenswrapper[4898]: I1211 13:25:57.418898 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-zctw4" event={"ID":"730a9af6-05eb-4ab9-a19e-c2d359813459","Type":"ContainerStarted","Data":"5c45ba0adcd382f966e106c460bc5c3e858dd387f9988d492a1a441951833b10"} Dec 11 13:25:57 crc kubenswrapper[4898]: I1211 13:25:57.418929 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-zctw4" event={"ID":"730a9af6-05eb-4ab9-a19e-c2d359813459","Type":"ContainerStarted","Data":"dbbbc06f9e2d2eb7a765c29bba63540e470d4d13927ff331a91e200916d69166"} Dec 11 13:25:57 crc kubenswrapper[4898]: I1211 13:25:57.423332 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74dc88fc-w4rz8"] Dec 11 13:25:57 crc kubenswrapper[4898]: I1211 13:25:57.438052 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-74dc88fc-w4rz8"] Dec 11 13:25:57 crc kubenswrapper[4898]: I1211 13:25:57.561696 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-zctw4" podStartSLOduration=2.561676134 podStartE2EDuration="2.561676134s" podCreationTimestamp="2025-12-11 13:25:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 13:25:57.440286944 +0000 UTC m=+1315.012613381" watchObservedRunningTime="2025-12-11 13:25:57.561676134 +0000 UTC m=+1315.134002571" Dec 11 13:25:57 crc kubenswrapper[4898]: I1211 13:25:57.580514 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-gtqwj"] Dec 11 13:25:57 crc kubenswrapper[4898]: I1211 13:25:57.622547 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-mhjjq"] Dec 11 13:25:57 crc kubenswrapper[4898]: I1211 13:25:57.646212 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 11 13:25:57 crc kubenswrapper[4898]: W1211 13:25:57.687037 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podce9ab866_6ba7_4782_a002_5f0f4c252b4e.slice/crio-2bcac39c7b4842809b5d1cba6f56405506a81fb3096449e3605b2ccbf4a393e8 WatchSource:0}: Error finding container 2bcac39c7b4842809b5d1cba6f56405506a81fb3096449e3605b2ccbf4a393e8: Status 404 returned error can't find the container with id 2bcac39c7b4842809b5d1cba6f56405506a81fb3096449e3605b2ccbf4a393e8 Dec 11 13:25:57 crc kubenswrapper[4898]: I1211 13:25:57.722225 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-d89j6"] Dec 11 13:25:57 crc kubenswrapper[4898]: W1211 13:25:57.734211 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod298d4481_1b69_43a5_89a8_218a127c0a51.slice/crio-d21566ca382cbfa9930d87bec625a05a530a0c1e429b28dabcbad20d945a0b27 WatchSource:0}: Error finding container d21566ca382cbfa9930d87bec625a05a530a0c1e429b28dabcbad20d945a0b27: Status 404 returned error can't find the container with id d21566ca382cbfa9930d87bec625a05a530a0c1e429b28dabcbad20d945a0b27 Dec 11 13:25:58 crc kubenswrapper[4898]: I1211 13:25:58.010384 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 11 13:25:58 crc kubenswrapper[4898]: I1211 13:25:58.147812 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 11 13:25:58 crc kubenswrapper[4898]: W1211 13:25:58.179768 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod98476a59_e1fa_4912_828f_c2bc4d6f1317.slice/crio-600f8cad965db2164982c51d6ef27f0ff5d8efb54479c6a9017208646a81bccc WatchSource:0}: Error finding container 600f8cad965db2164982c51d6ef27f0ff5d8efb54479c6a9017208646a81bccc: Status 404 returned error can't find the container with id 600f8cad965db2164982c51d6ef27f0ff5d8efb54479c6a9017208646a81bccc Dec 11 13:25:58 crc kubenswrapper[4898]: I1211 13:25:58.367358 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 11 13:25:58 crc kubenswrapper[4898]: I1211 13:25:58.442121 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-f57gt" event={"ID":"4ff1a3a4-ba9d-4155-b289-46de3809f5f4","Type":"ContainerStarted","Data":"30bd400b99cd12f96e97ead55af6bcd9b8d9bf06cfe4939424e74ac281a9f0d5"} Dec 11 13:25:58 crc kubenswrapper[4898]: I1211 13:25:58.481816 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-mhjjq" event={"ID":"b93b66d1-a195-42b8-912d-5029d9f0e6b3","Type":"ContainerStarted","Data":"14e779bfe73e233dab3b23e31aa82f28daec4112cfc07c5be1b0b99208ecc942"} Dec 11 13:25:58 crc kubenswrapper[4898]: I1211 13:25:58.491928 4898 generic.go:334] "Generic (PLEG): container finished" podID="298d4481-1b69-43a5-89a8-218a127c0a51" containerID="d7f6d8d4d2c0a3a2f0fc3a8e92e503d5808996cbfed6e3d92b807573d84addc7" exitCode=0 Dec 11 13:25:58 crc kubenswrapper[4898]: I1211 13:25:58.496813 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-d89j6" event={"ID":"298d4481-1b69-43a5-89a8-218a127c0a51","Type":"ContainerDied","Data":"d7f6d8d4d2c0a3a2f0fc3a8e92e503d5808996cbfed6e3d92b807573d84addc7"} Dec 11 13:25:58 crc kubenswrapper[4898]: I1211 13:25:58.496897 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-d89j6" event={"ID":"298d4481-1b69-43a5-89a8-218a127c0a51","Type":"ContainerStarted","Data":"d21566ca382cbfa9930d87bec625a05a530a0c1e429b28dabcbad20d945a0b27"} Dec 11 13:25:58 crc kubenswrapper[4898]: I1211 13:25:58.508742 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-gtqwj" event={"ID":"5924443a-434a-4efc-b04d-fdc73d3e2fe6","Type":"ContainerStarted","Data":"8cbc09377cca8b85fadfd8e851a500a7781315b2e7075965b2a3130d13224ec6"} Dec 11 13:25:58 crc kubenswrapper[4898]: I1211 13:25:58.526503 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ce9ab866-6ba7-4782-a002-5f0f4c252b4e","Type":"ContainerStarted","Data":"2bcac39c7b4842809b5d1cba6f56405506a81fb3096449e3605b2ccbf4a393e8"} Dec 11 13:25:58 crc kubenswrapper[4898]: I1211 13:25:58.545740 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"98476a59-e1fa-4912-828f-c2bc4d6f1317","Type":"ContainerStarted","Data":"600f8cad965db2164982c51d6ef27f0ff5d8efb54479c6a9017208646a81bccc"} Dec 11 13:25:58 crc kubenswrapper[4898]: I1211 13:25:58.563514 4898 generic.go:334] "Generic (PLEG): container finished" podID="83e282d3-244c-4ec5-afc0-8ac624fe0879" containerID="4273caf342a2350d74aac41c0b4558de29e37d87660e30df6d212e74e8d0cd25" exitCode=0 Dec 11 13:25:58 crc kubenswrapper[4898]: I1211 13:25:58.563628 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bbf5cc879-2mhm5" event={"ID":"83e282d3-244c-4ec5-afc0-8ac624fe0879","Type":"ContainerDied","Data":"4273caf342a2350d74aac41c0b4558de29e37d87660e30df6d212e74e8d0cd25"} Dec 11 13:25:58 crc kubenswrapper[4898]: I1211 13:25:58.572047 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 11 13:25:58 crc kubenswrapper[4898]: I1211 13:25:58.597668 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"0ae50cbd-0394-40d7-8423-4243d355d939","Type":"ContainerStarted","Data":"f7077c193b90d7f93bca10ff9f0887d5f719259ac257cfa3185a5364dfadeb59"} Dec 11 13:25:58 crc kubenswrapper[4898]: I1211 13:25:58.649820 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-f57gt" podStartSLOduration=3.649797788 podStartE2EDuration="3.649797788s" podCreationTimestamp="2025-12-11 13:25:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 13:25:58.596905984 +0000 UTC m=+1316.169232431" watchObservedRunningTime="2025-12-11 13:25:58.649797788 +0000 UTC m=+1316.222124225" Dec 11 13:25:58 crc kubenswrapper[4898]: I1211 13:25:58.718677 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 11 13:25:58 crc kubenswrapper[4898]: I1211 13:25:58.828354 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f8fd80d4-db6d-40ce-bbaf-13565266e5cc" path="/var/lib/kubelet/pods/f8fd80d4-db6d-40ce-bbaf-13565266e5cc/volumes" Dec 11 13:25:59 crc kubenswrapper[4898]: I1211 13:25:59.415792 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bbf5cc879-2mhm5" Dec 11 13:25:59 crc kubenswrapper[4898]: I1211 13:25:59.535992 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2ndbb\" (UniqueName: \"kubernetes.io/projected/83e282d3-244c-4ec5-afc0-8ac624fe0879-kube-api-access-2ndbb\") pod \"83e282d3-244c-4ec5-afc0-8ac624fe0879\" (UID: \"83e282d3-244c-4ec5-afc0-8ac624fe0879\") " Dec 11 13:25:59 crc kubenswrapper[4898]: I1211 13:25:59.536147 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/83e282d3-244c-4ec5-afc0-8ac624fe0879-ovsdbserver-sb\") pod \"83e282d3-244c-4ec5-afc0-8ac624fe0879\" (UID: \"83e282d3-244c-4ec5-afc0-8ac624fe0879\") " Dec 11 13:25:59 crc kubenswrapper[4898]: I1211 13:25:59.536173 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/83e282d3-244c-4ec5-afc0-8ac624fe0879-dns-svc\") pod \"83e282d3-244c-4ec5-afc0-8ac624fe0879\" (UID: \"83e282d3-244c-4ec5-afc0-8ac624fe0879\") " Dec 11 13:25:59 crc kubenswrapper[4898]: I1211 13:25:59.536197 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/83e282d3-244c-4ec5-afc0-8ac624fe0879-config\") pod \"83e282d3-244c-4ec5-afc0-8ac624fe0879\" (UID: \"83e282d3-244c-4ec5-afc0-8ac624fe0879\") " Dec 11 13:25:59 crc kubenswrapper[4898]: I1211 13:25:59.536314 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/83e282d3-244c-4ec5-afc0-8ac624fe0879-ovsdbserver-nb\") pod \"83e282d3-244c-4ec5-afc0-8ac624fe0879\" (UID: \"83e282d3-244c-4ec5-afc0-8ac624fe0879\") " Dec 11 13:25:59 crc kubenswrapper[4898]: I1211 13:25:59.536395 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/83e282d3-244c-4ec5-afc0-8ac624fe0879-dns-swift-storage-0\") pod \"83e282d3-244c-4ec5-afc0-8ac624fe0879\" (UID: \"83e282d3-244c-4ec5-afc0-8ac624fe0879\") " Dec 11 13:25:59 crc kubenswrapper[4898]: I1211 13:25:59.545103 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/83e282d3-244c-4ec5-afc0-8ac624fe0879-kube-api-access-2ndbb" (OuterVolumeSpecName: "kube-api-access-2ndbb") pod "83e282d3-244c-4ec5-afc0-8ac624fe0879" (UID: "83e282d3-244c-4ec5-afc0-8ac624fe0879"). InnerVolumeSpecName "kube-api-access-2ndbb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:25:59 crc kubenswrapper[4898]: I1211 13:25:59.569618 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/83e282d3-244c-4ec5-afc0-8ac624fe0879-config" (OuterVolumeSpecName: "config") pod "83e282d3-244c-4ec5-afc0-8ac624fe0879" (UID: "83e282d3-244c-4ec5-afc0-8ac624fe0879"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:25:59 crc kubenswrapper[4898]: I1211 13:25:59.571507 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/83e282d3-244c-4ec5-afc0-8ac624fe0879-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "83e282d3-244c-4ec5-afc0-8ac624fe0879" (UID: "83e282d3-244c-4ec5-afc0-8ac624fe0879"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:25:59 crc kubenswrapper[4898]: I1211 13:25:59.576720 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/83e282d3-244c-4ec5-afc0-8ac624fe0879-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "83e282d3-244c-4ec5-afc0-8ac624fe0879" (UID: "83e282d3-244c-4ec5-afc0-8ac624fe0879"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:25:59 crc kubenswrapper[4898]: I1211 13:25:59.615058 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"0ae50cbd-0394-40d7-8423-4243d355d939","Type":"ContainerStarted","Data":"6ef43e5e43aca1249e1273c0c990ecce9900dcd616b86611cf8962deeefe1c7a"} Dec 11 13:25:59 crc kubenswrapper[4898]: I1211 13:25:59.617805 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-d89j6" event={"ID":"298d4481-1b69-43a5-89a8-218a127c0a51","Type":"ContainerStarted","Data":"b0f3105ea6f8d99d069fecdbc2b1b5adb7a42fc436acac65aa7207065eb67b9d"} Dec 11 13:25:59 crc kubenswrapper[4898]: I1211 13:25:59.618960 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-56df8fb6b7-d89j6" Dec 11 13:25:59 crc kubenswrapper[4898]: I1211 13:25:59.620502 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"98476a59-e1fa-4912-828f-c2bc4d6f1317","Type":"ContainerStarted","Data":"a1517f1676b5404bec7c06d64374a259366eb52d1ac868e2c4be19f4bf56cfd4"} Dec 11 13:25:59 crc kubenswrapper[4898]: I1211 13:25:59.623356 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bbf5cc879-2mhm5" Dec 11 13:25:59 crc kubenswrapper[4898]: I1211 13:25:59.623609 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bbf5cc879-2mhm5" event={"ID":"83e282d3-244c-4ec5-afc0-8ac624fe0879","Type":"ContainerDied","Data":"a408a404d774d265c277749289462690697640952c6c08eb75a982227e486833"} Dec 11 13:25:59 crc kubenswrapper[4898]: I1211 13:25:59.623644 4898 scope.go:117] "RemoveContainer" containerID="4273caf342a2350d74aac41c0b4558de29e37d87660e30df6d212e74e8d0cd25" Dec 11 13:25:59 crc kubenswrapper[4898]: I1211 13:25:59.637871 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/83e282d3-244c-4ec5-afc0-8ac624fe0879-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "83e282d3-244c-4ec5-afc0-8ac624fe0879" (UID: "83e282d3-244c-4ec5-afc0-8ac624fe0879"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:25:59 crc kubenswrapper[4898]: I1211 13:25:59.641149 4898 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/83e282d3-244c-4ec5-afc0-8ac624fe0879-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 11 13:25:59 crc kubenswrapper[4898]: I1211 13:25:59.641173 4898 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/83e282d3-244c-4ec5-afc0-8ac624fe0879-config\") on node \"crc\" DevicePath \"\"" Dec 11 13:25:59 crc kubenswrapper[4898]: I1211 13:25:59.641183 4898 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/83e282d3-244c-4ec5-afc0-8ac624fe0879-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 11 13:25:59 crc kubenswrapper[4898]: I1211 13:25:59.641190 4898 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/83e282d3-244c-4ec5-afc0-8ac624fe0879-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 11 13:25:59 crc kubenswrapper[4898]: I1211 13:25:59.641199 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2ndbb\" (UniqueName: \"kubernetes.io/projected/83e282d3-244c-4ec5-afc0-8ac624fe0879-kube-api-access-2ndbb\") on node \"crc\" DevicePath \"\"" Dec 11 13:25:59 crc kubenswrapper[4898]: I1211 13:25:59.645818 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/83e282d3-244c-4ec5-afc0-8ac624fe0879-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "83e282d3-244c-4ec5-afc0-8ac624fe0879" (UID: "83e282d3-244c-4ec5-afc0-8ac624fe0879"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:25:59 crc kubenswrapper[4898]: I1211 13:25:59.657433 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-56df8fb6b7-d89j6" podStartSLOduration=3.657416851 podStartE2EDuration="3.657416851s" podCreationTimestamp="2025-12-11 13:25:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 13:25:59.654874423 +0000 UTC m=+1317.227200860" watchObservedRunningTime="2025-12-11 13:25:59.657416851 +0000 UTC m=+1317.229743288" Dec 11 13:25:59 crc kubenswrapper[4898]: I1211 13:25:59.743511 4898 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/83e282d3-244c-4ec5-afc0-8ac624fe0879-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 11 13:26:00 crc kubenswrapper[4898]: I1211 13:26:00.076846 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bbf5cc879-2mhm5"] Dec 11 13:26:00 crc kubenswrapper[4898]: I1211 13:26:00.100166 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-bbf5cc879-2mhm5"] Dec 11 13:26:00 crc kubenswrapper[4898]: I1211 13:26:00.648823 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"98476a59-e1fa-4912-828f-c2bc4d6f1317","Type":"ContainerStarted","Data":"9ab1a5584ad1e5a2191fc14c17d9a21a0da2fa89920b471a8ba52fedc981117e"} Dec 11 13:26:00 crc kubenswrapper[4898]: I1211 13:26:00.649054 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="98476a59-e1fa-4912-828f-c2bc4d6f1317" containerName="glance-log" containerID="cri-o://a1517f1676b5404bec7c06d64374a259366eb52d1ac868e2c4be19f4bf56cfd4" gracePeriod=30 Dec 11 13:26:00 crc kubenswrapper[4898]: I1211 13:26:00.649208 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="98476a59-e1fa-4912-828f-c2bc4d6f1317" containerName="glance-httpd" containerID="cri-o://9ab1a5584ad1e5a2191fc14c17d9a21a0da2fa89920b471a8ba52fedc981117e" gracePeriod=30 Dec 11 13:26:00 crc kubenswrapper[4898]: I1211 13:26:00.655480 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="0ae50cbd-0394-40d7-8423-4243d355d939" containerName="glance-log" containerID="cri-o://6ef43e5e43aca1249e1273c0c990ecce9900dcd616b86611cf8962deeefe1c7a" gracePeriod=30 Dec 11 13:26:00 crc kubenswrapper[4898]: I1211 13:26:00.656016 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"0ae50cbd-0394-40d7-8423-4243d355d939","Type":"ContainerStarted","Data":"d211b43a440846506d05834ba51cf751f17230edd367bb90880771555c3d43e4"} Dec 11 13:26:00 crc kubenswrapper[4898]: I1211 13:26:00.655913 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="0ae50cbd-0394-40d7-8423-4243d355d939" containerName="glance-httpd" containerID="cri-o://d211b43a440846506d05834ba51cf751f17230edd367bb90880771555c3d43e4" gracePeriod=30 Dec 11 13:26:00 crc kubenswrapper[4898]: I1211 13:26:00.686097 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=5.686079648 podStartE2EDuration="5.686079648s" podCreationTimestamp="2025-12-11 13:25:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 13:26:00.672610633 +0000 UTC m=+1318.244937070" watchObservedRunningTime="2025-12-11 13:26:00.686079648 +0000 UTC m=+1318.258406085" Dec 11 13:26:00 crc kubenswrapper[4898]: I1211 13:26:00.713664 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=5.7136181740000005 podStartE2EDuration="5.713618174s" podCreationTimestamp="2025-12-11 13:25:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 13:26:00.697723825 +0000 UTC m=+1318.270050262" watchObservedRunningTime="2025-12-11 13:26:00.713618174 +0000 UTC m=+1318.285944631" Dec 11 13:26:00 crc kubenswrapper[4898]: I1211 13:26:00.811767 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="83e282d3-244c-4ec5-afc0-8ac624fe0879" path="/var/lib/kubelet/pods/83e282d3-244c-4ec5-afc0-8ac624fe0879/volumes" Dec 11 13:26:00 crc kubenswrapper[4898]: E1211 13:26:00.978002 4898 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0ae50cbd_0394_40d7_8423_4243d355d939.slice/crio-d211b43a440846506d05834ba51cf751f17230edd367bb90880771555c3d43e4.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod98476a59_e1fa_4912_828f_c2bc4d6f1317.slice/crio-9ab1a5584ad1e5a2191fc14c17d9a21a0da2fa89920b471a8ba52fedc981117e.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod98476a59_e1fa_4912_828f_c2bc4d6f1317.slice/crio-conmon-9ab1a5584ad1e5a2191fc14c17d9a21a0da2fa89920b471a8ba52fedc981117e.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0ae50cbd_0394_40d7_8423_4243d355d939.slice/crio-6ef43e5e43aca1249e1273c0c990ecce9900dcd616b86611cf8962deeefe1c7a.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0ae50cbd_0394_40d7_8423_4243d355d939.slice/crio-conmon-6ef43e5e43aca1249e1273c0c990ecce9900dcd616b86611cf8962deeefe1c7a.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod98476a59_e1fa_4912_828f_c2bc4d6f1317.slice/crio-a1517f1676b5404bec7c06d64374a259366eb52d1ac868e2c4be19f4bf56cfd4.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod98476a59_e1fa_4912_828f_c2bc4d6f1317.slice/crio-conmon-a1517f1676b5404bec7c06d64374a259366eb52d1ac868e2c4be19f4bf56cfd4.scope\": RecentStats: unable to find data in memory cache]" Dec 11 13:26:01 crc kubenswrapper[4898]: I1211 13:26:01.666397 4898 generic.go:334] "Generic (PLEG): container finished" podID="0ae50cbd-0394-40d7-8423-4243d355d939" containerID="d211b43a440846506d05834ba51cf751f17230edd367bb90880771555c3d43e4" exitCode=143 Dec 11 13:26:01 crc kubenswrapper[4898]: I1211 13:26:01.666709 4898 generic.go:334] "Generic (PLEG): container finished" podID="0ae50cbd-0394-40d7-8423-4243d355d939" containerID="6ef43e5e43aca1249e1273c0c990ecce9900dcd616b86611cf8962deeefe1c7a" exitCode=143 Dec 11 13:26:01 crc kubenswrapper[4898]: I1211 13:26:01.666479 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"0ae50cbd-0394-40d7-8423-4243d355d939","Type":"ContainerDied","Data":"d211b43a440846506d05834ba51cf751f17230edd367bb90880771555c3d43e4"} Dec 11 13:26:01 crc kubenswrapper[4898]: I1211 13:26:01.666789 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"0ae50cbd-0394-40d7-8423-4243d355d939","Type":"ContainerDied","Data":"6ef43e5e43aca1249e1273c0c990ecce9900dcd616b86611cf8962deeefe1c7a"} Dec 11 13:26:01 crc kubenswrapper[4898]: I1211 13:26:01.670723 4898 generic.go:334] "Generic (PLEG): container finished" podID="98476a59-e1fa-4912-828f-c2bc4d6f1317" containerID="9ab1a5584ad1e5a2191fc14c17d9a21a0da2fa89920b471a8ba52fedc981117e" exitCode=143 Dec 11 13:26:01 crc kubenswrapper[4898]: I1211 13:26:01.670752 4898 generic.go:334] "Generic (PLEG): container finished" podID="98476a59-e1fa-4912-828f-c2bc4d6f1317" containerID="a1517f1676b5404bec7c06d64374a259366eb52d1ac868e2c4be19f4bf56cfd4" exitCode=143 Dec 11 13:26:01 crc kubenswrapper[4898]: I1211 13:26:01.671003 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"98476a59-e1fa-4912-828f-c2bc4d6f1317","Type":"ContainerDied","Data":"9ab1a5584ad1e5a2191fc14c17d9a21a0da2fa89920b471a8ba52fedc981117e"} Dec 11 13:26:01 crc kubenswrapper[4898]: I1211 13:26:01.671039 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"98476a59-e1fa-4912-828f-c2bc4d6f1317","Type":"ContainerDied","Data":"a1517f1676b5404bec7c06d64374a259366eb52d1ac868e2c4be19f4bf56cfd4"} Dec 11 13:26:02 crc kubenswrapper[4898]: I1211 13:26:02.683413 4898 generic.go:334] "Generic (PLEG): container finished" podID="730a9af6-05eb-4ab9-a19e-c2d359813459" containerID="5c45ba0adcd382f966e106c460bc5c3e858dd387f9988d492a1a441951833b10" exitCode=0 Dec 11 13:26:02 crc kubenswrapper[4898]: I1211 13:26:02.683678 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-zctw4" event={"ID":"730a9af6-05eb-4ab9-a19e-c2d359813459","Type":"ContainerDied","Data":"5c45ba0adcd382f966e106c460bc5c3e858dd387f9988d492a1a441951833b10"} Dec 11 13:26:03 crc kubenswrapper[4898]: I1211 13:26:03.036579 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Dec 11 13:26:03 crc kubenswrapper[4898]: I1211 13:26:03.067429 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Dec 11 13:26:03 crc kubenswrapper[4898]: I1211 13:26:03.699820 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Dec 11 13:26:03 crc kubenswrapper[4898]: I1211 13:26:03.753637 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5f59b8f679-9flrp" Dec 11 13:26:03 crc kubenswrapper[4898]: I1211 13:26:03.932073 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 11 13:26:03 crc kubenswrapper[4898]: I1211 13:26:03.937848 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 11 13:26:03 crc kubenswrapper[4898]: I1211 13:26:03.978823 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/98476a59-e1fa-4912-828f-c2bc4d6f1317-logs\") pod \"98476a59-e1fa-4912-828f-c2bc4d6f1317\" (UID: \"98476a59-e1fa-4912-828f-c2bc4d6f1317\") " Dec 11 13:26:03 crc kubenswrapper[4898]: I1211 13:26:03.978909 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0ae50cbd-0394-40d7-8423-4243d355d939-logs\") pod \"0ae50cbd-0394-40d7-8423-4243d355d939\" (UID: \"0ae50cbd-0394-40d7-8423-4243d355d939\") " Dec 11 13:26:03 crc kubenswrapper[4898]: I1211 13:26:03.978975 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0ae50cbd-0394-40d7-8423-4243d355d939-scripts\") pod \"0ae50cbd-0394-40d7-8423-4243d355d939\" (UID: \"0ae50cbd-0394-40d7-8423-4243d355d939\") " Dec 11 13:26:03 crc kubenswrapper[4898]: I1211 13:26:03.979006 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98476a59-e1fa-4912-828f-c2bc4d6f1317-combined-ca-bundle\") pod \"98476a59-e1fa-4912-828f-c2bc4d6f1317\" (UID: \"98476a59-e1fa-4912-828f-c2bc4d6f1317\") " Dec 11 13:26:03 crc kubenswrapper[4898]: I1211 13:26:03.979037 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/98476a59-e1fa-4912-828f-c2bc4d6f1317-scripts\") pod \"98476a59-e1fa-4912-828f-c2bc4d6f1317\" (UID: \"98476a59-e1fa-4912-828f-c2bc4d6f1317\") " Dec 11 13:26:03 crc kubenswrapper[4898]: I1211 13:26:03.979056 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"0ae50cbd-0394-40d7-8423-4243d355d939\" (UID: \"0ae50cbd-0394-40d7-8423-4243d355d939\") " Dec 11 13:26:03 crc kubenswrapper[4898]: I1211 13:26:03.979099 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/98476a59-e1fa-4912-828f-c2bc4d6f1317-config-data\") pod \"98476a59-e1fa-4912-828f-c2bc4d6f1317\" (UID: \"98476a59-e1fa-4912-828f-c2bc4d6f1317\") " Dec 11 13:26:03 crc kubenswrapper[4898]: I1211 13:26:03.979167 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"98476a59-e1fa-4912-828f-c2bc4d6f1317\" (UID: \"98476a59-e1fa-4912-828f-c2bc4d6f1317\") " Dec 11 13:26:03 crc kubenswrapper[4898]: I1211 13:26:03.979206 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ae50cbd-0394-40d7-8423-4243d355d939-config-data\") pod \"0ae50cbd-0394-40d7-8423-4243d355d939\" (UID: \"0ae50cbd-0394-40d7-8423-4243d355d939\") " Dec 11 13:26:03 crc kubenswrapper[4898]: I1211 13:26:03.979288 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ae50cbd-0394-40d7-8423-4243d355d939-combined-ca-bundle\") pod \"0ae50cbd-0394-40d7-8423-4243d355d939\" (UID: \"0ae50cbd-0394-40d7-8423-4243d355d939\") " Dec 11 13:26:03 crc kubenswrapper[4898]: I1211 13:26:03.979332 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/98476a59-e1fa-4912-828f-c2bc4d6f1317-public-tls-certs\") pod \"98476a59-e1fa-4912-828f-c2bc4d6f1317\" (UID: \"98476a59-e1fa-4912-828f-c2bc4d6f1317\") " Dec 11 13:26:03 crc kubenswrapper[4898]: I1211 13:26:03.979365 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0ae50cbd-0394-40d7-8423-4243d355d939-internal-tls-certs\") pod \"0ae50cbd-0394-40d7-8423-4243d355d939\" (UID: \"0ae50cbd-0394-40d7-8423-4243d355d939\") " Dec 11 13:26:03 crc kubenswrapper[4898]: I1211 13:26:03.979393 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/98476a59-e1fa-4912-828f-c2bc4d6f1317-httpd-run\") pod \"98476a59-e1fa-4912-828f-c2bc4d6f1317\" (UID: \"98476a59-e1fa-4912-828f-c2bc4d6f1317\") " Dec 11 13:26:03 crc kubenswrapper[4898]: I1211 13:26:03.979476 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0ae50cbd-0394-40d7-8423-4243d355d939-httpd-run\") pod \"0ae50cbd-0394-40d7-8423-4243d355d939\" (UID: \"0ae50cbd-0394-40d7-8423-4243d355d939\") " Dec 11 13:26:03 crc kubenswrapper[4898]: I1211 13:26:03.979550 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bkfhl\" (UniqueName: \"kubernetes.io/projected/0ae50cbd-0394-40d7-8423-4243d355d939-kube-api-access-bkfhl\") pod \"0ae50cbd-0394-40d7-8423-4243d355d939\" (UID: \"0ae50cbd-0394-40d7-8423-4243d355d939\") " Dec 11 13:26:03 crc kubenswrapper[4898]: I1211 13:26:03.979585 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-979zz\" (UniqueName: \"kubernetes.io/projected/98476a59-e1fa-4912-828f-c2bc4d6f1317-kube-api-access-979zz\") pod \"98476a59-e1fa-4912-828f-c2bc4d6f1317\" (UID: \"98476a59-e1fa-4912-828f-c2bc4d6f1317\") " Dec 11 13:26:03 crc kubenswrapper[4898]: I1211 13:26:03.991913 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "glance") pod "98476a59-e1fa-4912-828f-c2bc4d6f1317" (UID: "98476a59-e1fa-4912-828f-c2bc4d6f1317"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 11 13:26:03 crc kubenswrapper[4898]: I1211 13:26:03.992240 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/98476a59-e1fa-4912-828f-c2bc4d6f1317-logs" (OuterVolumeSpecName: "logs") pod "98476a59-e1fa-4912-828f-c2bc4d6f1317" (UID: "98476a59-e1fa-4912-828f-c2bc4d6f1317"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 13:26:03 crc kubenswrapper[4898]: I1211 13:26:03.992589 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0ae50cbd-0394-40d7-8423-4243d355d939-logs" (OuterVolumeSpecName: "logs") pod "0ae50cbd-0394-40d7-8423-4243d355d939" (UID: "0ae50cbd-0394-40d7-8423-4243d355d939"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 13:26:03 crc kubenswrapper[4898]: I1211 13:26:03.993068 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/98476a59-e1fa-4912-828f-c2bc4d6f1317-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "98476a59-e1fa-4912-828f-c2bc4d6f1317" (UID: "98476a59-e1fa-4912-828f-c2bc4d6f1317"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 13:26:03 crc kubenswrapper[4898]: I1211 13:26:03.993131 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0ae50cbd-0394-40d7-8423-4243d355d939-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "0ae50cbd-0394-40d7-8423-4243d355d939" (UID: "0ae50cbd-0394-40d7-8423-4243d355d939"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 13:26:04 crc kubenswrapper[4898]: I1211 13:26:04.001491 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ae50cbd-0394-40d7-8423-4243d355d939-kube-api-access-bkfhl" (OuterVolumeSpecName: "kube-api-access-bkfhl") pod "0ae50cbd-0394-40d7-8423-4243d355d939" (UID: "0ae50cbd-0394-40d7-8423-4243d355d939"). InnerVolumeSpecName "kube-api-access-bkfhl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:26:04 crc kubenswrapper[4898]: I1211 13:26:04.003588 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ae50cbd-0394-40d7-8423-4243d355d939-scripts" (OuterVolumeSpecName: "scripts") pod "0ae50cbd-0394-40d7-8423-4243d355d939" (UID: "0ae50cbd-0394-40d7-8423-4243d355d939"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:26:04 crc kubenswrapper[4898]: I1211 13:26:04.014148 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98476a59-e1fa-4912-828f-c2bc4d6f1317-scripts" (OuterVolumeSpecName: "scripts") pod "98476a59-e1fa-4912-828f-c2bc4d6f1317" (UID: "98476a59-e1fa-4912-828f-c2bc4d6f1317"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:26:04 crc kubenswrapper[4898]: I1211 13:26:04.014163 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/98476a59-e1fa-4912-828f-c2bc4d6f1317-kube-api-access-979zz" (OuterVolumeSpecName: "kube-api-access-979zz") pod "98476a59-e1fa-4912-828f-c2bc4d6f1317" (UID: "98476a59-e1fa-4912-828f-c2bc4d6f1317"). InnerVolumeSpecName "kube-api-access-979zz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:26:04 crc kubenswrapper[4898]: I1211 13:26:04.014956 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "glance") pod "0ae50cbd-0394-40d7-8423-4243d355d939" (UID: "0ae50cbd-0394-40d7-8423-4243d355d939"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 11 13:26:04 crc kubenswrapper[4898]: I1211 13:26:04.071438 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ae50cbd-0394-40d7-8423-4243d355d939-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0ae50cbd-0394-40d7-8423-4243d355d939" (UID: "0ae50cbd-0394-40d7-8423-4243d355d939"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:26:04 crc kubenswrapper[4898]: I1211 13:26:04.083689 4898 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/98476a59-e1fa-4912-828f-c2bc4d6f1317-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 11 13:26:04 crc kubenswrapper[4898]: I1211 13:26:04.083722 4898 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0ae50cbd-0394-40d7-8423-4243d355d939-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 11 13:26:04 crc kubenswrapper[4898]: I1211 13:26:04.083734 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bkfhl\" (UniqueName: \"kubernetes.io/projected/0ae50cbd-0394-40d7-8423-4243d355d939-kube-api-access-bkfhl\") on node \"crc\" DevicePath \"\"" Dec 11 13:26:04 crc kubenswrapper[4898]: I1211 13:26:04.083748 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-979zz\" (UniqueName: \"kubernetes.io/projected/98476a59-e1fa-4912-828f-c2bc4d6f1317-kube-api-access-979zz\") on node \"crc\" DevicePath \"\"" Dec 11 13:26:04 crc kubenswrapper[4898]: I1211 13:26:04.083758 4898 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/98476a59-e1fa-4912-828f-c2bc4d6f1317-logs\") on node \"crc\" DevicePath \"\"" Dec 11 13:26:04 crc kubenswrapper[4898]: I1211 13:26:04.083768 4898 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0ae50cbd-0394-40d7-8423-4243d355d939-logs\") on node \"crc\" DevicePath \"\"" Dec 11 13:26:04 crc kubenswrapper[4898]: I1211 13:26:04.083778 4898 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0ae50cbd-0394-40d7-8423-4243d355d939-scripts\") on node \"crc\" DevicePath \"\"" Dec 11 13:26:04 crc kubenswrapper[4898]: I1211 13:26:04.083788 4898 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/98476a59-e1fa-4912-828f-c2bc4d6f1317-scripts\") on node \"crc\" DevicePath \"\"" Dec 11 13:26:04 crc kubenswrapper[4898]: I1211 13:26:04.083811 4898 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Dec 11 13:26:04 crc kubenswrapper[4898]: I1211 13:26:04.083830 4898 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Dec 11 13:26:04 crc kubenswrapper[4898]: I1211 13:26:04.083841 4898 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ae50cbd-0394-40d7-8423-4243d355d939-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 11 13:26:04 crc kubenswrapper[4898]: I1211 13:26:04.089857 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98476a59-e1fa-4912-828f-c2bc4d6f1317-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "98476a59-e1fa-4912-828f-c2bc4d6f1317" (UID: "98476a59-e1fa-4912-828f-c2bc4d6f1317"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:26:04 crc kubenswrapper[4898]: I1211 13:26:04.119847 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98476a59-e1fa-4912-828f-c2bc4d6f1317-config-data" (OuterVolumeSpecName: "config-data") pod "98476a59-e1fa-4912-828f-c2bc4d6f1317" (UID: "98476a59-e1fa-4912-828f-c2bc4d6f1317"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:26:04 crc kubenswrapper[4898]: I1211 13:26:04.123047 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ae50cbd-0394-40d7-8423-4243d355d939-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "0ae50cbd-0394-40d7-8423-4243d355d939" (UID: "0ae50cbd-0394-40d7-8423-4243d355d939"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:26:04 crc kubenswrapper[4898]: I1211 13:26:04.126586 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98476a59-e1fa-4912-828f-c2bc4d6f1317-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "98476a59-e1fa-4912-828f-c2bc4d6f1317" (UID: "98476a59-e1fa-4912-828f-c2bc4d6f1317"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:26:04 crc kubenswrapper[4898]: I1211 13:26:04.132548 4898 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Dec 11 13:26:04 crc kubenswrapper[4898]: I1211 13:26:04.146015 4898 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Dec 11 13:26:04 crc kubenswrapper[4898]: I1211 13:26:04.174151 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ae50cbd-0394-40d7-8423-4243d355d939-config-data" (OuterVolumeSpecName: "config-data") pod "0ae50cbd-0394-40d7-8423-4243d355d939" (UID: "0ae50cbd-0394-40d7-8423-4243d355d939"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:26:04 crc kubenswrapper[4898]: I1211 13:26:04.190341 4898 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98476a59-e1fa-4912-828f-c2bc4d6f1317-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 11 13:26:04 crc kubenswrapper[4898]: I1211 13:26:04.190374 4898 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Dec 11 13:26:04 crc kubenswrapper[4898]: I1211 13:26:04.190383 4898 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/98476a59-e1fa-4912-828f-c2bc4d6f1317-config-data\") on node \"crc\" DevicePath \"\"" Dec 11 13:26:04 crc kubenswrapper[4898]: I1211 13:26:04.190393 4898 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Dec 11 13:26:04 crc kubenswrapper[4898]: I1211 13:26:04.190401 4898 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ae50cbd-0394-40d7-8423-4243d355d939-config-data\") on node \"crc\" DevicePath \"\"" Dec 11 13:26:04 crc kubenswrapper[4898]: I1211 13:26:04.190411 4898 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/98476a59-e1fa-4912-828f-c2bc4d6f1317-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 11 13:26:04 crc kubenswrapper[4898]: I1211 13:26:04.190419 4898 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0ae50cbd-0394-40d7-8423-4243d355d939-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 11 13:26:04 crc kubenswrapper[4898]: I1211 13:26:04.717241 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 11 13:26:04 crc kubenswrapper[4898]: I1211 13:26:04.717772 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"0ae50cbd-0394-40d7-8423-4243d355d939","Type":"ContainerDied","Data":"f7077c193b90d7f93bca10ff9f0887d5f719259ac257cfa3185a5364dfadeb59"} Dec 11 13:26:04 crc kubenswrapper[4898]: I1211 13:26:04.717838 4898 scope.go:117] "RemoveContainer" containerID="d211b43a440846506d05834ba51cf751f17230edd367bb90880771555c3d43e4" Dec 11 13:26:04 crc kubenswrapper[4898]: I1211 13:26:04.722442 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 11 13:26:04 crc kubenswrapper[4898]: I1211 13:26:04.722762 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"98476a59-e1fa-4912-828f-c2bc4d6f1317","Type":"ContainerDied","Data":"600f8cad965db2164982c51d6ef27f0ff5d8efb54479c6a9017208646a81bccc"} Dec 11 13:26:04 crc kubenswrapper[4898]: I1211 13:26:04.761166 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 11 13:26:04 crc kubenswrapper[4898]: I1211 13:26:04.775238 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 11 13:26:04 crc kubenswrapper[4898]: I1211 13:26:04.797673 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ae50cbd-0394-40d7-8423-4243d355d939" path="/var/lib/kubelet/pods/0ae50cbd-0394-40d7-8423-4243d355d939/volumes" Dec 11 13:26:04 crc kubenswrapper[4898]: I1211 13:26:04.806112 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 11 13:26:04 crc kubenswrapper[4898]: I1211 13:26:04.814281 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 11 13:26:04 crc kubenswrapper[4898]: I1211 13:26:04.822218 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 11 13:26:04 crc kubenswrapper[4898]: E1211 13:26:04.822770 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98476a59-e1fa-4912-828f-c2bc4d6f1317" containerName="glance-httpd" Dec 11 13:26:04 crc kubenswrapper[4898]: I1211 13:26:04.822788 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="98476a59-e1fa-4912-828f-c2bc4d6f1317" containerName="glance-httpd" Dec 11 13:26:04 crc kubenswrapper[4898]: E1211 13:26:04.822815 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ae50cbd-0394-40d7-8423-4243d355d939" containerName="glance-log" Dec 11 13:26:04 crc kubenswrapper[4898]: I1211 13:26:04.822822 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ae50cbd-0394-40d7-8423-4243d355d939" containerName="glance-log" Dec 11 13:26:04 crc kubenswrapper[4898]: E1211 13:26:04.822831 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98476a59-e1fa-4912-828f-c2bc4d6f1317" containerName="glance-log" Dec 11 13:26:04 crc kubenswrapper[4898]: I1211 13:26:04.822839 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="98476a59-e1fa-4912-828f-c2bc4d6f1317" containerName="glance-log" Dec 11 13:26:04 crc kubenswrapper[4898]: E1211 13:26:04.822856 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83e282d3-244c-4ec5-afc0-8ac624fe0879" containerName="init" Dec 11 13:26:04 crc kubenswrapper[4898]: I1211 13:26:04.822862 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="83e282d3-244c-4ec5-afc0-8ac624fe0879" containerName="init" Dec 11 13:26:04 crc kubenswrapper[4898]: E1211 13:26:04.822871 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ae50cbd-0394-40d7-8423-4243d355d939" containerName="glance-httpd" Dec 11 13:26:04 crc kubenswrapper[4898]: I1211 13:26:04.822876 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ae50cbd-0394-40d7-8423-4243d355d939" containerName="glance-httpd" Dec 11 13:26:04 crc kubenswrapper[4898]: I1211 13:26:04.823076 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ae50cbd-0394-40d7-8423-4243d355d939" containerName="glance-httpd" Dec 11 13:26:04 crc kubenswrapper[4898]: I1211 13:26:04.823092 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="98476a59-e1fa-4912-828f-c2bc4d6f1317" containerName="glance-log" Dec 11 13:26:04 crc kubenswrapper[4898]: I1211 13:26:04.823105 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="98476a59-e1fa-4912-828f-c2bc4d6f1317" containerName="glance-httpd" Dec 11 13:26:04 crc kubenswrapper[4898]: I1211 13:26:04.823122 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="83e282d3-244c-4ec5-afc0-8ac624fe0879" containerName="init" Dec 11 13:26:04 crc kubenswrapper[4898]: I1211 13:26:04.823138 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ae50cbd-0394-40d7-8423-4243d355d939" containerName="glance-log" Dec 11 13:26:04 crc kubenswrapper[4898]: I1211 13:26:04.824485 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 11 13:26:04 crc kubenswrapper[4898]: I1211 13:26:04.850109 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 11 13:26:04 crc kubenswrapper[4898]: I1211 13:26:04.852254 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 11 13:26:04 crc kubenswrapper[4898]: I1211 13:26:04.867411 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Dec 11 13:26:04 crc kubenswrapper[4898]: I1211 13:26:04.867732 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 11 13:26:04 crc kubenswrapper[4898]: I1211 13:26:04.867837 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 11 13:26:04 crc kubenswrapper[4898]: I1211 13:26:04.867846 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-hm6b4" Dec 11 13:26:04 crc kubenswrapper[4898]: I1211 13:26:04.868152 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Dec 11 13:26:04 crc kubenswrapper[4898]: I1211 13:26:04.868329 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Dec 11 13:26:04 crc kubenswrapper[4898]: I1211 13:26:04.873300 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 11 13:26:04 crc kubenswrapper[4898]: I1211 13:26:04.913534 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 11 13:26:05 crc kubenswrapper[4898]: I1211 13:26:05.009629 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2ftmf\" (UniqueName: \"kubernetes.io/projected/8c0762b4-f3a4-4243-8df8-94e805983b4b-kube-api-access-2ftmf\") pod \"glance-default-external-api-0\" (UID: \"8c0762b4-f3a4-4243-8df8-94e805983b4b\") " pod="openstack/glance-default-external-api-0" Dec 11 13:26:05 crc kubenswrapper[4898]: I1211 13:26:05.009702 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c0762b4-f3a4-4243-8df8-94e805983b4b-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"8c0762b4-f3a4-4243-8df8-94e805983b4b\") " pod="openstack/glance-default-external-api-0" Dec 11 13:26:05 crc kubenswrapper[4898]: I1211 13:26:05.009731 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c0762b4-f3a4-4243-8df8-94e805983b4b-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"8c0762b4-f3a4-4243-8df8-94e805983b4b\") " pod="openstack/glance-default-external-api-0" Dec 11 13:26:05 crc kubenswrapper[4898]: I1211 13:26:05.009751 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"1dc1c258-0a10-41a3-a831-0b8b1878ac80\") " pod="openstack/glance-default-internal-api-0" Dec 11 13:26:05 crc kubenswrapper[4898]: I1211 13:26:05.009776 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1dc1c258-0a10-41a3-a831-0b8b1878ac80-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"1dc1c258-0a10-41a3-a831-0b8b1878ac80\") " pod="openstack/glance-default-internal-api-0" Dec 11 13:26:05 crc kubenswrapper[4898]: I1211 13:26:05.009827 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8c0762b4-f3a4-4243-8df8-94e805983b4b-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"8c0762b4-f3a4-4243-8df8-94e805983b4b\") " pod="openstack/glance-default-external-api-0" Dec 11 13:26:05 crc kubenswrapper[4898]: I1211 13:26:05.009846 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8c0762b4-f3a4-4243-8df8-94e805983b4b-logs\") pod \"glance-default-external-api-0\" (UID: \"8c0762b4-f3a4-4243-8df8-94e805983b4b\") " pod="openstack/glance-default-external-api-0" Dec 11 13:26:05 crc kubenswrapper[4898]: I1211 13:26:05.009868 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"8c0762b4-f3a4-4243-8df8-94e805983b4b\") " pod="openstack/glance-default-external-api-0" Dec 11 13:26:05 crc kubenswrapper[4898]: I1211 13:26:05.009884 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1dc1c258-0a10-41a3-a831-0b8b1878ac80-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"1dc1c258-0a10-41a3-a831-0b8b1878ac80\") " pod="openstack/glance-default-internal-api-0" Dec 11 13:26:05 crc kubenswrapper[4898]: I1211 13:26:05.009926 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8c0762b4-f3a4-4243-8df8-94e805983b4b-scripts\") pod \"glance-default-external-api-0\" (UID: \"8c0762b4-f3a4-4243-8df8-94e805983b4b\") " pod="openstack/glance-default-external-api-0" Dec 11 13:26:05 crc kubenswrapper[4898]: I1211 13:26:05.009952 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wbfk\" (UniqueName: \"kubernetes.io/projected/1dc1c258-0a10-41a3-a831-0b8b1878ac80-kube-api-access-4wbfk\") pod \"glance-default-internal-api-0\" (UID: \"1dc1c258-0a10-41a3-a831-0b8b1878ac80\") " pod="openstack/glance-default-internal-api-0" Dec 11 13:26:05 crc kubenswrapper[4898]: I1211 13:26:05.010034 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1dc1c258-0a10-41a3-a831-0b8b1878ac80-logs\") pod \"glance-default-internal-api-0\" (UID: \"1dc1c258-0a10-41a3-a831-0b8b1878ac80\") " pod="openstack/glance-default-internal-api-0" Dec 11 13:26:05 crc kubenswrapper[4898]: I1211 13:26:05.010051 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1dc1c258-0a10-41a3-a831-0b8b1878ac80-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"1dc1c258-0a10-41a3-a831-0b8b1878ac80\") " pod="openstack/glance-default-internal-api-0" Dec 11 13:26:05 crc kubenswrapper[4898]: I1211 13:26:05.010075 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c0762b4-f3a4-4243-8df8-94e805983b4b-config-data\") pod \"glance-default-external-api-0\" (UID: \"8c0762b4-f3a4-4243-8df8-94e805983b4b\") " pod="openstack/glance-default-external-api-0" Dec 11 13:26:05 crc kubenswrapper[4898]: I1211 13:26:05.010098 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1dc1c258-0a10-41a3-a831-0b8b1878ac80-config-data\") pod \"glance-default-internal-api-0\" (UID: \"1dc1c258-0a10-41a3-a831-0b8b1878ac80\") " pod="openstack/glance-default-internal-api-0" Dec 11 13:26:05 crc kubenswrapper[4898]: I1211 13:26:05.010114 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1dc1c258-0a10-41a3-a831-0b8b1878ac80-scripts\") pod \"glance-default-internal-api-0\" (UID: \"1dc1c258-0a10-41a3-a831-0b8b1878ac80\") " pod="openstack/glance-default-internal-api-0" Dec 11 13:26:05 crc kubenswrapper[4898]: I1211 13:26:05.112090 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2ftmf\" (UniqueName: \"kubernetes.io/projected/8c0762b4-f3a4-4243-8df8-94e805983b4b-kube-api-access-2ftmf\") pod \"glance-default-external-api-0\" (UID: \"8c0762b4-f3a4-4243-8df8-94e805983b4b\") " pod="openstack/glance-default-external-api-0" Dec 11 13:26:05 crc kubenswrapper[4898]: I1211 13:26:05.112169 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c0762b4-f3a4-4243-8df8-94e805983b4b-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"8c0762b4-f3a4-4243-8df8-94e805983b4b\") " pod="openstack/glance-default-external-api-0" Dec 11 13:26:05 crc kubenswrapper[4898]: I1211 13:26:05.112201 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c0762b4-f3a4-4243-8df8-94e805983b4b-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"8c0762b4-f3a4-4243-8df8-94e805983b4b\") " pod="openstack/glance-default-external-api-0" Dec 11 13:26:05 crc kubenswrapper[4898]: I1211 13:26:05.112222 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"1dc1c258-0a10-41a3-a831-0b8b1878ac80\") " pod="openstack/glance-default-internal-api-0" Dec 11 13:26:05 crc kubenswrapper[4898]: I1211 13:26:05.112251 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1dc1c258-0a10-41a3-a831-0b8b1878ac80-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"1dc1c258-0a10-41a3-a831-0b8b1878ac80\") " pod="openstack/glance-default-internal-api-0" Dec 11 13:26:05 crc kubenswrapper[4898]: I1211 13:26:05.112310 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8c0762b4-f3a4-4243-8df8-94e805983b4b-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"8c0762b4-f3a4-4243-8df8-94e805983b4b\") " pod="openstack/glance-default-external-api-0" Dec 11 13:26:05 crc kubenswrapper[4898]: I1211 13:26:05.112330 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8c0762b4-f3a4-4243-8df8-94e805983b4b-logs\") pod \"glance-default-external-api-0\" (UID: \"8c0762b4-f3a4-4243-8df8-94e805983b4b\") " pod="openstack/glance-default-external-api-0" Dec 11 13:26:05 crc kubenswrapper[4898]: I1211 13:26:05.112351 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"8c0762b4-f3a4-4243-8df8-94e805983b4b\") " pod="openstack/glance-default-external-api-0" Dec 11 13:26:05 crc kubenswrapper[4898]: I1211 13:26:05.112532 4898 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"8c0762b4-f3a4-4243-8df8-94e805983b4b\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/glance-default-external-api-0" Dec 11 13:26:05 crc kubenswrapper[4898]: I1211 13:26:05.112696 4898 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"1dc1c258-0a10-41a3-a831-0b8b1878ac80\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/glance-default-internal-api-0" Dec 11 13:26:05 crc kubenswrapper[4898]: I1211 13:26:05.112898 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8c0762b4-f3a4-4243-8df8-94e805983b4b-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"8c0762b4-f3a4-4243-8df8-94e805983b4b\") " pod="openstack/glance-default-external-api-0" Dec 11 13:26:05 crc kubenswrapper[4898]: I1211 13:26:05.113006 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8c0762b4-f3a4-4243-8df8-94e805983b4b-logs\") pod \"glance-default-external-api-0\" (UID: \"8c0762b4-f3a4-4243-8df8-94e805983b4b\") " pod="openstack/glance-default-external-api-0" Dec 11 13:26:05 crc kubenswrapper[4898]: I1211 13:26:05.112371 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1dc1c258-0a10-41a3-a831-0b8b1878ac80-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"1dc1c258-0a10-41a3-a831-0b8b1878ac80\") " pod="openstack/glance-default-internal-api-0" Dec 11 13:26:05 crc kubenswrapper[4898]: I1211 13:26:05.116357 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8c0762b4-f3a4-4243-8df8-94e805983b4b-scripts\") pod \"glance-default-external-api-0\" (UID: \"8c0762b4-f3a4-4243-8df8-94e805983b4b\") " pod="openstack/glance-default-external-api-0" Dec 11 13:26:05 crc kubenswrapper[4898]: I1211 13:26:05.116408 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4wbfk\" (UniqueName: \"kubernetes.io/projected/1dc1c258-0a10-41a3-a831-0b8b1878ac80-kube-api-access-4wbfk\") pod \"glance-default-internal-api-0\" (UID: \"1dc1c258-0a10-41a3-a831-0b8b1878ac80\") " pod="openstack/glance-default-internal-api-0" Dec 11 13:26:05 crc kubenswrapper[4898]: I1211 13:26:05.116460 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1dc1c258-0a10-41a3-a831-0b8b1878ac80-logs\") pod \"glance-default-internal-api-0\" (UID: \"1dc1c258-0a10-41a3-a831-0b8b1878ac80\") " pod="openstack/glance-default-internal-api-0" Dec 11 13:26:05 crc kubenswrapper[4898]: I1211 13:26:05.116498 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1dc1c258-0a10-41a3-a831-0b8b1878ac80-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"1dc1c258-0a10-41a3-a831-0b8b1878ac80\") " pod="openstack/glance-default-internal-api-0" Dec 11 13:26:05 crc kubenswrapper[4898]: I1211 13:26:05.116540 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c0762b4-f3a4-4243-8df8-94e805983b4b-config-data\") pod \"glance-default-external-api-0\" (UID: \"8c0762b4-f3a4-4243-8df8-94e805983b4b\") " pod="openstack/glance-default-external-api-0" Dec 11 13:26:05 crc kubenswrapper[4898]: I1211 13:26:05.116582 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1dc1c258-0a10-41a3-a831-0b8b1878ac80-config-data\") pod \"glance-default-internal-api-0\" (UID: \"1dc1c258-0a10-41a3-a831-0b8b1878ac80\") " pod="openstack/glance-default-internal-api-0" Dec 11 13:26:05 crc kubenswrapper[4898]: I1211 13:26:05.116600 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1dc1c258-0a10-41a3-a831-0b8b1878ac80-scripts\") pod \"glance-default-internal-api-0\" (UID: \"1dc1c258-0a10-41a3-a831-0b8b1878ac80\") " pod="openstack/glance-default-internal-api-0" Dec 11 13:26:05 crc kubenswrapper[4898]: I1211 13:26:05.116944 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1dc1c258-0a10-41a3-a831-0b8b1878ac80-logs\") pod \"glance-default-internal-api-0\" (UID: \"1dc1c258-0a10-41a3-a831-0b8b1878ac80\") " pod="openstack/glance-default-internal-api-0" Dec 11 13:26:05 crc kubenswrapper[4898]: I1211 13:26:05.117525 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1dc1c258-0a10-41a3-a831-0b8b1878ac80-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"1dc1c258-0a10-41a3-a831-0b8b1878ac80\") " pod="openstack/glance-default-internal-api-0" Dec 11 13:26:05 crc kubenswrapper[4898]: I1211 13:26:05.119791 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c0762b4-f3a4-4243-8df8-94e805983b4b-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"8c0762b4-f3a4-4243-8df8-94e805983b4b\") " pod="openstack/glance-default-external-api-0" Dec 11 13:26:05 crc kubenswrapper[4898]: I1211 13:26:05.121232 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c0762b4-f3a4-4243-8df8-94e805983b4b-config-data\") pod \"glance-default-external-api-0\" (UID: \"8c0762b4-f3a4-4243-8df8-94e805983b4b\") " pod="openstack/glance-default-external-api-0" Dec 11 13:26:05 crc kubenswrapper[4898]: I1211 13:26:05.121363 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1dc1c258-0a10-41a3-a831-0b8b1878ac80-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"1dc1c258-0a10-41a3-a831-0b8b1878ac80\") " pod="openstack/glance-default-internal-api-0" Dec 11 13:26:05 crc kubenswrapper[4898]: I1211 13:26:05.121303 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1dc1c258-0a10-41a3-a831-0b8b1878ac80-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"1dc1c258-0a10-41a3-a831-0b8b1878ac80\") " pod="openstack/glance-default-internal-api-0" Dec 11 13:26:05 crc kubenswrapper[4898]: I1211 13:26:05.121967 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1dc1c258-0a10-41a3-a831-0b8b1878ac80-config-data\") pod \"glance-default-internal-api-0\" (UID: \"1dc1c258-0a10-41a3-a831-0b8b1878ac80\") " pod="openstack/glance-default-internal-api-0" Dec 11 13:26:05 crc kubenswrapper[4898]: I1211 13:26:05.122229 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8c0762b4-f3a4-4243-8df8-94e805983b4b-scripts\") pod \"glance-default-external-api-0\" (UID: \"8c0762b4-f3a4-4243-8df8-94e805983b4b\") " pod="openstack/glance-default-external-api-0" Dec 11 13:26:05 crc kubenswrapper[4898]: I1211 13:26:05.122572 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c0762b4-f3a4-4243-8df8-94e805983b4b-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"8c0762b4-f3a4-4243-8df8-94e805983b4b\") " pod="openstack/glance-default-external-api-0" Dec 11 13:26:05 crc kubenswrapper[4898]: I1211 13:26:05.123838 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1dc1c258-0a10-41a3-a831-0b8b1878ac80-scripts\") pod \"glance-default-internal-api-0\" (UID: \"1dc1c258-0a10-41a3-a831-0b8b1878ac80\") " pod="openstack/glance-default-internal-api-0" Dec 11 13:26:05 crc kubenswrapper[4898]: I1211 13:26:05.139518 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2ftmf\" (UniqueName: \"kubernetes.io/projected/8c0762b4-f3a4-4243-8df8-94e805983b4b-kube-api-access-2ftmf\") pod \"glance-default-external-api-0\" (UID: \"8c0762b4-f3a4-4243-8df8-94e805983b4b\") " pod="openstack/glance-default-external-api-0" Dec 11 13:26:05 crc kubenswrapper[4898]: I1211 13:26:05.143094 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wbfk\" (UniqueName: \"kubernetes.io/projected/1dc1c258-0a10-41a3-a831-0b8b1878ac80-kube-api-access-4wbfk\") pod \"glance-default-internal-api-0\" (UID: \"1dc1c258-0a10-41a3-a831-0b8b1878ac80\") " pod="openstack/glance-default-internal-api-0" Dec 11 13:26:05 crc kubenswrapper[4898]: I1211 13:26:05.186776 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"1dc1c258-0a10-41a3-a831-0b8b1878ac80\") " pod="openstack/glance-default-internal-api-0" Dec 11 13:26:05 crc kubenswrapper[4898]: I1211 13:26:05.201914 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"8c0762b4-f3a4-4243-8df8-94e805983b4b\") " pod="openstack/glance-default-external-api-0" Dec 11 13:26:05 crc kubenswrapper[4898]: I1211 13:26:05.468986 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 11 13:26:05 crc kubenswrapper[4898]: I1211 13:26:05.491289 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 11 13:26:06 crc kubenswrapper[4898]: E1211 13:26:06.313051 4898 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0ae50cbd_0394_40d7_8423_4243d355d939.slice/crio-f7077c193b90d7f93bca10ff9f0887d5f719259ac257cfa3185a5364dfadeb59\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod98476a59_e1fa_4912_828f_c2bc4d6f1317.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5c05e490_8014_425f_a1f1_7f4374e1d7da.slice/crio-bdc535a5c94d2ee02d91b1838893c317b59cf33b60b12338f985b27fa6368719.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0ae50cbd_0394_40d7_8423_4243d355d939.slice\": RecentStats: unable to find data in memory cache]" Dec 11 13:26:06 crc kubenswrapper[4898]: I1211 13:26:06.741942 4898 generic.go:334] "Generic (PLEG): container finished" podID="5c05e490-8014-425f-a1f1-7f4374e1d7da" containerID="bdc535a5c94d2ee02d91b1838893c317b59cf33b60b12338f985b27fa6368719" exitCode=137 Dec 11 13:26:06 crc kubenswrapper[4898]: I1211 13:26:06.742364 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f59b8f679-9flrp" event={"ID":"5c05e490-8014-425f-a1f1-7f4374e1d7da","Type":"ContainerDied","Data":"bdc535a5c94d2ee02d91b1838893c317b59cf33b60b12338f985b27fa6368719"} Dec 11 13:26:06 crc kubenswrapper[4898]: I1211 13:26:06.790667 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="98476a59-e1fa-4912-828f-c2bc4d6f1317" path="/var/lib/kubelet/pods/98476a59-e1fa-4912-828f-c2bc4d6f1317/volumes" Dec 11 13:26:06 crc kubenswrapper[4898]: I1211 13:26:06.792745 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-56df8fb6b7-d89j6" Dec 11 13:26:06 crc kubenswrapper[4898]: I1211 13:26:06.854126 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-kljnd"] Dec 11 13:26:06 crc kubenswrapper[4898]: I1211 13:26:06.854362 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-b8fbc5445-kljnd" podUID="75e494bf-b288-45e2-8f87-7c146a9bb74f" containerName="dnsmasq-dns" containerID="cri-o://315e559ddb3ac3b90876b8105f8d971ba84c4c052f8ac4100c18d896859254f1" gracePeriod=10 Dec 11 13:26:06 crc kubenswrapper[4898]: I1211 13:26:06.992581 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-zctw4" Dec 11 13:26:07 crc kubenswrapper[4898]: I1211 13:26:07.073101 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/730a9af6-05eb-4ab9-a19e-c2d359813459-scripts\") pod \"730a9af6-05eb-4ab9-a19e-c2d359813459\" (UID: \"730a9af6-05eb-4ab9-a19e-c2d359813459\") " Dec 11 13:26:07 crc kubenswrapper[4898]: I1211 13:26:07.073206 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/730a9af6-05eb-4ab9-a19e-c2d359813459-credential-keys\") pod \"730a9af6-05eb-4ab9-a19e-c2d359813459\" (UID: \"730a9af6-05eb-4ab9-a19e-c2d359813459\") " Dec 11 13:26:07 crc kubenswrapper[4898]: I1211 13:26:07.073229 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/730a9af6-05eb-4ab9-a19e-c2d359813459-config-data\") pod \"730a9af6-05eb-4ab9-a19e-c2d359813459\" (UID: \"730a9af6-05eb-4ab9-a19e-c2d359813459\") " Dec 11 13:26:07 crc kubenswrapper[4898]: I1211 13:26:07.073332 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cdgpd\" (UniqueName: \"kubernetes.io/projected/730a9af6-05eb-4ab9-a19e-c2d359813459-kube-api-access-cdgpd\") pod \"730a9af6-05eb-4ab9-a19e-c2d359813459\" (UID: \"730a9af6-05eb-4ab9-a19e-c2d359813459\") " Dec 11 13:26:07 crc kubenswrapper[4898]: I1211 13:26:07.073367 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/730a9af6-05eb-4ab9-a19e-c2d359813459-fernet-keys\") pod \"730a9af6-05eb-4ab9-a19e-c2d359813459\" (UID: \"730a9af6-05eb-4ab9-a19e-c2d359813459\") " Dec 11 13:26:07 crc kubenswrapper[4898]: I1211 13:26:07.073412 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/730a9af6-05eb-4ab9-a19e-c2d359813459-combined-ca-bundle\") pod \"730a9af6-05eb-4ab9-a19e-c2d359813459\" (UID: \"730a9af6-05eb-4ab9-a19e-c2d359813459\") " Dec 11 13:26:07 crc kubenswrapper[4898]: I1211 13:26:07.082706 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/730a9af6-05eb-4ab9-a19e-c2d359813459-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "730a9af6-05eb-4ab9-a19e-c2d359813459" (UID: "730a9af6-05eb-4ab9-a19e-c2d359813459"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:26:07 crc kubenswrapper[4898]: I1211 13:26:07.084189 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/730a9af6-05eb-4ab9-a19e-c2d359813459-kube-api-access-cdgpd" (OuterVolumeSpecName: "kube-api-access-cdgpd") pod "730a9af6-05eb-4ab9-a19e-c2d359813459" (UID: "730a9af6-05eb-4ab9-a19e-c2d359813459"). InnerVolumeSpecName "kube-api-access-cdgpd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:26:07 crc kubenswrapper[4898]: I1211 13:26:07.085368 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/730a9af6-05eb-4ab9-a19e-c2d359813459-scripts" (OuterVolumeSpecName: "scripts") pod "730a9af6-05eb-4ab9-a19e-c2d359813459" (UID: "730a9af6-05eb-4ab9-a19e-c2d359813459"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:26:07 crc kubenswrapper[4898]: I1211 13:26:07.135110 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/730a9af6-05eb-4ab9-a19e-c2d359813459-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "730a9af6-05eb-4ab9-a19e-c2d359813459" (UID: "730a9af6-05eb-4ab9-a19e-c2d359813459"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:26:07 crc kubenswrapper[4898]: I1211 13:26:07.151593 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/730a9af6-05eb-4ab9-a19e-c2d359813459-config-data" (OuterVolumeSpecName: "config-data") pod "730a9af6-05eb-4ab9-a19e-c2d359813459" (UID: "730a9af6-05eb-4ab9-a19e-c2d359813459"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:26:07 crc kubenswrapper[4898]: I1211 13:26:07.151873 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/730a9af6-05eb-4ab9-a19e-c2d359813459-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "730a9af6-05eb-4ab9-a19e-c2d359813459" (UID: "730a9af6-05eb-4ab9-a19e-c2d359813459"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:26:07 crc kubenswrapper[4898]: I1211 13:26:07.177124 4898 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/730a9af6-05eb-4ab9-a19e-c2d359813459-scripts\") on node \"crc\" DevicePath \"\"" Dec 11 13:26:07 crc kubenswrapper[4898]: I1211 13:26:07.177162 4898 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/730a9af6-05eb-4ab9-a19e-c2d359813459-credential-keys\") on node \"crc\" DevicePath \"\"" Dec 11 13:26:07 crc kubenswrapper[4898]: I1211 13:26:07.177172 4898 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/730a9af6-05eb-4ab9-a19e-c2d359813459-config-data\") on node \"crc\" DevicePath \"\"" Dec 11 13:26:07 crc kubenswrapper[4898]: I1211 13:26:07.177184 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cdgpd\" (UniqueName: \"kubernetes.io/projected/730a9af6-05eb-4ab9-a19e-c2d359813459-kube-api-access-cdgpd\") on node \"crc\" DevicePath \"\"" Dec 11 13:26:07 crc kubenswrapper[4898]: I1211 13:26:07.177193 4898 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/730a9af6-05eb-4ab9-a19e-c2d359813459-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 11 13:26:07 crc kubenswrapper[4898]: I1211 13:26:07.177203 4898 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/730a9af6-05eb-4ab9-a19e-c2d359813459-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 11 13:26:07 crc kubenswrapper[4898]: I1211 13:26:07.762065 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-zctw4" Dec 11 13:26:07 crc kubenswrapper[4898]: I1211 13:26:07.762072 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-zctw4" event={"ID":"730a9af6-05eb-4ab9-a19e-c2d359813459","Type":"ContainerDied","Data":"dbbbc06f9e2d2eb7a765c29bba63540e470d4d13927ff331a91e200916d69166"} Dec 11 13:26:07 crc kubenswrapper[4898]: I1211 13:26:07.762229 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dbbbc06f9e2d2eb7a765c29bba63540e470d4d13927ff331a91e200916d69166" Dec 11 13:26:07 crc kubenswrapper[4898]: I1211 13:26:07.764942 4898 generic.go:334] "Generic (PLEG): container finished" podID="75e494bf-b288-45e2-8f87-7c146a9bb74f" containerID="315e559ddb3ac3b90876b8105f8d971ba84c4c052f8ac4100c18d896859254f1" exitCode=0 Dec 11 13:26:07 crc kubenswrapper[4898]: I1211 13:26:07.764976 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-kljnd" event={"ID":"75e494bf-b288-45e2-8f87-7c146a9bb74f","Type":"ContainerDied","Data":"315e559ddb3ac3b90876b8105f8d971ba84c4c052f8ac4100c18d896859254f1"} Dec 11 13:26:08 crc kubenswrapper[4898]: I1211 13:26:08.146585 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-zctw4"] Dec 11 13:26:08 crc kubenswrapper[4898]: I1211 13:26:08.162175 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-zctw4"] Dec 11 13:26:08 crc kubenswrapper[4898]: I1211 13:26:08.230846 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-ml4kr"] Dec 11 13:26:08 crc kubenswrapper[4898]: E1211 13:26:08.231253 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="730a9af6-05eb-4ab9-a19e-c2d359813459" containerName="keystone-bootstrap" Dec 11 13:26:08 crc kubenswrapper[4898]: I1211 13:26:08.231295 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="730a9af6-05eb-4ab9-a19e-c2d359813459" containerName="keystone-bootstrap" Dec 11 13:26:08 crc kubenswrapper[4898]: I1211 13:26:08.231546 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="730a9af6-05eb-4ab9-a19e-c2d359813459" containerName="keystone-bootstrap" Dec 11 13:26:08 crc kubenswrapper[4898]: I1211 13:26:08.232251 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-ml4kr" Dec 11 13:26:08 crc kubenswrapper[4898]: I1211 13:26:08.236981 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 11 13:26:08 crc kubenswrapper[4898]: I1211 13:26:08.237097 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 11 13:26:08 crc kubenswrapper[4898]: I1211 13:26:08.236982 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 11 13:26:08 crc kubenswrapper[4898]: I1211 13:26:08.242680 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 11 13:26:08 crc kubenswrapper[4898]: I1211 13:26:08.242991 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-qkphd" Dec 11 13:26:08 crc kubenswrapper[4898]: I1211 13:26:08.296787 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-ml4kr"] Dec 11 13:26:08 crc kubenswrapper[4898]: I1211 13:26:08.315887 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/470d01dc-02f0-49f5-912b-087238320dba-fernet-keys\") pod \"keystone-bootstrap-ml4kr\" (UID: \"470d01dc-02f0-49f5-912b-087238320dba\") " pod="openstack/keystone-bootstrap-ml4kr" Dec 11 13:26:08 crc kubenswrapper[4898]: I1211 13:26:08.316004 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fqhjs\" (UniqueName: \"kubernetes.io/projected/470d01dc-02f0-49f5-912b-087238320dba-kube-api-access-fqhjs\") pod \"keystone-bootstrap-ml4kr\" (UID: \"470d01dc-02f0-49f5-912b-087238320dba\") " pod="openstack/keystone-bootstrap-ml4kr" Dec 11 13:26:08 crc kubenswrapper[4898]: I1211 13:26:08.316040 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/470d01dc-02f0-49f5-912b-087238320dba-combined-ca-bundle\") pod \"keystone-bootstrap-ml4kr\" (UID: \"470d01dc-02f0-49f5-912b-087238320dba\") " pod="openstack/keystone-bootstrap-ml4kr" Dec 11 13:26:08 crc kubenswrapper[4898]: I1211 13:26:08.316085 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/470d01dc-02f0-49f5-912b-087238320dba-scripts\") pod \"keystone-bootstrap-ml4kr\" (UID: \"470d01dc-02f0-49f5-912b-087238320dba\") " pod="openstack/keystone-bootstrap-ml4kr" Dec 11 13:26:08 crc kubenswrapper[4898]: I1211 13:26:08.316108 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/470d01dc-02f0-49f5-912b-087238320dba-credential-keys\") pod \"keystone-bootstrap-ml4kr\" (UID: \"470d01dc-02f0-49f5-912b-087238320dba\") " pod="openstack/keystone-bootstrap-ml4kr" Dec 11 13:26:08 crc kubenswrapper[4898]: I1211 13:26:08.316179 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/470d01dc-02f0-49f5-912b-087238320dba-config-data\") pod \"keystone-bootstrap-ml4kr\" (UID: \"470d01dc-02f0-49f5-912b-087238320dba\") " pod="openstack/keystone-bootstrap-ml4kr" Dec 11 13:26:08 crc kubenswrapper[4898]: I1211 13:26:08.417890 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/470d01dc-02f0-49f5-912b-087238320dba-fernet-keys\") pod \"keystone-bootstrap-ml4kr\" (UID: \"470d01dc-02f0-49f5-912b-087238320dba\") " pod="openstack/keystone-bootstrap-ml4kr" Dec 11 13:26:08 crc kubenswrapper[4898]: I1211 13:26:08.417981 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fqhjs\" (UniqueName: \"kubernetes.io/projected/470d01dc-02f0-49f5-912b-087238320dba-kube-api-access-fqhjs\") pod \"keystone-bootstrap-ml4kr\" (UID: \"470d01dc-02f0-49f5-912b-087238320dba\") " pod="openstack/keystone-bootstrap-ml4kr" Dec 11 13:26:08 crc kubenswrapper[4898]: I1211 13:26:08.418003 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/470d01dc-02f0-49f5-912b-087238320dba-combined-ca-bundle\") pod \"keystone-bootstrap-ml4kr\" (UID: \"470d01dc-02f0-49f5-912b-087238320dba\") " pod="openstack/keystone-bootstrap-ml4kr" Dec 11 13:26:08 crc kubenswrapper[4898]: I1211 13:26:08.418034 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/470d01dc-02f0-49f5-912b-087238320dba-scripts\") pod \"keystone-bootstrap-ml4kr\" (UID: \"470d01dc-02f0-49f5-912b-087238320dba\") " pod="openstack/keystone-bootstrap-ml4kr" Dec 11 13:26:08 crc kubenswrapper[4898]: I1211 13:26:08.418051 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/470d01dc-02f0-49f5-912b-087238320dba-credential-keys\") pod \"keystone-bootstrap-ml4kr\" (UID: \"470d01dc-02f0-49f5-912b-087238320dba\") " pod="openstack/keystone-bootstrap-ml4kr" Dec 11 13:26:08 crc kubenswrapper[4898]: I1211 13:26:08.418082 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/470d01dc-02f0-49f5-912b-087238320dba-config-data\") pod \"keystone-bootstrap-ml4kr\" (UID: \"470d01dc-02f0-49f5-912b-087238320dba\") " pod="openstack/keystone-bootstrap-ml4kr" Dec 11 13:26:08 crc kubenswrapper[4898]: I1211 13:26:08.427639 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/470d01dc-02f0-49f5-912b-087238320dba-combined-ca-bundle\") pod \"keystone-bootstrap-ml4kr\" (UID: \"470d01dc-02f0-49f5-912b-087238320dba\") " pod="openstack/keystone-bootstrap-ml4kr" Dec 11 13:26:08 crc kubenswrapper[4898]: I1211 13:26:08.428698 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/470d01dc-02f0-49f5-912b-087238320dba-scripts\") pod \"keystone-bootstrap-ml4kr\" (UID: \"470d01dc-02f0-49f5-912b-087238320dba\") " pod="openstack/keystone-bootstrap-ml4kr" Dec 11 13:26:08 crc kubenswrapper[4898]: I1211 13:26:08.429932 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/470d01dc-02f0-49f5-912b-087238320dba-credential-keys\") pod \"keystone-bootstrap-ml4kr\" (UID: \"470d01dc-02f0-49f5-912b-087238320dba\") " pod="openstack/keystone-bootstrap-ml4kr" Dec 11 13:26:08 crc kubenswrapper[4898]: I1211 13:26:08.444130 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fqhjs\" (UniqueName: \"kubernetes.io/projected/470d01dc-02f0-49f5-912b-087238320dba-kube-api-access-fqhjs\") pod \"keystone-bootstrap-ml4kr\" (UID: \"470d01dc-02f0-49f5-912b-087238320dba\") " pod="openstack/keystone-bootstrap-ml4kr" Dec 11 13:26:08 crc kubenswrapper[4898]: I1211 13:26:08.452682 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/470d01dc-02f0-49f5-912b-087238320dba-config-data\") pod \"keystone-bootstrap-ml4kr\" (UID: \"470d01dc-02f0-49f5-912b-087238320dba\") " pod="openstack/keystone-bootstrap-ml4kr" Dec 11 13:26:08 crc kubenswrapper[4898]: I1211 13:26:08.456088 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/470d01dc-02f0-49f5-912b-087238320dba-fernet-keys\") pod \"keystone-bootstrap-ml4kr\" (UID: \"470d01dc-02f0-49f5-912b-087238320dba\") " pod="openstack/keystone-bootstrap-ml4kr" Dec 11 13:26:08 crc kubenswrapper[4898]: I1211 13:26:08.556593 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-ml4kr" Dec 11 13:26:08 crc kubenswrapper[4898]: I1211 13:26:08.790421 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="730a9af6-05eb-4ab9-a19e-c2d359813459" path="/var/lib/kubelet/pods/730a9af6-05eb-4ab9-a19e-c2d359813459/volumes" Dec 11 13:26:08 crc kubenswrapper[4898]: I1211 13:26:08.832841 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-b8fbc5445-kljnd" podUID="75e494bf-b288-45e2-8f87-7c146a9bb74f" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.153:5353: connect: connection refused" Dec 11 13:26:11 crc kubenswrapper[4898]: E1211 13:26:11.342894 4898 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod98476a59_e1fa_4912_828f_c2bc4d6f1317.slice/crio-600f8cad965db2164982c51d6ef27f0ff5d8efb54479c6a9017208646a81bccc\": RecentStats: unable to find data in memory cache]" Dec 11 13:26:13 crc kubenswrapper[4898]: I1211 13:26:13.748522 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5f59b8f679-9flrp" podUID="5c05e490-8014-425f-a1f1-7f4374e1d7da" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.174:5353: i/o timeout" Dec 11 13:26:13 crc kubenswrapper[4898]: I1211 13:26:13.791489 4898 scope.go:117] "RemoveContainer" containerID="6ef43e5e43aca1249e1273c0c990ecce9900dcd616b86611cf8962deeefe1c7a" Dec 11 13:26:13 crc kubenswrapper[4898]: I1211 13:26:13.832014 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-b8fbc5445-kljnd" podUID="75e494bf-b288-45e2-8f87-7c146a9bb74f" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.153:5353: connect: connection refused" Dec 11 13:26:16 crc kubenswrapper[4898]: E1211 13:26:16.059209 4898 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified" Dec 11 13:26:16 crc kubenswrapper[4898]: E1211 13:26:16.059999 4898 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-r4jdf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-gtqwj_openstack(5924443a-434a-4efc-b04d-fdc73d3e2fe6): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 11 13:26:16 crc kubenswrapper[4898]: E1211 13:26:16.061198 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-gtqwj" podUID="5924443a-434a-4efc-b04d-fdc73d3e2fe6" Dec 11 13:26:16 crc kubenswrapper[4898]: I1211 13:26:16.169941 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f59b8f679-9flrp" Dec 11 13:26:16 crc kubenswrapper[4898]: I1211 13:26:16.293261 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2h4k9\" (UniqueName: \"kubernetes.io/projected/5c05e490-8014-425f-a1f1-7f4374e1d7da-kube-api-access-2h4k9\") pod \"5c05e490-8014-425f-a1f1-7f4374e1d7da\" (UID: \"5c05e490-8014-425f-a1f1-7f4374e1d7da\") " Dec 11 13:26:16 crc kubenswrapper[4898]: I1211 13:26:16.293697 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5c05e490-8014-425f-a1f1-7f4374e1d7da-ovsdbserver-sb\") pod \"5c05e490-8014-425f-a1f1-7f4374e1d7da\" (UID: \"5c05e490-8014-425f-a1f1-7f4374e1d7da\") " Dec 11 13:26:16 crc kubenswrapper[4898]: I1211 13:26:16.293761 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5c05e490-8014-425f-a1f1-7f4374e1d7da-dns-swift-storage-0\") pod \"5c05e490-8014-425f-a1f1-7f4374e1d7da\" (UID: \"5c05e490-8014-425f-a1f1-7f4374e1d7da\") " Dec 11 13:26:16 crc kubenswrapper[4898]: I1211 13:26:16.293882 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5c05e490-8014-425f-a1f1-7f4374e1d7da-dns-svc\") pod \"5c05e490-8014-425f-a1f1-7f4374e1d7da\" (UID: \"5c05e490-8014-425f-a1f1-7f4374e1d7da\") " Dec 11 13:26:16 crc kubenswrapper[4898]: I1211 13:26:16.293944 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5c05e490-8014-425f-a1f1-7f4374e1d7da-config\") pod \"5c05e490-8014-425f-a1f1-7f4374e1d7da\" (UID: \"5c05e490-8014-425f-a1f1-7f4374e1d7da\") " Dec 11 13:26:16 crc kubenswrapper[4898]: I1211 13:26:16.293998 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5c05e490-8014-425f-a1f1-7f4374e1d7da-ovsdbserver-nb\") pod \"5c05e490-8014-425f-a1f1-7f4374e1d7da\" (UID: \"5c05e490-8014-425f-a1f1-7f4374e1d7da\") " Dec 11 13:26:16 crc kubenswrapper[4898]: I1211 13:26:16.306046 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c05e490-8014-425f-a1f1-7f4374e1d7da-kube-api-access-2h4k9" (OuterVolumeSpecName: "kube-api-access-2h4k9") pod "5c05e490-8014-425f-a1f1-7f4374e1d7da" (UID: "5c05e490-8014-425f-a1f1-7f4374e1d7da"). InnerVolumeSpecName "kube-api-access-2h4k9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:26:16 crc kubenswrapper[4898]: I1211 13:26:16.362643 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5c05e490-8014-425f-a1f1-7f4374e1d7da-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "5c05e490-8014-425f-a1f1-7f4374e1d7da" (UID: "5c05e490-8014-425f-a1f1-7f4374e1d7da"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:26:16 crc kubenswrapper[4898]: I1211 13:26:16.384412 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5c05e490-8014-425f-a1f1-7f4374e1d7da-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "5c05e490-8014-425f-a1f1-7f4374e1d7da" (UID: "5c05e490-8014-425f-a1f1-7f4374e1d7da"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:26:16 crc kubenswrapper[4898]: I1211 13:26:16.384431 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5c05e490-8014-425f-a1f1-7f4374e1d7da-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "5c05e490-8014-425f-a1f1-7f4374e1d7da" (UID: "5c05e490-8014-425f-a1f1-7f4374e1d7da"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:26:16 crc kubenswrapper[4898]: I1211 13:26:16.384443 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5c05e490-8014-425f-a1f1-7f4374e1d7da-config" (OuterVolumeSpecName: "config") pod "5c05e490-8014-425f-a1f1-7f4374e1d7da" (UID: "5c05e490-8014-425f-a1f1-7f4374e1d7da"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:26:16 crc kubenswrapper[4898]: I1211 13:26:16.384545 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5c05e490-8014-425f-a1f1-7f4374e1d7da-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "5c05e490-8014-425f-a1f1-7f4374e1d7da" (UID: "5c05e490-8014-425f-a1f1-7f4374e1d7da"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:26:16 crc kubenswrapper[4898]: I1211 13:26:16.396399 4898 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5c05e490-8014-425f-a1f1-7f4374e1d7da-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 11 13:26:16 crc kubenswrapper[4898]: I1211 13:26:16.396435 4898 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5c05e490-8014-425f-a1f1-7f4374e1d7da-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 11 13:26:16 crc kubenswrapper[4898]: I1211 13:26:16.396446 4898 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5c05e490-8014-425f-a1f1-7f4374e1d7da-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 11 13:26:16 crc kubenswrapper[4898]: I1211 13:26:16.396473 4898 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5c05e490-8014-425f-a1f1-7f4374e1d7da-config\") on node \"crc\" DevicePath \"\"" Dec 11 13:26:16 crc kubenswrapper[4898]: I1211 13:26:16.396485 4898 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5c05e490-8014-425f-a1f1-7f4374e1d7da-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 11 13:26:16 crc kubenswrapper[4898]: I1211 13:26:16.396493 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2h4k9\" (UniqueName: \"kubernetes.io/projected/5c05e490-8014-425f-a1f1-7f4374e1d7da-kube-api-access-2h4k9\") on node \"crc\" DevicePath \"\"" Dec 11 13:26:16 crc kubenswrapper[4898]: E1211 13:26:16.529304 4898 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-heat-engine:current-podified" Dec 11 13:26:16 crc kubenswrapper[4898]: E1211 13:26:16.529533 4898 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:heat-db-sync,Image:quay.io/podified-antelope-centos9/openstack-heat-engine:current-podified,Command:[/bin/bash],Args:[-c /usr/bin/heat-manage --config-dir /etc/heat/heat.conf.d db_sync],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/heat/heat.conf.d/00-default.conf,SubPath:00-default.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/heat/heat.conf.d/01-custom.conf,SubPath:01-custom.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wwt28,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42418,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*42418,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod heat-db-sync-mvzhc_openstack(23459d62-b558-4f82-a875-311d5fa486e5): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 11 13:26:16 crc kubenswrapper[4898]: E1211 13:26:16.530745 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/heat-db-sync-mvzhc" podUID="23459d62-b558-4f82-a875-311d5fa486e5" Dec 11 13:26:16 crc kubenswrapper[4898]: I1211 13:26:16.868182 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f59b8f679-9flrp" Dec 11 13:26:16 crc kubenswrapper[4898]: I1211 13:26:16.868175 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f59b8f679-9flrp" event={"ID":"5c05e490-8014-425f-a1f1-7f4374e1d7da","Type":"ContainerDied","Data":"808880a1addb80674fe1bc5a2ad79ef00371ba9cdcd9c877f2c3782f187da30f"} Dec 11 13:26:16 crc kubenswrapper[4898]: E1211 13:26:16.870737 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-heat-engine:current-podified\\\"\"" pod="openstack/heat-db-sync-mvzhc" podUID="23459d62-b558-4f82-a875-311d5fa486e5" Dec 11 13:26:16 crc kubenswrapper[4898]: E1211 13:26:16.871113 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified\\\"\"" pod="openstack/barbican-db-sync-gtqwj" podUID="5924443a-434a-4efc-b04d-fdc73d3e2fe6" Dec 11 13:26:16 crc kubenswrapper[4898]: I1211 13:26:16.929334 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5f59b8f679-9flrp"] Dec 11 13:26:16 crc kubenswrapper[4898]: I1211 13:26:16.937715 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5f59b8f679-9flrp"] Dec 11 13:26:18 crc kubenswrapper[4898]: I1211 13:26:18.750167 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5f59b8f679-9flrp" podUID="5c05e490-8014-425f-a1f1-7f4374e1d7da" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.174:5353: i/o timeout" Dec 11 13:26:18 crc kubenswrapper[4898]: I1211 13:26:18.796498 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c05e490-8014-425f-a1f1-7f4374e1d7da" path="/var/lib/kubelet/pods/5c05e490-8014-425f-a1f1-7f4374e1d7da/volumes" Dec 11 13:26:21 crc kubenswrapper[4898]: E1211 13:26:21.463109 4898 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod98476a59_e1fa_4912_828f_c2bc4d6f1317.slice/crio-600f8cad965db2164982c51d6ef27f0ff5d8efb54479c6a9017208646a81bccc\": RecentStats: unable to find data in memory cache]" Dec 11 13:26:21 crc kubenswrapper[4898]: E1211 13:26:21.463104 4898 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod98476a59_e1fa_4912_828f_c2bc4d6f1317.slice/crio-600f8cad965db2164982c51d6ef27f0ff5d8efb54479c6a9017208646a81bccc\": RecentStats: unable to find data in memory cache]" Dec 11 13:26:23 crc kubenswrapper[4898]: I1211 13:26:23.833117 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-b8fbc5445-kljnd" podUID="75e494bf-b288-45e2-8f87-7c146a9bb74f" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.153:5353: i/o timeout" Dec 11 13:26:23 crc kubenswrapper[4898]: I1211 13:26:23.833875 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-b8fbc5445-kljnd" Dec 11 13:26:24 crc kubenswrapper[4898]: I1211 13:26:24.959644 4898 generic.go:334] "Generic (PLEG): container finished" podID="4ff1a3a4-ba9d-4155-b289-46de3809f5f4" containerID="30bd400b99cd12f96e97ead55af6bcd9b8d9bf06cfe4939424e74ac281a9f0d5" exitCode=0 Dec 11 13:26:24 crc kubenswrapper[4898]: I1211 13:26:24.959789 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-f57gt" event={"ID":"4ff1a3a4-ba9d-4155-b289-46de3809f5f4","Type":"ContainerDied","Data":"30bd400b99cd12f96e97ead55af6bcd9b8d9bf06cfe4939424e74ac281a9f0d5"} Dec 11 13:26:25 crc kubenswrapper[4898]: E1211 13:26:25.974796 4898 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified" Dec 11 13:26:25 crc kubenswrapper[4898]: E1211 13:26:25.975291 4898 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ceilometer-central-agent,Image:quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n5d8h5fdh7h55dh94h68ch94h576h589h67ch59ch88hfhbdh5d6h59h5b4h57ch648h697h8ch5f8h9fh6h5f5hffhd6h5b9h5f5h96h98h697q,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:ceilometer-central-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wf29f,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/python3 /var/lib/openstack/bin/centralhealth.py],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(ce9ab866-6ba7-4782-a002-5f0f4c252b4e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 11 13:26:26 crc kubenswrapper[4898]: I1211 13:26:26.077028 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-kljnd" Dec 11 13:26:26 crc kubenswrapper[4898]: I1211 13:26:26.135212 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t6kn7\" (UniqueName: \"kubernetes.io/projected/75e494bf-b288-45e2-8f87-7c146a9bb74f-kube-api-access-t6kn7\") pod \"75e494bf-b288-45e2-8f87-7c146a9bb74f\" (UID: \"75e494bf-b288-45e2-8f87-7c146a9bb74f\") " Dec 11 13:26:26 crc kubenswrapper[4898]: I1211 13:26:26.135355 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/75e494bf-b288-45e2-8f87-7c146a9bb74f-ovsdbserver-sb\") pod \"75e494bf-b288-45e2-8f87-7c146a9bb74f\" (UID: \"75e494bf-b288-45e2-8f87-7c146a9bb74f\") " Dec 11 13:26:26 crc kubenswrapper[4898]: I1211 13:26:26.135602 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/75e494bf-b288-45e2-8f87-7c146a9bb74f-dns-svc\") pod \"75e494bf-b288-45e2-8f87-7c146a9bb74f\" (UID: \"75e494bf-b288-45e2-8f87-7c146a9bb74f\") " Dec 11 13:26:26 crc kubenswrapper[4898]: I1211 13:26:26.135647 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/75e494bf-b288-45e2-8f87-7c146a9bb74f-ovsdbserver-nb\") pod \"75e494bf-b288-45e2-8f87-7c146a9bb74f\" (UID: \"75e494bf-b288-45e2-8f87-7c146a9bb74f\") " Dec 11 13:26:26 crc kubenswrapper[4898]: I1211 13:26:26.136281 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/75e494bf-b288-45e2-8f87-7c146a9bb74f-config\") pod \"75e494bf-b288-45e2-8f87-7c146a9bb74f\" (UID: \"75e494bf-b288-45e2-8f87-7c146a9bb74f\") " Dec 11 13:26:26 crc kubenswrapper[4898]: I1211 13:26:26.145119 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75e494bf-b288-45e2-8f87-7c146a9bb74f-kube-api-access-t6kn7" (OuterVolumeSpecName: "kube-api-access-t6kn7") pod "75e494bf-b288-45e2-8f87-7c146a9bb74f" (UID: "75e494bf-b288-45e2-8f87-7c146a9bb74f"). InnerVolumeSpecName "kube-api-access-t6kn7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:26:26 crc kubenswrapper[4898]: I1211 13:26:26.203178 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75e494bf-b288-45e2-8f87-7c146a9bb74f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "75e494bf-b288-45e2-8f87-7c146a9bb74f" (UID: "75e494bf-b288-45e2-8f87-7c146a9bb74f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:26:26 crc kubenswrapper[4898]: I1211 13:26:26.211684 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75e494bf-b288-45e2-8f87-7c146a9bb74f-config" (OuterVolumeSpecName: "config") pod "75e494bf-b288-45e2-8f87-7c146a9bb74f" (UID: "75e494bf-b288-45e2-8f87-7c146a9bb74f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:26:26 crc kubenswrapper[4898]: I1211 13:26:26.216133 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75e494bf-b288-45e2-8f87-7c146a9bb74f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "75e494bf-b288-45e2-8f87-7c146a9bb74f" (UID: "75e494bf-b288-45e2-8f87-7c146a9bb74f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:26:26 crc kubenswrapper[4898]: I1211 13:26:26.226773 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75e494bf-b288-45e2-8f87-7c146a9bb74f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "75e494bf-b288-45e2-8f87-7c146a9bb74f" (UID: "75e494bf-b288-45e2-8f87-7c146a9bb74f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:26:26 crc kubenswrapper[4898]: I1211 13:26:26.238860 4898 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/75e494bf-b288-45e2-8f87-7c146a9bb74f-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 11 13:26:26 crc kubenswrapper[4898]: I1211 13:26:26.238892 4898 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/75e494bf-b288-45e2-8f87-7c146a9bb74f-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 11 13:26:26 crc kubenswrapper[4898]: I1211 13:26:26.239097 4898 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/75e494bf-b288-45e2-8f87-7c146a9bb74f-config\") on node \"crc\" DevicePath \"\"" Dec 11 13:26:26 crc kubenswrapper[4898]: I1211 13:26:26.239121 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t6kn7\" (UniqueName: \"kubernetes.io/projected/75e494bf-b288-45e2-8f87-7c146a9bb74f-kube-api-access-t6kn7\") on node \"crc\" DevicePath \"\"" Dec 11 13:26:26 crc kubenswrapper[4898]: I1211 13:26:26.239132 4898 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/75e494bf-b288-45e2-8f87-7c146a9bb74f-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 11 13:26:26 crc kubenswrapper[4898]: I1211 13:26:26.985846 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-kljnd" event={"ID":"75e494bf-b288-45e2-8f87-7c146a9bb74f","Type":"ContainerDied","Data":"0bacbdb44b71cd986e9cbadda0cd0afa8def878e1ff2ae92066c98613391ab46"} Dec 11 13:26:26 crc kubenswrapper[4898]: I1211 13:26:26.985923 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-kljnd" Dec 11 13:26:27 crc kubenswrapper[4898]: I1211 13:26:27.021177 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-kljnd"] Dec 11 13:26:27 crc kubenswrapper[4898]: I1211 13:26:27.031477 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-kljnd"] Dec 11 13:26:28 crc kubenswrapper[4898]: I1211 13:26:28.793671 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="75e494bf-b288-45e2-8f87-7c146a9bb74f" path="/var/lib/kubelet/pods/75e494bf-b288-45e2-8f87-7c146a9bb74f/volumes" Dec 11 13:26:28 crc kubenswrapper[4898]: I1211 13:26:28.834562 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-b8fbc5445-kljnd" podUID="75e494bf-b288-45e2-8f87-7c146a9bb74f" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.153:5353: i/o timeout" Dec 11 13:26:29 crc kubenswrapper[4898]: I1211 13:26:29.090076 4898 scope.go:117] "RemoveContainer" containerID="9ab1a5584ad1e5a2191fc14c17d9a21a0da2fa89920b471a8ba52fedc981117e" Dec 11 13:26:29 crc kubenswrapper[4898]: E1211 13:26:29.140855 4898 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Dec 11 13:26:29 crc kubenswrapper[4898]: E1211 13:26:29.141022 4898 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-pzlfq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-6v7s7_openstack(76d89e82-8f2e-4198-8736-28293404a0bd): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 11 13:26:29 crc kubenswrapper[4898]: E1211 13:26:29.142382 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-6v7s7" podUID="76d89e82-8f2e-4198-8736-28293404a0bd" Dec 11 13:26:29 crc kubenswrapper[4898]: I1211 13:26:29.240958 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-f57gt" Dec 11 13:26:29 crc kubenswrapper[4898]: I1211 13:26:29.276964 4898 scope.go:117] "RemoveContainer" containerID="a1517f1676b5404bec7c06d64374a259366eb52d1ac868e2c4be19f4bf56cfd4" Dec 11 13:26:29 crc kubenswrapper[4898]: I1211 13:26:29.315529 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c7ssj\" (UniqueName: \"kubernetes.io/projected/4ff1a3a4-ba9d-4155-b289-46de3809f5f4-kube-api-access-c7ssj\") pod \"4ff1a3a4-ba9d-4155-b289-46de3809f5f4\" (UID: \"4ff1a3a4-ba9d-4155-b289-46de3809f5f4\") " Dec 11 13:26:29 crc kubenswrapper[4898]: I1211 13:26:29.315724 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4ff1a3a4-ba9d-4155-b289-46de3809f5f4-config\") pod \"4ff1a3a4-ba9d-4155-b289-46de3809f5f4\" (UID: \"4ff1a3a4-ba9d-4155-b289-46de3809f5f4\") " Dec 11 13:26:29 crc kubenswrapper[4898]: I1211 13:26:29.315796 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ff1a3a4-ba9d-4155-b289-46de3809f5f4-combined-ca-bundle\") pod \"4ff1a3a4-ba9d-4155-b289-46de3809f5f4\" (UID: \"4ff1a3a4-ba9d-4155-b289-46de3809f5f4\") " Dec 11 13:26:29 crc kubenswrapper[4898]: I1211 13:26:29.325919 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ff1a3a4-ba9d-4155-b289-46de3809f5f4-kube-api-access-c7ssj" (OuterVolumeSpecName: "kube-api-access-c7ssj") pod "4ff1a3a4-ba9d-4155-b289-46de3809f5f4" (UID: "4ff1a3a4-ba9d-4155-b289-46de3809f5f4"). InnerVolumeSpecName "kube-api-access-c7ssj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:26:29 crc kubenswrapper[4898]: I1211 13:26:29.366064 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ff1a3a4-ba9d-4155-b289-46de3809f5f4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4ff1a3a4-ba9d-4155-b289-46de3809f5f4" (UID: "4ff1a3a4-ba9d-4155-b289-46de3809f5f4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:26:29 crc kubenswrapper[4898]: I1211 13:26:29.368118 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ff1a3a4-ba9d-4155-b289-46de3809f5f4-config" (OuterVolumeSpecName: "config") pod "4ff1a3a4-ba9d-4155-b289-46de3809f5f4" (UID: "4ff1a3a4-ba9d-4155-b289-46de3809f5f4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:26:29 crc kubenswrapper[4898]: I1211 13:26:29.425546 4898 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/4ff1a3a4-ba9d-4155-b289-46de3809f5f4-config\") on node \"crc\" DevicePath \"\"" Dec 11 13:26:29 crc kubenswrapper[4898]: I1211 13:26:29.425581 4898 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ff1a3a4-ba9d-4155-b289-46de3809f5f4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 11 13:26:29 crc kubenswrapper[4898]: I1211 13:26:29.425592 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c7ssj\" (UniqueName: \"kubernetes.io/projected/4ff1a3a4-ba9d-4155-b289-46de3809f5f4-kube-api-access-c7ssj\") on node \"crc\" DevicePath \"\"" Dec 11 13:26:29 crc kubenswrapper[4898]: I1211 13:26:29.475105 4898 scope.go:117] "RemoveContainer" containerID="bdc535a5c94d2ee02d91b1838893c317b59cf33b60b12338f985b27fa6368719" Dec 11 13:26:29 crc kubenswrapper[4898]: I1211 13:26:29.500221 4898 scope.go:117] "RemoveContainer" containerID="41500bcf18203778da4094b610eb74d8a1fd2e65f82dfa4b72b7ec1602c4279c" Dec 11 13:26:29 crc kubenswrapper[4898]: I1211 13:26:29.530142 4898 scope.go:117] "RemoveContainer" containerID="315e559ddb3ac3b90876b8105f8d971ba84c4c052f8ac4100c18d896859254f1" Dec 11 13:26:29 crc kubenswrapper[4898]: I1211 13:26:29.585580 4898 scope.go:117] "RemoveContainer" containerID="d34d334e13830b57ee1b38d6cdb7249be2dc658324bb3afacd49fac3f310027b" Dec 11 13:26:29 crc kubenswrapper[4898]: I1211 13:26:29.720870 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-ml4kr"] Dec 11 13:26:29 crc kubenswrapper[4898]: I1211 13:26:29.819968 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 11 13:26:29 crc kubenswrapper[4898]: I1211 13:26:29.972499 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 11 13:26:30 crc kubenswrapper[4898]: I1211 13:26:30.022381 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-mhjjq" event={"ID":"b93b66d1-a195-42b8-912d-5029d9f0e6b3","Type":"ContainerStarted","Data":"b22c69ce81887d572d8d77d2dcea77704ea3414613a2ccc6e4b6e2abc100a5c4"} Dec 11 13:26:30 crc kubenswrapper[4898]: I1211 13:26:30.026773 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-f57gt" Dec 11 13:26:30 crc kubenswrapper[4898]: I1211 13:26:30.029092 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-f57gt" event={"ID":"4ff1a3a4-ba9d-4155-b289-46de3809f5f4","Type":"ContainerDied","Data":"1e4c2ee7dae75df9aa983ba3b8b3c33cac263a6a7c91858357354a129bd4e8d1"} Dec 11 13:26:30 crc kubenswrapper[4898]: I1211 13:26:30.029138 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1e4c2ee7dae75df9aa983ba3b8b3c33cac263a6a7c91858357354a129bd4e8d1" Dec 11 13:26:30 crc kubenswrapper[4898]: E1211 13:26:30.032386 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-6v7s7" podUID="76d89e82-8f2e-4198-8736-28293404a0bd" Dec 11 13:26:30 crc kubenswrapper[4898]: I1211 13:26:30.050790 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-mhjjq" podStartSLOduration=7.729489695 podStartE2EDuration="35.050768297s" podCreationTimestamp="2025-12-11 13:25:55 +0000 UTC" firstStartedPulling="2025-12-11 13:25:57.688049975 +0000 UTC m=+1315.260376412" lastFinishedPulling="2025-12-11 13:26:25.009328577 +0000 UTC m=+1342.581655014" observedRunningTime="2025-12-11 13:26:30.04292987 +0000 UTC m=+1347.615256307" watchObservedRunningTime="2025-12-11 13:26:30.050768297 +0000 UTC m=+1347.623094754" Dec 11 13:26:30 crc kubenswrapper[4898]: W1211 13:26:30.115180 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8c0762b4_f3a4_4243_8df8_94e805983b4b.slice/crio-7c133537feb4776bbe88c6d81babfb82052c9ecd8a33e41fd80f7c195ecf417c WatchSource:0}: Error finding container 7c133537feb4776bbe88c6d81babfb82052c9ecd8a33e41fd80f7c195ecf417c: Status 404 returned error can't find the container with id 7c133537feb4776bbe88c6d81babfb82052c9ecd8a33e41fd80f7c195ecf417c Dec 11 13:26:30 crc kubenswrapper[4898]: W1211 13:26:30.117060 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod470d01dc_02f0_49f5_912b_087238320dba.slice/crio-c9212b49de106c8e4378c1f2482265d82d1feb9067881db90469fa60ae15330e WatchSource:0}: Error finding container c9212b49de106c8e4378c1f2482265d82d1feb9067881db90469fa60ae15330e: Status 404 returned error can't find the container with id c9212b49de106c8e4378c1f2482265d82d1feb9067881db90469fa60ae15330e Dec 11 13:26:30 crc kubenswrapper[4898]: W1211 13:26:30.120718 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1dc1c258_0a10_41a3_a831_0b8b1878ac80.slice/crio-a0d23ae69b81b124946d9c11e911d16785ba6ad8e44ade7b74a1d3aebbdc4bdf WatchSource:0}: Error finding container a0d23ae69b81b124946d9c11e911d16785ba6ad8e44ade7b74a1d3aebbdc4bdf: Status 404 returned error can't find the container with id a0d23ae69b81b124946d9c11e911d16785ba6ad8e44ade7b74a1d3aebbdc4bdf Dec 11 13:26:30 crc kubenswrapper[4898]: I1211 13:26:30.416180 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-vhw9q"] Dec 11 13:26:30 crc kubenswrapper[4898]: E1211 13:26:30.416626 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c05e490-8014-425f-a1f1-7f4374e1d7da" containerName="init" Dec 11 13:26:30 crc kubenswrapper[4898]: I1211 13:26:30.416643 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c05e490-8014-425f-a1f1-7f4374e1d7da" containerName="init" Dec 11 13:26:30 crc kubenswrapper[4898]: E1211 13:26:30.416665 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75e494bf-b288-45e2-8f87-7c146a9bb74f" containerName="init" Dec 11 13:26:30 crc kubenswrapper[4898]: I1211 13:26:30.416671 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="75e494bf-b288-45e2-8f87-7c146a9bb74f" containerName="init" Dec 11 13:26:30 crc kubenswrapper[4898]: E1211 13:26:30.416681 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c05e490-8014-425f-a1f1-7f4374e1d7da" containerName="dnsmasq-dns" Dec 11 13:26:30 crc kubenswrapper[4898]: I1211 13:26:30.416703 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c05e490-8014-425f-a1f1-7f4374e1d7da" containerName="dnsmasq-dns" Dec 11 13:26:30 crc kubenswrapper[4898]: E1211 13:26:30.416712 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ff1a3a4-ba9d-4155-b289-46de3809f5f4" containerName="neutron-db-sync" Dec 11 13:26:30 crc kubenswrapper[4898]: I1211 13:26:30.416717 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ff1a3a4-ba9d-4155-b289-46de3809f5f4" containerName="neutron-db-sync" Dec 11 13:26:30 crc kubenswrapper[4898]: E1211 13:26:30.416738 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75e494bf-b288-45e2-8f87-7c146a9bb74f" containerName="dnsmasq-dns" Dec 11 13:26:30 crc kubenswrapper[4898]: I1211 13:26:30.416744 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="75e494bf-b288-45e2-8f87-7c146a9bb74f" containerName="dnsmasq-dns" Dec 11 13:26:30 crc kubenswrapper[4898]: I1211 13:26:30.416934 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="75e494bf-b288-45e2-8f87-7c146a9bb74f" containerName="dnsmasq-dns" Dec 11 13:26:30 crc kubenswrapper[4898]: I1211 13:26:30.416947 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c05e490-8014-425f-a1f1-7f4374e1d7da" containerName="dnsmasq-dns" Dec 11 13:26:30 crc kubenswrapper[4898]: I1211 13:26:30.416969 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ff1a3a4-ba9d-4155-b289-46de3809f5f4" containerName="neutron-db-sync" Dec 11 13:26:30 crc kubenswrapper[4898]: I1211 13:26:30.418017 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7b667979-vhw9q" Dec 11 13:26:30 crc kubenswrapper[4898]: I1211 13:26:30.433034 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-vhw9q"] Dec 11 13:26:30 crc kubenswrapper[4898]: I1211 13:26:30.459401 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec7227b2-711c-479a-91b3-4f4d2f77ace4-config\") pod \"dnsmasq-dns-6b7b667979-vhw9q\" (UID: \"ec7227b2-711c-479a-91b3-4f4d2f77ace4\") " pod="openstack/dnsmasq-dns-6b7b667979-vhw9q" Dec 11 13:26:30 crc kubenswrapper[4898]: I1211 13:26:30.459783 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ec7227b2-711c-479a-91b3-4f4d2f77ace4-dns-swift-storage-0\") pod \"dnsmasq-dns-6b7b667979-vhw9q\" (UID: \"ec7227b2-711c-479a-91b3-4f4d2f77ace4\") " pod="openstack/dnsmasq-dns-6b7b667979-vhw9q" Dec 11 13:26:30 crc kubenswrapper[4898]: I1211 13:26:30.459888 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ec7227b2-711c-479a-91b3-4f4d2f77ace4-ovsdbserver-sb\") pod \"dnsmasq-dns-6b7b667979-vhw9q\" (UID: \"ec7227b2-711c-479a-91b3-4f4d2f77ace4\") " pod="openstack/dnsmasq-dns-6b7b667979-vhw9q" Dec 11 13:26:30 crc kubenswrapper[4898]: I1211 13:26:30.459934 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ec7227b2-711c-479a-91b3-4f4d2f77ace4-dns-svc\") pod \"dnsmasq-dns-6b7b667979-vhw9q\" (UID: \"ec7227b2-711c-479a-91b3-4f4d2f77ace4\") " pod="openstack/dnsmasq-dns-6b7b667979-vhw9q" Dec 11 13:26:30 crc kubenswrapper[4898]: I1211 13:26:30.460903 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ec7227b2-711c-479a-91b3-4f4d2f77ace4-ovsdbserver-nb\") pod \"dnsmasq-dns-6b7b667979-vhw9q\" (UID: \"ec7227b2-711c-479a-91b3-4f4d2f77ace4\") " pod="openstack/dnsmasq-dns-6b7b667979-vhw9q" Dec 11 13:26:30 crc kubenswrapper[4898]: I1211 13:26:30.462514 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-58wpg\" (UniqueName: \"kubernetes.io/projected/ec7227b2-711c-479a-91b3-4f4d2f77ace4-kube-api-access-58wpg\") pod \"dnsmasq-dns-6b7b667979-vhw9q\" (UID: \"ec7227b2-711c-479a-91b3-4f4d2f77ace4\") " pod="openstack/dnsmasq-dns-6b7b667979-vhw9q" Dec 11 13:26:30 crc kubenswrapper[4898]: I1211 13:26:30.535522 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-d94654b9b-gfmwl"] Dec 11 13:26:30 crc kubenswrapper[4898]: I1211 13:26:30.538681 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-d94654b9b-gfmwl" Dec 11 13:26:30 crc kubenswrapper[4898]: I1211 13:26:30.547832 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Dec 11 13:26:30 crc kubenswrapper[4898]: I1211 13:26:30.547845 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Dec 11 13:26:30 crc kubenswrapper[4898]: I1211 13:26:30.548017 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Dec 11 13:26:30 crc kubenswrapper[4898]: I1211 13:26:30.550069 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-dn6dl" Dec 11 13:26:30 crc kubenswrapper[4898]: I1211 13:26:30.569983 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ec7227b2-711c-479a-91b3-4f4d2f77ace4-ovsdbserver-sb\") pod \"dnsmasq-dns-6b7b667979-vhw9q\" (UID: \"ec7227b2-711c-479a-91b3-4f4d2f77ace4\") " pod="openstack/dnsmasq-dns-6b7b667979-vhw9q" Dec 11 13:26:30 crc kubenswrapper[4898]: I1211 13:26:30.570031 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ec7227b2-711c-479a-91b3-4f4d2f77ace4-dns-svc\") pod \"dnsmasq-dns-6b7b667979-vhw9q\" (UID: \"ec7227b2-711c-479a-91b3-4f4d2f77ace4\") " pod="openstack/dnsmasq-dns-6b7b667979-vhw9q" Dec 11 13:26:30 crc kubenswrapper[4898]: I1211 13:26:30.570146 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g7bjd\" (UniqueName: \"kubernetes.io/projected/df8a71d7-debf-4fc0-84c0-a3b1f4cb7030-kube-api-access-g7bjd\") pod \"neutron-d94654b9b-gfmwl\" (UID: \"df8a71d7-debf-4fc0-84c0-a3b1f4cb7030\") " pod="openstack/neutron-d94654b9b-gfmwl" Dec 11 13:26:30 crc kubenswrapper[4898]: I1211 13:26:30.570172 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ec7227b2-711c-479a-91b3-4f4d2f77ace4-ovsdbserver-nb\") pod \"dnsmasq-dns-6b7b667979-vhw9q\" (UID: \"ec7227b2-711c-479a-91b3-4f4d2f77ace4\") " pod="openstack/dnsmasq-dns-6b7b667979-vhw9q" Dec 11 13:26:30 crc kubenswrapper[4898]: I1211 13:26:30.570247 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-58wpg\" (UniqueName: \"kubernetes.io/projected/ec7227b2-711c-479a-91b3-4f4d2f77ace4-kube-api-access-58wpg\") pod \"dnsmasq-dns-6b7b667979-vhw9q\" (UID: \"ec7227b2-711c-479a-91b3-4f4d2f77ace4\") " pod="openstack/dnsmasq-dns-6b7b667979-vhw9q" Dec 11 13:26:30 crc kubenswrapper[4898]: I1211 13:26:30.570316 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/df8a71d7-debf-4fc0-84c0-a3b1f4cb7030-config\") pod \"neutron-d94654b9b-gfmwl\" (UID: \"df8a71d7-debf-4fc0-84c0-a3b1f4cb7030\") " pod="openstack/neutron-d94654b9b-gfmwl" Dec 11 13:26:30 crc kubenswrapper[4898]: I1211 13:26:30.570341 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec7227b2-711c-479a-91b3-4f4d2f77ace4-config\") pod \"dnsmasq-dns-6b7b667979-vhw9q\" (UID: \"ec7227b2-711c-479a-91b3-4f4d2f77ace4\") " pod="openstack/dnsmasq-dns-6b7b667979-vhw9q" Dec 11 13:26:30 crc kubenswrapper[4898]: I1211 13:26:30.570403 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ec7227b2-711c-479a-91b3-4f4d2f77ace4-dns-swift-storage-0\") pod \"dnsmasq-dns-6b7b667979-vhw9q\" (UID: \"ec7227b2-711c-479a-91b3-4f4d2f77ace4\") " pod="openstack/dnsmasq-dns-6b7b667979-vhw9q" Dec 11 13:26:30 crc kubenswrapper[4898]: I1211 13:26:30.570448 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/df8a71d7-debf-4fc0-84c0-a3b1f4cb7030-httpd-config\") pod \"neutron-d94654b9b-gfmwl\" (UID: \"df8a71d7-debf-4fc0-84c0-a3b1f4cb7030\") " pod="openstack/neutron-d94654b9b-gfmwl" Dec 11 13:26:30 crc kubenswrapper[4898]: I1211 13:26:30.570482 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df8a71d7-debf-4fc0-84c0-a3b1f4cb7030-combined-ca-bundle\") pod \"neutron-d94654b9b-gfmwl\" (UID: \"df8a71d7-debf-4fc0-84c0-a3b1f4cb7030\") " pod="openstack/neutron-d94654b9b-gfmwl" Dec 11 13:26:30 crc kubenswrapper[4898]: I1211 13:26:30.570497 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/df8a71d7-debf-4fc0-84c0-a3b1f4cb7030-ovndb-tls-certs\") pod \"neutron-d94654b9b-gfmwl\" (UID: \"df8a71d7-debf-4fc0-84c0-a3b1f4cb7030\") " pod="openstack/neutron-d94654b9b-gfmwl" Dec 11 13:26:30 crc kubenswrapper[4898]: I1211 13:26:30.571538 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ec7227b2-711c-479a-91b3-4f4d2f77ace4-ovsdbserver-sb\") pod \"dnsmasq-dns-6b7b667979-vhw9q\" (UID: \"ec7227b2-711c-479a-91b3-4f4d2f77ace4\") " pod="openstack/dnsmasq-dns-6b7b667979-vhw9q" Dec 11 13:26:30 crc kubenswrapper[4898]: I1211 13:26:30.572112 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ec7227b2-711c-479a-91b3-4f4d2f77ace4-dns-svc\") pod \"dnsmasq-dns-6b7b667979-vhw9q\" (UID: \"ec7227b2-711c-479a-91b3-4f4d2f77ace4\") " pod="openstack/dnsmasq-dns-6b7b667979-vhw9q" Dec 11 13:26:30 crc kubenswrapper[4898]: I1211 13:26:30.572685 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec7227b2-711c-479a-91b3-4f4d2f77ace4-config\") pod \"dnsmasq-dns-6b7b667979-vhw9q\" (UID: \"ec7227b2-711c-479a-91b3-4f4d2f77ace4\") " pod="openstack/dnsmasq-dns-6b7b667979-vhw9q" Dec 11 13:26:30 crc kubenswrapper[4898]: I1211 13:26:30.572977 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ec7227b2-711c-479a-91b3-4f4d2f77ace4-ovsdbserver-nb\") pod \"dnsmasq-dns-6b7b667979-vhw9q\" (UID: \"ec7227b2-711c-479a-91b3-4f4d2f77ace4\") " pod="openstack/dnsmasq-dns-6b7b667979-vhw9q" Dec 11 13:26:30 crc kubenswrapper[4898]: I1211 13:26:30.573210 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ec7227b2-711c-479a-91b3-4f4d2f77ace4-dns-swift-storage-0\") pod \"dnsmasq-dns-6b7b667979-vhw9q\" (UID: \"ec7227b2-711c-479a-91b3-4f4d2f77ace4\") " pod="openstack/dnsmasq-dns-6b7b667979-vhw9q" Dec 11 13:26:30 crc kubenswrapper[4898]: I1211 13:26:30.580619 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-d94654b9b-gfmwl"] Dec 11 13:26:30 crc kubenswrapper[4898]: I1211 13:26:30.605601 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-58wpg\" (UniqueName: \"kubernetes.io/projected/ec7227b2-711c-479a-91b3-4f4d2f77ace4-kube-api-access-58wpg\") pod \"dnsmasq-dns-6b7b667979-vhw9q\" (UID: \"ec7227b2-711c-479a-91b3-4f4d2f77ace4\") " pod="openstack/dnsmasq-dns-6b7b667979-vhw9q" Dec 11 13:26:30 crc kubenswrapper[4898]: I1211 13:26:30.672871 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/df8a71d7-debf-4fc0-84c0-a3b1f4cb7030-config\") pod \"neutron-d94654b9b-gfmwl\" (UID: \"df8a71d7-debf-4fc0-84c0-a3b1f4cb7030\") " pod="openstack/neutron-d94654b9b-gfmwl" Dec 11 13:26:30 crc kubenswrapper[4898]: I1211 13:26:30.672952 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/df8a71d7-debf-4fc0-84c0-a3b1f4cb7030-httpd-config\") pod \"neutron-d94654b9b-gfmwl\" (UID: \"df8a71d7-debf-4fc0-84c0-a3b1f4cb7030\") " pod="openstack/neutron-d94654b9b-gfmwl" Dec 11 13:26:30 crc kubenswrapper[4898]: I1211 13:26:30.672980 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df8a71d7-debf-4fc0-84c0-a3b1f4cb7030-combined-ca-bundle\") pod \"neutron-d94654b9b-gfmwl\" (UID: \"df8a71d7-debf-4fc0-84c0-a3b1f4cb7030\") " pod="openstack/neutron-d94654b9b-gfmwl" Dec 11 13:26:30 crc kubenswrapper[4898]: I1211 13:26:30.672996 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/df8a71d7-debf-4fc0-84c0-a3b1f4cb7030-ovndb-tls-certs\") pod \"neutron-d94654b9b-gfmwl\" (UID: \"df8a71d7-debf-4fc0-84c0-a3b1f4cb7030\") " pod="openstack/neutron-d94654b9b-gfmwl" Dec 11 13:26:30 crc kubenswrapper[4898]: I1211 13:26:30.673087 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g7bjd\" (UniqueName: \"kubernetes.io/projected/df8a71d7-debf-4fc0-84c0-a3b1f4cb7030-kube-api-access-g7bjd\") pod \"neutron-d94654b9b-gfmwl\" (UID: \"df8a71d7-debf-4fc0-84c0-a3b1f4cb7030\") " pod="openstack/neutron-d94654b9b-gfmwl" Dec 11 13:26:30 crc kubenswrapper[4898]: I1211 13:26:30.681702 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df8a71d7-debf-4fc0-84c0-a3b1f4cb7030-combined-ca-bundle\") pod \"neutron-d94654b9b-gfmwl\" (UID: \"df8a71d7-debf-4fc0-84c0-a3b1f4cb7030\") " pod="openstack/neutron-d94654b9b-gfmwl" Dec 11 13:26:30 crc kubenswrapper[4898]: I1211 13:26:30.682777 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/df8a71d7-debf-4fc0-84c0-a3b1f4cb7030-config\") pod \"neutron-d94654b9b-gfmwl\" (UID: \"df8a71d7-debf-4fc0-84c0-a3b1f4cb7030\") " pod="openstack/neutron-d94654b9b-gfmwl" Dec 11 13:26:30 crc kubenswrapper[4898]: I1211 13:26:30.690851 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/df8a71d7-debf-4fc0-84c0-a3b1f4cb7030-httpd-config\") pod \"neutron-d94654b9b-gfmwl\" (UID: \"df8a71d7-debf-4fc0-84c0-a3b1f4cb7030\") " pod="openstack/neutron-d94654b9b-gfmwl" Dec 11 13:26:30 crc kubenswrapper[4898]: I1211 13:26:30.694541 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g7bjd\" (UniqueName: \"kubernetes.io/projected/df8a71d7-debf-4fc0-84c0-a3b1f4cb7030-kube-api-access-g7bjd\") pod \"neutron-d94654b9b-gfmwl\" (UID: \"df8a71d7-debf-4fc0-84c0-a3b1f4cb7030\") " pod="openstack/neutron-d94654b9b-gfmwl" Dec 11 13:26:30 crc kubenswrapper[4898]: I1211 13:26:30.706116 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/df8a71d7-debf-4fc0-84c0-a3b1f4cb7030-ovndb-tls-certs\") pod \"neutron-d94654b9b-gfmwl\" (UID: \"df8a71d7-debf-4fc0-84c0-a3b1f4cb7030\") " pod="openstack/neutron-d94654b9b-gfmwl" Dec 11 13:26:30 crc kubenswrapper[4898]: I1211 13:26:30.764890 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7b667979-vhw9q" Dec 11 13:26:30 crc kubenswrapper[4898]: I1211 13:26:30.870970 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-d94654b9b-gfmwl" Dec 11 13:26:31 crc kubenswrapper[4898]: I1211 13:26:31.079972 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ce9ab866-6ba7-4782-a002-5f0f4c252b4e","Type":"ContainerStarted","Data":"b04914645067dec13ff09d02784be8c6ab8970a8b466dfc229a4638004568231"} Dec 11 13:26:31 crc kubenswrapper[4898]: I1211 13:26:31.092814 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"1dc1c258-0a10-41a3-a831-0b8b1878ac80","Type":"ContainerStarted","Data":"a0d23ae69b81b124946d9c11e911d16785ba6ad8e44ade7b74a1d3aebbdc4bdf"} Dec 11 13:26:31 crc kubenswrapper[4898]: I1211 13:26:31.134559 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-mvzhc" event={"ID":"23459d62-b558-4f82-a875-311d5fa486e5","Type":"ContainerStarted","Data":"a8fb01bc589dee8d9d5d9e46068c3f94228cd2cd6920721105182af1d784b83e"} Dec 11 13:26:31 crc kubenswrapper[4898]: I1211 13:26:31.141089 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-gtqwj" event={"ID":"5924443a-434a-4efc-b04d-fdc73d3e2fe6","Type":"ContainerStarted","Data":"5eb8b1fac0eefc98cdd4aa11ee9069c852eb895b98d696667be2c09f7aa07bbb"} Dec 11 13:26:31 crc kubenswrapper[4898]: I1211 13:26:31.148664 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-ml4kr" event={"ID":"470d01dc-02f0-49f5-912b-087238320dba","Type":"ContainerStarted","Data":"70ef5f3eb8612084e2aa274d7d73c0d22001ad25ea423aa131b75b0ee1ce27dc"} Dec 11 13:26:31 crc kubenswrapper[4898]: I1211 13:26:31.148706 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-ml4kr" event={"ID":"470d01dc-02f0-49f5-912b-087238320dba","Type":"ContainerStarted","Data":"c9212b49de106c8e4378c1f2482265d82d1feb9067881db90469fa60ae15330e"} Dec 11 13:26:31 crc kubenswrapper[4898]: I1211 13:26:31.153794 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"8c0762b4-f3a4-4243-8df8-94e805983b4b","Type":"ContainerStarted","Data":"7c133537feb4776bbe88c6d81babfb82052c9ecd8a33e41fd80f7c195ecf417c"} Dec 11 13:26:31 crc kubenswrapper[4898]: I1211 13:26:31.160870 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-db-sync-mvzhc" podStartSLOduration=3.087849334 podStartE2EDuration="36.16084689s" podCreationTimestamp="2025-12-11 13:25:55 +0000 UTC" firstStartedPulling="2025-12-11 13:25:57.224082674 +0000 UTC m=+1314.796409111" lastFinishedPulling="2025-12-11 13:26:30.29708023 +0000 UTC m=+1347.869406667" observedRunningTime="2025-12-11 13:26:31.159696239 +0000 UTC m=+1348.732022676" watchObservedRunningTime="2025-12-11 13:26:31.16084689 +0000 UTC m=+1348.733173327" Dec 11 13:26:31 crc kubenswrapper[4898]: I1211 13:26:31.196895 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-gtqwj" podStartSLOduration=3.356203388 podStartE2EDuration="36.196876829s" podCreationTimestamp="2025-12-11 13:25:55 +0000 UTC" firstStartedPulling="2025-12-11 13:25:57.605976122 +0000 UTC m=+1315.178302549" lastFinishedPulling="2025-12-11 13:26:30.446649563 +0000 UTC m=+1348.018975990" observedRunningTime="2025-12-11 13:26:31.181824253 +0000 UTC m=+1348.754150690" watchObservedRunningTime="2025-12-11 13:26:31.196876829 +0000 UTC m=+1348.769203266" Dec 11 13:26:31 crc kubenswrapper[4898]: I1211 13:26:31.251808 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-ml4kr" podStartSLOduration=23.251787907 podStartE2EDuration="23.251787907s" podCreationTimestamp="2025-12-11 13:26:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 13:26:31.213637021 +0000 UTC m=+1348.785963458" watchObservedRunningTime="2025-12-11 13:26:31.251787907 +0000 UTC m=+1348.824114344" Dec 11 13:26:31 crc kubenswrapper[4898]: I1211 13:26:31.460583 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-vhw9q"] Dec 11 13:26:31 crc kubenswrapper[4898]: I1211 13:26:31.788642 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-d94654b9b-gfmwl"] Dec 11 13:26:32 crc kubenswrapper[4898]: E1211 13:26:32.125726 4898 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod98476a59_e1fa_4912_828f_c2bc4d6f1317.slice/crio-600f8cad965db2164982c51d6ef27f0ff5d8efb54479c6a9017208646a81bccc\": RecentStats: unable to find data in memory cache]" Dec 11 13:26:32 crc kubenswrapper[4898]: I1211 13:26:32.228953 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"1dc1c258-0a10-41a3-a831-0b8b1878ac80","Type":"ContainerStarted","Data":"f443b849f68ba83870d8cd9bb8cb6e0ee28dc53333c98ea0253e563f39842ea5"} Dec 11 13:26:32 crc kubenswrapper[4898]: I1211 13:26:32.232705 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-d94654b9b-gfmwl" event={"ID":"df8a71d7-debf-4fc0-84c0-a3b1f4cb7030","Type":"ContainerStarted","Data":"97372bde30f05b0912633875803c2a93a455f48b80e4958d1f47121364becccf"} Dec 11 13:26:32 crc kubenswrapper[4898]: I1211 13:26:32.244108 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7b667979-vhw9q" event={"ID":"ec7227b2-711c-479a-91b3-4f4d2f77ace4","Type":"ContainerStarted","Data":"12e8ff482a378e1dc9dfdbba6c6ae7109eb79093bedcc5bf2ca056fae419ed70"} Dec 11 13:26:32 crc kubenswrapper[4898]: I1211 13:26:32.267239 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"8c0762b4-f3a4-4243-8df8-94e805983b4b","Type":"ContainerStarted","Data":"3d10166a76dcef4f62b5c7bc9c12ea67476d6be9d8622c008144c0f8e7c00080"} Dec 11 13:26:33 crc kubenswrapper[4898]: I1211 13:26:33.300895 4898 generic.go:334] "Generic (PLEG): container finished" podID="ec7227b2-711c-479a-91b3-4f4d2f77ace4" containerID="13612d4efdb7fa2015fc0f48a89b3fec89946e6e4b65080075b9d834d5dbb2e4" exitCode=0 Dec 11 13:26:33 crc kubenswrapper[4898]: I1211 13:26:33.301602 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7b667979-vhw9q" event={"ID":"ec7227b2-711c-479a-91b3-4f4d2f77ace4","Type":"ContainerDied","Data":"13612d4efdb7fa2015fc0f48a89b3fec89946e6e4b65080075b9d834d5dbb2e4"} Dec 11 13:26:33 crc kubenswrapper[4898]: I1211 13:26:33.306727 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"8c0762b4-f3a4-4243-8df8-94e805983b4b","Type":"ContainerStarted","Data":"2c7613a2c4c0a54040bfa09053d6ae624afab17f5a34bdbb4122d42d97ca4e33"} Dec 11 13:26:33 crc kubenswrapper[4898]: I1211 13:26:33.335464 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"1dc1c258-0a10-41a3-a831-0b8b1878ac80","Type":"ContainerStarted","Data":"8238f160349bb9bfa3897baa7615ac6ec775c9f9dafb1f1c69ff40c49db69db6"} Dec 11 13:26:33 crc kubenswrapper[4898]: I1211 13:26:33.358649 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=29.358609017 podStartE2EDuration="29.358609017s" podCreationTimestamp="2025-12-11 13:26:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 13:26:33.351443978 +0000 UTC m=+1350.923770415" watchObservedRunningTime="2025-12-11 13:26:33.358609017 +0000 UTC m=+1350.930935454" Dec 11 13:26:33 crc kubenswrapper[4898]: I1211 13:26:33.366092 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-d94654b9b-gfmwl" event={"ID":"df8a71d7-debf-4fc0-84c0-a3b1f4cb7030","Type":"ContainerStarted","Data":"c71ccb054f5ca50913600baeabfc1b605ea7ad4e97a622031413c8e61b1dbc90"} Dec 11 13:26:33 crc kubenswrapper[4898]: I1211 13:26:33.366327 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-d94654b9b-gfmwl" event={"ID":"df8a71d7-debf-4fc0-84c0-a3b1f4cb7030","Type":"ContainerStarted","Data":"9ef00d43822c19520cf754da05080d4a738171a7c2e4dd6946a0bfd214b8e810"} Dec 11 13:26:33 crc kubenswrapper[4898]: I1211 13:26:33.366403 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-d94654b9b-gfmwl" Dec 11 13:26:33 crc kubenswrapper[4898]: I1211 13:26:33.373238 4898 generic.go:334] "Generic (PLEG): container finished" podID="b93b66d1-a195-42b8-912d-5029d9f0e6b3" containerID="b22c69ce81887d572d8d77d2dcea77704ea3414613a2ccc6e4b6e2abc100a5c4" exitCode=0 Dec 11 13:26:33 crc kubenswrapper[4898]: I1211 13:26:33.373389 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-mhjjq" event={"ID":"b93b66d1-a195-42b8-912d-5029d9f0e6b3","Type":"ContainerDied","Data":"b22c69ce81887d572d8d77d2dcea77704ea3414613a2ccc6e4b6e2abc100a5c4"} Dec 11 13:26:33 crc kubenswrapper[4898]: I1211 13:26:33.395345 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=29.395321334 podStartE2EDuration="29.395321334s" podCreationTimestamp="2025-12-11 13:26:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 13:26:33.380416781 +0000 UTC m=+1350.952743218" watchObservedRunningTime="2025-12-11 13:26:33.395321334 +0000 UTC m=+1350.967647771" Dec 11 13:26:33 crc kubenswrapper[4898]: I1211 13:26:33.442049 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-d94654b9b-gfmwl" podStartSLOduration=3.442027746 podStartE2EDuration="3.442027746s" podCreationTimestamp="2025-12-11 13:26:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 13:26:33.412658891 +0000 UTC m=+1350.984985328" watchObservedRunningTime="2025-12-11 13:26:33.442027746 +0000 UTC m=+1351.014354173" Dec 11 13:26:34 crc kubenswrapper[4898]: I1211 13:26:34.089168 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-84b4b98fdc-tbjdg"] Dec 11 13:26:34 crc kubenswrapper[4898]: I1211 13:26:34.095278 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-84b4b98fdc-tbjdg" Dec 11 13:26:34 crc kubenswrapper[4898]: I1211 13:26:34.098554 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Dec 11 13:26:34 crc kubenswrapper[4898]: I1211 13:26:34.099404 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Dec 11 13:26:34 crc kubenswrapper[4898]: I1211 13:26:34.123750 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-84b4b98fdc-tbjdg"] Dec 11 13:26:34 crc kubenswrapper[4898]: I1211 13:26:34.254123 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/72ca8f16-912b-44f0-bc9d-868f381fb8fb-config\") pod \"neutron-84b4b98fdc-tbjdg\" (UID: \"72ca8f16-912b-44f0-bc9d-868f381fb8fb\") " pod="openstack/neutron-84b4b98fdc-tbjdg" Dec 11 13:26:34 crc kubenswrapper[4898]: I1211 13:26:34.254202 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/72ca8f16-912b-44f0-bc9d-868f381fb8fb-ovndb-tls-certs\") pod \"neutron-84b4b98fdc-tbjdg\" (UID: \"72ca8f16-912b-44f0-bc9d-868f381fb8fb\") " pod="openstack/neutron-84b4b98fdc-tbjdg" Dec 11 13:26:34 crc kubenswrapper[4898]: I1211 13:26:34.254317 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/72ca8f16-912b-44f0-bc9d-868f381fb8fb-internal-tls-certs\") pod \"neutron-84b4b98fdc-tbjdg\" (UID: \"72ca8f16-912b-44f0-bc9d-868f381fb8fb\") " pod="openstack/neutron-84b4b98fdc-tbjdg" Dec 11 13:26:34 crc kubenswrapper[4898]: I1211 13:26:34.254359 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72ca8f16-912b-44f0-bc9d-868f381fb8fb-combined-ca-bundle\") pod \"neutron-84b4b98fdc-tbjdg\" (UID: \"72ca8f16-912b-44f0-bc9d-868f381fb8fb\") " pod="openstack/neutron-84b4b98fdc-tbjdg" Dec 11 13:26:34 crc kubenswrapper[4898]: I1211 13:26:34.254383 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xzrj\" (UniqueName: \"kubernetes.io/projected/72ca8f16-912b-44f0-bc9d-868f381fb8fb-kube-api-access-5xzrj\") pod \"neutron-84b4b98fdc-tbjdg\" (UID: \"72ca8f16-912b-44f0-bc9d-868f381fb8fb\") " pod="openstack/neutron-84b4b98fdc-tbjdg" Dec 11 13:26:34 crc kubenswrapper[4898]: I1211 13:26:34.254423 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/72ca8f16-912b-44f0-bc9d-868f381fb8fb-public-tls-certs\") pod \"neutron-84b4b98fdc-tbjdg\" (UID: \"72ca8f16-912b-44f0-bc9d-868f381fb8fb\") " pod="openstack/neutron-84b4b98fdc-tbjdg" Dec 11 13:26:34 crc kubenswrapper[4898]: I1211 13:26:34.254474 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/72ca8f16-912b-44f0-bc9d-868f381fb8fb-httpd-config\") pod \"neutron-84b4b98fdc-tbjdg\" (UID: \"72ca8f16-912b-44f0-bc9d-868f381fb8fb\") " pod="openstack/neutron-84b4b98fdc-tbjdg" Dec 11 13:26:34 crc kubenswrapper[4898]: I1211 13:26:34.356241 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/72ca8f16-912b-44f0-bc9d-868f381fb8fb-internal-tls-certs\") pod \"neutron-84b4b98fdc-tbjdg\" (UID: \"72ca8f16-912b-44f0-bc9d-868f381fb8fb\") " pod="openstack/neutron-84b4b98fdc-tbjdg" Dec 11 13:26:34 crc kubenswrapper[4898]: I1211 13:26:34.356522 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72ca8f16-912b-44f0-bc9d-868f381fb8fb-combined-ca-bundle\") pod \"neutron-84b4b98fdc-tbjdg\" (UID: \"72ca8f16-912b-44f0-bc9d-868f381fb8fb\") " pod="openstack/neutron-84b4b98fdc-tbjdg" Dec 11 13:26:34 crc kubenswrapper[4898]: I1211 13:26:34.356549 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5xzrj\" (UniqueName: \"kubernetes.io/projected/72ca8f16-912b-44f0-bc9d-868f381fb8fb-kube-api-access-5xzrj\") pod \"neutron-84b4b98fdc-tbjdg\" (UID: \"72ca8f16-912b-44f0-bc9d-868f381fb8fb\") " pod="openstack/neutron-84b4b98fdc-tbjdg" Dec 11 13:26:34 crc kubenswrapper[4898]: I1211 13:26:34.356587 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/72ca8f16-912b-44f0-bc9d-868f381fb8fb-public-tls-certs\") pod \"neutron-84b4b98fdc-tbjdg\" (UID: \"72ca8f16-912b-44f0-bc9d-868f381fb8fb\") " pod="openstack/neutron-84b4b98fdc-tbjdg" Dec 11 13:26:34 crc kubenswrapper[4898]: I1211 13:26:34.356628 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/72ca8f16-912b-44f0-bc9d-868f381fb8fb-httpd-config\") pod \"neutron-84b4b98fdc-tbjdg\" (UID: \"72ca8f16-912b-44f0-bc9d-868f381fb8fb\") " pod="openstack/neutron-84b4b98fdc-tbjdg" Dec 11 13:26:34 crc kubenswrapper[4898]: I1211 13:26:34.356645 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/72ca8f16-912b-44f0-bc9d-868f381fb8fb-config\") pod \"neutron-84b4b98fdc-tbjdg\" (UID: \"72ca8f16-912b-44f0-bc9d-868f381fb8fb\") " pod="openstack/neutron-84b4b98fdc-tbjdg" Dec 11 13:26:34 crc kubenswrapper[4898]: I1211 13:26:34.356682 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/72ca8f16-912b-44f0-bc9d-868f381fb8fb-ovndb-tls-certs\") pod \"neutron-84b4b98fdc-tbjdg\" (UID: \"72ca8f16-912b-44f0-bc9d-868f381fb8fb\") " pod="openstack/neutron-84b4b98fdc-tbjdg" Dec 11 13:26:34 crc kubenswrapper[4898]: I1211 13:26:34.370120 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/72ca8f16-912b-44f0-bc9d-868f381fb8fb-internal-tls-certs\") pod \"neutron-84b4b98fdc-tbjdg\" (UID: \"72ca8f16-912b-44f0-bc9d-868f381fb8fb\") " pod="openstack/neutron-84b4b98fdc-tbjdg" Dec 11 13:26:34 crc kubenswrapper[4898]: I1211 13:26:34.377110 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/72ca8f16-912b-44f0-bc9d-868f381fb8fb-ovndb-tls-certs\") pod \"neutron-84b4b98fdc-tbjdg\" (UID: \"72ca8f16-912b-44f0-bc9d-868f381fb8fb\") " pod="openstack/neutron-84b4b98fdc-tbjdg" Dec 11 13:26:34 crc kubenswrapper[4898]: I1211 13:26:34.383880 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72ca8f16-912b-44f0-bc9d-868f381fb8fb-combined-ca-bundle\") pod \"neutron-84b4b98fdc-tbjdg\" (UID: \"72ca8f16-912b-44f0-bc9d-868f381fb8fb\") " pod="openstack/neutron-84b4b98fdc-tbjdg" Dec 11 13:26:34 crc kubenswrapper[4898]: I1211 13:26:34.413080 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/72ca8f16-912b-44f0-bc9d-868f381fb8fb-httpd-config\") pod \"neutron-84b4b98fdc-tbjdg\" (UID: \"72ca8f16-912b-44f0-bc9d-868f381fb8fb\") " pod="openstack/neutron-84b4b98fdc-tbjdg" Dec 11 13:26:34 crc kubenswrapper[4898]: I1211 13:26:34.414119 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xzrj\" (UniqueName: \"kubernetes.io/projected/72ca8f16-912b-44f0-bc9d-868f381fb8fb-kube-api-access-5xzrj\") pod \"neutron-84b4b98fdc-tbjdg\" (UID: \"72ca8f16-912b-44f0-bc9d-868f381fb8fb\") " pod="openstack/neutron-84b4b98fdc-tbjdg" Dec 11 13:26:34 crc kubenswrapper[4898]: I1211 13:26:34.422181 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/72ca8f16-912b-44f0-bc9d-868f381fb8fb-public-tls-certs\") pod \"neutron-84b4b98fdc-tbjdg\" (UID: \"72ca8f16-912b-44f0-bc9d-868f381fb8fb\") " pod="openstack/neutron-84b4b98fdc-tbjdg" Dec 11 13:26:34 crc kubenswrapper[4898]: I1211 13:26:34.429177 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/72ca8f16-912b-44f0-bc9d-868f381fb8fb-config\") pod \"neutron-84b4b98fdc-tbjdg\" (UID: \"72ca8f16-912b-44f0-bc9d-868f381fb8fb\") " pod="openstack/neutron-84b4b98fdc-tbjdg" Dec 11 13:26:34 crc kubenswrapper[4898]: I1211 13:26:34.447958 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-84b4b98fdc-tbjdg" Dec 11 13:26:35 crc kubenswrapper[4898]: I1211 13:26:35.469556 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 11 13:26:35 crc kubenswrapper[4898]: I1211 13:26:35.469811 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 11 13:26:35 crc kubenswrapper[4898]: I1211 13:26:35.469821 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 11 13:26:35 crc kubenswrapper[4898]: I1211 13:26:35.469832 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 11 13:26:35 crc kubenswrapper[4898]: I1211 13:26:35.492135 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 11 13:26:35 crc kubenswrapper[4898]: I1211 13:26:35.492190 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 11 13:26:35 crc kubenswrapper[4898]: I1211 13:26:35.492201 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 11 13:26:35 crc kubenswrapper[4898]: I1211 13:26:35.492211 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 11 13:26:35 crc kubenswrapper[4898]: I1211 13:26:35.724984 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 11 13:26:35 crc kubenswrapper[4898]: I1211 13:26:35.725053 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 11 13:26:35 crc kubenswrapper[4898]: I1211 13:26:35.725759 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 11 13:26:35 crc kubenswrapper[4898]: I1211 13:26:35.727448 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 11 13:26:35 crc kubenswrapper[4898]: I1211 13:26:35.872703 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-84b4b98fdc-tbjdg"] Dec 11 13:26:35 crc kubenswrapper[4898]: W1211 13:26:35.888783 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod72ca8f16_912b_44f0_bc9d_868f381fb8fb.slice/crio-2de666bd2fb623278a2962bd5ce987a9ae9c83b02b9204109c2ff510a069cb18 WatchSource:0}: Error finding container 2de666bd2fb623278a2962bd5ce987a9ae9c83b02b9204109c2ff510a069cb18: Status 404 returned error can't find the container with id 2de666bd2fb623278a2962bd5ce987a9ae9c83b02b9204109c2ff510a069cb18 Dec 11 13:26:36 crc kubenswrapper[4898]: E1211 13:26:36.245219 4898 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod98476a59_e1fa_4912_828f_c2bc4d6f1317.slice/crio-600f8cad965db2164982c51d6ef27f0ff5d8efb54479c6a9017208646a81bccc\": RecentStats: unable to find data in memory cache]" Dec 11 13:26:36 crc kubenswrapper[4898]: I1211 13:26:36.440984 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-84b4b98fdc-tbjdg" event={"ID":"72ca8f16-912b-44f0-bc9d-868f381fb8fb","Type":"ContainerStarted","Data":"2de666bd2fb623278a2962bd5ce987a9ae9c83b02b9204109c2ff510a069cb18"} Dec 11 13:26:36 crc kubenswrapper[4898]: I1211 13:26:36.446892 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7b667979-vhw9q" event={"ID":"ec7227b2-711c-479a-91b3-4f4d2f77ace4","Type":"ContainerStarted","Data":"b4360fab85d966d93e1f09c72b64e4f81927aff75dca30e4b837dcdb84663211"} Dec 11 13:26:36 crc kubenswrapper[4898]: I1211 13:26:36.448149 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6b7b667979-vhw9q" Dec 11 13:26:36 crc kubenswrapper[4898]: I1211 13:26:36.473056 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6b7b667979-vhw9q" podStartSLOduration=6.473034447 podStartE2EDuration="6.473034447s" podCreationTimestamp="2025-12-11 13:26:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 13:26:36.472675788 +0000 UTC m=+1354.045002235" watchObservedRunningTime="2025-12-11 13:26:36.473034447 +0000 UTC m=+1354.045360884" Dec 11 13:26:39 crc kubenswrapper[4898]: I1211 13:26:39.471228 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-mhjjq" Dec 11 13:26:39 crc kubenswrapper[4898]: I1211 13:26:39.474882 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-mhjjq" event={"ID":"b93b66d1-a195-42b8-912d-5029d9f0e6b3","Type":"ContainerDied","Data":"14e779bfe73e233dab3b23e31aa82f28daec4112cfc07c5be1b0b99208ecc942"} Dec 11 13:26:39 crc kubenswrapper[4898]: I1211 13:26:39.475022 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="14e779bfe73e233dab3b23e31aa82f28daec4112cfc07c5be1b0b99208ecc942" Dec 11 13:26:39 crc kubenswrapper[4898]: I1211 13:26:39.634812 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b93b66d1-a195-42b8-912d-5029d9f0e6b3-config-data\") pod \"b93b66d1-a195-42b8-912d-5029d9f0e6b3\" (UID: \"b93b66d1-a195-42b8-912d-5029d9f0e6b3\") " Dec 11 13:26:39 crc kubenswrapper[4898]: I1211 13:26:39.634907 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qw5pd\" (UniqueName: \"kubernetes.io/projected/b93b66d1-a195-42b8-912d-5029d9f0e6b3-kube-api-access-qw5pd\") pod \"b93b66d1-a195-42b8-912d-5029d9f0e6b3\" (UID: \"b93b66d1-a195-42b8-912d-5029d9f0e6b3\") " Dec 11 13:26:39 crc kubenswrapper[4898]: I1211 13:26:39.634951 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b93b66d1-a195-42b8-912d-5029d9f0e6b3-logs\") pod \"b93b66d1-a195-42b8-912d-5029d9f0e6b3\" (UID: \"b93b66d1-a195-42b8-912d-5029d9f0e6b3\") " Dec 11 13:26:39 crc kubenswrapper[4898]: I1211 13:26:39.635041 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b93b66d1-a195-42b8-912d-5029d9f0e6b3-combined-ca-bundle\") pod \"b93b66d1-a195-42b8-912d-5029d9f0e6b3\" (UID: \"b93b66d1-a195-42b8-912d-5029d9f0e6b3\") " Dec 11 13:26:39 crc kubenswrapper[4898]: I1211 13:26:39.635112 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b93b66d1-a195-42b8-912d-5029d9f0e6b3-scripts\") pod \"b93b66d1-a195-42b8-912d-5029d9f0e6b3\" (UID: \"b93b66d1-a195-42b8-912d-5029d9f0e6b3\") " Dec 11 13:26:39 crc kubenswrapper[4898]: I1211 13:26:39.635711 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b93b66d1-a195-42b8-912d-5029d9f0e6b3-logs" (OuterVolumeSpecName: "logs") pod "b93b66d1-a195-42b8-912d-5029d9f0e6b3" (UID: "b93b66d1-a195-42b8-912d-5029d9f0e6b3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 13:26:39 crc kubenswrapper[4898]: I1211 13:26:39.642185 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b93b66d1-a195-42b8-912d-5029d9f0e6b3-kube-api-access-qw5pd" (OuterVolumeSpecName: "kube-api-access-qw5pd") pod "b93b66d1-a195-42b8-912d-5029d9f0e6b3" (UID: "b93b66d1-a195-42b8-912d-5029d9f0e6b3"). InnerVolumeSpecName "kube-api-access-qw5pd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:26:39 crc kubenswrapper[4898]: I1211 13:26:39.655633 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b93b66d1-a195-42b8-912d-5029d9f0e6b3-scripts" (OuterVolumeSpecName: "scripts") pod "b93b66d1-a195-42b8-912d-5029d9f0e6b3" (UID: "b93b66d1-a195-42b8-912d-5029d9f0e6b3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:26:39 crc kubenswrapper[4898]: I1211 13:26:39.671891 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b93b66d1-a195-42b8-912d-5029d9f0e6b3-config-data" (OuterVolumeSpecName: "config-data") pod "b93b66d1-a195-42b8-912d-5029d9f0e6b3" (UID: "b93b66d1-a195-42b8-912d-5029d9f0e6b3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:26:39 crc kubenswrapper[4898]: I1211 13:26:39.701869 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b93b66d1-a195-42b8-912d-5029d9f0e6b3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b93b66d1-a195-42b8-912d-5029d9f0e6b3" (UID: "b93b66d1-a195-42b8-912d-5029d9f0e6b3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:26:39 crc kubenswrapper[4898]: I1211 13:26:39.738373 4898 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b93b66d1-a195-42b8-912d-5029d9f0e6b3-config-data\") on node \"crc\" DevicePath \"\"" Dec 11 13:26:39 crc kubenswrapper[4898]: I1211 13:26:39.738415 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qw5pd\" (UniqueName: \"kubernetes.io/projected/b93b66d1-a195-42b8-912d-5029d9f0e6b3-kube-api-access-qw5pd\") on node \"crc\" DevicePath \"\"" Dec 11 13:26:39 crc kubenswrapper[4898]: I1211 13:26:39.738429 4898 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b93b66d1-a195-42b8-912d-5029d9f0e6b3-logs\") on node \"crc\" DevicePath \"\"" Dec 11 13:26:39 crc kubenswrapper[4898]: I1211 13:26:39.738437 4898 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b93b66d1-a195-42b8-912d-5029d9f0e6b3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 11 13:26:39 crc kubenswrapper[4898]: I1211 13:26:39.738445 4898 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b93b66d1-a195-42b8-912d-5029d9f0e6b3-scripts\") on node \"crc\" DevicePath \"\"" Dec 11 13:26:39 crc kubenswrapper[4898]: I1211 13:26:39.980082 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 11 13:26:39 crc kubenswrapper[4898]: I1211 13:26:39.991865 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 11 13:26:40 crc kubenswrapper[4898]: I1211 13:26:40.072936 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 11 13:26:40 crc kubenswrapper[4898]: I1211 13:26:40.488797 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-84b4b98fdc-tbjdg" event={"ID":"72ca8f16-912b-44f0-bc9d-868f381fb8fb","Type":"ContainerStarted","Data":"de4aa06e9f4a066d72338426f7eaf5f08340bb12be905095a972f20d0e06c62d"} Dec 11 13:26:40 crc kubenswrapper[4898]: I1211 13:26:40.488813 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-mhjjq" Dec 11 13:26:40 crc kubenswrapper[4898]: I1211 13:26:40.582494 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-6849f86cdd-lt69n"] Dec 11 13:26:40 crc kubenswrapper[4898]: E1211 13:26:40.582917 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b93b66d1-a195-42b8-912d-5029d9f0e6b3" containerName="placement-db-sync" Dec 11 13:26:40 crc kubenswrapper[4898]: I1211 13:26:40.582935 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="b93b66d1-a195-42b8-912d-5029d9f0e6b3" containerName="placement-db-sync" Dec 11 13:26:40 crc kubenswrapper[4898]: I1211 13:26:40.583149 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="b93b66d1-a195-42b8-912d-5029d9f0e6b3" containerName="placement-db-sync" Dec 11 13:26:40 crc kubenswrapper[4898]: I1211 13:26:40.593586 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6849f86cdd-lt69n" Dec 11 13:26:40 crc kubenswrapper[4898]: I1211 13:26:40.596535 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Dec 11 13:26:40 crc kubenswrapper[4898]: I1211 13:26:40.597182 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-hfvt8" Dec 11 13:26:40 crc kubenswrapper[4898]: I1211 13:26:40.597777 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Dec 11 13:26:40 crc kubenswrapper[4898]: I1211 13:26:40.598158 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Dec 11 13:26:40 crc kubenswrapper[4898]: I1211 13:26:40.598350 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Dec 11 13:26:40 crc kubenswrapper[4898]: I1211 13:26:40.611909 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-6849f86cdd-lt69n"] Dec 11 13:26:40 crc kubenswrapper[4898]: I1211 13:26:40.760126 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ee89c59-41c3-4df8-b1ed-7a0fab3b6ee1-config-data\") pod \"placement-6849f86cdd-lt69n\" (UID: \"0ee89c59-41c3-4df8-b1ed-7a0fab3b6ee1\") " pod="openstack/placement-6849f86cdd-lt69n" Dec 11 13:26:40 crc kubenswrapper[4898]: I1211 13:26:40.760224 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0ee89c59-41c3-4df8-b1ed-7a0fab3b6ee1-scripts\") pod \"placement-6849f86cdd-lt69n\" (UID: \"0ee89c59-41c3-4df8-b1ed-7a0fab3b6ee1\") " pod="openstack/placement-6849f86cdd-lt69n" Dec 11 13:26:40 crc kubenswrapper[4898]: I1211 13:26:40.760275 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0ee89c59-41c3-4df8-b1ed-7a0fab3b6ee1-internal-tls-certs\") pod \"placement-6849f86cdd-lt69n\" (UID: \"0ee89c59-41c3-4df8-b1ed-7a0fab3b6ee1\") " pod="openstack/placement-6849f86cdd-lt69n" Dec 11 13:26:40 crc kubenswrapper[4898]: I1211 13:26:40.760449 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0ee89c59-41c3-4df8-b1ed-7a0fab3b6ee1-logs\") pod \"placement-6849f86cdd-lt69n\" (UID: \"0ee89c59-41c3-4df8-b1ed-7a0fab3b6ee1\") " pod="openstack/placement-6849f86cdd-lt69n" Dec 11 13:26:40 crc kubenswrapper[4898]: I1211 13:26:40.760537 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ee89c59-41c3-4df8-b1ed-7a0fab3b6ee1-combined-ca-bundle\") pod \"placement-6849f86cdd-lt69n\" (UID: \"0ee89c59-41c3-4df8-b1ed-7a0fab3b6ee1\") " pod="openstack/placement-6849f86cdd-lt69n" Dec 11 13:26:40 crc kubenswrapper[4898]: I1211 13:26:40.760618 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0ee89c59-41c3-4df8-b1ed-7a0fab3b6ee1-public-tls-certs\") pod \"placement-6849f86cdd-lt69n\" (UID: \"0ee89c59-41c3-4df8-b1ed-7a0fab3b6ee1\") " pod="openstack/placement-6849f86cdd-lt69n" Dec 11 13:26:40 crc kubenswrapper[4898]: I1211 13:26:40.760750 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lfwpx\" (UniqueName: \"kubernetes.io/projected/0ee89c59-41c3-4df8-b1ed-7a0fab3b6ee1-kube-api-access-lfwpx\") pod \"placement-6849f86cdd-lt69n\" (UID: \"0ee89c59-41c3-4df8-b1ed-7a0fab3b6ee1\") " pod="openstack/placement-6849f86cdd-lt69n" Dec 11 13:26:40 crc kubenswrapper[4898]: I1211 13:26:40.766624 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6b7b667979-vhw9q" Dec 11 13:26:40 crc kubenswrapper[4898]: I1211 13:26:40.876101 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ee89c59-41c3-4df8-b1ed-7a0fab3b6ee1-config-data\") pod \"placement-6849f86cdd-lt69n\" (UID: \"0ee89c59-41c3-4df8-b1ed-7a0fab3b6ee1\") " pod="openstack/placement-6849f86cdd-lt69n" Dec 11 13:26:40 crc kubenswrapper[4898]: I1211 13:26:40.876219 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0ee89c59-41c3-4df8-b1ed-7a0fab3b6ee1-scripts\") pod \"placement-6849f86cdd-lt69n\" (UID: \"0ee89c59-41c3-4df8-b1ed-7a0fab3b6ee1\") " pod="openstack/placement-6849f86cdd-lt69n" Dec 11 13:26:40 crc kubenswrapper[4898]: I1211 13:26:40.876328 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0ee89c59-41c3-4df8-b1ed-7a0fab3b6ee1-internal-tls-certs\") pod \"placement-6849f86cdd-lt69n\" (UID: \"0ee89c59-41c3-4df8-b1ed-7a0fab3b6ee1\") " pod="openstack/placement-6849f86cdd-lt69n" Dec 11 13:26:40 crc kubenswrapper[4898]: I1211 13:26:40.876602 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0ee89c59-41c3-4df8-b1ed-7a0fab3b6ee1-logs\") pod \"placement-6849f86cdd-lt69n\" (UID: \"0ee89c59-41c3-4df8-b1ed-7a0fab3b6ee1\") " pod="openstack/placement-6849f86cdd-lt69n" Dec 11 13:26:40 crc kubenswrapper[4898]: I1211 13:26:40.876632 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ee89c59-41c3-4df8-b1ed-7a0fab3b6ee1-combined-ca-bundle\") pod \"placement-6849f86cdd-lt69n\" (UID: \"0ee89c59-41c3-4df8-b1ed-7a0fab3b6ee1\") " pod="openstack/placement-6849f86cdd-lt69n" Dec 11 13:26:40 crc kubenswrapper[4898]: I1211 13:26:40.876688 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0ee89c59-41c3-4df8-b1ed-7a0fab3b6ee1-public-tls-certs\") pod \"placement-6849f86cdd-lt69n\" (UID: \"0ee89c59-41c3-4df8-b1ed-7a0fab3b6ee1\") " pod="openstack/placement-6849f86cdd-lt69n" Dec 11 13:26:40 crc kubenswrapper[4898]: I1211 13:26:40.876837 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lfwpx\" (UniqueName: \"kubernetes.io/projected/0ee89c59-41c3-4df8-b1ed-7a0fab3b6ee1-kube-api-access-lfwpx\") pod \"placement-6849f86cdd-lt69n\" (UID: \"0ee89c59-41c3-4df8-b1ed-7a0fab3b6ee1\") " pod="openstack/placement-6849f86cdd-lt69n" Dec 11 13:26:40 crc kubenswrapper[4898]: I1211 13:26:40.879879 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0ee89c59-41c3-4df8-b1ed-7a0fab3b6ee1-logs\") pod \"placement-6849f86cdd-lt69n\" (UID: \"0ee89c59-41c3-4df8-b1ed-7a0fab3b6ee1\") " pod="openstack/placement-6849f86cdd-lt69n" Dec 11 13:26:40 crc kubenswrapper[4898]: I1211 13:26:40.888701 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ee89c59-41c3-4df8-b1ed-7a0fab3b6ee1-combined-ca-bundle\") pod \"placement-6849f86cdd-lt69n\" (UID: \"0ee89c59-41c3-4df8-b1ed-7a0fab3b6ee1\") " pod="openstack/placement-6849f86cdd-lt69n" Dec 11 13:26:40 crc kubenswrapper[4898]: I1211 13:26:40.899957 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0ee89c59-41c3-4df8-b1ed-7a0fab3b6ee1-internal-tls-certs\") pod \"placement-6849f86cdd-lt69n\" (UID: \"0ee89c59-41c3-4df8-b1ed-7a0fab3b6ee1\") " pod="openstack/placement-6849f86cdd-lt69n" Dec 11 13:26:40 crc kubenswrapper[4898]: I1211 13:26:40.900061 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ee89c59-41c3-4df8-b1ed-7a0fab3b6ee1-config-data\") pod \"placement-6849f86cdd-lt69n\" (UID: \"0ee89c59-41c3-4df8-b1ed-7a0fab3b6ee1\") " pod="openstack/placement-6849f86cdd-lt69n" Dec 11 13:26:40 crc kubenswrapper[4898]: I1211 13:26:40.902339 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0ee89c59-41c3-4df8-b1ed-7a0fab3b6ee1-public-tls-certs\") pod \"placement-6849f86cdd-lt69n\" (UID: \"0ee89c59-41c3-4df8-b1ed-7a0fab3b6ee1\") " pod="openstack/placement-6849f86cdd-lt69n" Dec 11 13:26:40 crc kubenswrapper[4898]: I1211 13:26:40.908437 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0ee89c59-41c3-4df8-b1ed-7a0fab3b6ee1-scripts\") pod \"placement-6849f86cdd-lt69n\" (UID: \"0ee89c59-41c3-4df8-b1ed-7a0fab3b6ee1\") " pod="openstack/placement-6849f86cdd-lt69n" Dec 11 13:26:40 crc kubenswrapper[4898]: I1211 13:26:40.912202 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lfwpx\" (UniqueName: \"kubernetes.io/projected/0ee89c59-41c3-4df8-b1ed-7a0fab3b6ee1-kube-api-access-lfwpx\") pod \"placement-6849f86cdd-lt69n\" (UID: \"0ee89c59-41c3-4df8-b1ed-7a0fab3b6ee1\") " pod="openstack/placement-6849f86cdd-lt69n" Dec 11 13:26:40 crc kubenswrapper[4898]: I1211 13:26:40.922988 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-d89j6"] Dec 11 13:26:40 crc kubenswrapper[4898]: I1211 13:26:40.925860 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-56df8fb6b7-d89j6" podUID="298d4481-1b69-43a5-89a8-218a127c0a51" containerName="dnsmasq-dns" containerID="cri-o://b0f3105ea6f8d99d069fecdbc2b1b5adb7a42fc436acac65aa7207065eb67b9d" gracePeriod=10 Dec 11 13:26:40 crc kubenswrapper[4898]: I1211 13:26:40.972217 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6849f86cdd-lt69n" Dec 11 13:26:41 crc kubenswrapper[4898]: I1211 13:26:41.504054 4898 generic.go:334] "Generic (PLEG): container finished" podID="470d01dc-02f0-49f5-912b-087238320dba" containerID="70ef5f3eb8612084e2aa274d7d73c0d22001ad25ea423aa131b75b0ee1ce27dc" exitCode=0 Dec 11 13:26:41 crc kubenswrapper[4898]: I1211 13:26:41.504145 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-ml4kr" event={"ID":"470d01dc-02f0-49f5-912b-087238320dba","Type":"ContainerDied","Data":"70ef5f3eb8612084e2aa274d7d73c0d22001ad25ea423aa131b75b0ee1ce27dc"} Dec 11 13:26:41 crc kubenswrapper[4898]: I1211 13:26:41.509897 4898 generic.go:334] "Generic (PLEG): container finished" podID="298d4481-1b69-43a5-89a8-218a127c0a51" containerID="b0f3105ea6f8d99d069fecdbc2b1b5adb7a42fc436acac65aa7207065eb67b9d" exitCode=0 Dec 11 13:26:41 crc kubenswrapper[4898]: I1211 13:26:41.509937 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-d89j6" event={"ID":"298d4481-1b69-43a5-89a8-218a127c0a51","Type":"ContainerDied","Data":"b0f3105ea6f8d99d069fecdbc2b1b5adb7a42fc436acac65aa7207065eb67b9d"} Dec 11 13:26:41 crc kubenswrapper[4898]: I1211 13:26:41.791356 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-56df8fb6b7-d89j6" podUID="298d4481-1b69-43a5-89a8-218a127c0a51" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.183:5353: connect: connection refused" Dec 11 13:26:42 crc kubenswrapper[4898]: I1211 13:26:42.404293 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 11 13:26:42 crc kubenswrapper[4898]: I1211 13:26:42.492164 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56df8fb6b7-d89j6" Dec 11 13:26:42 crc kubenswrapper[4898]: I1211 13:26:42.554899 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56df8fb6b7-d89j6" Dec 11 13:26:42 crc kubenswrapper[4898]: I1211 13:26:42.558167 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-d89j6" event={"ID":"298d4481-1b69-43a5-89a8-218a127c0a51","Type":"ContainerDied","Data":"d21566ca382cbfa9930d87bec625a05a530a0c1e429b28dabcbad20d945a0b27"} Dec 11 13:26:42 crc kubenswrapper[4898]: I1211 13:26:42.558293 4898 scope.go:117] "RemoveContainer" containerID="b0f3105ea6f8d99d069fecdbc2b1b5adb7a42fc436acac65aa7207065eb67b9d" Dec 11 13:26:42 crc kubenswrapper[4898]: E1211 13:26:42.626266 4898 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod98476a59_e1fa_4912_828f_c2bc4d6f1317.slice/crio-600f8cad965db2164982c51d6ef27f0ff5d8efb54479c6a9017208646a81bccc\": RecentStats: unable to find data in memory cache]" Dec 11 13:26:42 crc kubenswrapper[4898]: I1211 13:26:42.646630 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/298d4481-1b69-43a5-89a8-218a127c0a51-config\") pod \"298d4481-1b69-43a5-89a8-218a127c0a51\" (UID: \"298d4481-1b69-43a5-89a8-218a127c0a51\") " Dec 11 13:26:42 crc kubenswrapper[4898]: I1211 13:26:42.646982 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gbrbr\" (UniqueName: \"kubernetes.io/projected/298d4481-1b69-43a5-89a8-218a127c0a51-kube-api-access-gbrbr\") pod \"298d4481-1b69-43a5-89a8-218a127c0a51\" (UID: \"298d4481-1b69-43a5-89a8-218a127c0a51\") " Dec 11 13:26:42 crc kubenswrapper[4898]: I1211 13:26:42.647161 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/298d4481-1b69-43a5-89a8-218a127c0a51-dns-svc\") pod \"298d4481-1b69-43a5-89a8-218a127c0a51\" (UID: \"298d4481-1b69-43a5-89a8-218a127c0a51\") " Dec 11 13:26:42 crc kubenswrapper[4898]: I1211 13:26:42.647223 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/298d4481-1b69-43a5-89a8-218a127c0a51-ovsdbserver-nb\") pod \"298d4481-1b69-43a5-89a8-218a127c0a51\" (UID: \"298d4481-1b69-43a5-89a8-218a127c0a51\") " Dec 11 13:26:42 crc kubenswrapper[4898]: I1211 13:26:42.647244 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/298d4481-1b69-43a5-89a8-218a127c0a51-ovsdbserver-sb\") pod \"298d4481-1b69-43a5-89a8-218a127c0a51\" (UID: \"298d4481-1b69-43a5-89a8-218a127c0a51\") " Dec 11 13:26:42 crc kubenswrapper[4898]: I1211 13:26:42.647267 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/298d4481-1b69-43a5-89a8-218a127c0a51-dns-swift-storage-0\") pod \"298d4481-1b69-43a5-89a8-218a127c0a51\" (UID: \"298d4481-1b69-43a5-89a8-218a127c0a51\") " Dec 11 13:26:42 crc kubenswrapper[4898]: I1211 13:26:42.651741 4898 scope.go:117] "RemoveContainer" containerID="d7f6d8d4d2c0a3a2f0fc3a8e92e503d5808996cbfed6e3d92b807573d84addc7" Dec 11 13:26:42 crc kubenswrapper[4898]: I1211 13:26:42.660418 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/298d4481-1b69-43a5-89a8-218a127c0a51-kube-api-access-gbrbr" (OuterVolumeSpecName: "kube-api-access-gbrbr") pod "298d4481-1b69-43a5-89a8-218a127c0a51" (UID: "298d4481-1b69-43a5-89a8-218a127c0a51"). InnerVolumeSpecName "kube-api-access-gbrbr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:26:42 crc kubenswrapper[4898]: I1211 13:26:42.714907 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/298d4481-1b69-43a5-89a8-218a127c0a51-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "298d4481-1b69-43a5-89a8-218a127c0a51" (UID: "298d4481-1b69-43a5-89a8-218a127c0a51"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:26:42 crc kubenswrapper[4898]: I1211 13:26:42.719415 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-6849f86cdd-lt69n"] Dec 11 13:26:42 crc kubenswrapper[4898]: I1211 13:26:42.735970 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/298d4481-1b69-43a5-89a8-218a127c0a51-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "298d4481-1b69-43a5-89a8-218a127c0a51" (UID: "298d4481-1b69-43a5-89a8-218a127c0a51"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:26:42 crc kubenswrapper[4898]: I1211 13:26:42.743634 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/298d4481-1b69-43a5-89a8-218a127c0a51-config" (OuterVolumeSpecName: "config") pod "298d4481-1b69-43a5-89a8-218a127c0a51" (UID: "298d4481-1b69-43a5-89a8-218a127c0a51"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:26:42 crc kubenswrapper[4898]: I1211 13:26:42.750904 4898 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/298d4481-1b69-43a5-89a8-218a127c0a51-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 11 13:26:42 crc kubenswrapper[4898]: I1211 13:26:42.750950 4898 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/298d4481-1b69-43a5-89a8-218a127c0a51-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 11 13:26:42 crc kubenswrapper[4898]: I1211 13:26:42.750961 4898 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/298d4481-1b69-43a5-89a8-218a127c0a51-config\") on node \"crc\" DevicePath \"\"" Dec 11 13:26:42 crc kubenswrapper[4898]: I1211 13:26:42.750971 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gbrbr\" (UniqueName: \"kubernetes.io/projected/298d4481-1b69-43a5-89a8-218a127c0a51-kube-api-access-gbrbr\") on node \"crc\" DevicePath \"\"" Dec 11 13:26:42 crc kubenswrapper[4898]: I1211 13:26:42.759438 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/298d4481-1b69-43a5-89a8-218a127c0a51-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "298d4481-1b69-43a5-89a8-218a127c0a51" (UID: "298d4481-1b69-43a5-89a8-218a127c0a51"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:26:42 crc kubenswrapper[4898]: I1211 13:26:42.762174 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/298d4481-1b69-43a5-89a8-218a127c0a51-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "298d4481-1b69-43a5-89a8-218a127c0a51" (UID: "298d4481-1b69-43a5-89a8-218a127c0a51"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:26:42 crc kubenswrapper[4898]: W1211 13:26:42.832585 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0ee89c59_41c3_4df8_b1ed_7a0fab3b6ee1.slice/crio-4a450aa2237645d325c30e33fdee31c1c4e5a82486914e76588442910e1e2cda WatchSource:0}: Error finding container 4a450aa2237645d325c30e33fdee31c1c4e5a82486914e76588442910e1e2cda: Status 404 returned error can't find the container with id 4a450aa2237645d325c30e33fdee31c1c4e5a82486914e76588442910e1e2cda Dec 11 13:26:42 crc kubenswrapper[4898]: I1211 13:26:42.853292 4898 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/298d4481-1b69-43a5-89a8-218a127c0a51-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 11 13:26:42 crc kubenswrapper[4898]: I1211 13:26:42.853325 4898 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/298d4481-1b69-43a5-89a8-218a127c0a51-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 11 13:26:42 crc kubenswrapper[4898]: I1211 13:26:42.894434 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-d89j6"] Dec 11 13:26:42 crc kubenswrapper[4898]: I1211 13:26:42.909320 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-d89j6"] Dec 11 13:26:43 crc kubenswrapper[4898]: I1211 13:26:43.107715 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-ml4kr" Dec 11 13:26:43 crc kubenswrapper[4898]: I1211 13:26:43.268650 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqhjs\" (UniqueName: \"kubernetes.io/projected/470d01dc-02f0-49f5-912b-087238320dba-kube-api-access-fqhjs\") pod \"470d01dc-02f0-49f5-912b-087238320dba\" (UID: \"470d01dc-02f0-49f5-912b-087238320dba\") " Dec 11 13:26:43 crc kubenswrapper[4898]: I1211 13:26:43.268763 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/470d01dc-02f0-49f5-912b-087238320dba-scripts\") pod \"470d01dc-02f0-49f5-912b-087238320dba\" (UID: \"470d01dc-02f0-49f5-912b-087238320dba\") " Dec 11 13:26:43 crc kubenswrapper[4898]: I1211 13:26:43.268905 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/470d01dc-02f0-49f5-912b-087238320dba-credential-keys\") pod \"470d01dc-02f0-49f5-912b-087238320dba\" (UID: \"470d01dc-02f0-49f5-912b-087238320dba\") " Dec 11 13:26:43 crc kubenswrapper[4898]: I1211 13:26:43.268933 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/470d01dc-02f0-49f5-912b-087238320dba-fernet-keys\") pod \"470d01dc-02f0-49f5-912b-087238320dba\" (UID: \"470d01dc-02f0-49f5-912b-087238320dba\") " Dec 11 13:26:43 crc kubenswrapper[4898]: I1211 13:26:43.268973 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/470d01dc-02f0-49f5-912b-087238320dba-config-data\") pod \"470d01dc-02f0-49f5-912b-087238320dba\" (UID: \"470d01dc-02f0-49f5-912b-087238320dba\") " Dec 11 13:26:43 crc kubenswrapper[4898]: I1211 13:26:43.269002 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/470d01dc-02f0-49f5-912b-087238320dba-combined-ca-bundle\") pod \"470d01dc-02f0-49f5-912b-087238320dba\" (UID: \"470d01dc-02f0-49f5-912b-087238320dba\") " Dec 11 13:26:43 crc kubenswrapper[4898]: I1211 13:26:43.273749 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/470d01dc-02f0-49f5-912b-087238320dba-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "470d01dc-02f0-49f5-912b-087238320dba" (UID: "470d01dc-02f0-49f5-912b-087238320dba"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:26:43 crc kubenswrapper[4898]: I1211 13:26:43.275104 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/470d01dc-02f0-49f5-912b-087238320dba-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "470d01dc-02f0-49f5-912b-087238320dba" (UID: "470d01dc-02f0-49f5-912b-087238320dba"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:26:43 crc kubenswrapper[4898]: I1211 13:26:43.282771 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/470d01dc-02f0-49f5-912b-087238320dba-kube-api-access-fqhjs" (OuterVolumeSpecName: "kube-api-access-fqhjs") pod "470d01dc-02f0-49f5-912b-087238320dba" (UID: "470d01dc-02f0-49f5-912b-087238320dba"). InnerVolumeSpecName "kube-api-access-fqhjs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:26:43 crc kubenswrapper[4898]: I1211 13:26:43.285007 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/470d01dc-02f0-49f5-912b-087238320dba-scripts" (OuterVolumeSpecName: "scripts") pod "470d01dc-02f0-49f5-912b-087238320dba" (UID: "470d01dc-02f0-49f5-912b-087238320dba"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:26:43 crc kubenswrapper[4898]: I1211 13:26:43.308358 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/470d01dc-02f0-49f5-912b-087238320dba-config-data" (OuterVolumeSpecName: "config-data") pod "470d01dc-02f0-49f5-912b-087238320dba" (UID: "470d01dc-02f0-49f5-912b-087238320dba"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:26:43 crc kubenswrapper[4898]: I1211 13:26:43.309423 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/470d01dc-02f0-49f5-912b-087238320dba-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "470d01dc-02f0-49f5-912b-087238320dba" (UID: "470d01dc-02f0-49f5-912b-087238320dba"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:26:43 crc kubenswrapper[4898]: I1211 13:26:43.371027 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqhjs\" (UniqueName: \"kubernetes.io/projected/470d01dc-02f0-49f5-912b-087238320dba-kube-api-access-fqhjs\") on node \"crc\" DevicePath \"\"" Dec 11 13:26:43 crc kubenswrapper[4898]: I1211 13:26:43.371059 4898 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/470d01dc-02f0-49f5-912b-087238320dba-scripts\") on node \"crc\" DevicePath \"\"" Dec 11 13:26:43 crc kubenswrapper[4898]: I1211 13:26:43.371069 4898 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/470d01dc-02f0-49f5-912b-087238320dba-credential-keys\") on node \"crc\" DevicePath \"\"" Dec 11 13:26:43 crc kubenswrapper[4898]: I1211 13:26:43.371077 4898 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/470d01dc-02f0-49f5-912b-087238320dba-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 11 13:26:43 crc kubenswrapper[4898]: I1211 13:26:43.371087 4898 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/470d01dc-02f0-49f5-912b-087238320dba-config-data\") on node \"crc\" DevicePath \"\"" Dec 11 13:26:43 crc kubenswrapper[4898]: I1211 13:26:43.371096 4898 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/470d01dc-02f0-49f5-912b-087238320dba-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 11 13:26:43 crc kubenswrapper[4898]: I1211 13:26:43.576917 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-6v7s7" event={"ID":"76d89e82-8f2e-4198-8736-28293404a0bd","Type":"ContainerStarted","Data":"87016d8e9b9f797522f428ba0d870d8d0bee8cdfd5017a713ac92eb31e20f22f"} Dec 11 13:26:43 crc kubenswrapper[4898]: I1211 13:26:43.581812 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-ml4kr" event={"ID":"470d01dc-02f0-49f5-912b-087238320dba","Type":"ContainerDied","Data":"c9212b49de106c8e4378c1f2482265d82d1feb9067881db90469fa60ae15330e"} Dec 11 13:26:43 crc kubenswrapper[4898]: I1211 13:26:43.581852 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c9212b49de106c8e4378c1f2482265d82d1feb9067881db90469fa60ae15330e" Dec 11 13:26:43 crc kubenswrapper[4898]: I1211 13:26:43.581852 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-ml4kr" Dec 11 13:26:43 crc kubenswrapper[4898]: I1211 13:26:43.587196 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-84b4b98fdc-tbjdg" event={"ID":"72ca8f16-912b-44f0-bc9d-868f381fb8fb","Type":"ContainerStarted","Data":"259a9368647705b6849f534f0dbcb4d3659b3bf64a78cb202f0501bc18012c96"} Dec 11 13:26:43 crc kubenswrapper[4898]: I1211 13:26:43.588152 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-84b4b98fdc-tbjdg" Dec 11 13:26:43 crc kubenswrapper[4898]: I1211 13:26:43.589760 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ce9ab866-6ba7-4782-a002-5f0f4c252b4e","Type":"ContainerStarted","Data":"da70ba5ae6f4baab0b2940ef49df332e4852297598182a36995f2e14873b46b6"} Dec 11 13:26:43 crc kubenswrapper[4898]: I1211 13:26:43.594840 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6849f86cdd-lt69n" event={"ID":"0ee89c59-41c3-4df8-b1ed-7a0fab3b6ee1","Type":"ContainerStarted","Data":"4332e99f305849c80133f7a574454ae856978dbfdb031a10a2d57cc64e102f4b"} Dec 11 13:26:43 crc kubenswrapper[4898]: I1211 13:26:43.594906 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6849f86cdd-lt69n" event={"ID":"0ee89c59-41c3-4df8-b1ed-7a0fab3b6ee1","Type":"ContainerStarted","Data":"2319055c71eb3232b287e843cb75b745d10e06f382796ba6ab9579b9c6d1773c"} Dec 11 13:26:43 crc kubenswrapper[4898]: I1211 13:26:43.594924 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6849f86cdd-lt69n" event={"ID":"0ee89c59-41c3-4df8-b1ed-7a0fab3b6ee1","Type":"ContainerStarted","Data":"4a450aa2237645d325c30e33fdee31c1c4e5a82486914e76588442910e1e2cda"} Dec 11 13:26:43 crc kubenswrapper[4898]: I1211 13:26:43.595022 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-6849f86cdd-lt69n" Dec 11 13:26:43 crc kubenswrapper[4898]: I1211 13:26:43.595042 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-6849f86cdd-lt69n" Dec 11 13:26:43 crc kubenswrapper[4898]: I1211 13:26:43.602295 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-6v7s7" podStartSLOduration=3.703051063 podStartE2EDuration="48.602281505s" podCreationTimestamp="2025-12-11 13:25:55 +0000 UTC" firstStartedPulling="2025-12-11 13:25:57.222351749 +0000 UTC m=+1314.794678196" lastFinishedPulling="2025-12-11 13:26:42.121582201 +0000 UTC m=+1359.693908638" observedRunningTime="2025-12-11 13:26:43.592811206 +0000 UTC m=+1361.165137643" watchObservedRunningTime="2025-12-11 13:26:43.602281505 +0000 UTC m=+1361.174607942" Dec 11 13:26:43 crc kubenswrapper[4898]: I1211 13:26:43.659142 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-5d97d757c9-6txl4"] Dec 11 13:26:43 crc kubenswrapper[4898]: E1211 13:26:43.659622 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="470d01dc-02f0-49f5-912b-087238320dba" containerName="keystone-bootstrap" Dec 11 13:26:43 crc kubenswrapper[4898]: I1211 13:26:43.659650 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="470d01dc-02f0-49f5-912b-087238320dba" containerName="keystone-bootstrap" Dec 11 13:26:43 crc kubenswrapper[4898]: E1211 13:26:43.659680 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="298d4481-1b69-43a5-89a8-218a127c0a51" containerName="init" Dec 11 13:26:43 crc kubenswrapper[4898]: I1211 13:26:43.659686 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="298d4481-1b69-43a5-89a8-218a127c0a51" containerName="init" Dec 11 13:26:43 crc kubenswrapper[4898]: E1211 13:26:43.659696 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="298d4481-1b69-43a5-89a8-218a127c0a51" containerName="dnsmasq-dns" Dec 11 13:26:43 crc kubenswrapper[4898]: I1211 13:26:43.659703 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="298d4481-1b69-43a5-89a8-218a127c0a51" containerName="dnsmasq-dns" Dec 11 13:26:43 crc kubenswrapper[4898]: I1211 13:26:43.659906 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="298d4481-1b69-43a5-89a8-218a127c0a51" containerName="dnsmasq-dns" Dec 11 13:26:43 crc kubenswrapper[4898]: I1211 13:26:43.659935 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="470d01dc-02f0-49f5-912b-087238320dba" containerName="keystone-bootstrap" Dec 11 13:26:43 crc kubenswrapper[4898]: I1211 13:26:43.660671 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5d97d757c9-6txl4" Dec 11 13:26:43 crc kubenswrapper[4898]: I1211 13:26:43.662256 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-84b4b98fdc-tbjdg" podStartSLOduration=9.662240716 podStartE2EDuration="9.662240716s" podCreationTimestamp="2025-12-11 13:26:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 13:26:43.631321751 +0000 UTC m=+1361.203648188" watchObservedRunningTime="2025-12-11 13:26:43.662240716 +0000 UTC m=+1361.234567143" Dec 11 13:26:43 crc kubenswrapper[4898]: I1211 13:26:43.668692 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Dec 11 13:26:43 crc kubenswrapper[4898]: I1211 13:26:43.668712 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 11 13:26:43 crc kubenswrapper[4898]: I1211 13:26:43.671882 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 11 13:26:43 crc kubenswrapper[4898]: I1211 13:26:43.671894 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 11 13:26:43 crc kubenswrapper[4898]: I1211 13:26:43.672199 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-qkphd" Dec 11 13:26:43 crc kubenswrapper[4898]: I1211 13:26:43.672302 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Dec 11 13:26:43 crc kubenswrapper[4898]: I1211 13:26:43.721998 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-5d97d757c9-6txl4"] Dec 11 13:26:43 crc kubenswrapper[4898]: I1211 13:26:43.749863 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-6849f86cdd-lt69n" podStartSLOduration=3.7498438849999998 podStartE2EDuration="3.749843885s" podCreationTimestamp="2025-12-11 13:26:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 13:26:43.672032624 +0000 UTC m=+1361.244359061" watchObservedRunningTime="2025-12-11 13:26:43.749843885 +0000 UTC m=+1361.322170322" Dec 11 13:26:43 crc kubenswrapper[4898]: I1211 13:26:43.803998 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c23801ba-7898-47cc-bbf9-bd108c25f99e-internal-tls-certs\") pod \"keystone-5d97d757c9-6txl4\" (UID: \"c23801ba-7898-47cc-bbf9-bd108c25f99e\") " pod="openstack/keystone-5d97d757c9-6txl4" Dec 11 13:26:43 crc kubenswrapper[4898]: I1211 13:26:43.804252 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c23801ba-7898-47cc-bbf9-bd108c25f99e-public-tls-certs\") pod \"keystone-5d97d757c9-6txl4\" (UID: \"c23801ba-7898-47cc-bbf9-bd108c25f99e\") " pod="openstack/keystone-5d97d757c9-6txl4" Dec 11 13:26:43 crc kubenswrapper[4898]: I1211 13:26:43.804291 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c23801ba-7898-47cc-bbf9-bd108c25f99e-credential-keys\") pod \"keystone-5d97d757c9-6txl4\" (UID: \"c23801ba-7898-47cc-bbf9-bd108c25f99e\") " pod="openstack/keystone-5d97d757c9-6txl4" Dec 11 13:26:43 crc kubenswrapper[4898]: I1211 13:26:43.804317 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c23801ba-7898-47cc-bbf9-bd108c25f99e-config-data\") pod \"keystone-5d97d757c9-6txl4\" (UID: \"c23801ba-7898-47cc-bbf9-bd108c25f99e\") " pod="openstack/keystone-5d97d757c9-6txl4" Dec 11 13:26:43 crc kubenswrapper[4898]: I1211 13:26:43.804387 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c23801ba-7898-47cc-bbf9-bd108c25f99e-fernet-keys\") pod \"keystone-5d97d757c9-6txl4\" (UID: \"c23801ba-7898-47cc-bbf9-bd108c25f99e\") " pod="openstack/keystone-5d97d757c9-6txl4" Dec 11 13:26:43 crc kubenswrapper[4898]: I1211 13:26:43.804477 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c23801ba-7898-47cc-bbf9-bd108c25f99e-combined-ca-bundle\") pod \"keystone-5d97d757c9-6txl4\" (UID: \"c23801ba-7898-47cc-bbf9-bd108c25f99e\") " pod="openstack/keystone-5d97d757c9-6txl4" Dec 11 13:26:43 crc kubenswrapper[4898]: I1211 13:26:43.804504 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c23801ba-7898-47cc-bbf9-bd108c25f99e-scripts\") pod \"keystone-5d97d757c9-6txl4\" (UID: \"c23801ba-7898-47cc-bbf9-bd108c25f99e\") " pod="openstack/keystone-5d97d757c9-6txl4" Dec 11 13:26:43 crc kubenswrapper[4898]: I1211 13:26:43.804570 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tcgtr\" (UniqueName: \"kubernetes.io/projected/c23801ba-7898-47cc-bbf9-bd108c25f99e-kube-api-access-tcgtr\") pod \"keystone-5d97d757c9-6txl4\" (UID: \"c23801ba-7898-47cc-bbf9-bd108c25f99e\") " pod="openstack/keystone-5d97d757c9-6txl4" Dec 11 13:26:43 crc kubenswrapper[4898]: I1211 13:26:43.906959 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tcgtr\" (UniqueName: \"kubernetes.io/projected/c23801ba-7898-47cc-bbf9-bd108c25f99e-kube-api-access-tcgtr\") pod \"keystone-5d97d757c9-6txl4\" (UID: \"c23801ba-7898-47cc-bbf9-bd108c25f99e\") " pod="openstack/keystone-5d97d757c9-6txl4" Dec 11 13:26:43 crc kubenswrapper[4898]: I1211 13:26:43.907038 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c23801ba-7898-47cc-bbf9-bd108c25f99e-internal-tls-certs\") pod \"keystone-5d97d757c9-6txl4\" (UID: \"c23801ba-7898-47cc-bbf9-bd108c25f99e\") " pod="openstack/keystone-5d97d757c9-6txl4" Dec 11 13:26:43 crc kubenswrapper[4898]: I1211 13:26:43.907070 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c23801ba-7898-47cc-bbf9-bd108c25f99e-public-tls-certs\") pod \"keystone-5d97d757c9-6txl4\" (UID: \"c23801ba-7898-47cc-bbf9-bd108c25f99e\") " pod="openstack/keystone-5d97d757c9-6txl4" Dec 11 13:26:43 crc kubenswrapper[4898]: I1211 13:26:43.907120 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c23801ba-7898-47cc-bbf9-bd108c25f99e-credential-keys\") pod \"keystone-5d97d757c9-6txl4\" (UID: \"c23801ba-7898-47cc-bbf9-bd108c25f99e\") " pod="openstack/keystone-5d97d757c9-6txl4" Dec 11 13:26:43 crc kubenswrapper[4898]: I1211 13:26:43.907156 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c23801ba-7898-47cc-bbf9-bd108c25f99e-config-data\") pod \"keystone-5d97d757c9-6txl4\" (UID: \"c23801ba-7898-47cc-bbf9-bd108c25f99e\") " pod="openstack/keystone-5d97d757c9-6txl4" Dec 11 13:26:43 crc kubenswrapper[4898]: I1211 13:26:43.907234 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c23801ba-7898-47cc-bbf9-bd108c25f99e-fernet-keys\") pod \"keystone-5d97d757c9-6txl4\" (UID: \"c23801ba-7898-47cc-bbf9-bd108c25f99e\") " pod="openstack/keystone-5d97d757c9-6txl4" Dec 11 13:26:43 crc kubenswrapper[4898]: I1211 13:26:43.907363 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c23801ba-7898-47cc-bbf9-bd108c25f99e-combined-ca-bundle\") pod \"keystone-5d97d757c9-6txl4\" (UID: \"c23801ba-7898-47cc-bbf9-bd108c25f99e\") " pod="openstack/keystone-5d97d757c9-6txl4" Dec 11 13:26:43 crc kubenswrapper[4898]: I1211 13:26:43.907406 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c23801ba-7898-47cc-bbf9-bd108c25f99e-scripts\") pod \"keystone-5d97d757c9-6txl4\" (UID: \"c23801ba-7898-47cc-bbf9-bd108c25f99e\") " pod="openstack/keystone-5d97d757c9-6txl4" Dec 11 13:26:43 crc kubenswrapper[4898]: I1211 13:26:43.914967 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c23801ba-7898-47cc-bbf9-bd108c25f99e-scripts\") pod \"keystone-5d97d757c9-6txl4\" (UID: \"c23801ba-7898-47cc-bbf9-bd108c25f99e\") " pod="openstack/keystone-5d97d757c9-6txl4" Dec 11 13:26:43 crc kubenswrapper[4898]: I1211 13:26:43.915938 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c23801ba-7898-47cc-bbf9-bd108c25f99e-public-tls-certs\") pod \"keystone-5d97d757c9-6txl4\" (UID: \"c23801ba-7898-47cc-bbf9-bd108c25f99e\") " pod="openstack/keystone-5d97d757c9-6txl4" Dec 11 13:26:43 crc kubenswrapper[4898]: I1211 13:26:43.916164 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c23801ba-7898-47cc-bbf9-bd108c25f99e-credential-keys\") pod \"keystone-5d97d757c9-6txl4\" (UID: \"c23801ba-7898-47cc-bbf9-bd108c25f99e\") " pod="openstack/keystone-5d97d757c9-6txl4" Dec 11 13:26:43 crc kubenswrapper[4898]: I1211 13:26:43.916475 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c23801ba-7898-47cc-bbf9-bd108c25f99e-combined-ca-bundle\") pod \"keystone-5d97d757c9-6txl4\" (UID: \"c23801ba-7898-47cc-bbf9-bd108c25f99e\") " pod="openstack/keystone-5d97d757c9-6txl4" Dec 11 13:26:43 crc kubenswrapper[4898]: I1211 13:26:43.918153 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c23801ba-7898-47cc-bbf9-bd108c25f99e-internal-tls-certs\") pod \"keystone-5d97d757c9-6txl4\" (UID: \"c23801ba-7898-47cc-bbf9-bd108c25f99e\") " pod="openstack/keystone-5d97d757c9-6txl4" Dec 11 13:26:43 crc kubenswrapper[4898]: I1211 13:26:43.919405 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c23801ba-7898-47cc-bbf9-bd108c25f99e-fernet-keys\") pod \"keystone-5d97d757c9-6txl4\" (UID: \"c23801ba-7898-47cc-bbf9-bd108c25f99e\") " pod="openstack/keystone-5d97d757c9-6txl4" Dec 11 13:26:43 crc kubenswrapper[4898]: I1211 13:26:43.923702 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c23801ba-7898-47cc-bbf9-bd108c25f99e-config-data\") pod \"keystone-5d97d757c9-6txl4\" (UID: \"c23801ba-7898-47cc-bbf9-bd108c25f99e\") " pod="openstack/keystone-5d97d757c9-6txl4" Dec 11 13:26:43 crc kubenswrapper[4898]: I1211 13:26:43.926125 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tcgtr\" (UniqueName: \"kubernetes.io/projected/c23801ba-7898-47cc-bbf9-bd108c25f99e-kube-api-access-tcgtr\") pod \"keystone-5d97d757c9-6txl4\" (UID: \"c23801ba-7898-47cc-bbf9-bd108c25f99e\") " pod="openstack/keystone-5d97d757c9-6txl4" Dec 11 13:26:43 crc kubenswrapper[4898]: I1211 13:26:43.994181 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5d97d757c9-6txl4" Dec 11 13:26:44 crc kubenswrapper[4898]: I1211 13:26:44.511232 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-5d97d757c9-6txl4"] Dec 11 13:26:44 crc kubenswrapper[4898]: W1211 13:26:44.514936 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc23801ba_7898_47cc_bbf9_bd108c25f99e.slice/crio-8047d464daa3ad2868d49bbc1c12413d29b372a143a7df8f1b4c77619f27b1eb WatchSource:0}: Error finding container 8047d464daa3ad2868d49bbc1c12413d29b372a143a7df8f1b4c77619f27b1eb: Status 404 returned error can't find the container with id 8047d464daa3ad2868d49bbc1c12413d29b372a143a7df8f1b4c77619f27b1eb Dec 11 13:26:44 crc kubenswrapper[4898]: I1211 13:26:44.617666 4898 generic.go:334] "Generic (PLEG): container finished" podID="5924443a-434a-4efc-b04d-fdc73d3e2fe6" containerID="5eb8b1fac0eefc98cdd4aa11ee9069c852eb895b98d696667be2c09f7aa07bbb" exitCode=0 Dec 11 13:26:44 crc kubenswrapper[4898]: I1211 13:26:44.617773 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-gtqwj" event={"ID":"5924443a-434a-4efc-b04d-fdc73d3e2fe6","Type":"ContainerDied","Data":"5eb8b1fac0eefc98cdd4aa11ee9069c852eb895b98d696667be2c09f7aa07bbb"} Dec 11 13:26:44 crc kubenswrapper[4898]: I1211 13:26:44.619804 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5d97d757c9-6txl4" event={"ID":"c23801ba-7898-47cc-bbf9-bd108c25f99e","Type":"ContainerStarted","Data":"8047d464daa3ad2868d49bbc1c12413d29b372a143a7df8f1b4c77619f27b1eb"} Dec 11 13:26:44 crc kubenswrapper[4898]: I1211 13:26:44.789383 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="298d4481-1b69-43a5-89a8-218a127c0a51" path="/var/lib/kubelet/pods/298d4481-1b69-43a5-89a8-218a127c0a51/volumes" Dec 11 13:26:45 crc kubenswrapper[4898]: I1211 13:26:45.654467 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5d97d757c9-6txl4" event={"ID":"c23801ba-7898-47cc-bbf9-bd108c25f99e","Type":"ContainerStarted","Data":"220b6b82be0359dc557ffde318a371b734927e888368568c01d7c34e41f59d14"} Dec 11 13:26:45 crc kubenswrapper[4898]: I1211 13:26:45.689896 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-5d97d757c9-6txl4" Dec 11 13:26:45 crc kubenswrapper[4898]: I1211 13:26:45.824956 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-5d97d757c9-6txl4" podStartSLOduration=2.82493513 podStartE2EDuration="2.82493513s" podCreationTimestamp="2025-12-11 13:26:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 13:26:45.734156244 +0000 UTC m=+1363.306482701" watchObservedRunningTime="2025-12-11 13:26:45.82493513 +0000 UTC m=+1363.397261557" Dec 11 13:26:46 crc kubenswrapper[4898]: I1211 13:26:46.196943 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-gtqwj" Dec 11 13:26:46 crc kubenswrapper[4898]: I1211 13:26:46.282720 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5924443a-434a-4efc-b04d-fdc73d3e2fe6-combined-ca-bundle\") pod \"5924443a-434a-4efc-b04d-fdc73d3e2fe6\" (UID: \"5924443a-434a-4efc-b04d-fdc73d3e2fe6\") " Dec 11 13:26:46 crc kubenswrapper[4898]: I1211 13:26:46.282918 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5924443a-434a-4efc-b04d-fdc73d3e2fe6-db-sync-config-data\") pod \"5924443a-434a-4efc-b04d-fdc73d3e2fe6\" (UID: \"5924443a-434a-4efc-b04d-fdc73d3e2fe6\") " Dec 11 13:26:46 crc kubenswrapper[4898]: I1211 13:26:46.283042 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r4jdf\" (UniqueName: \"kubernetes.io/projected/5924443a-434a-4efc-b04d-fdc73d3e2fe6-kube-api-access-r4jdf\") pod \"5924443a-434a-4efc-b04d-fdc73d3e2fe6\" (UID: \"5924443a-434a-4efc-b04d-fdc73d3e2fe6\") " Dec 11 13:26:46 crc kubenswrapper[4898]: I1211 13:26:46.295748 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5924443a-434a-4efc-b04d-fdc73d3e2fe6-kube-api-access-r4jdf" (OuterVolumeSpecName: "kube-api-access-r4jdf") pod "5924443a-434a-4efc-b04d-fdc73d3e2fe6" (UID: "5924443a-434a-4efc-b04d-fdc73d3e2fe6"). InnerVolumeSpecName "kube-api-access-r4jdf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:26:46 crc kubenswrapper[4898]: I1211 13:26:46.304509 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5924443a-434a-4efc-b04d-fdc73d3e2fe6-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "5924443a-434a-4efc-b04d-fdc73d3e2fe6" (UID: "5924443a-434a-4efc-b04d-fdc73d3e2fe6"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:26:46 crc kubenswrapper[4898]: I1211 13:26:46.316239 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5924443a-434a-4efc-b04d-fdc73d3e2fe6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5924443a-434a-4efc-b04d-fdc73d3e2fe6" (UID: "5924443a-434a-4efc-b04d-fdc73d3e2fe6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:26:46 crc kubenswrapper[4898]: I1211 13:26:46.385407 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r4jdf\" (UniqueName: \"kubernetes.io/projected/5924443a-434a-4efc-b04d-fdc73d3e2fe6-kube-api-access-r4jdf\") on node \"crc\" DevicePath \"\"" Dec 11 13:26:46 crc kubenswrapper[4898]: I1211 13:26:46.385439 4898 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5924443a-434a-4efc-b04d-fdc73d3e2fe6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 11 13:26:46 crc kubenswrapper[4898]: I1211 13:26:46.385450 4898 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5924443a-434a-4efc-b04d-fdc73d3e2fe6-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 11 13:26:46 crc kubenswrapper[4898]: I1211 13:26:46.672835 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-gtqwj" Dec 11 13:26:46 crc kubenswrapper[4898]: I1211 13:26:46.672834 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-gtqwj" event={"ID":"5924443a-434a-4efc-b04d-fdc73d3e2fe6","Type":"ContainerDied","Data":"8cbc09377cca8b85fadfd8e851a500a7781315b2e7075965b2a3130d13224ec6"} Dec 11 13:26:46 crc kubenswrapper[4898]: I1211 13:26:46.673134 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8cbc09377cca8b85fadfd8e851a500a7781315b2e7075965b2a3130d13224ec6" Dec 11 13:26:46 crc kubenswrapper[4898]: I1211 13:26:46.677939 4898 generic.go:334] "Generic (PLEG): container finished" podID="23459d62-b558-4f82-a875-311d5fa486e5" containerID="a8fb01bc589dee8d9d5d9e46068c3f94228cd2cd6920721105182af1d784b83e" exitCode=0 Dec 11 13:26:46 crc kubenswrapper[4898]: I1211 13:26:46.678033 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-mvzhc" event={"ID":"23459d62-b558-4f82-a875-311d5fa486e5","Type":"ContainerDied","Data":"a8fb01bc589dee8d9d5d9e46068c3f94228cd2cd6920721105182af1d784b83e"} Dec 11 13:26:46 crc kubenswrapper[4898]: I1211 13:26:46.956204 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-qrsq6"] Dec 11 13:26:46 crc kubenswrapper[4898]: E1211 13:26:46.956810 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5924443a-434a-4efc-b04d-fdc73d3e2fe6" containerName="barbican-db-sync" Dec 11 13:26:46 crc kubenswrapper[4898]: I1211 13:26:46.956859 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="5924443a-434a-4efc-b04d-fdc73d3e2fe6" containerName="barbican-db-sync" Dec 11 13:26:46 crc kubenswrapper[4898]: I1211 13:26:46.957229 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="5924443a-434a-4efc-b04d-fdc73d3e2fe6" containerName="barbican-db-sync" Dec 11 13:26:46 crc kubenswrapper[4898]: I1211 13:26:46.964997 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-848cf88cfc-qrsq6" Dec 11 13:26:47 crc kubenswrapper[4898]: I1211 13:26:47.019285 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-qrsq6"] Dec 11 13:26:47 crc kubenswrapper[4898]: I1211 13:26:47.060411 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-668c999b4c-4pjgd"] Dec 11 13:26:47 crc kubenswrapper[4898]: I1211 13:26:47.062629 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-668c999b4c-4pjgd" Dec 11 13:26:47 crc kubenswrapper[4898]: I1211 13:26:47.072211 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-cns9k" Dec 11 13:26:47 crc kubenswrapper[4898]: I1211 13:26:47.072586 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Dec 11 13:26:47 crc kubenswrapper[4898]: I1211 13:26:47.072742 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Dec 11 13:26:47 crc kubenswrapper[4898]: I1211 13:26:47.090445 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-668c999b4c-4pjgd"] Dec 11 13:26:47 crc kubenswrapper[4898]: I1211 13:26:47.101949 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7b30e6b-41bb-420f-8139-5bd4646d7944-config\") pod \"dnsmasq-dns-848cf88cfc-qrsq6\" (UID: \"e7b30e6b-41bb-420f-8139-5bd4646d7944\") " pod="openstack/dnsmasq-dns-848cf88cfc-qrsq6" Dec 11 13:26:47 crc kubenswrapper[4898]: I1211 13:26:47.102143 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e7b30e6b-41bb-420f-8139-5bd4646d7944-ovsdbserver-nb\") pod \"dnsmasq-dns-848cf88cfc-qrsq6\" (UID: \"e7b30e6b-41bb-420f-8139-5bd4646d7944\") " pod="openstack/dnsmasq-dns-848cf88cfc-qrsq6" Dec 11 13:26:47 crc kubenswrapper[4898]: I1211 13:26:47.102175 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vgzjp\" (UniqueName: \"kubernetes.io/projected/e7b30e6b-41bb-420f-8139-5bd4646d7944-kube-api-access-vgzjp\") pod \"dnsmasq-dns-848cf88cfc-qrsq6\" (UID: \"e7b30e6b-41bb-420f-8139-5bd4646d7944\") " pod="openstack/dnsmasq-dns-848cf88cfc-qrsq6" Dec 11 13:26:47 crc kubenswrapper[4898]: I1211 13:26:47.102265 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e7b30e6b-41bb-420f-8139-5bd4646d7944-dns-svc\") pod \"dnsmasq-dns-848cf88cfc-qrsq6\" (UID: \"e7b30e6b-41bb-420f-8139-5bd4646d7944\") " pod="openstack/dnsmasq-dns-848cf88cfc-qrsq6" Dec 11 13:26:47 crc kubenswrapper[4898]: I1211 13:26:47.103874 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e7b30e6b-41bb-420f-8139-5bd4646d7944-dns-swift-storage-0\") pod \"dnsmasq-dns-848cf88cfc-qrsq6\" (UID: \"e7b30e6b-41bb-420f-8139-5bd4646d7944\") " pod="openstack/dnsmasq-dns-848cf88cfc-qrsq6" Dec 11 13:26:47 crc kubenswrapper[4898]: I1211 13:26:47.103911 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e7b30e6b-41bb-420f-8139-5bd4646d7944-ovsdbserver-sb\") pod \"dnsmasq-dns-848cf88cfc-qrsq6\" (UID: \"e7b30e6b-41bb-420f-8139-5bd4646d7944\") " pod="openstack/dnsmasq-dns-848cf88cfc-qrsq6" Dec 11 13:26:47 crc kubenswrapper[4898]: I1211 13:26:47.124288 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-58648c8f48-xlp5l"] Dec 11 13:26:47 crc kubenswrapper[4898]: I1211 13:26:47.127584 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-58648c8f48-xlp5l" Dec 11 13:26:47 crc kubenswrapper[4898]: I1211 13:26:47.132817 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Dec 11 13:26:47 crc kubenswrapper[4898]: I1211 13:26:47.160063 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-58648c8f48-xlp5l"] Dec 11 13:26:47 crc kubenswrapper[4898]: I1211 13:26:47.170644 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-54b55bb954-hjwxn"] Dec 11 13:26:47 crc kubenswrapper[4898]: I1211 13:26:47.173069 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-54b55bb954-hjwxn" Dec 11 13:26:47 crc kubenswrapper[4898]: I1211 13:26:47.178008 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Dec 11 13:26:47 crc kubenswrapper[4898]: I1211 13:26:47.194388 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-54b55bb954-hjwxn"] Dec 11 13:26:47 crc kubenswrapper[4898]: I1211 13:26:47.205589 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e7b30e6b-41bb-420f-8139-5bd4646d7944-dns-swift-storage-0\") pod \"dnsmasq-dns-848cf88cfc-qrsq6\" (UID: \"e7b30e6b-41bb-420f-8139-5bd4646d7944\") " pod="openstack/dnsmasq-dns-848cf88cfc-qrsq6" Dec 11 13:26:47 crc kubenswrapper[4898]: I1211 13:26:47.205654 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e7b30e6b-41bb-420f-8139-5bd4646d7944-ovsdbserver-sb\") pod \"dnsmasq-dns-848cf88cfc-qrsq6\" (UID: \"e7b30e6b-41bb-420f-8139-5bd4646d7944\") " pod="openstack/dnsmasq-dns-848cf88cfc-qrsq6" Dec 11 13:26:47 crc kubenswrapper[4898]: I1211 13:26:47.205712 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12ec53d2-3707-4276-89f3-58df46bb17bd-combined-ca-bundle\") pod \"barbican-worker-668c999b4c-4pjgd\" (UID: \"12ec53d2-3707-4276-89f3-58df46bb17bd\") " pod="openstack/barbican-worker-668c999b4c-4pjgd" Dec 11 13:26:47 crc kubenswrapper[4898]: I1211 13:26:47.205771 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pntfj\" (UniqueName: \"kubernetes.io/projected/12ec53d2-3707-4276-89f3-58df46bb17bd-kube-api-access-pntfj\") pod \"barbican-worker-668c999b4c-4pjgd\" (UID: \"12ec53d2-3707-4276-89f3-58df46bb17bd\") " pod="openstack/barbican-worker-668c999b4c-4pjgd" Dec 11 13:26:47 crc kubenswrapper[4898]: I1211 13:26:47.205814 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7b30e6b-41bb-420f-8139-5bd4646d7944-config\") pod \"dnsmasq-dns-848cf88cfc-qrsq6\" (UID: \"e7b30e6b-41bb-420f-8139-5bd4646d7944\") " pod="openstack/dnsmasq-dns-848cf88cfc-qrsq6" Dec 11 13:26:47 crc kubenswrapper[4898]: I1211 13:26:47.205837 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/33744ce1-11e6-4c20-b805-f9ba35221d29-config-data-custom\") pod \"barbican-keystone-listener-58648c8f48-xlp5l\" (UID: \"33744ce1-11e6-4c20-b805-f9ba35221d29\") " pod="openstack/barbican-keystone-listener-58648c8f48-xlp5l" Dec 11 13:26:47 crc kubenswrapper[4898]: I1211 13:26:47.205861 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e7b30e6b-41bb-420f-8139-5bd4646d7944-ovsdbserver-nb\") pod \"dnsmasq-dns-848cf88cfc-qrsq6\" (UID: \"e7b30e6b-41bb-420f-8139-5bd4646d7944\") " pod="openstack/dnsmasq-dns-848cf88cfc-qrsq6" Dec 11 13:26:47 crc kubenswrapper[4898]: I1211 13:26:47.205884 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vgzjp\" (UniqueName: \"kubernetes.io/projected/e7b30e6b-41bb-420f-8139-5bd4646d7944-kube-api-access-vgzjp\") pod \"dnsmasq-dns-848cf88cfc-qrsq6\" (UID: \"e7b30e6b-41bb-420f-8139-5bd4646d7944\") " pod="openstack/dnsmasq-dns-848cf88cfc-qrsq6" Dec 11 13:26:47 crc kubenswrapper[4898]: I1211 13:26:47.205919 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/33744ce1-11e6-4c20-b805-f9ba35221d29-logs\") pod \"barbican-keystone-listener-58648c8f48-xlp5l\" (UID: \"33744ce1-11e6-4c20-b805-f9ba35221d29\") " pod="openstack/barbican-keystone-listener-58648c8f48-xlp5l" Dec 11 13:26:47 crc kubenswrapper[4898]: I1211 13:26:47.205965 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/12ec53d2-3707-4276-89f3-58df46bb17bd-config-data-custom\") pod \"barbican-worker-668c999b4c-4pjgd\" (UID: \"12ec53d2-3707-4276-89f3-58df46bb17bd\") " pod="openstack/barbican-worker-668c999b4c-4pjgd" Dec 11 13:26:47 crc kubenswrapper[4898]: I1211 13:26:47.206017 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33744ce1-11e6-4c20-b805-f9ba35221d29-combined-ca-bundle\") pod \"barbican-keystone-listener-58648c8f48-xlp5l\" (UID: \"33744ce1-11e6-4c20-b805-f9ba35221d29\") " pod="openstack/barbican-keystone-listener-58648c8f48-xlp5l" Dec 11 13:26:47 crc kubenswrapper[4898]: I1211 13:26:47.206039 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bzrfr\" (UniqueName: \"kubernetes.io/projected/33744ce1-11e6-4c20-b805-f9ba35221d29-kube-api-access-bzrfr\") pod \"barbican-keystone-listener-58648c8f48-xlp5l\" (UID: \"33744ce1-11e6-4c20-b805-f9ba35221d29\") " pod="openstack/barbican-keystone-listener-58648c8f48-xlp5l" Dec 11 13:26:47 crc kubenswrapper[4898]: I1211 13:26:47.206062 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12ec53d2-3707-4276-89f3-58df46bb17bd-config-data\") pod \"barbican-worker-668c999b4c-4pjgd\" (UID: \"12ec53d2-3707-4276-89f3-58df46bb17bd\") " pod="openstack/barbican-worker-668c999b4c-4pjgd" Dec 11 13:26:47 crc kubenswrapper[4898]: I1211 13:26:47.206089 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e7b30e6b-41bb-420f-8139-5bd4646d7944-dns-svc\") pod \"dnsmasq-dns-848cf88cfc-qrsq6\" (UID: \"e7b30e6b-41bb-420f-8139-5bd4646d7944\") " pod="openstack/dnsmasq-dns-848cf88cfc-qrsq6" Dec 11 13:26:47 crc kubenswrapper[4898]: I1211 13:26:47.206120 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/12ec53d2-3707-4276-89f3-58df46bb17bd-logs\") pod \"barbican-worker-668c999b4c-4pjgd\" (UID: \"12ec53d2-3707-4276-89f3-58df46bb17bd\") " pod="openstack/barbican-worker-668c999b4c-4pjgd" Dec 11 13:26:47 crc kubenswrapper[4898]: I1211 13:26:47.206147 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33744ce1-11e6-4c20-b805-f9ba35221d29-config-data\") pod \"barbican-keystone-listener-58648c8f48-xlp5l\" (UID: \"33744ce1-11e6-4c20-b805-f9ba35221d29\") " pod="openstack/barbican-keystone-listener-58648c8f48-xlp5l" Dec 11 13:26:47 crc kubenswrapper[4898]: I1211 13:26:47.207045 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e7b30e6b-41bb-420f-8139-5bd4646d7944-dns-swift-storage-0\") pod \"dnsmasq-dns-848cf88cfc-qrsq6\" (UID: \"e7b30e6b-41bb-420f-8139-5bd4646d7944\") " pod="openstack/dnsmasq-dns-848cf88cfc-qrsq6" Dec 11 13:26:47 crc kubenswrapper[4898]: I1211 13:26:47.207516 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e7b30e6b-41bb-420f-8139-5bd4646d7944-ovsdbserver-sb\") pod \"dnsmasq-dns-848cf88cfc-qrsq6\" (UID: \"e7b30e6b-41bb-420f-8139-5bd4646d7944\") " pod="openstack/dnsmasq-dns-848cf88cfc-qrsq6" Dec 11 13:26:47 crc kubenswrapper[4898]: I1211 13:26:47.209016 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e7b30e6b-41bb-420f-8139-5bd4646d7944-dns-svc\") pod \"dnsmasq-dns-848cf88cfc-qrsq6\" (UID: \"e7b30e6b-41bb-420f-8139-5bd4646d7944\") " pod="openstack/dnsmasq-dns-848cf88cfc-qrsq6" Dec 11 13:26:47 crc kubenswrapper[4898]: I1211 13:26:47.209534 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7b30e6b-41bb-420f-8139-5bd4646d7944-config\") pod \"dnsmasq-dns-848cf88cfc-qrsq6\" (UID: \"e7b30e6b-41bb-420f-8139-5bd4646d7944\") " pod="openstack/dnsmasq-dns-848cf88cfc-qrsq6" Dec 11 13:26:47 crc kubenswrapper[4898]: I1211 13:26:47.210015 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e7b30e6b-41bb-420f-8139-5bd4646d7944-ovsdbserver-nb\") pod \"dnsmasq-dns-848cf88cfc-qrsq6\" (UID: \"e7b30e6b-41bb-420f-8139-5bd4646d7944\") " pod="openstack/dnsmasq-dns-848cf88cfc-qrsq6" Dec 11 13:26:47 crc kubenswrapper[4898]: I1211 13:26:47.231543 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vgzjp\" (UniqueName: \"kubernetes.io/projected/e7b30e6b-41bb-420f-8139-5bd4646d7944-kube-api-access-vgzjp\") pod \"dnsmasq-dns-848cf88cfc-qrsq6\" (UID: \"e7b30e6b-41bb-420f-8139-5bd4646d7944\") " pod="openstack/dnsmasq-dns-848cf88cfc-qrsq6" Dec 11 13:26:47 crc kubenswrapper[4898]: I1211 13:26:47.307838 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33744ce1-11e6-4c20-b805-f9ba35221d29-config-data\") pod \"barbican-keystone-listener-58648c8f48-xlp5l\" (UID: \"33744ce1-11e6-4c20-b805-f9ba35221d29\") " pod="openstack/barbican-keystone-listener-58648c8f48-xlp5l" Dec 11 13:26:47 crc kubenswrapper[4898]: I1211 13:26:47.307929 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8bc10d96-de00-4df3-bdd0-42d70e94b7ee-config-data\") pod \"barbican-api-54b55bb954-hjwxn\" (UID: \"8bc10d96-de00-4df3-bdd0-42d70e94b7ee\") " pod="openstack/barbican-api-54b55bb954-hjwxn" Dec 11 13:26:47 crc kubenswrapper[4898]: I1211 13:26:47.308030 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12ec53d2-3707-4276-89f3-58df46bb17bd-combined-ca-bundle\") pod \"barbican-worker-668c999b4c-4pjgd\" (UID: \"12ec53d2-3707-4276-89f3-58df46bb17bd\") " pod="openstack/barbican-worker-668c999b4c-4pjgd" Dec 11 13:26:47 crc kubenswrapper[4898]: I1211 13:26:47.308071 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pntfj\" (UniqueName: \"kubernetes.io/projected/12ec53d2-3707-4276-89f3-58df46bb17bd-kube-api-access-pntfj\") pod \"barbican-worker-668c999b4c-4pjgd\" (UID: \"12ec53d2-3707-4276-89f3-58df46bb17bd\") " pod="openstack/barbican-worker-668c999b4c-4pjgd" Dec 11 13:26:47 crc kubenswrapper[4898]: I1211 13:26:47.308123 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/33744ce1-11e6-4c20-b805-f9ba35221d29-config-data-custom\") pod \"barbican-keystone-listener-58648c8f48-xlp5l\" (UID: \"33744ce1-11e6-4c20-b805-f9ba35221d29\") " pod="openstack/barbican-keystone-listener-58648c8f48-xlp5l" Dec 11 13:26:47 crc kubenswrapper[4898]: I1211 13:26:47.308145 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8bc10d96-de00-4df3-bdd0-42d70e94b7ee-combined-ca-bundle\") pod \"barbican-api-54b55bb954-hjwxn\" (UID: \"8bc10d96-de00-4df3-bdd0-42d70e94b7ee\") " pod="openstack/barbican-api-54b55bb954-hjwxn" Dec 11 13:26:47 crc kubenswrapper[4898]: I1211 13:26:47.308490 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/33744ce1-11e6-4c20-b805-f9ba35221d29-logs\") pod \"barbican-keystone-listener-58648c8f48-xlp5l\" (UID: \"33744ce1-11e6-4c20-b805-f9ba35221d29\") " pod="openstack/barbican-keystone-listener-58648c8f48-xlp5l" Dec 11 13:26:47 crc kubenswrapper[4898]: I1211 13:26:47.308858 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/33744ce1-11e6-4c20-b805-f9ba35221d29-logs\") pod \"barbican-keystone-listener-58648c8f48-xlp5l\" (UID: \"33744ce1-11e6-4c20-b805-f9ba35221d29\") " pod="openstack/barbican-keystone-listener-58648c8f48-xlp5l" Dec 11 13:26:47 crc kubenswrapper[4898]: I1211 13:26:47.308933 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/12ec53d2-3707-4276-89f3-58df46bb17bd-config-data-custom\") pod \"barbican-worker-668c999b4c-4pjgd\" (UID: \"12ec53d2-3707-4276-89f3-58df46bb17bd\") " pod="openstack/barbican-worker-668c999b4c-4pjgd" Dec 11 13:26:47 crc kubenswrapper[4898]: I1211 13:26:47.309021 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33744ce1-11e6-4c20-b805-f9ba35221d29-combined-ca-bundle\") pod \"barbican-keystone-listener-58648c8f48-xlp5l\" (UID: \"33744ce1-11e6-4c20-b805-f9ba35221d29\") " pod="openstack/barbican-keystone-listener-58648c8f48-xlp5l" Dec 11 13:26:47 crc kubenswrapper[4898]: I1211 13:26:47.309058 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bzrfr\" (UniqueName: \"kubernetes.io/projected/33744ce1-11e6-4c20-b805-f9ba35221d29-kube-api-access-bzrfr\") pod \"barbican-keystone-listener-58648c8f48-xlp5l\" (UID: \"33744ce1-11e6-4c20-b805-f9ba35221d29\") " pod="openstack/barbican-keystone-listener-58648c8f48-xlp5l" Dec 11 13:26:47 crc kubenswrapper[4898]: I1211 13:26:47.309087 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bjmp5\" (UniqueName: \"kubernetes.io/projected/8bc10d96-de00-4df3-bdd0-42d70e94b7ee-kube-api-access-bjmp5\") pod \"barbican-api-54b55bb954-hjwxn\" (UID: \"8bc10d96-de00-4df3-bdd0-42d70e94b7ee\") " pod="openstack/barbican-api-54b55bb954-hjwxn" Dec 11 13:26:47 crc kubenswrapper[4898]: I1211 13:26:47.309119 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12ec53d2-3707-4276-89f3-58df46bb17bd-config-data\") pod \"barbican-worker-668c999b4c-4pjgd\" (UID: \"12ec53d2-3707-4276-89f3-58df46bb17bd\") " pod="openstack/barbican-worker-668c999b4c-4pjgd" Dec 11 13:26:47 crc kubenswrapper[4898]: I1211 13:26:47.309153 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8bc10d96-de00-4df3-bdd0-42d70e94b7ee-config-data-custom\") pod \"barbican-api-54b55bb954-hjwxn\" (UID: \"8bc10d96-de00-4df3-bdd0-42d70e94b7ee\") " pod="openstack/barbican-api-54b55bb954-hjwxn" Dec 11 13:26:47 crc kubenswrapper[4898]: I1211 13:26:47.309221 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/12ec53d2-3707-4276-89f3-58df46bb17bd-logs\") pod \"barbican-worker-668c999b4c-4pjgd\" (UID: \"12ec53d2-3707-4276-89f3-58df46bb17bd\") " pod="openstack/barbican-worker-668c999b4c-4pjgd" Dec 11 13:26:47 crc kubenswrapper[4898]: I1211 13:26:47.309248 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8bc10d96-de00-4df3-bdd0-42d70e94b7ee-logs\") pod \"barbican-api-54b55bb954-hjwxn\" (UID: \"8bc10d96-de00-4df3-bdd0-42d70e94b7ee\") " pod="openstack/barbican-api-54b55bb954-hjwxn" Dec 11 13:26:47 crc kubenswrapper[4898]: I1211 13:26:47.310228 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/12ec53d2-3707-4276-89f3-58df46bb17bd-logs\") pod \"barbican-worker-668c999b4c-4pjgd\" (UID: \"12ec53d2-3707-4276-89f3-58df46bb17bd\") " pod="openstack/barbican-worker-668c999b4c-4pjgd" Dec 11 13:26:47 crc kubenswrapper[4898]: I1211 13:26:47.311194 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12ec53d2-3707-4276-89f3-58df46bb17bd-combined-ca-bundle\") pod \"barbican-worker-668c999b4c-4pjgd\" (UID: \"12ec53d2-3707-4276-89f3-58df46bb17bd\") " pod="openstack/barbican-worker-668c999b4c-4pjgd" Dec 11 13:26:47 crc kubenswrapper[4898]: I1211 13:26:47.312707 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33744ce1-11e6-4c20-b805-f9ba35221d29-combined-ca-bundle\") pod \"barbican-keystone-listener-58648c8f48-xlp5l\" (UID: \"33744ce1-11e6-4c20-b805-f9ba35221d29\") " pod="openstack/barbican-keystone-listener-58648c8f48-xlp5l" Dec 11 13:26:47 crc kubenswrapper[4898]: I1211 13:26:47.314165 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12ec53d2-3707-4276-89f3-58df46bb17bd-config-data\") pod \"barbican-worker-668c999b4c-4pjgd\" (UID: \"12ec53d2-3707-4276-89f3-58df46bb17bd\") " pod="openstack/barbican-worker-668c999b4c-4pjgd" Dec 11 13:26:47 crc kubenswrapper[4898]: I1211 13:26:47.319885 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-848cf88cfc-qrsq6" Dec 11 13:26:47 crc kubenswrapper[4898]: I1211 13:26:47.324632 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33744ce1-11e6-4c20-b805-f9ba35221d29-config-data\") pod \"barbican-keystone-listener-58648c8f48-xlp5l\" (UID: \"33744ce1-11e6-4c20-b805-f9ba35221d29\") " pod="openstack/barbican-keystone-listener-58648c8f48-xlp5l" Dec 11 13:26:47 crc kubenswrapper[4898]: I1211 13:26:47.326426 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pntfj\" (UniqueName: \"kubernetes.io/projected/12ec53d2-3707-4276-89f3-58df46bb17bd-kube-api-access-pntfj\") pod \"barbican-worker-668c999b4c-4pjgd\" (UID: \"12ec53d2-3707-4276-89f3-58df46bb17bd\") " pod="openstack/barbican-worker-668c999b4c-4pjgd" Dec 11 13:26:47 crc kubenswrapper[4898]: I1211 13:26:47.329839 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/12ec53d2-3707-4276-89f3-58df46bb17bd-config-data-custom\") pod \"barbican-worker-668c999b4c-4pjgd\" (UID: \"12ec53d2-3707-4276-89f3-58df46bb17bd\") " pod="openstack/barbican-worker-668c999b4c-4pjgd" Dec 11 13:26:47 crc kubenswrapper[4898]: I1211 13:26:47.331350 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/33744ce1-11e6-4c20-b805-f9ba35221d29-config-data-custom\") pod \"barbican-keystone-listener-58648c8f48-xlp5l\" (UID: \"33744ce1-11e6-4c20-b805-f9ba35221d29\") " pod="openstack/barbican-keystone-listener-58648c8f48-xlp5l" Dec 11 13:26:47 crc kubenswrapper[4898]: I1211 13:26:47.333614 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bzrfr\" (UniqueName: \"kubernetes.io/projected/33744ce1-11e6-4c20-b805-f9ba35221d29-kube-api-access-bzrfr\") pod \"barbican-keystone-listener-58648c8f48-xlp5l\" (UID: \"33744ce1-11e6-4c20-b805-f9ba35221d29\") " pod="openstack/barbican-keystone-listener-58648c8f48-xlp5l" Dec 11 13:26:47 crc kubenswrapper[4898]: I1211 13:26:47.406732 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-668c999b4c-4pjgd" Dec 11 13:26:47 crc kubenswrapper[4898]: I1211 13:26:47.413247 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bjmp5\" (UniqueName: \"kubernetes.io/projected/8bc10d96-de00-4df3-bdd0-42d70e94b7ee-kube-api-access-bjmp5\") pod \"barbican-api-54b55bb954-hjwxn\" (UID: \"8bc10d96-de00-4df3-bdd0-42d70e94b7ee\") " pod="openstack/barbican-api-54b55bb954-hjwxn" Dec 11 13:26:47 crc kubenswrapper[4898]: I1211 13:26:47.413332 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8bc10d96-de00-4df3-bdd0-42d70e94b7ee-config-data-custom\") pod \"barbican-api-54b55bb954-hjwxn\" (UID: \"8bc10d96-de00-4df3-bdd0-42d70e94b7ee\") " pod="openstack/barbican-api-54b55bb954-hjwxn" Dec 11 13:26:47 crc kubenswrapper[4898]: I1211 13:26:47.413440 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8bc10d96-de00-4df3-bdd0-42d70e94b7ee-logs\") pod \"barbican-api-54b55bb954-hjwxn\" (UID: \"8bc10d96-de00-4df3-bdd0-42d70e94b7ee\") " pod="openstack/barbican-api-54b55bb954-hjwxn" Dec 11 13:26:47 crc kubenswrapper[4898]: I1211 13:26:47.413589 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8bc10d96-de00-4df3-bdd0-42d70e94b7ee-config-data\") pod \"barbican-api-54b55bb954-hjwxn\" (UID: \"8bc10d96-de00-4df3-bdd0-42d70e94b7ee\") " pod="openstack/barbican-api-54b55bb954-hjwxn" Dec 11 13:26:47 crc kubenswrapper[4898]: I1211 13:26:47.414088 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8bc10d96-de00-4df3-bdd0-42d70e94b7ee-logs\") pod \"barbican-api-54b55bb954-hjwxn\" (UID: \"8bc10d96-de00-4df3-bdd0-42d70e94b7ee\") " pod="openstack/barbican-api-54b55bb954-hjwxn" Dec 11 13:26:47 crc kubenswrapper[4898]: I1211 13:26:47.415122 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8bc10d96-de00-4df3-bdd0-42d70e94b7ee-combined-ca-bundle\") pod \"barbican-api-54b55bb954-hjwxn\" (UID: \"8bc10d96-de00-4df3-bdd0-42d70e94b7ee\") " pod="openstack/barbican-api-54b55bb954-hjwxn" Dec 11 13:26:47 crc kubenswrapper[4898]: I1211 13:26:47.418744 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8bc10d96-de00-4df3-bdd0-42d70e94b7ee-combined-ca-bundle\") pod \"barbican-api-54b55bb954-hjwxn\" (UID: \"8bc10d96-de00-4df3-bdd0-42d70e94b7ee\") " pod="openstack/barbican-api-54b55bb954-hjwxn" Dec 11 13:26:47 crc kubenswrapper[4898]: I1211 13:26:47.419126 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8bc10d96-de00-4df3-bdd0-42d70e94b7ee-config-data-custom\") pod \"barbican-api-54b55bb954-hjwxn\" (UID: \"8bc10d96-de00-4df3-bdd0-42d70e94b7ee\") " pod="openstack/barbican-api-54b55bb954-hjwxn" Dec 11 13:26:47 crc kubenswrapper[4898]: I1211 13:26:47.419506 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8bc10d96-de00-4df3-bdd0-42d70e94b7ee-config-data\") pod \"barbican-api-54b55bb954-hjwxn\" (UID: \"8bc10d96-de00-4df3-bdd0-42d70e94b7ee\") " pod="openstack/barbican-api-54b55bb954-hjwxn" Dec 11 13:26:47 crc kubenswrapper[4898]: I1211 13:26:47.433398 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bjmp5\" (UniqueName: \"kubernetes.io/projected/8bc10d96-de00-4df3-bdd0-42d70e94b7ee-kube-api-access-bjmp5\") pod \"barbican-api-54b55bb954-hjwxn\" (UID: \"8bc10d96-de00-4df3-bdd0-42d70e94b7ee\") " pod="openstack/barbican-api-54b55bb954-hjwxn" Dec 11 13:26:47 crc kubenswrapper[4898]: I1211 13:26:47.450846 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-58648c8f48-xlp5l" Dec 11 13:26:47 crc kubenswrapper[4898]: I1211 13:26:47.508564 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-54b55bb954-hjwxn" Dec 11 13:26:47 crc kubenswrapper[4898]: I1211 13:26:47.895231 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-qrsq6"] Dec 11 13:26:47 crc kubenswrapper[4898]: W1211 13:26:47.899573 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode7b30e6b_41bb_420f_8139_5bd4646d7944.slice/crio-2c94624230355378b52cbe18da1965489caa19ecab2e7b6d6ca4322e35377122 WatchSource:0}: Error finding container 2c94624230355378b52cbe18da1965489caa19ecab2e7b6d6ca4322e35377122: Status 404 returned error can't find the container with id 2c94624230355378b52cbe18da1965489caa19ecab2e7b6d6ca4322e35377122 Dec 11 13:26:48 crc kubenswrapper[4898]: I1211 13:26:48.395703 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-54b55bb954-hjwxn"] Dec 11 13:26:48 crc kubenswrapper[4898]: I1211 13:26:48.417407 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-58648c8f48-xlp5l"] Dec 11 13:26:48 crc kubenswrapper[4898]: E1211 13:26:48.423014 4898 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod98476a59_e1fa_4912_828f_c2bc4d6f1317.slice/crio-600f8cad965db2164982c51d6ef27f0ff5d8efb54479c6a9017208646a81bccc\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode7b30e6b_41bb_420f_8139_5bd4646d7944.slice/crio-conmon-a43e2ea6a9ebb2ffc87efff95e3add9a3ee1013dcd03eddccb10698aa054145f.scope\": RecentStats: unable to find data in memory cache]" Dec 11 13:26:48 crc kubenswrapper[4898]: E1211 13:26:48.436637 4898 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod98476a59_e1fa_4912_828f_c2bc4d6f1317.slice/crio-600f8cad965db2164982c51d6ef27f0ff5d8efb54479c6a9017208646a81bccc\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode7b30e6b_41bb_420f_8139_5bd4646d7944.slice/crio-conmon-a43e2ea6a9ebb2ffc87efff95e3add9a3ee1013dcd03eddccb10698aa054145f.scope\": RecentStats: unable to find data in memory cache]" Dec 11 13:26:48 crc kubenswrapper[4898]: I1211 13:26:48.446743 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-668c999b4c-4pjgd"] Dec 11 13:26:48 crc kubenswrapper[4898]: I1211 13:26:48.597060 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-mvzhc" Dec 11 13:26:48 crc kubenswrapper[4898]: I1211 13:26:48.716856 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-58648c8f48-xlp5l" event={"ID":"33744ce1-11e6-4c20-b805-f9ba35221d29","Type":"ContainerStarted","Data":"38327c4237fd857b0d64a766230e3e635cbbb5ee86bcae9eb3a7e89c6f7607ad"} Dec 11 13:26:48 crc kubenswrapper[4898]: I1211 13:26:48.720775 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-mvzhc" event={"ID":"23459d62-b558-4f82-a875-311d5fa486e5","Type":"ContainerDied","Data":"5326e65bd790418544772d15d689ceb2f8acd7cf5eab25c5ae6b87a2e2a3c8ef"} Dec 11 13:26:48 crc kubenswrapper[4898]: I1211 13:26:48.720818 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5326e65bd790418544772d15d689ceb2f8acd7cf5eab25c5ae6b87a2e2a3c8ef" Dec 11 13:26:48 crc kubenswrapper[4898]: I1211 13:26:48.720880 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-mvzhc" Dec 11 13:26:48 crc kubenswrapper[4898]: I1211 13:26:48.725892 4898 generic.go:334] "Generic (PLEG): container finished" podID="e7b30e6b-41bb-420f-8139-5bd4646d7944" containerID="a43e2ea6a9ebb2ffc87efff95e3add9a3ee1013dcd03eddccb10698aa054145f" exitCode=0 Dec 11 13:26:48 crc kubenswrapper[4898]: I1211 13:26:48.725948 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-848cf88cfc-qrsq6" event={"ID":"e7b30e6b-41bb-420f-8139-5bd4646d7944","Type":"ContainerDied","Data":"a43e2ea6a9ebb2ffc87efff95e3add9a3ee1013dcd03eddccb10698aa054145f"} Dec 11 13:26:48 crc kubenswrapper[4898]: I1211 13:26:48.725977 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-848cf88cfc-qrsq6" event={"ID":"e7b30e6b-41bb-420f-8139-5bd4646d7944","Type":"ContainerStarted","Data":"2c94624230355378b52cbe18da1965489caa19ecab2e7b6d6ca4322e35377122"} Dec 11 13:26:48 crc kubenswrapper[4898]: I1211 13:26:48.741403 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-668c999b4c-4pjgd" event={"ID":"12ec53d2-3707-4276-89f3-58df46bb17bd","Type":"ContainerStarted","Data":"70a0feb2469e54139c371383c00a10219bfabda7be21a6e23323cfc74daace51"} Dec 11 13:26:48 crc kubenswrapper[4898]: I1211 13:26:48.753724 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23459d62-b558-4f82-a875-311d5fa486e5-combined-ca-bundle\") pod \"23459d62-b558-4f82-a875-311d5fa486e5\" (UID: \"23459d62-b558-4f82-a875-311d5fa486e5\") " Dec 11 13:26:48 crc kubenswrapper[4898]: I1211 13:26:48.753874 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23459d62-b558-4f82-a875-311d5fa486e5-config-data\") pod \"23459d62-b558-4f82-a875-311d5fa486e5\" (UID: \"23459d62-b558-4f82-a875-311d5fa486e5\") " Dec 11 13:26:48 crc kubenswrapper[4898]: I1211 13:26:48.754009 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wwt28\" (UniqueName: \"kubernetes.io/projected/23459d62-b558-4f82-a875-311d5fa486e5-kube-api-access-wwt28\") pod \"23459d62-b558-4f82-a875-311d5fa486e5\" (UID: \"23459d62-b558-4f82-a875-311d5fa486e5\") " Dec 11 13:26:48 crc kubenswrapper[4898]: I1211 13:26:48.769674 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-54b55bb954-hjwxn" event={"ID":"8bc10d96-de00-4df3-bdd0-42d70e94b7ee","Type":"ContainerStarted","Data":"5d43a8685cfff2f82b6bd355ddf513d39d779743d7b28100ed897fca89d858f7"} Dec 11 13:26:48 crc kubenswrapper[4898]: I1211 13:26:48.778683 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23459d62-b558-4f82-a875-311d5fa486e5-kube-api-access-wwt28" (OuterVolumeSpecName: "kube-api-access-wwt28") pod "23459d62-b558-4f82-a875-311d5fa486e5" (UID: "23459d62-b558-4f82-a875-311d5fa486e5"). InnerVolumeSpecName "kube-api-access-wwt28". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:26:48 crc kubenswrapper[4898]: I1211 13:26:48.856349 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wwt28\" (UniqueName: \"kubernetes.io/projected/23459d62-b558-4f82-a875-311d5fa486e5-kube-api-access-wwt28\") on node \"crc\" DevicePath \"\"" Dec 11 13:26:48 crc kubenswrapper[4898]: I1211 13:26:48.955124 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23459d62-b558-4f82-a875-311d5fa486e5-config-data" (OuterVolumeSpecName: "config-data") pod "23459d62-b558-4f82-a875-311d5fa486e5" (UID: "23459d62-b558-4f82-a875-311d5fa486e5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:26:48 crc kubenswrapper[4898]: I1211 13:26:48.957763 4898 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23459d62-b558-4f82-a875-311d5fa486e5-config-data\") on node \"crc\" DevicePath \"\"" Dec 11 13:26:48 crc kubenswrapper[4898]: I1211 13:26:48.959686 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23459d62-b558-4f82-a875-311d5fa486e5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "23459d62-b558-4f82-a875-311d5fa486e5" (UID: "23459d62-b558-4f82-a875-311d5fa486e5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:26:49 crc kubenswrapper[4898]: I1211 13:26:49.060134 4898 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23459d62-b558-4f82-a875-311d5fa486e5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 11 13:26:49 crc kubenswrapper[4898]: I1211 13:26:49.784879 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-848cf88cfc-qrsq6" event={"ID":"e7b30e6b-41bb-420f-8139-5bd4646d7944","Type":"ContainerStarted","Data":"1c28f25469bebac535f76c747c41cb5d41a441b15f49e78350c2ef922c6798e8"} Dec 11 13:26:49 crc kubenswrapper[4898]: I1211 13:26:49.785293 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-848cf88cfc-qrsq6" Dec 11 13:26:49 crc kubenswrapper[4898]: I1211 13:26:49.788068 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-54b55bb954-hjwxn" event={"ID":"8bc10d96-de00-4df3-bdd0-42d70e94b7ee","Type":"ContainerStarted","Data":"7d397eeaaab07cd02eead0a941f1b54b960ad73edcd71545268d822f2f6f648d"} Dec 11 13:26:49 crc kubenswrapper[4898]: I1211 13:26:49.788115 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-54b55bb954-hjwxn" event={"ID":"8bc10d96-de00-4df3-bdd0-42d70e94b7ee","Type":"ContainerStarted","Data":"a2377f5d11f6df07aa659dc552184e17199e4f38f38b252eb1d75582f5804779"} Dec 11 13:26:49 crc kubenswrapper[4898]: I1211 13:26:49.788214 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-54b55bb954-hjwxn" Dec 11 13:26:49 crc kubenswrapper[4898]: I1211 13:26:49.808107 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-848cf88cfc-qrsq6" podStartSLOduration=3.808085045 podStartE2EDuration="3.808085045s" podCreationTimestamp="2025-12-11 13:26:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 13:26:49.803191913 +0000 UTC m=+1367.375518350" watchObservedRunningTime="2025-12-11 13:26:49.808085045 +0000 UTC m=+1367.380411482" Dec 11 13:26:49 crc kubenswrapper[4898]: I1211 13:26:49.828785 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-54b55bb954-hjwxn" podStartSLOduration=2.828768955 podStartE2EDuration="2.828768955s" podCreationTimestamp="2025-12-11 13:26:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 13:26:49.825849336 +0000 UTC m=+1367.398175793" watchObservedRunningTime="2025-12-11 13:26:49.828768955 +0000 UTC m=+1367.401095392" Dec 11 13:26:50 crc kubenswrapper[4898]: I1211 13:26:50.297117 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-7d9697675d-q65pl"] Dec 11 13:26:50 crc kubenswrapper[4898]: E1211 13:26:50.297826 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23459d62-b558-4f82-a875-311d5fa486e5" containerName="heat-db-sync" Dec 11 13:26:50 crc kubenswrapper[4898]: I1211 13:26:50.297839 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="23459d62-b558-4f82-a875-311d5fa486e5" containerName="heat-db-sync" Dec 11 13:26:50 crc kubenswrapper[4898]: I1211 13:26:50.298080 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="23459d62-b558-4f82-a875-311d5fa486e5" containerName="heat-db-sync" Dec 11 13:26:50 crc kubenswrapper[4898]: I1211 13:26:50.299311 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7d9697675d-q65pl" Dec 11 13:26:50 crc kubenswrapper[4898]: I1211 13:26:50.301732 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Dec 11 13:26:50 crc kubenswrapper[4898]: I1211 13:26:50.303217 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Dec 11 13:26:50 crc kubenswrapper[4898]: I1211 13:26:50.312640 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7d9697675d-q65pl"] Dec 11 13:26:50 crc kubenswrapper[4898]: I1211 13:26:50.391641 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/08d32354-0f02-436c-a082-9d02e2ebaadc-logs\") pod \"barbican-api-7d9697675d-q65pl\" (UID: \"08d32354-0f02-436c-a082-9d02e2ebaadc\") " pod="openstack/barbican-api-7d9697675d-q65pl" Dec 11 13:26:50 crc kubenswrapper[4898]: I1211 13:26:50.391694 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/08d32354-0f02-436c-a082-9d02e2ebaadc-internal-tls-certs\") pod \"barbican-api-7d9697675d-q65pl\" (UID: \"08d32354-0f02-436c-a082-9d02e2ebaadc\") " pod="openstack/barbican-api-7d9697675d-q65pl" Dec 11 13:26:50 crc kubenswrapper[4898]: I1211 13:26:50.391753 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/08d32354-0f02-436c-a082-9d02e2ebaadc-public-tls-certs\") pod \"barbican-api-7d9697675d-q65pl\" (UID: \"08d32354-0f02-436c-a082-9d02e2ebaadc\") " pod="openstack/barbican-api-7d9697675d-q65pl" Dec 11 13:26:50 crc kubenswrapper[4898]: I1211 13:26:50.391819 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d5drt\" (UniqueName: \"kubernetes.io/projected/08d32354-0f02-436c-a082-9d02e2ebaadc-kube-api-access-d5drt\") pod \"barbican-api-7d9697675d-q65pl\" (UID: \"08d32354-0f02-436c-a082-9d02e2ebaadc\") " pod="openstack/barbican-api-7d9697675d-q65pl" Dec 11 13:26:50 crc kubenswrapper[4898]: I1211 13:26:50.391930 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/08d32354-0f02-436c-a082-9d02e2ebaadc-config-data-custom\") pod \"barbican-api-7d9697675d-q65pl\" (UID: \"08d32354-0f02-436c-a082-9d02e2ebaadc\") " pod="openstack/barbican-api-7d9697675d-q65pl" Dec 11 13:26:50 crc kubenswrapper[4898]: I1211 13:26:50.391988 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08d32354-0f02-436c-a082-9d02e2ebaadc-config-data\") pod \"barbican-api-7d9697675d-q65pl\" (UID: \"08d32354-0f02-436c-a082-9d02e2ebaadc\") " pod="openstack/barbican-api-7d9697675d-q65pl" Dec 11 13:26:50 crc kubenswrapper[4898]: I1211 13:26:50.392146 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08d32354-0f02-436c-a082-9d02e2ebaadc-combined-ca-bundle\") pod \"barbican-api-7d9697675d-q65pl\" (UID: \"08d32354-0f02-436c-a082-9d02e2ebaadc\") " pod="openstack/barbican-api-7d9697675d-q65pl" Dec 11 13:26:50 crc kubenswrapper[4898]: I1211 13:26:50.493654 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/08d32354-0f02-436c-a082-9d02e2ebaadc-logs\") pod \"barbican-api-7d9697675d-q65pl\" (UID: \"08d32354-0f02-436c-a082-9d02e2ebaadc\") " pod="openstack/barbican-api-7d9697675d-q65pl" Dec 11 13:26:50 crc kubenswrapper[4898]: I1211 13:26:50.493697 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/08d32354-0f02-436c-a082-9d02e2ebaadc-internal-tls-certs\") pod \"barbican-api-7d9697675d-q65pl\" (UID: \"08d32354-0f02-436c-a082-9d02e2ebaadc\") " pod="openstack/barbican-api-7d9697675d-q65pl" Dec 11 13:26:50 crc kubenswrapper[4898]: I1211 13:26:50.493744 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/08d32354-0f02-436c-a082-9d02e2ebaadc-public-tls-certs\") pod \"barbican-api-7d9697675d-q65pl\" (UID: \"08d32354-0f02-436c-a082-9d02e2ebaadc\") " pod="openstack/barbican-api-7d9697675d-q65pl" Dec 11 13:26:50 crc kubenswrapper[4898]: I1211 13:26:50.493789 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d5drt\" (UniqueName: \"kubernetes.io/projected/08d32354-0f02-436c-a082-9d02e2ebaadc-kube-api-access-d5drt\") pod \"barbican-api-7d9697675d-q65pl\" (UID: \"08d32354-0f02-436c-a082-9d02e2ebaadc\") " pod="openstack/barbican-api-7d9697675d-q65pl" Dec 11 13:26:50 crc kubenswrapper[4898]: I1211 13:26:50.493867 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/08d32354-0f02-436c-a082-9d02e2ebaadc-config-data-custom\") pod \"barbican-api-7d9697675d-q65pl\" (UID: \"08d32354-0f02-436c-a082-9d02e2ebaadc\") " pod="openstack/barbican-api-7d9697675d-q65pl" Dec 11 13:26:50 crc kubenswrapper[4898]: I1211 13:26:50.493908 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08d32354-0f02-436c-a082-9d02e2ebaadc-config-data\") pod \"barbican-api-7d9697675d-q65pl\" (UID: \"08d32354-0f02-436c-a082-9d02e2ebaadc\") " pod="openstack/barbican-api-7d9697675d-q65pl" Dec 11 13:26:50 crc kubenswrapper[4898]: I1211 13:26:50.493938 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08d32354-0f02-436c-a082-9d02e2ebaadc-combined-ca-bundle\") pod \"barbican-api-7d9697675d-q65pl\" (UID: \"08d32354-0f02-436c-a082-9d02e2ebaadc\") " pod="openstack/barbican-api-7d9697675d-q65pl" Dec 11 13:26:50 crc kubenswrapper[4898]: I1211 13:26:50.494090 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/08d32354-0f02-436c-a082-9d02e2ebaadc-logs\") pod \"barbican-api-7d9697675d-q65pl\" (UID: \"08d32354-0f02-436c-a082-9d02e2ebaadc\") " pod="openstack/barbican-api-7d9697675d-q65pl" Dec 11 13:26:50 crc kubenswrapper[4898]: I1211 13:26:50.499159 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08d32354-0f02-436c-a082-9d02e2ebaadc-combined-ca-bundle\") pod \"barbican-api-7d9697675d-q65pl\" (UID: \"08d32354-0f02-436c-a082-9d02e2ebaadc\") " pod="openstack/barbican-api-7d9697675d-q65pl" Dec 11 13:26:50 crc kubenswrapper[4898]: I1211 13:26:50.500181 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/08d32354-0f02-436c-a082-9d02e2ebaadc-config-data-custom\") pod \"barbican-api-7d9697675d-q65pl\" (UID: \"08d32354-0f02-436c-a082-9d02e2ebaadc\") " pod="openstack/barbican-api-7d9697675d-q65pl" Dec 11 13:26:50 crc kubenswrapper[4898]: I1211 13:26:50.506127 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/08d32354-0f02-436c-a082-9d02e2ebaadc-public-tls-certs\") pod \"barbican-api-7d9697675d-q65pl\" (UID: \"08d32354-0f02-436c-a082-9d02e2ebaadc\") " pod="openstack/barbican-api-7d9697675d-q65pl" Dec 11 13:26:50 crc kubenswrapper[4898]: I1211 13:26:50.510839 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08d32354-0f02-436c-a082-9d02e2ebaadc-config-data\") pod \"barbican-api-7d9697675d-q65pl\" (UID: \"08d32354-0f02-436c-a082-9d02e2ebaadc\") " pod="openstack/barbican-api-7d9697675d-q65pl" Dec 11 13:26:50 crc kubenswrapper[4898]: I1211 13:26:50.511566 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d5drt\" (UniqueName: \"kubernetes.io/projected/08d32354-0f02-436c-a082-9d02e2ebaadc-kube-api-access-d5drt\") pod \"barbican-api-7d9697675d-q65pl\" (UID: \"08d32354-0f02-436c-a082-9d02e2ebaadc\") " pod="openstack/barbican-api-7d9697675d-q65pl" Dec 11 13:26:50 crc kubenswrapper[4898]: I1211 13:26:50.515191 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/08d32354-0f02-436c-a082-9d02e2ebaadc-internal-tls-certs\") pod \"barbican-api-7d9697675d-q65pl\" (UID: \"08d32354-0f02-436c-a082-9d02e2ebaadc\") " pod="openstack/barbican-api-7d9697675d-q65pl" Dec 11 13:26:50 crc kubenswrapper[4898]: I1211 13:26:50.642193 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7d9697675d-q65pl" Dec 11 13:26:50 crc kubenswrapper[4898]: I1211 13:26:50.812492 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-54b55bb954-hjwxn" Dec 11 13:26:51 crc kubenswrapper[4898]: E1211 13:26:51.248124 4898 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod98476a59_e1fa_4912_828f_c2bc4d6f1317.slice/crio-600f8cad965db2164982c51d6ef27f0ff5d8efb54479c6a9017208646a81bccc\": RecentStats: unable to find data in memory cache]" Dec 11 13:26:51 crc kubenswrapper[4898]: I1211 13:26:51.816487 4898 generic.go:334] "Generic (PLEG): container finished" podID="76d89e82-8f2e-4198-8736-28293404a0bd" containerID="87016d8e9b9f797522f428ba0d870d8d0bee8cdfd5017a713ac92eb31e20f22f" exitCode=0 Dec 11 13:26:51 crc kubenswrapper[4898]: I1211 13:26:51.816579 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-6v7s7" event={"ID":"76d89e82-8f2e-4198-8736-28293404a0bd","Type":"ContainerDied","Data":"87016d8e9b9f797522f428ba0d870d8d0bee8cdfd5017a713ac92eb31e20f22f"} Dec 11 13:26:52 crc kubenswrapper[4898]: E1211 13:26:52.692449 4898 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod98476a59_e1fa_4912_828f_c2bc4d6f1317.slice/crio-600f8cad965db2164982c51d6ef27f0ff5d8efb54479c6a9017208646a81bccc\": RecentStats: unable to find data in memory cache]" Dec 11 13:26:57 crc kubenswrapper[4898]: I1211 13:26:57.075306 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-6v7s7" Dec 11 13:26:57 crc kubenswrapper[4898]: I1211 13:26:57.168619 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76d89e82-8f2e-4198-8736-28293404a0bd-config-data\") pod \"76d89e82-8f2e-4198-8736-28293404a0bd\" (UID: \"76d89e82-8f2e-4198-8736-28293404a0bd\") " Dec 11 13:26:57 crc kubenswrapper[4898]: I1211 13:26:57.168696 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/76d89e82-8f2e-4198-8736-28293404a0bd-scripts\") pod \"76d89e82-8f2e-4198-8736-28293404a0bd\" (UID: \"76d89e82-8f2e-4198-8736-28293404a0bd\") " Dec 11 13:26:57 crc kubenswrapper[4898]: I1211 13:26:57.168750 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/76d89e82-8f2e-4198-8736-28293404a0bd-etc-machine-id\") pod \"76d89e82-8f2e-4198-8736-28293404a0bd\" (UID: \"76d89e82-8f2e-4198-8736-28293404a0bd\") " Dec 11 13:26:57 crc kubenswrapper[4898]: I1211 13:26:57.168891 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pzlfq\" (UniqueName: \"kubernetes.io/projected/76d89e82-8f2e-4198-8736-28293404a0bd-kube-api-access-pzlfq\") pod \"76d89e82-8f2e-4198-8736-28293404a0bd\" (UID: \"76d89e82-8f2e-4198-8736-28293404a0bd\") " Dec 11 13:26:57 crc kubenswrapper[4898]: I1211 13:26:57.168919 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76d89e82-8f2e-4198-8736-28293404a0bd-combined-ca-bundle\") pod \"76d89e82-8f2e-4198-8736-28293404a0bd\" (UID: \"76d89e82-8f2e-4198-8736-28293404a0bd\") " Dec 11 13:26:57 crc kubenswrapper[4898]: I1211 13:26:57.169047 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/76d89e82-8f2e-4198-8736-28293404a0bd-db-sync-config-data\") pod \"76d89e82-8f2e-4198-8736-28293404a0bd\" (UID: \"76d89e82-8f2e-4198-8736-28293404a0bd\") " Dec 11 13:26:57 crc kubenswrapper[4898]: I1211 13:26:57.169822 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/76d89e82-8f2e-4198-8736-28293404a0bd-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "76d89e82-8f2e-4198-8736-28293404a0bd" (UID: "76d89e82-8f2e-4198-8736-28293404a0bd"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 11 13:26:57 crc kubenswrapper[4898]: I1211 13:26:57.176695 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/76d89e82-8f2e-4198-8736-28293404a0bd-kube-api-access-pzlfq" (OuterVolumeSpecName: "kube-api-access-pzlfq") pod "76d89e82-8f2e-4198-8736-28293404a0bd" (UID: "76d89e82-8f2e-4198-8736-28293404a0bd"). InnerVolumeSpecName "kube-api-access-pzlfq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:26:57 crc kubenswrapper[4898]: I1211 13:26:57.177155 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76d89e82-8f2e-4198-8736-28293404a0bd-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "76d89e82-8f2e-4198-8736-28293404a0bd" (UID: "76d89e82-8f2e-4198-8736-28293404a0bd"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:26:57 crc kubenswrapper[4898]: I1211 13:26:57.185674 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76d89e82-8f2e-4198-8736-28293404a0bd-scripts" (OuterVolumeSpecName: "scripts") pod "76d89e82-8f2e-4198-8736-28293404a0bd" (UID: "76d89e82-8f2e-4198-8736-28293404a0bd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:26:57 crc kubenswrapper[4898]: I1211 13:26:57.205384 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76d89e82-8f2e-4198-8736-28293404a0bd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "76d89e82-8f2e-4198-8736-28293404a0bd" (UID: "76d89e82-8f2e-4198-8736-28293404a0bd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:26:57 crc kubenswrapper[4898]: I1211 13:26:57.245954 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76d89e82-8f2e-4198-8736-28293404a0bd-config-data" (OuterVolumeSpecName: "config-data") pod "76d89e82-8f2e-4198-8736-28293404a0bd" (UID: "76d89e82-8f2e-4198-8736-28293404a0bd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:26:57 crc kubenswrapper[4898]: I1211 13:26:57.273355 4898 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/76d89e82-8f2e-4198-8736-28293404a0bd-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 11 13:26:57 crc kubenswrapper[4898]: I1211 13:26:57.273391 4898 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76d89e82-8f2e-4198-8736-28293404a0bd-config-data\") on node \"crc\" DevicePath \"\"" Dec 11 13:26:57 crc kubenswrapper[4898]: I1211 13:26:57.273402 4898 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/76d89e82-8f2e-4198-8736-28293404a0bd-scripts\") on node \"crc\" DevicePath \"\"" Dec 11 13:26:57 crc kubenswrapper[4898]: I1211 13:26:57.273412 4898 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/76d89e82-8f2e-4198-8736-28293404a0bd-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 11 13:26:57 crc kubenswrapper[4898]: I1211 13:26:57.273423 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pzlfq\" (UniqueName: \"kubernetes.io/projected/76d89e82-8f2e-4198-8736-28293404a0bd-kube-api-access-pzlfq\") on node \"crc\" DevicePath \"\"" Dec 11 13:26:57 crc kubenswrapper[4898]: I1211 13:26:57.273436 4898 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76d89e82-8f2e-4198-8736-28293404a0bd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 11 13:26:57 crc kubenswrapper[4898]: I1211 13:26:57.322091 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-848cf88cfc-qrsq6" Dec 11 13:26:57 crc kubenswrapper[4898]: I1211 13:26:57.400304 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-vhw9q"] Dec 11 13:26:57 crc kubenswrapper[4898]: I1211 13:26:57.400861 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6b7b667979-vhw9q" podUID="ec7227b2-711c-479a-91b3-4f4d2f77ace4" containerName="dnsmasq-dns" containerID="cri-o://b4360fab85d966d93e1f09c72b64e4f81927aff75dca30e4b837dcdb84663211" gracePeriod=10 Dec 11 13:26:57 crc kubenswrapper[4898]: I1211 13:26:57.937082 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-6v7s7" event={"ID":"76d89e82-8f2e-4198-8736-28293404a0bd","Type":"ContainerDied","Data":"70d347fdd48e904d66a228ed5fa1d1994b4b9c1d3260d11fc00099a4d99f221c"} Dec 11 13:26:57 crc kubenswrapper[4898]: I1211 13:26:57.937598 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="70d347fdd48e904d66a228ed5fa1d1994b4b9c1d3260d11fc00099a4d99f221c" Dec 11 13:26:57 crc kubenswrapper[4898]: I1211 13:26:57.937668 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-6v7s7" Dec 11 13:26:57 crc kubenswrapper[4898]: I1211 13:26:57.960553 4898 generic.go:334] "Generic (PLEG): container finished" podID="ec7227b2-711c-479a-91b3-4f4d2f77ace4" containerID="b4360fab85d966d93e1f09c72b64e4f81927aff75dca30e4b837dcdb84663211" exitCode=0 Dec 11 13:26:57 crc kubenswrapper[4898]: I1211 13:26:57.960590 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7b667979-vhw9q" event={"ID":"ec7227b2-711c-479a-91b3-4f4d2f77ace4","Type":"ContainerDied","Data":"b4360fab85d966d93e1f09c72b64e4f81927aff75dca30e4b837dcdb84663211"} Dec 11 13:26:58 crc kubenswrapper[4898]: I1211 13:26:58.177035 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7b667979-vhw9q" Dec 11 13:26:58 crc kubenswrapper[4898]: I1211 13:26:58.295408 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7d9697675d-q65pl"] Dec 11 13:26:58 crc kubenswrapper[4898]: I1211 13:26:58.317724 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ec7227b2-711c-479a-91b3-4f4d2f77ace4-dns-svc\") pod \"ec7227b2-711c-479a-91b3-4f4d2f77ace4\" (UID: \"ec7227b2-711c-479a-91b3-4f4d2f77ace4\") " Dec 11 13:26:58 crc kubenswrapper[4898]: I1211 13:26:58.317781 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ec7227b2-711c-479a-91b3-4f4d2f77ace4-dns-swift-storage-0\") pod \"ec7227b2-711c-479a-91b3-4f4d2f77ace4\" (UID: \"ec7227b2-711c-479a-91b3-4f4d2f77ace4\") " Dec 11 13:26:58 crc kubenswrapper[4898]: I1211 13:26:58.317867 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-58wpg\" (UniqueName: \"kubernetes.io/projected/ec7227b2-711c-479a-91b3-4f4d2f77ace4-kube-api-access-58wpg\") pod \"ec7227b2-711c-479a-91b3-4f4d2f77ace4\" (UID: \"ec7227b2-711c-479a-91b3-4f4d2f77ace4\") " Dec 11 13:26:58 crc kubenswrapper[4898]: I1211 13:26:58.317892 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec7227b2-711c-479a-91b3-4f4d2f77ace4-config\") pod \"ec7227b2-711c-479a-91b3-4f4d2f77ace4\" (UID: \"ec7227b2-711c-479a-91b3-4f4d2f77ace4\") " Dec 11 13:26:58 crc kubenswrapper[4898]: I1211 13:26:58.317995 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ec7227b2-711c-479a-91b3-4f4d2f77ace4-ovsdbserver-sb\") pod \"ec7227b2-711c-479a-91b3-4f4d2f77ace4\" (UID: \"ec7227b2-711c-479a-91b3-4f4d2f77ace4\") " Dec 11 13:26:58 crc kubenswrapper[4898]: I1211 13:26:58.318102 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ec7227b2-711c-479a-91b3-4f4d2f77ace4-ovsdbserver-nb\") pod \"ec7227b2-711c-479a-91b3-4f4d2f77ace4\" (UID: \"ec7227b2-711c-479a-91b3-4f4d2f77ace4\") " Dec 11 13:26:58 crc kubenswrapper[4898]: I1211 13:26:58.330581 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec7227b2-711c-479a-91b3-4f4d2f77ace4-kube-api-access-58wpg" (OuterVolumeSpecName: "kube-api-access-58wpg") pod "ec7227b2-711c-479a-91b3-4f4d2f77ace4" (UID: "ec7227b2-711c-479a-91b3-4f4d2f77ace4"). InnerVolumeSpecName "kube-api-access-58wpg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:26:58 crc kubenswrapper[4898]: E1211 13:26:58.380546 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceilometer-central-agent\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ceilometer-0" podUID="ce9ab866-6ba7-4782-a002-5f0f4c252b4e" Dec 11 13:26:58 crc kubenswrapper[4898]: I1211 13:26:58.395796 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Dec 11 13:26:58 crc kubenswrapper[4898]: E1211 13:26:58.396228 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76d89e82-8f2e-4198-8736-28293404a0bd" containerName="cinder-db-sync" Dec 11 13:26:58 crc kubenswrapper[4898]: I1211 13:26:58.396243 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="76d89e82-8f2e-4198-8736-28293404a0bd" containerName="cinder-db-sync" Dec 11 13:26:58 crc kubenswrapper[4898]: E1211 13:26:58.396252 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec7227b2-711c-479a-91b3-4f4d2f77ace4" containerName="init" Dec 11 13:26:58 crc kubenswrapper[4898]: I1211 13:26:58.396259 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec7227b2-711c-479a-91b3-4f4d2f77ace4" containerName="init" Dec 11 13:26:58 crc kubenswrapper[4898]: E1211 13:26:58.396297 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec7227b2-711c-479a-91b3-4f4d2f77ace4" containerName="dnsmasq-dns" Dec 11 13:26:58 crc kubenswrapper[4898]: I1211 13:26:58.396304 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec7227b2-711c-479a-91b3-4f4d2f77ace4" containerName="dnsmasq-dns" Dec 11 13:26:58 crc kubenswrapper[4898]: I1211 13:26:58.396540 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec7227b2-711c-479a-91b3-4f4d2f77ace4" containerName="dnsmasq-dns" Dec 11 13:26:58 crc kubenswrapper[4898]: I1211 13:26:58.396555 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="76d89e82-8f2e-4198-8736-28293404a0bd" containerName="cinder-db-sync" Dec 11 13:26:58 crc kubenswrapper[4898]: I1211 13:26:58.398847 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 11 13:26:58 crc kubenswrapper[4898]: I1211 13:26:58.402362 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Dec 11 13:26:58 crc kubenswrapper[4898]: I1211 13:26:58.402554 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Dec 11 13:26:58 crc kubenswrapper[4898]: I1211 13:26:58.402675 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Dec 11 13:26:58 crc kubenswrapper[4898]: I1211 13:26:58.403370 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-d68j7" Dec 11 13:26:58 crc kubenswrapper[4898]: I1211 13:26:58.433420 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-58wpg\" (UniqueName: \"kubernetes.io/projected/ec7227b2-711c-479a-91b3-4f4d2f77ace4-kube-api-access-58wpg\") on node \"crc\" DevicePath \"\"" Dec 11 13:26:58 crc kubenswrapper[4898]: I1211 13:26:58.437142 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 11 13:26:58 crc kubenswrapper[4898]: I1211 13:26:58.536760 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9a8d652-d53c-4daf-8714-ffd773a4134c-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"d9a8d652-d53c-4daf-8714-ffd773a4134c\") " pod="openstack/cinder-scheduler-0" Dec 11 13:26:58 crc kubenswrapper[4898]: I1211 13:26:58.536836 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d9a8d652-d53c-4daf-8714-ffd773a4134c-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"d9a8d652-d53c-4daf-8714-ffd773a4134c\") " pod="openstack/cinder-scheduler-0" Dec 11 13:26:58 crc kubenswrapper[4898]: I1211 13:26:58.536870 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9a8d652-d53c-4daf-8714-ffd773a4134c-config-data\") pod \"cinder-scheduler-0\" (UID: \"d9a8d652-d53c-4daf-8714-ffd773a4134c\") " pod="openstack/cinder-scheduler-0" Dec 11 13:26:58 crc kubenswrapper[4898]: I1211 13:26:58.536890 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d9a8d652-d53c-4daf-8714-ffd773a4134c-scripts\") pod \"cinder-scheduler-0\" (UID: \"d9a8d652-d53c-4daf-8714-ffd773a4134c\") " pod="openstack/cinder-scheduler-0" Dec 11 13:26:58 crc kubenswrapper[4898]: I1211 13:26:58.536912 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d9a8d652-d53c-4daf-8714-ffd773a4134c-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"d9a8d652-d53c-4daf-8714-ffd773a4134c\") " pod="openstack/cinder-scheduler-0" Dec 11 13:26:58 crc kubenswrapper[4898]: I1211 13:26:58.536945 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6j8mj\" (UniqueName: \"kubernetes.io/projected/d9a8d652-d53c-4daf-8714-ffd773a4134c-kube-api-access-6j8mj\") pod \"cinder-scheduler-0\" (UID: \"d9a8d652-d53c-4daf-8714-ffd773a4134c\") " pod="openstack/cinder-scheduler-0" Dec 11 13:26:58 crc kubenswrapper[4898]: I1211 13:26:58.573700 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-cx55d"] Dec 11 13:26:58 crc kubenswrapper[4898]: I1211 13:26:58.575719 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6578955fd5-cx55d" Dec 11 13:26:58 crc kubenswrapper[4898]: I1211 13:26:58.588855 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-cx55d"] Dec 11 13:26:58 crc kubenswrapper[4898]: I1211 13:26:58.638784 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9a8d652-d53c-4daf-8714-ffd773a4134c-config-data\") pod \"cinder-scheduler-0\" (UID: \"d9a8d652-d53c-4daf-8714-ffd773a4134c\") " pod="openstack/cinder-scheduler-0" Dec 11 13:26:58 crc kubenswrapper[4898]: I1211 13:26:58.638840 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5bdd1d6a-439c-487d-be8f-59ddafe284de-ovsdbserver-nb\") pod \"dnsmasq-dns-6578955fd5-cx55d\" (UID: \"5bdd1d6a-439c-487d-be8f-59ddafe284de\") " pod="openstack/dnsmasq-dns-6578955fd5-cx55d" Dec 11 13:26:58 crc kubenswrapper[4898]: I1211 13:26:58.638872 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d9a8d652-d53c-4daf-8714-ffd773a4134c-scripts\") pod \"cinder-scheduler-0\" (UID: \"d9a8d652-d53c-4daf-8714-ffd773a4134c\") " pod="openstack/cinder-scheduler-0" Dec 11 13:26:58 crc kubenswrapper[4898]: I1211 13:26:58.638902 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d9a8d652-d53c-4daf-8714-ffd773a4134c-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"d9a8d652-d53c-4daf-8714-ffd773a4134c\") " pod="openstack/cinder-scheduler-0" Dec 11 13:26:58 crc kubenswrapper[4898]: I1211 13:26:58.638953 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6j8mj\" (UniqueName: \"kubernetes.io/projected/d9a8d652-d53c-4daf-8714-ffd773a4134c-kube-api-access-6j8mj\") pod \"cinder-scheduler-0\" (UID: \"d9a8d652-d53c-4daf-8714-ffd773a4134c\") " pod="openstack/cinder-scheduler-0" Dec 11 13:26:58 crc kubenswrapper[4898]: I1211 13:26:58.639032 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5bdd1d6a-439c-487d-be8f-59ddafe284de-dns-svc\") pod \"dnsmasq-dns-6578955fd5-cx55d\" (UID: \"5bdd1d6a-439c-487d-be8f-59ddafe284de\") " pod="openstack/dnsmasq-dns-6578955fd5-cx55d" Dec 11 13:26:58 crc kubenswrapper[4898]: I1211 13:26:58.639059 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5bdd1d6a-439c-487d-be8f-59ddafe284de-dns-swift-storage-0\") pod \"dnsmasq-dns-6578955fd5-cx55d\" (UID: \"5bdd1d6a-439c-487d-be8f-59ddafe284de\") " pod="openstack/dnsmasq-dns-6578955fd5-cx55d" Dec 11 13:26:58 crc kubenswrapper[4898]: I1211 13:26:58.639112 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5bdd1d6a-439c-487d-be8f-59ddafe284de-ovsdbserver-sb\") pod \"dnsmasq-dns-6578955fd5-cx55d\" (UID: \"5bdd1d6a-439c-487d-be8f-59ddafe284de\") " pod="openstack/dnsmasq-dns-6578955fd5-cx55d" Dec 11 13:26:58 crc kubenswrapper[4898]: I1211 13:26:58.639144 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5bdd1d6a-439c-487d-be8f-59ddafe284de-config\") pod \"dnsmasq-dns-6578955fd5-cx55d\" (UID: \"5bdd1d6a-439c-487d-be8f-59ddafe284de\") " pod="openstack/dnsmasq-dns-6578955fd5-cx55d" Dec 11 13:26:58 crc kubenswrapper[4898]: I1211 13:26:58.639203 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9a8d652-d53c-4daf-8714-ffd773a4134c-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"d9a8d652-d53c-4daf-8714-ffd773a4134c\") " pod="openstack/cinder-scheduler-0" Dec 11 13:26:58 crc kubenswrapper[4898]: I1211 13:26:58.639257 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9p9mr\" (UniqueName: \"kubernetes.io/projected/5bdd1d6a-439c-487d-be8f-59ddafe284de-kube-api-access-9p9mr\") pod \"dnsmasq-dns-6578955fd5-cx55d\" (UID: \"5bdd1d6a-439c-487d-be8f-59ddafe284de\") " pod="openstack/dnsmasq-dns-6578955fd5-cx55d" Dec 11 13:26:58 crc kubenswrapper[4898]: I1211 13:26:58.639290 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d9a8d652-d53c-4daf-8714-ffd773a4134c-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"d9a8d652-d53c-4daf-8714-ffd773a4134c\") " pod="openstack/cinder-scheduler-0" Dec 11 13:26:58 crc kubenswrapper[4898]: I1211 13:26:58.641463 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d9a8d652-d53c-4daf-8714-ffd773a4134c-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"d9a8d652-d53c-4daf-8714-ffd773a4134c\") " pod="openstack/cinder-scheduler-0" Dec 11 13:26:58 crc kubenswrapper[4898]: I1211 13:26:58.675321 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ec7227b2-711c-479a-91b3-4f4d2f77ace4-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ec7227b2-711c-479a-91b3-4f4d2f77ace4" (UID: "ec7227b2-711c-479a-91b3-4f4d2f77ace4"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:26:58 crc kubenswrapper[4898]: I1211 13:26:58.692307 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9a8d652-d53c-4daf-8714-ffd773a4134c-config-data\") pod \"cinder-scheduler-0\" (UID: \"d9a8d652-d53c-4daf-8714-ffd773a4134c\") " pod="openstack/cinder-scheduler-0" Dec 11 13:26:58 crc kubenswrapper[4898]: I1211 13:26:58.714115 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6j8mj\" (UniqueName: \"kubernetes.io/projected/d9a8d652-d53c-4daf-8714-ffd773a4134c-kube-api-access-6j8mj\") pod \"cinder-scheduler-0\" (UID: \"d9a8d652-d53c-4daf-8714-ffd773a4134c\") " pod="openstack/cinder-scheduler-0" Dec 11 13:26:58 crc kubenswrapper[4898]: I1211 13:26:58.715033 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d9a8d652-d53c-4daf-8714-ffd773a4134c-scripts\") pod \"cinder-scheduler-0\" (UID: \"d9a8d652-d53c-4daf-8714-ffd773a4134c\") " pod="openstack/cinder-scheduler-0" Dec 11 13:26:58 crc kubenswrapper[4898]: I1211 13:26:58.716096 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d9a8d652-d53c-4daf-8714-ffd773a4134c-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"d9a8d652-d53c-4daf-8714-ffd773a4134c\") " pod="openstack/cinder-scheduler-0" Dec 11 13:26:58 crc kubenswrapper[4898]: I1211 13:26:58.741337 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5bdd1d6a-439c-487d-be8f-59ddafe284de-ovsdbserver-sb\") pod \"dnsmasq-dns-6578955fd5-cx55d\" (UID: \"5bdd1d6a-439c-487d-be8f-59ddafe284de\") " pod="openstack/dnsmasq-dns-6578955fd5-cx55d" Dec 11 13:26:58 crc kubenswrapper[4898]: I1211 13:26:58.741396 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5bdd1d6a-439c-487d-be8f-59ddafe284de-config\") pod \"dnsmasq-dns-6578955fd5-cx55d\" (UID: \"5bdd1d6a-439c-487d-be8f-59ddafe284de\") " pod="openstack/dnsmasq-dns-6578955fd5-cx55d" Dec 11 13:26:58 crc kubenswrapper[4898]: I1211 13:26:58.741548 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9p9mr\" (UniqueName: \"kubernetes.io/projected/5bdd1d6a-439c-487d-be8f-59ddafe284de-kube-api-access-9p9mr\") pod \"dnsmasq-dns-6578955fd5-cx55d\" (UID: \"5bdd1d6a-439c-487d-be8f-59ddafe284de\") " pod="openstack/dnsmasq-dns-6578955fd5-cx55d" Dec 11 13:26:58 crc kubenswrapper[4898]: I1211 13:26:58.741625 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5bdd1d6a-439c-487d-be8f-59ddafe284de-ovsdbserver-nb\") pod \"dnsmasq-dns-6578955fd5-cx55d\" (UID: \"5bdd1d6a-439c-487d-be8f-59ddafe284de\") " pod="openstack/dnsmasq-dns-6578955fd5-cx55d" Dec 11 13:26:58 crc kubenswrapper[4898]: I1211 13:26:58.741785 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5bdd1d6a-439c-487d-be8f-59ddafe284de-dns-svc\") pod \"dnsmasq-dns-6578955fd5-cx55d\" (UID: \"5bdd1d6a-439c-487d-be8f-59ddafe284de\") " pod="openstack/dnsmasq-dns-6578955fd5-cx55d" Dec 11 13:26:58 crc kubenswrapper[4898]: I1211 13:26:58.741819 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5bdd1d6a-439c-487d-be8f-59ddafe284de-dns-swift-storage-0\") pod \"dnsmasq-dns-6578955fd5-cx55d\" (UID: \"5bdd1d6a-439c-487d-be8f-59ddafe284de\") " pod="openstack/dnsmasq-dns-6578955fd5-cx55d" Dec 11 13:26:58 crc kubenswrapper[4898]: I1211 13:26:58.741895 4898 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ec7227b2-711c-479a-91b3-4f4d2f77ace4-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 11 13:26:58 crc kubenswrapper[4898]: I1211 13:26:58.746854 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5bdd1d6a-439c-487d-be8f-59ddafe284de-ovsdbserver-sb\") pod \"dnsmasq-dns-6578955fd5-cx55d\" (UID: \"5bdd1d6a-439c-487d-be8f-59ddafe284de\") " pod="openstack/dnsmasq-dns-6578955fd5-cx55d" Dec 11 13:26:58 crc kubenswrapper[4898]: I1211 13:26:58.747842 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5bdd1d6a-439c-487d-be8f-59ddafe284de-config\") pod \"dnsmasq-dns-6578955fd5-cx55d\" (UID: \"5bdd1d6a-439c-487d-be8f-59ddafe284de\") " pod="openstack/dnsmasq-dns-6578955fd5-cx55d" Dec 11 13:26:58 crc kubenswrapper[4898]: I1211 13:26:58.748590 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5bdd1d6a-439c-487d-be8f-59ddafe284de-ovsdbserver-nb\") pod \"dnsmasq-dns-6578955fd5-cx55d\" (UID: \"5bdd1d6a-439c-487d-be8f-59ddafe284de\") " pod="openstack/dnsmasq-dns-6578955fd5-cx55d" Dec 11 13:26:58 crc kubenswrapper[4898]: I1211 13:26:58.751520 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5bdd1d6a-439c-487d-be8f-59ddafe284de-dns-svc\") pod \"dnsmasq-dns-6578955fd5-cx55d\" (UID: \"5bdd1d6a-439c-487d-be8f-59ddafe284de\") " pod="openstack/dnsmasq-dns-6578955fd5-cx55d" Dec 11 13:26:58 crc kubenswrapper[4898]: I1211 13:26:58.753000 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5bdd1d6a-439c-487d-be8f-59ddafe284de-dns-swift-storage-0\") pod \"dnsmasq-dns-6578955fd5-cx55d\" (UID: \"5bdd1d6a-439c-487d-be8f-59ddafe284de\") " pod="openstack/dnsmasq-dns-6578955fd5-cx55d" Dec 11 13:26:58 crc kubenswrapper[4898]: I1211 13:26:58.771087 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ec7227b2-711c-479a-91b3-4f4d2f77ace4-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ec7227b2-711c-479a-91b3-4f4d2f77ace4" (UID: "ec7227b2-711c-479a-91b3-4f4d2f77ace4"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:26:58 crc kubenswrapper[4898]: I1211 13:26:58.771550 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ec7227b2-711c-479a-91b3-4f4d2f77ace4-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "ec7227b2-711c-479a-91b3-4f4d2f77ace4" (UID: "ec7227b2-711c-479a-91b3-4f4d2f77ace4"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:26:58 crc kubenswrapper[4898]: I1211 13:26:58.786267 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9a8d652-d53c-4daf-8714-ffd773a4134c-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"d9a8d652-d53c-4daf-8714-ffd773a4134c\") " pod="openstack/cinder-scheduler-0" Dec 11 13:26:58 crc kubenswrapper[4898]: I1211 13:26:58.788551 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9p9mr\" (UniqueName: \"kubernetes.io/projected/5bdd1d6a-439c-487d-be8f-59ddafe284de-kube-api-access-9p9mr\") pod \"dnsmasq-dns-6578955fd5-cx55d\" (UID: \"5bdd1d6a-439c-487d-be8f-59ddafe284de\") " pod="openstack/dnsmasq-dns-6578955fd5-cx55d" Dec 11 13:26:58 crc kubenswrapper[4898]: I1211 13:26:58.806132 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 11 13:26:58 crc kubenswrapper[4898]: I1211 13:26:58.852476 4898 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ec7227b2-711c-479a-91b3-4f4d2f77ace4-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 11 13:26:58 crc kubenswrapper[4898]: I1211 13:26:58.852516 4898 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ec7227b2-711c-479a-91b3-4f4d2f77ace4-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 11 13:26:58 crc kubenswrapper[4898]: I1211 13:26:58.926144 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ec7227b2-711c-479a-91b3-4f4d2f77ace4-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ec7227b2-711c-479a-91b3-4f4d2f77ace4" (UID: "ec7227b2-711c-479a-91b3-4f4d2f77ace4"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:26:58 crc kubenswrapper[4898]: I1211 13:26:58.926396 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ec7227b2-711c-479a-91b3-4f4d2f77ace4-config" (OuterVolumeSpecName: "config") pod "ec7227b2-711c-479a-91b3-4f4d2f77ace4" (UID: "ec7227b2-711c-479a-91b3-4f4d2f77ace4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:26:58 crc kubenswrapper[4898]: I1211 13:26:58.950528 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Dec 11 13:26:58 crc kubenswrapper[4898]: I1211 13:26:58.953143 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 11 13:26:58 crc kubenswrapper[4898]: I1211 13:26:58.959708 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Dec 11 13:26:58 crc kubenswrapper[4898]: I1211 13:26:58.979975 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6578955fd5-cx55d" Dec 11 13:26:58 crc kubenswrapper[4898]: I1211 13:26:58.981059 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b9c2941b-6d82-4218-8cc7-91f6473d0fbf-config-data-custom\") pod \"cinder-api-0\" (UID: \"b9c2941b-6d82-4218-8cc7-91f6473d0fbf\") " pod="openstack/cinder-api-0" Dec 11 13:26:58 crc kubenswrapper[4898]: I1211 13:26:58.981098 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b9c2941b-6d82-4218-8cc7-91f6473d0fbf-etc-machine-id\") pod \"cinder-api-0\" (UID: \"b9c2941b-6d82-4218-8cc7-91f6473d0fbf\") " pod="openstack/cinder-api-0" Dec 11 13:26:58 crc kubenswrapper[4898]: I1211 13:26:58.981184 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b9c2941b-6d82-4218-8cc7-91f6473d0fbf-logs\") pod \"cinder-api-0\" (UID: \"b9c2941b-6d82-4218-8cc7-91f6473d0fbf\") " pod="openstack/cinder-api-0" Dec 11 13:26:58 crc kubenswrapper[4898]: I1211 13:26:58.981234 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-brzs6\" (UniqueName: \"kubernetes.io/projected/b9c2941b-6d82-4218-8cc7-91f6473d0fbf-kube-api-access-brzs6\") pod \"cinder-api-0\" (UID: \"b9c2941b-6d82-4218-8cc7-91f6473d0fbf\") " pod="openstack/cinder-api-0" Dec 11 13:26:58 crc kubenswrapper[4898]: I1211 13:26:58.981329 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9c2941b-6d82-4218-8cc7-91f6473d0fbf-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"b9c2941b-6d82-4218-8cc7-91f6473d0fbf\") " pod="openstack/cinder-api-0" Dec 11 13:26:58 crc kubenswrapper[4898]: I1211 13:26:58.981515 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b9c2941b-6d82-4218-8cc7-91f6473d0fbf-scripts\") pod \"cinder-api-0\" (UID: \"b9c2941b-6d82-4218-8cc7-91f6473d0fbf\") " pod="openstack/cinder-api-0" Dec 11 13:26:58 crc kubenswrapper[4898]: I1211 13:26:58.981564 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9c2941b-6d82-4218-8cc7-91f6473d0fbf-config-data\") pod \"cinder-api-0\" (UID: \"b9c2941b-6d82-4218-8cc7-91f6473d0fbf\") " pod="openstack/cinder-api-0" Dec 11 13:26:58 crc kubenswrapper[4898]: I1211 13:26:58.982053 4898 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec7227b2-711c-479a-91b3-4f4d2f77ace4-config\") on node \"crc\" DevicePath \"\"" Dec 11 13:26:58 crc kubenswrapper[4898]: I1211 13:26:58.982080 4898 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ec7227b2-711c-479a-91b3-4f4d2f77ace4-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 11 13:26:58 crc kubenswrapper[4898]: I1211 13:26:58.986072 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 11 13:26:59 crc kubenswrapper[4898]: I1211 13:26:59.044698 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7d9697675d-q65pl" event={"ID":"08d32354-0f02-436c-a082-9d02e2ebaadc","Type":"ContainerStarted","Data":"fd07fcac3eb2b83f6aaa2d0036b4ce18cf80bff92b6158ef6002ae17f600be67"} Dec 11 13:26:59 crc kubenswrapper[4898]: I1211 13:26:59.071348 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-58648c8f48-xlp5l" event={"ID":"33744ce1-11e6-4c20-b805-f9ba35221d29","Type":"ContainerStarted","Data":"33a99f5d049cb86f423f0273424d865653a623500390d110619a4368ec2122dc"} Dec 11 13:26:59 crc kubenswrapper[4898]: I1211 13:26:59.085428 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b9c2941b-6d82-4218-8cc7-91f6473d0fbf-logs\") pod \"cinder-api-0\" (UID: \"b9c2941b-6d82-4218-8cc7-91f6473d0fbf\") " pod="openstack/cinder-api-0" Dec 11 13:26:59 crc kubenswrapper[4898]: I1211 13:26:59.085492 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-brzs6\" (UniqueName: \"kubernetes.io/projected/b9c2941b-6d82-4218-8cc7-91f6473d0fbf-kube-api-access-brzs6\") pod \"cinder-api-0\" (UID: \"b9c2941b-6d82-4218-8cc7-91f6473d0fbf\") " pod="openstack/cinder-api-0" Dec 11 13:26:59 crc kubenswrapper[4898]: I1211 13:26:59.085553 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9c2941b-6d82-4218-8cc7-91f6473d0fbf-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"b9c2941b-6d82-4218-8cc7-91f6473d0fbf\") " pod="openstack/cinder-api-0" Dec 11 13:26:59 crc kubenswrapper[4898]: I1211 13:26:59.085637 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b9c2941b-6d82-4218-8cc7-91f6473d0fbf-scripts\") pod \"cinder-api-0\" (UID: \"b9c2941b-6d82-4218-8cc7-91f6473d0fbf\") " pod="openstack/cinder-api-0" Dec 11 13:26:59 crc kubenswrapper[4898]: I1211 13:26:59.085665 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9c2941b-6d82-4218-8cc7-91f6473d0fbf-config-data\") pod \"cinder-api-0\" (UID: \"b9c2941b-6d82-4218-8cc7-91f6473d0fbf\") " pod="openstack/cinder-api-0" Dec 11 13:26:59 crc kubenswrapper[4898]: I1211 13:26:59.085707 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b9c2941b-6d82-4218-8cc7-91f6473d0fbf-config-data-custom\") pod \"cinder-api-0\" (UID: \"b9c2941b-6d82-4218-8cc7-91f6473d0fbf\") " pod="openstack/cinder-api-0" Dec 11 13:26:59 crc kubenswrapper[4898]: I1211 13:26:59.085721 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b9c2941b-6d82-4218-8cc7-91f6473d0fbf-etc-machine-id\") pod \"cinder-api-0\" (UID: \"b9c2941b-6d82-4218-8cc7-91f6473d0fbf\") " pod="openstack/cinder-api-0" Dec 11 13:26:59 crc kubenswrapper[4898]: I1211 13:26:59.085809 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b9c2941b-6d82-4218-8cc7-91f6473d0fbf-etc-machine-id\") pod \"cinder-api-0\" (UID: \"b9c2941b-6d82-4218-8cc7-91f6473d0fbf\") " pod="openstack/cinder-api-0" Dec 11 13:26:59 crc kubenswrapper[4898]: I1211 13:26:59.087049 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b9c2941b-6d82-4218-8cc7-91f6473d0fbf-logs\") pod \"cinder-api-0\" (UID: \"b9c2941b-6d82-4218-8cc7-91f6473d0fbf\") " pod="openstack/cinder-api-0" Dec 11 13:26:59 crc kubenswrapper[4898]: I1211 13:26:59.098940 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9c2941b-6d82-4218-8cc7-91f6473d0fbf-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"b9c2941b-6d82-4218-8cc7-91f6473d0fbf\") " pod="openstack/cinder-api-0" Dec 11 13:26:59 crc kubenswrapper[4898]: I1211 13:26:59.099264 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ce9ab866-6ba7-4782-a002-5f0f4c252b4e","Type":"ContainerStarted","Data":"4a6d22d4236dd26e95018dae89db81ca0c5083f985d5108229e042e699a8d85a"} Dec 11 13:26:59 crc kubenswrapper[4898]: I1211 13:26:59.099283 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ce9ab866-6ba7-4782-a002-5f0f4c252b4e" containerName="ceilometer-notification-agent" containerID="cri-o://b04914645067dec13ff09d02784be8c6ab8970a8b466dfc229a4638004568231" gracePeriod=30 Dec 11 13:26:59 crc kubenswrapper[4898]: I1211 13:26:59.099378 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ce9ab866-6ba7-4782-a002-5f0f4c252b4e" containerName="proxy-httpd" containerID="cri-o://4a6d22d4236dd26e95018dae89db81ca0c5083f985d5108229e042e699a8d85a" gracePeriod=30 Dec 11 13:26:59 crc kubenswrapper[4898]: I1211 13:26:59.099438 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ce9ab866-6ba7-4782-a002-5f0f4c252b4e" containerName="sg-core" containerID="cri-o://da70ba5ae6f4baab0b2940ef49df332e4852297598182a36995f2e14873b46b6" gracePeriod=30 Dec 11 13:26:59 crc kubenswrapper[4898]: I1211 13:26:59.100018 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 11 13:26:59 crc kubenswrapper[4898]: I1211 13:26:59.101227 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9c2941b-6d82-4218-8cc7-91f6473d0fbf-config-data\") pod \"cinder-api-0\" (UID: \"b9c2941b-6d82-4218-8cc7-91f6473d0fbf\") " pod="openstack/cinder-api-0" Dec 11 13:26:59 crc kubenswrapper[4898]: I1211 13:26:59.102086 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b9c2941b-6d82-4218-8cc7-91f6473d0fbf-scripts\") pod \"cinder-api-0\" (UID: \"b9c2941b-6d82-4218-8cc7-91f6473d0fbf\") " pod="openstack/cinder-api-0" Dec 11 13:26:59 crc kubenswrapper[4898]: I1211 13:26:59.103984 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b9c2941b-6d82-4218-8cc7-91f6473d0fbf-config-data-custom\") pod \"cinder-api-0\" (UID: \"b9c2941b-6d82-4218-8cc7-91f6473d0fbf\") " pod="openstack/cinder-api-0" Dec 11 13:26:59 crc kubenswrapper[4898]: I1211 13:26:59.133023 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-brzs6\" (UniqueName: \"kubernetes.io/projected/b9c2941b-6d82-4218-8cc7-91f6473d0fbf-kube-api-access-brzs6\") pod \"cinder-api-0\" (UID: \"b9c2941b-6d82-4218-8cc7-91f6473d0fbf\") " pod="openstack/cinder-api-0" Dec 11 13:26:59 crc kubenswrapper[4898]: I1211 13:26:59.162802 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7b667979-vhw9q" event={"ID":"ec7227b2-711c-479a-91b3-4f4d2f77ace4","Type":"ContainerDied","Data":"12e8ff482a378e1dc9dfdbba6c6ae7109eb79093bedcc5bf2ca056fae419ed70"} Dec 11 13:26:59 crc kubenswrapper[4898]: I1211 13:26:59.162862 4898 scope.go:117] "RemoveContainer" containerID="b4360fab85d966d93e1f09c72b64e4f81927aff75dca30e4b837dcdb84663211" Dec 11 13:26:59 crc kubenswrapper[4898]: I1211 13:26:59.163001 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7b667979-vhw9q" Dec 11 13:26:59 crc kubenswrapper[4898]: I1211 13:26:59.180933 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-668c999b4c-4pjgd" event={"ID":"12ec53d2-3707-4276-89f3-58df46bb17bd","Type":"ContainerStarted","Data":"535b7e9d7bc28a95029cb4caba397856a980451279dbb91b8bacda65b527f356"} Dec 11 13:26:59 crc kubenswrapper[4898]: I1211 13:26:59.354250 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 11 13:26:59 crc kubenswrapper[4898]: I1211 13:26:59.421130 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-vhw9q"] Dec 11 13:26:59 crc kubenswrapper[4898]: I1211 13:26:59.439727 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-vhw9q"] Dec 11 13:26:59 crc kubenswrapper[4898]: I1211 13:26:59.469838 4898 scope.go:117] "RemoveContainer" containerID="13612d4efdb7fa2015fc0f48a89b3fec89946e6e4b65080075b9d834d5dbb2e4" Dec 11 13:26:59 crc kubenswrapper[4898]: I1211 13:26:59.708788 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 11 13:26:59 crc kubenswrapper[4898]: I1211 13:26:59.918239 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-cx55d"] Dec 11 13:26:59 crc kubenswrapper[4898]: I1211 13:26:59.985978 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 11 13:27:00 crc kubenswrapper[4898]: I1211 13:27:00.317041 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-58648c8f48-xlp5l" event={"ID":"33744ce1-11e6-4c20-b805-f9ba35221d29","Type":"ContainerStarted","Data":"14e7195c4f67ea2735470baedeb2a2ca5f91c0a3cd1c9dcd0663b4d17cc11861"} Dec 11 13:27:00 crc kubenswrapper[4898]: I1211 13:27:00.337741 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"b9c2941b-6d82-4218-8cc7-91f6473d0fbf","Type":"ContainerStarted","Data":"3d04a7dc70631fda6e3d0eaecd889ae56d25494dfc1f1a4a1dc8fbc35a12d5e0"} Dec 11 13:27:00 crc kubenswrapper[4898]: I1211 13:27:00.370577 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-668c999b4c-4pjgd" event={"ID":"12ec53d2-3707-4276-89f3-58df46bb17bd","Type":"ContainerStarted","Data":"c8a85c3a8d79c510fb6964fd1959631a1610a68e9a552fb03fb96bd17e686b79"} Dec 11 13:27:00 crc kubenswrapper[4898]: I1211 13:27:00.375866 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-58648c8f48-xlp5l" podStartSLOduration=5.297320867 podStartE2EDuration="14.375848608s" podCreationTimestamp="2025-12-11 13:26:46 +0000 UTC" firstStartedPulling="2025-12-11 13:26:48.405274471 +0000 UTC m=+1365.977600908" lastFinishedPulling="2025-12-11 13:26:57.483802212 +0000 UTC m=+1375.056128649" observedRunningTime="2025-12-11 13:27:00.344231012 +0000 UTC m=+1377.916557459" watchObservedRunningTime="2025-12-11 13:27:00.375848608 +0000 UTC m=+1377.948175045" Dec 11 13:27:00 crc kubenswrapper[4898]: I1211 13:27:00.393768 4898 generic.go:334] "Generic (PLEG): container finished" podID="ce9ab866-6ba7-4782-a002-5f0f4c252b4e" containerID="4a6d22d4236dd26e95018dae89db81ca0c5083f985d5108229e042e699a8d85a" exitCode=0 Dec 11 13:27:00 crc kubenswrapper[4898]: I1211 13:27:00.393851 4898 generic.go:334] "Generic (PLEG): container finished" podID="ce9ab866-6ba7-4782-a002-5f0f4c252b4e" containerID="da70ba5ae6f4baab0b2940ef49df332e4852297598182a36995f2e14873b46b6" exitCode=2 Dec 11 13:27:00 crc kubenswrapper[4898]: I1211 13:27:00.393967 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ce9ab866-6ba7-4782-a002-5f0f4c252b4e","Type":"ContainerDied","Data":"4a6d22d4236dd26e95018dae89db81ca0c5083f985d5108229e042e699a8d85a"} Dec 11 13:27:00 crc kubenswrapper[4898]: I1211 13:27:00.394035 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ce9ab866-6ba7-4782-a002-5f0f4c252b4e","Type":"ContainerDied","Data":"da70ba5ae6f4baab0b2940ef49df332e4852297598182a36995f2e14873b46b6"} Dec 11 13:27:00 crc kubenswrapper[4898]: I1211 13:27:00.395252 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-668c999b4c-4pjgd" podStartSLOduration=5.224691062 podStartE2EDuration="14.395243533s" podCreationTimestamp="2025-12-11 13:26:46 +0000 UTC" firstStartedPulling="2025-12-11 13:26:48.409959918 +0000 UTC m=+1365.982286355" lastFinishedPulling="2025-12-11 13:26:57.580512389 +0000 UTC m=+1375.152838826" observedRunningTime="2025-12-11 13:27:00.393323541 +0000 UTC m=+1377.965649978" watchObservedRunningTime="2025-12-11 13:27:00.395243533 +0000 UTC m=+1377.967569970" Dec 11 13:27:00 crc kubenswrapper[4898]: I1211 13:27:00.403668 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-cx55d" event={"ID":"5bdd1d6a-439c-487d-be8f-59ddafe284de","Type":"ContainerStarted","Data":"8705cc0b17edc446597efe1056e88bda92c586ab8c6c347c48a38cb9cf5f098f"} Dec 11 13:27:00 crc kubenswrapper[4898]: I1211 13:27:00.410368 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"d9a8d652-d53c-4daf-8714-ffd773a4134c","Type":"ContainerStarted","Data":"529f9d55cae159d48ea1e6ab5fdef2619ee32217527a9abba28a1d089cd669cd"} Dec 11 13:27:00 crc kubenswrapper[4898]: I1211 13:27:00.418035 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7d9697675d-q65pl" event={"ID":"08d32354-0f02-436c-a082-9d02e2ebaadc","Type":"ContainerStarted","Data":"9a211b32ecab38e93641b8b7ff889e8330e7021f7b33139e76e2c03c61b72ccc"} Dec 11 13:27:00 crc kubenswrapper[4898]: I1211 13:27:00.418078 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7d9697675d-q65pl" event={"ID":"08d32354-0f02-436c-a082-9d02e2ebaadc","Type":"ContainerStarted","Data":"b7a4ada0b52bf31180a23778e175af2c869b7f09824b0d159655b61656d3eead"} Dec 11 13:27:00 crc kubenswrapper[4898]: I1211 13:27:00.418403 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7d9697675d-q65pl" Dec 11 13:27:00 crc kubenswrapper[4898]: I1211 13:27:00.418691 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7d9697675d-q65pl" Dec 11 13:27:00 crc kubenswrapper[4898]: I1211 13:27:00.445867 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-7d9697675d-q65pl" podStartSLOduration=10.445842232 podStartE2EDuration="10.445842232s" podCreationTimestamp="2025-12-11 13:26:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 13:27:00.436127799 +0000 UTC m=+1378.008454236" watchObservedRunningTime="2025-12-11 13:27:00.445842232 +0000 UTC m=+1378.018168669" Dec 11 13:27:00 crc kubenswrapper[4898]: I1211 13:27:00.693697 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-54b55bb954-hjwxn" Dec 11 13:27:00 crc kubenswrapper[4898]: I1211 13:27:00.801344 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ec7227b2-711c-479a-91b3-4f4d2f77ace4" path="/var/lib/kubelet/pods/ec7227b2-711c-479a-91b3-4f4d2f77ace4/volumes" Dec 11 13:27:00 crc kubenswrapper[4898]: I1211 13:27:00.901624 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-d94654b9b-gfmwl" Dec 11 13:27:01 crc kubenswrapper[4898]: I1211 13:27:01.469058 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 11 13:27:01 crc kubenswrapper[4898]: I1211 13:27:01.539546 4898 generic.go:334] "Generic (PLEG): container finished" podID="ce9ab866-6ba7-4782-a002-5f0f4c252b4e" containerID="b04914645067dec13ff09d02784be8c6ab8970a8b466dfc229a4638004568231" exitCode=0 Dec 11 13:27:01 crc kubenswrapper[4898]: I1211 13:27:01.539596 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ce9ab866-6ba7-4782-a002-5f0f4c252b4e","Type":"ContainerDied","Data":"b04914645067dec13ff09d02784be8c6ab8970a8b466dfc229a4638004568231"} Dec 11 13:27:01 crc kubenswrapper[4898]: I1211 13:27:01.539621 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ce9ab866-6ba7-4782-a002-5f0f4c252b4e","Type":"ContainerDied","Data":"2bcac39c7b4842809b5d1cba6f56405506a81fb3096449e3605b2ccbf4a393e8"} Dec 11 13:27:01 crc kubenswrapper[4898]: I1211 13:27:01.539637 4898 scope.go:117] "RemoveContainer" containerID="4a6d22d4236dd26e95018dae89db81ca0c5083f985d5108229e042e699a8d85a" Dec 11 13:27:01 crc kubenswrapper[4898]: I1211 13:27:01.539754 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 11 13:27:01 crc kubenswrapper[4898]: I1211 13:27:01.588243 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wf29f\" (UniqueName: \"kubernetes.io/projected/ce9ab866-6ba7-4782-a002-5f0f4c252b4e-kube-api-access-wf29f\") pod \"ce9ab866-6ba7-4782-a002-5f0f4c252b4e\" (UID: \"ce9ab866-6ba7-4782-a002-5f0f4c252b4e\") " Dec 11 13:27:01 crc kubenswrapper[4898]: I1211 13:27:01.588343 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ce9ab866-6ba7-4782-a002-5f0f4c252b4e-run-httpd\") pod \"ce9ab866-6ba7-4782-a002-5f0f4c252b4e\" (UID: \"ce9ab866-6ba7-4782-a002-5f0f4c252b4e\") " Dec 11 13:27:01 crc kubenswrapper[4898]: I1211 13:27:01.588369 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ce9ab866-6ba7-4782-a002-5f0f4c252b4e-log-httpd\") pod \"ce9ab866-6ba7-4782-a002-5f0f4c252b4e\" (UID: \"ce9ab866-6ba7-4782-a002-5f0f4c252b4e\") " Dec 11 13:27:01 crc kubenswrapper[4898]: I1211 13:27:01.588404 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ce9ab866-6ba7-4782-a002-5f0f4c252b4e-scripts\") pod \"ce9ab866-6ba7-4782-a002-5f0f4c252b4e\" (UID: \"ce9ab866-6ba7-4782-a002-5f0f4c252b4e\") " Dec 11 13:27:01 crc kubenswrapper[4898]: I1211 13:27:01.588887 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ce9ab866-6ba7-4782-a002-5f0f4c252b4e-sg-core-conf-yaml\") pod \"ce9ab866-6ba7-4782-a002-5f0f4c252b4e\" (UID: \"ce9ab866-6ba7-4782-a002-5f0f4c252b4e\") " Dec 11 13:27:01 crc kubenswrapper[4898]: I1211 13:27:01.588921 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ce9ab866-6ba7-4782-a002-5f0f4c252b4e-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "ce9ab866-6ba7-4782-a002-5f0f4c252b4e" (UID: "ce9ab866-6ba7-4782-a002-5f0f4c252b4e"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 13:27:01 crc kubenswrapper[4898]: I1211 13:27:01.588942 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ce9ab866-6ba7-4782-a002-5f0f4c252b4e-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "ce9ab866-6ba7-4782-a002-5f0f4c252b4e" (UID: "ce9ab866-6ba7-4782-a002-5f0f4c252b4e"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 13:27:01 crc kubenswrapper[4898]: I1211 13:27:01.588943 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce9ab866-6ba7-4782-a002-5f0f4c252b4e-combined-ca-bundle\") pod \"ce9ab866-6ba7-4782-a002-5f0f4c252b4e\" (UID: \"ce9ab866-6ba7-4782-a002-5f0f4c252b4e\") " Dec 11 13:27:01 crc kubenswrapper[4898]: I1211 13:27:01.589139 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce9ab866-6ba7-4782-a002-5f0f4c252b4e-config-data\") pod \"ce9ab866-6ba7-4782-a002-5f0f4c252b4e\" (UID: \"ce9ab866-6ba7-4782-a002-5f0f4c252b4e\") " Dec 11 13:27:01 crc kubenswrapper[4898]: I1211 13:27:01.590338 4898 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ce9ab866-6ba7-4782-a002-5f0f4c252b4e-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 11 13:27:01 crc kubenswrapper[4898]: I1211 13:27:01.590367 4898 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ce9ab866-6ba7-4782-a002-5f0f4c252b4e-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 11 13:27:01 crc kubenswrapper[4898]: I1211 13:27:01.595522 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce9ab866-6ba7-4782-a002-5f0f4c252b4e-kube-api-access-wf29f" (OuterVolumeSpecName: "kube-api-access-wf29f") pod "ce9ab866-6ba7-4782-a002-5f0f4c252b4e" (UID: "ce9ab866-6ba7-4782-a002-5f0f4c252b4e"). InnerVolumeSpecName "kube-api-access-wf29f". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:27:01 crc kubenswrapper[4898]: I1211 13:27:01.602334 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce9ab866-6ba7-4782-a002-5f0f4c252b4e-scripts" (OuterVolumeSpecName: "scripts") pod "ce9ab866-6ba7-4782-a002-5f0f4c252b4e" (UID: "ce9ab866-6ba7-4782-a002-5f0f4c252b4e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:27:01 crc kubenswrapper[4898]: I1211 13:27:01.636849 4898 generic.go:334] "Generic (PLEG): container finished" podID="5bdd1d6a-439c-487d-be8f-59ddafe284de" containerID="c2bdcec82c23f76fa53eac0089e53257284830fce4b1da3169165202f0a01c04" exitCode=0 Dec 11 13:27:01 crc kubenswrapper[4898]: I1211 13:27:01.636939 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-cx55d" event={"ID":"5bdd1d6a-439c-487d-be8f-59ddafe284de","Type":"ContainerDied","Data":"c2bdcec82c23f76fa53eac0089e53257284830fce4b1da3169165202f0a01c04"} Dec 11 13:27:01 crc kubenswrapper[4898]: I1211 13:27:01.657019 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"b9c2941b-6d82-4218-8cc7-91f6473d0fbf","Type":"ContainerStarted","Data":"4d0ffac38ead83f4b73e7c0742be67db7ebbd09cc27e6dadd523be21c77db4e9"} Dec 11 13:27:01 crc kubenswrapper[4898]: I1211 13:27:01.677616 4898 scope.go:117] "RemoveContainer" containerID="da70ba5ae6f4baab0b2940ef49df332e4852297598182a36995f2e14873b46b6" Dec 11 13:27:01 crc kubenswrapper[4898]: I1211 13:27:01.685690 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-54b55bb954-hjwxn" Dec 11 13:27:01 crc kubenswrapper[4898]: I1211 13:27:01.696436 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wf29f\" (UniqueName: \"kubernetes.io/projected/ce9ab866-6ba7-4782-a002-5f0f4c252b4e-kube-api-access-wf29f\") on node \"crc\" DevicePath \"\"" Dec 11 13:27:01 crc kubenswrapper[4898]: I1211 13:27:01.696495 4898 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ce9ab866-6ba7-4782-a002-5f0f4c252b4e-scripts\") on node \"crc\" DevicePath \"\"" Dec 11 13:27:01 crc kubenswrapper[4898]: I1211 13:27:01.726440 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce9ab866-6ba7-4782-a002-5f0f4c252b4e-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "ce9ab866-6ba7-4782-a002-5f0f4c252b4e" (UID: "ce9ab866-6ba7-4782-a002-5f0f4c252b4e"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:27:01 crc kubenswrapper[4898]: I1211 13:27:01.737026 4898 scope.go:117] "RemoveContainer" containerID="b04914645067dec13ff09d02784be8c6ab8970a8b466dfc229a4638004568231" Dec 11 13:27:01 crc kubenswrapper[4898]: I1211 13:27:01.813312 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce9ab866-6ba7-4782-a002-5f0f4c252b4e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ce9ab866-6ba7-4782-a002-5f0f4c252b4e" (UID: "ce9ab866-6ba7-4782-a002-5f0f4c252b4e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:27:01 crc kubenswrapper[4898]: I1211 13:27:01.814961 4898 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ce9ab866-6ba7-4782-a002-5f0f4c252b4e-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 11 13:27:01 crc kubenswrapper[4898]: I1211 13:27:01.815969 4898 scope.go:117] "RemoveContainer" containerID="4a6d22d4236dd26e95018dae89db81ca0c5083f985d5108229e042e699a8d85a" Dec 11 13:27:01 crc kubenswrapper[4898]: I1211 13:27:01.903793 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce9ab866-6ba7-4782-a002-5f0f4c252b4e-config-data" (OuterVolumeSpecName: "config-data") pod "ce9ab866-6ba7-4782-a002-5f0f4c252b4e" (UID: "ce9ab866-6ba7-4782-a002-5f0f4c252b4e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:27:01 crc kubenswrapper[4898]: I1211 13:27:01.918761 4898 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce9ab866-6ba7-4782-a002-5f0f4c252b4e-config-data\") on node \"crc\" DevicePath \"\"" Dec 11 13:27:01 crc kubenswrapper[4898]: I1211 13:27:01.918804 4898 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce9ab866-6ba7-4782-a002-5f0f4c252b4e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 11 13:27:01 crc kubenswrapper[4898]: E1211 13:27:01.921871 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a6d22d4236dd26e95018dae89db81ca0c5083f985d5108229e042e699a8d85a\": container with ID starting with 4a6d22d4236dd26e95018dae89db81ca0c5083f985d5108229e042e699a8d85a not found: ID does not exist" containerID="4a6d22d4236dd26e95018dae89db81ca0c5083f985d5108229e042e699a8d85a" Dec 11 13:27:01 crc kubenswrapper[4898]: I1211 13:27:01.921930 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a6d22d4236dd26e95018dae89db81ca0c5083f985d5108229e042e699a8d85a"} err="failed to get container status \"4a6d22d4236dd26e95018dae89db81ca0c5083f985d5108229e042e699a8d85a\": rpc error: code = NotFound desc = could not find container \"4a6d22d4236dd26e95018dae89db81ca0c5083f985d5108229e042e699a8d85a\": container with ID starting with 4a6d22d4236dd26e95018dae89db81ca0c5083f985d5108229e042e699a8d85a not found: ID does not exist" Dec 11 13:27:01 crc kubenswrapper[4898]: I1211 13:27:01.921955 4898 scope.go:117] "RemoveContainer" containerID="da70ba5ae6f4baab0b2940ef49df332e4852297598182a36995f2e14873b46b6" Dec 11 13:27:01 crc kubenswrapper[4898]: E1211 13:27:01.938621 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da70ba5ae6f4baab0b2940ef49df332e4852297598182a36995f2e14873b46b6\": container with ID starting with da70ba5ae6f4baab0b2940ef49df332e4852297598182a36995f2e14873b46b6 not found: ID does not exist" containerID="da70ba5ae6f4baab0b2940ef49df332e4852297598182a36995f2e14873b46b6" Dec 11 13:27:01 crc kubenswrapper[4898]: I1211 13:27:01.938686 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da70ba5ae6f4baab0b2940ef49df332e4852297598182a36995f2e14873b46b6"} err="failed to get container status \"da70ba5ae6f4baab0b2940ef49df332e4852297598182a36995f2e14873b46b6\": rpc error: code = NotFound desc = could not find container \"da70ba5ae6f4baab0b2940ef49df332e4852297598182a36995f2e14873b46b6\": container with ID starting with da70ba5ae6f4baab0b2940ef49df332e4852297598182a36995f2e14873b46b6 not found: ID does not exist" Dec 11 13:27:01 crc kubenswrapper[4898]: I1211 13:27:01.938719 4898 scope.go:117] "RemoveContainer" containerID="b04914645067dec13ff09d02784be8c6ab8970a8b466dfc229a4638004568231" Dec 11 13:27:01 crc kubenswrapper[4898]: E1211 13:27:01.942556 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b04914645067dec13ff09d02784be8c6ab8970a8b466dfc229a4638004568231\": container with ID starting with b04914645067dec13ff09d02784be8c6ab8970a8b466dfc229a4638004568231 not found: ID does not exist" containerID="b04914645067dec13ff09d02784be8c6ab8970a8b466dfc229a4638004568231" Dec 11 13:27:01 crc kubenswrapper[4898]: I1211 13:27:01.942597 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b04914645067dec13ff09d02784be8c6ab8970a8b466dfc229a4638004568231"} err="failed to get container status \"b04914645067dec13ff09d02784be8c6ab8970a8b466dfc229a4638004568231\": rpc error: code = NotFound desc = could not find container \"b04914645067dec13ff09d02784be8c6ab8970a8b466dfc229a4638004568231\": container with ID starting with b04914645067dec13ff09d02784be8c6ab8970a8b466dfc229a4638004568231 not found: ID does not exist" Dec 11 13:27:02 crc kubenswrapper[4898]: I1211 13:27:02.225640 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 11 13:27:02 crc kubenswrapper[4898]: I1211 13:27:02.249980 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 11 13:27:02 crc kubenswrapper[4898]: I1211 13:27:02.291580 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 11 13:27:02 crc kubenswrapper[4898]: E1211 13:27:02.292114 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce9ab866-6ba7-4782-a002-5f0f4c252b4e" containerName="sg-core" Dec 11 13:27:02 crc kubenswrapper[4898]: I1211 13:27:02.292130 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce9ab866-6ba7-4782-a002-5f0f4c252b4e" containerName="sg-core" Dec 11 13:27:02 crc kubenswrapper[4898]: E1211 13:27:02.292146 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce9ab866-6ba7-4782-a002-5f0f4c252b4e" containerName="ceilometer-notification-agent" Dec 11 13:27:02 crc kubenswrapper[4898]: I1211 13:27:02.292153 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce9ab866-6ba7-4782-a002-5f0f4c252b4e" containerName="ceilometer-notification-agent" Dec 11 13:27:02 crc kubenswrapper[4898]: E1211 13:27:02.292162 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce9ab866-6ba7-4782-a002-5f0f4c252b4e" containerName="proxy-httpd" Dec 11 13:27:02 crc kubenswrapper[4898]: I1211 13:27:02.292168 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce9ab866-6ba7-4782-a002-5f0f4c252b4e" containerName="proxy-httpd" Dec 11 13:27:02 crc kubenswrapper[4898]: I1211 13:27:02.292364 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce9ab866-6ba7-4782-a002-5f0f4c252b4e" containerName="sg-core" Dec 11 13:27:02 crc kubenswrapper[4898]: I1211 13:27:02.292384 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce9ab866-6ba7-4782-a002-5f0f4c252b4e" containerName="ceilometer-notification-agent" Dec 11 13:27:02 crc kubenswrapper[4898]: I1211 13:27:02.292409 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce9ab866-6ba7-4782-a002-5f0f4c252b4e" containerName="proxy-httpd" Dec 11 13:27:02 crc kubenswrapper[4898]: I1211 13:27:02.294430 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 11 13:27:02 crc kubenswrapper[4898]: I1211 13:27:02.301949 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 11 13:27:02 crc kubenswrapper[4898]: I1211 13:27:02.302134 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 11 13:27:02 crc kubenswrapper[4898]: I1211 13:27:02.362540 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 11 13:27:02 crc kubenswrapper[4898]: I1211 13:27:02.431388 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ee3c0532-b7dd-4147-812a-3add6446d9a1-run-httpd\") pod \"ceilometer-0\" (UID: \"ee3c0532-b7dd-4147-812a-3add6446d9a1\") " pod="openstack/ceilometer-0" Dec 11 13:27:02 crc kubenswrapper[4898]: I1211 13:27:02.431486 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee3c0532-b7dd-4147-812a-3add6446d9a1-config-data\") pod \"ceilometer-0\" (UID: \"ee3c0532-b7dd-4147-812a-3add6446d9a1\") " pod="openstack/ceilometer-0" Dec 11 13:27:02 crc kubenswrapper[4898]: I1211 13:27:02.431621 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ee3c0532-b7dd-4147-812a-3add6446d9a1-log-httpd\") pod \"ceilometer-0\" (UID: \"ee3c0532-b7dd-4147-812a-3add6446d9a1\") " pod="openstack/ceilometer-0" Dec 11 13:27:02 crc kubenswrapper[4898]: I1211 13:27:02.431690 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ee3c0532-b7dd-4147-812a-3add6446d9a1-scripts\") pod \"ceilometer-0\" (UID: \"ee3c0532-b7dd-4147-812a-3add6446d9a1\") " pod="openstack/ceilometer-0" Dec 11 13:27:02 crc kubenswrapper[4898]: I1211 13:27:02.431714 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee3c0532-b7dd-4147-812a-3add6446d9a1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ee3c0532-b7dd-4147-812a-3add6446d9a1\") " pod="openstack/ceilometer-0" Dec 11 13:27:02 crc kubenswrapper[4898]: I1211 13:27:02.431755 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-86jtx\" (UniqueName: \"kubernetes.io/projected/ee3c0532-b7dd-4147-812a-3add6446d9a1-kube-api-access-86jtx\") pod \"ceilometer-0\" (UID: \"ee3c0532-b7dd-4147-812a-3add6446d9a1\") " pod="openstack/ceilometer-0" Dec 11 13:27:02 crc kubenswrapper[4898]: I1211 13:27:02.431810 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ee3c0532-b7dd-4147-812a-3add6446d9a1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ee3c0532-b7dd-4147-812a-3add6446d9a1\") " pod="openstack/ceilometer-0" Dec 11 13:27:02 crc kubenswrapper[4898]: I1211 13:27:02.538975 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ee3c0532-b7dd-4147-812a-3add6446d9a1-log-httpd\") pod \"ceilometer-0\" (UID: \"ee3c0532-b7dd-4147-812a-3add6446d9a1\") " pod="openstack/ceilometer-0" Dec 11 13:27:02 crc kubenswrapper[4898]: I1211 13:27:02.539120 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ee3c0532-b7dd-4147-812a-3add6446d9a1-scripts\") pod \"ceilometer-0\" (UID: \"ee3c0532-b7dd-4147-812a-3add6446d9a1\") " pod="openstack/ceilometer-0" Dec 11 13:27:02 crc kubenswrapper[4898]: I1211 13:27:02.539171 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee3c0532-b7dd-4147-812a-3add6446d9a1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ee3c0532-b7dd-4147-812a-3add6446d9a1\") " pod="openstack/ceilometer-0" Dec 11 13:27:02 crc kubenswrapper[4898]: I1211 13:27:02.539230 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-86jtx\" (UniqueName: \"kubernetes.io/projected/ee3c0532-b7dd-4147-812a-3add6446d9a1-kube-api-access-86jtx\") pod \"ceilometer-0\" (UID: \"ee3c0532-b7dd-4147-812a-3add6446d9a1\") " pod="openstack/ceilometer-0" Dec 11 13:27:02 crc kubenswrapper[4898]: I1211 13:27:02.539311 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ee3c0532-b7dd-4147-812a-3add6446d9a1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ee3c0532-b7dd-4147-812a-3add6446d9a1\") " pod="openstack/ceilometer-0" Dec 11 13:27:02 crc kubenswrapper[4898]: I1211 13:27:02.539403 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ee3c0532-b7dd-4147-812a-3add6446d9a1-run-httpd\") pod \"ceilometer-0\" (UID: \"ee3c0532-b7dd-4147-812a-3add6446d9a1\") " pod="openstack/ceilometer-0" Dec 11 13:27:02 crc kubenswrapper[4898]: I1211 13:27:02.539923 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee3c0532-b7dd-4147-812a-3add6446d9a1-config-data\") pod \"ceilometer-0\" (UID: \"ee3c0532-b7dd-4147-812a-3add6446d9a1\") " pod="openstack/ceilometer-0" Dec 11 13:27:02 crc kubenswrapper[4898]: I1211 13:27:02.542885 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ee3c0532-b7dd-4147-812a-3add6446d9a1-log-httpd\") pod \"ceilometer-0\" (UID: \"ee3c0532-b7dd-4147-812a-3add6446d9a1\") " pod="openstack/ceilometer-0" Dec 11 13:27:02 crc kubenswrapper[4898]: I1211 13:27:02.545270 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee3c0532-b7dd-4147-812a-3add6446d9a1-config-data\") pod \"ceilometer-0\" (UID: \"ee3c0532-b7dd-4147-812a-3add6446d9a1\") " pod="openstack/ceilometer-0" Dec 11 13:27:02 crc kubenswrapper[4898]: I1211 13:27:02.545563 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ee3c0532-b7dd-4147-812a-3add6446d9a1-run-httpd\") pod \"ceilometer-0\" (UID: \"ee3c0532-b7dd-4147-812a-3add6446d9a1\") " pod="openstack/ceilometer-0" Dec 11 13:27:02 crc kubenswrapper[4898]: I1211 13:27:02.558507 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Dec 11 13:27:02 crc kubenswrapper[4898]: I1211 13:27:02.559322 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ee3c0532-b7dd-4147-812a-3add6446d9a1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ee3c0532-b7dd-4147-812a-3add6446d9a1\") " pod="openstack/ceilometer-0" Dec 11 13:27:02 crc kubenswrapper[4898]: I1211 13:27:02.559429 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee3c0532-b7dd-4147-812a-3add6446d9a1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ee3c0532-b7dd-4147-812a-3add6446d9a1\") " pod="openstack/ceilometer-0" Dec 11 13:27:02 crc kubenswrapper[4898]: I1211 13:27:02.571784 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ee3c0532-b7dd-4147-812a-3add6446d9a1-scripts\") pod \"ceilometer-0\" (UID: \"ee3c0532-b7dd-4147-812a-3add6446d9a1\") " pod="openstack/ceilometer-0" Dec 11 13:27:02 crc kubenswrapper[4898]: I1211 13:27:02.579673 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-86jtx\" (UniqueName: \"kubernetes.io/projected/ee3c0532-b7dd-4147-812a-3add6446d9a1-kube-api-access-86jtx\") pod \"ceilometer-0\" (UID: \"ee3c0532-b7dd-4147-812a-3add6446d9a1\") " pod="openstack/ceilometer-0" Dec 11 13:27:02 crc kubenswrapper[4898]: I1211 13:27:02.672568 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"b9c2941b-6d82-4218-8cc7-91f6473d0fbf","Type":"ContainerStarted","Data":"58e185896e058998d630d9bdb5ec15963cbaba34c173a769cba3bf6bcd8a32b3"} Dec 11 13:27:02 crc kubenswrapper[4898]: I1211 13:27:02.672719 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="b9c2941b-6d82-4218-8cc7-91f6473d0fbf" containerName="cinder-api-log" containerID="cri-o://4d0ffac38ead83f4b73e7c0742be67db7ebbd09cc27e6dadd523be21c77db4e9" gracePeriod=30 Dec 11 13:27:02 crc kubenswrapper[4898]: I1211 13:27:02.672754 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="b9c2941b-6d82-4218-8cc7-91f6473d0fbf" containerName="cinder-api" containerID="cri-o://58e185896e058998d630d9bdb5ec15963cbaba34c173a769cba3bf6bcd8a32b3" gracePeriod=30 Dec 11 13:27:02 crc kubenswrapper[4898]: I1211 13:27:02.672959 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Dec 11 13:27:02 crc kubenswrapper[4898]: I1211 13:27:02.683080 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-cx55d" event={"ID":"5bdd1d6a-439c-487d-be8f-59ddafe284de","Type":"ContainerStarted","Data":"0e86e7b3d2d0a5a5a6c3c1a89921a8ade3c0656cd7567f7d30efb5d3154c4586"} Dec 11 13:27:02 crc kubenswrapper[4898]: I1211 13:27:02.687816 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6578955fd5-cx55d" Dec 11 13:27:02 crc kubenswrapper[4898]: I1211 13:27:02.690483 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.690469929 podStartE2EDuration="4.690469929s" podCreationTimestamp="2025-12-11 13:26:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 13:27:02.686881172 +0000 UTC m=+1380.259207609" watchObservedRunningTime="2025-12-11 13:27:02.690469929 +0000 UTC m=+1380.262796366" Dec 11 13:27:02 crc kubenswrapper[4898]: I1211 13:27:02.700897 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"d9a8d652-d53c-4daf-8714-ffd773a4134c","Type":"ContainerStarted","Data":"4e9be3b2d83f0bf8e93bb7f3fe33609f7df69d2d59278146f559c4bad6041762"} Dec 11 13:27:02 crc kubenswrapper[4898]: I1211 13:27:02.716920 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6578955fd5-cx55d" podStartSLOduration=4.716904543 podStartE2EDuration="4.716904543s" podCreationTimestamp="2025-12-11 13:26:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 13:27:02.715209667 +0000 UTC m=+1380.287536104" watchObservedRunningTime="2025-12-11 13:27:02.716904543 +0000 UTC m=+1380.289230980" Dec 11 13:27:02 crc kubenswrapper[4898]: I1211 13:27:02.798387 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 11 13:27:02 crc kubenswrapper[4898]: I1211 13:27:02.809666 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce9ab866-6ba7-4782-a002-5f0f4c252b4e" path="/var/lib/kubelet/pods/ce9ab866-6ba7-4782-a002-5f0f4c252b4e/volumes" Dec 11 13:27:02 crc kubenswrapper[4898]: E1211 13:27:02.853091 4898 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod98476a59_e1fa_4912_828f_c2bc4d6f1317.slice/crio-600f8cad965db2164982c51d6ef27f0ff5d8efb54479c6a9017208646a81bccc\": RecentStats: unable to find data in memory cache]" Dec 11 13:27:03 crc kubenswrapper[4898]: I1211 13:27:03.434948 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 11 13:27:03 crc kubenswrapper[4898]: W1211 13:27:03.445729 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podee3c0532_b7dd_4147_812a_3add6446d9a1.slice/crio-1ec0ac589729fef512a2acaf03be80142bdc464d3d0f82f074ad05575c8e2602 WatchSource:0}: Error finding container 1ec0ac589729fef512a2acaf03be80142bdc464d3d0f82f074ad05575c8e2602: Status 404 returned error can't find the container with id 1ec0ac589729fef512a2acaf03be80142bdc464d3d0f82f074ad05575c8e2602 Dec 11 13:27:03 crc kubenswrapper[4898]: I1211 13:27:03.716144 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"d9a8d652-d53c-4daf-8714-ffd773a4134c","Type":"ContainerStarted","Data":"64d0e16feee0dd13c1b44cd7c280241ae24d4eb2df3f52c22d5a9944d741cba9"} Dec 11 13:27:03 crc kubenswrapper[4898]: I1211 13:27:03.718282 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ee3c0532-b7dd-4147-812a-3add6446d9a1","Type":"ContainerStarted","Data":"1ec0ac589729fef512a2acaf03be80142bdc464d3d0f82f074ad05575c8e2602"} Dec 11 13:27:03 crc kubenswrapper[4898]: I1211 13:27:03.741855 4898 generic.go:334] "Generic (PLEG): container finished" podID="b9c2941b-6d82-4218-8cc7-91f6473d0fbf" containerID="4d0ffac38ead83f4b73e7c0742be67db7ebbd09cc27e6dadd523be21c77db4e9" exitCode=143 Dec 11 13:27:03 crc kubenswrapper[4898]: I1211 13:27:03.742654 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"b9c2941b-6d82-4218-8cc7-91f6473d0fbf","Type":"ContainerDied","Data":"4d0ffac38ead83f4b73e7c0742be67db7ebbd09cc27e6dadd523be21c77db4e9"} Dec 11 13:27:03 crc kubenswrapper[4898]: I1211 13:27:03.807011 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Dec 11 13:27:03 crc kubenswrapper[4898]: I1211 13:27:03.869883 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=4.886152124 podStartE2EDuration="5.869862576s" podCreationTimestamp="2025-12-11 13:26:58 +0000 UTC" firstStartedPulling="2025-12-11 13:26:59.746141656 +0000 UTC m=+1377.318468093" lastFinishedPulling="2025-12-11 13:27:00.729852108 +0000 UTC m=+1378.302178545" observedRunningTime="2025-12-11 13:27:03.760886486 +0000 UTC m=+1381.333212923" watchObservedRunningTime="2025-12-11 13:27:03.869862576 +0000 UTC m=+1381.442189013" Dec 11 13:27:04 crc kubenswrapper[4898]: I1211 13:27:04.468921 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-84b4b98fdc-tbjdg" Dec 11 13:27:04 crc kubenswrapper[4898]: I1211 13:27:04.570757 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-d94654b9b-gfmwl"] Dec 11 13:27:04 crc kubenswrapper[4898]: I1211 13:27:04.571014 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-d94654b9b-gfmwl" podUID="df8a71d7-debf-4fc0-84c0-a3b1f4cb7030" containerName="neutron-api" containerID="cri-o://9ef00d43822c19520cf754da05080d4a738171a7c2e4dd6946a0bfd214b8e810" gracePeriod=30 Dec 11 13:27:04 crc kubenswrapper[4898]: I1211 13:27:04.571574 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-d94654b9b-gfmwl" podUID="df8a71d7-debf-4fc0-84c0-a3b1f4cb7030" containerName="neutron-httpd" containerID="cri-o://c71ccb054f5ca50913600baeabfc1b605ea7ad4e97a622031413c8e61b1dbc90" gracePeriod=30 Dec 11 13:27:04 crc kubenswrapper[4898]: I1211 13:27:04.772873 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ee3c0532-b7dd-4147-812a-3add6446d9a1","Type":"ContainerStarted","Data":"18ae6c0fa34c3c05f5a3592448be5988ac5031be2913b0ede4dba797a3166f63"} Dec 11 13:27:05 crc kubenswrapper[4898]: I1211 13:27:05.802373 4898 generic.go:334] "Generic (PLEG): container finished" podID="df8a71d7-debf-4fc0-84c0-a3b1f4cb7030" containerID="c71ccb054f5ca50913600baeabfc1b605ea7ad4e97a622031413c8e61b1dbc90" exitCode=0 Dec 11 13:27:05 crc kubenswrapper[4898]: I1211 13:27:05.802449 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-d94654b9b-gfmwl" event={"ID":"df8a71d7-debf-4fc0-84c0-a3b1f4cb7030","Type":"ContainerDied","Data":"c71ccb054f5ca50913600baeabfc1b605ea7ad4e97a622031413c8e61b1dbc90"} Dec 11 13:27:06 crc kubenswrapper[4898]: I1211 13:27:06.842656 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ee3c0532-b7dd-4147-812a-3add6446d9a1","Type":"ContainerStarted","Data":"101e35693121caba6825c50408ee1563ac024c47b03bb8af36c46448894ca9e9"} Dec 11 13:27:07 crc kubenswrapper[4898]: I1211 13:27:07.655645 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7d9697675d-q65pl" Dec 11 13:27:07 crc kubenswrapper[4898]: I1211 13:27:07.858271 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ee3c0532-b7dd-4147-812a-3add6446d9a1","Type":"ContainerStarted","Data":"0ab2ee3a37554d4b70a2e0e001aaa985748619ff652d537c5d296943c659c6dc"} Dec 11 13:27:07 crc kubenswrapper[4898]: I1211 13:27:07.929126 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7d9697675d-q65pl" Dec 11 13:27:08 crc kubenswrapper[4898]: I1211 13:27:08.031854 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-54b55bb954-hjwxn"] Dec 11 13:27:08 crc kubenswrapper[4898]: I1211 13:27:08.032105 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-54b55bb954-hjwxn" podUID="8bc10d96-de00-4df3-bdd0-42d70e94b7ee" containerName="barbican-api-log" containerID="cri-o://a2377f5d11f6df07aa659dc552184e17199e4f38f38b252eb1d75582f5804779" gracePeriod=30 Dec 11 13:27:08 crc kubenswrapper[4898]: I1211 13:27:08.032253 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-54b55bb954-hjwxn" podUID="8bc10d96-de00-4df3-bdd0-42d70e94b7ee" containerName="barbican-api" containerID="cri-o://7d397eeaaab07cd02eead0a941f1b54b960ad73edcd71545268d822f2f6f648d" gracePeriod=30 Dec 11 13:27:08 crc kubenswrapper[4898]: I1211 13:27:08.871603 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ee3c0532-b7dd-4147-812a-3add6446d9a1","Type":"ContainerStarted","Data":"2d5f34b7189654cc5dc196435838bdd9ec0ba65468cf2c2304dd540e3d9c265f"} Dec 11 13:27:08 crc kubenswrapper[4898]: I1211 13:27:08.873327 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 11 13:27:08 crc kubenswrapper[4898]: I1211 13:27:08.877483 4898 generic.go:334] "Generic (PLEG): container finished" podID="8bc10d96-de00-4df3-bdd0-42d70e94b7ee" containerID="a2377f5d11f6df07aa659dc552184e17199e4f38f38b252eb1d75582f5804779" exitCode=143 Dec 11 13:27:08 crc kubenswrapper[4898]: I1211 13:27:08.877540 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-54b55bb954-hjwxn" event={"ID":"8bc10d96-de00-4df3-bdd0-42d70e94b7ee","Type":"ContainerDied","Data":"a2377f5d11f6df07aa659dc552184e17199e4f38f38b252eb1d75582f5804779"} Dec 11 13:27:08 crc kubenswrapper[4898]: I1211 13:27:08.904373 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.350903224 podStartE2EDuration="6.904349433s" podCreationTimestamp="2025-12-11 13:27:02 +0000 UTC" firstStartedPulling="2025-12-11 13:27:03.455236435 +0000 UTC m=+1381.027562892" lastFinishedPulling="2025-12-11 13:27:08.008682664 +0000 UTC m=+1385.581009101" observedRunningTime="2025-12-11 13:27:08.893854689 +0000 UTC m=+1386.466181146" watchObservedRunningTime="2025-12-11 13:27:08.904349433 +0000 UTC m=+1386.476675880" Dec 11 13:27:08 crc kubenswrapper[4898]: I1211 13:27:08.982612 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6578955fd5-cx55d" Dec 11 13:27:09 crc kubenswrapper[4898]: I1211 13:27:09.064579 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-qrsq6"] Dec 11 13:27:09 crc kubenswrapper[4898]: I1211 13:27:09.066167 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-848cf88cfc-qrsq6" podUID="e7b30e6b-41bb-420f-8139-5bd4646d7944" containerName="dnsmasq-dns" containerID="cri-o://1c28f25469bebac535f76c747c41cb5d41a441b15f49e78350c2ef922c6798e8" gracePeriod=10 Dec 11 13:27:09 crc kubenswrapper[4898]: I1211 13:27:09.185733 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Dec 11 13:27:09 crc kubenswrapper[4898]: I1211 13:27:09.359525 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 11 13:27:09 crc kubenswrapper[4898]: I1211 13:27:09.730418 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-848cf88cfc-qrsq6" Dec 11 13:27:09 crc kubenswrapper[4898]: I1211 13:27:09.822109 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e7b30e6b-41bb-420f-8139-5bd4646d7944-dns-swift-storage-0\") pod \"e7b30e6b-41bb-420f-8139-5bd4646d7944\" (UID: \"e7b30e6b-41bb-420f-8139-5bd4646d7944\") " Dec 11 13:27:09 crc kubenswrapper[4898]: I1211 13:27:09.822190 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e7b30e6b-41bb-420f-8139-5bd4646d7944-ovsdbserver-nb\") pod \"e7b30e6b-41bb-420f-8139-5bd4646d7944\" (UID: \"e7b30e6b-41bb-420f-8139-5bd4646d7944\") " Dec 11 13:27:09 crc kubenswrapper[4898]: I1211 13:27:09.822224 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7b30e6b-41bb-420f-8139-5bd4646d7944-config\") pod \"e7b30e6b-41bb-420f-8139-5bd4646d7944\" (UID: \"e7b30e6b-41bb-420f-8139-5bd4646d7944\") " Dec 11 13:27:09 crc kubenswrapper[4898]: I1211 13:27:09.822303 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vgzjp\" (UniqueName: \"kubernetes.io/projected/e7b30e6b-41bb-420f-8139-5bd4646d7944-kube-api-access-vgzjp\") pod \"e7b30e6b-41bb-420f-8139-5bd4646d7944\" (UID: \"e7b30e6b-41bb-420f-8139-5bd4646d7944\") " Dec 11 13:27:09 crc kubenswrapper[4898]: I1211 13:27:09.822333 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e7b30e6b-41bb-420f-8139-5bd4646d7944-ovsdbserver-sb\") pod \"e7b30e6b-41bb-420f-8139-5bd4646d7944\" (UID: \"e7b30e6b-41bb-420f-8139-5bd4646d7944\") " Dec 11 13:27:09 crc kubenswrapper[4898]: I1211 13:27:09.822375 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e7b30e6b-41bb-420f-8139-5bd4646d7944-dns-svc\") pod \"e7b30e6b-41bb-420f-8139-5bd4646d7944\" (UID: \"e7b30e6b-41bb-420f-8139-5bd4646d7944\") " Dec 11 13:27:09 crc kubenswrapper[4898]: I1211 13:27:09.831318 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7b30e6b-41bb-420f-8139-5bd4646d7944-kube-api-access-vgzjp" (OuterVolumeSpecName: "kube-api-access-vgzjp") pod "e7b30e6b-41bb-420f-8139-5bd4646d7944" (UID: "e7b30e6b-41bb-420f-8139-5bd4646d7944"). InnerVolumeSpecName "kube-api-access-vgzjp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:27:09 crc kubenswrapper[4898]: I1211 13:27:09.920535 4898 generic.go:334] "Generic (PLEG): container finished" podID="df8a71d7-debf-4fc0-84c0-a3b1f4cb7030" containerID="9ef00d43822c19520cf754da05080d4a738171a7c2e4dd6946a0bfd214b8e810" exitCode=0 Dec 11 13:27:09 crc kubenswrapper[4898]: I1211 13:27:09.920615 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-d94654b9b-gfmwl" event={"ID":"df8a71d7-debf-4fc0-84c0-a3b1f4cb7030","Type":"ContainerDied","Data":"9ef00d43822c19520cf754da05080d4a738171a7c2e4dd6946a0bfd214b8e810"} Dec 11 13:27:09 crc kubenswrapper[4898]: I1211 13:27:09.922240 4898 generic.go:334] "Generic (PLEG): container finished" podID="e7b30e6b-41bb-420f-8139-5bd4646d7944" containerID="1c28f25469bebac535f76c747c41cb5d41a441b15f49e78350c2ef922c6798e8" exitCode=0 Dec 11 13:27:09 crc kubenswrapper[4898]: I1211 13:27:09.922505 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="d9a8d652-d53c-4daf-8714-ffd773a4134c" containerName="cinder-scheduler" containerID="cri-o://4e9be3b2d83f0bf8e93bb7f3fe33609f7df69d2d59278146f559c4bad6041762" gracePeriod=30 Dec 11 13:27:09 crc kubenswrapper[4898]: I1211 13:27:09.922941 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-848cf88cfc-qrsq6" Dec 11 13:27:09 crc kubenswrapper[4898]: I1211 13:27:09.923445 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-848cf88cfc-qrsq6" event={"ID":"e7b30e6b-41bb-420f-8139-5bd4646d7944","Type":"ContainerDied","Data":"1c28f25469bebac535f76c747c41cb5d41a441b15f49e78350c2ef922c6798e8"} Dec 11 13:27:09 crc kubenswrapper[4898]: I1211 13:27:09.923477 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-848cf88cfc-qrsq6" event={"ID":"e7b30e6b-41bb-420f-8139-5bd4646d7944","Type":"ContainerDied","Data":"2c94624230355378b52cbe18da1965489caa19ecab2e7b6d6ca4322e35377122"} Dec 11 13:27:09 crc kubenswrapper[4898]: I1211 13:27:09.923491 4898 scope.go:117] "RemoveContainer" containerID="1c28f25469bebac535f76c747c41cb5d41a441b15f49e78350c2ef922c6798e8" Dec 11 13:27:09 crc kubenswrapper[4898]: I1211 13:27:09.924641 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="d9a8d652-d53c-4daf-8714-ffd773a4134c" containerName="probe" containerID="cri-o://64d0e16feee0dd13c1b44cd7c280241ae24d4eb2df3f52c22d5a9944d741cba9" gracePeriod=30 Dec 11 13:27:09 crc kubenswrapper[4898]: I1211 13:27:09.933963 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vgzjp\" (UniqueName: \"kubernetes.io/projected/e7b30e6b-41bb-420f-8139-5bd4646d7944-kube-api-access-vgzjp\") on node \"crc\" DevicePath \"\"" Dec 11 13:27:09 crc kubenswrapper[4898]: I1211 13:27:09.957361 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7b30e6b-41bb-420f-8139-5bd4646d7944-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e7b30e6b-41bb-420f-8139-5bd4646d7944" (UID: "e7b30e6b-41bb-420f-8139-5bd4646d7944"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:27:09 crc kubenswrapper[4898]: I1211 13:27:09.958583 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7b30e6b-41bb-420f-8139-5bd4646d7944-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "e7b30e6b-41bb-420f-8139-5bd4646d7944" (UID: "e7b30e6b-41bb-420f-8139-5bd4646d7944"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:27:10 crc kubenswrapper[4898]: I1211 13:27:10.000968 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7b30e6b-41bb-420f-8139-5bd4646d7944-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "e7b30e6b-41bb-420f-8139-5bd4646d7944" (UID: "e7b30e6b-41bb-420f-8139-5bd4646d7944"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:27:10 crc kubenswrapper[4898]: I1211 13:27:10.014365 4898 scope.go:117] "RemoveContainer" containerID="a43e2ea6a9ebb2ffc87efff95e3add9a3ee1013dcd03eddccb10698aa054145f" Dec 11 13:27:10 crc kubenswrapper[4898]: I1211 13:27:10.025432 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7b30e6b-41bb-420f-8139-5bd4646d7944-config" (OuterVolumeSpecName: "config") pod "e7b30e6b-41bb-420f-8139-5bd4646d7944" (UID: "e7b30e6b-41bb-420f-8139-5bd4646d7944"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:27:10 crc kubenswrapper[4898]: I1211 13:27:10.037385 4898 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e7b30e6b-41bb-420f-8139-5bd4646d7944-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 11 13:27:10 crc kubenswrapper[4898]: I1211 13:27:10.043639 4898 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7b30e6b-41bb-420f-8139-5bd4646d7944-config\") on node \"crc\" DevicePath \"\"" Dec 11 13:27:10 crc kubenswrapper[4898]: I1211 13:27:10.043675 4898 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e7b30e6b-41bb-420f-8139-5bd4646d7944-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 11 13:27:10 crc kubenswrapper[4898]: I1211 13:27:10.043688 4898 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e7b30e6b-41bb-420f-8139-5bd4646d7944-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 11 13:27:10 crc kubenswrapper[4898]: I1211 13:27:10.046192 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7b30e6b-41bb-420f-8139-5bd4646d7944-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "e7b30e6b-41bb-420f-8139-5bd4646d7944" (UID: "e7b30e6b-41bb-420f-8139-5bd4646d7944"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:27:10 crc kubenswrapper[4898]: I1211 13:27:10.085274 4898 scope.go:117] "RemoveContainer" containerID="1c28f25469bebac535f76c747c41cb5d41a441b15f49e78350c2ef922c6798e8" Dec 11 13:27:10 crc kubenswrapper[4898]: E1211 13:27:10.100770 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1c28f25469bebac535f76c747c41cb5d41a441b15f49e78350c2ef922c6798e8\": container with ID starting with 1c28f25469bebac535f76c747c41cb5d41a441b15f49e78350c2ef922c6798e8 not found: ID does not exist" containerID="1c28f25469bebac535f76c747c41cb5d41a441b15f49e78350c2ef922c6798e8" Dec 11 13:27:10 crc kubenswrapper[4898]: I1211 13:27:10.100833 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c28f25469bebac535f76c747c41cb5d41a441b15f49e78350c2ef922c6798e8"} err="failed to get container status \"1c28f25469bebac535f76c747c41cb5d41a441b15f49e78350c2ef922c6798e8\": rpc error: code = NotFound desc = could not find container \"1c28f25469bebac535f76c747c41cb5d41a441b15f49e78350c2ef922c6798e8\": container with ID starting with 1c28f25469bebac535f76c747c41cb5d41a441b15f49e78350c2ef922c6798e8 not found: ID does not exist" Dec 11 13:27:10 crc kubenswrapper[4898]: I1211 13:27:10.100877 4898 scope.go:117] "RemoveContainer" containerID="a43e2ea6a9ebb2ffc87efff95e3add9a3ee1013dcd03eddccb10698aa054145f" Dec 11 13:27:10 crc kubenswrapper[4898]: E1211 13:27:10.104766 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a43e2ea6a9ebb2ffc87efff95e3add9a3ee1013dcd03eddccb10698aa054145f\": container with ID starting with a43e2ea6a9ebb2ffc87efff95e3add9a3ee1013dcd03eddccb10698aa054145f not found: ID does not exist" containerID="a43e2ea6a9ebb2ffc87efff95e3add9a3ee1013dcd03eddccb10698aa054145f" Dec 11 13:27:10 crc kubenswrapper[4898]: I1211 13:27:10.104824 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a43e2ea6a9ebb2ffc87efff95e3add9a3ee1013dcd03eddccb10698aa054145f"} err="failed to get container status \"a43e2ea6a9ebb2ffc87efff95e3add9a3ee1013dcd03eddccb10698aa054145f\": rpc error: code = NotFound desc = could not find container \"a43e2ea6a9ebb2ffc87efff95e3add9a3ee1013dcd03eddccb10698aa054145f\": container with ID starting with a43e2ea6a9ebb2ffc87efff95e3add9a3ee1013dcd03eddccb10698aa054145f not found: ID does not exist" Dec 11 13:27:10 crc kubenswrapper[4898]: I1211 13:27:10.137711 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-d94654b9b-gfmwl" Dec 11 13:27:10 crc kubenswrapper[4898]: I1211 13:27:10.145902 4898 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e7b30e6b-41bb-420f-8139-5bd4646d7944-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 11 13:27:10 crc kubenswrapper[4898]: I1211 13:27:10.247410 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df8a71d7-debf-4fc0-84c0-a3b1f4cb7030-combined-ca-bundle\") pod \"df8a71d7-debf-4fc0-84c0-a3b1f4cb7030\" (UID: \"df8a71d7-debf-4fc0-84c0-a3b1f4cb7030\") " Dec 11 13:27:10 crc kubenswrapper[4898]: I1211 13:27:10.247479 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/df8a71d7-debf-4fc0-84c0-a3b1f4cb7030-httpd-config\") pod \"df8a71d7-debf-4fc0-84c0-a3b1f4cb7030\" (UID: \"df8a71d7-debf-4fc0-84c0-a3b1f4cb7030\") " Dec 11 13:27:10 crc kubenswrapper[4898]: I1211 13:27:10.247567 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/df8a71d7-debf-4fc0-84c0-a3b1f4cb7030-ovndb-tls-certs\") pod \"df8a71d7-debf-4fc0-84c0-a3b1f4cb7030\" (UID: \"df8a71d7-debf-4fc0-84c0-a3b1f4cb7030\") " Dec 11 13:27:10 crc kubenswrapper[4898]: I1211 13:27:10.247755 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g7bjd\" (UniqueName: \"kubernetes.io/projected/df8a71d7-debf-4fc0-84c0-a3b1f4cb7030-kube-api-access-g7bjd\") pod \"df8a71d7-debf-4fc0-84c0-a3b1f4cb7030\" (UID: \"df8a71d7-debf-4fc0-84c0-a3b1f4cb7030\") " Dec 11 13:27:10 crc kubenswrapper[4898]: I1211 13:27:10.247807 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/df8a71d7-debf-4fc0-84c0-a3b1f4cb7030-config\") pod \"df8a71d7-debf-4fc0-84c0-a3b1f4cb7030\" (UID: \"df8a71d7-debf-4fc0-84c0-a3b1f4cb7030\") " Dec 11 13:27:10 crc kubenswrapper[4898]: I1211 13:27:10.254168 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df8a71d7-debf-4fc0-84c0-a3b1f4cb7030-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "df8a71d7-debf-4fc0-84c0-a3b1f4cb7030" (UID: "df8a71d7-debf-4fc0-84c0-a3b1f4cb7030"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:27:10 crc kubenswrapper[4898]: I1211 13:27:10.285572 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df8a71d7-debf-4fc0-84c0-a3b1f4cb7030-kube-api-access-g7bjd" (OuterVolumeSpecName: "kube-api-access-g7bjd") pod "df8a71d7-debf-4fc0-84c0-a3b1f4cb7030" (UID: "df8a71d7-debf-4fc0-84c0-a3b1f4cb7030"). InnerVolumeSpecName "kube-api-access-g7bjd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:27:10 crc kubenswrapper[4898]: I1211 13:27:10.349971 4898 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/df8a71d7-debf-4fc0-84c0-a3b1f4cb7030-httpd-config\") on node \"crc\" DevicePath \"\"" Dec 11 13:27:10 crc kubenswrapper[4898]: I1211 13:27:10.350006 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g7bjd\" (UniqueName: \"kubernetes.io/projected/df8a71d7-debf-4fc0-84c0-a3b1f4cb7030-kube-api-access-g7bjd\") on node \"crc\" DevicePath \"\"" Dec 11 13:27:10 crc kubenswrapper[4898]: I1211 13:27:10.360134 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df8a71d7-debf-4fc0-84c0-a3b1f4cb7030-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "df8a71d7-debf-4fc0-84c0-a3b1f4cb7030" (UID: "df8a71d7-debf-4fc0-84c0-a3b1f4cb7030"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:27:10 crc kubenswrapper[4898]: I1211 13:27:10.384648 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df8a71d7-debf-4fc0-84c0-a3b1f4cb7030-config" (OuterVolumeSpecName: "config") pod "df8a71d7-debf-4fc0-84c0-a3b1f4cb7030" (UID: "df8a71d7-debf-4fc0-84c0-a3b1f4cb7030"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:27:10 crc kubenswrapper[4898]: I1211 13:27:10.434703 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df8a71d7-debf-4fc0-84c0-a3b1f4cb7030-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "df8a71d7-debf-4fc0-84c0-a3b1f4cb7030" (UID: "df8a71d7-debf-4fc0-84c0-a3b1f4cb7030"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:27:10 crc kubenswrapper[4898]: I1211 13:27:10.456541 4898 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df8a71d7-debf-4fc0-84c0-a3b1f4cb7030-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 11 13:27:10 crc kubenswrapper[4898]: I1211 13:27:10.456588 4898 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/df8a71d7-debf-4fc0-84c0-a3b1f4cb7030-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 11 13:27:10 crc kubenswrapper[4898]: I1211 13:27:10.456601 4898 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/df8a71d7-debf-4fc0-84c0-a3b1f4cb7030-config\") on node \"crc\" DevicePath \"\"" Dec 11 13:27:10 crc kubenswrapper[4898]: I1211 13:27:10.567704 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-qrsq6"] Dec 11 13:27:10 crc kubenswrapper[4898]: I1211 13:27:10.583010 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-qrsq6"] Dec 11 13:27:10 crc kubenswrapper[4898]: I1211 13:27:10.789724 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7b30e6b-41bb-420f-8139-5bd4646d7944" path="/var/lib/kubelet/pods/e7b30e6b-41bb-420f-8139-5bd4646d7944/volumes" Dec 11 13:27:10 crc kubenswrapper[4898]: I1211 13:27:10.937150 4898 generic.go:334] "Generic (PLEG): container finished" podID="d9a8d652-d53c-4daf-8714-ffd773a4134c" containerID="4e9be3b2d83f0bf8e93bb7f3fe33609f7df69d2d59278146f559c4bad6041762" exitCode=0 Dec 11 13:27:10 crc kubenswrapper[4898]: I1211 13:27:10.937218 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"d9a8d652-d53c-4daf-8714-ffd773a4134c","Type":"ContainerDied","Data":"4e9be3b2d83f0bf8e93bb7f3fe33609f7df69d2d59278146f559c4bad6041762"} Dec 11 13:27:10 crc kubenswrapper[4898]: I1211 13:27:10.940345 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-d94654b9b-gfmwl" Dec 11 13:27:10 crc kubenswrapper[4898]: I1211 13:27:10.940386 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-d94654b9b-gfmwl" event={"ID":"df8a71d7-debf-4fc0-84c0-a3b1f4cb7030","Type":"ContainerDied","Data":"97372bde30f05b0912633875803c2a93a455f48b80e4958d1f47121364becccf"} Dec 11 13:27:10 crc kubenswrapper[4898]: I1211 13:27:10.940422 4898 scope.go:117] "RemoveContainer" containerID="c71ccb054f5ca50913600baeabfc1b605ea7ad4e97a622031413c8e61b1dbc90" Dec 11 13:27:10 crc kubenswrapper[4898]: I1211 13:27:10.971853 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-d94654b9b-gfmwl"] Dec 11 13:27:10 crc kubenswrapper[4898]: I1211 13:27:10.983392 4898 scope.go:117] "RemoveContainer" containerID="9ef00d43822c19520cf754da05080d4a738171a7c2e4dd6946a0bfd214b8e810" Dec 11 13:27:10 crc kubenswrapper[4898]: I1211 13:27:10.984650 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-d94654b9b-gfmwl"] Dec 11 13:27:11 crc kubenswrapper[4898]: I1211 13:27:11.665925 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 11 13:27:11 crc kubenswrapper[4898]: I1211 13:27:11.790331 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9a8d652-d53c-4daf-8714-ffd773a4134c-combined-ca-bundle\") pod \"d9a8d652-d53c-4daf-8714-ffd773a4134c\" (UID: \"d9a8d652-d53c-4daf-8714-ffd773a4134c\") " Dec 11 13:27:11 crc kubenswrapper[4898]: I1211 13:27:11.790733 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6j8mj\" (UniqueName: \"kubernetes.io/projected/d9a8d652-d53c-4daf-8714-ffd773a4134c-kube-api-access-6j8mj\") pod \"d9a8d652-d53c-4daf-8714-ffd773a4134c\" (UID: \"d9a8d652-d53c-4daf-8714-ffd773a4134c\") " Dec 11 13:27:11 crc kubenswrapper[4898]: I1211 13:27:11.790885 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9a8d652-d53c-4daf-8714-ffd773a4134c-config-data\") pod \"d9a8d652-d53c-4daf-8714-ffd773a4134c\" (UID: \"d9a8d652-d53c-4daf-8714-ffd773a4134c\") " Dec 11 13:27:11 crc kubenswrapper[4898]: I1211 13:27:11.790925 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d9a8d652-d53c-4daf-8714-ffd773a4134c-config-data-custom\") pod \"d9a8d652-d53c-4daf-8714-ffd773a4134c\" (UID: \"d9a8d652-d53c-4daf-8714-ffd773a4134c\") " Dec 11 13:27:11 crc kubenswrapper[4898]: I1211 13:27:11.790940 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d9a8d652-d53c-4daf-8714-ffd773a4134c-etc-machine-id\") pod \"d9a8d652-d53c-4daf-8714-ffd773a4134c\" (UID: \"d9a8d652-d53c-4daf-8714-ffd773a4134c\") " Dec 11 13:27:11 crc kubenswrapper[4898]: I1211 13:27:11.790997 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d9a8d652-d53c-4daf-8714-ffd773a4134c-scripts\") pod \"d9a8d652-d53c-4daf-8714-ffd773a4134c\" (UID: \"d9a8d652-d53c-4daf-8714-ffd773a4134c\") " Dec 11 13:27:11 crc kubenswrapper[4898]: I1211 13:27:11.791880 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d9a8d652-d53c-4daf-8714-ffd773a4134c-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "d9a8d652-d53c-4daf-8714-ffd773a4134c" (UID: "d9a8d652-d53c-4daf-8714-ffd773a4134c"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 11 13:27:11 crc kubenswrapper[4898]: I1211 13:27:11.812638 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9a8d652-d53c-4daf-8714-ffd773a4134c-scripts" (OuterVolumeSpecName: "scripts") pod "d9a8d652-d53c-4daf-8714-ffd773a4134c" (UID: "d9a8d652-d53c-4daf-8714-ffd773a4134c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:27:11 crc kubenswrapper[4898]: I1211 13:27:11.816903 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d9a8d652-d53c-4daf-8714-ffd773a4134c-kube-api-access-6j8mj" (OuterVolumeSpecName: "kube-api-access-6j8mj") pod "d9a8d652-d53c-4daf-8714-ffd773a4134c" (UID: "d9a8d652-d53c-4daf-8714-ffd773a4134c"). InnerVolumeSpecName "kube-api-access-6j8mj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:27:11 crc kubenswrapper[4898]: I1211 13:27:11.832990 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9a8d652-d53c-4daf-8714-ffd773a4134c-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "d9a8d652-d53c-4daf-8714-ffd773a4134c" (UID: "d9a8d652-d53c-4daf-8714-ffd773a4134c"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:27:11 crc kubenswrapper[4898]: I1211 13:27:11.893962 4898 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d9a8d652-d53c-4daf-8714-ffd773a4134c-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 11 13:27:11 crc kubenswrapper[4898]: I1211 13:27:11.893993 4898 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d9a8d652-d53c-4daf-8714-ffd773a4134c-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 11 13:27:11 crc kubenswrapper[4898]: I1211 13:27:11.894001 4898 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d9a8d652-d53c-4daf-8714-ffd773a4134c-scripts\") on node \"crc\" DevicePath \"\"" Dec 11 13:27:11 crc kubenswrapper[4898]: I1211 13:27:11.894010 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6j8mj\" (UniqueName: \"kubernetes.io/projected/d9a8d652-d53c-4daf-8714-ffd773a4134c-kube-api-access-6j8mj\") on node \"crc\" DevicePath \"\"" Dec 11 13:27:11 crc kubenswrapper[4898]: I1211 13:27:11.963439 4898 generic.go:334] "Generic (PLEG): container finished" podID="8bc10d96-de00-4df3-bdd0-42d70e94b7ee" containerID="7d397eeaaab07cd02eead0a941f1b54b960ad73edcd71545268d822f2f6f648d" exitCode=0 Dec 11 13:27:11 crc kubenswrapper[4898]: I1211 13:27:11.963534 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-54b55bb954-hjwxn" event={"ID":"8bc10d96-de00-4df3-bdd0-42d70e94b7ee","Type":"ContainerDied","Data":"7d397eeaaab07cd02eead0a941f1b54b960ad73edcd71545268d822f2f6f648d"} Dec 11 13:27:11 crc kubenswrapper[4898]: I1211 13:27:11.968859 4898 generic.go:334] "Generic (PLEG): container finished" podID="d9a8d652-d53c-4daf-8714-ffd773a4134c" containerID="64d0e16feee0dd13c1b44cd7c280241ae24d4eb2df3f52c22d5a9944d741cba9" exitCode=0 Dec 11 13:27:11 crc kubenswrapper[4898]: I1211 13:27:11.968928 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"d9a8d652-d53c-4daf-8714-ffd773a4134c","Type":"ContainerDied","Data":"64d0e16feee0dd13c1b44cd7c280241ae24d4eb2df3f52c22d5a9944d741cba9"} Dec 11 13:27:11 crc kubenswrapper[4898]: I1211 13:27:11.968960 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"d9a8d652-d53c-4daf-8714-ffd773a4134c","Type":"ContainerDied","Data":"529f9d55cae159d48ea1e6ab5fdef2619ee32217527a9abba28a1d089cd669cd"} Dec 11 13:27:11 crc kubenswrapper[4898]: I1211 13:27:11.968983 4898 scope.go:117] "RemoveContainer" containerID="64d0e16feee0dd13c1b44cd7c280241ae24d4eb2df3f52c22d5a9944d741cba9" Dec 11 13:27:11 crc kubenswrapper[4898]: I1211 13:27:11.969364 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 11 13:27:11 crc kubenswrapper[4898]: I1211 13:27:11.982561 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9a8d652-d53c-4daf-8714-ffd773a4134c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d9a8d652-d53c-4daf-8714-ffd773a4134c" (UID: "d9a8d652-d53c-4daf-8714-ffd773a4134c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:27:11 crc kubenswrapper[4898]: I1211 13:27:11.997131 4898 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9a8d652-d53c-4daf-8714-ffd773a4134c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 11 13:27:12 crc kubenswrapper[4898]: I1211 13:27:12.018816 4898 scope.go:117] "RemoveContainer" containerID="4e9be3b2d83f0bf8e93bb7f3fe33609f7df69d2d59278146f559c4bad6041762" Dec 11 13:27:12 crc kubenswrapper[4898]: I1211 13:27:12.025654 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9a8d652-d53c-4daf-8714-ffd773a4134c-config-data" (OuterVolumeSpecName: "config-data") pod "d9a8d652-d53c-4daf-8714-ffd773a4134c" (UID: "d9a8d652-d53c-4daf-8714-ffd773a4134c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:27:12 crc kubenswrapper[4898]: I1211 13:27:12.032723 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-54b55bb954-hjwxn" Dec 11 13:27:12 crc kubenswrapper[4898]: I1211 13:27:12.060471 4898 scope.go:117] "RemoveContainer" containerID="64d0e16feee0dd13c1b44cd7c280241ae24d4eb2df3f52c22d5a9944d741cba9" Dec 11 13:27:12 crc kubenswrapper[4898]: E1211 13:27:12.061069 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"64d0e16feee0dd13c1b44cd7c280241ae24d4eb2df3f52c22d5a9944d741cba9\": container with ID starting with 64d0e16feee0dd13c1b44cd7c280241ae24d4eb2df3f52c22d5a9944d741cba9 not found: ID does not exist" containerID="64d0e16feee0dd13c1b44cd7c280241ae24d4eb2df3f52c22d5a9944d741cba9" Dec 11 13:27:12 crc kubenswrapper[4898]: I1211 13:27:12.061181 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"64d0e16feee0dd13c1b44cd7c280241ae24d4eb2df3f52c22d5a9944d741cba9"} err="failed to get container status \"64d0e16feee0dd13c1b44cd7c280241ae24d4eb2df3f52c22d5a9944d741cba9\": rpc error: code = NotFound desc = could not find container \"64d0e16feee0dd13c1b44cd7c280241ae24d4eb2df3f52c22d5a9944d741cba9\": container with ID starting with 64d0e16feee0dd13c1b44cd7c280241ae24d4eb2df3f52c22d5a9944d741cba9 not found: ID does not exist" Dec 11 13:27:12 crc kubenswrapper[4898]: I1211 13:27:12.061299 4898 scope.go:117] "RemoveContainer" containerID="4e9be3b2d83f0bf8e93bb7f3fe33609f7df69d2d59278146f559c4bad6041762" Dec 11 13:27:12 crc kubenswrapper[4898]: E1211 13:27:12.062220 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4e9be3b2d83f0bf8e93bb7f3fe33609f7df69d2d59278146f559c4bad6041762\": container with ID starting with 4e9be3b2d83f0bf8e93bb7f3fe33609f7df69d2d59278146f559c4bad6041762 not found: ID does not exist" containerID="4e9be3b2d83f0bf8e93bb7f3fe33609f7df69d2d59278146f559c4bad6041762" Dec 11 13:27:12 crc kubenswrapper[4898]: I1211 13:27:12.062367 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e9be3b2d83f0bf8e93bb7f3fe33609f7df69d2d59278146f559c4bad6041762"} err="failed to get container status \"4e9be3b2d83f0bf8e93bb7f3fe33609f7df69d2d59278146f559c4bad6041762\": rpc error: code = NotFound desc = could not find container \"4e9be3b2d83f0bf8e93bb7f3fe33609f7df69d2d59278146f559c4bad6041762\": container with ID starting with 4e9be3b2d83f0bf8e93bb7f3fe33609f7df69d2d59278146f559c4bad6041762 not found: ID does not exist" Dec 11 13:27:12 crc kubenswrapper[4898]: I1211 13:27:12.099276 4898 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9a8d652-d53c-4daf-8714-ffd773a4134c-config-data\") on node \"crc\" DevicePath \"\"" Dec 11 13:27:12 crc kubenswrapper[4898]: I1211 13:27:12.200942 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8bc10d96-de00-4df3-bdd0-42d70e94b7ee-logs\") pod \"8bc10d96-de00-4df3-bdd0-42d70e94b7ee\" (UID: \"8bc10d96-de00-4df3-bdd0-42d70e94b7ee\") " Dec 11 13:27:12 crc kubenswrapper[4898]: I1211 13:27:12.201049 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8bc10d96-de00-4df3-bdd0-42d70e94b7ee-config-data-custom\") pod \"8bc10d96-de00-4df3-bdd0-42d70e94b7ee\" (UID: \"8bc10d96-de00-4df3-bdd0-42d70e94b7ee\") " Dec 11 13:27:12 crc kubenswrapper[4898]: I1211 13:27:12.201089 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8bc10d96-de00-4df3-bdd0-42d70e94b7ee-config-data\") pod \"8bc10d96-de00-4df3-bdd0-42d70e94b7ee\" (UID: \"8bc10d96-de00-4df3-bdd0-42d70e94b7ee\") " Dec 11 13:27:12 crc kubenswrapper[4898]: I1211 13:27:12.201129 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8bc10d96-de00-4df3-bdd0-42d70e94b7ee-combined-ca-bundle\") pod \"8bc10d96-de00-4df3-bdd0-42d70e94b7ee\" (UID: \"8bc10d96-de00-4df3-bdd0-42d70e94b7ee\") " Dec 11 13:27:12 crc kubenswrapper[4898]: I1211 13:27:12.201247 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bjmp5\" (UniqueName: \"kubernetes.io/projected/8bc10d96-de00-4df3-bdd0-42d70e94b7ee-kube-api-access-bjmp5\") pod \"8bc10d96-de00-4df3-bdd0-42d70e94b7ee\" (UID: \"8bc10d96-de00-4df3-bdd0-42d70e94b7ee\") " Dec 11 13:27:12 crc kubenswrapper[4898]: I1211 13:27:12.201366 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8bc10d96-de00-4df3-bdd0-42d70e94b7ee-logs" (OuterVolumeSpecName: "logs") pod "8bc10d96-de00-4df3-bdd0-42d70e94b7ee" (UID: "8bc10d96-de00-4df3-bdd0-42d70e94b7ee"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 13:27:12 crc kubenswrapper[4898]: I1211 13:27:12.201929 4898 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8bc10d96-de00-4df3-bdd0-42d70e94b7ee-logs\") on node \"crc\" DevicePath \"\"" Dec 11 13:27:12 crc kubenswrapper[4898]: I1211 13:27:12.209888 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8bc10d96-de00-4df3-bdd0-42d70e94b7ee-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "8bc10d96-de00-4df3-bdd0-42d70e94b7ee" (UID: "8bc10d96-de00-4df3-bdd0-42d70e94b7ee"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:27:12 crc kubenswrapper[4898]: I1211 13:27:12.210188 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8bc10d96-de00-4df3-bdd0-42d70e94b7ee-kube-api-access-bjmp5" (OuterVolumeSpecName: "kube-api-access-bjmp5") pod "8bc10d96-de00-4df3-bdd0-42d70e94b7ee" (UID: "8bc10d96-de00-4df3-bdd0-42d70e94b7ee"). InnerVolumeSpecName "kube-api-access-bjmp5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:27:12 crc kubenswrapper[4898]: I1211 13:27:12.237571 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8bc10d96-de00-4df3-bdd0-42d70e94b7ee-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8bc10d96-de00-4df3-bdd0-42d70e94b7ee" (UID: "8bc10d96-de00-4df3-bdd0-42d70e94b7ee"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:27:12 crc kubenswrapper[4898]: I1211 13:27:12.283978 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8bc10d96-de00-4df3-bdd0-42d70e94b7ee-config-data" (OuterVolumeSpecName: "config-data") pod "8bc10d96-de00-4df3-bdd0-42d70e94b7ee" (UID: "8bc10d96-de00-4df3-bdd0-42d70e94b7ee"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:27:12 crc kubenswrapper[4898]: I1211 13:27:12.304528 4898 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8bc10d96-de00-4df3-bdd0-42d70e94b7ee-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 11 13:27:12 crc kubenswrapper[4898]: I1211 13:27:12.304742 4898 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8bc10d96-de00-4df3-bdd0-42d70e94b7ee-config-data\") on node \"crc\" DevicePath \"\"" Dec 11 13:27:12 crc kubenswrapper[4898]: I1211 13:27:12.304926 4898 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8bc10d96-de00-4df3-bdd0-42d70e94b7ee-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 11 13:27:12 crc kubenswrapper[4898]: I1211 13:27:12.304997 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bjmp5\" (UniqueName: \"kubernetes.io/projected/8bc10d96-de00-4df3-bdd0-42d70e94b7ee-kube-api-access-bjmp5\") on node \"crc\" DevicePath \"\"" Dec 11 13:27:12 crc kubenswrapper[4898]: I1211 13:27:12.375011 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 11 13:27:12 crc kubenswrapper[4898]: I1211 13:27:12.388708 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 11 13:27:12 crc kubenswrapper[4898]: I1211 13:27:12.404488 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Dec 11 13:27:12 crc kubenswrapper[4898]: E1211 13:27:12.405131 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7b30e6b-41bb-420f-8139-5bd4646d7944" containerName="dnsmasq-dns" Dec 11 13:27:12 crc kubenswrapper[4898]: I1211 13:27:12.405224 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7b30e6b-41bb-420f-8139-5bd4646d7944" containerName="dnsmasq-dns" Dec 11 13:27:12 crc kubenswrapper[4898]: E1211 13:27:12.405326 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7b30e6b-41bb-420f-8139-5bd4646d7944" containerName="init" Dec 11 13:27:12 crc kubenswrapper[4898]: I1211 13:27:12.405381 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7b30e6b-41bb-420f-8139-5bd4646d7944" containerName="init" Dec 11 13:27:12 crc kubenswrapper[4898]: E1211 13:27:12.405436 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9a8d652-d53c-4daf-8714-ffd773a4134c" containerName="cinder-scheduler" Dec 11 13:27:12 crc kubenswrapper[4898]: I1211 13:27:12.405534 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9a8d652-d53c-4daf-8714-ffd773a4134c" containerName="cinder-scheduler" Dec 11 13:27:12 crc kubenswrapper[4898]: E1211 13:27:12.405598 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df8a71d7-debf-4fc0-84c0-a3b1f4cb7030" containerName="neutron-httpd" Dec 11 13:27:12 crc kubenswrapper[4898]: I1211 13:27:12.405662 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="df8a71d7-debf-4fc0-84c0-a3b1f4cb7030" containerName="neutron-httpd" Dec 11 13:27:12 crc kubenswrapper[4898]: E1211 13:27:12.405733 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df8a71d7-debf-4fc0-84c0-a3b1f4cb7030" containerName="neutron-api" Dec 11 13:27:12 crc kubenswrapper[4898]: I1211 13:27:12.405799 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="df8a71d7-debf-4fc0-84c0-a3b1f4cb7030" containerName="neutron-api" Dec 11 13:27:12 crc kubenswrapper[4898]: E1211 13:27:12.405855 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9a8d652-d53c-4daf-8714-ffd773a4134c" containerName="probe" Dec 11 13:27:12 crc kubenswrapper[4898]: I1211 13:27:12.405913 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9a8d652-d53c-4daf-8714-ffd773a4134c" containerName="probe" Dec 11 13:27:12 crc kubenswrapper[4898]: E1211 13:27:12.406011 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8bc10d96-de00-4df3-bdd0-42d70e94b7ee" containerName="barbican-api" Dec 11 13:27:12 crc kubenswrapper[4898]: I1211 13:27:12.406072 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="8bc10d96-de00-4df3-bdd0-42d70e94b7ee" containerName="barbican-api" Dec 11 13:27:12 crc kubenswrapper[4898]: E1211 13:27:12.406131 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8bc10d96-de00-4df3-bdd0-42d70e94b7ee" containerName="barbican-api-log" Dec 11 13:27:12 crc kubenswrapper[4898]: I1211 13:27:12.406192 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="8bc10d96-de00-4df3-bdd0-42d70e94b7ee" containerName="barbican-api-log" Dec 11 13:27:12 crc kubenswrapper[4898]: I1211 13:27:12.409432 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="df8a71d7-debf-4fc0-84c0-a3b1f4cb7030" containerName="neutron-api" Dec 11 13:27:12 crc kubenswrapper[4898]: I1211 13:27:12.409836 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7b30e6b-41bb-420f-8139-5bd4646d7944" containerName="dnsmasq-dns" Dec 11 13:27:12 crc kubenswrapper[4898]: I1211 13:27:12.409947 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="8bc10d96-de00-4df3-bdd0-42d70e94b7ee" containerName="barbican-api" Dec 11 13:27:12 crc kubenswrapper[4898]: I1211 13:27:12.410742 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9a8d652-d53c-4daf-8714-ffd773a4134c" containerName="probe" Dec 11 13:27:12 crc kubenswrapper[4898]: I1211 13:27:12.410826 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="8bc10d96-de00-4df3-bdd0-42d70e94b7ee" containerName="barbican-api-log" Dec 11 13:27:12 crc kubenswrapper[4898]: I1211 13:27:12.410899 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9a8d652-d53c-4daf-8714-ffd773a4134c" containerName="cinder-scheduler" Dec 11 13:27:12 crc kubenswrapper[4898]: I1211 13:27:12.410983 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="df8a71d7-debf-4fc0-84c0-a3b1f4cb7030" containerName="neutron-httpd" Dec 11 13:27:12 crc kubenswrapper[4898]: I1211 13:27:12.412567 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 11 13:27:12 crc kubenswrapper[4898]: I1211 13:27:12.416414 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Dec 11 13:27:12 crc kubenswrapper[4898]: I1211 13:27:12.425307 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 11 13:27:12 crc kubenswrapper[4898]: I1211 13:27:12.455068 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Dec 11 13:27:12 crc kubenswrapper[4898]: I1211 13:27:12.509431 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/512ddc04-04b7-409c-a856-4f9bf3b22c50-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"512ddc04-04b7-409c-a856-4f9bf3b22c50\") " pod="openstack/cinder-scheduler-0" Dec 11 13:27:12 crc kubenswrapper[4898]: I1211 13:27:12.509558 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/512ddc04-04b7-409c-a856-4f9bf3b22c50-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"512ddc04-04b7-409c-a856-4f9bf3b22c50\") " pod="openstack/cinder-scheduler-0" Dec 11 13:27:12 crc kubenswrapper[4898]: I1211 13:27:12.509658 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/512ddc04-04b7-409c-a856-4f9bf3b22c50-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"512ddc04-04b7-409c-a856-4f9bf3b22c50\") " pod="openstack/cinder-scheduler-0" Dec 11 13:27:12 crc kubenswrapper[4898]: I1211 13:27:12.509688 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8fd6d\" (UniqueName: \"kubernetes.io/projected/512ddc04-04b7-409c-a856-4f9bf3b22c50-kube-api-access-8fd6d\") pod \"cinder-scheduler-0\" (UID: \"512ddc04-04b7-409c-a856-4f9bf3b22c50\") " pod="openstack/cinder-scheduler-0" Dec 11 13:27:12 crc kubenswrapper[4898]: I1211 13:27:12.509765 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/512ddc04-04b7-409c-a856-4f9bf3b22c50-config-data\") pod \"cinder-scheduler-0\" (UID: \"512ddc04-04b7-409c-a856-4f9bf3b22c50\") " pod="openstack/cinder-scheduler-0" Dec 11 13:27:12 crc kubenswrapper[4898]: I1211 13:27:12.509792 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/512ddc04-04b7-409c-a856-4f9bf3b22c50-scripts\") pod \"cinder-scheduler-0\" (UID: \"512ddc04-04b7-409c-a856-4f9bf3b22c50\") " pod="openstack/cinder-scheduler-0" Dec 11 13:27:12 crc kubenswrapper[4898]: I1211 13:27:12.575861 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-6849f86cdd-lt69n" Dec 11 13:27:12 crc kubenswrapper[4898]: I1211 13:27:12.578210 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-6849f86cdd-lt69n" Dec 11 13:27:12 crc kubenswrapper[4898]: I1211 13:27:12.611515 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/512ddc04-04b7-409c-a856-4f9bf3b22c50-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"512ddc04-04b7-409c-a856-4f9bf3b22c50\") " pod="openstack/cinder-scheduler-0" Dec 11 13:27:12 crc kubenswrapper[4898]: I1211 13:27:12.611637 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/512ddc04-04b7-409c-a856-4f9bf3b22c50-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"512ddc04-04b7-409c-a856-4f9bf3b22c50\") " pod="openstack/cinder-scheduler-0" Dec 11 13:27:12 crc kubenswrapper[4898]: I1211 13:27:12.611795 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/512ddc04-04b7-409c-a856-4f9bf3b22c50-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"512ddc04-04b7-409c-a856-4f9bf3b22c50\") " pod="openstack/cinder-scheduler-0" Dec 11 13:27:12 crc kubenswrapper[4898]: I1211 13:27:12.611840 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8fd6d\" (UniqueName: \"kubernetes.io/projected/512ddc04-04b7-409c-a856-4f9bf3b22c50-kube-api-access-8fd6d\") pod \"cinder-scheduler-0\" (UID: \"512ddc04-04b7-409c-a856-4f9bf3b22c50\") " pod="openstack/cinder-scheduler-0" Dec 11 13:27:12 crc kubenswrapper[4898]: I1211 13:27:12.611892 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/512ddc04-04b7-409c-a856-4f9bf3b22c50-config-data\") pod \"cinder-scheduler-0\" (UID: \"512ddc04-04b7-409c-a856-4f9bf3b22c50\") " pod="openstack/cinder-scheduler-0" Dec 11 13:27:12 crc kubenswrapper[4898]: I1211 13:27:12.611936 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/512ddc04-04b7-409c-a856-4f9bf3b22c50-scripts\") pod \"cinder-scheduler-0\" (UID: \"512ddc04-04b7-409c-a856-4f9bf3b22c50\") " pod="openstack/cinder-scheduler-0" Dec 11 13:27:12 crc kubenswrapper[4898]: I1211 13:27:12.612524 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/512ddc04-04b7-409c-a856-4f9bf3b22c50-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"512ddc04-04b7-409c-a856-4f9bf3b22c50\") " pod="openstack/cinder-scheduler-0" Dec 11 13:27:12 crc kubenswrapper[4898]: I1211 13:27:12.616022 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/512ddc04-04b7-409c-a856-4f9bf3b22c50-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"512ddc04-04b7-409c-a856-4f9bf3b22c50\") " pod="openstack/cinder-scheduler-0" Dec 11 13:27:12 crc kubenswrapper[4898]: I1211 13:27:12.621687 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/512ddc04-04b7-409c-a856-4f9bf3b22c50-scripts\") pod \"cinder-scheduler-0\" (UID: \"512ddc04-04b7-409c-a856-4f9bf3b22c50\") " pod="openstack/cinder-scheduler-0" Dec 11 13:27:12 crc kubenswrapper[4898]: I1211 13:27:12.626240 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/512ddc04-04b7-409c-a856-4f9bf3b22c50-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"512ddc04-04b7-409c-a856-4f9bf3b22c50\") " pod="openstack/cinder-scheduler-0" Dec 11 13:27:12 crc kubenswrapper[4898]: I1211 13:27:12.642613 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8fd6d\" (UniqueName: \"kubernetes.io/projected/512ddc04-04b7-409c-a856-4f9bf3b22c50-kube-api-access-8fd6d\") pod \"cinder-scheduler-0\" (UID: \"512ddc04-04b7-409c-a856-4f9bf3b22c50\") " pod="openstack/cinder-scheduler-0" Dec 11 13:27:12 crc kubenswrapper[4898]: I1211 13:27:12.682889 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/512ddc04-04b7-409c-a856-4f9bf3b22c50-config-data\") pod \"cinder-scheduler-0\" (UID: \"512ddc04-04b7-409c-a856-4f9bf3b22c50\") " pod="openstack/cinder-scheduler-0" Dec 11 13:27:12 crc kubenswrapper[4898]: I1211 13:27:12.750835 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 11 13:27:12 crc kubenswrapper[4898]: I1211 13:27:12.800046 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d9a8d652-d53c-4daf-8714-ffd773a4134c" path="/var/lib/kubelet/pods/d9a8d652-d53c-4daf-8714-ffd773a4134c/volumes" Dec 11 13:27:12 crc kubenswrapper[4898]: I1211 13:27:12.800749 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df8a71d7-debf-4fc0-84c0-a3b1f4cb7030" path="/var/lib/kubelet/pods/df8a71d7-debf-4fc0-84c0-a3b1f4cb7030/volumes" Dec 11 13:27:13 crc kubenswrapper[4898]: I1211 13:27:13.003042 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-54b55bb954-hjwxn" event={"ID":"8bc10d96-de00-4df3-bdd0-42d70e94b7ee","Type":"ContainerDied","Data":"5d43a8685cfff2f82b6bd355ddf513d39d779743d7b28100ed897fca89d858f7"} Dec 11 13:27:13 crc kubenswrapper[4898]: I1211 13:27:13.003413 4898 scope.go:117] "RemoveContainer" containerID="7d397eeaaab07cd02eead0a941f1b54b960ad73edcd71545268d822f2f6f648d" Dec 11 13:27:13 crc kubenswrapper[4898]: I1211 13:27:13.003241 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-54b55bb954-hjwxn" Dec 11 13:27:13 crc kubenswrapper[4898]: I1211 13:27:13.030745 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-54b55bb954-hjwxn"] Dec 11 13:27:13 crc kubenswrapper[4898]: I1211 13:27:13.038861 4898 scope.go:117] "RemoveContainer" containerID="a2377f5d11f6df07aa659dc552184e17199e4f38f38b252eb1d75582f5804779" Dec 11 13:27:13 crc kubenswrapper[4898]: I1211 13:27:13.043682 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-54b55bb954-hjwxn"] Dec 11 13:27:13 crc kubenswrapper[4898]: W1211 13:27:13.260861 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod512ddc04_04b7_409c_a856_4f9bf3b22c50.slice/crio-eb4ba77fb29e90ca0bdb985738a6c40aab1136775e113475753b89c56f32a061 WatchSource:0}: Error finding container eb4ba77fb29e90ca0bdb985738a6c40aab1136775e113475753b89c56f32a061: Status 404 returned error can't find the container with id eb4ba77fb29e90ca0bdb985738a6c40aab1136775e113475753b89c56f32a061 Dec 11 13:27:13 crc kubenswrapper[4898]: I1211 13:27:13.261767 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 11 13:27:14 crc kubenswrapper[4898]: I1211 13:27:14.016898 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"512ddc04-04b7-409c-a856-4f9bf3b22c50","Type":"ContainerStarted","Data":"56cf577ae4d3d1c32613200fed078bc4a9ef27f100edd8ae075c7e367b21e77e"} Dec 11 13:27:14 crc kubenswrapper[4898]: I1211 13:27:14.017500 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"512ddc04-04b7-409c-a856-4f9bf3b22c50","Type":"ContainerStarted","Data":"eb4ba77fb29e90ca0bdb985738a6c40aab1136775e113475753b89c56f32a061"} Dec 11 13:27:14 crc kubenswrapper[4898]: I1211 13:27:14.786791 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8bc10d96-de00-4df3-bdd0-42d70e94b7ee" path="/var/lib/kubelet/pods/8bc10d96-de00-4df3-bdd0-42d70e94b7ee/volumes" Dec 11 13:27:15 crc kubenswrapper[4898]: I1211 13:27:15.033987 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"512ddc04-04b7-409c-a856-4f9bf3b22c50","Type":"ContainerStarted","Data":"b7483da3a2b3dd00c70fb981f433ae9acdf2ab92797d39aa92afe1a135f344ef"} Dec 11 13:27:15 crc kubenswrapper[4898]: I1211 13:27:15.076424 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.076398916 podStartE2EDuration="3.076398916s" podCreationTimestamp="2025-12-11 13:27:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 13:27:15.065250204 +0000 UTC m=+1392.637576651" watchObservedRunningTime="2025-12-11 13:27:15.076398916 +0000 UTC m=+1392.648725353" Dec 11 13:27:15 crc kubenswrapper[4898]: I1211 13:27:15.844098 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-5d97d757c9-6txl4" Dec 11 13:27:17 crc kubenswrapper[4898]: I1211 13:27:17.599896 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Dec 11 13:27:17 crc kubenswrapper[4898]: I1211 13:27:17.602240 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 11 13:27:17 crc kubenswrapper[4898]: I1211 13:27:17.605225 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Dec 11 13:27:17 crc kubenswrapper[4898]: I1211 13:27:17.605257 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Dec 11 13:27:17 crc kubenswrapper[4898]: I1211 13:27:17.605276 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-zrhfn" Dec 11 13:27:17 crc kubenswrapper[4898]: I1211 13:27:17.622350 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 11 13:27:17 crc kubenswrapper[4898]: I1211 13:27:17.753723 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Dec 11 13:27:17 crc kubenswrapper[4898]: I1211 13:27:17.755979 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11154469-5a32-47bb-bbaf-66ea95afcf82-combined-ca-bundle\") pod \"openstackclient\" (UID: \"11154469-5a32-47bb-bbaf-66ea95afcf82\") " pod="openstack/openstackclient" Dec 11 13:27:17 crc kubenswrapper[4898]: I1211 13:27:17.756024 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lv9w6\" (UniqueName: \"kubernetes.io/projected/11154469-5a32-47bb-bbaf-66ea95afcf82-kube-api-access-lv9w6\") pod \"openstackclient\" (UID: \"11154469-5a32-47bb-bbaf-66ea95afcf82\") " pod="openstack/openstackclient" Dec 11 13:27:17 crc kubenswrapper[4898]: I1211 13:27:17.756053 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/11154469-5a32-47bb-bbaf-66ea95afcf82-openstack-config\") pod \"openstackclient\" (UID: \"11154469-5a32-47bb-bbaf-66ea95afcf82\") " pod="openstack/openstackclient" Dec 11 13:27:17 crc kubenswrapper[4898]: I1211 13:27:17.756168 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/11154469-5a32-47bb-bbaf-66ea95afcf82-openstack-config-secret\") pod \"openstackclient\" (UID: \"11154469-5a32-47bb-bbaf-66ea95afcf82\") " pod="openstack/openstackclient" Dec 11 13:27:17 crc kubenswrapper[4898]: I1211 13:27:17.857907 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lv9w6\" (UniqueName: \"kubernetes.io/projected/11154469-5a32-47bb-bbaf-66ea95afcf82-kube-api-access-lv9w6\") pod \"openstackclient\" (UID: \"11154469-5a32-47bb-bbaf-66ea95afcf82\") " pod="openstack/openstackclient" Dec 11 13:27:17 crc kubenswrapper[4898]: I1211 13:27:17.858008 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/11154469-5a32-47bb-bbaf-66ea95afcf82-openstack-config\") pod \"openstackclient\" (UID: \"11154469-5a32-47bb-bbaf-66ea95afcf82\") " pod="openstack/openstackclient" Dec 11 13:27:17 crc kubenswrapper[4898]: I1211 13:27:17.858284 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/11154469-5a32-47bb-bbaf-66ea95afcf82-openstack-config-secret\") pod \"openstackclient\" (UID: \"11154469-5a32-47bb-bbaf-66ea95afcf82\") " pod="openstack/openstackclient" Dec 11 13:27:17 crc kubenswrapper[4898]: I1211 13:27:17.858455 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11154469-5a32-47bb-bbaf-66ea95afcf82-combined-ca-bundle\") pod \"openstackclient\" (UID: \"11154469-5a32-47bb-bbaf-66ea95afcf82\") " pod="openstack/openstackclient" Dec 11 13:27:17 crc kubenswrapper[4898]: I1211 13:27:17.858998 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/11154469-5a32-47bb-bbaf-66ea95afcf82-openstack-config\") pod \"openstackclient\" (UID: \"11154469-5a32-47bb-bbaf-66ea95afcf82\") " pod="openstack/openstackclient" Dec 11 13:27:17 crc kubenswrapper[4898]: I1211 13:27:17.870352 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/11154469-5a32-47bb-bbaf-66ea95afcf82-openstack-config-secret\") pod \"openstackclient\" (UID: \"11154469-5a32-47bb-bbaf-66ea95afcf82\") " pod="openstack/openstackclient" Dec 11 13:27:17 crc kubenswrapper[4898]: I1211 13:27:17.870443 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11154469-5a32-47bb-bbaf-66ea95afcf82-combined-ca-bundle\") pod \"openstackclient\" (UID: \"11154469-5a32-47bb-bbaf-66ea95afcf82\") " pod="openstack/openstackclient" Dec 11 13:27:17 crc kubenswrapper[4898]: I1211 13:27:17.881402 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lv9w6\" (UniqueName: \"kubernetes.io/projected/11154469-5a32-47bb-bbaf-66ea95afcf82-kube-api-access-lv9w6\") pod \"openstackclient\" (UID: \"11154469-5a32-47bb-bbaf-66ea95afcf82\") " pod="openstack/openstackclient" Dec 11 13:27:17 crc kubenswrapper[4898]: I1211 13:27:17.938605 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 11 13:27:18 crc kubenswrapper[4898]: I1211 13:27:18.447346 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 11 13:27:19 crc kubenswrapper[4898]: I1211 13:27:19.099537 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"11154469-5a32-47bb-bbaf-66ea95afcf82","Type":"ContainerStarted","Data":"100d6a1308c23a0a0e1d94e4545497b6eb55b1bacb309a089278c11cf74e5eb1"} Dec 11 13:27:21 crc kubenswrapper[4898]: I1211 13:27:21.798834 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-6867fd7bcf-bbj7b"] Dec 11 13:27:21 crc kubenswrapper[4898]: I1211 13:27:21.801957 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-6867fd7bcf-bbj7b" Dec 11 13:27:21 crc kubenswrapper[4898]: I1211 13:27:21.805406 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Dec 11 13:27:21 crc kubenswrapper[4898]: I1211 13:27:21.805605 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Dec 11 13:27:21 crc kubenswrapper[4898]: I1211 13:27:21.805852 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Dec 11 13:27:21 crc kubenswrapper[4898]: I1211 13:27:21.817855 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-6867fd7bcf-bbj7b"] Dec 11 13:27:21 crc kubenswrapper[4898]: I1211 13:27:21.850864 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4ed17564-edd0-4a66-8b9b-04aabd280113-internal-tls-certs\") pod \"swift-proxy-6867fd7bcf-bbj7b\" (UID: \"4ed17564-edd0-4a66-8b9b-04aabd280113\") " pod="openstack/swift-proxy-6867fd7bcf-bbj7b" Dec 11 13:27:21 crc kubenswrapper[4898]: I1211 13:27:21.851038 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-frwl9\" (UniqueName: \"kubernetes.io/projected/4ed17564-edd0-4a66-8b9b-04aabd280113-kube-api-access-frwl9\") pod \"swift-proxy-6867fd7bcf-bbj7b\" (UID: \"4ed17564-edd0-4a66-8b9b-04aabd280113\") " pod="openstack/swift-proxy-6867fd7bcf-bbj7b" Dec 11 13:27:21 crc kubenswrapper[4898]: I1211 13:27:21.851139 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4ed17564-edd0-4a66-8b9b-04aabd280113-public-tls-certs\") pod \"swift-proxy-6867fd7bcf-bbj7b\" (UID: \"4ed17564-edd0-4a66-8b9b-04aabd280113\") " pod="openstack/swift-proxy-6867fd7bcf-bbj7b" Dec 11 13:27:21 crc kubenswrapper[4898]: I1211 13:27:21.851196 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4ed17564-edd0-4a66-8b9b-04aabd280113-log-httpd\") pod \"swift-proxy-6867fd7bcf-bbj7b\" (UID: \"4ed17564-edd0-4a66-8b9b-04aabd280113\") " pod="openstack/swift-proxy-6867fd7bcf-bbj7b" Dec 11 13:27:21 crc kubenswrapper[4898]: I1211 13:27:21.851242 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ed17564-edd0-4a66-8b9b-04aabd280113-config-data\") pod \"swift-proxy-6867fd7bcf-bbj7b\" (UID: \"4ed17564-edd0-4a66-8b9b-04aabd280113\") " pod="openstack/swift-proxy-6867fd7bcf-bbj7b" Dec 11 13:27:21 crc kubenswrapper[4898]: I1211 13:27:21.851279 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4ed17564-edd0-4a66-8b9b-04aabd280113-run-httpd\") pod \"swift-proxy-6867fd7bcf-bbj7b\" (UID: \"4ed17564-edd0-4a66-8b9b-04aabd280113\") " pod="openstack/swift-proxy-6867fd7bcf-bbj7b" Dec 11 13:27:21 crc kubenswrapper[4898]: I1211 13:27:21.851315 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ed17564-edd0-4a66-8b9b-04aabd280113-combined-ca-bundle\") pod \"swift-proxy-6867fd7bcf-bbj7b\" (UID: \"4ed17564-edd0-4a66-8b9b-04aabd280113\") " pod="openstack/swift-proxy-6867fd7bcf-bbj7b" Dec 11 13:27:21 crc kubenswrapper[4898]: I1211 13:27:21.851343 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/4ed17564-edd0-4a66-8b9b-04aabd280113-etc-swift\") pod \"swift-proxy-6867fd7bcf-bbj7b\" (UID: \"4ed17564-edd0-4a66-8b9b-04aabd280113\") " pod="openstack/swift-proxy-6867fd7bcf-bbj7b" Dec 11 13:27:21 crc kubenswrapper[4898]: I1211 13:27:21.954939 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4ed17564-edd0-4a66-8b9b-04aabd280113-public-tls-certs\") pod \"swift-proxy-6867fd7bcf-bbj7b\" (UID: \"4ed17564-edd0-4a66-8b9b-04aabd280113\") " pod="openstack/swift-proxy-6867fd7bcf-bbj7b" Dec 11 13:27:21 crc kubenswrapper[4898]: I1211 13:27:21.955017 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4ed17564-edd0-4a66-8b9b-04aabd280113-log-httpd\") pod \"swift-proxy-6867fd7bcf-bbj7b\" (UID: \"4ed17564-edd0-4a66-8b9b-04aabd280113\") " pod="openstack/swift-proxy-6867fd7bcf-bbj7b" Dec 11 13:27:21 crc kubenswrapper[4898]: I1211 13:27:21.955065 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ed17564-edd0-4a66-8b9b-04aabd280113-config-data\") pod \"swift-proxy-6867fd7bcf-bbj7b\" (UID: \"4ed17564-edd0-4a66-8b9b-04aabd280113\") " pod="openstack/swift-proxy-6867fd7bcf-bbj7b" Dec 11 13:27:21 crc kubenswrapper[4898]: I1211 13:27:21.955105 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4ed17564-edd0-4a66-8b9b-04aabd280113-run-httpd\") pod \"swift-proxy-6867fd7bcf-bbj7b\" (UID: \"4ed17564-edd0-4a66-8b9b-04aabd280113\") " pod="openstack/swift-proxy-6867fd7bcf-bbj7b" Dec 11 13:27:21 crc kubenswrapper[4898]: I1211 13:27:21.955140 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ed17564-edd0-4a66-8b9b-04aabd280113-combined-ca-bundle\") pod \"swift-proxy-6867fd7bcf-bbj7b\" (UID: \"4ed17564-edd0-4a66-8b9b-04aabd280113\") " pod="openstack/swift-proxy-6867fd7bcf-bbj7b" Dec 11 13:27:21 crc kubenswrapper[4898]: I1211 13:27:21.955231 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/4ed17564-edd0-4a66-8b9b-04aabd280113-etc-swift\") pod \"swift-proxy-6867fd7bcf-bbj7b\" (UID: \"4ed17564-edd0-4a66-8b9b-04aabd280113\") " pod="openstack/swift-proxy-6867fd7bcf-bbj7b" Dec 11 13:27:21 crc kubenswrapper[4898]: I1211 13:27:21.955266 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4ed17564-edd0-4a66-8b9b-04aabd280113-internal-tls-certs\") pod \"swift-proxy-6867fd7bcf-bbj7b\" (UID: \"4ed17564-edd0-4a66-8b9b-04aabd280113\") " pod="openstack/swift-proxy-6867fd7bcf-bbj7b" Dec 11 13:27:21 crc kubenswrapper[4898]: I1211 13:27:21.955487 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-frwl9\" (UniqueName: \"kubernetes.io/projected/4ed17564-edd0-4a66-8b9b-04aabd280113-kube-api-access-frwl9\") pod \"swift-proxy-6867fd7bcf-bbj7b\" (UID: \"4ed17564-edd0-4a66-8b9b-04aabd280113\") " pod="openstack/swift-proxy-6867fd7bcf-bbj7b" Dec 11 13:27:21 crc kubenswrapper[4898]: I1211 13:27:21.961060 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4ed17564-edd0-4a66-8b9b-04aabd280113-run-httpd\") pod \"swift-proxy-6867fd7bcf-bbj7b\" (UID: \"4ed17564-edd0-4a66-8b9b-04aabd280113\") " pod="openstack/swift-proxy-6867fd7bcf-bbj7b" Dec 11 13:27:21 crc kubenswrapper[4898]: I1211 13:27:21.965845 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4ed17564-edd0-4a66-8b9b-04aabd280113-log-httpd\") pod \"swift-proxy-6867fd7bcf-bbj7b\" (UID: \"4ed17564-edd0-4a66-8b9b-04aabd280113\") " pod="openstack/swift-proxy-6867fd7bcf-bbj7b" Dec 11 13:27:21 crc kubenswrapper[4898]: I1211 13:27:21.981429 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ed17564-edd0-4a66-8b9b-04aabd280113-config-data\") pod \"swift-proxy-6867fd7bcf-bbj7b\" (UID: \"4ed17564-edd0-4a66-8b9b-04aabd280113\") " pod="openstack/swift-proxy-6867fd7bcf-bbj7b" Dec 11 13:27:21 crc kubenswrapper[4898]: I1211 13:27:21.982337 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4ed17564-edd0-4a66-8b9b-04aabd280113-public-tls-certs\") pod \"swift-proxy-6867fd7bcf-bbj7b\" (UID: \"4ed17564-edd0-4a66-8b9b-04aabd280113\") " pod="openstack/swift-proxy-6867fd7bcf-bbj7b" Dec 11 13:27:21 crc kubenswrapper[4898]: I1211 13:27:21.983438 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ed17564-edd0-4a66-8b9b-04aabd280113-combined-ca-bundle\") pod \"swift-proxy-6867fd7bcf-bbj7b\" (UID: \"4ed17564-edd0-4a66-8b9b-04aabd280113\") " pod="openstack/swift-proxy-6867fd7bcf-bbj7b" Dec 11 13:27:21 crc kubenswrapper[4898]: I1211 13:27:21.992163 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4ed17564-edd0-4a66-8b9b-04aabd280113-internal-tls-certs\") pod \"swift-proxy-6867fd7bcf-bbj7b\" (UID: \"4ed17564-edd0-4a66-8b9b-04aabd280113\") " pod="openstack/swift-proxy-6867fd7bcf-bbj7b" Dec 11 13:27:22 crc kubenswrapper[4898]: I1211 13:27:22.021627 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/4ed17564-edd0-4a66-8b9b-04aabd280113-etc-swift\") pod \"swift-proxy-6867fd7bcf-bbj7b\" (UID: \"4ed17564-edd0-4a66-8b9b-04aabd280113\") " pod="openstack/swift-proxy-6867fd7bcf-bbj7b" Dec 11 13:27:22 crc kubenswrapper[4898]: I1211 13:27:22.028595 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-frwl9\" (UniqueName: \"kubernetes.io/projected/4ed17564-edd0-4a66-8b9b-04aabd280113-kube-api-access-frwl9\") pod \"swift-proxy-6867fd7bcf-bbj7b\" (UID: \"4ed17564-edd0-4a66-8b9b-04aabd280113\") " pod="openstack/swift-proxy-6867fd7bcf-bbj7b" Dec 11 13:27:22 crc kubenswrapper[4898]: I1211 13:27:22.134294 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-6867fd7bcf-bbj7b" Dec 11 13:27:22 crc kubenswrapper[4898]: I1211 13:27:22.541220 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-6867fd7bcf-bbj7b"] Dec 11 13:27:23 crc kubenswrapper[4898]: I1211 13:27:23.027729 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Dec 11 13:27:23 crc kubenswrapper[4898]: I1211 13:27:23.150603 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-6867fd7bcf-bbj7b" event={"ID":"4ed17564-edd0-4a66-8b9b-04aabd280113","Type":"ContainerStarted","Data":"4d50dc7a3f11c12e099c1f520ef678e95c3147469eb7e6e0865920fd10ba8d16"} Dec 11 13:27:23 crc kubenswrapper[4898]: I1211 13:27:23.150745 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-6867fd7bcf-bbj7b" event={"ID":"4ed17564-edd0-4a66-8b9b-04aabd280113","Type":"ContainerStarted","Data":"1bef1139323550f9af565f38d91b39c66edf06574f594bae9e6b276bf6454d2a"} Dec 11 13:27:23 crc kubenswrapper[4898]: I1211 13:27:23.150761 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-6867fd7bcf-bbj7b" event={"ID":"4ed17564-edd0-4a66-8b9b-04aabd280113","Type":"ContainerStarted","Data":"1de3524d76cbf899c9f2b6bf7e5711070b819b821c0d08e4d164e54c8b8e5112"} Dec 11 13:27:23 crc kubenswrapper[4898]: I1211 13:27:23.150893 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-6867fd7bcf-bbj7b" Dec 11 13:27:23 crc kubenswrapper[4898]: I1211 13:27:23.150926 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-6867fd7bcf-bbj7b" Dec 11 13:27:23 crc kubenswrapper[4898]: I1211 13:27:23.173714 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-6867fd7bcf-bbj7b" podStartSLOduration=2.173696141 podStartE2EDuration="2.173696141s" podCreationTimestamp="2025-12-11 13:27:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 13:27:23.171974095 +0000 UTC m=+1400.744300532" watchObservedRunningTime="2025-12-11 13:27:23.173696141 +0000 UTC m=+1400.746022578" Dec 11 13:27:23 crc kubenswrapper[4898]: I1211 13:27:23.714069 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 11 13:27:23 crc kubenswrapper[4898]: I1211 13:27:23.714593 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ee3c0532-b7dd-4147-812a-3add6446d9a1" containerName="ceilometer-central-agent" containerID="cri-o://18ae6c0fa34c3c05f5a3592448be5988ac5031be2913b0ede4dba797a3166f63" gracePeriod=30 Dec 11 13:27:23 crc kubenswrapper[4898]: I1211 13:27:23.715259 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ee3c0532-b7dd-4147-812a-3add6446d9a1" containerName="sg-core" containerID="cri-o://0ab2ee3a37554d4b70a2e0e001aaa985748619ff652d537c5d296943c659c6dc" gracePeriod=30 Dec 11 13:27:23 crc kubenswrapper[4898]: I1211 13:27:23.715286 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ee3c0532-b7dd-4147-812a-3add6446d9a1" containerName="proxy-httpd" containerID="cri-o://2d5f34b7189654cc5dc196435838bdd9ec0ba65468cf2c2304dd540e3d9c265f" gracePeriod=30 Dec 11 13:27:23 crc kubenswrapper[4898]: I1211 13:27:23.715317 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ee3c0532-b7dd-4147-812a-3add6446d9a1" containerName="ceilometer-notification-agent" containerID="cri-o://101e35693121caba6825c50408ee1563ac024c47b03bb8af36c46448894ca9e9" gracePeriod=30 Dec 11 13:27:23 crc kubenswrapper[4898]: I1211 13:27:23.724475 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="ee3c0532-b7dd-4147-812a-3add6446d9a1" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.202:3000/\": EOF" Dec 11 13:27:24 crc kubenswrapper[4898]: I1211 13:27:24.177063 4898 generic.go:334] "Generic (PLEG): container finished" podID="ee3c0532-b7dd-4147-812a-3add6446d9a1" containerID="2d5f34b7189654cc5dc196435838bdd9ec0ba65468cf2c2304dd540e3d9c265f" exitCode=0 Dec 11 13:27:24 crc kubenswrapper[4898]: I1211 13:27:24.177104 4898 generic.go:334] "Generic (PLEG): container finished" podID="ee3c0532-b7dd-4147-812a-3add6446d9a1" containerID="0ab2ee3a37554d4b70a2e0e001aaa985748619ff652d537c5d296943c659c6dc" exitCode=2 Dec 11 13:27:24 crc kubenswrapper[4898]: I1211 13:27:24.177117 4898 generic.go:334] "Generic (PLEG): container finished" podID="ee3c0532-b7dd-4147-812a-3add6446d9a1" containerID="18ae6c0fa34c3c05f5a3592448be5988ac5031be2913b0ede4dba797a3166f63" exitCode=0 Dec 11 13:27:24 crc kubenswrapper[4898]: I1211 13:27:24.177136 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ee3c0532-b7dd-4147-812a-3add6446d9a1","Type":"ContainerDied","Data":"2d5f34b7189654cc5dc196435838bdd9ec0ba65468cf2c2304dd540e3d9c265f"} Dec 11 13:27:24 crc kubenswrapper[4898]: I1211 13:27:24.177175 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ee3c0532-b7dd-4147-812a-3add6446d9a1","Type":"ContainerDied","Data":"0ab2ee3a37554d4b70a2e0e001aaa985748619ff652d537c5d296943c659c6dc"} Dec 11 13:27:24 crc kubenswrapper[4898]: I1211 13:27:24.177186 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ee3c0532-b7dd-4147-812a-3add6446d9a1","Type":"ContainerDied","Data":"18ae6c0fa34c3c05f5a3592448be5988ac5031be2913b0ede4dba797a3166f63"} Dec 11 13:27:25 crc kubenswrapper[4898]: I1211 13:27:25.189873 4898 generic.go:334] "Generic (PLEG): container finished" podID="ee3c0532-b7dd-4147-812a-3add6446d9a1" containerID="101e35693121caba6825c50408ee1563ac024c47b03bb8af36c46448894ca9e9" exitCode=0 Dec 11 13:27:25 crc kubenswrapper[4898]: I1211 13:27:25.190161 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ee3c0532-b7dd-4147-812a-3add6446d9a1","Type":"ContainerDied","Data":"101e35693121caba6825c50408ee1563ac024c47b03bb8af36c46448894ca9e9"} Dec 11 13:27:27 crc kubenswrapper[4898]: I1211 13:27:27.146591 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-6867fd7bcf-bbj7b" Dec 11 13:27:27 crc kubenswrapper[4898]: I1211 13:27:27.154296 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-6867fd7bcf-bbj7b" Dec 11 13:27:28 crc kubenswrapper[4898]: I1211 13:27:28.573962 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-qnzh8"] Dec 11 13:27:28 crc kubenswrapper[4898]: I1211 13:27:28.576132 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-qnzh8" Dec 11 13:27:28 crc kubenswrapper[4898]: I1211 13:27:28.588275 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-qnzh8"] Dec 11 13:27:28 crc kubenswrapper[4898]: I1211 13:27:28.638116 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hpdhh\" (UniqueName: \"kubernetes.io/projected/f07f8aaa-f41a-4204-881e-274dc9a9ad74-kube-api-access-hpdhh\") pod \"nova-api-db-create-qnzh8\" (UID: \"f07f8aaa-f41a-4204-881e-274dc9a9ad74\") " pod="openstack/nova-api-db-create-qnzh8" Dec 11 13:27:28 crc kubenswrapper[4898]: I1211 13:27:28.638719 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f07f8aaa-f41a-4204-881e-274dc9a9ad74-operator-scripts\") pod \"nova-api-db-create-qnzh8\" (UID: \"f07f8aaa-f41a-4204-881e-274dc9a9ad74\") " pod="openstack/nova-api-db-create-qnzh8" Dec 11 13:27:28 crc kubenswrapper[4898]: I1211 13:27:28.684310 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-rnnwz"] Dec 11 13:27:28 crc kubenswrapper[4898]: I1211 13:27:28.685985 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-rnnwz" Dec 11 13:27:28 crc kubenswrapper[4898]: I1211 13:27:28.711216 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-1a34-account-create-update-g82b9"] Dec 11 13:27:28 crc kubenswrapper[4898]: I1211 13:27:28.712684 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-1a34-account-create-update-g82b9" Dec 11 13:27:28 crc kubenswrapper[4898]: I1211 13:27:28.719506 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Dec 11 13:27:28 crc kubenswrapper[4898]: I1211 13:27:28.728512 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-rnnwz"] Dec 11 13:27:28 crc kubenswrapper[4898]: I1211 13:27:28.741273 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f7ckh\" (UniqueName: \"kubernetes.io/projected/446e3383-1a1b-4271-94ca-1662e36059d3-kube-api-access-f7ckh\") pod \"nova-cell0-db-create-rnnwz\" (UID: \"446e3383-1a1b-4271-94ca-1662e36059d3\") " pod="openstack/nova-cell0-db-create-rnnwz" Dec 11 13:27:28 crc kubenswrapper[4898]: I1211 13:27:28.741438 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f07f8aaa-f41a-4204-881e-274dc9a9ad74-operator-scripts\") pod \"nova-api-db-create-qnzh8\" (UID: \"f07f8aaa-f41a-4204-881e-274dc9a9ad74\") " pod="openstack/nova-api-db-create-qnzh8" Dec 11 13:27:28 crc kubenswrapper[4898]: I1211 13:27:28.741699 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hpdhh\" (UniqueName: \"kubernetes.io/projected/f07f8aaa-f41a-4204-881e-274dc9a9ad74-kube-api-access-hpdhh\") pod \"nova-api-db-create-qnzh8\" (UID: \"f07f8aaa-f41a-4204-881e-274dc9a9ad74\") " pod="openstack/nova-api-db-create-qnzh8" Dec 11 13:27:28 crc kubenswrapper[4898]: I1211 13:27:28.741837 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/446e3383-1a1b-4271-94ca-1662e36059d3-operator-scripts\") pod \"nova-cell0-db-create-rnnwz\" (UID: \"446e3383-1a1b-4271-94ca-1662e36059d3\") " pod="openstack/nova-cell0-db-create-rnnwz" Dec 11 13:27:28 crc kubenswrapper[4898]: I1211 13:27:28.742859 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f07f8aaa-f41a-4204-881e-274dc9a9ad74-operator-scripts\") pod \"nova-api-db-create-qnzh8\" (UID: \"f07f8aaa-f41a-4204-881e-274dc9a9ad74\") " pod="openstack/nova-api-db-create-qnzh8" Dec 11 13:27:28 crc kubenswrapper[4898]: I1211 13:27:28.746123 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-1a34-account-create-update-g82b9"] Dec 11 13:27:28 crc kubenswrapper[4898]: I1211 13:27:28.778392 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hpdhh\" (UniqueName: \"kubernetes.io/projected/f07f8aaa-f41a-4204-881e-274dc9a9ad74-kube-api-access-hpdhh\") pod \"nova-api-db-create-qnzh8\" (UID: \"f07f8aaa-f41a-4204-881e-274dc9a9ad74\") " pod="openstack/nova-api-db-create-qnzh8" Dec 11 13:27:28 crc kubenswrapper[4898]: I1211 13:27:28.811079 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-tfn2j"] Dec 11 13:27:28 crc kubenswrapper[4898]: I1211 13:27:28.813296 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-tfn2j" Dec 11 13:27:28 crc kubenswrapper[4898]: I1211 13:27:28.821606 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-tfn2j"] Dec 11 13:27:28 crc kubenswrapper[4898]: I1211 13:27:28.844164 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f7ckh\" (UniqueName: \"kubernetes.io/projected/446e3383-1a1b-4271-94ca-1662e36059d3-kube-api-access-f7ckh\") pod \"nova-cell0-db-create-rnnwz\" (UID: \"446e3383-1a1b-4271-94ca-1662e36059d3\") " pod="openstack/nova-cell0-db-create-rnnwz" Dec 11 13:27:28 crc kubenswrapper[4898]: I1211 13:27:28.844228 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2b4db1d0-fd46-4731-8392-e8390610a2c8-operator-scripts\") pod \"nova-cell1-db-create-tfn2j\" (UID: \"2b4db1d0-fd46-4731-8392-e8390610a2c8\") " pod="openstack/nova-cell1-db-create-tfn2j" Dec 11 13:27:28 crc kubenswrapper[4898]: I1211 13:27:28.845842 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wm798\" (UniqueName: \"kubernetes.io/projected/c2cfe6b8-8f34-455c-afe3-61b33605d648-kube-api-access-wm798\") pod \"nova-api-1a34-account-create-update-g82b9\" (UID: \"c2cfe6b8-8f34-455c-afe3-61b33605d648\") " pod="openstack/nova-api-1a34-account-create-update-g82b9" Dec 11 13:27:28 crc kubenswrapper[4898]: I1211 13:27:28.846099 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/446e3383-1a1b-4271-94ca-1662e36059d3-operator-scripts\") pod \"nova-cell0-db-create-rnnwz\" (UID: \"446e3383-1a1b-4271-94ca-1662e36059d3\") " pod="openstack/nova-cell0-db-create-rnnwz" Dec 11 13:27:28 crc kubenswrapper[4898]: I1211 13:27:28.846177 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c2cfe6b8-8f34-455c-afe3-61b33605d648-operator-scripts\") pod \"nova-api-1a34-account-create-update-g82b9\" (UID: \"c2cfe6b8-8f34-455c-afe3-61b33605d648\") " pod="openstack/nova-api-1a34-account-create-update-g82b9" Dec 11 13:27:28 crc kubenswrapper[4898]: I1211 13:27:28.846226 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8g6lj\" (UniqueName: \"kubernetes.io/projected/2b4db1d0-fd46-4731-8392-e8390610a2c8-kube-api-access-8g6lj\") pod \"nova-cell1-db-create-tfn2j\" (UID: \"2b4db1d0-fd46-4731-8392-e8390610a2c8\") " pod="openstack/nova-cell1-db-create-tfn2j" Dec 11 13:27:28 crc kubenswrapper[4898]: I1211 13:27:28.847754 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/446e3383-1a1b-4271-94ca-1662e36059d3-operator-scripts\") pod \"nova-cell0-db-create-rnnwz\" (UID: \"446e3383-1a1b-4271-94ca-1662e36059d3\") " pod="openstack/nova-cell0-db-create-rnnwz" Dec 11 13:27:28 crc kubenswrapper[4898]: I1211 13:27:28.886116 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f7ckh\" (UniqueName: \"kubernetes.io/projected/446e3383-1a1b-4271-94ca-1662e36059d3-kube-api-access-f7ckh\") pod \"nova-cell0-db-create-rnnwz\" (UID: \"446e3383-1a1b-4271-94ca-1662e36059d3\") " pod="openstack/nova-cell0-db-create-rnnwz" Dec 11 13:27:28 crc kubenswrapper[4898]: I1211 13:27:28.903304 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-qnzh8" Dec 11 13:27:28 crc kubenswrapper[4898]: I1211 13:27:28.907231 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-d8bf-account-create-update-f5twm"] Dec 11 13:27:28 crc kubenswrapper[4898]: I1211 13:27:28.908815 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-d8bf-account-create-update-f5twm" Dec 11 13:27:28 crc kubenswrapper[4898]: I1211 13:27:28.911356 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Dec 11 13:27:28 crc kubenswrapper[4898]: I1211 13:27:28.949695 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-d8bf-account-create-update-f5twm"] Dec 11 13:27:28 crc kubenswrapper[4898]: I1211 13:27:28.954838 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2b4db1d0-fd46-4731-8392-e8390610a2c8-operator-scripts\") pod \"nova-cell1-db-create-tfn2j\" (UID: \"2b4db1d0-fd46-4731-8392-e8390610a2c8\") " pod="openstack/nova-cell1-db-create-tfn2j" Dec 11 13:27:28 crc kubenswrapper[4898]: I1211 13:27:28.954950 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wm798\" (UniqueName: \"kubernetes.io/projected/c2cfe6b8-8f34-455c-afe3-61b33605d648-kube-api-access-wm798\") pod \"nova-api-1a34-account-create-update-g82b9\" (UID: \"c2cfe6b8-8f34-455c-afe3-61b33605d648\") " pod="openstack/nova-api-1a34-account-create-update-g82b9" Dec 11 13:27:28 crc kubenswrapper[4898]: I1211 13:27:28.955085 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5d173a21-f567-430e-b7fe-c47892f42873-operator-scripts\") pod \"nova-cell0-d8bf-account-create-update-f5twm\" (UID: \"5d173a21-f567-430e-b7fe-c47892f42873\") " pod="openstack/nova-cell0-d8bf-account-create-update-f5twm" Dec 11 13:27:28 crc kubenswrapper[4898]: I1211 13:27:28.955120 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9kp4x\" (UniqueName: \"kubernetes.io/projected/5d173a21-f567-430e-b7fe-c47892f42873-kube-api-access-9kp4x\") pod \"nova-cell0-d8bf-account-create-update-f5twm\" (UID: \"5d173a21-f567-430e-b7fe-c47892f42873\") " pod="openstack/nova-cell0-d8bf-account-create-update-f5twm" Dec 11 13:27:28 crc kubenswrapper[4898]: I1211 13:27:28.955176 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c2cfe6b8-8f34-455c-afe3-61b33605d648-operator-scripts\") pod \"nova-api-1a34-account-create-update-g82b9\" (UID: \"c2cfe6b8-8f34-455c-afe3-61b33605d648\") " pod="openstack/nova-api-1a34-account-create-update-g82b9" Dec 11 13:27:28 crc kubenswrapper[4898]: I1211 13:27:28.955226 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8g6lj\" (UniqueName: \"kubernetes.io/projected/2b4db1d0-fd46-4731-8392-e8390610a2c8-kube-api-access-8g6lj\") pod \"nova-cell1-db-create-tfn2j\" (UID: \"2b4db1d0-fd46-4731-8392-e8390610a2c8\") " pod="openstack/nova-cell1-db-create-tfn2j" Dec 11 13:27:28 crc kubenswrapper[4898]: I1211 13:27:28.956069 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2b4db1d0-fd46-4731-8392-e8390610a2c8-operator-scripts\") pod \"nova-cell1-db-create-tfn2j\" (UID: \"2b4db1d0-fd46-4731-8392-e8390610a2c8\") " pod="openstack/nova-cell1-db-create-tfn2j" Dec 11 13:27:28 crc kubenswrapper[4898]: I1211 13:27:28.956221 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c2cfe6b8-8f34-455c-afe3-61b33605d648-operator-scripts\") pod \"nova-api-1a34-account-create-update-g82b9\" (UID: \"c2cfe6b8-8f34-455c-afe3-61b33605d648\") " pod="openstack/nova-api-1a34-account-create-update-g82b9" Dec 11 13:27:28 crc kubenswrapper[4898]: I1211 13:27:28.972350 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wm798\" (UniqueName: \"kubernetes.io/projected/c2cfe6b8-8f34-455c-afe3-61b33605d648-kube-api-access-wm798\") pod \"nova-api-1a34-account-create-update-g82b9\" (UID: \"c2cfe6b8-8f34-455c-afe3-61b33605d648\") " pod="openstack/nova-api-1a34-account-create-update-g82b9" Dec 11 13:27:28 crc kubenswrapper[4898]: I1211 13:27:28.976736 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8g6lj\" (UniqueName: \"kubernetes.io/projected/2b4db1d0-fd46-4731-8392-e8390610a2c8-kube-api-access-8g6lj\") pod \"nova-cell1-db-create-tfn2j\" (UID: \"2b4db1d0-fd46-4731-8392-e8390610a2c8\") " pod="openstack/nova-cell1-db-create-tfn2j" Dec 11 13:27:29 crc kubenswrapper[4898]: I1211 13:27:29.009279 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-rnnwz" Dec 11 13:27:29 crc kubenswrapper[4898]: I1211 13:27:29.055310 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-1a34-account-create-update-g82b9" Dec 11 13:27:29 crc kubenswrapper[4898]: I1211 13:27:29.058392 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5d173a21-f567-430e-b7fe-c47892f42873-operator-scripts\") pod \"nova-cell0-d8bf-account-create-update-f5twm\" (UID: \"5d173a21-f567-430e-b7fe-c47892f42873\") " pod="openstack/nova-cell0-d8bf-account-create-update-f5twm" Dec 11 13:27:29 crc kubenswrapper[4898]: I1211 13:27:29.058450 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9kp4x\" (UniqueName: \"kubernetes.io/projected/5d173a21-f567-430e-b7fe-c47892f42873-kube-api-access-9kp4x\") pod \"nova-cell0-d8bf-account-create-update-f5twm\" (UID: \"5d173a21-f567-430e-b7fe-c47892f42873\") " pod="openstack/nova-cell0-d8bf-account-create-update-f5twm" Dec 11 13:27:29 crc kubenswrapper[4898]: I1211 13:27:29.059652 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5d173a21-f567-430e-b7fe-c47892f42873-operator-scripts\") pod \"nova-cell0-d8bf-account-create-update-f5twm\" (UID: \"5d173a21-f567-430e-b7fe-c47892f42873\") " pod="openstack/nova-cell0-d8bf-account-create-update-f5twm" Dec 11 13:27:29 crc kubenswrapper[4898]: I1211 13:27:29.083072 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9kp4x\" (UniqueName: \"kubernetes.io/projected/5d173a21-f567-430e-b7fe-c47892f42873-kube-api-access-9kp4x\") pod \"nova-cell0-d8bf-account-create-update-f5twm\" (UID: \"5d173a21-f567-430e-b7fe-c47892f42873\") " pod="openstack/nova-cell0-d8bf-account-create-update-f5twm" Dec 11 13:27:29 crc kubenswrapper[4898]: I1211 13:27:29.084218 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-c008-account-create-update-5t4pl"] Dec 11 13:27:29 crc kubenswrapper[4898]: I1211 13:27:29.086367 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-c008-account-create-update-5t4pl" Dec 11 13:27:29 crc kubenswrapper[4898]: I1211 13:27:29.092353 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Dec 11 13:27:29 crc kubenswrapper[4898]: I1211 13:27:29.109002 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-c008-account-create-update-5t4pl"] Dec 11 13:27:29 crc kubenswrapper[4898]: I1211 13:27:29.160695 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1188efd5-cf9c-48dc-bd72-5259294cca4c-operator-scripts\") pod \"nova-cell1-c008-account-create-update-5t4pl\" (UID: \"1188efd5-cf9c-48dc-bd72-5259294cca4c\") " pod="openstack/nova-cell1-c008-account-create-update-5t4pl" Dec 11 13:27:29 crc kubenswrapper[4898]: I1211 13:27:29.160787 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-56gdv\" (UniqueName: \"kubernetes.io/projected/1188efd5-cf9c-48dc-bd72-5259294cca4c-kube-api-access-56gdv\") pod \"nova-cell1-c008-account-create-update-5t4pl\" (UID: \"1188efd5-cf9c-48dc-bd72-5259294cca4c\") " pod="openstack/nova-cell1-c008-account-create-update-5t4pl" Dec 11 13:27:29 crc kubenswrapper[4898]: I1211 13:27:29.168184 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-tfn2j" Dec 11 13:27:29 crc kubenswrapper[4898]: I1211 13:27:29.251353 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-d8bf-account-create-update-f5twm" Dec 11 13:27:29 crc kubenswrapper[4898]: I1211 13:27:29.262956 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-56gdv\" (UniqueName: \"kubernetes.io/projected/1188efd5-cf9c-48dc-bd72-5259294cca4c-kube-api-access-56gdv\") pod \"nova-cell1-c008-account-create-update-5t4pl\" (UID: \"1188efd5-cf9c-48dc-bd72-5259294cca4c\") " pod="openstack/nova-cell1-c008-account-create-update-5t4pl" Dec 11 13:27:29 crc kubenswrapper[4898]: I1211 13:27:29.263154 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1188efd5-cf9c-48dc-bd72-5259294cca4c-operator-scripts\") pod \"nova-cell1-c008-account-create-update-5t4pl\" (UID: \"1188efd5-cf9c-48dc-bd72-5259294cca4c\") " pod="openstack/nova-cell1-c008-account-create-update-5t4pl" Dec 11 13:27:29 crc kubenswrapper[4898]: I1211 13:27:29.263830 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1188efd5-cf9c-48dc-bd72-5259294cca4c-operator-scripts\") pod \"nova-cell1-c008-account-create-update-5t4pl\" (UID: \"1188efd5-cf9c-48dc-bd72-5259294cca4c\") " pod="openstack/nova-cell1-c008-account-create-update-5t4pl" Dec 11 13:27:29 crc kubenswrapper[4898]: I1211 13:27:29.286289 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-56gdv\" (UniqueName: \"kubernetes.io/projected/1188efd5-cf9c-48dc-bd72-5259294cca4c-kube-api-access-56gdv\") pod \"nova-cell1-c008-account-create-update-5t4pl\" (UID: \"1188efd5-cf9c-48dc-bd72-5259294cca4c\") " pod="openstack/nova-cell1-c008-account-create-update-5t4pl" Dec 11 13:27:29 crc kubenswrapper[4898]: I1211 13:27:29.480021 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-c008-account-create-update-5t4pl" Dec 11 13:27:31 crc kubenswrapper[4898]: I1211 13:27:31.268661 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 11 13:27:31 crc kubenswrapper[4898]: I1211 13:27:31.279749 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ee3c0532-b7dd-4147-812a-3add6446d9a1","Type":"ContainerDied","Data":"1ec0ac589729fef512a2acaf03be80142bdc464d3d0f82f074ad05575c8e2602"} Dec 11 13:27:31 crc kubenswrapper[4898]: I1211 13:27:31.279803 4898 scope.go:117] "RemoveContainer" containerID="2d5f34b7189654cc5dc196435838bdd9ec0ba65468cf2c2304dd540e3d9c265f" Dec 11 13:27:31 crc kubenswrapper[4898]: I1211 13:27:31.279982 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 11 13:27:31 crc kubenswrapper[4898]: I1211 13:27:31.312797 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ee3c0532-b7dd-4147-812a-3add6446d9a1-scripts\") pod \"ee3c0532-b7dd-4147-812a-3add6446d9a1\" (UID: \"ee3c0532-b7dd-4147-812a-3add6446d9a1\") " Dec 11 13:27:31 crc kubenswrapper[4898]: I1211 13:27:31.312993 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ee3c0532-b7dd-4147-812a-3add6446d9a1-sg-core-conf-yaml\") pod \"ee3c0532-b7dd-4147-812a-3add6446d9a1\" (UID: \"ee3c0532-b7dd-4147-812a-3add6446d9a1\") " Dec 11 13:27:31 crc kubenswrapper[4898]: I1211 13:27:31.313154 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ee3c0532-b7dd-4147-812a-3add6446d9a1-log-httpd\") pod \"ee3c0532-b7dd-4147-812a-3add6446d9a1\" (UID: \"ee3c0532-b7dd-4147-812a-3add6446d9a1\") " Dec 11 13:27:31 crc kubenswrapper[4898]: I1211 13:27:31.313277 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-86jtx\" (UniqueName: \"kubernetes.io/projected/ee3c0532-b7dd-4147-812a-3add6446d9a1-kube-api-access-86jtx\") pod \"ee3c0532-b7dd-4147-812a-3add6446d9a1\" (UID: \"ee3c0532-b7dd-4147-812a-3add6446d9a1\") " Dec 11 13:27:31 crc kubenswrapper[4898]: I1211 13:27:31.313349 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee3c0532-b7dd-4147-812a-3add6446d9a1-combined-ca-bundle\") pod \"ee3c0532-b7dd-4147-812a-3add6446d9a1\" (UID: \"ee3c0532-b7dd-4147-812a-3add6446d9a1\") " Dec 11 13:27:31 crc kubenswrapper[4898]: I1211 13:27:31.313379 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee3c0532-b7dd-4147-812a-3add6446d9a1-config-data\") pod \"ee3c0532-b7dd-4147-812a-3add6446d9a1\" (UID: \"ee3c0532-b7dd-4147-812a-3add6446d9a1\") " Dec 11 13:27:31 crc kubenswrapper[4898]: I1211 13:27:31.313497 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ee3c0532-b7dd-4147-812a-3add6446d9a1-run-httpd\") pod \"ee3c0532-b7dd-4147-812a-3add6446d9a1\" (UID: \"ee3c0532-b7dd-4147-812a-3add6446d9a1\") " Dec 11 13:27:31 crc kubenswrapper[4898]: I1211 13:27:31.314811 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ee3c0532-b7dd-4147-812a-3add6446d9a1-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "ee3c0532-b7dd-4147-812a-3add6446d9a1" (UID: "ee3c0532-b7dd-4147-812a-3add6446d9a1"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 13:27:31 crc kubenswrapper[4898]: I1211 13:27:31.315442 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ee3c0532-b7dd-4147-812a-3add6446d9a1-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "ee3c0532-b7dd-4147-812a-3add6446d9a1" (UID: "ee3c0532-b7dd-4147-812a-3add6446d9a1"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 13:27:31 crc kubenswrapper[4898]: I1211 13:27:31.340721 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee3c0532-b7dd-4147-812a-3add6446d9a1-kube-api-access-86jtx" (OuterVolumeSpecName: "kube-api-access-86jtx") pod "ee3c0532-b7dd-4147-812a-3add6446d9a1" (UID: "ee3c0532-b7dd-4147-812a-3add6446d9a1"). InnerVolumeSpecName "kube-api-access-86jtx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:27:31 crc kubenswrapper[4898]: I1211 13:27:31.340843 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee3c0532-b7dd-4147-812a-3add6446d9a1-scripts" (OuterVolumeSpecName: "scripts") pod "ee3c0532-b7dd-4147-812a-3add6446d9a1" (UID: "ee3c0532-b7dd-4147-812a-3add6446d9a1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:27:31 crc kubenswrapper[4898]: I1211 13:27:31.358297 4898 scope.go:117] "RemoveContainer" containerID="0ab2ee3a37554d4b70a2e0e001aaa985748619ff652d537c5d296943c659c6dc" Dec 11 13:27:31 crc kubenswrapper[4898]: I1211 13:27:31.360960 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee3c0532-b7dd-4147-812a-3add6446d9a1-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "ee3c0532-b7dd-4147-812a-3add6446d9a1" (UID: "ee3c0532-b7dd-4147-812a-3add6446d9a1"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:27:31 crc kubenswrapper[4898]: I1211 13:27:31.417498 4898 scope.go:117] "RemoveContainer" containerID="101e35693121caba6825c50408ee1563ac024c47b03bb8af36c46448894ca9e9" Dec 11 13:27:31 crc kubenswrapper[4898]: I1211 13:27:31.419149 4898 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ee3c0532-b7dd-4147-812a-3add6446d9a1-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 11 13:27:31 crc kubenswrapper[4898]: I1211 13:27:31.419189 4898 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ee3c0532-b7dd-4147-812a-3add6446d9a1-scripts\") on node \"crc\" DevicePath \"\"" Dec 11 13:27:31 crc kubenswrapper[4898]: I1211 13:27:31.419202 4898 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ee3c0532-b7dd-4147-812a-3add6446d9a1-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 11 13:27:31 crc kubenswrapper[4898]: I1211 13:27:31.419216 4898 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ee3c0532-b7dd-4147-812a-3add6446d9a1-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 11 13:27:31 crc kubenswrapper[4898]: I1211 13:27:31.419229 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-86jtx\" (UniqueName: \"kubernetes.io/projected/ee3c0532-b7dd-4147-812a-3add6446d9a1-kube-api-access-86jtx\") on node \"crc\" DevicePath \"\"" Dec 11 13:27:31 crc kubenswrapper[4898]: I1211 13:27:31.446365 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee3c0532-b7dd-4147-812a-3add6446d9a1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ee3c0532-b7dd-4147-812a-3add6446d9a1" (UID: "ee3c0532-b7dd-4147-812a-3add6446d9a1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:27:31 crc kubenswrapper[4898]: I1211 13:27:31.490283 4898 scope.go:117] "RemoveContainer" containerID="18ae6c0fa34c3c05f5a3592448be5988ac5031be2913b0ede4dba797a3166f63" Dec 11 13:27:31 crc kubenswrapper[4898]: I1211 13:27:31.511564 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee3c0532-b7dd-4147-812a-3add6446d9a1-config-data" (OuterVolumeSpecName: "config-data") pod "ee3c0532-b7dd-4147-812a-3add6446d9a1" (UID: "ee3c0532-b7dd-4147-812a-3add6446d9a1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:27:31 crc kubenswrapper[4898]: I1211 13:27:31.522869 4898 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee3c0532-b7dd-4147-812a-3add6446d9a1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 11 13:27:31 crc kubenswrapper[4898]: I1211 13:27:31.523109 4898 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee3c0532-b7dd-4147-812a-3add6446d9a1-config-data\") on node \"crc\" DevicePath \"\"" Dec 11 13:27:31 crc kubenswrapper[4898]: I1211 13:27:31.589614 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-qnzh8"] Dec 11 13:27:31 crc kubenswrapper[4898]: I1211 13:27:31.605043 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-d8bf-account-create-update-f5twm"] Dec 11 13:27:31 crc kubenswrapper[4898]: I1211 13:27:31.622142 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 11 13:27:31 crc kubenswrapper[4898]: W1211 13:27:31.625998 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf07f8aaa_f41a_4204_881e_274dc9a9ad74.slice/crio-0912ea9b60aaf356fd9b9e196400f72884bb15e30662d8bb1ed9c8b2a0c43eb9 WatchSource:0}: Error finding container 0912ea9b60aaf356fd9b9e196400f72884bb15e30662d8bb1ed9c8b2a0c43eb9: Status 404 returned error can't find the container with id 0912ea9b60aaf356fd9b9e196400f72884bb15e30662d8bb1ed9c8b2a0c43eb9 Dec 11 13:27:31 crc kubenswrapper[4898]: I1211 13:27:31.637410 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 11 13:27:31 crc kubenswrapper[4898]: I1211 13:27:31.648921 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 11 13:27:31 crc kubenswrapper[4898]: E1211 13:27:31.649366 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee3c0532-b7dd-4147-812a-3add6446d9a1" containerName="sg-core" Dec 11 13:27:31 crc kubenswrapper[4898]: I1211 13:27:31.649383 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee3c0532-b7dd-4147-812a-3add6446d9a1" containerName="sg-core" Dec 11 13:27:31 crc kubenswrapper[4898]: E1211 13:27:31.649391 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee3c0532-b7dd-4147-812a-3add6446d9a1" containerName="proxy-httpd" Dec 11 13:27:31 crc kubenswrapper[4898]: I1211 13:27:31.649397 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee3c0532-b7dd-4147-812a-3add6446d9a1" containerName="proxy-httpd" Dec 11 13:27:31 crc kubenswrapper[4898]: E1211 13:27:31.649441 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee3c0532-b7dd-4147-812a-3add6446d9a1" containerName="ceilometer-central-agent" Dec 11 13:27:31 crc kubenswrapper[4898]: I1211 13:27:31.649448 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee3c0532-b7dd-4147-812a-3add6446d9a1" containerName="ceilometer-central-agent" Dec 11 13:27:31 crc kubenswrapper[4898]: E1211 13:27:31.649505 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee3c0532-b7dd-4147-812a-3add6446d9a1" containerName="ceilometer-notification-agent" Dec 11 13:27:31 crc kubenswrapper[4898]: I1211 13:27:31.649511 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee3c0532-b7dd-4147-812a-3add6446d9a1" containerName="ceilometer-notification-agent" Dec 11 13:27:31 crc kubenswrapper[4898]: I1211 13:27:31.649775 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee3c0532-b7dd-4147-812a-3add6446d9a1" containerName="proxy-httpd" Dec 11 13:27:31 crc kubenswrapper[4898]: I1211 13:27:31.649798 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee3c0532-b7dd-4147-812a-3add6446d9a1" containerName="sg-core" Dec 11 13:27:31 crc kubenswrapper[4898]: I1211 13:27:31.649820 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee3c0532-b7dd-4147-812a-3add6446d9a1" containerName="ceilometer-notification-agent" Dec 11 13:27:31 crc kubenswrapper[4898]: I1211 13:27:31.649836 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee3c0532-b7dd-4147-812a-3add6446d9a1" containerName="ceilometer-central-agent" Dec 11 13:27:31 crc kubenswrapper[4898]: I1211 13:27:31.651736 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 11 13:27:31 crc kubenswrapper[4898]: I1211 13:27:31.655252 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 11 13:27:31 crc kubenswrapper[4898]: I1211 13:27:31.656050 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 11 13:27:31 crc kubenswrapper[4898]: I1211 13:27:31.680284 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 11 13:27:31 crc kubenswrapper[4898]: I1211 13:27:31.718941 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-rnnwz"] Dec 11 13:27:31 crc kubenswrapper[4898]: W1211 13:27:31.720947 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod446e3383_1a1b_4271_94ca_1662e36059d3.slice/crio-4b84ce924720483b4a391454c702ec7df2e5424b4cb5403360134b2ded4dcf3f WatchSource:0}: Error finding container 4b84ce924720483b4a391454c702ec7df2e5424b4cb5403360134b2ded4dcf3f: Status 404 returned error can't find the container with id 4b84ce924720483b4a391454c702ec7df2e5424b4cb5403360134b2ded4dcf3f Dec 11 13:27:31 crc kubenswrapper[4898]: I1211 13:27:31.728407 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/25ac34fb-287b-4ea4-b349-124a57354453-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"25ac34fb-287b-4ea4-b349-124a57354453\") " pod="openstack/ceilometer-0" Dec 11 13:27:31 crc kubenswrapper[4898]: I1211 13:27:31.728561 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/25ac34fb-287b-4ea4-b349-124a57354453-run-httpd\") pod \"ceilometer-0\" (UID: \"25ac34fb-287b-4ea4-b349-124a57354453\") " pod="openstack/ceilometer-0" Dec 11 13:27:31 crc kubenswrapper[4898]: I1211 13:27:31.728704 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/25ac34fb-287b-4ea4-b349-124a57354453-log-httpd\") pod \"ceilometer-0\" (UID: \"25ac34fb-287b-4ea4-b349-124a57354453\") " pod="openstack/ceilometer-0" Dec 11 13:27:31 crc kubenswrapper[4898]: I1211 13:27:31.728865 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/25ac34fb-287b-4ea4-b349-124a57354453-scripts\") pod \"ceilometer-0\" (UID: \"25ac34fb-287b-4ea4-b349-124a57354453\") " pod="openstack/ceilometer-0" Dec 11 13:27:31 crc kubenswrapper[4898]: I1211 13:27:31.728906 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25ac34fb-287b-4ea4-b349-124a57354453-config-data\") pod \"ceilometer-0\" (UID: \"25ac34fb-287b-4ea4-b349-124a57354453\") " pod="openstack/ceilometer-0" Dec 11 13:27:31 crc kubenswrapper[4898]: I1211 13:27:31.728982 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25ac34fb-287b-4ea4-b349-124a57354453-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"25ac34fb-287b-4ea4-b349-124a57354453\") " pod="openstack/ceilometer-0" Dec 11 13:27:31 crc kubenswrapper[4898]: I1211 13:27:31.729022 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kkr8x\" (UniqueName: \"kubernetes.io/projected/25ac34fb-287b-4ea4-b349-124a57354453-kube-api-access-kkr8x\") pod \"ceilometer-0\" (UID: \"25ac34fb-287b-4ea4-b349-124a57354453\") " pod="openstack/ceilometer-0" Dec 11 13:27:31 crc kubenswrapper[4898]: I1211 13:27:31.830795 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/25ac34fb-287b-4ea4-b349-124a57354453-run-httpd\") pod \"ceilometer-0\" (UID: \"25ac34fb-287b-4ea4-b349-124a57354453\") " pod="openstack/ceilometer-0" Dec 11 13:27:31 crc kubenswrapper[4898]: I1211 13:27:31.830968 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/25ac34fb-287b-4ea4-b349-124a57354453-log-httpd\") pod \"ceilometer-0\" (UID: \"25ac34fb-287b-4ea4-b349-124a57354453\") " pod="openstack/ceilometer-0" Dec 11 13:27:31 crc kubenswrapper[4898]: I1211 13:27:31.831042 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/25ac34fb-287b-4ea4-b349-124a57354453-scripts\") pod \"ceilometer-0\" (UID: \"25ac34fb-287b-4ea4-b349-124a57354453\") " pod="openstack/ceilometer-0" Dec 11 13:27:31 crc kubenswrapper[4898]: I1211 13:27:31.831060 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25ac34fb-287b-4ea4-b349-124a57354453-config-data\") pod \"ceilometer-0\" (UID: \"25ac34fb-287b-4ea4-b349-124a57354453\") " pod="openstack/ceilometer-0" Dec 11 13:27:31 crc kubenswrapper[4898]: I1211 13:27:31.831106 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25ac34fb-287b-4ea4-b349-124a57354453-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"25ac34fb-287b-4ea4-b349-124a57354453\") " pod="openstack/ceilometer-0" Dec 11 13:27:31 crc kubenswrapper[4898]: I1211 13:27:31.831124 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kkr8x\" (UniqueName: \"kubernetes.io/projected/25ac34fb-287b-4ea4-b349-124a57354453-kube-api-access-kkr8x\") pod \"ceilometer-0\" (UID: \"25ac34fb-287b-4ea4-b349-124a57354453\") " pod="openstack/ceilometer-0" Dec 11 13:27:31 crc kubenswrapper[4898]: I1211 13:27:31.831183 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/25ac34fb-287b-4ea4-b349-124a57354453-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"25ac34fb-287b-4ea4-b349-124a57354453\") " pod="openstack/ceilometer-0" Dec 11 13:27:31 crc kubenswrapper[4898]: I1211 13:27:31.832277 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/25ac34fb-287b-4ea4-b349-124a57354453-run-httpd\") pod \"ceilometer-0\" (UID: \"25ac34fb-287b-4ea4-b349-124a57354453\") " pod="openstack/ceilometer-0" Dec 11 13:27:31 crc kubenswrapper[4898]: I1211 13:27:31.832814 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/25ac34fb-287b-4ea4-b349-124a57354453-log-httpd\") pod \"ceilometer-0\" (UID: \"25ac34fb-287b-4ea4-b349-124a57354453\") " pod="openstack/ceilometer-0" Dec 11 13:27:31 crc kubenswrapper[4898]: I1211 13:27:31.836305 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25ac34fb-287b-4ea4-b349-124a57354453-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"25ac34fb-287b-4ea4-b349-124a57354453\") " pod="openstack/ceilometer-0" Dec 11 13:27:31 crc kubenswrapper[4898]: I1211 13:27:31.836824 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/25ac34fb-287b-4ea4-b349-124a57354453-scripts\") pod \"ceilometer-0\" (UID: \"25ac34fb-287b-4ea4-b349-124a57354453\") " pod="openstack/ceilometer-0" Dec 11 13:27:31 crc kubenswrapper[4898]: I1211 13:27:31.837347 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25ac34fb-287b-4ea4-b349-124a57354453-config-data\") pod \"ceilometer-0\" (UID: \"25ac34fb-287b-4ea4-b349-124a57354453\") " pod="openstack/ceilometer-0" Dec 11 13:27:31 crc kubenswrapper[4898]: I1211 13:27:31.855796 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kkr8x\" (UniqueName: \"kubernetes.io/projected/25ac34fb-287b-4ea4-b349-124a57354453-kube-api-access-kkr8x\") pod \"ceilometer-0\" (UID: \"25ac34fb-287b-4ea4-b349-124a57354453\") " pod="openstack/ceilometer-0" Dec 11 13:27:31 crc kubenswrapper[4898]: I1211 13:27:31.856844 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/25ac34fb-287b-4ea4-b349-124a57354453-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"25ac34fb-287b-4ea4-b349-124a57354453\") " pod="openstack/ceilometer-0" Dec 11 13:27:31 crc kubenswrapper[4898]: I1211 13:27:31.992490 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 11 13:27:32 crc kubenswrapper[4898]: I1211 13:27:32.119031 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-c008-account-create-update-5t4pl"] Dec 11 13:27:32 crc kubenswrapper[4898]: I1211 13:27:32.146273 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-tfn2j"] Dec 11 13:27:32 crc kubenswrapper[4898]: I1211 13:27:32.164096 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-1a34-account-create-update-g82b9"] Dec 11 13:27:32 crc kubenswrapper[4898]: I1211 13:27:32.329745 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-rnnwz" event={"ID":"446e3383-1a1b-4271-94ca-1662e36059d3","Type":"ContainerStarted","Data":"dc81a7b7c44e4ed19357ecb521a7ccadf35a545b8d09718313ff58bd79adce59"} Dec 11 13:27:32 crc kubenswrapper[4898]: I1211 13:27:32.330081 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-rnnwz" event={"ID":"446e3383-1a1b-4271-94ca-1662e36059d3","Type":"ContainerStarted","Data":"4b84ce924720483b4a391454c702ec7df2e5424b4cb5403360134b2ded4dcf3f"} Dec 11 13:27:32 crc kubenswrapper[4898]: I1211 13:27:32.346930 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-c008-account-create-update-5t4pl" event={"ID":"1188efd5-cf9c-48dc-bd72-5259294cca4c","Type":"ContainerStarted","Data":"17086baff6937e1f0965e1d496ca3bbaaa1672cba22bffa7301d0a87f6ffc017"} Dec 11 13:27:32 crc kubenswrapper[4898]: I1211 13:27:32.356564 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"11154469-5a32-47bb-bbaf-66ea95afcf82","Type":"ContainerStarted","Data":"7987ff86e67f7aa39d2743cdfefd5fac0e12f5499df4e04fce7e3bc73f02ef1c"} Dec 11 13:27:32 crc kubenswrapper[4898]: I1211 13:27:32.362419 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-1a34-account-create-update-g82b9" event={"ID":"c2cfe6b8-8f34-455c-afe3-61b33605d648","Type":"ContainerStarted","Data":"98ce40f66b62878aa8237e75a4d371d148e987874778b63456e174e6e291908a"} Dec 11 13:27:32 crc kubenswrapper[4898]: I1211 13:27:32.369094 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-tfn2j" event={"ID":"2b4db1d0-fd46-4731-8392-e8390610a2c8","Type":"ContainerStarted","Data":"e3a5f372e315923c258908ab647d85c0446d0626fc70e1c640cef616e34a4fca"} Dec 11 13:27:32 crc kubenswrapper[4898]: I1211 13:27:32.373591 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-qnzh8" event={"ID":"f07f8aaa-f41a-4204-881e-274dc9a9ad74","Type":"ContainerStarted","Data":"0912ea9b60aaf356fd9b9e196400f72884bb15e30662d8bb1ed9c8b2a0c43eb9"} Dec 11 13:27:32 crc kubenswrapper[4898]: I1211 13:27:32.395636 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-d8bf-account-create-update-f5twm" event={"ID":"5d173a21-f567-430e-b7fe-c47892f42873","Type":"ContainerStarted","Data":"12bc5f25f4fab7bd424b7ecd41040a348ea7f36cbdbd74a785bd9242cc9e9b44"} Dec 11 13:27:32 crc kubenswrapper[4898]: I1211 13:27:32.395687 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-d8bf-account-create-update-f5twm" event={"ID":"5d173a21-f567-430e-b7fe-c47892f42873","Type":"ContainerStarted","Data":"afa9c4903d114baa7f0e14f33bfb7a909473163b6343417829dc344644e8b4a9"} Dec 11 13:27:32 crc kubenswrapper[4898]: I1211 13:27:32.405954 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-db-create-rnnwz" podStartSLOduration=4.40592798 podStartE2EDuration="4.40592798s" podCreationTimestamp="2025-12-11 13:27:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 13:27:32.358923808 +0000 UTC m=+1409.931250245" watchObservedRunningTime="2025-12-11 13:27:32.40592798 +0000 UTC m=+1409.978254417" Dec 11 13:27:32 crc kubenswrapper[4898]: I1211 13:27:32.411491 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.930897261 podStartE2EDuration="15.41147312s" podCreationTimestamp="2025-12-11 13:27:17 +0000 UTC" firstStartedPulling="2025-12-11 13:27:18.45240484 +0000 UTC m=+1396.024731277" lastFinishedPulling="2025-12-11 13:27:30.932980699 +0000 UTC m=+1408.505307136" observedRunningTime="2025-12-11 13:27:32.38373775 +0000 UTC m=+1409.956064187" watchObservedRunningTime="2025-12-11 13:27:32.41147312 +0000 UTC m=+1409.983799557" Dec 11 13:27:32 crc kubenswrapper[4898]: I1211 13:27:32.436544 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-d8bf-account-create-update-f5twm" podStartSLOduration=4.436527779 podStartE2EDuration="4.436527779s" podCreationTimestamp="2025-12-11 13:27:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 13:27:32.41332131 +0000 UTC m=+1409.985647747" watchObservedRunningTime="2025-12-11 13:27:32.436527779 +0000 UTC m=+1410.008854216" Dec 11 13:27:32 crc kubenswrapper[4898]: I1211 13:27:32.771715 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 11 13:27:32 crc kubenswrapper[4898]: I1211 13:27:32.804052 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee3c0532-b7dd-4147-812a-3add6446d9a1" path="/var/lib/kubelet/pods/ee3c0532-b7dd-4147-812a-3add6446d9a1/volumes" Dec 11 13:27:32 crc kubenswrapper[4898]: W1211 13:27:32.840814 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod25ac34fb_287b_4ea4_b349_124a57354453.slice/crio-9b5e0d61001119ad8fe8068cad317e7699f8b04e7c6a00e55c83241d6f90265a WatchSource:0}: Error finding container 9b5e0d61001119ad8fe8068cad317e7699f8b04e7c6a00e55c83241d6f90265a: Status 404 returned error can't find the container with id 9b5e0d61001119ad8fe8068cad317e7699f8b04e7c6a00e55c83241d6f90265a Dec 11 13:27:32 crc kubenswrapper[4898]: I1211 13:27:32.959299 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 11 13:27:32 crc kubenswrapper[4898]: I1211 13:27:32.959615 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="8c0762b4-f3a4-4243-8df8-94e805983b4b" containerName="glance-log" containerID="cri-o://3d10166a76dcef4f62b5c7bc9c12ea67476d6be9d8622c008144c0f8e7c00080" gracePeriod=30 Dec 11 13:27:32 crc kubenswrapper[4898]: I1211 13:27:32.959971 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="8c0762b4-f3a4-4243-8df8-94e805983b4b" containerName="glance-httpd" containerID="cri-o://2c7613a2c4c0a54040bfa09053d6ae624afab17f5a34bdbb4122d42d97ca4e33" gracePeriod=30 Dec 11 13:27:33 crc kubenswrapper[4898]: I1211 13:27:33.409993 4898 generic.go:334] "Generic (PLEG): container finished" podID="8c0762b4-f3a4-4243-8df8-94e805983b4b" containerID="3d10166a76dcef4f62b5c7bc9c12ea67476d6be9d8622c008144c0f8e7c00080" exitCode=143 Dec 11 13:27:33 crc kubenswrapper[4898]: I1211 13:27:33.411151 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"8c0762b4-f3a4-4243-8df8-94e805983b4b","Type":"ContainerDied","Data":"3d10166a76dcef4f62b5c7bc9c12ea67476d6be9d8622c008144c0f8e7c00080"} Dec 11 13:27:33 crc kubenswrapper[4898]: I1211 13:27:33.422700 4898 generic.go:334] "Generic (PLEG): container finished" podID="446e3383-1a1b-4271-94ca-1662e36059d3" containerID="dc81a7b7c44e4ed19357ecb521a7ccadf35a545b8d09718313ff58bd79adce59" exitCode=0 Dec 11 13:27:33 crc kubenswrapper[4898]: I1211 13:27:33.422961 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-rnnwz" event={"ID":"446e3383-1a1b-4271-94ca-1662e36059d3","Type":"ContainerDied","Data":"dc81a7b7c44e4ed19357ecb521a7ccadf35a545b8d09718313ff58bd79adce59"} Dec 11 13:27:33 crc kubenswrapper[4898]: I1211 13:27:33.432961 4898 generic.go:334] "Generic (PLEG): container finished" podID="c2cfe6b8-8f34-455c-afe3-61b33605d648" containerID="22c0ea90004f2589fdd305567b0d291a7c8b35733c827f0fd4d110900c546170" exitCode=0 Dec 11 13:27:33 crc kubenswrapper[4898]: I1211 13:27:33.433033 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-1a34-account-create-update-g82b9" event={"ID":"c2cfe6b8-8f34-455c-afe3-61b33605d648","Type":"ContainerDied","Data":"22c0ea90004f2589fdd305567b0d291a7c8b35733c827f0fd4d110900c546170"} Dec 11 13:27:33 crc kubenswrapper[4898]: I1211 13:27:33.443629 4898 generic.go:334] "Generic (PLEG): container finished" podID="2b4db1d0-fd46-4731-8392-e8390610a2c8" containerID="e4c171d2cc66e164716bb30d991949d4fbde33237792a70a6a026fe33cf71144" exitCode=0 Dec 11 13:27:33 crc kubenswrapper[4898]: I1211 13:27:33.444239 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-tfn2j" event={"ID":"2b4db1d0-fd46-4731-8392-e8390610a2c8","Type":"ContainerDied","Data":"e4c171d2cc66e164716bb30d991949d4fbde33237792a70a6a026fe33cf71144"} Dec 11 13:27:33 crc kubenswrapper[4898]: I1211 13:27:33.454796 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 11 13:27:33 crc kubenswrapper[4898]: I1211 13:27:33.455629 4898 generic.go:334] "Generic (PLEG): container finished" podID="5d173a21-f567-430e-b7fe-c47892f42873" containerID="12bc5f25f4fab7bd424b7ecd41040a348ea7f36cbdbd74a785bd9242cc9e9b44" exitCode=0 Dec 11 13:27:33 crc kubenswrapper[4898]: I1211 13:27:33.455728 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-d8bf-account-create-update-f5twm" event={"ID":"5d173a21-f567-430e-b7fe-c47892f42873","Type":"ContainerDied","Data":"12bc5f25f4fab7bd424b7ecd41040a348ea7f36cbdbd74a785bd9242cc9e9b44"} Dec 11 13:27:33 crc kubenswrapper[4898]: I1211 13:27:33.465888 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"25ac34fb-287b-4ea4-b349-124a57354453","Type":"ContainerStarted","Data":"9b5e0d61001119ad8fe8068cad317e7699f8b04e7c6a00e55c83241d6f90265a"} Dec 11 13:27:33 crc kubenswrapper[4898]: I1211 13:27:33.473911 4898 generic.go:334] "Generic (PLEG): container finished" podID="f07f8aaa-f41a-4204-881e-274dc9a9ad74" containerID="9f62647d653af839ebc11879be82bcfc9ac18bc6d9e6ce2f25d2671618157129" exitCode=0 Dec 11 13:27:33 crc kubenswrapper[4898]: I1211 13:27:33.474153 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-qnzh8" event={"ID":"f07f8aaa-f41a-4204-881e-274dc9a9ad74","Type":"ContainerDied","Data":"9f62647d653af839ebc11879be82bcfc9ac18bc6d9e6ce2f25d2671618157129"} Dec 11 13:27:33 crc kubenswrapper[4898]: I1211 13:27:33.475891 4898 generic.go:334] "Generic (PLEG): container finished" podID="b9c2941b-6d82-4218-8cc7-91f6473d0fbf" containerID="58e185896e058998d630d9bdb5ec15963cbaba34c173a769cba3bf6bcd8a32b3" exitCode=137 Dec 11 13:27:33 crc kubenswrapper[4898]: I1211 13:27:33.475989 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"b9c2941b-6d82-4218-8cc7-91f6473d0fbf","Type":"ContainerDied","Data":"58e185896e058998d630d9bdb5ec15963cbaba34c173a769cba3bf6bcd8a32b3"} Dec 11 13:27:33 crc kubenswrapper[4898]: I1211 13:27:33.476057 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"b9c2941b-6d82-4218-8cc7-91f6473d0fbf","Type":"ContainerDied","Data":"3d04a7dc70631fda6e3d0eaecd889ae56d25494dfc1f1a4a1dc8fbc35a12d5e0"} Dec 11 13:27:33 crc kubenswrapper[4898]: I1211 13:27:33.476131 4898 scope.go:117] "RemoveContainer" containerID="58e185896e058998d630d9bdb5ec15963cbaba34c173a769cba3bf6bcd8a32b3" Dec 11 13:27:33 crc kubenswrapper[4898]: I1211 13:27:33.476286 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 11 13:27:33 crc kubenswrapper[4898]: I1211 13:27:33.480226 4898 generic.go:334] "Generic (PLEG): container finished" podID="1188efd5-cf9c-48dc-bd72-5259294cca4c" containerID="31418266dbda0e303b0b1289c6ae139d3a6995a48d451fcd08d3ae4def89749b" exitCode=0 Dec 11 13:27:33 crc kubenswrapper[4898]: I1211 13:27:33.480392 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-c008-account-create-update-5t4pl" event={"ID":"1188efd5-cf9c-48dc-bd72-5259294cca4c","Type":"ContainerDied","Data":"31418266dbda0e303b0b1289c6ae139d3a6995a48d451fcd08d3ae4def89749b"} Dec 11 13:27:33 crc kubenswrapper[4898]: I1211 13:27:33.481371 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b9c2941b-6d82-4218-8cc7-91f6473d0fbf-config-data-custom\") pod \"b9c2941b-6d82-4218-8cc7-91f6473d0fbf\" (UID: \"b9c2941b-6d82-4218-8cc7-91f6473d0fbf\") " Dec 11 13:27:33 crc kubenswrapper[4898]: I1211 13:27:33.483000 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-brzs6\" (UniqueName: \"kubernetes.io/projected/b9c2941b-6d82-4218-8cc7-91f6473d0fbf-kube-api-access-brzs6\") pod \"b9c2941b-6d82-4218-8cc7-91f6473d0fbf\" (UID: \"b9c2941b-6d82-4218-8cc7-91f6473d0fbf\") " Dec 11 13:27:33 crc kubenswrapper[4898]: I1211 13:27:33.483079 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9c2941b-6d82-4218-8cc7-91f6473d0fbf-config-data\") pod \"b9c2941b-6d82-4218-8cc7-91f6473d0fbf\" (UID: \"b9c2941b-6d82-4218-8cc7-91f6473d0fbf\") " Dec 11 13:27:33 crc kubenswrapper[4898]: I1211 13:27:33.483250 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b9c2941b-6d82-4218-8cc7-91f6473d0fbf-scripts\") pod \"b9c2941b-6d82-4218-8cc7-91f6473d0fbf\" (UID: \"b9c2941b-6d82-4218-8cc7-91f6473d0fbf\") " Dec 11 13:27:33 crc kubenswrapper[4898]: I1211 13:27:33.483287 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9c2941b-6d82-4218-8cc7-91f6473d0fbf-combined-ca-bundle\") pod \"b9c2941b-6d82-4218-8cc7-91f6473d0fbf\" (UID: \"b9c2941b-6d82-4218-8cc7-91f6473d0fbf\") " Dec 11 13:27:33 crc kubenswrapper[4898]: I1211 13:27:33.483382 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b9c2941b-6d82-4218-8cc7-91f6473d0fbf-etc-machine-id\") pod \"b9c2941b-6d82-4218-8cc7-91f6473d0fbf\" (UID: \"b9c2941b-6d82-4218-8cc7-91f6473d0fbf\") " Dec 11 13:27:33 crc kubenswrapper[4898]: I1211 13:27:33.483444 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b9c2941b-6d82-4218-8cc7-91f6473d0fbf-logs\") pod \"b9c2941b-6d82-4218-8cc7-91f6473d0fbf\" (UID: \"b9c2941b-6d82-4218-8cc7-91f6473d0fbf\") " Dec 11 13:27:33 crc kubenswrapper[4898]: I1211 13:27:33.485163 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b9c2941b-6d82-4218-8cc7-91f6473d0fbf-logs" (OuterVolumeSpecName: "logs") pod "b9c2941b-6d82-4218-8cc7-91f6473d0fbf" (UID: "b9c2941b-6d82-4218-8cc7-91f6473d0fbf"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 13:27:33 crc kubenswrapper[4898]: I1211 13:27:33.490210 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9c2941b-6d82-4218-8cc7-91f6473d0fbf-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "b9c2941b-6d82-4218-8cc7-91f6473d0fbf" (UID: "b9c2941b-6d82-4218-8cc7-91f6473d0fbf"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:27:33 crc kubenswrapper[4898]: I1211 13:27:33.491197 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b9c2941b-6d82-4218-8cc7-91f6473d0fbf-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "b9c2941b-6d82-4218-8cc7-91f6473d0fbf" (UID: "b9c2941b-6d82-4218-8cc7-91f6473d0fbf"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 11 13:27:33 crc kubenswrapper[4898]: I1211 13:27:33.492540 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9c2941b-6d82-4218-8cc7-91f6473d0fbf-scripts" (OuterVolumeSpecName: "scripts") pod "b9c2941b-6d82-4218-8cc7-91f6473d0fbf" (UID: "b9c2941b-6d82-4218-8cc7-91f6473d0fbf"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:27:33 crc kubenswrapper[4898]: I1211 13:27:33.492966 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9c2941b-6d82-4218-8cc7-91f6473d0fbf-kube-api-access-brzs6" (OuterVolumeSpecName: "kube-api-access-brzs6") pod "b9c2941b-6d82-4218-8cc7-91f6473d0fbf" (UID: "b9c2941b-6d82-4218-8cc7-91f6473d0fbf"). InnerVolumeSpecName "kube-api-access-brzs6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:27:33 crc kubenswrapper[4898]: I1211 13:27:33.535109 4898 scope.go:117] "RemoveContainer" containerID="4d0ffac38ead83f4b73e7c0742be67db7ebbd09cc27e6dadd523be21c77db4e9" Dec 11 13:27:33 crc kubenswrapper[4898]: I1211 13:27:33.590450 4898 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b9c2941b-6d82-4218-8cc7-91f6473d0fbf-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 11 13:27:33 crc kubenswrapper[4898]: I1211 13:27:33.604933 4898 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b9c2941b-6d82-4218-8cc7-91f6473d0fbf-logs\") on node \"crc\" DevicePath \"\"" Dec 11 13:27:33 crc kubenswrapper[4898]: I1211 13:27:33.604968 4898 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b9c2941b-6d82-4218-8cc7-91f6473d0fbf-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 11 13:27:33 crc kubenswrapper[4898]: I1211 13:27:33.604991 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-brzs6\" (UniqueName: \"kubernetes.io/projected/b9c2941b-6d82-4218-8cc7-91f6473d0fbf-kube-api-access-brzs6\") on node \"crc\" DevicePath \"\"" Dec 11 13:27:33 crc kubenswrapper[4898]: I1211 13:27:33.605005 4898 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b9c2941b-6d82-4218-8cc7-91f6473d0fbf-scripts\") on node \"crc\" DevicePath \"\"" Dec 11 13:27:33 crc kubenswrapper[4898]: I1211 13:27:33.621528 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9c2941b-6d82-4218-8cc7-91f6473d0fbf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b9c2941b-6d82-4218-8cc7-91f6473d0fbf" (UID: "b9c2941b-6d82-4218-8cc7-91f6473d0fbf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:27:33 crc kubenswrapper[4898]: I1211 13:27:33.621905 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9c2941b-6d82-4218-8cc7-91f6473d0fbf-config-data" (OuterVolumeSpecName: "config-data") pod "b9c2941b-6d82-4218-8cc7-91f6473d0fbf" (UID: "b9c2941b-6d82-4218-8cc7-91f6473d0fbf"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:27:33 crc kubenswrapper[4898]: I1211 13:27:33.628195 4898 scope.go:117] "RemoveContainer" containerID="58e185896e058998d630d9bdb5ec15963cbaba34c173a769cba3bf6bcd8a32b3" Dec 11 13:27:33 crc kubenswrapper[4898]: E1211 13:27:33.631655 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"58e185896e058998d630d9bdb5ec15963cbaba34c173a769cba3bf6bcd8a32b3\": container with ID starting with 58e185896e058998d630d9bdb5ec15963cbaba34c173a769cba3bf6bcd8a32b3 not found: ID does not exist" containerID="58e185896e058998d630d9bdb5ec15963cbaba34c173a769cba3bf6bcd8a32b3" Dec 11 13:27:33 crc kubenswrapper[4898]: I1211 13:27:33.631722 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58e185896e058998d630d9bdb5ec15963cbaba34c173a769cba3bf6bcd8a32b3"} err="failed to get container status \"58e185896e058998d630d9bdb5ec15963cbaba34c173a769cba3bf6bcd8a32b3\": rpc error: code = NotFound desc = could not find container \"58e185896e058998d630d9bdb5ec15963cbaba34c173a769cba3bf6bcd8a32b3\": container with ID starting with 58e185896e058998d630d9bdb5ec15963cbaba34c173a769cba3bf6bcd8a32b3 not found: ID does not exist" Dec 11 13:27:33 crc kubenswrapper[4898]: I1211 13:27:33.631778 4898 scope.go:117] "RemoveContainer" containerID="4d0ffac38ead83f4b73e7c0742be67db7ebbd09cc27e6dadd523be21c77db4e9" Dec 11 13:27:33 crc kubenswrapper[4898]: E1211 13:27:33.635218 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4d0ffac38ead83f4b73e7c0742be67db7ebbd09cc27e6dadd523be21c77db4e9\": container with ID starting with 4d0ffac38ead83f4b73e7c0742be67db7ebbd09cc27e6dadd523be21c77db4e9 not found: ID does not exist" containerID="4d0ffac38ead83f4b73e7c0742be67db7ebbd09cc27e6dadd523be21c77db4e9" Dec 11 13:27:33 crc kubenswrapper[4898]: I1211 13:27:33.635258 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d0ffac38ead83f4b73e7c0742be67db7ebbd09cc27e6dadd523be21c77db4e9"} err="failed to get container status \"4d0ffac38ead83f4b73e7c0742be67db7ebbd09cc27e6dadd523be21c77db4e9\": rpc error: code = NotFound desc = could not find container \"4d0ffac38ead83f4b73e7c0742be67db7ebbd09cc27e6dadd523be21c77db4e9\": container with ID starting with 4d0ffac38ead83f4b73e7c0742be67db7ebbd09cc27e6dadd523be21c77db4e9 not found: ID does not exist" Dec 11 13:27:33 crc kubenswrapper[4898]: I1211 13:27:33.712672 4898 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9c2941b-6d82-4218-8cc7-91f6473d0fbf-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 11 13:27:33 crc kubenswrapper[4898]: I1211 13:27:33.712704 4898 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9c2941b-6d82-4218-8cc7-91f6473d0fbf-config-data\") on node \"crc\" DevicePath \"\"" Dec 11 13:27:33 crc kubenswrapper[4898]: I1211 13:27:33.956296 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-engine-596c56fc48-gpcvd"] Dec 11 13:27:33 crc kubenswrapper[4898]: E1211 13:27:33.957957 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9c2941b-6d82-4218-8cc7-91f6473d0fbf" containerName="cinder-api-log" Dec 11 13:27:33 crc kubenswrapper[4898]: I1211 13:27:33.977213 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9c2941b-6d82-4218-8cc7-91f6473d0fbf" containerName="cinder-api-log" Dec 11 13:27:33 crc kubenswrapper[4898]: E1211 13:27:33.977286 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9c2941b-6d82-4218-8cc7-91f6473d0fbf" containerName="cinder-api" Dec 11 13:27:33 crc kubenswrapper[4898]: I1211 13:27:33.977293 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9c2941b-6d82-4218-8cc7-91f6473d0fbf" containerName="cinder-api" Dec 11 13:27:33 crc kubenswrapper[4898]: I1211 13:27:33.977753 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="b9c2941b-6d82-4218-8cc7-91f6473d0fbf" containerName="cinder-api-log" Dec 11 13:27:33 crc kubenswrapper[4898]: I1211 13:27:33.977778 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="b9c2941b-6d82-4218-8cc7-91f6473d0fbf" containerName="cinder-api" Dec 11 13:27:33 crc kubenswrapper[4898]: I1211 13:27:33.979073 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-596c56fc48-gpcvd" Dec 11 13:27:33 crc kubenswrapper[4898]: I1211 13:27:33.989424 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-heat-dockercfg-nrsz5" Dec 11 13:27:33 crc kubenswrapper[4898]: I1211 13:27:33.989656 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-engine-config-data" Dec 11 13:27:33 crc kubenswrapper[4898]: I1211 13:27:33.989775 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-config-data" Dec 11 13:27:34 crc kubenswrapper[4898]: I1211 13:27:34.036955 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-596c56fc48-gpcvd"] Dec 11 13:27:34 crc kubenswrapper[4898]: I1211 13:27:34.042999 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w4lb9\" (UniqueName: \"kubernetes.io/projected/eff78caa-c237-4040-b508-7cd9b8b5413f-kube-api-access-w4lb9\") pod \"heat-engine-596c56fc48-gpcvd\" (UID: \"eff78caa-c237-4040-b508-7cd9b8b5413f\") " pod="openstack/heat-engine-596c56fc48-gpcvd" Dec 11 13:27:34 crc kubenswrapper[4898]: I1211 13:27:34.043130 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eff78caa-c237-4040-b508-7cd9b8b5413f-combined-ca-bundle\") pod \"heat-engine-596c56fc48-gpcvd\" (UID: \"eff78caa-c237-4040-b508-7cd9b8b5413f\") " pod="openstack/heat-engine-596c56fc48-gpcvd" Dec 11 13:27:34 crc kubenswrapper[4898]: I1211 13:27:34.043165 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eff78caa-c237-4040-b508-7cd9b8b5413f-config-data\") pod \"heat-engine-596c56fc48-gpcvd\" (UID: \"eff78caa-c237-4040-b508-7cd9b8b5413f\") " pod="openstack/heat-engine-596c56fc48-gpcvd" Dec 11 13:27:34 crc kubenswrapper[4898]: I1211 13:27:34.045732 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/eff78caa-c237-4040-b508-7cd9b8b5413f-config-data-custom\") pod \"heat-engine-596c56fc48-gpcvd\" (UID: \"eff78caa-c237-4040-b508-7cd9b8b5413f\") " pod="openstack/heat-engine-596c56fc48-gpcvd" Dec 11 13:27:34 crc kubenswrapper[4898]: I1211 13:27:34.087844 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Dec 11 13:27:34 crc kubenswrapper[4898]: I1211 13:27:34.117745 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Dec 11 13:27:34 crc kubenswrapper[4898]: I1211 13:27:34.152499 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-6cfcb9bcc9-8j4sz"] Dec 11 13:27:34 crc kubenswrapper[4898]: I1211 13:27:34.161626 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eff78caa-c237-4040-b508-7cd9b8b5413f-config-data\") pod \"heat-engine-596c56fc48-gpcvd\" (UID: \"eff78caa-c237-4040-b508-7cd9b8b5413f\") " pod="openstack/heat-engine-596c56fc48-gpcvd" Dec 11 13:27:34 crc kubenswrapper[4898]: I1211 13:27:34.161812 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/eff78caa-c237-4040-b508-7cd9b8b5413f-config-data-custom\") pod \"heat-engine-596c56fc48-gpcvd\" (UID: \"eff78caa-c237-4040-b508-7cd9b8b5413f\") " pod="openstack/heat-engine-596c56fc48-gpcvd" Dec 11 13:27:34 crc kubenswrapper[4898]: I1211 13:27:34.161937 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-6cfcb9bcc9-8j4sz" Dec 11 13:27:34 crc kubenswrapper[4898]: I1211 13:27:34.162141 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w4lb9\" (UniqueName: \"kubernetes.io/projected/eff78caa-c237-4040-b508-7cd9b8b5413f-kube-api-access-w4lb9\") pod \"heat-engine-596c56fc48-gpcvd\" (UID: \"eff78caa-c237-4040-b508-7cd9b8b5413f\") " pod="openstack/heat-engine-596c56fc48-gpcvd" Dec 11 13:27:34 crc kubenswrapper[4898]: I1211 13:27:34.162252 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eff78caa-c237-4040-b508-7cd9b8b5413f-combined-ca-bundle\") pod \"heat-engine-596c56fc48-gpcvd\" (UID: \"eff78caa-c237-4040-b508-7cd9b8b5413f\") " pod="openstack/heat-engine-596c56fc48-gpcvd" Dec 11 13:27:34 crc kubenswrapper[4898]: I1211 13:27:34.172223 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-cfnapi-config-data" Dec 11 13:27:34 crc kubenswrapper[4898]: I1211 13:27:34.185839 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eff78caa-c237-4040-b508-7cd9b8b5413f-config-data\") pod \"heat-engine-596c56fc48-gpcvd\" (UID: \"eff78caa-c237-4040-b508-7cd9b8b5413f\") " pod="openstack/heat-engine-596c56fc48-gpcvd" Dec 11 13:27:34 crc kubenswrapper[4898]: I1211 13:27:34.185871 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/eff78caa-c237-4040-b508-7cd9b8b5413f-config-data-custom\") pod \"heat-engine-596c56fc48-gpcvd\" (UID: \"eff78caa-c237-4040-b508-7cd9b8b5413f\") " pod="openstack/heat-engine-596c56fc48-gpcvd" Dec 11 13:27:34 crc kubenswrapper[4898]: I1211 13:27:34.188443 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-6ff897fbd7-rddq7"] Dec 11 13:27:34 crc kubenswrapper[4898]: I1211 13:27:34.195793 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eff78caa-c237-4040-b508-7cd9b8b5413f-combined-ca-bundle\") pod \"heat-engine-596c56fc48-gpcvd\" (UID: \"eff78caa-c237-4040-b508-7cd9b8b5413f\") " pod="openstack/heat-engine-596c56fc48-gpcvd" Dec 11 13:27:34 crc kubenswrapper[4898]: I1211 13:27:34.201219 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-6ff897fbd7-rddq7" Dec 11 13:27:34 crc kubenswrapper[4898]: I1211 13:27:34.204449 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-api-config-data" Dec 11 13:27:34 crc kubenswrapper[4898]: I1211 13:27:34.222777 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-688b9f5b49-m8v8g"] Dec 11 13:27:34 crc kubenswrapper[4898]: I1211 13:27:34.230705 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-688b9f5b49-m8v8g" Dec 11 13:27:34 crc kubenswrapper[4898]: I1211 13:27:34.238245 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w4lb9\" (UniqueName: \"kubernetes.io/projected/eff78caa-c237-4040-b508-7cd9b8b5413f-kube-api-access-w4lb9\") pod \"heat-engine-596c56fc48-gpcvd\" (UID: \"eff78caa-c237-4040-b508-7cd9b8b5413f\") " pod="openstack/heat-engine-596c56fc48-gpcvd" Dec 11 13:27:34 crc kubenswrapper[4898]: I1211 13:27:34.267963 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c6532ba5-2d1a-44f9-bf4a-6fd73f4bb6ee-config-data\") pod \"heat-api-6ff897fbd7-rddq7\" (UID: \"c6532ba5-2d1a-44f9-bf4a-6fd73f4bb6ee\") " pod="openstack/heat-api-6ff897fbd7-rddq7" Dec 11 13:27:34 crc kubenswrapper[4898]: I1211 13:27:34.268018 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ee1613f-2749-4158-b826-8096758ca04f-config-data\") pod \"heat-cfnapi-6cfcb9bcc9-8j4sz\" (UID: \"0ee1613f-2749-4158-b826-8096758ca04f\") " pod="openstack/heat-cfnapi-6cfcb9bcc9-8j4sz" Dec 11 13:27:34 crc kubenswrapper[4898]: I1211 13:27:34.268059 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c6532ba5-2d1a-44f9-bf4a-6fd73f4bb6ee-config-data-custom\") pod \"heat-api-6ff897fbd7-rddq7\" (UID: \"c6532ba5-2d1a-44f9-bf4a-6fd73f4bb6ee\") " pod="openstack/heat-api-6ff897fbd7-rddq7" Dec 11 13:27:34 crc kubenswrapper[4898]: I1211 13:27:34.268083 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bg7zt\" (UniqueName: \"kubernetes.io/projected/0ee1613f-2749-4158-b826-8096758ca04f-kube-api-access-bg7zt\") pod \"heat-cfnapi-6cfcb9bcc9-8j4sz\" (UID: \"0ee1613f-2749-4158-b826-8096758ca04f\") " pod="openstack/heat-cfnapi-6cfcb9bcc9-8j4sz" Dec 11 13:27:34 crc kubenswrapper[4898]: I1211 13:27:34.268110 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6532ba5-2d1a-44f9-bf4a-6fd73f4bb6ee-combined-ca-bundle\") pod \"heat-api-6ff897fbd7-rddq7\" (UID: \"c6532ba5-2d1a-44f9-bf4a-6fd73f4bb6ee\") " pod="openstack/heat-api-6ff897fbd7-rddq7" Dec 11 13:27:34 crc kubenswrapper[4898]: I1211 13:27:34.268137 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ee1613f-2749-4158-b826-8096758ca04f-combined-ca-bundle\") pod \"heat-cfnapi-6cfcb9bcc9-8j4sz\" (UID: \"0ee1613f-2749-4158-b826-8096758ca04f\") " pod="openstack/heat-cfnapi-6cfcb9bcc9-8j4sz" Dec 11 13:27:34 crc kubenswrapper[4898]: I1211 13:27:34.268174 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0ee1613f-2749-4158-b826-8096758ca04f-config-data-custom\") pod \"heat-cfnapi-6cfcb9bcc9-8j4sz\" (UID: \"0ee1613f-2749-4158-b826-8096758ca04f\") " pod="openstack/heat-cfnapi-6cfcb9bcc9-8j4sz" Dec 11 13:27:34 crc kubenswrapper[4898]: I1211 13:27:34.268200 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5bjms\" (UniqueName: \"kubernetes.io/projected/c6532ba5-2d1a-44f9-bf4a-6fd73f4bb6ee-kube-api-access-5bjms\") pod \"heat-api-6ff897fbd7-rddq7\" (UID: \"c6532ba5-2d1a-44f9-bf4a-6fd73f4bb6ee\") " pod="openstack/heat-api-6ff897fbd7-rddq7" Dec 11 13:27:34 crc kubenswrapper[4898]: I1211 13:27:34.309110 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Dec 11 13:27:34 crc kubenswrapper[4898]: I1211 13:27:34.311252 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 11 13:27:34 crc kubenswrapper[4898]: I1211 13:27:34.313124 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Dec 11 13:27:34 crc kubenswrapper[4898]: I1211 13:27:34.313516 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Dec 11 13:27:34 crc kubenswrapper[4898]: I1211 13:27:34.313660 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Dec 11 13:27:34 crc kubenswrapper[4898]: I1211 13:27:34.322739 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-596c56fc48-gpcvd" Dec 11 13:27:34 crc kubenswrapper[4898]: I1211 13:27:34.327587 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-6cfcb9bcc9-8j4sz"] Dec 11 13:27:34 crc kubenswrapper[4898]: I1211 13:27:34.342646 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-6ff897fbd7-rddq7"] Dec 11 13:27:34 crc kubenswrapper[4898]: I1211 13:27:34.371659 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 11 13:27:34 crc kubenswrapper[4898]: I1211 13:27:34.372937 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3ec65fb5-ed03-4c86-b3da-19711d02ef4d-dns-svc\") pod \"dnsmasq-dns-688b9f5b49-m8v8g\" (UID: \"3ec65fb5-ed03-4c86-b3da-19711d02ef4d\") " pod="openstack/dnsmasq-dns-688b9f5b49-m8v8g" Dec 11 13:27:34 crc kubenswrapper[4898]: I1211 13:27:34.373007 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3d7a3f50-05f9-4ffd-b473-b7c0c1563bc2-etc-machine-id\") pod \"cinder-api-0\" (UID: \"3d7a3f50-05f9-4ffd-b473-b7c0c1563bc2\") " pod="openstack/cinder-api-0" Dec 11 13:27:34 crc kubenswrapper[4898]: I1211 13:27:34.373040 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d7a3f50-05f9-4ffd-b473-b7c0c1563bc2-scripts\") pod \"cinder-api-0\" (UID: \"3d7a3f50-05f9-4ffd-b473-b7c0c1563bc2\") " pod="openstack/cinder-api-0" Dec 11 13:27:34 crc kubenswrapper[4898]: I1211 13:27:34.373079 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d7a3f50-05f9-4ffd-b473-b7c0c1563bc2-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"3d7a3f50-05f9-4ffd-b473-b7c0c1563bc2\") " pod="openstack/cinder-api-0" Dec 11 13:27:34 crc kubenswrapper[4898]: I1211 13:27:34.373121 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3ec65fb5-ed03-4c86-b3da-19711d02ef4d-dns-swift-storage-0\") pod \"dnsmasq-dns-688b9f5b49-m8v8g\" (UID: \"3ec65fb5-ed03-4c86-b3da-19711d02ef4d\") " pod="openstack/dnsmasq-dns-688b9f5b49-m8v8g" Dec 11 13:27:34 crc kubenswrapper[4898]: I1211 13:27:34.373153 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hf7zh\" (UniqueName: \"kubernetes.io/projected/3ec65fb5-ed03-4c86-b3da-19711d02ef4d-kube-api-access-hf7zh\") pod \"dnsmasq-dns-688b9f5b49-m8v8g\" (UID: \"3ec65fb5-ed03-4c86-b3da-19711d02ef4d\") " pod="openstack/dnsmasq-dns-688b9f5b49-m8v8g" Dec 11 13:27:34 crc kubenswrapper[4898]: I1211 13:27:34.373174 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f5v8v\" (UniqueName: \"kubernetes.io/projected/3d7a3f50-05f9-4ffd-b473-b7c0c1563bc2-kube-api-access-f5v8v\") pod \"cinder-api-0\" (UID: \"3d7a3f50-05f9-4ffd-b473-b7c0c1563bc2\") " pod="openstack/cinder-api-0" Dec 11 13:27:34 crc kubenswrapper[4898]: I1211 13:27:34.373206 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c6532ba5-2d1a-44f9-bf4a-6fd73f4bb6ee-config-data\") pod \"heat-api-6ff897fbd7-rddq7\" (UID: \"c6532ba5-2d1a-44f9-bf4a-6fd73f4bb6ee\") " pod="openstack/heat-api-6ff897fbd7-rddq7" Dec 11 13:27:34 crc kubenswrapper[4898]: I1211 13:27:34.373226 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d7a3f50-05f9-4ffd-b473-b7c0c1563bc2-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"3d7a3f50-05f9-4ffd-b473-b7c0c1563bc2\") " pod="openstack/cinder-api-0" Dec 11 13:27:34 crc kubenswrapper[4898]: I1211 13:27:34.373259 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ee1613f-2749-4158-b826-8096758ca04f-config-data\") pod \"heat-cfnapi-6cfcb9bcc9-8j4sz\" (UID: \"0ee1613f-2749-4158-b826-8096758ca04f\") " pod="openstack/heat-cfnapi-6cfcb9bcc9-8j4sz" Dec 11 13:27:34 crc kubenswrapper[4898]: I1211 13:27:34.373290 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3ec65fb5-ed03-4c86-b3da-19711d02ef4d-config\") pod \"dnsmasq-dns-688b9f5b49-m8v8g\" (UID: \"3ec65fb5-ed03-4c86-b3da-19711d02ef4d\") " pod="openstack/dnsmasq-dns-688b9f5b49-m8v8g" Dec 11 13:27:34 crc kubenswrapper[4898]: I1211 13:27:34.373312 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3d7a3f50-05f9-4ffd-b473-b7c0c1563bc2-config-data-custom\") pod \"cinder-api-0\" (UID: \"3d7a3f50-05f9-4ffd-b473-b7c0c1563bc2\") " pod="openstack/cinder-api-0" Dec 11 13:27:34 crc kubenswrapper[4898]: I1211 13:27:34.373353 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c6532ba5-2d1a-44f9-bf4a-6fd73f4bb6ee-config-data-custom\") pod \"heat-api-6ff897fbd7-rddq7\" (UID: \"c6532ba5-2d1a-44f9-bf4a-6fd73f4bb6ee\") " pod="openstack/heat-api-6ff897fbd7-rddq7" Dec 11 13:27:34 crc kubenswrapper[4898]: I1211 13:27:34.373382 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d7a3f50-05f9-4ffd-b473-b7c0c1563bc2-config-data\") pod \"cinder-api-0\" (UID: \"3d7a3f50-05f9-4ffd-b473-b7c0c1563bc2\") " pod="openstack/cinder-api-0" Dec 11 13:27:34 crc kubenswrapper[4898]: I1211 13:27:34.373410 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bg7zt\" (UniqueName: \"kubernetes.io/projected/0ee1613f-2749-4158-b826-8096758ca04f-kube-api-access-bg7zt\") pod \"heat-cfnapi-6cfcb9bcc9-8j4sz\" (UID: \"0ee1613f-2749-4158-b826-8096758ca04f\") " pod="openstack/heat-cfnapi-6cfcb9bcc9-8j4sz" Dec 11 13:27:34 crc kubenswrapper[4898]: I1211 13:27:34.373438 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3ec65fb5-ed03-4c86-b3da-19711d02ef4d-ovsdbserver-sb\") pod \"dnsmasq-dns-688b9f5b49-m8v8g\" (UID: \"3ec65fb5-ed03-4c86-b3da-19711d02ef4d\") " pod="openstack/dnsmasq-dns-688b9f5b49-m8v8g" Dec 11 13:27:34 crc kubenswrapper[4898]: I1211 13:27:34.373503 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6532ba5-2d1a-44f9-bf4a-6fd73f4bb6ee-combined-ca-bundle\") pod \"heat-api-6ff897fbd7-rddq7\" (UID: \"c6532ba5-2d1a-44f9-bf4a-6fd73f4bb6ee\") " pod="openstack/heat-api-6ff897fbd7-rddq7" Dec 11 13:27:34 crc kubenswrapper[4898]: I1211 13:27:34.373542 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ee1613f-2749-4158-b826-8096758ca04f-combined-ca-bundle\") pod \"heat-cfnapi-6cfcb9bcc9-8j4sz\" (UID: \"0ee1613f-2749-4158-b826-8096758ca04f\") " pod="openstack/heat-cfnapi-6cfcb9bcc9-8j4sz" Dec 11 13:27:34 crc kubenswrapper[4898]: I1211 13:27:34.373564 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d7a3f50-05f9-4ffd-b473-b7c0c1563bc2-public-tls-certs\") pod \"cinder-api-0\" (UID: \"3d7a3f50-05f9-4ffd-b473-b7c0c1563bc2\") " pod="openstack/cinder-api-0" Dec 11 13:27:34 crc kubenswrapper[4898]: I1211 13:27:34.373623 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0ee1613f-2749-4158-b826-8096758ca04f-config-data-custom\") pod \"heat-cfnapi-6cfcb9bcc9-8j4sz\" (UID: \"0ee1613f-2749-4158-b826-8096758ca04f\") " pod="openstack/heat-cfnapi-6cfcb9bcc9-8j4sz" Dec 11 13:27:34 crc kubenswrapper[4898]: I1211 13:27:34.373661 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3ec65fb5-ed03-4c86-b3da-19711d02ef4d-ovsdbserver-nb\") pod \"dnsmasq-dns-688b9f5b49-m8v8g\" (UID: \"3ec65fb5-ed03-4c86-b3da-19711d02ef4d\") " pod="openstack/dnsmasq-dns-688b9f5b49-m8v8g" Dec 11 13:27:34 crc kubenswrapper[4898]: I1211 13:27:34.373691 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5bjms\" (UniqueName: \"kubernetes.io/projected/c6532ba5-2d1a-44f9-bf4a-6fd73f4bb6ee-kube-api-access-5bjms\") pod \"heat-api-6ff897fbd7-rddq7\" (UID: \"c6532ba5-2d1a-44f9-bf4a-6fd73f4bb6ee\") " pod="openstack/heat-api-6ff897fbd7-rddq7" Dec 11 13:27:34 crc kubenswrapper[4898]: I1211 13:27:34.373714 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3d7a3f50-05f9-4ffd-b473-b7c0c1563bc2-logs\") pod \"cinder-api-0\" (UID: \"3d7a3f50-05f9-4ffd-b473-b7c0c1563bc2\") " pod="openstack/cinder-api-0" Dec 11 13:27:34 crc kubenswrapper[4898]: I1211 13:27:34.386427 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-688b9f5b49-m8v8g"] Dec 11 13:27:34 crc kubenswrapper[4898]: I1211 13:27:34.405974 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0ee1613f-2749-4158-b826-8096758ca04f-config-data-custom\") pod \"heat-cfnapi-6cfcb9bcc9-8j4sz\" (UID: \"0ee1613f-2749-4158-b826-8096758ca04f\") " pod="openstack/heat-cfnapi-6cfcb9bcc9-8j4sz" Dec 11 13:27:34 crc kubenswrapper[4898]: I1211 13:27:34.417146 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5bjms\" (UniqueName: \"kubernetes.io/projected/c6532ba5-2d1a-44f9-bf4a-6fd73f4bb6ee-kube-api-access-5bjms\") pod \"heat-api-6ff897fbd7-rddq7\" (UID: \"c6532ba5-2d1a-44f9-bf4a-6fd73f4bb6ee\") " pod="openstack/heat-api-6ff897fbd7-rddq7" Dec 11 13:27:34 crc kubenswrapper[4898]: I1211 13:27:34.423738 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ee1613f-2749-4158-b826-8096758ca04f-combined-ca-bundle\") pod \"heat-cfnapi-6cfcb9bcc9-8j4sz\" (UID: \"0ee1613f-2749-4158-b826-8096758ca04f\") " pod="openstack/heat-cfnapi-6cfcb9bcc9-8j4sz" Dec 11 13:27:34 crc kubenswrapper[4898]: I1211 13:27:34.427670 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6532ba5-2d1a-44f9-bf4a-6fd73f4bb6ee-combined-ca-bundle\") pod \"heat-api-6ff897fbd7-rddq7\" (UID: \"c6532ba5-2d1a-44f9-bf4a-6fd73f4bb6ee\") " pod="openstack/heat-api-6ff897fbd7-rddq7" Dec 11 13:27:34 crc kubenswrapper[4898]: I1211 13:27:34.428790 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c6532ba5-2d1a-44f9-bf4a-6fd73f4bb6ee-config-data\") pod \"heat-api-6ff897fbd7-rddq7\" (UID: \"c6532ba5-2d1a-44f9-bf4a-6fd73f4bb6ee\") " pod="openstack/heat-api-6ff897fbd7-rddq7" Dec 11 13:27:34 crc kubenswrapper[4898]: I1211 13:27:34.429047 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bg7zt\" (UniqueName: \"kubernetes.io/projected/0ee1613f-2749-4158-b826-8096758ca04f-kube-api-access-bg7zt\") pod \"heat-cfnapi-6cfcb9bcc9-8j4sz\" (UID: \"0ee1613f-2749-4158-b826-8096758ca04f\") " pod="openstack/heat-cfnapi-6cfcb9bcc9-8j4sz" Dec 11 13:27:34 crc kubenswrapper[4898]: I1211 13:27:34.429320 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c6532ba5-2d1a-44f9-bf4a-6fd73f4bb6ee-config-data-custom\") pod \"heat-api-6ff897fbd7-rddq7\" (UID: \"c6532ba5-2d1a-44f9-bf4a-6fd73f4bb6ee\") " pod="openstack/heat-api-6ff897fbd7-rddq7" Dec 11 13:27:34 crc kubenswrapper[4898]: I1211 13:27:34.431553 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ee1613f-2749-4158-b826-8096758ca04f-config-data\") pod \"heat-cfnapi-6cfcb9bcc9-8j4sz\" (UID: \"0ee1613f-2749-4158-b826-8096758ca04f\") " pod="openstack/heat-cfnapi-6cfcb9bcc9-8j4sz" Dec 11 13:27:34 crc kubenswrapper[4898]: I1211 13:27:34.455424 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-6ff897fbd7-rddq7" Dec 11 13:27:34 crc kubenswrapper[4898]: I1211 13:27:34.475291 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3d7a3f50-05f9-4ffd-b473-b7c0c1563bc2-config-data-custom\") pod \"cinder-api-0\" (UID: \"3d7a3f50-05f9-4ffd-b473-b7c0c1563bc2\") " pod="openstack/cinder-api-0" Dec 11 13:27:34 crc kubenswrapper[4898]: I1211 13:27:34.475351 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d7a3f50-05f9-4ffd-b473-b7c0c1563bc2-config-data\") pod \"cinder-api-0\" (UID: \"3d7a3f50-05f9-4ffd-b473-b7c0c1563bc2\") " pod="openstack/cinder-api-0" Dec 11 13:27:34 crc kubenswrapper[4898]: I1211 13:27:34.475393 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3ec65fb5-ed03-4c86-b3da-19711d02ef4d-ovsdbserver-sb\") pod \"dnsmasq-dns-688b9f5b49-m8v8g\" (UID: \"3ec65fb5-ed03-4c86-b3da-19711d02ef4d\") " pod="openstack/dnsmasq-dns-688b9f5b49-m8v8g" Dec 11 13:27:34 crc kubenswrapper[4898]: I1211 13:27:34.475450 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d7a3f50-05f9-4ffd-b473-b7c0c1563bc2-public-tls-certs\") pod \"cinder-api-0\" (UID: \"3d7a3f50-05f9-4ffd-b473-b7c0c1563bc2\") " pod="openstack/cinder-api-0" Dec 11 13:27:34 crc kubenswrapper[4898]: I1211 13:27:34.475551 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3ec65fb5-ed03-4c86-b3da-19711d02ef4d-ovsdbserver-nb\") pod \"dnsmasq-dns-688b9f5b49-m8v8g\" (UID: \"3ec65fb5-ed03-4c86-b3da-19711d02ef4d\") " pod="openstack/dnsmasq-dns-688b9f5b49-m8v8g" Dec 11 13:27:34 crc kubenswrapper[4898]: I1211 13:27:34.475573 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3d7a3f50-05f9-4ffd-b473-b7c0c1563bc2-logs\") pod \"cinder-api-0\" (UID: \"3d7a3f50-05f9-4ffd-b473-b7c0c1563bc2\") " pod="openstack/cinder-api-0" Dec 11 13:27:34 crc kubenswrapper[4898]: I1211 13:27:34.475618 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3ec65fb5-ed03-4c86-b3da-19711d02ef4d-dns-svc\") pod \"dnsmasq-dns-688b9f5b49-m8v8g\" (UID: \"3ec65fb5-ed03-4c86-b3da-19711d02ef4d\") " pod="openstack/dnsmasq-dns-688b9f5b49-m8v8g" Dec 11 13:27:34 crc kubenswrapper[4898]: I1211 13:27:34.475660 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3d7a3f50-05f9-4ffd-b473-b7c0c1563bc2-etc-machine-id\") pod \"cinder-api-0\" (UID: \"3d7a3f50-05f9-4ffd-b473-b7c0c1563bc2\") " pod="openstack/cinder-api-0" Dec 11 13:27:34 crc kubenswrapper[4898]: I1211 13:27:34.475681 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d7a3f50-05f9-4ffd-b473-b7c0c1563bc2-scripts\") pod \"cinder-api-0\" (UID: \"3d7a3f50-05f9-4ffd-b473-b7c0c1563bc2\") " pod="openstack/cinder-api-0" Dec 11 13:27:34 crc kubenswrapper[4898]: I1211 13:27:34.475703 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d7a3f50-05f9-4ffd-b473-b7c0c1563bc2-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"3d7a3f50-05f9-4ffd-b473-b7c0c1563bc2\") " pod="openstack/cinder-api-0" Dec 11 13:27:34 crc kubenswrapper[4898]: I1211 13:27:34.475737 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3ec65fb5-ed03-4c86-b3da-19711d02ef4d-dns-swift-storage-0\") pod \"dnsmasq-dns-688b9f5b49-m8v8g\" (UID: \"3ec65fb5-ed03-4c86-b3da-19711d02ef4d\") " pod="openstack/dnsmasq-dns-688b9f5b49-m8v8g" Dec 11 13:27:34 crc kubenswrapper[4898]: I1211 13:27:34.475760 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hf7zh\" (UniqueName: \"kubernetes.io/projected/3ec65fb5-ed03-4c86-b3da-19711d02ef4d-kube-api-access-hf7zh\") pod \"dnsmasq-dns-688b9f5b49-m8v8g\" (UID: \"3ec65fb5-ed03-4c86-b3da-19711d02ef4d\") " pod="openstack/dnsmasq-dns-688b9f5b49-m8v8g" Dec 11 13:27:34 crc kubenswrapper[4898]: I1211 13:27:34.475779 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f5v8v\" (UniqueName: \"kubernetes.io/projected/3d7a3f50-05f9-4ffd-b473-b7c0c1563bc2-kube-api-access-f5v8v\") pod \"cinder-api-0\" (UID: \"3d7a3f50-05f9-4ffd-b473-b7c0c1563bc2\") " pod="openstack/cinder-api-0" Dec 11 13:27:34 crc kubenswrapper[4898]: I1211 13:27:34.475811 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d7a3f50-05f9-4ffd-b473-b7c0c1563bc2-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"3d7a3f50-05f9-4ffd-b473-b7c0c1563bc2\") " pod="openstack/cinder-api-0" Dec 11 13:27:34 crc kubenswrapper[4898]: I1211 13:27:34.475850 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3ec65fb5-ed03-4c86-b3da-19711d02ef4d-config\") pod \"dnsmasq-dns-688b9f5b49-m8v8g\" (UID: \"3ec65fb5-ed03-4c86-b3da-19711d02ef4d\") " pod="openstack/dnsmasq-dns-688b9f5b49-m8v8g" Dec 11 13:27:34 crc kubenswrapper[4898]: I1211 13:27:34.476821 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3ec65fb5-ed03-4c86-b3da-19711d02ef4d-config\") pod \"dnsmasq-dns-688b9f5b49-m8v8g\" (UID: \"3ec65fb5-ed03-4c86-b3da-19711d02ef4d\") " pod="openstack/dnsmasq-dns-688b9f5b49-m8v8g" Dec 11 13:27:34 crc kubenswrapper[4898]: I1211 13:27:34.484808 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3ec65fb5-ed03-4c86-b3da-19711d02ef4d-dns-svc\") pod \"dnsmasq-dns-688b9f5b49-m8v8g\" (UID: \"3ec65fb5-ed03-4c86-b3da-19711d02ef4d\") " pod="openstack/dnsmasq-dns-688b9f5b49-m8v8g" Dec 11 13:27:34 crc kubenswrapper[4898]: I1211 13:27:34.485905 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3ec65fb5-ed03-4c86-b3da-19711d02ef4d-ovsdbserver-nb\") pod \"dnsmasq-dns-688b9f5b49-m8v8g\" (UID: \"3ec65fb5-ed03-4c86-b3da-19711d02ef4d\") " pod="openstack/dnsmasq-dns-688b9f5b49-m8v8g" Dec 11 13:27:34 crc kubenswrapper[4898]: I1211 13:27:34.487385 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d7a3f50-05f9-4ffd-b473-b7c0c1563bc2-public-tls-certs\") pod \"cinder-api-0\" (UID: \"3d7a3f50-05f9-4ffd-b473-b7c0c1563bc2\") " pod="openstack/cinder-api-0" Dec 11 13:27:34 crc kubenswrapper[4898]: I1211 13:27:34.494407 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3ec65fb5-ed03-4c86-b3da-19711d02ef4d-ovsdbserver-sb\") pod \"dnsmasq-dns-688b9f5b49-m8v8g\" (UID: \"3ec65fb5-ed03-4c86-b3da-19711d02ef4d\") " pod="openstack/dnsmasq-dns-688b9f5b49-m8v8g" Dec 11 13:27:34 crc kubenswrapper[4898]: I1211 13:27:34.494643 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3d7a3f50-05f9-4ffd-b473-b7c0c1563bc2-logs\") pod \"cinder-api-0\" (UID: \"3d7a3f50-05f9-4ffd-b473-b7c0c1563bc2\") " pod="openstack/cinder-api-0" Dec 11 13:27:34 crc kubenswrapper[4898]: I1211 13:27:34.495488 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3ec65fb5-ed03-4c86-b3da-19711d02ef4d-dns-swift-storage-0\") pod \"dnsmasq-dns-688b9f5b49-m8v8g\" (UID: \"3ec65fb5-ed03-4c86-b3da-19711d02ef4d\") " pod="openstack/dnsmasq-dns-688b9f5b49-m8v8g" Dec 11 13:27:34 crc kubenswrapper[4898]: I1211 13:27:34.495533 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3d7a3f50-05f9-4ffd-b473-b7c0c1563bc2-etc-machine-id\") pod \"cinder-api-0\" (UID: \"3d7a3f50-05f9-4ffd-b473-b7c0c1563bc2\") " pod="openstack/cinder-api-0" Dec 11 13:27:34 crc kubenswrapper[4898]: I1211 13:27:34.497395 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d7a3f50-05f9-4ffd-b473-b7c0c1563bc2-config-data\") pod \"cinder-api-0\" (UID: \"3d7a3f50-05f9-4ffd-b473-b7c0c1563bc2\") " pod="openstack/cinder-api-0" Dec 11 13:27:34 crc kubenswrapper[4898]: I1211 13:27:34.504964 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d7a3f50-05f9-4ffd-b473-b7c0c1563bc2-scripts\") pod \"cinder-api-0\" (UID: \"3d7a3f50-05f9-4ffd-b473-b7c0c1563bc2\") " pod="openstack/cinder-api-0" Dec 11 13:27:34 crc kubenswrapper[4898]: I1211 13:27:34.513228 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d7a3f50-05f9-4ffd-b473-b7c0c1563bc2-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"3d7a3f50-05f9-4ffd-b473-b7c0c1563bc2\") " pod="openstack/cinder-api-0" Dec 11 13:27:34 crc kubenswrapper[4898]: I1211 13:27:34.525629 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f5v8v\" (UniqueName: \"kubernetes.io/projected/3d7a3f50-05f9-4ffd-b473-b7c0c1563bc2-kube-api-access-f5v8v\") pod \"cinder-api-0\" (UID: \"3d7a3f50-05f9-4ffd-b473-b7c0c1563bc2\") " pod="openstack/cinder-api-0" Dec 11 13:27:34 crc kubenswrapper[4898]: I1211 13:27:34.526528 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3d7a3f50-05f9-4ffd-b473-b7c0c1563bc2-config-data-custom\") pod \"cinder-api-0\" (UID: \"3d7a3f50-05f9-4ffd-b473-b7c0c1563bc2\") " pod="openstack/cinder-api-0" Dec 11 13:27:34 crc kubenswrapper[4898]: I1211 13:27:34.532443 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d7a3f50-05f9-4ffd-b473-b7c0c1563bc2-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"3d7a3f50-05f9-4ffd-b473-b7c0c1563bc2\") " pod="openstack/cinder-api-0" Dec 11 13:27:34 crc kubenswrapper[4898]: I1211 13:27:34.532665 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hf7zh\" (UniqueName: \"kubernetes.io/projected/3ec65fb5-ed03-4c86-b3da-19711d02ef4d-kube-api-access-hf7zh\") pod \"dnsmasq-dns-688b9f5b49-m8v8g\" (UID: \"3ec65fb5-ed03-4c86-b3da-19711d02ef4d\") " pod="openstack/dnsmasq-dns-688b9f5b49-m8v8g" Dec 11 13:27:34 crc kubenswrapper[4898]: I1211 13:27:34.577808 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 11 13:27:34 crc kubenswrapper[4898]: I1211 13:27:34.709216 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-6cfcb9bcc9-8j4sz" Dec 11 13:27:34 crc kubenswrapper[4898]: I1211 13:27:34.808746 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b9c2941b-6d82-4218-8cc7-91f6473d0fbf" path="/var/lib/kubelet/pods/b9c2941b-6d82-4218-8cc7-91f6473d0fbf/volumes" Dec 11 13:27:34 crc kubenswrapper[4898]: I1211 13:27:34.814759 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-688b9f5b49-m8v8g" Dec 11 13:27:34 crc kubenswrapper[4898]: I1211 13:27:34.990465 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-596c56fc48-gpcvd"] Dec 11 13:27:35 crc kubenswrapper[4898]: I1211 13:27:35.001725 4898 patch_prober.go:28] interesting pod/machine-config-daemon-7mmvk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 11 13:27:35 crc kubenswrapper[4898]: I1211 13:27:35.001776 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 11 13:27:35 crc kubenswrapper[4898]: I1211 13:27:35.420648 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-tfn2j" Dec 11 13:27:35 crc kubenswrapper[4898]: I1211 13:27:35.449960 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-rnnwz" Dec 11 13:27:35 crc kubenswrapper[4898]: I1211 13:27:35.495947 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-qnzh8" Dec 11 13:27:35 crc kubenswrapper[4898]: I1211 13:27:35.558971 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/446e3383-1a1b-4271-94ca-1662e36059d3-operator-scripts\") pod \"446e3383-1a1b-4271-94ca-1662e36059d3\" (UID: \"446e3383-1a1b-4271-94ca-1662e36059d3\") " Dec 11 13:27:35 crc kubenswrapper[4898]: I1211 13:27:35.559346 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2b4db1d0-fd46-4731-8392-e8390610a2c8-operator-scripts\") pod \"2b4db1d0-fd46-4731-8392-e8390610a2c8\" (UID: \"2b4db1d0-fd46-4731-8392-e8390610a2c8\") " Dec 11 13:27:35 crc kubenswrapper[4898]: I1211 13:27:35.559491 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f7ckh\" (UniqueName: \"kubernetes.io/projected/446e3383-1a1b-4271-94ca-1662e36059d3-kube-api-access-f7ckh\") pod \"446e3383-1a1b-4271-94ca-1662e36059d3\" (UID: \"446e3383-1a1b-4271-94ca-1662e36059d3\") " Dec 11 13:27:35 crc kubenswrapper[4898]: I1211 13:27:35.560279 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8g6lj\" (UniqueName: \"kubernetes.io/projected/2b4db1d0-fd46-4731-8392-e8390610a2c8-kube-api-access-8g6lj\") pod \"2b4db1d0-fd46-4731-8392-e8390610a2c8\" (UID: \"2b4db1d0-fd46-4731-8392-e8390610a2c8\") " Dec 11 13:27:35 crc kubenswrapper[4898]: I1211 13:27:35.561086 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/446e3383-1a1b-4271-94ca-1662e36059d3-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "446e3383-1a1b-4271-94ca-1662e36059d3" (UID: "446e3383-1a1b-4271-94ca-1662e36059d3"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:27:35 crc kubenswrapper[4898]: I1211 13:27:35.561719 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2b4db1d0-fd46-4731-8392-e8390610a2c8-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2b4db1d0-fd46-4731-8392-e8390610a2c8" (UID: "2b4db1d0-fd46-4731-8392-e8390610a2c8"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:27:35 crc kubenswrapper[4898]: E1211 13:27:35.564654 4898 kubelet_node_status.go:756] "Failed to set some node status fields" err="failed to validate nodeIP: route ip+net: no such network interface" node="crc" Dec 11 13:27:35 crc kubenswrapper[4898]: I1211 13:27:35.565367 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/446e3383-1a1b-4271-94ca-1662e36059d3-kube-api-access-f7ckh" (OuterVolumeSpecName: "kube-api-access-f7ckh") pod "446e3383-1a1b-4271-94ca-1662e36059d3" (UID: "446e3383-1a1b-4271-94ca-1662e36059d3"). InnerVolumeSpecName "kube-api-access-f7ckh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:27:35 crc kubenswrapper[4898]: I1211 13:27:35.566495 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b4db1d0-fd46-4731-8392-e8390610a2c8-kube-api-access-8g6lj" (OuterVolumeSpecName: "kube-api-access-8g6lj") pod "2b4db1d0-fd46-4731-8392-e8390610a2c8" (UID: "2b4db1d0-fd46-4731-8392-e8390610a2c8"). InnerVolumeSpecName "kube-api-access-8g6lj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:27:35 crc kubenswrapper[4898]: I1211 13:27:35.602335 4898 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/446e3383-1a1b-4271-94ca-1662e36059d3-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 11 13:27:35 crc kubenswrapper[4898]: I1211 13:27:35.602374 4898 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2b4db1d0-fd46-4731-8392-e8390610a2c8-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 11 13:27:35 crc kubenswrapper[4898]: I1211 13:27:35.602403 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f7ckh\" (UniqueName: \"kubernetes.io/projected/446e3383-1a1b-4271-94ca-1662e36059d3-kube-api-access-f7ckh\") on node \"crc\" DevicePath \"\"" Dec 11 13:27:35 crc kubenswrapper[4898]: I1211 13:27:35.602418 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8g6lj\" (UniqueName: \"kubernetes.io/projected/2b4db1d0-fd46-4731-8392-e8390610a2c8-kube-api-access-8g6lj\") on node \"crc\" DevicePath \"\"" Dec 11 13:27:35 crc kubenswrapper[4898]: I1211 13:27:35.677483 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-596c56fc48-gpcvd" event={"ID":"eff78caa-c237-4040-b508-7cd9b8b5413f","Type":"ContainerStarted","Data":"fc709e30f1c7b9a4dfc26b17afc9659745d99b9f0f5fc75d869e217db7fd7218"} Dec 11 13:27:35 crc kubenswrapper[4898]: I1211 13:27:35.677526 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-596c56fc48-gpcvd" event={"ID":"eff78caa-c237-4040-b508-7cd9b8b5413f","Type":"ContainerStarted","Data":"c9837276bd3680657621b6b1a7f9c0e6b336657e3d6e02311e1af86c999327c8"} Dec 11 13:27:35 crc kubenswrapper[4898]: I1211 13:27:35.678816 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-engine-596c56fc48-gpcvd" Dec 11 13:27:35 crc kubenswrapper[4898]: I1211 13:27:35.703143 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"25ac34fb-287b-4ea4-b349-124a57354453","Type":"ContainerStarted","Data":"6f15206f068cd564bdcfebb3a774f04b52319bb6114731e417b7e5baec2291c0"} Dec 11 13:27:35 crc kubenswrapper[4898]: I1211 13:27:35.721913 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-engine-596c56fc48-gpcvd" podStartSLOduration=2.7218676889999998 podStartE2EDuration="2.721867689s" podCreationTimestamp="2025-12-11 13:27:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 13:27:35.707253523 +0000 UTC m=+1413.279579980" watchObservedRunningTime="2025-12-11 13:27:35.721867689 +0000 UTC m=+1413.294194126" Dec 11 13:27:35 crc kubenswrapper[4898]: I1211 13:27:35.727778 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-rnnwz" event={"ID":"446e3383-1a1b-4271-94ca-1662e36059d3","Type":"ContainerDied","Data":"4b84ce924720483b4a391454c702ec7df2e5424b4cb5403360134b2ded4dcf3f"} Dec 11 13:27:35 crc kubenswrapper[4898]: I1211 13:27:35.727804 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-rnnwz" Dec 11 13:27:35 crc kubenswrapper[4898]: I1211 13:27:35.727818 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4b84ce924720483b4a391454c702ec7df2e5424b4cb5403360134b2ded4dcf3f" Dec 11 13:27:35 crc kubenswrapper[4898]: I1211 13:27:35.728100 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f07f8aaa-f41a-4204-881e-274dc9a9ad74-operator-scripts\") pod \"f07f8aaa-f41a-4204-881e-274dc9a9ad74\" (UID: \"f07f8aaa-f41a-4204-881e-274dc9a9ad74\") " Dec 11 13:27:35 crc kubenswrapper[4898]: I1211 13:27:35.728617 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hpdhh\" (UniqueName: \"kubernetes.io/projected/f07f8aaa-f41a-4204-881e-274dc9a9ad74-kube-api-access-hpdhh\") pod \"f07f8aaa-f41a-4204-881e-274dc9a9ad74\" (UID: \"f07f8aaa-f41a-4204-881e-274dc9a9ad74\") " Dec 11 13:27:35 crc kubenswrapper[4898]: I1211 13:27:35.741361 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f07f8aaa-f41a-4204-881e-274dc9a9ad74-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f07f8aaa-f41a-4204-881e-274dc9a9ad74" (UID: "f07f8aaa-f41a-4204-881e-274dc9a9ad74"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:27:35 crc kubenswrapper[4898]: I1211 13:27:35.750549 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f07f8aaa-f41a-4204-881e-274dc9a9ad74-kube-api-access-hpdhh" (OuterVolumeSpecName: "kube-api-access-hpdhh") pod "f07f8aaa-f41a-4204-881e-274dc9a9ad74" (UID: "f07f8aaa-f41a-4204-881e-274dc9a9ad74"). InnerVolumeSpecName "kube-api-access-hpdhh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:27:35 crc kubenswrapper[4898]: I1211 13:27:35.774205 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-tfn2j" Dec 11 13:27:35 crc kubenswrapper[4898]: I1211 13:27:35.775139 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-tfn2j" event={"ID":"2b4db1d0-fd46-4731-8392-e8390610a2c8","Type":"ContainerDied","Data":"e3a5f372e315923c258908ab647d85c0446d0626fc70e1c640cef616e34a4fca"} Dec 11 13:27:35 crc kubenswrapper[4898]: I1211 13:27:35.775174 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e3a5f372e315923c258908ab647d85c0446d0626fc70e1c640cef616e34a4fca" Dec 11 13:27:35 crc kubenswrapper[4898]: I1211 13:27:35.779632 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-1a34-account-create-update-g82b9" Dec 11 13:27:35 crc kubenswrapper[4898]: I1211 13:27:35.786300 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-qnzh8" event={"ID":"f07f8aaa-f41a-4204-881e-274dc9a9ad74","Type":"ContainerDied","Data":"0912ea9b60aaf356fd9b9e196400f72884bb15e30662d8bb1ed9c8b2a0c43eb9"} Dec 11 13:27:35 crc kubenswrapper[4898]: I1211 13:27:35.786342 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0912ea9b60aaf356fd9b9e196400f72884bb15e30662d8bb1ed9c8b2a0c43eb9" Dec 11 13:27:35 crc kubenswrapper[4898]: I1211 13:27:35.786393 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-qnzh8" Dec 11 13:27:35 crc kubenswrapper[4898]: I1211 13:27:35.786609 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-d8bf-account-create-update-f5twm" Dec 11 13:27:35 crc kubenswrapper[4898]: I1211 13:27:35.819227 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-c008-account-create-update-5t4pl" Dec 11 13:27:35 crc kubenswrapper[4898]: I1211 13:27:35.832269 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wm798\" (UniqueName: \"kubernetes.io/projected/c2cfe6b8-8f34-455c-afe3-61b33605d648-kube-api-access-wm798\") pod \"c2cfe6b8-8f34-455c-afe3-61b33605d648\" (UID: \"c2cfe6b8-8f34-455c-afe3-61b33605d648\") " Dec 11 13:27:35 crc kubenswrapper[4898]: I1211 13:27:35.832440 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9kp4x\" (UniqueName: \"kubernetes.io/projected/5d173a21-f567-430e-b7fe-c47892f42873-kube-api-access-9kp4x\") pod \"5d173a21-f567-430e-b7fe-c47892f42873\" (UID: \"5d173a21-f567-430e-b7fe-c47892f42873\") " Dec 11 13:27:35 crc kubenswrapper[4898]: I1211 13:27:35.832678 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5d173a21-f567-430e-b7fe-c47892f42873-operator-scripts\") pod \"5d173a21-f567-430e-b7fe-c47892f42873\" (UID: \"5d173a21-f567-430e-b7fe-c47892f42873\") " Dec 11 13:27:35 crc kubenswrapper[4898]: I1211 13:27:35.832753 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c2cfe6b8-8f34-455c-afe3-61b33605d648-operator-scripts\") pod \"c2cfe6b8-8f34-455c-afe3-61b33605d648\" (UID: \"c2cfe6b8-8f34-455c-afe3-61b33605d648\") " Dec 11 13:27:35 crc kubenswrapper[4898]: I1211 13:27:35.833332 4898 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f07f8aaa-f41a-4204-881e-274dc9a9ad74-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 11 13:27:35 crc kubenswrapper[4898]: I1211 13:27:35.833348 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hpdhh\" (UniqueName: \"kubernetes.io/projected/f07f8aaa-f41a-4204-881e-274dc9a9ad74-kube-api-access-hpdhh\") on node \"crc\" DevicePath \"\"" Dec 11 13:27:35 crc kubenswrapper[4898]: I1211 13:27:35.836134 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5d173a21-f567-430e-b7fe-c47892f42873-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5d173a21-f567-430e-b7fe-c47892f42873" (UID: "5d173a21-f567-430e-b7fe-c47892f42873"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:27:35 crc kubenswrapper[4898]: I1211 13:27:35.836537 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c2cfe6b8-8f34-455c-afe3-61b33605d648-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c2cfe6b8-8f34-455c-afe3-61b33605d648" (UID: "c2cfe6b8-8f34-455c-afe3-61b33605d648"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:27:35 crc kubenswrapper[4898]: I1211 13:27:35.843620 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d173a21-f567-430e-b7fe-c47892f42873-kube-api-access-9kp4x" (OuterVolumeSpecName: "kube-api-access-9kp4x") pod "5d173a21-f567-430e-b7fe-c47892f42873" (UID: "5d173a21-f567-430e-b7fe-c47892f42873"). InnerVolumeSpecName "kube-api-access-9kp4x". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:27:35 crc kubenswrapper[4898]: I1211 13:27:35.862227 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2cfe6b8-8f34-455c-afe3-61b33605d648-kube-api-access-wm798" (OuterVolumeSpecName: "kube-api-access-wm798") pod "c2cfe6b8-8f34-455c-afe3-61b33605d648" (UID: "c2cfe6b8-8f34-455c-afe3-61b33605d648"). InnerVolumeSpecName "kube-api-access-wm798". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:27:35 crc kubenswrapper[4898]: I1211 13:27:35.934348 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1188efd5-cf9c-48dc-bd72-5259294cca4c-operator-scripts\") pod \"1188efd5-cf9c-48dc-bd72-5259294cca4c\" (UID: \"1188efd5-cf9c-48dc-bd72-5259294cca4c\") " Dec 11 13:27:35 crc kubenswrapper[4898]: I1211 13:27:35.934477 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-56gdv\" (UniqueName: \"kubernetes.io/projected/1188efd5-cf9c-48dc-bd72-5259294cca4c-kube-api-access-56gdv\") pod \"1188efd5-cf9c-48dc-bd72-5259294cca4c\" (UID: \"1188efd5-cf9c-48dc-bd72-5259294cca4c\") " Dec 11 13:27:35 crc kubenswrapper[4898]: I1211 13:27:35.934826 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1188efd5-cf9c-48dc-bd72-5259294cca4c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1188efd5-cf9c-48dc-bd72-5259294cca4c" (UID: "1188efd5-cf9c-48dc-bd72-5259294cca4c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:27:35 crc kubenswrapper[4898]: I1211 13:27:35.935518 4898 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5d173a21-f567-430e-b7fe-c47892f42873-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 11 13:27:35 crc kubenswrapper[4898]: I1211 13:27:35.935536 4898 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c2cfe6b8-8f34-455c-afe3-61b33605d648-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 11 13:27:35 crc kubenswrapper[4898]: I1211 13:27:35.935545 4898 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1188efd5-cf9c-48dc-bd72-5259294cca4c-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 11 13:27:35 crc kubenswrapper[4898]: I1211 13:27:35.935554 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wm798\" (UniqueName: \"kubernetes.io/projected/c2cfe6b8-8f34-455c-afe3-61b33605d648-kube-api-access-wm798\") on node \"crc\" DevicePath \"\"" Dec 11 13:27:35 crc kubenswrapper[4898]: I1211 13:27:35.935566 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9kp4x\" (UniqueName: \"kubernetes.io/projected/5d173a21-f567-430e-b7fe-c47892f42873-kube-api-access-9kp4x\") on node \"crc\" DevicePath \"\"" Dec 11 13:27:35 crc kubenswrapper[4898]: I1211 13:27:35.951729 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1188efd5-cf9c-48dc-bd72-5259294cca4c-kube-api-access-56gdv" (OuterVolumeSpecName: "kube-api-access-56gdv") pod "1188efd5-cf9c-48dc-bd72-5259294cca4c" (UID: "1188efd5-cf9c-48dc-bd72-5259294cca4c"). InnerVolumeSpecName "kube-api-access-56gdv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:27:36 crc kubenswrapper[4898]: I1211 13:27:36.039775 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-56gdv\" (UniqueName: \"kubernetes.io/projected/1188efd5-cf9c-48dc-bd72-5259294cca4c-kube-api-access-56gdv\") on node \"crc\" DevicePath \"\"" Dec 11 13:27:36 crc kubenswrapper[4898]: I1211 13:27:36.270082 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-external-api-0" podUID="8c0762b4-f3a4-4243-8df8-94e805983b4b" containerName="glance-httpd" probeResult="failure" output="Get \"https://10.217.0.187:9292/healthcheck\": read tcp 10.217.0.2:55008->10.217.0.187:9292: read: connection reset by peer" Dec 11 13:27:36 crc kubenswrapper[4898]: I1211 13:27:36.271014 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-external-api-0" podUID="8c0762b4-f3a4-4243-8df8-94e805983b4b" containerName="glance-log" probeResult="failure" output="Get \"https://10.217.0.187:9292/healthcheck\": read tcp 10.217.0.2:54996->10.217.0.187:9292: read: connection reset by peer" Dec 11 13:27:36 crc kubenswrapper[4898]: I1211 13:27:36.271057 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 11 13:27:36 crc kubenswrapper[4898]: I1211 13:27:36.294640 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-688b9f5b49-m8v8g"] Dec 11 13:27:36 crc kubenswrapper[4898]: I1211 13:27:36.337297 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-6ff897fbd7-rddq7"] Dec 11 13:27:36 crc kubenswrapper[4898]: I1211 13:27:36.362162 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-6cfcb9bcc9-8j4sz"] Dec 11 13:27:36 crc kubenswrapper[4898]: I1211 13:27:36.894390 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-688b9f5b49-m8v8g" event={"ID":"3ec65fb5-ed03-4c86-b3da-19711d02ef4d","Type":"ContainerStarted","Data":"6bfc24cc35b3b19439986ff3d40b2a88a2d588ca1324395af0c81f8e7ed41cd1"} Dec 11 13:27:36 crc kubenswrapper[4898]: I1211 13:27:36.907697 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-1a34-account-create-update-g82b9" event={"ID":"c2cfe6b8-8f34-455c-afe3-61b33605d648","Type":"ContainerDied","Data":"98ce40f66b62878aa8237e75a4d371d148e987874778b63456e174e6e291908a"} Dec 11 13:27:36 crc kubenswrapper[4898]: I1211 13:27:36.907760 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="98ce40f66b62878aa8237e75a4d371d148e987874778b63456e174e6e291908a" Dec 11 13:27:36 crc kubenswrapper[4898]: I1211 13:27:36.907873 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-1a34-account-create-update-g82b9" Dec 11 13:27:36 crc kubenswrapper[4898]: I1211 13:27:36.918759 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-6ff897fbd7-rddq7" event={"ID":"c6532ba5-2d1a-44f9-bf4a-6fd73f4bb6ee","Type":"ContainerStarted","Data":"cebf79a6c94cfbbe95f3601b078191aea3ed4c0f4aa59ba5c70f314c998dfed2"} Dec 11 13:27:36 crc kubenswrapper[4898]: I1211 13:27:36.971691 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"25ac34fb-287b-4ea4-b349-124a57354453","Type":"ContainerStarted","Data":"28921edda652003320dab5988a7617717d4874a26d9de9ca3ec96c8cb7818cb9"} Dec 11 13:27:36 crc kubenswrapper[4898]: I1211 13:27:36.998200 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"3d7a3f50-05f9-4ffd-b473-b7c0c1563bc2","Type":"ContainerStarted","Data":"84282bd5541fc9cc9e0f74eadb3e04a1137d1af44281c52d8d0a5a85f906608d"} Dec 11 13:27:37 crc kubenswrapper[4898]: I1211 13:27:37.070667 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-6cfcb9bcc9-8j4sz" event={"ID":"0ee1613f-2749-4158-b826-8096758ca04f","Type":"ContainerStarted","Data":"259bffd5dee25aedcda33245332fe75a51051ac1bdb36b148371210739ad4e1e"} Dec 11 13:27:37 crc kubenswrapper[4898]: I1211 13:27:37.083132 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-c008-account-create-update-5t4pl" event={"ID":"1188efd5-cf9c-48dc-bd72-5259294cca4c","Type":"ContainerDied","Data":"17086baff6937e1f0965e1d496ca3bbaaa1672cba22bffa7301d0a87f6ffc017"} Dec 11 13:27:37 crc kubenswrapper[4898]: I1211 13:27:37.083168 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="17086baff6937e1f0965e1d496ca3bbaaa1672cba22bffa7301d0a87f6ffc017" Dec 11 13:27:37 crc kubenswrapper[4898]: I1211 13:27:37.083224 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-c008-account-create-update-5t4pl" Dec 11 13:27:37 crc kubenswrapper[4898]: I1211 13:27:37.101184 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-d8bf-account-create-update-f5twm" Dec 11 13:27:37 crc kubenswrapper[4898]: I1211 13:27:37.101784 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-d8bf-account-create-update-f5twm" event={"ID":"5d173a21-f567-430e-b7fe-c47892f42873","Type":"ContainerDied","Data":"afa9c4903d114baa7f0e14f33bfb7a909473163b6343417829dc344644e8b4a9"} Dec 11 13:27:37 crc kubenswrapper[4898]: I1211 13:27:37.101821 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="afa9c4903d114baa7f0e14f33bfb7a909473163b6343417829dc344644e8b4a9" Dec 11 13:27:37 crc kubenswrapper[4898]: I1211 13:27:37.117619 4898 generic.go:334] "Generic (PLEG): container finished" podID="8c0762b4-f3a4-4243-8df8-94e805983b4b" containerID="2c7613a2c4c0a54040bfa09053d6ae624afab17f5a34bdbb4122d42d97ca4e33" exitCode=0 Dec 11 13:27:37 crc kubenswrapper[4898]: I1211 13:27:37.118523 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"8c0762b4-f3a4-4243-8df8-94e805983b4b","Type":"ContainerDied","Data":"2c7613a2c4c0a54040bfa09053d6ae624afab17f5a34bdbb4122d42d97ca4e33"} Dec 11 13:27:37 crc kubenswrapper[4898]: I1211 13:27:37.260938 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 11 13:27:37 crc kubenswrapper[4898]: I1211 13:27:37.340151 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2ftmf\" (UniqueName: \"kubernetes.io/projected/8c0762b4-f3a4-4243-8df8-94e805983b4b-kube-api-access-2ftmf\") pod \"8c0762b4-f3a4-4243-8df8-94e805983b4b\" (UID: \"8c0762b4-f3a4-4243-8df8-94e805983b4b\") " Dec 11 13:27:37 crc kubenswrapper[4898]: I1211 13:27:37.340258 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8c0762b4-f3a4-4243-8df8-94e805983b4b-scripts\") pod \"8c0762b4-f3a4-4243-8df8-94e805983b4b\" (UID: \"8c0762b4-f3a4-4243-8df8-94e805983b4b\") " Dec 11 13:27:37 crc kubenswrapper[4898]: I1211 13:27:37.340415 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"8c0762b4-f3a4-4243-8df8-94e805983b4b\" (UID: \"8c0762b4-f3a4-4243-8df8-94e805983b4b\") " Dec 11 13:27:37 crc kubenswrapper[4898]: I1211 13:27:37.340432 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8c0762b4-f3a4-4243-8df8-94e805983b4b-logs\") pod \"8c0762b4-f3a4-4243-8df8-94e805983b4b\" (UID: \"8c0762b4-f3a4-4243-8df8-94e805983b4b\") " Dec 11 13:27:37 crc kubenswrapper[4898]: I1211 13:27:37.340488 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c0762b4-f3a4-4243-8df8-94e805983b4b-config-data\") pod \"8c0762b4-f3a4-4243-8df8-94e805983b4b\" (UID: \"8c0762b4-f3a4-4243-8df8-94e805983b4b\") " Dec 11 13:27:37 crc kubenswrapper[4898]: I1211 13:27:37.340618 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8c0762b4-f3a4-4243-8df8-94e805983b4b-httpd-run\") pod \"8c0762b4-f3a4-4243-8df8-94e805983b4b\" (UID: \"8c0762b4-f3a4-4243-8df8-94e805983b4b\") " Dec 11 13:27:37 crc kubenswrapper[4898]: I1211 13:27:37.340653 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c0762b4-f3a4-4243-8df8-94e805983b4b-combined-ca-bundle\") pod \"8c0762b4-f3a4-4243-8df8-94e805983b4b\" (UID: \"8c0762b4-f3a4-4243-8df8-94e805983b4b\") " Dec 11 13:27:37 crc kubenswrapper[4898]: I1211 13:27:37.340671 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c0762b4-f3a4-4243-8df8-94e805983b4b-public-tls-certs\") pod \"8c0762b4-f3a4-4243-8df8-94e805983b4b\" (UID: \"8c0762b4-f3a4-4243-8df8-94e805983b4b\") " Dec 11 13:27:37 crc kubenswrapper[4898]: I1211 13:27:37.342618 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8c0762b4-f3a4-4243-8df8-94e805983b4b-logs" (OuterVolumeSpecName: "logs") pod "8c0762b4-f3a4-4243-8df8-94e805983b4b" (UID: "8c0762b4-f3a4-4243-8df8-94e805983b4b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 13:27:37 crc kubenswrapper[4898]: I1211 13:27:37.348543 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8c0762b4-f3a4-4243-8df8-94e805983b4b-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "8c0762b4-f3a4-4243-8df8-94e805983b4b" (UID: "8c0762b4-f3a4-4243-8df8-94e805983b4b"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 13:27:37 crc kubenswrapper[4898]: I1211 13:27:37.350366 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c0762b4-f3a4-4243-8df8-94e805983b4b-scripts" (OuterVolumeSpecName: "scripts") pod "8c0762b4-f3a4-4243-8df8-94e805983b4b" (UID: "8c0762b4-f3a4-4243-8df8-94e805983b4b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:27:37 crc kubenswrapper[4898]: I1211 13:27:37.353000 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "glance") pod "8c0762b4-f3a4-4243-8df8-94e805983b4b" (UID: "8c0762b4-f3a4-4243-8df8-94e805983b4b"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 11 13:27:37 crc kubenswrapper[4898]: I1211 13:27:37.357533 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c0762b4-f3a4-4243-8df8-94e805983b4b-kube-api-access-2ftmf" (OuterVolumeSpecName: "kube-api-access-2ftmf") pod "8c0762b4-f3a4-4243-8df8-94e805983b4b" (UID: "8c0762b4-f3a4-4243-8df8-94e805983b4b"). InnerVolumeSpecName "kube-api-access-2ftmf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:27:37 crc kubenswrapper[4898]: I1211 13:27:37.446615 4898 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8c0762b4-f3a4-4243-8df8-94e805983b4b-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 11 13:27:37 crc kubenswrapper[4898]: I1211 13:27:37.446652 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2ftmf\" (UniqueName: \"kubernetes.io/projected/8c0762b4-f3a4-4243-8df8-94e805983b4b-kube-api-access-2ftmf\") on node \"crc\" DevicePath \"\"" Dec 11 13:27:37 crc kubenswrapper[4898]: I1211 13:27:37.446664 4898 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8c0762b4-f3a4-4243-8df8-94e805983b4b-scripts\") on node \"crc\" DevicePath \"\"" Dec 11 13:27:37 crc kubenswrapper[4898]: I1211 13:27:37.446685 4898 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Dec 11 13:27:37 crc kubenswrapper[4898]: I1211 13:27:37.446693 4898 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8c0762b4-f3a4-4243-8df8-94e805983b4b-logs\") on node \"crc\" DevicePath \"\"" Dec 11 13:27:37 crc kubenswrapper[4898]: I1211 13:27:37.495945 4898 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Dec 11 13:27:37 crc kubenswrapper[4898]: I1211 13:27:37.500650 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c0762b4-f3a4-4243-8df8-94e805983b4b-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "8c0762b4-f3a4-4243-8df8-94e805983b4b" (UID: "8c0762b4-f3a4-4243-8df8-94e805983b4b"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:27:37 crc kubenswrapper[4898]: I1211 13:27:37.514053 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c0762b4-f3a4-4243-8df8-94e805983b4b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8c0762b4-f3a4-4243-8df8-94e805983b4b" (UID: "8c0762b4-f3a4-4243-8df8-94e805983b4b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:27:37 crc kubenswrapper[4898]: I1211 13:27:37.549495 4898 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Dec 11 13:27:37 crc kubenswrapper[4898]: I1211 13:27:37.549528 4898 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c0762b4-f3a4-4243-8df8-94e805983b4b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 11 13:27:37 crc kubenswrapper[4898]: I1211 13:27:37.549555 4898 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c0762b4-f3a4-4243-8df8-94e805983b4b-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 11 13:27:37 crc kubenswrapper[4898]: I1211 13:27:37.558856 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c0762b4-f3a4-4243-8df8-94e805983b4b-config-data" (OuterVolumeSpecName: "config-data") pod "8c0762b4-f3a4-4243-8df8-94e805983b4b" (UID: "8c0762b4-f3a4-4243-8df8-94e805983b4b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:27:37 crc kubenswrapper[4898]: I1211 13:27:37.651319 4898 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c0762b4-f3a4-4243-8df8-94e805983b4b-config-data\") on node \"crc\" DevicePath \"\"" Dec 11 13:27:38 crc kubenswrapper[4898]: I1211 13:27:38.134118 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"8c0762b4-f3a4-4243-8df8-94e805983b4b","Type":"ContainerDied","Data":"7c133537feb4776bbe88c6d81babfb82052c9ecd8a33e41fd80f7c195ecf417c"} Dec 11 13:27:38 crc kubenswrapper[4898]: I1211 13:27:38.134493 4898 scope.go:117] "RemoveContainer" containerID="2c7613a2c4c0a54040bfa09053d6ae624afab17f5a34bdbb4122d42d97ca4e33" Dec 11 13:27:38 crc kubenswrapper[4898]: I1211 13:27:38.134366 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 11 13:27:38 crc kubenswrapper[4898]: I1211 13:27:38.152014 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"25ac34fb-287b-4ea4-b349-124a57354453","Type":"ContainerStarted","Data":"374a78de8739eef4fa406721b9dca1ec6b141ead76b48ecae8837b884187cc46"} Dec 11 13:27:38 crc kubenswrapper[4898]: I1211 13:27:38.155036 4898 generic.go:334] "Generic (PLEG): container finished" podID="3ec65fb5-ed03-4c86-b3da-19711d02ef4d" containerID="3a1f98064dff3f8d2587ac4ebb0c37042beb56a5d07d4f78149ee4ec7c5a6a46" exitCode=0 Dec 11 13:27:38 crc kubenswrapper[4898]: I1211 13:27:38.155113 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-688b9f5b49-m8v8g" event={"ID":"3ec65fb5-ed03-4c86-b3da-19711d02ef4d","Type":"ContainerDied","Data":"3a1f98064dff3f8d2587ac4ebb0c37042beb56a5d07d4f78149ee4ec7c5a6a46"} Dec 11 13:27:38 crc kubenswrapper[4898]: I1211 13:27:38.159640 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"3d7a3f50-05f9-4ffd-b473-b7c0c1563bc2","Type":"ContainerStarted","Data":"18b4da2ee9ed8a88a855feefdb755e376140d58b6e97d4c9ee7777a6b614bdb9"} Dec 11 13:27:38 crc kubenswrapper[4898]: I1211 13:27:38.199688 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 11 13:27:38 crc kubenswrapper[4898]: I1211 13:27:38.217163 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 11 13:27:38 crc kubenswrapper[4898]: I1211 13:27:38.247899 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 11 13:27:38 crc kubenswrapper[4898]: E1211 13:27:38.248554 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c0762b4-f3a4-4243-8df8-94e805983b4b" containerName="glance-log" Dec 11 13:27:38 crc kubenswrapper[4898]: I1211 13:27:38.248578 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c0762b4-f3a4-4243-8df8-94e805983b4b" containerName="glance-log" Dec 11 13:27:38 crc kubenswrapper[4898]: E1211 13:27:38.248601 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1188efd5-cf9c-48dc-bd72-5259294cca4c" containerName="mariadb-account-create-update" Dec 11 13:27:38 crc kubenswrapper[4898]: I1211 13:27:38.248609 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="1188efd5-cf9c-48dc-bd72-5259294cca4c" containerName="mariadb-account-create-update" Dec 11 13:27:38 crc kubenswrapper[4898]: E1211 13:27:38.248625 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c0762b4-f3a4-4243-8df8-94e805983b4b" containerName="glance-httpd" Dec 11 13:27:38 crc kubenswrapper[4898]: I1211 13:27:38.248631 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c0762b4-f3a4-4243-8df8-94e805983b4b" containerName="glance-httpd" Dec 11 13:27:38 crc kubenswrapper[4898]: E1211 13:27:38.248680 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b4db1d0-fd46-4731-8392-e8390610a2c8" containerName="mariadb-database-create" Dec 11 13:27:38 crc kubenswrapper[4898]: I1211 13:27:38.248689 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b4db1d0-fd46-4731-8392-e8390610a2c8" containerName="mariadb-database-create" Dec 11 13:27:38 crc kubenswrapper[4898]: E1211 13:27:38.248701 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f07f8aaa-f41a-4204-881e-274dc9a9ad74" containerName="mariadb-database-create" Dec 11 13:27:38 crc kubenswrapper[4898]: I1211 13:27:38.248709 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="f07f8aaa-f41a-4204-881e-274dc9a9ad74" containerName="mariadb-database-create" Dec 11 13:27:38 crc kubenswrapper[4898]: E1211 13:27:38.248722 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2cfe6b8-8f34-455c-afe3-61b33605d648" containerName="mariadb-account-create-update" Dec 11 13:27:38 crc kubenswrapper[4898]: I1211 13:27:38.248730 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2cfe6b8-8f34-455c-afe3-61b33605d648" containerName="mariadb-account-create-update" Dec 11 13:27:38 crc kubenswrapper[4898]: E1211 13:27:38.248743 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d173a21-f567-430e-b7fe-c47892f42873" containerName="mariadb-account-create-update" Dec 11 13:27:38 crc kubenswrapper[4898]: I1211 13:27:38.248754 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d173a21-f567-430e-b7fe-c47892f42873" containerName="mariadb-account-create-update" Dec 11 13:27:38 crc kubenswrapper[4898]: E1211 13:27:38.248765 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="446e3383-1a1b-4271-94ca-1662e36059d3" containerName="mariadb-database-create" Dec 11 13:27:38 crc kubenswrapper[4898]: I1211 13:27:38.248773 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="446e3383-1a1b-4271-94ca-1662e36059d3" containerName="mariadb-database-create" Dec 11 13:27:38 crc kubenswrapper[4898]: I1211 13:27:38.249046 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="1188efd5-cf9c-48dc-bd72-5259294cca4c" containerName="mariadb-account-create-update" Dec 11 13:27:38 crc kubenswrapper[4898]: I1211 13:27:38.249069 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="446e3383-1a1b-4271-94ca-1662e36059d3" containerName="mariadb-database-create" Dec 11 13:27:38 crc kubenswrapper[4898]: I1211 13:27:38.249081 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b4db1d0-fd46-4731-8392-e8390610a2c8" containerName="mariadb-database-create" Dec 11 13:27:38 crc kubenswrapper[4898]: I1211 13:27:38.249092 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2cfe6b8-8f34-455c-afe3-61b33605d648" containerName="mariadb-account-create-update" Dec 11 13:27:38 crc kubenswrapper[4898]: I1211 13:27:38.249105 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c0762b4-f3a4-4243-8df8-94e805983b4b" containerName="glance-httpd" Dec 11 13:27:38 crc kubenswrapper[4898]: I1211 13:27:38.249119 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d173a21-f567-430e-b7fe-c47892f42873" containerName="mariadb-account-create-update" Dec 11 13:27:38 crc kubenswrapper[4898]: I1211 13:27:38.249133 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c0762b4-f3a4-4243-8df8-94e805983b4b" containerName="glance-log" Dec 11 13:27:38 crc kubenswrapper[4898]: I1211 13:27:38.249147 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="f07f8aaa-f41a-4204-881e-274dc9a9ad74" containerName="mariadb-database-create" Dec 11 13:27:38 crc kubenswrapper[4898]: I1211 13:27:38.252065 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 11 13:27:38 crc kubenswrapper[4898]: I1211 13:27:38.254016 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Dec 11 13:27:38 crc kubenswrapper[4898]: I1211 13:27:38.255345 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 11 13:27:38 crc kubenswrapper[4898]: I1211 13:27:38.270114 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 11 13:27:38 crc kubenswrapper[4898]: I1211 13:27:38.368670 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/72c623d3-596d-4db8-8447-3bc93f7187e4-logs\") pod \"glance-default-external-api-0\" (UID: \"72c623d3-596d-4db8-8447-3bc93f7187e4\") " pod="openstack/glance-default-external-api-0" Dec 11 13:27:38 crc kubenswrapper[4898]: I1211 13:27:38.368739 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72c623d3-596d-4db8-8447-3bc93f7187e4-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"72c623d3-596d-4db8-8447-3bc93f7187e4\") " pod="openstack/glance-default-external-api-0" Dec 11 13:27:38 crc kubenswrapper[4898]: I1211 13:27:38.368781 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72c623d3-596d-4db8-8447-3bc93f7187e4-config-data\") pod \"glance-default-external-api-0\" (UID: \"72c623d3-596d-4db8-8447-3bc93f7187e4\") " pod="openstack/glance-default-external-api-0" Dec 11 13:27:38 crc kubenswrapper[4898]: I1211 13:27:38.368827 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lxr2f\" (UniqueName: \"kubernetes.io/projected/72c623d3-596d-4db8-8447-3bc93f7187e4-kube-api-access-lxr2f\") pod \"glance-default-external-api-0\" (UID: \"72c623d3-596d-4db8-8447-3bc93f7187e4\") " pod="openstack/glance-default-external-api-0" Dec 11 13:27:38 crc kubenswrapper[4898]: I1211 13:27:38.369057 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"72c623d3-596d-4db8-8447-3bc93f7187e4\") " pod="openstack/glance-default-external-api-0" Dec 11 13:27:38 crc kubenswrapper[4898]: I1211 13:27:38.369214 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/72c623d3-596d-4db8-8447-3bc93f7187e4-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"72c623d3-596d-4db8-8447-3bc93f7187e4\") " pod="openstack/glance-default-external-api-0" Dec 11 13:27:38 crc kubenswrapper[4898]: I1211 13:27:38.369280 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/72c623d3-596d-4db8-8447-3bc93f7187e4-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"72c623d3-596d-4db8-8447-3bc93f7187e4\") " pod="openstack/glance-default-external-api-0" Dec 11 13:27:38 crc kubenswrapper[4898]: I1211 13:27:38.369674 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/72c623d3-596d-4db8-8447-3bc93f7187e4-scripts\") pod \"glance-default-external-api-0\" (UID: \"72c623d3-596d-4db8-8447-3bc93f7187e4\") " pod="openstack/glance-default-external-api-0" Dec 11 13:27:38 crc kubenswrapper[4898]: I1211 13:27:38.472689 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/72c623d3-596d-4db8-8447-3bc93f7187e4-scripts\") pod \"glance-default-external-api-0\" (UID: \"72c623d3-596d-4db8-8447-3bc93f7187e4\") " pod="openstack/glance-default-external-api-0" Dec 11 13:27:38 crc kubenswrapper[4898]: I1211 13:27:38.472817 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/72c623d3-596d-4db8-8447-3bc93f7187e4-logs\") pod \"glance-default-external-api-0\" (UID: \"72c623d3-596d-4db8-8447-3bc93f7187e4\") " pod="openstack/glance-default-external-api-0" Dec 11 13:27:38 crc kubenswrapper[4898]: I1211 13:27:38.472837 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72c623d3-596d-4db8-8447-3bc93f7187e4-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"72c623d3-596d-4db8-8447-3bc93f7187e4\") " pod="openstack/glance-default-external-api-0" Dec 11 13:27:38 crc kubenswrapper[4898]: I1211 13:27:38.473354 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/72c623d3-596d-4db8-8447-3bc93f7187e4-logs\") pod \"glance-default-external-api-0\" (UID: \"72c623d3-596d-4db8-8447-3bc93f7187e4\") " pod="openstack/glance-default-external-api-0" Dec 11 13:27:38 crc kubenswrapper[4898]: I1211 13:27:38.473413 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72c623d3-596d-4db8-8447-3bc93f7187e4-config-data\") pod \"glance-default-external-api-0\" (UID: \"72c623d3-596d-4db8-8447-3bc93f7187e4\") " pod="openstack/glance-default-external-api-0" Dec 11 13:27:38 crc kubenswrapper[4898]: I1211 13:27:38.473909 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lxr2f\" (UniqueName: \"kubernetes.io/projected/72c623d3-596d-4db8-8447-3bc93f7187e4-kube-api-access-lxr2f\") pod \"glance-default-external-api-0\" (UID: \"72c623d3-596d-4db8-8447-3bc93f7187e4\") " pod="openstack/glance-default-external-api-0" Dec 11 13:27:38 crc kubenswrapper[4898]: I1211 13:27:38.474626 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"72c623d3-596d-4db8-8447-3bc93f7187e4\") " pod="openstack/glance-default-external-api-0" Dec 11 13:27:38 crc kubenswrapper[4898]: I1211 13:27:38.474760 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/72c623d3-596d-4db8-8447-3bc93f7187e4-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"72c623d3-596d-4db8-8447-3bc93f7187e4\") " pod="openstack/glance-default-external-api-0" Dec 11 13:27:38 crc kubenswrapper[4898]: I1211 13:27:38.474818 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/72c623d3-596d-4db8-8447-3bc93f7187e4-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"72c623d3-596d-4db8-8447-3bc93f7187e4\") " pod="openstack/glance-default-external-api-0" Dec 11 13:27:38 crc kubenswrapper[4898]: I1211 13:27:38.475279 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/72c623d3-596d-4db8-8447-3bc93f7187e4-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"72c623d3-596d-4db8-8447-3bc93f7187e4\") " pod="openstack/glance-default-external-api-0" Dec 11 13:27:38 crc kubenswrapper[4898]: I1211 13:27:38.475540 4898 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"72c623d3-596d-4db8-8447-3bc93f7187e4\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/glance-default-external-api-0" Dec 11 13:27:38 crc kubenswrapper[4898]: I1211 13:27:38.496227 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72c623d3-596d-4db8-8447-3bc93f7187e4-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"72c623d3-596d-4db8-8447-3bc93f7187e4\") " pod="openstack/glance-default-external-api-0" Dec 11 13:27:38 crc kubenswrapper[4898]: I1211 13:27:38.500042 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/72c623d3-596d-4db8-8447-3bc93f7187e4-scripts\") pod \"glance-default-external-api-0\" (UID: \"72c623d3-596d-4db8-8447-3bc93f7187e4\") " pod="openstack/glance-default-external-api-0" Dec 11 13:27:38 crc kubenswrapper[4898]: I1211 13:27:38.500647 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72c623d3-596d-4db8-8447-3bc93f7187e4-config-data\") pod \"glance-default-external-api-0\" (UID: \"72c623d3-596d-4db8-8447-3bc93f7187e4\") " pod="openstack/glance-default-external-api-0" Dec 11 13:27:38 crc kubenswrapper[4898]: I1211 13:27:38.501317 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/72c623d3-596d-4db8-8447-3bc93f7187e4-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"72c623d3-596d-4db8-8447-3bc93f7187e4\") " pod="openstack/glance-default-external-api-0" Dec 11 13:27:38 crc kubenswrapper[4898]: I1211 13:27:38.504605 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lxr2f\" (UniqueName: \"kubernetes.io/projected/72c623d3-596d-4db8-8447-3bc93f7187e4-kube-api-access-lxr2f\") pod \"glance-default-external-api-0\" (UID: \"72c623d3-596d-4db8-8447-3bc93f7187e4\") " pod="openstack/glance-default-external-api-0" Dec 11 13:27:38 crc kubenswrapper[4898]: I1211 13:27:38.562087 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"72c623d3-596d-4db8-8447-3bc93f7187e4\") " pod="openstack/glance-default-external-api-0" Dec 11 13:27:38 crc kubenswrapper[4898]: I1211 13:27:38.600514 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 11 13:27:38 crc kubenswrapper[4898]: I1211 13:27:38.750163 4898 scope.go:117] "RemoveContainer" containerID="3d10166a76dcef4f62b5c7bc9c12ea67476d6be9d8622c008144c0f8e7c00080" Dec 11 13:27:38 crc kubenswrapper[4898]: I1211 13:27:38.812589 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c0762b4-f3a4-4243-8df8-94e805983b4b" path="/var/lib/kubelet/pods/8c0762b4-f3a4-4243-8df8-94e805983b4b/volumes" Dec 11 13:27:39 crc kubenswrapper[4898]: I1211 13:27:39.219928 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"3d7a3f50-05f9-4ffd-b473-b7c0c1563bc2","Type":"ContainerStarted","Data":"e454105a88f0dca69ff8dd5897f78fc6fb0aaf3cfd94b425822fd8b8d08dcd57"} Dec 11 13:27:39 crc kubenswrapper[4898]: I1211 13:27:39.220566 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Dec 11 13:27:39 crc kubenswrapper[4898]: I1211 13:27:39.276182 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=6.276159298 podStartE2EDuration="6.276159298s" podCreationTimestamp="2025-12-11 13:27:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 13:27:39.254742468 +0000 UTC m=+1416.827068925" watchObservedRunningTime="2025-12-11 13:27:39.276159298 +0000 UTC m=+1416.848485735" Dec 11 13:27:39 crc kubenswrapper[4898]: I1211 13:27:39.608053 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-rpgj6"] Dec 11 13:27:39 crc kubenswrapper[4898]: I1211 13:27:39.611973 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-rpgj6" Dec 11 13:27:39 crc kubenswrapper[4898]: I1211 13:27:39.619831 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-2kdzw" Dec 11 13:27:39 crc kubenswrapper[4898]: I1211 13:27:39.645495 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Dec 11 13:27:39 crc kubenswrapper[4898]: I1211 13:27:39.645654 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Dec 11 13:27:39 crc kubenswrapper[4898]: I1211 13:27:39.687564 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-rpgj6"] Dec 11 13:27:39 crc kubenswrapper[4898]: I1211 13:27:39.715892 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-784v4\" (UniqueName: \"kubernetes.io/projected/421ec273-9526-4100-9a5a-63e0512beee3-kube-api-access-784v4\") pod \"nova-cell0-conductor-db-sync-rpgj6\" (UID: \"421ec273-9526-4100-9a5a-63e0512beee3\") " pod="openstack/nova-cell0-conductor-db-sync-rpgj6" Dec 11 13:27:39 crc kubenswrapper[4898]: I1211 13:27:39.716004 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/421ec273-9526-4100-9a5a-63e0512beee3-config-data\") pod \"nova-cell0-conductor-db-sync-rpgj6\" (UID: \"421ec273-9526-4100-9a5a-63e0512beee3\") " pod="openstack/nova-cell0-conductor-db-sync-rpgj6" Dec 11 13:27:39 crc kubenswrapper[4898]: I1211 13:27:39.716037 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/421ec273-9526-4100-9a5a-63e0512beee3-scripts\") pod \"nova-cell0-conductor-db-sync-rpgj6\" (UID: \"421ec273-9526-4100-9a5a-63e0512beee3\") " pod="openstack/nova-cell0-conductor-db-sync-rpgj6" Dec 11 13:27:39 crc kubenswrapper[4898]: I1211 13:27:39.716130 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/421ec273-9526-4100-9a5a-63e0512beee3-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-rpgj6\" (UID: \"421ec273-9526-4100-9a5a-63e0512beee3\") " pod="openstack/nova-cell0-conductor-db-sync-rpgj6" Dec 11 13:27:39 crc kubenswrapper[4898]: I1211 13:27:39.790486 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 11 13:27:39 crc kubenswrapper[4898]: I1211 13:27:39.853974 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/421ec273-9526-4100-9a5a-63e0512beee3-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-rpgj6\" (UID: \"421ec273-9526-4100-9a5a-63e0512beee3\") " pod="openstack/nova-cell0-conductor-db-sync-rpgj6" Dec 11 13:27:39 crc kubenswrapper[4898]: I1211 13:27:39.854396 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-784v4\" (UniqueName: \"kubernetes.io/projected/421ec273-9526-4100-9a5a-63e0512beee3-kube-api-access-784v4\") pod \"nova-cell0-conductor-db-sync-rpgj6\" (UID: \"421ec273-9526-4100-9a5a-63e0512beee3\") " pod="openstack/nova-cell0-conductor-db-sync-rpgj6" Dec 11 13:27:39 crc kubenswrapper[4898]: I1211 13:27:39.854500 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/421ec273-9526-4100-9a5a-63e0512beee3-config-data\") pod \"nova-cell0-conductor-db-sync-rpgj6\" (UID: \"421ec273-9526-4100-9a5a-63e0512beee3\") " pod="openstack/nova-cell0-conductor-db-sync-rpgj6" Dec 11 13:27:39 crc kubenswrapper[4898]: I1211 13:27:39.854527 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/421ec273-9526-4100-9a5a-63e0512beee3-scripts\") pod \"nova-cell0-conductor-db-sync-rpgj6\" (UID: \"421ec273-9526-4100-9a5a-63e0512beee3\") " pod="openstack/nova-cell0-conductor-db-sync-rpgj6" Dec 11 13:27:39 crc kubenswrapper[4898]: I1211 13:27:39.868781 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 11 13:27:39 crc kubenswrapper[4898]: I1211 13:27:39.868998 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="1dc1c258-0a10-41a3-a831-0b8b1878ac80" containerName="glance-log" containerID="cri-o://f443b849f68ba83870d8cd9bb8cb6e0ee28dc53333c98ea0253e563f39842ea5" gracePeriod=30 Dec 11 13:27:39 crc kubenswrapper[4898]: I1211 13:27:39.869692 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="1dc1c258-0a10-41a3-a831-0b8b1878ac80" containerName="glance-httpd" containerID="cri-o://8238f160349bb9bfa3897baa7615ac6ec775c9f9dafb1f1c69ff40c49db69db6" gracePeriod=30 Dec 11 13:27:39 crc kubenswrapper[4898]: I1211 13:27:39.871148 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/421ec273-9526-4100-9a5a-63e0512beee3-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-rpgj6\" (UID: \"421ec273-9526-4100-9a5a-63e0512beee3\") " pod="openstack/nova-cell0-conductor-db-sync-rpgj6" Dec 11 13:27:39 crc kubenswrapper[4898]: I1211 13:27:39.871745 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/421ec273-9526-4100-9a5a-63e0512beee3-scripts\") pod \"nova-cell0-conductor-db-sync-rpgj6\" (UID: \"421ec273-9526-4100-9a5a-63e0512beee3\") " pod="openstack/nova-cell0-conductor-db-sync-rpgj6" Dec 11 13:27:39 crc kubenswrapper[4898]: I1211 13:27:39.872126 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/421ec273-9526-4100-9a5a-63e0512beee3-config-data\") pod \"nova-cell0-conductor-db-sync-rpgj6\" (UID: \"421ec273-9526-4100-9a5a-63e0512beee3\") " pod="openstack/nova-cell0-conductor-db-sync-rpgj6" Dec 11 13:27:39 crc kubenswrapper[4898]: I1211 13:27:39.914538 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-784v4\" (UniqueName: \"kubernetes.io/projected/421ec273-9526-4100-9a5a-63e0512beee3-kube-api-access-784v4\") pod \"nova-cell0-conductor-db-sync-rpgj6\" (UID: \"421ec273-9526-4100-9a5a-63e0512beee3\") " pod="openstack/nova-cell0-conductor-db-sync-rpgj6" Dec 11 13:27:40 crc kubenswrapper[4898]: I1211 13:27:40.029948 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-rpgj6" Dec 11 13:27:40 crc kubenswrapper[4898]: I1211 13:27:40.341565 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-688b9f5b49-m8v8g" event={"ID":"3ec65fb5-ed03-4c86-b3da-19711d02ef4d","Type":"ContainerStarted","Data":"34268be809bb953ad05606192e502b89eee8a121c7980b49b6c8fc38f9cb3a0b"} Dec 11 13:27:40 crc kubenswrapper[4898]: I1211 13:27:40.341889 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-688b9f5b49-m8v8g" Dec 11 13:27:40 crc kubenswrapper[4898]: I1211 13:27:40.385073 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-6cfcb9bcc9-8j4sz" event={"ID":"0ee1613f-2749-4158-b826-8096758ca04f","Type":"ContainerStarted","Data":"3e4d687a0921db9498e4173ffb0d78fb822d8dfbfe1a84dcabbeaaaaa578664a"} Dec 11 13:27:40 crc kubenswrapper[4898]: I1211 13:27:40.385593 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-6cfcb9bcc9-8j4sz" Dec 11 13:27:40 crc kubenswrapper[4898]: I1211 13:27:40.417007 4898 generic.go:334] "Generic (PLEG): container finished" podID="1dc1c258-0a10-41a3-a831-0b8b1878ac80" containerID="f443b849f68ba83870d8cd9bb8cb6e0ee28dc53333c98ea0253e563f39842ea5" exitCode=143 Dec 11 13:27:40 crc kubenswrapper[4898]: I1211 13:27:40.417084 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"1dc1c258-0a10-41a3-a831-0b8b1878ac80","Type":"ContainerDied","Data":"f443b849f68ba83870d8cd9bb8cb6e0ee28dc53333c98ea0253e563f39842ea5"} Dec 11 13:27:40 crc kubenswrapper[4898]: I1211 13:27:40.422260 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-688b9f5b49-m8v8g" podStartSLOduration=7.422239034 podStartE2EDuration="7.422239034s" podCreationTimestamp="2025-12-11 13:27:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 13:27:40.36298412 +0000 UTC m=+1417.935310557" watchObservedRunningTime="2025-12-11 13:27:40.422239034 +0000 UTC m=+1417.994565471" Dec 11 13:27:40 crc kubenswrapper[4898]: I1211 13:27:40.428794 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"72c623d3-596d-4db8-8447-3bc93f7187e4","Type":"ContainerStarted","Data":"55f5824e02a149a85725d5d3671fab2fc05374f804878509dc0472289828757f"} Dec 11 13:27:40 crc kubenswrapper[4898]: I1211 13:27:40.435687 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-6cfcb9bcc9-8j4sz" podStartSLOduration=4.878289268 podStartE2EDuration="7.435664257s" podCreationTimestamp="2025-12-11 13:27:33 +0000 UTC" firstStartedPulling="2025-12-11 13:27:36.359788053 +0000 UTC m=+1413.932114490" lastFinishedPulling="2025-12-11 13:27:38.917163042 +0000 UTC m=+1416.489489479" observedRunningTime="2025-12-11 13:27:40.411309558 +0000 UTC m=+1417.983635995" watchObservedRunningTime="2025-12-11 13:27:40.435664257 +0000 UTC m=+1418.007990694" Dec 11 13:27:40 crc kubenswrapper[4898]: I1211 13:27:40.468184 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"25ac34fb-287b-4ea4-b349-124a57354453","Type":"ContainerStarted","Data":"66b2a4d4af3a93e795e5f56d227e5414c8e1538bbf1dd1419850642d3f21bf62"} Dec 11 13:27:40 crc kubenswrapper[4898]: I1211 13:27:40.520425 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.417188221 podStartE2EDuration="9.52039726s" podCreationTimestamp="2025-12-11 13:27:31 +0000 UTC" firstStartedPulling="2025-12-11 13:27:32.850216004 +0000 UTC m=+1410.422542431" lastFinishedPulling="2025-12-11 13:27:38.953425033 +0000 UTC m=+1416.525751470" observedRunningTime="2025-12-11 13:27:40.490926523 +0000 UTC m=+1418.063252960" watchObservedRunningTime="2025-12-11 13:27:40.52039726 +0000 UTC m=+1418.092723697" Dec 11 13:27:40 crc kubenswrapper[4898]: I1211 13:27:40.692915 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-rpgj6"] Dec 11 13:27:41 crc kubenswrapper[4898]: W1211 13:27:41.236648 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod421ec273_9526_4100_9a5a_63e0512beee3.slice/crio-ed982c0a94c488cd357dbbf23c9cb8dc8fca157682e658a06f112e144d99ea85 WatchSource:0}: Error finding container ed982c0a94c488cd357dbbf23c9cb8dc8fca157682e658a06f112e144d99ea85: Status 404 returned error can't find the container with id ed982c0a94c488cd357dbbf23c9cb8dc8fca157682e658a06f112e144d99ea85 Dec 11 13:27:41 crc kubenswrapper[4898]: I1211 13:27:41.502553 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"72c623d3-596d-4db8-8447-3bc93f7187e4","Type":"ContainerStarted","Data":"d9eae93541edaace1ce8b2bb15697a946b68ce3920baa83b3a4ea9164074ab53"} Dec 11 13:27:41 crc kubenswrapper[4898]: I1211 13:27:41.506382 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-rpgj6" event={"ID":"421ec273-9526-4100-9a5a-63e0512beee3","Type":"ContainerStarted","Data":"ed982c0a94c488cd357dbbf23c9cb8dc8fca157682e658a06f112e144d99ea85"} Dec 11 13:27:41 crc kubenswrapper[4898]: I1211 13:27:41.507645 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 11 13:27:42 crc kubenswrapper[4898]: I1211 13:27:42.303553 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-7dc95c5f9b-hp64x"] Dec 11 13:27:42 crc kubenswrapper[4898]: I1211 13:27:42.305721 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-7dc95c5f9b-hp64x" Dec 11 13:27:42 crc kubenswrapper[4898]: I1211 13:27:42.315794 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-7dc95c5f9b-hp64x"] Dec 11 13:27:42 crc kubenswrapper[4898]: I1211 13:27:42.332877 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-engine-5f8cd7c4cf-7zq47"] Dec 11 13:27:42 crc kubenswrapper[4898]: I1211 13:27:42.334494 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-5f8cd7c4cf-7zq47" Dec 11 13:27:42 crc kubenswrapper[4898]: I1211 13:27:42.356621 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-58bd476547-b52w6"] Dec 11 13:27:42 crc kubenswrapper[4898]: I1211 13:27:42.359333 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-58bd476547-b52w6" Dec 11 13:27:42 crc kubenswrapper[4898]: I1211 13:27:42.384668 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-5f8cd7c4cf-7zq47"] Dec 11 13:27:42 crc kubenswrapper[4898]: I1211 13:27:42.399524 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-58bd476547-b52w6"] Dec 11 13:27:42 crc kubenswrapper[4898]: I1211 13:27:42.434523 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-49vrz\" (UniqueName: \"kubernetes.io/projected/0720809d-a8eb-4edd-a201-67305a32bc97-kube-api-access-49vrz\") pod \"heat-cfnapi-7dc95c5f9b-hp64x\" (UID: \"0720809d-a8eb-4edd-a201-67305a32bc97\") " pod="openstack/heat-cfnapi-7dc95c5f9b-hp64x" Dec 11 13:27:42 crc kubenswrapper[4898]: I1211 13:27:42.434596 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0720809d-a8eb-4edd-a201-67305a32bc97-config-data\") pod \"heat-cfnapi-7dc95c5f9b-hp64x\" (UID: \"0720809d-a8eb-4edd-a201-67305a32bc97\") " pod="openstack/heat-cfnapi-7dc95c5f9b-hp64x" Dec 11 13:27:42 crc kubenswrapper[4898]: I1211 13:27:42.438571 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0720809d-a8eb-4edd-a201-67305a32bc97-combined-ca-bundle\") pod \"heat-cfnapi-7dc95c5f9b-hp64x\" (UID: \"0720809d-a8eb-4edd-a201-67305a32bc97\") " pod="openstack/heat-cfnapi-7dc95c5f9b-hp64x" Dec 11 13:27:42 crc kubenswrapper[4898]: I1211 13:27:42.439688 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0720809d-a8eb-4edd-a201-67305a32bc97-config-data-custom\") pod \"heat-cfnapi-7dc95c5f9b-hp64x\" (UID: \"0720809d-a8eb-4edd-a201-67305a32bc97\") " pod="openstack/heat-cfnapi-7dc95c5f9b-hp64x" Dec 11 13:27:42 crc kubenswrapper[4898]: I1211 13:27:42.535001 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-6ff897fbd7-rddq7" event={"ID":"c6532ba5-2d1a-44f9-bf4a-6fd73f4bb6ee","Type":"ContainerStarted","Data":"75320a8b39bb7fa3543db9e0288c17f254ec4ffaa105274e968f0d723be9cc21"} Dec 11 13:27:42 crc kubenswrapper[4898]: I1211 13:27:42.536349 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-6ff897fbd7-rddq7" Dec 11 13:27:42 crc kubenswrapper[4898]: I1211 13:27:42.539080 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"72c623d3-596d-4db8-8447-3bc93f7187e4","Type":"ContainerStarted","Data":"457343090b71614e6b3e60249d42f82bf6323dd30338e5219c077be55b0ec3ae"} Dec 11 13:27:42 crc kubenswrapper[4898]: I1211 13:27:42.541288 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kp44m\" (UniqueName: \"kubernetes.io/projected/2caa9e06-4684-4011-8b55-e4436bf09d20-kube-api-access-kp44m\") pod \"heat-api-58bd476547-b52w6\" (UID: \"2caa9e06-4684-4011-8b55-e4436bf09d20\") " pod="openstack/heat-api-58bd476547-b52w6" Dec 11 13:27:42 crc kubenswrapper[4898]: I1211 13:27:42.541336 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0720809d-a8eb-4edd-a201-67305a32bc97-combined-ca-bundle\") pod \"heat-cfnapi-7dc95c5f9b-hp64x\" (UID: \"0720809d-a8eb-4edd-a201-67305a32bc97\") " pod="openstack/heat-cfnapi-7dc95c5f9b-hp64x" Dec 11 13:27:42 crc kubenswrapper[4898]: I1211 13:27:42.541360 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2caa9e06-4684-4011-8b55-e4436bf09d20-config-data\") pod \"heat-api-58bd476547-b52w6\" (UID: \"2caa9e06-4684-4011-8b55-e4436bf09d20\") " pod="openstack/heat-api-58bd476547-b52w6" Dec 11 13:27:42 crc kubenswrapper[4898]: I1211 13:27:42.541390 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e113da52-055b-47d0-b16a-2fef9d302bbe-combined-ca-bundle\") pod \"heat-engine-5f8cd7c4cf-7zq47\" (UID: \"e113da52-055b-47d0-b16a-2fef9d302bbe\") " pod="openstack/heat-engine-5f8cd7c4cf-7zq47" Dec 11 13:27:42 crc kubenswrapper[4898]: I1211 13:27:42.541667 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e113da52-055b-47d0-b16a-2fef9d302bbe-config-data\") pod \"heat-engine-5f8cd7c4cf-7zq47\" (UID: \"e113da52-055b-47d0-b16a-2fef9d302bbe\") " pod="openstack/heat-engine-5f8cd7c4cf-7zq47" Dec 11 13:27:42 crc kubenswrapper[4898]: I1211 13:27:42.541976 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e113da52-055b-47d0-b16a-2fef9d302bbe-config-data-custom\") pod \"heat-engine-5f8cd7c4cf-7zq47\" (UID: \"e113da52-055b-47d0-b16a-2fef9d302bbe\") " pod="openstack/heat-engine-5f8cd7c4cf-7zq47" Dec 11 13:27:42 crc kubenswrapper[4898]: I1211 13:27:42.542076 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0720809d-a8eb-4edd-a201-67305a32bc97-config-data-custom\") pod \"heat-cfnapi-7dc95c5f9b-hp64x\" (UID: \"0720809d-a8eb-4edd-a201-67305a32bc97\") " pod="openstack/heat-cfnapi-7dc95c5f9b-hp64x" Dec 11 13:27:42 crc kubenswrapper[4898]: I1211 13:27:42.542109 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bv5cw\" (UniqueName: \"kubernetes.io/projected/e113da52-055b-47d0-b16a-2fef9d302bbe-kube-api-access-bv5cw\") pod \"heat-engine-5f8cd7c4cf-7zq47\" (UID: \"e113da52-055b-47d0-b16a-2fef9d302bbe\") " pod="openstack/heat-engine-5f8cd7c4cf-7zq47" Dec 11 13:27:42 crc kubenswrapper[4898]: I1211 13:27:42.542169 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-49vrz\" (UniqueName: \"kubernetes.io/projected/0720809d-a8eb-4edd-a201-67305a32bc97-kube-api-access-49vrz\") pod \"heat-cfnapi-7dc95c5f9b-hp64x\" (UID: \"0720809d-a8eb-4edd-a201-67305a32bc97\") " pod="openstack/heat-cfnapi-7dc95c5f9b-hp64x" Dec 11 13:27:42 crc kubenswrapper[4898]: I1211 13:27:42.542192 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2caa9e06-4684-4011-8b55-e4436bf09d20-combined-ca-bundle\") pod \"heat-api-58bd476547-b52w6\" (UID: \"2caa9e06-4684-4011-8b55-e4436bf09d20\") " pod="openstack/heat-api-58bd476547-b52w6" Dec 11 13:27:42 crc kubenswrapper[4898]: I1211 13:27:42.542230 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0720809d-a8eb-4edd-a201-67305a32bc97-config-data\") pod \"heat-cfnapi-7dc95c5f9b-hp64x\" (UID: \"0720809d-a8eb-4edd-a201-67305a32bc97\") " pod="openstack/heat-cfnapi-7dc95c5f9b-hp64x" Dec 11 13:27:42 crc kubenswrapper[4898]: I1211 13:27:42.542339 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2caa9e06-4684-4011-8b55-e4436bf09d20-config-data-custom\") pod \"heat-api-58bd476547-b52w6\" (UID: \"2caa9e06-4684-4011-8b55-e4436bf09d20\") " pod="openstack/heat-api-58bd476547-b52w6" Dec 11 13:27:42 crc kubenswrapper[4898]: I1211 13:27:42.565764 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0720809d-a8eb-4edd-a201-67305a32bc97-config-data\") pod \"heat-cfnapi-7dc95c5f9b-hp64x\" (UID: \"0720809d-a8eb-4edd-a201-67305a32bc97\") " pod="openstack/heat-cfnapi-7dc95c5f9b-hp64x" Dec 11 13:27:42 crc kubenswrapper[4898]: I1211 13:27:42.568621 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0720809d-a8eb-4edd-a201-67305a32bc97-config-data-custom\") pod \"heat-cfnapi-7dc95c5f9b-hp64x\" (UID: \"0720809d-a8eb-4edd-a201-67305a32bc97\") " pod="openstack/heat-cfnapi-7dc95c5f9b-hp64x" Dec 11 13:27:42 crc kubenswrapper[4898]: I1211 13:27:42.569428 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0720809d-a8eb-4edd-a201-67305a32bc97-combined-ca-bundle\") pod \"heat-cfnapi-7dc95c5f9b-hp64x\" (UID: \"0720809d-a8eb-4edd-a201-67305a32bc97\") " pod="openstack/heat-cfnapi-7dc95c5f9b-hp64x" Dec 11 13:27:42 crc kubenswrapper[4898]: I1211 13:27:42.572151 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-49vrz\" (UniqueName: \"kubernetes.io/projected/0720809d-a8eb-4edd-a201-67305a32bc97-kube-api-access-49vrz\") pod \"heat-cfnapi-7dc95c5f9b-hp64x\" (UID: \"0720809d-a8eb-4edd-a201-67305a32bc97\") " pod="openstack/heat-cfnapi-7dc95c5f9b-hp64x" Dec 11 13:27:42 crc kubenswrapper[4898]: I1211 13:27:42.587081 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-6ff897fbd7-rddq7" podStartSLOduration=4.621718164 podStartE2EDuration="9.58705974s" podCreationTimestamp="2025-12-11 13:27:33 +0000 UTC" firstStartedPulling="2025-12-11 13:27:36.359394752 +0000 UTC m=+1413.931721189" lastFinishedPulling="2025-12-11 13:27:41.324736328 +0000 UTC m=+1418.897062765" observedRunningTime="2025-12-11 13:27:42.550332176 +0000 UTC m=+1420.122658623" watchObservedRunningTime="2025-12-11 13:27:42.58705974 +0000 UTC m=+1420.159386177" Dec 11 13:27:42 crc kubenswrapper[4898]: I1211 13:27:42.625957 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-7dc95c5f9b-hp64x" Dec 11 13:27:42 crc kubenswrapper[4898]: I1211 13:27:42.626989 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.62696784 podStartE2EDuration="4.62696784s" podCreationTimestamp="2025-12-11 13:27:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 13:27:42.619303512 +0000 UTC m=+1420.191629949" watchObservedRunningTime="2025-12-11 13:27:42.62696784 +0000 UTC m=+1420.199294277" Dec 11 13:27:42 crc kubenswrapper[4898]: I1211 13:27:42.643860 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2caa9e06-4684-4011-8b55-e4436bf09d20-combined-ca-bundle\") pod \"heat-api-58bd476547-b52w6\" (UID: \"2caa9e06-4684-4011-8b55-e4436bf09d20\") " pod="openstack/heat-api-58bd476547-b52w6" Dec 11 13:27:42 crc kubenswrapper[4898]: I1211 13:27:42.643985 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2caa9e06-4684-4011-8b55-e4436bf09d20-config-data-custom\") pod \"heat-api-58bd476547-b52w6\" (UID: \"2caa9e06-4684-4011-8b55-e4436bf09d20\") " pod="openstack/heat-api-58bd476547-b52w6" Dec 11 13:27:42 crc kubenswrapper[4898]: I1211 13:27:42.644095 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kp44m\" (UniqueName: \"kubernetes.io/projected/2caa9e06-4684-4011-8b55-e4436bf09d20-kube-api-access-kp44m\") pod \"heat-api-58bd476547-b52w6\" (UID: \"2caa9e06-4684-4011-8b55-e4436bf09d20\") " pod="openstack/heat-api-58bd476547-b52w6" Dec 11 13:27:42 crc kubenswrapper[4898]: I1211 13:27:42.644117 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2caa9e06-4684-4011-8b55-e4436bf09d20-config-data\") pod \"heat-api-58bd476547-b52w6\" (UID: \"2caa9e06-4684-4011-8b55-e4436bf09d20\") " pod="openstack/heat-api-58bd476547-b52w6" Dec 11 13:27:42 crc kubenswrapper[4898]: I1211 13:27:42.644144 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e113da52-055b-47d0-b16a-2fef9d302bbe-combined-ca-bundle\") pod \"heat-engine-5f8cd7c4cf-7zq47\" (UID: \"e113da52-055b-47d0-b16a-2fef9d302bbe\") " pod="openstack/heat-engine-5f8cd7c4cf-7zq47" Dec 11 13:27:42 crc kubenswrapper[4898]: I1211 13:27:42.644168 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e113da52-055b-47d0-b16a-2fef9d302bbe-config-data\") pod \"heat-engine-5f8cd7c4cf-7zq47\" (UID: \"e113da52-055b-47d0-b16a-2fef9d302bbe\") " pod="openstack/heat-engine-5f8cd7c4cf-7zq47" Dec 11 13:27:42 crc kubenswrapper[4898]: I1211 13:27:42.644323 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e113da52-055b-47d0-b16a-2fef9d302bbe-config-data-custom\") pod \"heat-engine-5f8cd7c4cf-7zq47\" (UID: \"e113da52-055b-47d0-b16a-2fef9d302bbe\") " pod="openstack/heat-engine-5f8cd7c4cf-7zq47" Dec 11 13:27:42 crc kubenswrapper[4898]: I1211 13:27:42.644442 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bv5cw\" (UniqueName: \"kubernetes.io/projected/e113da52-055b-47d0-b16a-2fef9d302bbe-kube-api-access-bv5cw\") pod \"heat-engine-5f8cd7c4cf-7zq47\" (UID: \"e113da52-055b-47d0-b16a-2fef9d302bbe\") " pod="openstack/heat-engine-5f8cd7c4cf-7zq47" Dec 11 13:27:42 crc kubenswrapper[4898]: I1211 13:27:42.649427 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e113da52-055b-47d0-b16a-2fef9d302bbe-config-data\") pod \"heat-engine-5f8cd7c4cf-7zq47\" (UID: \"e113da52-055b-47d0-b16a-2fef9d302bbe\") " pod="openstack/heat-engine-5f8cd7c4cf-7zq47" Dec 11 13:27:42 crc kubenswrapper[4898]: I1211 13:27:42.650166 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2caa9e06-4684-4011-8b55-e4436bf09d20-config-data-custom\") pod \"heat-api-58bd476547-b52w6\" (UID: \"2caa9e06-4684-4011-8b55-e4436bf09d20\") " pod="openstack/heat-api-58bd476547-b52w6" Dec 11 13:27:42 crc kubenswrapper[4898]: I1211 13:27:42.654867 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2caa9e06-4684-4011-8b55-e4436bf09d20-config-data\") pod \"heat-api-58bd476547-b52w6\" (UID: \"2caa9e06-4684-4011-8b55-e4436bf09d20\") " pod="openstack/heat-api-58bd476547-b52w6" Dec 11 13:27:42 crc kubenswrapper[4898]: I1211 13:27:42.659314 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e113da52-055b-47d0-b16a-2fef9d302bbe-combined-ca-bundle\") pod \"heat-engine-5f8cd7c4cf-7zq47\" (UID: \"e113da52-055b-47d0-b16a-2fef9d302bbe\") " pod="openstack/heat-engine-5f8cd7c4cf-7zq47" Dec 11 13:27:42 crc kubenswrapper[4898]: I1211 13:27:42.660118 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e113da52-055b-47d0-b16a-2fef9d302bbe-config-data-custom\") pod \"heat-engine-5f8cd7c4cf-7zq47\" (UID: \"e113da52-055b-47d0-b16a-2fef9d302bbe\") " pod="openstack/heat-engine-5f8cd7c4cf-7zq47" Dec 11 13:27:42 crc kubenswrapper[4898]: I1211 13:27:42.660706 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2caa9e06-4684-4011-8b55-e4436bf09d20-combined-ca-bundle\") pod \"heat-api-58bd476547-b52w6\" (UID: \"2caa9e06-4684-4011-8b55-e4436bf09d20\") " pod="openstack/heat-api-58bd476547-b52w6" Dec 11 13:27:42 crc kubenswrapper[4898]: I1211 13:27:42.667112 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bv5cw\" (UniqueName: \"kubernetes.io/projected/e113da52-055b-47d0-b16a-2fef9d302bbe-kube-api-access-bv5cw\") pod \"heat-engine-5f8cd7c4cf-7zq47\" (UID: \"e113da52-055b-47d0-b16a-2fef9d302bbe\") " pod="openstack/heat-engine-5f8cd7c4cf-7zq47" Dec 11 13:27:42 crc kubenswrapper[4898]: I1211 13:27:42.670368 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kp44m\" (UniqueName: \"kubernetes.io/projected/2caa9e06-4684-4011-8b55-e4436bf09d20-kube-api-access-kp44m\") pod \"heat-api-58bd476547-b52w6\" (UID: \"2caa9e06-4684-4011-8b55-e4436bf09d20\") " pod="openstack/heat-api-58bd476547-b52w6" Dec 11 13:27:42 crc kubenswrapper[4898]: I1211 13:27:42.685720 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-5f8cd7c4cf-7zq47" Dec 11 13:27:42 crc kubenswrapper[4898]: I1211 13:27:42.695538 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-58bd476547-b52w6" Dec 11 13:27:43 crc kubenswrapper[4898]: I1211 13:27:43.385239 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-7dc95c5f9b-hp64x"] Dec 11 13:27:43 crc kubenswrapper[4898]: W1211 13:27:43.386580 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0720809d_a8eb_4edd_a201_67305a32bc97.slice/crio-c7744e9ebb929645ee9eb9d00537c73ba5dde4990782b9793eb6e6304c5f9a56 WatchSource:0}: Error finding container c7744e9ebb929645ee9eb9d00537c73ba5dde4990782b9793eb6e6304c5f9a56: Status 404 returned error can't find the container with id c7744e9ebb929645ee9eb9d00537c73ba5dde4990782b9793eb6e6304c5f9a56 Dec 11 13:27:43 crc kubenswrapper[4898]: I1211 13:27:43.535201 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-58bd476547-b52w6"] Dec 11 13:27:43 crc kubenswrapper[4898]: I1211 13:27:43.588100 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-7dc95c5f9b-hp64x" event={"ID":"0720809d-a8eb-4edd-a201-67305a32bc97","Type":"ContainerStarted","Data":"c7744e9ebb929645ee9eb9d00537c73ba5dde4990782b9793eb6e6304c5f9a56"} Dec 11 13:27:43 crc kubenswrapper[4898]: I1211 13:27:43.611837 4898 generic.go:334] "Generic (PLEG): container finished" podID="1dc1c258-0a10-41a3-a831-0b8b1878ac80" containerID="8238f160349bb9bfa3897baa7615ac6ec775c9f9dafb1f1c69ff40c49db69db6" exitCode=0 Dec 11 13:27:43 crc kubenswrapper[4898]: I1211 13:27:43.620378 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"1dc1c258-0a10-41a3-a831-0b8b1878ac80","Type":"ContainerDied","Data":"8238f160349bb9bfa3897baa7615ac6ec775c9f9dafb1f1c69ff40c49db69db6"} Dec 11 13:27:43 crc kubenswrapper[4898]: I1211 13:27:43.634651 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-5f8cd7c4cf-7zq47"] Dec 11 13:27:43 crc kubenswrapper[4898]: W1211 13:27:43.699152 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode113da52_055b_47d0_b16a_2fef9d302bbe.slice/crio-a16387aef9d3e60ec85510eb7d40372515839f75829594ea56628444149f9b18 WatchSource:0}: Error finding container a16387aef9d3e60ec85510eb7d40372515839f75829594ea56628444149f9b18: Status 404 returned error can't find the container with id a16387aef9d3e60ec85510eb7d40372515839f75829594ea56628444149f9b18 Dec 11 13:27:43 crc kubenswrapper[4898]: I1211 13:27:43.832138 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 11 13:27:43 crc kubenswrapper[4898]: I1211 13:27:43.900329 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1dc1c258-0a10-41a3-a831-0b8b1878ac80-scripts\") pod \"1dc1c258-0a10-41a3-a831-0b8b1878ac80\" (UID: \"1dc1c258-0a10-41a3-a831-0b8b1878ac80\") " Dec 11 13:27:43 crc kubenswrapper[4898]: I1211 13:27:43.900422 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1dc1c258-0a10-41a3-a831-0b8b1878ac80-combined-ca-bundle\") pod \"1dc1c258-0a10-41a3-a831-0b8b1878ac80\" (UID: \"1dc1c258-0a10-41a3-a831-0b8b1878ac80\") " Dec 11 13:27:43 crc kubenswrapper[4898]: I1211 13:27:43.900484 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4wbfk\" (UniqueName: \"kubernetes.io/projected/1dc1c258-0a10-41a3-a831-0b8b1878ac80-kube-api-access-4wbfk\") pod \"1dc1c258-0a10-41a3-a831-0b8b1878ac80\" (UID: \"1dc1c258-0a10-41a3-a831-0b8b1878ac80\") " Dec 11 13:27:43 crc kubenswrapper[4898]: I1211 13:27:43.900612 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1dc1c258-0a10-41a3-a831-0b8b1878ac80-internal-tls-certs\") pod \"1dc1c258-0a10-41a3-a831-0b8b1878ac80\" (UID: \"1dc1c258-0a10-41a3-a831-0b8b1878ac80\") " Dec 11 13:27:43 crc kubenswrapper[4898]: I1211 13:27:43.900664 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"1dc1c258-0a10-41a3-a831-0b8b1878ac80\" (UID: \"1dc1c258-0a10-41a3-a831-0b8b1878ac80\") " Dec 11 13:27:43 crc kubenswrapper[4898]: I1211 13:27:43.900726 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1dc1c258-0a10-41a3-a831-0b8b1878ac80-logs\") pod \"1dc1c258-0a10-41a3-a831-0b8b1878ac80\" (UID: \"1dc1c258-0a10-41a3-a831-0b8b1878ac80\") " Dec 11 13:27:43 crc kubenswrapper[4898]: I1211 13:27:43.900759 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1dc1c258-0a10-41a3-a831-0b8b1878ac80-httpd-run\") pod \"1dc1c258-0a10-41a3-a831-0b8b1878ac80\" (UID: \"1dc1c258-0a10-41a3-a831-0b8b1878ac80\") " Dec 11 13:27:43 crc kubenswrapper[4898]: I1211 13:27:43.900864 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1dc1c258-0a10-41a3-a831-0b8b1878ac80-config-data\") pod \"1dc1c258-0a10-41a3-a831-0b8b1878ac80\" (UID: \"1dc1c258-0a10-41a3-a831-0b8b1878ac80\") " Dec 11 13:27:43 crc kubenswrapper[4898]: I1211 13:27:43.904682 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1dc1c258-0a10-41a3-a831-0b8b1878ac80-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "1dc1c258-0a10-41a3-a831-0b8b1878ac80" (UID: "1dc1c258-0a10-41a3-a831-0b8b1878ac80"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 13:27:43 crc kubenswrapper[4898]: I1211 13:27:43.905016 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1dc1c258-0a10-41a3-a831-0b8b1878ac80-logs" (OuterVolumeSpecName: "logs") pod "1dc1c258-0a10-41a3-a831-0b8b1878ac80" (UID: "1dc1c258-0a10-41a3-a831-0b8b1878ac80"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 13:27:43 crc kubenswrapper[4898]: I1211 13:27:43.908976 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "glance") pod "1dc1c258-0a10-41a3-a831-0b8b1878ac80" (UID: "1dc1c258-0a10-41a3-a831-0b8b1878ac80"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 11 13:27:43 crc kubenswrapper[4898]: I1211 13:27:43.915650 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1dc1c258-0a10-41a3-a831-0b8b1878ac80-scripts" (OuterVolumeSpecName: "scripts") pod "1dc1c258-0a10-41a3-a831-0b8b1878ac80" (UID: "1dc1c258-0a10-41a3-a831-0b8b1878ac80"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:27:43 crc kubenswrapper[4898]: I1211 13:27:43.925716 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1dc1c258-0a10-41a3-a831-0b8b1878ac80-kube-api-access-4wbfk" (OuterVolumeSpecName: "kube-api-access-4wbfk") pod "1dc1c258-0a10-41a3-a831-0b8b1878ac80" (UID: "1dc1c258-0a10-41a3-a831-0b8b1878ac80"). InnerVolumeSpecName "kube-api-access-4wbfk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:27:43 crc kubenswrapper[4898]: I1211 13:27:43.986348 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1dc1c258-0a10-41a3-a831-0b8b1878ac80-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1dc1c258-0a10-41a3-a831-0b8b1878ac80" (UID: "1dc1c258-0a10-41a3-a831-0b8b1878ac80"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:27:44 crc kubenswrapper[4898]: I1211 13:27:44.005239 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4wbfk\" (UniqueName: \"kubernetes.io/projected/1dc1c258-0a10-41a3-a831-0b8b1878ac80-kube-api-access-4wbfk\") on node \"crc\" DevicePath \"\"" Dec 11 13:27:44 crc kubenswrapper[4898]: I1211 13:27:44.005292 4898 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Dec 11 13:27:44 crc kubenswrapper[4898]: I1211 13:27:44.005306 4898 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1dc1c258-0a10-41a3-a831-0b8b1878ac80-logs\") on node \"crc\" DevicePath \"\"" Dec 11 13:27:44 crc kubenswrapper[4898]: I1211 13:27:44.005318 4898 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1dc1c258-0a10-41a3-a831-0b8b1878ac80-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 11 13:27:44 crc kubenswrapper[4898]: I1211 13:27:44.005328 4898 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1dc1c258-0a10-41a3-a831-0b8b1878ac80-scripts\") on node \"crc\" DevicePath \"\"" Dec 11 13:27:44 crc kubenswrapper[4898]: I1211 13:27:44.005340 4898 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1dc1c258-0a10-41a3-a831-0b8b1878ac80-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 11 13:27:44 crc kubenswrapper[4898]: I1211 13:27:44.062943 4898 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Dec 11 13:27:44 crc kubenswrapper[4898]: I1211 13:27:44.084775 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1dc1c258-0a10-41a3-a831-0b8b1878ac80-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "1dc1c258-0a10-41a3-a831-0b8b1878ac80" (UID: "1dc1c258-0a10-41a3-a831-0b8b1878ac80"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:27:44 crc kubenswrapper[4898]: I1211 13:27:44.111594 4898 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1dc1c258-0a10-41a3-a831-0b8b1878ac80-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 11 13:27:44 crc kubenswrapper[4898]: I1211 13:27:44.111632 4898 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Dec 11 13:27:44 crc kubenswrapper[4898]: I1211 13:27:44.149004 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1dc1c258-0a10-41a3-a831-0b8b1878ac80-config-data" (OuterVolumeSpecName: "config-data") pod "1dc1c258-0a10-41a3-a831-0b8b1878ac80" (UID: "1dc1c258-0a10-41a3-a831-0b8b1878ac80"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:27:44 crc kubenswrapper[4898]: I1211 13:27:44.213953 4898 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1dc1c258-0a10-41a3-a831-0b8b1878ac80-config-data\") on node \"crc\" DevicePath \"\"" Dec 11 13:27:44 crc kubenswrapper[4898]: E1211 13:27:44.435506 4898 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0720809d_a8eb_4edd_a201_67305a32bc97.slice/crio-conmon-74c171cb2674a163c6ff5d0d77adaa5e3fa5d3d9fa1fb24ad701211573939c52.scope\": RecentStats: unable to find data in memory cache]" Dec 11 13:27:44 crc kubenswrapper[4898]: I1211 13:27:44.640074 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-5f8cd7c4cf-7zq47" event={"ID":"e113da52-055b-47d0-b16a-2fef9d302bbe","Type":"ContainerStarted","Data":"4219744614ae19f5e8e2ef87f6e046fe6466914b8739d3d3af7ad27b29bc9fd1"} Dec 11 13:27:44 crc kubenswrapper[4898]: I1211 13:27:44.640517 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-5f8cd7c4cf-7zq47" event={"ID":"e113da52-055b-47d0-b16a-2fef9d302bbe","Type":"ContainerStarted","Data":"a16387aef9d3e60ec85510eb7d40372515839f75829594ea56628444149f9b18"} Dec 11 13:27:44 crc kubenswrapper[4898]: I1211 13:27:44.642143 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-engine-5f8cd7c4cf-7zq47" Dec 11 13:27:44 crc kubenswrapper[4898]: I1211 13:27:44.646664 4898 generic.go:334] "Generic (PLEG): container finished" podID="2caa9e06-4684-4011-8b55-e4436bf09d20" containerID="980d299249a8b3b2a44b9c94208fcb008905c05c66c7cf4b88905d8c356f76b9" exitCode=1 Dec 11 13:27:44 crc kubenswrapper[4898]: I1211 13:27:44.646729 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-58bd476547-b52w6" event={"ID":"2caa9e06-4684-4011-8b55-e4436bf09d20","Type":"ContainerDied","Data":"980d299249a8b3b2a44b9c94208fcb008905c05c66c7cf4b88905d8c356f76b9"} Dec 11 13:27:44 crc kubenswrapper[4898]: I1211 13:27:44.646749 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-58bd476547-b52w6" event={"ID":"2caa9e06-4684-4011-8b55-e4436bf09d20","Type":"ContainerStarted","Data":"4657e0fe7edfb5a6943ba3fcdebea8595b61d5eb4bfe3084b7a7c0507678e1e0"} Dec 11 13:27:44 crc kubenswrapper[4898]: I1211 13:27:44.647396 4898 scope.go:117] "RemoveContainer" containerID="980d299249a8b3b2a44b9c94208fcb008905c05c66c7cf4b88905d8c356f76b9" Dec 11 13:27:44 crc kubenswrapper[4898]: I1211 13:27:44.657883 4898 generic.go:334] "Generic (PLEG): container finished" podID="0720809d-a8eb-4edd-a201-67305a32bc97" containerID="74c171cb2674a163c6ff5d0d77adaa5e3fa5d3d9fa1fb24ad701211573939c52" exitCode=1 Dec 11 13:27:44 crc kubenswrapper[4898]: I1211 13:27:44.657977 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-7dc95c5f9b-hp64x" event={"ID":"0720809d-a8eb-4edd-a201-67305a32bc97","Type":"ContainerDied","Data":"74c171cb2674a163c6ff5d0d77adaa5e3fa5d3d9fa1fb24ad701211573939c52"} Dec 11 13:27:44 crc kubenswrapper[4898]: I1211 13:27:44.658771 4898 scope.go:117] "RemoveContainer" containerID="74c171cb2674a163c6ff5d0d77adaa5e3fa5d3d9fa1fb24ad701211573939c52" Dec 11 13:27:44 crc kubenswrapper[4898]: I1211 13:27:44.662197 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"1dc1c258-0a10-41a3-a831-0b8b1878ac80","Type":"ContainerDied","Data":"a0d23ae69b81b124946d9c11e911d16785ba6ad8e44ade7b74a1d3aebbdc4bdf"} Dec 11 13:27:44 crc kubenswrapper[4898]: I1211 13:27:44.662251 4898 scope.go:117] "RemoveContainer" containerID="8238f160349bb9bfa3897baa7615ac6ec775c9f9dafb1f1c69ff40c49db69db6" Dec 11 13:27:44 crc kubenswrapper[4898]: I1211 13:27:44.662494 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 11 13:27:44 crc kubenswrapper[4898]: I1211 13:27:44.675262 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-engine-5f8cd7c4cf-7zq47" podStartSLOduration=2.675241212 podStartE2EDuration="2.675241212s" podCreationTimestamp="2025-12-11 13:27:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 13:27:44.660855503 +0000 UTC m=+1422.233181940" watchObservedRunningTime="2025-12-11 13:27:44.675241212 +0000 UTC m=+1422.247567649" Dec 11 13:27:44 crc kubenswrapper[4898]: I1211 13:27:44.796677 4898 scope.go:117] "RemoveContainer" containerID="f443b849f68ba83870d8cd9bb8cb6e0ee28dc53333c98ea0253e563f39842ea5" Dec 11 13:27:44 crc kubenswrapper[4898]: I1211 13:27:44.842898 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-688b9f5b49-m8v8g" Dec 11 13:27:44 crc kubenswrapper[4898]: I1211 13:27:44.842942 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 11 13:27:44 crc kubenswrapper[4898]: I1211 13:27:44.842964 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 11 13:27:44 crc kubenswrapper[4898]: I1211 13:27:44.859236 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 11 13:27:44 crc kubenswrapper[4898]: E1211 13:27:44.859901 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1dc1c258-0a10-41a3-a831-0b8b1878ac80" containerName="glance-log" Dec 11 13:27:44 crc kubenswrapper[4898]: I1211 13:27:44.859937 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="1dc1c258-0a10-41a3-a831-0b8b1878ac80" containerName="glance-log" Dec 11 13:27:44 crc kubenswrapper[4898]: E1211 13:27:44.861691 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1dc1c258-0a10-41a3-a831-0b8b1878ac80" containerName="glance-httpd" Dec 11 13:27:44 crc kubenswrapper[4898]: I1211 13:27:44.861777 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="1dc1c258-0a10-41a3-a831-0b8b1878ac80" containerName="glance-httpd" Dec 11 13:27:44 crc kubenswrapper[4898]: I1211 13:27:44.864131 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="1dc1c258-0a10-41a3-a831-0b8b1878ac80" containerName="glance-httpd" Dec 11 13:27:44 crc kubenswrapper[4898]: I1211 13:27:44.864172 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="1dc1c258-0a10-41a3-a831-0b8b1878ac80" containerName="glance-log" Dec 11 13:27:44 crc kubenswrapper[4898]: I1211 13:27:44.865572 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 11 13:27:44 crc kubenswrapper[4898]: I1211 13:27:44.868064 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 11 13:27:44 crc kubenswrapper[4898]: I1211 13:27:44.868342 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Dec 11 13:27:44 crc kubenswrapper[4898]: I1211 13:27:44.882447 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 11 13:27:44 crc kubenswrapper[4898]: I1211 13:27:44.940117 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5a98434b-9a01-4f9f-a0c0-5c52ab613405-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"5a98434b-9a01-4f9f-a0c0-5c52ab613405\") " pod="openstack/glance-default-internal-api-0" Dec 11 13:27:44 crc kubenswrapper[4898]: I1211 13:27:44.940226 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5a98434b-9a01-4f9f-a0c0-5c52ab613405-scripts\") pod \"glance-default-internal-api-0\" (UID: \"5a98434b-9a01-4f9f-a0c0-5c52ab613405\") " pod="openstack/glance-default-internal-api-0" Dec 11 13:27:44 crc kubenswrapper[4898]: I1211 13:27:44.940338 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a98434b-9a01-4f9f-a0c0-5c52ab613405-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"5a98434b-9a01-4f9f-a0c0-5c52ab613405\") " pod="openstack/glance-default-internal-api-0" Dec 11 13:27:44 crc kubenswrapper[4898]: I1211 13:27:44.940446 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"5a98434b-9a01-4f9f-a0c0-5c52ab613405\") " pod="openstack/glance-default-internal-api-0" Dec 11 13:27:44 crc kubenswrapper[4898]: I1211 13:27:44.945228 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5a98434b-9a01-4f9f-a0c0-5c52ab613405-logs\") pod \"glance-default-internal-api-0\" (UID: \"5a98434b-9a01-4f9f-a0c0-5c52ab613405\") " pod="openstack/glance-default-internal-api-0" Dec 11 13:27:44 crc kubenswrapper[4898]: I1211 13:27:44.945361 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a98434b-9a01-4f9f-a0c0-5c52ab613405-config-data\") pod \"glance-default-internal-api-0\" (UID: \"5a98434b-9a01-4f9f-a0c0-5c52ab613405\") " pod="openstack/glance-default-internal-api-0" Dec 11 13:27:44 crc kubenswrapper[4898]: I1211 13:27:44.945547 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5a98434b-9a01-4f9f-a0c0-5c52ab613405-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"5a98434b-9a01-4f9f-a0c0-5c52ab613405\") " pod="openstack/glance-default-internal-api-0" Dec 11 13:27:44 crc kubenswrapper[4898]: I1211 13:27:44.945876 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j4kt7\" (UniqueName: \"kubernetes.io/projected/5a98434b-9a01-4f9f-a0c0-5c52ab613405-kube-api-access-j4kt7\") pod \"glance-default-internal-api-0\" (UID: \"5a98434b-9a01-4f9f-a0c0-5c52ab613405\") " pod="openstack/glance-default-internal-api-0" Dec 11 13:27:44 crc kubenswrapper[4898]: I1211 13:27:44.947630 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-cx55d"] Dec 11 13:27:44 crc kubenswrapper[4898]: I1211 13:27:44.947874 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6578955fd5-cx55d" podUID="5bdd1d6a-439c-487d-be8f-59ddafe284de" containerName="dnsmasq-dns" containerID="cri-o://0e86e7b3d2d0a5a5a6c3c1a89921a8ade3c0656cd7567f7d30efb5d3154c4586" gracePeriod=10 Dec 11 13:27:45 crc kubenswrapper[4898]: I1211 13:27:45.047648 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j4kt7\" (UniqueName: \"kubernetes.io/projected/5a98434b-9a01-4f9f-a0c0-5c52ab613405-kube-api-access-j4kt7\") pod \"glance-default-internal-api-0\" (UID: \"5a98434b-9a01-4f9f-a0c0-5c52ab613405\") " pod="openstack/glance-default-internal-api-0" Dec 11 13:27:45 crc kubenswrapper[4898]: I1211 13:27:45.047708 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5a98434b-9a01-4f9f-a0c0-5c52ab613405-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"5a98434b-9a01-4f9f-a0c0-5c52ab613405\") " pod="openstack/glance-default-internal-api-0" Dec 11 13:27:45 crc kubenswrapper[4898]: I1211 13:27:45.047741 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5a98434b-9a01-4f9f-a0c0-5c52ab613405-scripts\") pod \"glance-default-internal-api-0\" (UID: \"5a98434b-9a01-4f9f-a0c0-5c52ab613405\") " pod="openstack/glance-default-internal-api-0" Dec 11 13:27:45 crc kubenswrapper[4898]: I1211 13:27:45.047783 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a98434b-9a01-4f9f-a0c0-5c52ab613405-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"5a98434b-9a01-4f9f-a0c0-5c52ab613405\") " pod="openstack/glance-default-internal-api-0" Dec 11 13:27:45 crc kubenswrapper[4898]: I1211 13:27:45.047833 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"5a98434b-9a01-4f9f-a0c0-5c52ab613405\") " pod="openstack/glance-default-internal-api-0" Dec 11 13:27:45 crc kubenswrapper[4898]: I1211 13:27:45.047888 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5a98434b-9a01-4f9f-a0c0-5c52ab613405-logs\") pod \"glance-default-internal-api-0\" (UID: \"5a98434b-9a01-4f9f-a0c0-5c52ab613405\") " pod="openstack/glance-default-internal-api-0" Dec 11 13:27:45 crc kubenswrapper[4898]: I1211 13:27:45.047927 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a98434b-9a01-4f9f-a0c0-5c52ab613405-config-data\") pod \"glance-default-internal-api-0\" (UID: \"5a98434b-9a01-4f9f-a0c0-5c52ab613405\") " pod="openstack/glance-default-internal-api-0" Dec 11 13:27:45 crc kubenswrapper[4898]: I1211 13:27:45.047965 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5a98434b-9a01-4f9f-a0c0-5c52ab613405-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"5a98434b-9a01-4f9f-a0c0-5c52ab613405\") " pod="openstack/glance-default-internal-api-0" Dec 11 13:27:45 crc kubenswrapper[4898]: I1211 13:27:45.048381 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5a98434b-9a01-4f9f-a0c0-5c52ab613405-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"5a98434b-9a01-4f9f-a0c0-5c52ab613405\") " pod="openstack/glance-default-internal-api-0" Dec 11 13:27:45 crc kubenswrapper[4898]: I1211 13:27:45.050804 4898 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"5a98434b-9a01-4f9f-a0c0-5c52ab613405\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/glance-default-internal-api-0" Dec 11 13:27:45 crc kubenswrapper[4898]: I1211 13:27:45.052881 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5a98434b-9a01-4f9f-a0c0-5c52ab613405-logs\") pod \"glance-default-internal-api-0\" (UID: \"5a98434b-9a01-4f9f-a0c0-5c52ab613405\") " pod="openstack/glance-default-internal-api-0" Dec 11 13:27:45 crc kubenswrapper[4898]: I1211 13:27:45.055379 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5a98434b-9a01-4f9f-a0c0-5c52ab613405-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"5a98434b-9a01-4f9f-a0c0-5c52ab613405\") " pod="openstack/glance-default-internal-api-0" Dec 11 13:27:45 crc kubenswrapper[4898]: I1211 13:27:45.058000 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5a98434b-9a01-4f9f-a0c0-5c52ab613405-scripts\") pod \"glance-default-internal-api-0\" (UID: \"5a98434b-9a01-4f9f-a0c0-5c52ab613405\") " pod="openstack/glance-default-internal-api-0" Dec 11 13:27:45 crc kubenswrapper[4898]: I1211 13:27:45.075968 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a98434b-9a01-4f9f-a0c0-5c52ab613405-config-data\") pod \"glance-default-internal-api-0\" (UID: \"5a98434b-9a01-4f9f-a0c0-5c52ab613405\") " pod="openstack/glance-default-internal-api-0" Dec 11 13:27:45 crc kubenswrapper[4898]: I1211 13:27:45.083769 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a98434b-9a01-4f9f-a0c0-5c52ab613405-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"5a98434b-9a01-4f9f-a0c0-5c52ab613405\") " pod="openstack/glance-default-internal-api-0" Dec 11 13:27:45 crc kubenswrapper[4898]: I1211 13:27:45.085503 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j4kt7\" (UniqueName: \"kubernetes.io/projected/5a98434b-9a01-4f9f-a0c0-5c52ab613405-kube-api-access-j4kt7\") pod \"glance-default-internal-api-0\" (UID: \"5a98434b-9a01-4f9f-a0c0-5c52ab613405\") " pod="openstack/glance-default-internal-api-0" Dec 11 13:27:45 crc kubenswrapper[4898]: I1211 13:27:45.136131 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"5a98434b-9a01-4f9f-a0c0-5c52ab613405\") " pod="openstack/glance-default-internal-api-0" Dec 11 13:27:45 crc kubenswrapper[4898]: I1211 13:27:45.243067 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 11 13:27:45 crc kubenswrapper[4898]: I1211 13:27:45.611138 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6578955fd5-cx55d" Dec 11 13:27:45 crc kubenswrapper[4898]: I1211 13:27:45.678126 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5bdd1d6a-439c-487d-be8f-59ddafe284de-dns-svc\") pod \"5bdd1d6a-439c-487d-be8f-59ddafe284de\" (UID: \"5bdd1d6a-439c-487d-be8f-59ddafe284de\") " Dec 11 13:27:45 crc kubenswrapper[4898]: I1211 13:27:45.678821 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5bdd1d6a-439c-487d-be8f-59ddafe284de-config\") pod \"5bdd1d6a-439c-487d-be8f-59ddafe284de\" (UID: \"5bdd1d6a-439c-487d-be8f-59ddafe284de\") " Dec 11 13:27:45 crc kubenswrapper[4898]: I1211 13:27:45.678920 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5bdd1d6a-439c-487d-be8f-59ddafe284de-dns-swift-storage-0\") pod \"5bdd1d6a-439c-487d-be8f-59ddafe284de\" (UID: \"5bdd1d6a-439c-487d-be8f-59ddafe284de\") " Dec 11 13:27:45 crc kubenswrapper[4898]: I1211 13:27:45.678939 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5bdd1d6a-439c-487d-be8f-59ddafe284de-ovsdbserver-nb\") pod \"5bdd1d6a-439c-487d-be8f-59ddafe284de\" (UID: \"5bdd1d6a-439c-487d-be8f-59ddafe284de\") " Dec 11 13:27:45 crc kubenswrapper[4898]: I1211 13:27:45.679040 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9p9mr\" (UniqueName: \"kubernetes.io/projected/5bdd1d6a-439c-487d-be8f-59ddafe284de-kube-api-access-9p9mr\") pod \"5bdd1d6a-439c-487d-be8f-59ddafe284de\" (UID: \"5bdd1d6a-439c-487d-be8f-59ddafe284de\") " Dec 11 13:27:45 crc kubenswrapper[4898]: I1211 13:27:45.679199 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5bdd1d6a-439c-487d-be8f-59ddafe284de-ovsdbserver-sb\") pod \"5bdd1d6a-439c-487d-be8f-59ddafe284de\" (UID: \"5bdd1d6a-439c-487d-be8f-59ddafe284de\") " Dec 11 13:27:45 crc kubenswrapper[4898]: I1211 13:27:45.695610 4898 generic.go:334] "Generic (PLEG): container finished" podID="2caa9e06-4684-4011-8b55-e4436bf09d20" containerID="23e2f642ee8186e197f5e9d46b2e12731ff4e5497740cf42cbc2824699a0608e" exitCode=1 Dec 11 13:27:45 crc kubenswrapper[4898]: I1211 13:27:45.695673 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-58bd476547-b52w6" event={"ID":"2caa9e06-4684-4011-8b55-e4436bf09d20","Type":"ContainerDied","Data":"23e2f642ee8186e197f5e9d46b2e12731ff4e5497740cf42cbc2824699a0608e"} Dec 11 13:27:45 crc kubenswrapper[4898]: I1211 13:27:45.695709 4898 scope.go:117] "RemoveContainer" containerID="980d299249a8b3b2a44b9c94208fcb008905c05c66c7cf4b88905d8c356f76b9" Dec 11 13:27:45 crc kubenswrapper[4898]: I1211 13:27:45.696476 4898 scope.go:117] "RemoveContainer" containerID="23e2f642ee8186e197f5e9d46b2e12731ff4e5497740cf42cbc2824699a0608e" Dec 11 13:27:45 crc kubenswrapper[4898]: E1211 13:27:45.696836 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-api pod=heat-api-58bd476547-b52w6_openstack(2caa9e06-4684-4011-8b55-e4436bf09d20)\"" pod="openstack/heat-api-58bd476547-b52w6" podUID="2caa9e06-4684-4011-8b55-e4436bf09d20" Dec 11 13:27:45 crc kubenswrapper[4898]: I1211 13:27:45.698887 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5bdd1d6a-439c-487d-be8f-59ddafe284de-kube-api-access-9p9mr" (OuterVolumeSpecName: "kube-api-access-9p9mr") pod "5bdd1d6a-439c-487d-be8f-59ddafe284de" (UID: "5bdd1d6a-439c-487d-be8f-59ddafe284de"). InnerVolumeSpecName "kube-api-access-9p9mr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:27:45 crc kubenswrapper[4898]: I1211 13:27:45.729894 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-7dc95c5f9b-hp64x" event={"ID":"0720809d-a8eb-4edd-a201-67305a32bc97","Type":"ContainerStarted","Data":"189e28771eb11157a5b2e0bf0100157d2e510f5ac65180cf81847641cd19ccf7"} Dec 11 13:27:45 crc kubenswrapper[4898]: I1211 13:27:45.730006 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-7dc95c5f9b-hp64x" Dec 11 13:27:45 crc kubenswrapper[4898]: I1211 13:27:45.744563 4898 generic.go:334] "Generic (PLEG): container finished" podID="5bdd1d6a-439c-487d-be8f-59ddafe284de" containerID="0e86e7b3d2d0a5a5a6c3c1a89921a8ade3c0656cd7567f7d30efb5d3154c4586" exitCode=0 Dec 11 13:27:45 crc kubenswrapper[4898]: I1211 13:27:45.744632 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6578955fd5-cx55d" Dec 11 13:27:45 crc kubenswrapper[4898]: I1211 13:27:45.744669 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-cx55d" event={"ID":"5bdd1d6a-439c-487d-be8f-59ddafe284de","Type":"ContainerDied","Data":"0e86e7b3d2d0a5a5a6c3c1a89921a8ade3c0656cd7567f7d30efb5d3154c4586"} Dec 11 13:27:45 crc kubenswrapper[4898]: I1211 13:27:45.744723 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-cx55d" event={"ID":"5bdd1d6a-439c-487d-be8f-59ddafe284de","Type":"ContainerDied","Data":"8705cc0b17edc446597efe1056e88bda92c586ab8c6c347c48a38cb9cf5f098f"} Dec 11 13:27:45 crc kubenswrapper[4898]: I1211 13:27:45.771582 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-7dc95c5f9b-hp64x" podStartSLOduration=3.771560021 podStartE2EDuration="3.771560021s" podCreationTimestamp="2025-12-11 13:27:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 13:27:45.74937715 +0000 UTC m=+1423.321703587" watchObservedRunningTime="2025-12-11 13:27:45.771560021 +0000 UTC m=+1423.343886458" Dec 11 13:27:45 crc kubenswrapper[4898]: I1211 13:27:45.782272 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9p9mr\" (UniqueName: \"kubernetes.io/projected/5bdd1d6a-439c-487d-be8f-59ddafe284de-kube-api-access-9p9mr\") on node \"crc\" DevicePath \"\"" Dec 11 13:27:45 crc kubenswrapper[4898]: I1211 13:27:45.795701 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5bdd1d6a-439c-487d-be8f-59ddafe284de-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "5bdd1d6a-439c-487d-be8f-59ddafe284de" (UID: "5bdd1d6a-439c-487d-be8f-59ddafe284de"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:27:45 crc kubenswrapper[4898]: I1211 13:27:45.833953 4898 scope.go:117] "RemoveContainer" containerID="0e86e7b3d2d0a5a5a6c3c1a89921a8ade3c0656cd7567f7d30efb5d3154c4586" Dec 11 13:27:45 crc kubenswrapper[4898]: I1211 13:27:45.884652 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5bdd1d6a-439c-487d-be8f-59ddafe284de-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "5bdd1d6a-439c-487d-be8f-59ddafe284de" (UID: "5bdd1d6a-439c-487d-be8f-59ddafe284de"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:27:45 crc kubenswrapper[4898]: I1211 13:27:45.885653 4898 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5bdd1d6a-439c-487d-be8f-59ddafe284de-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 11 13:27:45 crc kubenswrapper[4898]: I1211 13:27:45.885674 4898 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5bdd1d6a-439c-487d-be8f-59ddafe284de-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 11 13:27:45 crc kubenswrapper[4898]: I1211 13:27:45.897768 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5bdd1d6a-439c-487d-be8f-59ddafe284de-config" (OuterVolumeSpecName: "config") pod "5bdd1d6a-439c-487d-be8f-59ddafe284de" (UID: "5bdd1d6a-439c-487d-be8f-59ddafe284de"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:27:45 crc kubenswrapper[4898]: I1211 13:27:45.943317 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5bdd1d6a-439c-487d-be8f-59ddafe284de-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "5bdd1d6a-439c-487d-be8f-59ddafe284de" (UID: "5bdd1d6a-439c-487d-be8f-59ddafe284de"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:27:45 crc kubenswrapper[4898]: I1211 13:27:45.950715 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5bdd1d6a-439c-487d-be8f-59ddafe284de-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "5bdd1d6a-439c-487d-be8f-59ddafe284de" (UID: "5bdd1d6a-439c-487d-be8f-59ddafe284de"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:27:45 crc kubenswrapper[4898]: I1211 13:27:45.982916 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 11 13:27:45 crc kubenswrapper[4898]: I1211 13:27:45.983551 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="25ac34fb-287b-4ea4-b349-124a57354453" containerName="ceilometer-central-agent" containerID="cri-o://6f15206f068cd564bdcfebb3a774f04b52319bb6114731e417b7e5baec2291c0" gracePeriod=30 Dec 11 13:27:45 crc kubenswrapper[4898]: I1211 13:27:45.984108 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="25ac34fb-287b-4ea4-b349-124a57354453" containerName="proxy-httpd" containerID="cri-o://66b2a4d4af3a93e795e5f56d227e5414c8e1538bbf1dd1419850642d3f21bf62" gracePeriod=30 Dec 11 13:27:45 crc kubenswrapper[4898]: I1211 13:27:45.984285 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="25ac34fb-287b-4ea4-b349-124a57354453" containerName="sg-core" containerID="cri-o://374a78de8739eef4fa406721b9dca1ec6b141ead76b48ecae8837b884187cc46" gracePeriod=30 Dec 11 13:27:45 crc kubenswrapper[4898]: I1211 13:27:45.984391 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="25ac34fb-287b-4ea4-b349-124a57354453" containerName="ceilometer-notification-agent" containerID="cri-o://28921edda652003320dab5988a7617717d4874a26d9de9ca3ec96c8cb7818cb9" gracePeriod=30 Dec 11 13:27:45 crc kubenswrapper[4898]: I1211 13:27:45.987974 4898 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5bdd1d6a-439c-487d-be8f-59ddafe284de-config\") on node \"crc\" DevicePath \"\"" Dec 11 13:27:45 crc kubenswrapper[4898]: I1211 13:27:45.988125 4898 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5bdd1d6a-439c-487d-be8f-59ddafe284de-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 11 13:27:45 crc kubenswrapper[4898]: I1211 13:27:45.988194 4898 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5bdd1d6a-439c-487d-be8f-59ddafe284de-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 11 13:27:46 crc kubenswrapper[4898]: I1211 13:27:46.010904 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 11 13:27:46 crc kubenswrapper[4898]: I1211 13:27:46.206985 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-cx55d"] Dec 11 13:27:46 crc kubenswrapper[4898]: I1211 13:27:46.212956 4898 scope.go:117] "RemoveContainer" containerID="c2bdcec82c23f76fa53eac0089e53257284830fce4b1da3169165202f0a01c04" Dec 11 13:27:46 crc kubenswrapper[4898]: I1211 13:27:46.237122 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-cx55d"] Dec 11 13:27:46 crc kubenswrapper[4898]: I1211 13:27:46.338436 4898 scope.go:117] "RemoveContainer" containerID="0e86e7b3d2d0a5a5a6c3c1a89921a8ade3c0656cd7567f7d30efb5d3154c4586" Dec 11 13:27:46 crc kubenswrapper[4898]: E1211 13:27:46.346850 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0e86e7b3d2d0a5a5a6c3c1a89921a8ade3c0656cd7567f7d30efb5d3154c4586\": container with ID starting with 0e86e7b3d2d0a5a5a6c3c1a89921a8ade3c0656cd7567f7d30efb5d3154c4586 not found: ID does not exist" containerID="0e86e7b3d2d0a5a5a6c3c1a89921a8ade3c0656cd7567f7d30efb5d3154c4586" Dec 11 13:27:46 crc kubenswrapper[4898]: I1211 13:27:46.347288 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e86e7b3d2d0a5a5a6c3c1a89921a8ade3c0656cd7567f7d30efb5d3154c4586"} err="failed to get container status \"0e86e7b3d2d0a5a5a6c3c1a89921a8ade3c0656cd7567f7d30efb5d3154c4586\": rpc error: code = NotFound desc = could not find container \"0e86e7b3d2d0a5a5a6c3c1a89921a8ade3c0656cd7567f7d30efb5d3154c4586\": container with ID starting with 0e86e7b3d2d0a5a5a6c3c1a89921a8ade3c0656cd7567f7d30efb5d3154c4586 not found: ID does not exist" Dec 11 13:27:46 crc kubenswrapper[4898]: I1211 13:27:46.347433 4898 scope.go:117] "RemoveContainer" containerID="c2bdcec82c23f76fa53eac0089e53257284830fce4b1da3169165202f0a01c04" Dec 11 13:27:46 crc kubenswrapper[4898]: E1211 13:27:46.349565 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c2bdcec82c23f76fa53eac0089e53257284830fce4b1da3169165202f0a01c04\": container with ID starting with c2bdcec82c23f76fa53eac0089e53257284830fce4b1da3169165202f0a01c04 not found: ID does not exist" containerID="c2bdcec82c23f76fa53eac0089e53257284830fce4b1da3169165202f0a01c04" Dec 11 13:27:46 crc kubenswrapper[4898]: I1211 13:27:46.349597 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c2bdcec82c23f76fa53eac0089e53257284830fce4b1da3169165202f0a01c04"} err="failed to get container status \"c2bdcec82c23f76fa53eac0089e53257284830fce4b1da3169165202f0a01c04\": rpc error: code = NotFound desc = could not find container \"c2bdcec82c23f76fa53eac0089e53257284830fce4b1da3169165202f0a01c04\": container with ID starting with c2bdcec82c23f76fa53eac0089e53257284830fce4b1da3169165202f0a01c04 not found: ID does not exist" Dec 11 13:27:46 crc kubenswrapper[4898]: I1211 13:27:46.771478 4898 scope.go:117] "RemoveContainer" containerID="23e2f642ee8186e197f5e9d46b2e12731ff4e5497740cf42cbc2824699a0608e" Dec 11 13:27:46 crc kubenswrapper[4898]: E1211 13:27:46.771848 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-api pod=heat-api-58bd476547-b52w6_openstack(2caa9e06-4684-4011-8b55-e4436bf09d20)\"" pod="openstack/heat-api-58bd476547-b52w6" podUID="2caa9e06-4684-4011-8b55-e4436bf09d20" Dec 11 13:27:46 crc kubenswrapper[4898]: I1211 13:27:46.773351 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"5a98434b-9a01-4f9f-a0c0-5c52ab613405","Type":"ContainerStarted","Data":"29bdafac3c7f759cc569d270855001f96a25153bab659bc150485412a5b445fa"} Dec 11 13:27:46 crc kubenswrapper[4898]: I1211 13:27:46.795791 4898 generic.go:334] "Generic (PLEG): container finished" podID="0720809d-a8eb-4edd-a201-67305a32bc97" containerID="189e28771eb11157a5b2e0bf0100157d2e510f5ac65180cf81847641cd19ccf7" exitCode=1 Dec 11 13:27:46 crc kubenswrapper[4898]: I1211 13:27:46.796843 4898 scope.go:117] "RemoveContainer" containerID="189e28771eb11157a5b2e0bf0100157d2e510f5ac65180cf81847641cd19ccf7" Dec 11 13:27:46 crc kubenswrapper[4898]: E1211 13:27:46.797174 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-cfnapi\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-cfnapi pod=heat-cfnapi-7dc95c5f9b-hp64x_openstack(0720809d-a8eb-4edd-a201-67305a32bc97)\"" pod="openstack/heat-cfnapi-7dc95c5f9b-hp64x" podUID="0720809d-a8eb-4edd-a201-67305a32bc97" Dec 11 13:27:46 crc kubenswrapper[4898]: I1211 13:27:46.802890 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1dc1c258-0a10-41a3-a831-0b8b1878ac80" path="/var/lib/kubelet/pods/1dc1c258-0a10-41a3-a831-0b8b1878ac80/volumes" Dec 11 13:27:46 crc kubenswrapper[4898]: I1211 13:27:46.804033 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5bdd1d6a-439c-487d-be8f-59ddafe284de" path="/var/lib/kubelet/pods/5bdd1d6a-439c-487d-be8f-59ddafe284de/volumes" Dec 11 13:27:46 crc kubenswrapper[4898]: I1211 13:27:46.810298 4898 generic.go:334] "Generic (PLEG): container finished" podID="25ac34fb-287b-4ea4-b349-124a57354453" containerID="66b2a4d4af3a93e795e5f56d227e5414c8e1538bbf1dd1419850642d3f21bf62" exitCode=0 Dec 11 13:27:46 crc kubenswrapper[4898]: I1211 13:27:46.810343 4898 generic.go:334] "Generic (PLEG): container finished" podID="25ac34fb-287b-4ea4-b349-124a57354453" containerID="374a78de8739eef4fa406721b9dca1ec6b141ead76b48ecae8837b884187cc46" exitCode=2 Dec 11 13:27:46 crc kubenswrapper[4898]: I1211 13:27:46.813599 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-7dc95c5f9b-hp64x" event={"ID":"0720809d-a8eb-4edd-a201-67305a32bc97","Type":"ContainerDied","Data":"189e28771eb11157a5b2e0bf0100157d2e510f5ac65180cf81847641cd19ccf7"} Dec 11 13:27:46 crc kubenswrapper[4898]: I1211 13:27:46.813651 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"25ac34fb-287b-4ea4-b349-124a57354453","Type":"ContainerDied","Data":"66b2a4d4af3a93e795e5f56d227e5414c8e1538bbf1dd1419850642d3f21bf62"} Dec 11 13:27:46 crc kubenswrapper[4898]: I1211 13:27:46.813667 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"25ac34fb-287b-4ea4-b349-124a57354453","Type":"ContainerDied","Data":"374a78de8739eef4fa406721b9dca1ec6b141ead76b48ecae8837b884187cc46"} Dec 11 13:27:46 crc kubenswrapper[4898]: I1211 13:27:46.813690 4898 scope.go:117] "RemoveContainer" containerID="74c171cb2674a163c6ff5d0d77adaa5e3fa5d3d9fa1fb24ad701211573939c52" Dec 11 13:27:47 crc kubenswrapper[4898]: I1211 13:27:47.016819 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-6ff897fbd7-rddq7"] Dec 11 13:27:47 crc kubenswrapper[4898]: I1211 13:27:47.017287 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-api-6ff897fbd7-rddq7" podUID="c6532ba5-2d1a-44f9-bf4a-6fd73f4bb6ee" containerName="heat-api" containerID="cri-o://75320a8b39bb7fa3543db9e0288c17f254ec4ffaa105274e968f0d723be9cc21" gracePeriod=60 Dec 11 13:27:47 crc kubenswrapper[4898]: I1211 13:27:47.036272 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-6cfcb9bcc9-8j4sz"] Dec 11 13:27:47 crc kubenswrapper[4898]: I1211 13:27:47.036503 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-cfnapi-6cfcb9bcc9-8j4sz" podUID="0ee1613f-2749-4158-b826-8096758ca04f" containerName="heat-cfnapi" containerID="cri-o://3e4d687a0921db9498e4173ffb0d78fb822d8dfbfe1a84dcabbeaaaaa578664a" gracePeriod=60 Dec 11 13:27:47 crc kubenswrapper[4898]: I1211 13:27:47.043745 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/heat-api-6ff897fbd7-rddq7" podUID="c6532ba5-2d1a-44f9-bf4a-6fd73f4bb6ee" containerName="heat-api" probeResult="failure" output="Get \"http://10.217.0.215:8004/healthcheck\": EOF" Dec 11 13:27:47 crc kubenswrapper[4898]: I1211 13:27:47.060666 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/heat-cfnapi-6cfcb9bcc9-8j4sz" podUID="0ee1613f-2749-4158-b826-8096758ca04f" containerName="heat-cfnapi" probeResult="failure" output="Get \"http://10.217.0.214:8000/healthcheck\": EOF" Dec 11 13:27:47 crc kubenswrapper[4898]: I1211 13:27:47.060849 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/heat-cfnapi-6cfcb9bcc9-8j4sz" podUID="0ee1613f-2749-4158-b826-8096758ca04f" containerName="heat-cfnapi" probeResult="failure" output="Get \"http://10.217.0.214:8000/healthcheck\": EOF" Dec 11 13:27:47 crc kubenswrapper[4898]: I1211 13:27:47.105405 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-59dc5cddb6-qswpf"] Dec 11 13:27:47 crc kubenswrapper[4898]: E1211 13:27:47.106048 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5bdd1d6a-439c-487d-be8f-59ddafe284de" containerName="init" Dec 11 13:27:47 crc kubenswrapper[4898]: I1211 13:27:47.106077 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="5bdd1d6a-439c-487d-be8f-59ddafe284de" containerName="init" Dec 11 13:27:47 crc kubenswrapper[4898]: E1211 13:27:47.106112 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5bdd1d6a-439c-487d-be8f-59ddafe284de" containerName="dnsmasq-dns" Dec 11 13:27:47 crc kubenswrapper[4898]: I1211 13:27:47.106122 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="5bdd1d6a-439c-487d-be8f-59ddafe284de" containerName="dnsmasq-dns" Dec 11 13:27:47 crc kubenswrapper[4898]: I1211 13:27:47.106491 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="5bdd1d6a-439c-487d-be8f-59ddafe284de" containerName="dnsmasq-dns" Dec 11 13:27:47 crc kubenswrapper[4898]: I1211 13:27:47.108254 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-59dc5cddb6-qswpf" Dec 11 13:27:47 crc kubenswrapper[4898]: I1211 13:27:47.115633 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-api-public-svc" Dec 11 13:27:47 crc kubenswrapper[4898]: I1211 13:27:47.115902 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-api-internal-svc" Dec 11 13:27:47 crc kubenswrapper[4898]: I1211 13:27:47.115897 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-7f89bf5c7d-t54nx"] Dec 11 13:27:47 crc kubenswrapper[4898]: I1211 13:27:47.117672 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-7f89bf5c7d-t54nx" Dec 11 13:27:47 crc kubenswrapper[4898]: I1211 13:27:47.119166 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-cfnapi-public-svc" Dec 11 13:27:47 crc kubenswrapper[4898]: I1211 13:27:47.119944 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-cfnapi-internal-svc" Dec 11 13:27:47 crc kubenswrapper[4898]: I1211 13:27:47.126780 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-7f89bf5c7d-t54nx"] Dec 11 13:27:47 crc kubenswrapper[4898]: I1211 13:27:47.152033 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-59dc5cddb6-qswpf"] Dec 11 13:27:47 crc kubenswrapper[4898]: I1211 13:27:47.231848 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4def8e33-aab9-4c66-8b73-c866ac6c5047-internal-tls-certs\") pod \"heat-api-59dc5cddb6-qswpf\" (UID: \"4def8e33-aab9-4c66-8b73-c866ac6c5047\") " pod="openstack/heat-api-59dc5cddb6-qswpf" Dec 11 13:27:47 crc kubenswrapper[4898]: I1211 13:27:47.231945 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4def8e33-aab9-4c66-8b73-c866ac6c5047-combined-ca-bundle\") pod \"heat-api-59dc5cddb6-qswpf\" (UID: \"4def8e33-aab9-4c66-8b73-c866ac6c5047\") " pod="openstack/heat-api-59dc5cddb6-qswpf" Dec 11 13:27:47 crc kubenswrapper[4898]: I1211 13:27:47.231989 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/84a913ce-6f2e-4327-89b0-eb40be31c03e-public-tls-certs\") pod \"heat-cfnapi-7f89bf5c7d-t54nx\" (UID: \"84a913ce-6f2e-4327-89b0-eb40be31c03e\") " pod="openstack/heat-cfnapi-7f89bf5c7d-t54nx" Dec 11 13:27:47 crc kubenswrapper[4898]: I1211 13:27:47.232015 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xptfp\" (UniqueName: \"kubernetes.io/projected/4def8e33-aab9-4c66-8b73-c866ac6c5047-kube-api-access-xptfp\") pod \"heat-api-59dc5cddb6-qswpf\" (UID: \"4def8e33-aab9-4c66-8b73-c866ac6c5047\") " pod="openstack/heat-api-59dc5cddb6-qswpf" Dec 11 13:27:47 crc kubenswrapper[4898]: I1211 13:27:47.232071 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4def8e33-aab9-4c66-8b73-c866ac6c5047-config-data-custom\") pod \"heat-api-59dc5cddb6-qswpf\" (UID: \"4def8e33-aab9-4c66-8b73-c866ac6c5047\") " pod="openstack/heat-api-59dc5cddb6-qswpf" Dec 11 13:27:47 crc kubenswrapper[4898]: I1211 13:27:47.232113 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/84a913ce-6f2e-4327-89b0-eb40be31c03e-internal-tls-certs\") pod \"heat-cfnapi-7f89bf5c7d-t54nx\" (UID: \"84a913ce-6f2e-4327-89b0-eb40be31c03e\") " pod="openstack/heat-cfnapi-7f89bf5c7d-t54nx" Dec 11 13:27:47 crc kubenswrapper[4898]: I1211 13:27:47.232137 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84a913ce-6f2e-4327-89b0-eb40be31c03e-combined-ca-bundle\") pod \"heat-cfnapi-7f89bf5c7d-t54nx\" (UID: \"84a913ce-6f2e-4327-89b0-eb40be31c03e\") " pod="openstack/heat-cfnapi-7f89bf5c7d-t54nx" Dec 11 13:27:47 crc kubenswrapper[4898]: I1211 13:27:47.232167 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/84a913ce-6f2e-4327-89b0-eb40be31c03e-config-data-custom\") pod \"heat-cfnapi-7f89bf5c7d-t54nx\" (UID: \"84a913ce-6f2e-4327-89b0-eb40be31c03e\") " pod="openstack/heat-cfnapi-7f89bf5c7d-t54nx" Dec 11 13:27:47 crc kubenswrapper[4898]: I1211 13:27:47.232225 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4def8e33-aab9-4c66-8b73-c866ac6c5047-public-tls-certs\") pod \"heat-api-59dc5cddb6-qswpf\" (UID: \"4def8e33-aab9-4c66-8b73-c866ac6c5047\") " pod="openstack/heat-api-59dc5cddb6-qswpf" Dec 11 13:27:47 crc kubenswrapper[4898]: I1211 13:27:47.232276 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jk7ds\" (UniqueName: \"kubernetes.io/projected/84a913ce-6f2e-4327-89b0-eb40be31c03e-kube-api-access-jk7ds\") pod \"heat-cfnapi-7f89bf5c7d-t54nx\" (UID: \"84a913ce-6f2e-4327-89b0-eb40be31c03e\") " pod="openstack/heat-cfnapi-7f89bf5c7d-t54nx" Dec 11 13:27:47 crc kubenswrapper[4898]: I1211 13:27:47.232307 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84a913ce-6f2e-4327-89b0-eb40be31c03e-config-data\") pod \"heat-cfnapi-7f89bf5c7d-t54nx\" (UID: \"84a913ce-6f2e-4327-89b0-eb40be31c03e\") " pod="openstack/heat-cfnapi-7f89bf5c7d-t54nx" Dec 11 13:27:47 crc kubenswrapper[4898]: I1211 13:27:47.232483 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4def8e33-aab9-4c66-8b73-c866ac6c5047-config-data\") pod \"heat-api-59dc5cddb6-qswpf\" (UID: \"4def8e33-aab9-4c66-8b73-c866ac6c5047\") " pod="openstack/heat-api-59dc5cddb6-qswpf" Dec 11 13:27:47 crc kubenswrapper[4898]: I1211 13:27:47.338839 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4def8e33-aab9-4c66-8b73-c866ac6c5047-internal-tls-certs\") pod \"heat-api-59dc5cddb6-qswpf\" (UID: \"4def8e33-aab9-4c66-8b73-c866ac6c5047\") " pod="openstack/heat-api-59dc5cddb6-qswpf" Dec 11 13:27:47 crc kubenswrapper[4898]: I1211 13:27:47.338923 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4def8e33-aab9-4c66-8b73-c866ac6c5047-combined-ca-bundle\") pod \"heat-api-59dc5cddb6-qswpf\" (UID: \"4def8e33-aab9-4c66-8b73-c866ac6c5047\") " pod="openstack/heat-api-59dc5cddb6-qswpf" Dec 11 13:27:47 crc kubenswrapper[4898]: I1211 13:27:47.338959 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/84a913ce-6f2e-4327-89b0-eb40be31c03e-public-tls-certs\") pod \"heat-cfnapi-7f89bf5c7d-t54nx\" (UID: \"84a913ce-6f2e-4327-89b0-eb40be31c03e\") " pod="openstack/heat-cfnapi-7f89bf5c7d-t54nx" Dec 11 13:27:47 crc kubenswrapper[4898]: I1211 13:27:47.338977 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xptfp\" (UniqueName: \"kubernetes.io/projected/4def8e33-aab9-4c66-8b73-c866ac6c5047-kube-api-access-xptfp\") pod \"heat-api-59dc5cddb6-qswpf\" (UID: \"4def8e33-aab9-4c66-8b73-c866ac6c5047\") " pod="openstack/heat-api-59dc5cddb6-qswpf" Dec 11 13:27:47 crc kubenswrapper[4898]: I1211 13:27:47.339022 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4def8e33-aab9-4c66-8b73-c866ac6c5047-config-data-custom\") pod \"heat-api-59dc5cddb6-qswpf\" (UID: \"4def8e33-aab9-4c66-8b73-c866ac6c5047\") " pod="openstack/heat-api-59dc5cddb6-qswpf" Dec 11 13:27:47 crc kubenswrapper[4898]: I1211 13:27:47.339055 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/84a913ce-6f2e-4327-89b0-eb40be31c03e-internal-tls-certs\") pod \"heat-cfnapi-7f89bf5c7d-t54nx\" (UID: \"84a913ce-6f2e-4327-89b0-eb40be31c03e\") " pod="openstack/heat-cfnapi-7f89bf5c7d-t54nx" Dec 11 13:27:47 crc kubenswrapper[4898]: I1211 13:27:47.339076 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84a913ce-6f2e-4327-89b0-eb40be31c03e-combined-ca-bundle\") pod \"heat-cfnapi-7f89bf5c7d-t54nx\" (UID: \"84a913ce-6f2e-4327-89b0-eb40be31c03e\") " pod="openstack/heat-cfnapi-7f89bf5c7d-t54nx" Dec 11 13:27:47 crc kubenswrapper[4898]: I1211 13:27:47.339098 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/84a913ce-6f2e-4327-89b0-eb40be31c03e-config-data-custom\") pod \"heat-cfnapi-7f89bf5c7d-t54nx\" (UID: \"84a913ce-6f2e-4327-89b0-eb40be31c03e\") " pod="openstack/heat-cfnapi-7f89bf5c7d-t54nx" Dec 11 13:27:47 crc kubenswrapper[4898]: I1211 13:27:47.339142 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4def8e33-aab9-4c66-8b73-c866ac6c5047-public-tls-certs\") pod \"heat-api-59dc5cddb6-qswpf\" (UID: \"4def8e33-aab9-4c66-8b73-c866ac6c5047\") " pod="openstack/heat-api-59dc5cddb6-qswpf" Dec 11 13:27:47 crc kubenswrapper[4898]: I1211 13:27:47.339180 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jk7ds\" (UniqueName: \"kubernetes.io/projected/84a913ce-6f2e-4327-89b0-eb40be31c03e-kube-api-access-jk7ds\") pod \"heat-cfnapi-7f89bf5c7d-t54nx\" (UID: \"84a913ce-6f2e-4327-89b0-eb40be31c03e\") " pod="openstack/heat-cfnapi-7f89bf5c7d-t54nx" Dec 11 13:27:47 crc kubenswrapper[4898]: I1211 13:27:47.339201 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84a913ce-6f2e-4327-89b0-eb40be31c03e-config-data\") pod \"heat-cfnapi-7f89bf5c7d-t54nx\" (UID: \"84a913ce-6f2e-4327-89b0-eb40be31c03e\") " pod="openstack/heat-cfnapi-7f89bf5c7d-t54nx" Dec 11 13:27:47 crc kubenswrapper[4898]: I1211 13:27:47.339243 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4def8e33-aab9-4c66-8b73-c866ac6c5047-config-data\") pod \"heat-api-59dc5cddb6-qswpf\" (UID: \"4def8e33-aab9-4c66-8b73-c866ac6c5047\") " pod="openstack/heat-api-59dc5cddb6-qswpf" Dec 11 13:27:47 crc kubenswrapper[4898]: I1211 13:27:47.343910 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/84a913ce-6f2e-4327-89b0-eb40be31c03e-public-tls-certs\") pod \"heat-cfnapi-7f89bf5c7d-t54nx\" (UID: \"84a913ce-6f2e-4327-89b0-eb40be31c03e\") " pod="openstack/heat-cfnapi-7f89bf5c7d-t54nx" Dec 11 13:27:47 crc kubenswrapper[4898]: I1211 13:27:47.347635 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4def8e33-aab9-4c66-8b73-c866ac6c5047-internal-tls-certs\") pod \"heat-api-59dc5cddb6-qswpf\" (UID: \"4def8e33-aab9-4c66-8b73-c866ac6c5047\") " pod="openstack/heat-api-59dc5cddb6-qswpf" Dec 11 13:27:47 crc kubenswrapper[4898]: I1211 13:27:47.349306 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4def8e33-aab9-4c66-8b73-c866ac6c5047-combined-ca-bundle\") pod \"heat-api-59dc5cddb6-qswpf\" (UID: \"4def8e33-aab9-4c66-8b73-c866ac6c5047\") " pod="openstack/heat-api-59dc5cddb6-qswpf" Dec 11 13:27:47 crc kubenswrapper[4898]: I1211 13:27:47.351313 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84a913ce-6f2e-4327-89b0-eb40be31c03e-combined-ca-bundle\") pod \"heat-cfnapi-7f89bf5c7d-t54nx\" (UID: \"84a913ce-6f2e-4327-89b0-eb40be31c03e\") " pod="openstack/heat-cfnapi-7f89bf5c7d-t54nx" Dec 11 13:27:47 crc kubenswrapper[4898]: I1211 13:27:47.352056 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4def8e33-aab9-4c66-8b73-c866ac6c5047-config-data-custom\") pod \"heat-api-59dc5cddb6-qswpf\" (UID: \"4def8e33-aab9-4c66-8b73-c866ac6c5047\") " pod="openstack/heat-api-59dc5cddb6-qswpf" Dec 11 13:27:47 crc kubenswrapper[4898]: I1211 13:27:47.352059 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/84a913ce-6f2e-4327-89b0-eb40be31c03e-config-data-custom\") pod \"heat-cfnapi-7f89bf5c7d-t54nx\" (UID: \"84a913ce-6f2e-4327-89b0-eb40be31c03e\") " pod="openstack/heat-cfnapi-7f89bf5c7d-t54nx" Dec 11 13:27:47 crc kubenswrapper[4898]: I1211 13:27:47.352838 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4def8e33-aab9-4c66-8b73-c866ac6c5047-config-data\") pod \"heat-api-59dc5cddb6-qswpf\" (UID: \"4def8e33-aab9-4c66-8b73-c866ac6c5047\") " pod="openstack/heat-api-59dc5cddb6-qswpf" Dec 11 13:27:47 crc kubenswrapper[4898]: I1211 13:27:47.354002 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/84a913ce-6f2e-4327-89b0-eb40be31c03e-internal-tls-certs\") pod \"heat-cfnapi-7f89bf5c7d-t54nx\" (UID: \"84a913ce-6f2e-4327-89b0-eb40be31c03e\") " pod="openstack/heat-cfnapi-7f89bf5c7d-t54nx" Dec 11 13:27:47 crc kubenswrapper[4898]: I1211 13:27:47.362025 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xptfp\" (UniqueName: \"kubernetes.io/projected/4def8e33-aab9-4c66-8b73-c866ac6c5047-kube-api-access-xptfp\") pod \"heat-api-59dc5cddb6-qswpf\" (UID: \"4def8e33-aab9-4c66-8b73-c866ac6c5047\") " pod="openstack/heat-api-59dc5cddb6-qswpf" Dec 11 13:27:47 crc kubenswrapper[4898]: I1211 13:27:47.366996 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84a913ce-6f2e-4327-89b0-eb40be31c03e-config-data\") pod \"heat-cfnapi-7f89bf5c7d-t54nx\" (UID: \"84a913ce-6f2e-4327-89b0-eb40be31c03e\") " pod="openstack/heat-cfnapi-7f89bf5c7d-t54nx" Dec 11 13:27:47 crc kubenswrapper[4898]: I1211 13:27:47.368600 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jk7ds\" (UniqueName: \"kubernetes.io/projected/84a913ce-6f2e-4327-89b0-eb40be31c03e-kube-api-access-jk7ds\") pod \"heat-cfnapi-7f89bf5c7d-t54nx\" (UID: \"84a913ce-6f2e-4327-89b0-eb40be31c03e\") " pod="openstack/heat-cfnapi-7f89bf5c7d-t54nx" Dec 11 13:27:47 crc kubenswrapper[4898]: I1211 13:27:47.378171 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4def8e33-aab9-4c66-8b73-c866ac6c5047-public-tls-certs\") pod \"heat-api-59dc5cddb6-qswpf\" (UID: \"4def8e33-aab9-4c66-8b73-c866ac6c5047\") " pod="openstack/heat-api-59dc5cddb6-qswpf" Dec 11 13:27:47 crc kubenswrapper[4898]: I1211 13:27:47.490246 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-59dc5cddb6-qswpf" Dec 11 13:27:47 crc kubenswrapper[4898]: I1211 13:27:47.521447 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-7f89bf5c7d-t54nx" Dec 11 13:27:47 crc kubenswrapper[4898]: I1211 13:27:47.626838 4898 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/heat-cfnapi-7dc95c5f9b-hp64x" Dec 11 13:27:47 crc kubenswrapper[4898]: I1211 13:27:47.699828 4898 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/heat-api-58bd476547-b52w6" Dec 11 13:27:47 crc kubenswrapper[4898]: I1211 13:27:47.699877 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-58bd476547-b52w6" Dec 11 13:27:47 crc kubenswrapper[4898]: I1211 13:27:47.838917 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"5a98434b-9a01-4f9f-a0c0-5c52ab613405","Type":"ContainerStarted","Data":"3d97fc3c812a33f84614ce4a53df86b359a2275c905f5538f47a26500305a790"} Dec 11 13:27:47 crc kubenswrapper[4898]: I1211 13:27:47.848388 4898 scope.go:117] "RemoveContainer" containerID="189e28771eb11157a5b2e0bf0100157d2e510f5ac65180cf81847641cd19ccf7" Dec 11 13:27:47 crc kubenswrapper[4898]: E1211 13:27:47.848676 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-cfnapi\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-cfnapi pod=heat-cfnapi-7dc95c5f9b-hp64x_openstack(0720809d-a8eb-4edd-a201-67305a32bc97)\"" pod="openstack/heat-cfnapi-7dc95c5f9b-hp64x" podUID="0720809d-a8eb-4edd-a201-67305a32bc97" Dec 11 13:27:47 crc kubenswrapper[4898]: I1211 13:27:47.874516 4898 generic.go:334] "Generic (PLEG): container finished" podID="25ac34fb-287b-4ea4-b349-124a57354453" containerID="28921edda652003320dab5988a7617717d4874a26d9de9ca3ec96c8cb7818cb9" exitCode=0 Dec 11 13:27:47 crc kubenswrapper[4898]: I1211 13:27:47.874554 4898 generic.go:334] "Generic (PLEG): container finished" podID="25ac34fb-287b-4ea4-b349-124a57354453" containerID="6f15206f068cd564bdcfebb3a774f04b52319bb6114731e417b7e5baec2291c0" exitCode=0 Dec 11 13:27:47 crc kubenswrapper[4898]: I1211 13:27:47.875347 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"25ac34fb-287b-4ea4-b349-124a57354453","Type":"ContainerDied","Data":"28921edda652003320dab5988a7617717d4874a26d9de9ca3ec96c8cb7818cb9"} Dec 11 13:27:47 crc kubenswrapper[4898]: I1211 13:27:47.875395 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"25ac34fb-287b-4ea4-b349-124a57354453","Type":"ContainerDied","Data":"6f15206f068cd564bdcfebb3a774f04b52319bb6114731e417b7e5baec2291c0"} Dec 11 13:27:47 crc kubenswrapper[4898]: I1211 13:27:47.875425 4898 scope.go:117] "RemoveContainer" containerID="23e2f642ee8186e197f5e9d46b2e12731ff4e5497740cf42cbc2824699a0608e" Dec 11 13:27:47 crc kubenswrapper[4898]: E1211 13:27:47.875722 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-api pod=heat-api-58bd476547-b52w6_openstack(2caa9e06-4684-4011-8b55-e4436bf09d20)\"" pod="openstack/heat-api-58bd476547-b52w6" podUID="2caa9e06-4684-4011-8b55-e4436bf09d20" Dec 11 13:27:48 crc kubenswrapper[4898]: I1211 13:27:48.219955 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Dec 11 13:27:48 crc kubenswrapper[4898]: I1211 13:27:48.354729 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-7f89bf5c7d-t54nx"] Dec 11 13:27:48 crc kubenswrapper[4898]: I1211 13:27:48.371570 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-59dc5cddb6-qswpf"] Dec 11 13:27:48 crc kubenswrapper[4898]: I1211 13:27:48.582090 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 11 13:27:48 crc kubenswrapper[4898]: I1211 13:27:48.601661 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 11 13:27:48 crc kubenswrapper[4898]: I1211 13:27:48.602634 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 11 13:27:48 crc kubenswrapper[4898]: I1211 13:27:48.687642 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 11 13:27:48 crc kubenswrapper[4898]: I1211 13:27:48.688393 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kkr8x\" (UniqueName: \"kubernetes.io/projected/25ac34fb-287b-4ea4-b349-124a57354453-kube-api-access-kkr8x\") pod \"25ac34fb-287b-4ea4-b349-124a57354453\" (UID: \"25ac34fb-287b-4ea4-b349-124a57354453\") " Dec 11 13:27:48 crc kubenswrapper[4898]: I1211 13:27:48.688565 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25ac34fb-287b-4ea4-b349-124a57354453-combined-ca-bundle\") pod \"25ac34fb-287b-4ea4-b349-124a57354453\" (UID: \"25ac34fb-287b-4ea4-b349-124a57354453\") " Dec 11 13:27:48 crc kubenswrapper[4898]: I1211 13:27:48.688661 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/25ac34fb-287b-4ea4-b349-124a57354453-sg-core-conf-yaml\") pod \"25ac34fb-287b-4ea4-b349-124a57354453\" (UID: \"25ac34fb-287b-4ea4-b349-124a57354453\") " Dec 11 13:27:48 crc kubenswrapper[4898]: I1211 13:27:48.689018 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/25ac34fb-287b-4ea4-b349-124a57354453-run-httpd\") pod \"25ac34fb-287b-4ea4-b349-124a57354453\" (UID: \"25ac34fb-287b-4ea4-b349-124a57354453\") " Dec 11 13:27:48 crc kubenswrapper[4898]: I1211 13:27:48.689143 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25ac34fb-287b-4ea4-b349-124a57354453-config-data\") pod \"25ac34fb-287b-4ea4-b349-124a57354453\" (UID: \"25ac34fb-287b-4ea4-b349-124a57354453\") " Dec 11 13:27:48 crc kubenswrapper[4898]: I1211 13:27:48.689272 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/25ac34fb-287b-4ea4-b349-124a57354453-scripts\") pod \"25ac34fb-287b-4ea4-b349-124a57354453\" (UID: \"25ac34fb-287b-4ea4-b349-124a57354453\") " Dec 11 13:27:48 crc kubenswrapper[4898]: I1211 13:27:48.689481 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/25ac34fb-287b-4ea4-b349-124a57354453-log-httpd\") pod \"25ac34fb-287b-4ea4-b349-124a57354453\" (UID: \"25ac34fb-287b-4ea4-b349-124a57354453\") " Dec 11 13:27:48 crc kubenswrapper[4898]: I1211 13:27:48.690743 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/25ac34fb-287b-4ea4-b349-124a57354453-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "25ac34fb-287b-4ea4-b349-124a57354453" (UID: "25ac34fb-287b-4ea4-b349-124a57354453"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 13:27:48 crc kubenswrapper[4898]: I1211 13:27:48.691129 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/25ac34fb-287b-4ea4-b349-124a57354453-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "25ac34fb-287b-4ea4-b349-124a57354453" (UID: "25ac34fb-287b-4ea4-b349-124a57354453"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 13:27:48 crc kubenswrapper[4898]: I1211 13:27:48.695476 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 11 13:27:48 crc kubenswrapper[4898]: I1211 13:27:48.731789 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25ac34fb-287b-4ea4-b349-124a57354453-kube-api-access-kkr8x" (OuterVolumeSpecName: "kube-api-access-kkr8x") pod "25ac34fb-287b-4ea4-b349-124a57354453" (UID: "25ac34fb-287b-4ea4-b349-124a57354453"). InnerVolumeSpecName "kube-api-access-kkr8x". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:27:48 crc kubenswrapper[4898]: I1211 13:27:48.747272 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25ac34fb-287b-4ea4-b349-124a57354453-scripts" (OuterVolumeSpecName: "scripts") pod "25ac34fb-287b-4ea4-b349-124a57354453" (UID: "25ac34fb-287b-4ea4-b349-124a57354453"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:27:48 crc kubenswrapper[4898]: I1211 13:27:48.796184 4898 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/25ac34fb-287b-4ea4-b349-124a57354453-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 11 13:27:48 crc kubenswrapper[4898]: I1211 13:27:48.796208 4898 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/25ac34fb-287b-4ea4-b349-124a57354453-scripts\") on node \"crc\" DevicePath \"\"" Dec 11 13:27:48 crc kubenswrapper[4898]: I1211 13:27:48.796219 4898 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/25ac34fb-287b-4ea4-b349-124a57354453-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 11 13:27:48 crc kubenswrapper[4898]: I1211 13:27:48.796228 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kkr8x\" (UniqueName: \"kubernetes.io/projected/25ac34fb-287b-4ea4-b349-124a57354453-kube-api-access-kkr8x\") on node \"crc\" DevicePath \"\"" Dec 11 13:27:48 crc kubenswrapper[4898]: I1211 13:27:48.859110 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25ac34fb-287b-4ea4-b349-124a57354453-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "25ac34fb-287b-4ea4-b349-124a57354453" (UID: "25ac34fb-287b-4ea4-b349-124a57354453"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:27:48 crc kubenswrapper[4898]: I1211 13:27:48.898862 4898 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/25ac34fb-287b-4ea4-b349-124a57354453-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 11 13:27:48 crc kubenswrapper[4898]: I1211 13:27:48.915668 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"25ac34fb-287b-4ea4-b349-124a57354453","Type":"ContainerDied","Data":"9b5e0d61001119ad8fe8068cad317e7699f8b04e7c6a00e55c83241d6f90265a"} Dec 11 13:27:48 crc kubenswrapper[4898]: I1211 13:27:48.915720 4898 scope.go:117] "RemoveContainer" containerID="66b2a4d4af3a93e795e5f56d227e5414c8e1538bbf1dd1419850642d3f21bf62" Dec 11 13:27:48 crc kubenswrapper[4898]: I1211 13:27:48.915958 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 11 13:27:48 crc kubenswrapper[4898]: I1211 13:27:48.927551 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-7f89bf5c7d-t54nx" event={"ID":"84a913ce-6f2e-4327-89b0-eb40be31c03e","Type":"ContainerStarted","Data":"e9a47768d5332a760761752a343cf5b0496d0addbfc153fedc62814b54cdd698"} Dec 11 13:27:48 crc kubenswrapper[4898]: I1211 13:27:48.927694 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-7f89bf5c7d-t54nx" event={"ID":"84a913ce-6f2e-4327-89b0-eb40be31c03e","Type":"ContainerStarted","Data":"6e213a479ab98c9dd74395981e83aafce69b368b8733a4815c07d410648ff41f"} Dec 11 13:27:48 crc kubenswrapper[4898]: I1211 13:27:48.927802 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-7f89bf5c7d-t54nx" Dec 11 13:27:48 crc kubenswrapper[4898]: I1211 13:27:48.944762 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-59dc5cddb6-qswpf" event={"ID":"4def8e33-aab9-4c66-8b73-c866ac6c5047","Type":"ContainerStarted","Data":"4d47a2ecefda2198c17a4c10b42f72ca37c81d0e136444649e79b5514837f21b"} Dec 11 13:27:48 crc kubenswrapper[4898]: I1211 13:27:48.954698 4898 scope.go:117] "RemoveContainer" containerID="23e2f642ee8186e197f5e9d46b2e12731ff4e5497740cf42cbc2824699a0608e" Dec 11 13:27:48 crc kubenswrapper[4898]: E1211 13:27:48.954991 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-api pod=heat-api-58bd476547-b52w6_openstack(2caa9e06-4684-4011-8b55-e4436bf09d20)\"" pod="openstack/heat-api-58bd476547-b52w6" podUID="2caa9e06-4684-4011-8b55-e4436bf09d20" Dec 11 13:27:48 crc kubenswrapper[4898]: I1211 13:27:48.955009 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"5a98434b-9a01-4f9f-a0c0-5c52ab613405","Type":"ContainerStarted","Data":"60f1c94c3b97c5f1e80ee0ed3851453b90356f587bf7ac6c251113943956b265"} Dec 11 13:27:48 crc kubenswrapper[4898]: I1211 13:27:48.955477 4898 scope.go:117] "RemoveContainer" containerID="189e28771eb11157a5b2e0bf0100157d2e510f5ac65180cf81847641cd19ccf7" Dec 11 13:27:48 crc kubenswrapper[4898]: E1211 13:27:48.955995 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-cfnapi\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-cfnapi pod=heat-cfnapi-7dc95c5f9b-hp64x_openstack(0720809d-a8eb-4edd-a201-67305a32bc97)\"" pod="openstack/heat-cfnapi-7dc95c5f9b-hp64x" podUID="0720809d-a8eb-4edd-a201-67305a32bc97" Dec 11 13:27:48 crc kubenswrapper[4898]: I1211 13:27:48.956448 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 11 13:27:48 crc kubenswrapper[4898]: I1211 13:27:48.956782 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 11 13:27:48 crc kubenswrapper[4898]: I1211 13:27:48.973255 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-7f89bf5c7d-t54nx" podStartSLOduration=2.973238658 podStartE2EDuration="2.973238658s" podCreationTimestamp="2025-12-11 13:27:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 13:27:48.945111587 +0000 UTC m=+1426.517438024" watchObservedRunningTime="2025-12-11 13:27:48.973238658 +0000 UTC m=+1426.545565095" Dec 11 13:27:48 crc kubenswrapper[4898]: I1211 13:27:48.991006 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=4.990984668 podStartE2EDuration="4.990984668s" podCreationTimestamp="2025-12-11 13:27:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 13:27:48.976435244 +0000 UTC m=+1426.548761681" watchObservedRunningTime="2025-12-11 13:27:48.990984668 +0000 UTC m=+1426.563311105" Dec 11 13:27:48 crc kubenswrapper[4898]: I1211 13:27:48.998822 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25ac34fb-287b-4ea4-b349-124a57354453-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "25ac34fb-287b-4ea4-b349-124a57354453" (UID: "25ac34fb-287b-4ea4-b349-124a57354453"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:27:49 crc kubenswrapper[4898]: I1211 13:27:49.002069 4898 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25ac34fb-287b-4ea4-b349-124a57354453-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 11 13:27:49 crc kubenswrapper[4898]: I1211 13:27:49.027148 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25ac34fb-287b-4ea4-b349-124a57354453-config-data" (OuterVolumeSpecName: "config-data") pod "25ac34fb-287b-4ea4-b349-124a57354453" (UID: "25ac34fb-287b-4ea4-b349-124a57354453"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:27:49 crc kubenswrapper[4898]: I1211 13:27:49.104626 4898 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25ac34fb-287b-4ea4-b349-124a57354453-config-data\") on node \"crc\" DevicePath \"\"" Dec 11 13:27:49 crc kubenswrapper[4898]: I1211 13:27:49.114120 4898 scope.go:117] "RemoveContainer" containerID="374a78de8739eef4fa406721b9dca1ec6b141ead76b48ecae8837b884187cc46" Dec 11 13:27:49 crc kubenswrapper[4898]: I1211 13:27:49.158045 4898 scope.go:117] "RemoveContainer" containerID="28921edda652003320dab5988a7617717d4874a26d9de9ca3ec96c8cb7818cb9" Dec 11 13:27:49 crc kubenswrapper[4898]: I1211 13:27:49.189918 4898 scope.go:117] "RemoveContainer" containerID="6f15206f068cd564bdcfebb3a774f04b52319bb6114731e417b7e5baec2291c0" Dec 11 13:27:49 crc kubenswrapper[4898]: I1211 13:27:49.291603 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 11 13:27:49 crc kubenswrapper[4898]: I1211 13:27:49.317527 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 11 13:27:49 crc kubenswrapper[4898]: I1211 13:27:49.368133 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 11 13:27:49 crc kubenswrapper[4898]: E1211 13:27:49.368559 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25ac34fb-287b-4ea4-b349-124a57354453" containerName="proxy-httpd" Dec 11 13:27:49 crc kubenswrapper[4898]: I1211 13:27:49.368571 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="25ac34fb-287b-4ea4-b349-124a57354453" containerName="proxy-httpd" Dec 11 13:27:49 crc kubenswrapper[4898]: E1211 13:27:49.368594 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25ac34fb-287b-4ea4-b349-124a57354453" containerName="sg-core" Dec 11 13:27:49 crc kubenswrapper[4898]: I1211 13:27:49.368601 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="25ac34fb-287b-4ea4-b349-124a57354453" containerName="sg-core" Dec 11 13:27:49 crc kubenswrapper[4898]: E1211 13:27:49.368626 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25ac34fb-287b-4ea4-b349-124a57354453" containerName="ceilometer-central-agent" Dec 11 13:27:49 crc kubenswrapper[4898]: I1211 13:27:49.368632 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="25ac34fb-287b-4ea4-b349-124a57354453" containerName="ceilometer-central-agent" Dec 11 13:27:49 crc kubenswrapper[4898]: E1211 13:27:49.368660 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25ac34fb-287b-4ea4-b349-124a57354453" containerName="ceilometer-notification-agent" Dec 11 13:27:49 crc kubenswrapper[4898]: I1211 13:27:49.368665 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="25ac34fb-287b-4ea4-b349-124a57354453" containerName="ceilometer-notification-agent" Dec 11 13:27:49 crc kubenswrapper[4898]: I1211 13:27:49.368862 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="25ac34fb-287b-4ea4-b349-124a57354453" containerName="sg-core" Dec 11 13:27:49 crc kubenswrapper[4898]: I1211 13:27:49.368886 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="25ac34fb-287b-4ea4-b349-124a57354453" containerName="ceilometer-notification-agent" Dec 11 13:27:49 crc kubenswrapper[4898]: I1211 13:27:49.368897 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="25ac34fb-287b-4ea4-b349-124a57354453" containerName="ceilometer-central-agent" Dec 11 13:27:49 crc kubenswrapper[4898]: I1211 13:27:49.368911 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="25ac34fb-287b-4ea4-b349-124a57354453" containerName="proxy-httpd" Dec 11 13:27:49 crc kubenswrapper[4898]: I1211 13:27:49.370918 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 11 13:27:49 crc kubenswrapper[4898]: I1211 13:27:49.376961 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 11 13:27:49 crc kubenswrapper[4898]: I1211 13:27:49.377209 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 11 13:27:49 crc kubenswrapper[4898]: I1211 13:27:49.393196 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 11 13:27:49 crc kubenswrapper[4898]: I1211 13:27:49.513898 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6cd68b61-5d09-46d1-908e-6c4e72120b56-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6cd68b61-5d09-46d1-908e-6c4e72120b56\") " pod="openstack/ceilometer-0" Dec 11 13:27:49 crc kubenswrapper[4898]: I1211 13:27:49.513976 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6cd68b61-5d09-46d1-908e-6c4e72120b56-run-httpd\") pod \"ceilometer-0\" (UID: \"6cd68b61-5d09-46d1-908e-6c4e72120b56\") " pod="openstack/ceilometer-0" Dec 11 13:27:49 crc kubenswrapper[4898]: I1211 13:27:49.514038 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cd68b61-5d09-46d1-908e-6c4e72120b56-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6cd68b61-5d09-46d1-908e-6c4e72120b56\") " pod="openstack/ceilometer-0" Dec 11 13:27:49 crc kubenswrapper[4898]: I1211 13:27:49.514071 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6cd68b61-5d09-46d1-908e-6c4e72120b56-scripts\") pod \"ceilometer-0\" (UID: \"6cd68b61-5d09-46d1-908e-6c4e72120b56\") " pod="openstack/ceilometer-0" Dec 11 13:27:49 crc kubenswrapper[4898]: I1211 13:27:49.514200 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6cd68b61-5d09-46d1-908e-6c4e72120b56-log-httpd\") pod \"ceilometer-0\" (UID: \"6cd68b61-5d09-46d1-908e-6c4e72120b56\") " pod="openstack/ceilometer-0" Dec 11 13:27:49 crc kubenswrapper[4898]: I1211 13:27:49.514251 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mf5xb\" (UniqueName: \"kubernetes.io/projected/6cd68b61-5d09-46d1-908e-6c4e72120b56-kube-api-access-mf5xb\") pod \"ceilometer-0\" (UID: \"6cd68b61-5d09-46d1-908e-6c4e72120b56\") " pod="openstack/ceilometer-0" Dec 11 13:27:49 crc kubenswrapper[4898]: I1211 13:27:49.514272 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6cd68b61-5d09-46d1-908e-6c4e72120b56-config-data\") pod \"ceilometer-0\" (UID: \"6cd68b61-5d09-46d1-908e-6c4e72120b56\") " pod="openstack/ceilometer-0" Dec 11 13:27:49 crc kubenswrapper[4898]: I1211 13:27:49.616311 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6cd68b61-5d09-46d1-908e-6c4e72120b56-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6cd68b61-5d09-46d1-908e-6c4e72120b56\") " pod="openstack/ceilometer-0" Dec 11 13:27:49 crc kubenswrapper[4898]: I1211 13:27:49.616381 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6cd68b61-5d09-46d1-908e-6c4e72120b56-run-httpd\") pod \"ceilometer-0\" (UID: \"6cd68b61-5d09-46d1-908e-6c4e72120b56\") " pod="openstack/ceilometer-0" Dec 11 13:27:49 crc kubenswrapper[4898]: I1211 13:27:49.616467 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cd68b61-5d09-46d1-908e-6c4e72120b56-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6cd68b61-5d09-46d1-908e-6c4e72120b56\") " pod="openstack/ceilometer-0" Dec 11 13:27:49 crc kubenswrapper[4898]: I1211 13:27:49.616514 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6cd68b61-5d09-46d1-908e-6c4e72120b56-scripts\") pod \"ceilometer-0\" (UID: \"6cd68b61-5d09-46d1-908e-6c4e72120b56\") " pod="openstack/ceilometer-0" Dec 11 13:27:49 crc kubenswrapper[4898]: I1211 13:27:49.616639 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6cd68b61-5d09-46d1-908e-6c4e72120b56-log-httpd\") pod \"ceilometer-0\" (UID: \"6cd68b61-5d09-46d1-908e-6c4e72120b56\") " pod="openstack/ceilometer-0" Dec 11 13:27:49 crc kubenswrapper[4898]: I1211 13:27:49.616684 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mf5xb\" (UniqueName: \"kubernetes.io/projected/6cd68b61-5d09-46d1-908e-6c4e72120b56-kube-api-access-mf5xb\") pod \"ceilometer-0\" (UID: \"6cd68b61-5d09-46d1-908e-6c4e72120b56\") " pod="openstack/ceilometer-0" Dec 11 13:27:49 crc kubenswrapper[4898]: I1211 13:27:49.616709 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6cd68b61-5d09-46d1-908e-6c4e72120b56-config-data\") pod \"ceilometer-0\" (UID: \"6cd68b61-5d09-46d1-908e-6c4e72120b56\") " pod="openstack/ceilometer-0" Dec 11 13:27:49 crc kubenswrapper[4898]: I1211 13:27:49.617003 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6cd68b61-5d09-46d1-908e-6c4e72120b56-run-httpd\") pod \"ceilometer-0\" (UID: \"6cd68b61-5d09-46d1-908e-6c4e72120b56\") " pod="openstack/ceilometer-0" Dec 11 13:27:49 crc kubenswrapper[4898]: I1211 13:27:49.617074 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6cd68b61-5d09-46d1-908e-6c4e72120b56-log-httpd\") pod \"ceilometer-0\" (UID: \"6cd68b61-5d09-46d1-908e-6c4e72120b56\") " pod="openstack/ceilometer-0" Dec 11 13:27:49 crc kubenswrapper[4898]: I1211 13:27:49.639447 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cd68b61-5d09-46d1-908e-6c4e72120b56-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6cd68b61-5d09-46d1-908e-6c4e72120b56\") " pod="openstack/ceilometer-0" Dec 11 13:27:49 crc kubenswrapper[4898]: I1211 13:27:49.642578 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6cd68b61-5d09-46d1-908e-6c4e72120b56-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6cd68b61-5d09-46d1-908e-6c4e72120b56\") " pod="openstack/ceilometer-0" Dec 11 13:27:49 crc kubenswrapper[4898]: I1211 13:27:49.644855 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6cd68b61-5d09-46d1-908e-6c4e72120b56-config-data\") pod \"ceilometer-0\" (UID: \"6cd68b61-5d09-46d1-908e-6c4e72120b56\") " pod="openstack/ceilometer-0" Dec 11 13:27:49 crc kubenswrapper[4898]: I1211 13:27:49.647435 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6cd68b61-5d09-46d1-908e-6c4e72120b56-scripts\") pod \"ceilometer-0\" (UID: \"6cd68b61-5d09-46d1-908e-6c4e72120b56\") " pod="openstack/ceilometer-0" Dec 11 13:27:49 crc kubenswrapper[4898]: I1211 13:27:49.650307 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mf5xb\" (UniqueName: \"kubernetes.io/projected/6cd68b61-5d09-46d1-908e-6c4e72120b56-kube-api-access-mf5xb\") pod \"ceilometer-0\" (UID: \"6cd68b61-5d09-46d1-908e-6c4e72120b56\") " pod="openstack/ceilometer-0" Dec 11 13:27:49 crc kubenswrapper[4898]: I1211 13:27:49.723338 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 11 13:27:49 crc kubenswrapper[4898]: I1211 13:27:49.980094 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-59dc5cddb6-qswpf" event={"ID":"4def8e33-aab9-4c66-8b73-c866ac6c5047","Type":"ContainerStarted","Data":"6020ea8fd58098a1bd89bcd4ea035d07467ae603bbc9618ec25600a4f889f68a"} Dec 11 13:27:49 crc kubenswrapper[4898]: I1211 13:27:49.980288 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-59dc5cddb6-qswpf" Dec 11 13:27:50 crc kubenswrapper[4898]: I1211 13:27:50.011851 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-59dc5cddb6-qswpf" podStartSLOduration=4.011773373 podStartE2EDuration="4.011773373s" podCreationTimestamp="2025-12-11 13:27:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 13:27:49.996685344 +0000 UTC m=+1427.569011781" watchObservedRunningTime="2025-12-11 13:27:50.011773373 +0000 UTC m=+1427.584099810" Dec 11 13:27:50 crc kubenswrapper[4898]: I1211 13:27:50.792355 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25ac34fb-287b-4ea4-b349-124a57354453" path="/var/lib/kubelet/pods/25ac34fb-287b-4ea4-b349-124a57354453/volumes" Dec 11 13:27:50 crc kubenswrapper[4898]: I1211 13:27:50.819435 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 11 13:27:51 crc kubenswrapper[4898]: I1211 13:27:51.005130 4898 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 11 13:27:51 crc kubenswrapper[4898]: I1211 13:27:51.005162 4898 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 11 13:27:51 crc kubenswrapper[4898]: I1211 13:27:51.492541 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/heat-cfnapi-6cfcb9bcc9-8j4sz" podUID="0ee1613f-2749-4158-b826-8096758ca04f" containerName="heat-cfnapi" probeResult="failure" output="Get \"http://10.217.0.214:8000/healthcheck\": read tcp 10.217.0.2:54082->10.217.0.214:8000: read: connection reset by peer" Dec 11 13:27:51 crc kubenswrapper[4898]: I1211 13:27:51.493338 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/heat-cfnapi-6cfcb9bcc9-8j4sz" podUID="0ee1613f-2749-4158-b826-8096758ca04f" containerName="heat-cfnapi" probeResult="failure" output="Get \"http://10.217.0.214:8000/healthcheck\": dial tcp 10.217.0.214:8000: connect: connection refused" Dec 11 13:27:51 crc kubenswrapper[4898]: I1211 13:27:51.773778 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 11 13:27:52 crc kubenswrapper[4898]: I1211 13:27:52.018397 4898 generic.go:334] "Generic (PLEG): container finished" podID="0ee1613f-2749-4158-b826-8096758ca04f" containerID="3e4d687a0921db9498e4173ffb0d78fb822d8dfbfe1a84dcabbeaaaaa578664a" exitCode=0 Dec 11 13:27:52 crc kubenswrapper[4898]: I1211 13:27:52.018443 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-6cfcb9bcc9-8j4sz" event={"ID":"0ee1613f-2749-4158-b826-8096758ca04f","Type":"ContainerDied","Data":"3e4d687a0921db9498e4173ffb0d78fb822d8dfbfe1a84dcabbeaaaaa578664a"} Dec 11 13:27:53 crc kubenswrapper[4898]: I1211 13:27:53.441929 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/heat-api-6ff897fbd7-rddq7" podUID="c6532ba5-2d1a-44f9-bf4a-6fd73f4bb6ee" containerName="heat-api" probeResult="failure" output="Get \"http://10.217.0.215:8004/healthcheck\": read tcp 10.217.0.2:59050->10.217.0.215:8004: read: connection reset by peer" Dec 11 13:27:54 crc kubenswrapper[4898]: I1211 13:27:54.053805 4898 generic.go:334] "Generic (PLEG): container finished" podID="c6532ba5-2d1a-44f9-bf4a-6fd73f4bb6ee" containerID="75320a8b39bb7fa3543db9e0288c17f254ec4ffaa105274e968f0d723be9cc21" exitCode=0 Dec 11 13:27:54 crc kubenswrapper[4898]: I1211 13:27:54.053877 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-6ff897fbd7-rddq7" event={"ID":"c6532ba5-2d1a-44f9-bf4a-6fd73f4bb6ee","Type":"ContainerDied","Data":"75320a8b39bb7fa3543db9e0288c17f254ec4ffaa105274e968f0d723be9cc21"} Dec 11 13:27:54 crc kubenswrapper[4898]: I1211 13:27:54.122109 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 11 13:27:54 crc kubenswrapper[4898]: I1211 13:27:54.122233 4898 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 11 13:27:54 crc kubenswrapper[4898]: I1211 13:27:54.126522 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 11 13:27:54 crc kubenswrapper[4898]: I1211 13:27:54.361799 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-engine-596c56fc48-gpcvd" Dec 11 13:27:54 crc kubenswrapper[4898]: I1211 13:27:54.456221 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/heat-api-6ff897fbd7-rddq7" podUID="c6532ba5-2d1a-44f9-bf4a-6fd73f4bb6ee" containerName="heat-api" probeResult="failure" output="Get \"http://10.217.0.215:8004/healthcheck\": dial tcp 10.217.0.215:8004: connect: connection refused" Dec 11 13:27:54 crc kubenswrapper[4898]: I1211 13:27:54.711271 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/heat-cfnapi-6cfcb9bcc9-8j4sz" podUID="0ee1613f-2749-4158-b826-8096758ca04f" containerName="heat-cfnapi" probeResult="failure" output="Get \"http://10.217.0.214:8000/healthcheck\": dial tcp 10.217.0.214:8000: connect: connection refused" Dec 11 13:27:55 crc kubenswrapper[4898]: I1211 13:27:55.244127 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 11 13:27:55 crc kubenswrapper[4898]: I1211 13:27:55.244411 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 11 13:27:55 crc kubenswrapper[4898]: I1211 13:27:55.297576 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 11 13:27:55 crc kubenswrapper[4898]: I1211 13:27:55.304687 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 11 13:27:56 crc kubenswrapper[4898]: I1211 13:27:56.081434 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 11 13:27:56 crc kubenswrapper[4898]: I1211 13:27:56.081736 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 11 13:27:57 crc kubenswrapper[4898]: W1211 13:27:57.705081 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6cd68b61_5d09_46d1_908e_6c4e72120b56.slice/crio-5c07f9cff2353bcf5cf4b2f004a7c7e797e6adf5da75b6d3981fe6053f04f96f WatchSource:0}: Error finding container 5c07f9cff2353bcf5cf4b2f004a7c7e797e6adf5da75b6d3981fe6053f04f96f: Status 404 returned error can't find the container with id 5c07f9cff2353bcf5cf4b2f004a7c7e797e6adf5da75b6d3981fe6053f04f96f Dec 11 13:27:58 crc kubenswrapper[4898]: I1211 13:27:58.147643 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6cd68b61-5d09-46d1-908e-6c4e72120b56","Type":"ContainerStarted","Data":"5c07f9cff2353bcf5cf4b2f004a7c7e797e6adf5da75b6d3981fe6053f04f96f"} Dec 11 13:27:58 crc kubenswrapper[4898]: I1211 13:27:58.900744 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 11 13:27:58 crc kubenswrapper[4898]: I1211 13:27:58.901141 4898 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 11 13:27:58 crc kubenswrapper[4898]: I1211 13:27:58.921496 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 11 13:27:59 crc kubenswrapper[4898]: I1211 13:27:59.154283 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-6cfcb9bcc9-8j4sz" Dec 11 13:27:59 crc kubenswrapper[4898]: I1211 13:27:59.163531 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-6cfcb9bcc9-8j4sz" event={"ID":"0ee1613f-2749-4158-b826-8096758ca04f","Type":"ContainerDied","Data":"259bffd5dee25aedcda33245332fe75a51051ac1bdb36b148371210739ad4e1e"} Dec 11 13:27:59 crc kubenswrapper[4898]: I1211 13:27:59.163579 4898 scope.go:117] "RemoveContainer" containerID="3e4d687a0921db9498e4173ffb0d78fb822d8dfbfe1a84dcabbeaaaaa578664a" Dec 11 13:27:59 crc kubenswrapper[4898]: I1211 13:27:59.163581 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-6cfcb9bcc9-8j4sz" Dec 11 13:27:59 crc kubenswrapper[4898]: I1211 13:27:59.303071 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ee1613f-2749-4158-b826-8096758ca04f-combined-ca-bundle\") pod \"0ee1613f-2749-4158-b826-8096758ca04f\" (UID: \"0ee1613f-2749-4158-b826-8096758ca04f\") " Dec 11 13:27:59 crc kubenswrapper[4898]: I1211 13:27:59.303650 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0ee1613f-2749-4158-b826-8096758ca04f-config-data-custom\") pod \"0ee1613f-2749-4158-b826-8096758ca04f\" (UID: \"0ee1613f-2749-4158-b826-8096758ca04f\") " Dec 11 13:27:59 crc kubenswrapper[4898]: I1211 13:27:59.303705 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ee1613f-2749-4158-b826-8096758ca04f-config-data\") pod \"0ee1613f-2749-4158-b826-8096758ca04f\" (UID: \"0ee1613f-2749-4158-b826-8096758ca04f\") " Dec 11 13:27:59 crc kubenswrapper[4898]: I1211 13:27:59.303806 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bg7zt\" (UniqueName: \"kubernetes.io/projected/0ee1613f-2749-4158-b826-8096758ca04f-kube-api-access-bg7zt\") pod \"0ee1613f-2749-4158-b826-8096758ca04f\" (UID: \"0ee1613f-2749-4158-b826-8096758ca04f\") " Dec 11 13:27:59 crc kubenswrapper[4898]: I1211 13:27:59.316598 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ee1613f-2749-4158-b826-8096758ca04f-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "0ee1613f-2749-4158-b826-8096758ca04f" (UID: "0ee1613f-2749-4158-b826-8096758ca04f"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:27:59 crc kubenswrapper[4898]: I1211 13:27:59.323755 4898 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0ee1613f-2749-4158-b826-8096758ca04f-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 11 13:27:59 crc kubenswrapper[4898]: I1211 13:27:59.327702 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ee1613f-2749-4158-b826-8096758ca04f-kube-api-access-bg7zt" (OuterVolumeSpecName: "kube-api-access-bg7zt") pod "0ee1613f-2749-4158-b826-8096758ca04f" (UID: "0ee1613f-2749-4158-b826-8096758ca04f"). InnerVolumeSpecName "kube-api-access-bg7zt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:27:59 crc kubenswrapper[4898]: I1211 13:27:59.367625 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ee1613f-2749-4158-b826-8096758ca04f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0ee1613f-2749-4158-b826-8096758ca04f" (UID: "0ee1613f-2749-4158-b826-8096758ca04f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:27:59 crc kubenswrapper[4898]: I1211 13:27:59.427447 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bg7zt\" (UniqueName: \"kubernetes.io/projected/0ee1613f-2749-4158-b826-8096758ca04f-kube-api-access-bg7zt\") on node \"crc\" DevicePath \"\"" Dec 11 13:27:59 crc kubenswrapper[4898]: I1211 13:27:59.427501 4898 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ee1613f-2749-4158-b826-8096758ca04f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 11 13:27:59 crc kubenswrapper[4898]: I1211 13:27:59.449378 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ee1613f-2749-4158-b826-8096758ca04f-config-data" (OuterVolumeSpecName: "config-data") pod "0ee1613f-2749-4158-b826-8096758ca04f" (UID: "0ee1613f-2749-4158-b826-8096758ca04f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:27:59 crc kubenswrapper[4898]: I1211 13:27:59.459295 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/heat-api-6ff897fbd7-rddq7" podUID="c6532ba5-2d1a-44f9-bf4a-6fd73f4bb6ee" containerName="heat-api" probeResult="failure" output="Get \"http://10.217.0.215:8004/healthcheck\": dial tcp 10.217.0.215:8004: connect: connection refused" Dec 11 13:27:59 crc kubenswrapper[4898]: I1211 13:27:59.529577 4898 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ee1613f-2749-4158-b826-8096758ca04f-config-data\") on node \"crc\" DevicePath \"\"" Dec 11 13:27:59 crc kubenswrapper[4898]: I1211 13:27:59.595505 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-6cfcb9bcc9-8j4sz"] Dec 11 13:27:59 crc kubenswrapper[4898]: I1211 13:27:59.615407 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-cfnapi-6cfcb9bcc9-8j4sz"] Dec 11 13:27:59 crc kubenswrapper[4898]: I1211 13:27:59.830856 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-api-59dc5cddb6-qswpf" Dec 11 13:27:59 crc kubenswrapper[4898]: I1211 13:27:59.901968 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-6ff897fbd7-rddq7" Dec 11 13:27:59 crc kubenswrapper[4898]: I1211 13:27:59.946203 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5bjms\" (UniqueName: \"kubernetes.io/projected/c6532ba5-2d1a-44f9-bf4a-6fd73f4bb6ee-kube-api-access-5bjms\") pod \"c6532ba5-2d1a-44f9-bf4a-6fd73f4bb6ee\" (UID: \"c6532ba5-2d1a-44f9-bf4a-6fd73f4bb6ee\") " Dec 11 13:27:59 crc kubenswrapper[4898]: I1211 13:27:59.946351 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c6532ba5-2d1a-44f9-bf4a-6fd73f4bb6ee-config-data\") pod \"c6532ba5-2d1a-44f9-bf4a-6fd73f4bb6ee\" (UID: \"c6532ba5-2d1a-44f9-bf4a-6fd73f4bb6ee\") " Dec 11 13:27:59 crc kubenswrapper[4898]: I1211 13:27:59.946546 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c6532ba5-2d1a-44f9-bf4a-6fd73f4bb6ee-config-data-custom\") pod \"c6532ba5-2d1a-44f9-bf4a-6fd73f4bb6ee\" (UID: \"c6532ba5-2d1a-44f9-bf4a-6fd73f4bb6ee\") " Dec 11 13:27:59 crc kubenswrapper[4898]: I1211 13:27:59.946621 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6532ba5-2d1a-44f9-bf4a-6fd73f4bb6ee-combined-ca-bundle\") pod \"c6532ba5-2d1a-44f9-bf4a-6fd73f4bb6ee\" (UID: \"c6532ba5-2d1a-44f9-bf4a-6fd73f4bb6ee\") " Dec 11 13:27:59 crc kubenswrapper[4898]: I1211 13:27:59.956128 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-58bd476547-b52w6"] Dec 11 13:27:59 crc kubenswrapper[4898]: I1211 13:27:59.971123 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c6532ba5-2d1a-44f9-bf4a-6fd73f4bb6ee-kube-api-access-5bjms" (OuterVolumeSpecName: "kube-api-access-5bjms") pod "c6532ba5-2d1a-44f9-bf4a-6fd73f4bb6ee" (UID: "c6532ba5-2d1a-44f9-bf4a-6fd73f4bb6ee"). InnerVolumeSpecName "kube-api-access-5bjms". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:27:59 crc kubenswrapper[4898]: I1211 13:27:59.975153 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6532ba5-2d1a-44f9-bf4a-6fd73f4bb6ee-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "c6532ba5-2d1a-44f9-bf4a-6fd73f4bb6ee" (UID: "c6532ba5-2d1a-44f9-bf4a-6fd73f4bb6ee"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:28:00 crc kubenswrapper[4898]: I1211 13:28:00.046163 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6532ba5-2d1a-44f9-bf4a-6fd73f4bb6ee-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c6532ba5-2d1a-44f9-bf4a-6fd73f4bb6ee" (UID: "c6532ba5-2d1a-44f9-bf4a-6fd73f4bb6ee"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:28:00 crc kubenswrapper[4898]: I1211 13:28:00.048994 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5bjms\" (UniqueName: \"kubernetes.io/projected/c6532ba5-2d1a-44f9-bf4a-6fd73f4bb6ee-kube-api-access-5bjms\") on node \"crc\" DevicePath \"\"" Dec 11 13:28:00 crc kubenswrapper[4898]: I1211 13:28:00.049125 4898 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c6532ba5-2d1a-44f9-bf4a-6fd73f4bb6ee-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 11 13:28:00 crc kubenswrapper[4898]: I1211 13:28:00.049282 4898 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6532ba5-2d1a-44f9-bf4a-6fd73f4bb6ee-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 11 13:28:00 crc kubenswrapper[4898]: I1211 13:28:00.061570 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6532ba5-2d1a-44f9-bf4a-6fd73f4bb6ee-config-data" (OuterVolumeSpecName: "config-data") pod "c6532ba5-2d1a-44f9-bf4a-6fd73f4bb6ee" (UID: "c6532ba5-2d1a-44f9-bf4a-6fd73f4bb6ee"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:28:00 crc kubenswrapper[4898]: I1211 13:28:00.150909 4898 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c6532ba5-2d1a-44f9-bf4a-6fd73f4bb6ee-config-data\") on node \"crc\" DevicePath \"\"" Dec 11 13:28:00 crc kubenswrapper[4898]: I1211 13:28:00.219550 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6cd68b61-5d09-46d1-908e-6c4e72120b56","Type":"ContainerStarted","Data":"274b0abc1fdfc3ce5a15a25adbb0b48e53b5a3fcf39978399042c805a99784f1"} Dec 11 13:28:00 crc kubenswrapper[4898]: I1211 13:28:00.224051 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-6ff897fbd7-rddq7" event={"ID":"c6532ba5-2d1a-44f9-bf4a-6fd73f4bb6ee","Type":"ContainerDied","Data":"cebf79a6c94cfbbe95f3601b078191aea3ed4c0f4aa59ba5c70f314c998dfed2"} Dec 11 13:28:00 crc kubenswrapper[4898]: I1211 13:28:00.224104 4898 scope.go:117] "RemoveContainer" containerID="75320a8b39bb7fa3543db9e0288c17f254ec4ffaa105274e968f0d723be9cc21" Dec 11 13:28:00 crc kubenswrapper[4898]: I1211 13:28:00.224208 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-6ff897fbd7-rddq7" Dec 11 13:28:00 crc kubenswrapper[4898]: I1211 13:28:00.255217 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-rpgj6" event={"ID":"421ec273-9526-4100-9a5a-63e0512beee3","Type":"ContainerStarted","Data":"16872782332a1dbb940497943f7cacb007dc4833d1469644a0b34aa6937ab657"} Dec 11 13:28:00 crc kubenswrapper[4898]: I1211 13:28:00.314137 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-cfnapi-7f89bf5c7d-t54nx" Dec 11 13:28:00 crc kubenswrapper[4898]: I1211 13:28:00.342644 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-6ff897fbd7-rddq7"] Dec 11 13:28:00 crc kubenswrapper[4898]: I1211 13:28:00.354535 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-api-6ff897fbd7-rddq7"] Dec 11 13:28:00 crc kubenswrapper[4898]: I1211 13:28:00.363374 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-rpgj6" podStartSLOduration=3.1580526190000002 podStartE2EDuration="21.363352625s" podCreationTimestamp="2025-12-11 13:27:39 +0000 UTC" firstStartedPulling="2025-12-11 13:27:41.242705418 +0000 UTC m=+1418.815031855" lastFinishedPulling="2025-12-11 13:27:59.448005424 +0000 UTC m=+1437.020331861" observedRunningTime="2025-12-11 13:28:00.293718581 +0000 UTC m=+1437.866045018" watchObservedRunningTime="2025-12-11 13:28:00.363352625 +0000 UTC m=+1437.935679052" Dec 11 13:28:00 crc kubenswrapper[4898]: I1211 13:28:00.399820 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-7dc95c5f9b-hp64x"] Dec 11 13:28:00 crc kubenswrapper[4898]: I1211 13:28:00.524105 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-58bd476547-b52w6" Dec 11 13:28:00 crc kubenswrapper[4898]: I1211 13:28:00.567950 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2caa9e06-4684-4011-8b55-e4436bf09d20-combined-ca-bundle\") pod \"2caa9e06-4684-4011-8b55-e4436bf09d20\" (UID: \"2caa9e06-4684-4011-8b55-e4436bf09d20\") " Dec 11 13:28:00 crc kubenswrapper[4898]: I1211 13:28:00.568088 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kp44m\" (UniqueName: \"kubernetes.io/projected/2caa9e06-4684-4011-8b55-e4436bf09d20-kube-api-access-kp44m\") pod \"2caa9e06-4684-4011-8b55-e4436bf09d20\" (UID: \"2caa9e06-4684-4011-8b55-e4436bf09d20\") " Dec 11 13:28:00 crc kubenswrapper[4898]: I1211 13:28:00.568125 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2caa9e06-4684-4011-8b55-e4436bf09d20-config-data\") pod \"2caa9e06-4684-4011-8b55-e4436bf09d20\" (UID: \"2caa9e06-4684-4011-8b55-e4436bf09d20\") " Dec 11 13:28:00 crc kubenswrapper[4898]: I1211 13:28:00.568168 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2caa9e06-4684-4011-8b55-e4436bf09d20-config-data-custom\") pod \"2caa9e06-4684-4011-8b55-e4436bf09d20\" (UID: \"2caa9e06-4684-4011-8b55-e4436bf09d20\") " Dec 11 13:28:00 crc kubenswrapper[4898]: I1211 13:28:00.578667 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2caa9e06-4684-4011-8b55-e4436bf09d20-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "2caa9e06-4684-4011-8b55-e4436bf09d20" (UID: "2caa9e06-4684-4011-8b55-e4436bf09d20"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:28:00 crc kubenswrapper[4898]: I1211 13:28:00.578709 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2caa9e06-4684-4011-8b55-e4436bf09d20-kube-api-access-kp44m" (OuterVolumeSpecName: "kube-api-access-kp44m") pod "2caa9e06-4684-4011-8b55-e4436bf09d20" (UID: "2caa9e06-4684-4011-8b55-e4436bf09d20"). InnerVolumeSpecName "kube-api-access-kp44m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:28:00 crc kubenswrapper[4898]: I1211 13:28:00.611789 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2caa9e06-4684-4011-8b55-e4436bf09d20-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2caa9e06-4684-4011-8b55-e4436bf09d20" (UID: "2caa9e06-4684-4011-8b55-e4436bf09d20"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:28:00 crc kubenswrapper[4898]: I1211 13:28:00.645705 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2caa9e06-4684-4011-8b55-e4436bf09d20-config-data" (OuterVolumeSpecName: "config-data") pod "2caa9e06-4684-4011-8b55-e4436bf09d20" (UID: "2caa9e06-4684-4011-8b55-e4436bf09d20"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:28:00 crc kubenswrapper[4898]: I1211 13:28:00.673032 4898 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2caa9e06-4684-4011-8b55-e4436bf09d20-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 11 13:28:00 crc kubenswrapper[4898]: I1211 13:28:00.673066 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kp44m\" (UniqueName: \"kubernetes.io/projected/2caa9e06-4684-4011-8b55-e4436bf09d20-kube-api-access-kp44m\") on node \"crc\" DevicePath \"\"" Dec 11 13:28:00 crc kubenswrapper[4898]: I1211 13:28:00.673077 4898 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2caa9e06-4684-4011-8b55-e4436bf09d20-config-data\") on node \"crc\" DevicePath \"\"" Dec 11 13:28:00 crc kubenswrapper[4898]: I1211 13:28:00.673087 4898 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2caa9e06-4684-4011-8b55-e4436bf09d20-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 11 13:28:00 crc kubenswrapper[4898]: I1211 13:28:00.781742 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-7dc95c5f9b-hp64x" Dec 11 13:28:00 crc kubenswrapper[4898]: I1211 13:28:00.793747 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ee1613f-2749-4158-b826-8096758ca04f" path="/var/lib/kubelet/pods/0ee1613f-2749-4158-b826-8096758ca04f/volumes" Dec 11 13:28:00 crc kubenswrapper[4898]: I1211 13:28:00.794514 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c6532ba5-2d1a-44f9-bf4a-6fd73f4bb6ee" path="/var/lib/kubelet/pods/c6532ba5-2d1a-44f9-bf4a-6fd73f4bb6ee/volumes" Dec 11 13:28:00 crc kubenswrapper[4898]: I1211 13:28:00.878086 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-49vrz\" (UniqueName: \"kubernetes.io/projected/0720809d-a8eb-4edd-a201-67305a32bc97-kube-api-access-49vrz\") pod \"0720809d-a8eb-4edd-a201-67305a32bc97\" (UID: \"0720809d-a8eb-4edd-a201-67305a32bc97\") " Dec 11 13:28:00 crc kubenswrapper[4898]: I1211 13:28:00.878673 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0720809d-a8eb-4edd-a201-67305a32bc97-combined-ca-bundle\") pod \"0720809d-a8eb-4edd-a201-67305a32bc97\" (UID: \"0720809d-a8eb-4edd-a201-67305a32bc97\") " Dec 11 13:28:00 crc kubenswrapper[4898]: I1211 13:28:00.879151 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0720809d-a8eb-4edd-a201-67305a32bc97-config-data\") pod \"0720809d-a8eb-4edd-a201-67305a32bc97\" (UID: \"0720809d-a8eb-4edd-a201-67305a32bc97\") " Dec 11 13:28:00 crc kubenswrapper[4898]: I1211 13:28:00.880419 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0720809d-a8eb-4edd-a201-67305a32bc97-config-data-custom\") pod \"0720809d-a8eb-4edd-a201-67305a32bc97\" (UID: \"0720809d-a8eb-4edd-a201-67305a32bc97\") " Dec 11 13:28:00 crc kubenswrapper[4898]: I1211 13:28:00.890157 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0720809d-a8eb-4edd-a201-67305a32bc97-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "0720809d-a8eb-4edd-a201-67305a32bc97" (UID: "0720809d-a8eb-4edd-a201-67305a32bc97"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:28:00 crc kubenswrapper[4898]: I1211 13:28:00.890580 4898 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0720809d-a8eb-4edd-a201-67305a32bc97-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 11 13:28:00 crc kubenswrapper[4898]: I1211 13:28:00.899350 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0720809d-a8eb-4edd-a201-67305a32bc97-kube-api-access-49vrz" (OuterVolumeSpecName: "kube-api-access-49vrz") pod "0720809d-a8eb-4edd-a201-67305a32bc97" (UID: "0720809d-a8eb-4edd-a201-67305a32bc97"). InnerVolumeSpecName "kube-api-access-49vrz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:28:00 crc kubenswrapper[4898]: I1211 13:28:00.945374 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0720809d-a8eb-4edd-a201-67305a32bc97-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0720809d-a8eb-4edd-a201-67305a32bc97" (UID: "0720809d-a8eb-4edd-a201-67305a32bc97"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:28:00 crc kubenswrapper[4898]: I1211 13:28:00.987035 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0720809d-a8eb-4edd-a201-67305a32bc97-config-data" (OuterVolumeSpecName: "config-data") pod "0720809d-a8eb-4edd-a201-67305a32bc97" (UID: "0720809d-a8eb-4edd-a201-67305a32bc97"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:28:00 crc kubenswrapper[4898]: I1211 13:28:00.997986 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-49vrz\" (UniqueName: \"kubernetes.io/projected/0720809d-a8eb-4edd-a201-67305a32bc97-kube-api-access-49vrz\") on node \"crc\" DevicePath \"\"" Dec 11 13:28:00 crc kubenswrapper[4898]: I1211 13:28:00.998013 4898 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0720809d-a8eb-4edd-a201-67305a32bc97-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 11 13:28:00 crc kubenswrapper[4898]: I1211 13:28:00.998023 4898 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0720809d-a8eb-4edd-a201-67305a32bc97-config-data\") on node \"crc\" DevicePath \"\"" Dec 11 13:28:01 crc kubenswrapper[4898]: I1211 13:28:01.266033 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6cd68b61-5d09-46d1-908e-6c4e72120b56","Type":"ContainerStarted","Data":"2e8bac42a8b3e790b20f0794f98cc0756711880a237f2fd8e9a96afe9cdb3388"} Dec 11 13:28:01 crc kubenswrapper[4898]: I1211 13:28:01.268556 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-58bd476547-b52w6" event={"ID":"2caa9e06-4684-4011-8b55-e4436bf09d20","Type":"ContainerDied","Data":"4657e0fe7edfb5a6943ba3fcdebea8595b61d5eb4bfe3084b7a7c0507678e1e0"} Dec 11 13:28:01 crc kubenswrapper[4898]: I1211 13:28:01.268739 4898 scope.go:117] "RemoveContainer" containerID="23e2f642ee8186e197f5e9d46b2e12731ff4e5497740cf42cbc2824699a0608e" Dec 11 13:28:01 crc kubenswrapper[4898]: I1211 13:28:01.268590 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-58bd476547-b52w6" Dec 11 13:28:01 crc kubenswrapper[4898]: I1211 13:28:01.271035 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-7dc95c5f9b-hp64x" Dec 11 13:28:01 crc kubenswrapper[4898]: I1211 13:28:01.271008 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-7dc95c5f9b-hp64x" event={"ID":"0720809d-a8eb-4edd-a201-67305a32bc97","Type":"ContainerDied","Data":"c7744e9ebb929645ee9eb9d00537c73ba5dde4990782b9793eb6e6304c5f9a56"} Dec 11 13:28:01 crc kubenswrapper[4898]: I1211 13:28:01.295762 4898 scope.go:117] "RemoveContainer" containerID="189e28771eb11157a5b2e0bf0100157d2e510f5ac65180cf81847641cd19ccf7" Dec 11 13:28:01 crc kubenswrapper[4898]: I1211 13:28:01.300535 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-58bd476547-b52w6"] Dec 11 13:28:01 crc kubenswrapper[4898]: I1211 13:28:01.326352 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-api-58bd476547-b52w6"] Dec 11 13:28:01 crc kubenswrapper[4898]: I1211 13:28:01.384817 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-7dc95c5f9b-hp64x"] Dec 11 13:28:01 crc kubenswrapper[4898]: I1211 13:28:01.397429 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-cfnapi-7dc95c5f9b-hp64x"] Dec 11 13:28:02 crc kubenswrapper[4898]: I1211 13:28:02.285706 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6cd68b61-5d09-46d1-908e-6c4e72120b56","Type":"ContainerStarted","Data":"1ea49050370723bf68b85266e2fa38d433ebc60e30a962085c0c9e4d75ad8d18"} Dec 11 13:28:02 crc kubenswrapper[4898]: I1211 13:28:02.740673 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-engine-5f8cd7c4cf-7zq47" Dec 11 13:28:02 crc kubenswrapper[4898]: I1211 13:28:02.816831 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0720809d-a8eb-4edd-a201-67305a32bc97" path="/var/lib/kubelet/pods/0720809d-a8eb-4edd-a201-67305a32bc97/volumes" Dec 11 13:28:02 crc kubenswrapper[4898]: I1211 13:28:02.834745 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2caa9e06-4684-4011-8b55-e4436bf09d20" path="/var/lib/kubelet/pods/2caa9e06-4684-4011-8b55-e4436bf09d20/volumes" Dec 11 13:28:02 crc kubenswrapper[4898]: I1211 13:28:02.877929 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-engine-596c56fc48-gpcvd"] Dec 11 13:28:02 crc kubenswrapper[4898]: I1211 13:28:02.878557 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-engine-596c56fc48-gpcvd" podUID="eff78caa-c237-4040-b508-7cd9b8b5413f" containerName="heat-engine" containerID="cri-o://fc709e30f1c7b9a4dfc26b17afc9659745d99b9f0f5fc75d869e217db7fd7218" gracePeriod=60 Dec 11 13:28:04 crc kubenswrapper[4898]: I1211 13:28:04.312082 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6cd68b61-5d09-46d1-908e-6c4e72120b56","Type":"ContainerStarted","Data":"1bbaa7c3bb200ccc1c640e6114f7e3a0303fdd1b94ede257fd32fb93797d2494"} Dec 11 13:28:04 crc kubenswrapper[4898]: I1211 13:28:04.312745 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6cd68b61-5d09-46d1-908e-6c4e72120b56" containerName="ceilometer-central-agent" containerID="cri-o://274b0abc1fdfc3ce5a15a25adbb0b48e53b5a3fcf39978399042c805a99784f1" gracePeriod=30 Dec 11 13:28:04 crc kubenswrapper[4898]: I1211 13:28:04.312976 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 11 13:28:04 crc kubenswrapper[4898]: I1211 13:28:04.315030 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6cd68b61-5d09-46d1-908e-6c4e72120b56" containerName="proxy-httpd" containerID="cri-o://1bbaa7c3bb200ccc1c640e6114f7e3a0303fdd1b94ede257fd32fb93797d2494" gracePeriod=30 Dec 11 13:28:04 crc kubenswrapper[4898]: I1211 13:28:04.315192 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6cd68b61-5d09-46d1-908e-6c4e72120b56" containerName="sg-core" containerID="cri-o://1ea49050370723bf68b85266e2fa38d433ebc60e30a962085c0c9e4d75ad8d18" gracePeriod=30 Dec 11 13:28:04 crc kubenswrapper[4898]: I1211 13:28:04.315287 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6cd68b61-5d09-46d1-908e-6c4e72120b56" containerName="ceilometer-notification-agent" containerID="cri-o://2e8bac42a8b3e790b20f0794f98cc0756711880a237f2fd8e9a96afe9cdb3388" gracePeriod=30 Dec 11 13:28:04 crc kubenswrapper[4898]: E1211 13:28:04.330282 4898 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="fc709e30f1c7b9a4dfc26b17afc9659745d99b9f0f5fc75d869e217db7fd7218" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Dec 11 13:28:04 crc kubenswrapper[4898]: E1211 13:28:04.333914 4898 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="fc709e30f1c7b9a4dfc26b17afc9659745d99b9f0f5fc75d869e217db7fd7218" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Dec 11 13:28:04 crc kubenswrapper[4898]: E1211 13:28:04.341580 4898 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="fc709e30f1c7b9a4dfc26b17afc9659745d99b9f0f5fc75d869e217db7fd7218" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Dec 11 13:28:04 crc kubenswrapper[4898]: E1211 13:28:04.341654 4898 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/heat-engine-596c56fc48-gpcvd" podUID="eff78caa-c237-4040-b508-7cd9b8b5413f" containerName="heat-engine" Dec 11 13:28:04 crc kubenswrapper[4898]: I1211 13:28:04.342605 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=11.015914573 podStartE2EDuration="15.342591755s" podCreationTimestamp="2025-12-11 13:27:49 +0000 UTC" firstStartedPulling="2025-12-11 13:27:58.930576381 +0000 UTC m=+1436.502902818" lastFinishedPulling="2025-12-11 13:28:03.257253563 +0000 UTC m=+1440.829580000" observedRunningTime="2025-12-11 13:28:04.338643458 +0000 UTC m=+1441.910969895" watchObservedRunningTime="2025-12-11 13:28:04.342591755 +0000 UTC m=+1441.914918192" Dec 11 13:28:04 crc kubenswrapper[4898]: I1211 13:28:04.995516 4898 patch_prober.go:28] interesting pod/machine-config-daemon-7mmvk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 11 13:28:04 crc kubenswrapper[4898]: I1211 13:28:04.995819 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 11 13:28:05 crc kubenswrapper[4898]: I1211 13:28:05.329079 4898 generic.go:334] "Generic (PLEG): container finished" podID="6cd68b61-5d09-46d1-908e-6c4e72120b56" containerID="1bbaa7c3bb200ccc1c640e6114f7e3a0303fdd1b94ede257fd32fb93797d2494" exitCode=0 Dec 11 13:28:05 crc kubenswrapper[4898]: I1211 13:28:05.329131 4898 generic.go:334] "Generic (PLEG): container finished" podID="6cd68b61-5d09-46d1-908e-6c4e72120b56" containerID="1ea49050370723bf68b85266e2fa38d433ebc60e30a962085c0c9e4d75ad8d18" exitCode=2 Dec 11 13:28:05 crc kubenswrapper[4898]: I1211 13:28:05.329143 4898 generic.go:334] "Generic (PLEG): container finished" podID="6cd68b61-5d09-46d1-908e-6c4e72120b56" containerID="2e8bac42a8b3e790b20f0794f98cc0756711880a237f2fd8e9a96afe9cdb3388" exitCode=0 Dec 11 13:28:05 crc kubenswrapper[4898]: I1211 13:28:05.329161 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6cd68b61-5d09-46d1-908e-6c4e72120b56","Type":"ContainerDied","Data":"1bbaa7c3bb200ccc1c640e6114f7e3a0303fdd1b94ede257fd32fb93797d2494"} Dec 11 13:28:05 crc kubenswrapper[4898]: I1211 13:28:05.329214 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6cd68b61-5d09-46d1-908e-6c4e72120b56","Type":"ContainerDied","Data":"1ea49050370723bf68b85266e2fa38d433ebc60e30a962085c0c9e4d75ad8d18"} Dec 11 13:28:05 crc kubenswrapper[4898]: I1211 13:28:05.329233 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6cd68b61-5d09-46d1-908e-6c4e72120b56","Type":"ContainerDied","Data":"2e8bac42a8b3e790b20f0794f98cc0756711880a237f2fd8e9a96afe9cdb3388"} Dec 11 13:28:10 crc kubenswrapper[4898]: I1211 13:28:10.387406 4898 generic.go:334] "Generic (PLEG): container finished" podID="eff78caa-c237-4040-b508-7cd9b8b5413f" containerID="fc709e30f1c7b9a4dfc26b17afc9659745d99b9f0f5fc75d869e217db7fd7218" exitCode=0 Dec 11 13:28:10 crc kubenswrapper[4898]: I1211 13:28:10.387495 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-596c56fc48-gpcvd" event={"ID":"eff78caa-c237-4040-b508-7cd9b8b5413f","Type":"ContainerDied","Data":"fc709e30f1c7b9a4dfc26b17afc9659745d99b9f0f5fc75d869e217db7fd7218"} Dec 11 13:28:10 crc kubenswrapper[4898]: I1211 13:28:10.685071 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-596c56fc48-gpcvd" Dec 11 13:28:10 crc kubenswrapper[4898]: I1211 13:28:10.828765 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/eff78caa-c237-4040-b508-7cd9b8b5413f-config-data-custom\") pod \"eff78caa-c237-4040-b508-7cd9b8b5413f\" (UID: \"eff78caa-c237-4040-b508-7cd9b8b5413f\") " Dec 11 13:28:10 crc kubenswrapper[4898]: I1211 13:28:10.828858 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eff78caa-c237-4040-b508-7cd9b8b5413f-config-data\") pod \"eff78caa-c237-4040-b508-7cd9b8b5413f\" (UID: \"eff78caa-c237-4040-b508-7cd9b8b5413f\") " Dec 11 13:28:10 crc kubenswrapper[4898]: I1211 13:28:10.828990 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eff78caa-c237-4040-b508-7cd9b8b5413f-combined-ca-bundle\") pod \"eff78caa-c237-4040-b508-7cd9b8b5413f\" (UID: \"eff78caa-c237-4040-b508-7cd9b8b5413f\") " Dec 11 13:28:10 crc kubenswrapper[4898]: I1211 13:28:10.829067 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4lb9\" (UniqueName: \"kubernetes.io/projected/eff78caa-c237-4040-b508-7cd9b8b5413f-kube-api-access-w4lb9\") pod \"eff78caa-c237-4040-b508-7cd9b8b5413f\" (UID: \"eff78caa-c237-4040-b508-7cd9b8b5413f\") " Dec 11 13:28:10 crc kubenswrapper[4898]: I1211 13:28:10.835348 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eff78caa-c237-4040-b508-7cd9b8b5413f-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "eff78caa-c237-4040-b508-7cd9b8b5413f" (UID: "eff78caa-c237-4040-b508-7cd9b8b5413f"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:28:10 crc kubenswrapper[4898]: I1211 13:28:10.852897 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eff78caa-c237-4040-b508-7cd9b8b5413f-kube-api-access-w4lb9" (OuterVolumeSpecName: "kube-api-access-w4lb9") pod "eff78caa-c237-4040-b508-7cd9b8b5413f" (UID: "eff78caa-c237-4040-b508-7cd9b8b5413f"). InnerVolumeSpecName "kube-api-access-w4lb9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:28:10 crc kubenswrapper[4898]: I1211 13:28:10.884031 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eff78caa-c237-4040-b508-7cd9b8b5413f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "eff78caa-c237-4040-b508-7cd9b8b5413f" (UID: "eff78caa-c237-4040-b508-7cd9b8b5413f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:28:10 crc kubenswrapper[4898]: I1211 13:28:10.911981 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eff78caa-c237-4040-b508-7cd9b8b5413f-config-data" (OuterVolumeSpecName: "config-data") pod "eff78caa-c237-4040-b508-7cd9b8b5413f" (UID: "eff78caa-c237-4040-b508-7cd9b8b5413f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:28:10 crc kubenswrapper[4898]: I1211 13:28:10.932045 4898 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/eff78caa-c237-4040-b508-7cd9b8b5413f-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 11 13:28:10 crc kubenswrapper[4898]: I1211 13:28:10.932085 4898 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eff78caa-c237-4040-b508-7cd9b8b5413f-config-data\") on node \"crc\" DevicePath \"\"" Dec 11 13:28:10 crc kubenswrapper[4898]: I1211 13:28:10.932096 4898 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eff78caa-c237-4040-b508-7cd9b8b5413f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 11 13:28:10 crc kubenswrapper[4898]: I1211 13:28:10.932107 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4lb9\" (UniqueName: \"kubernetes.io/projected/eff78caa-c237-4040-b508-7cd9b8b5413f-kube-api-access-w4lb9\") on node \"crc\" DevicePath \"\"" Dec 11 13:28:11 crc kubenswrapper[4898]: I1211 13:28:11.400604 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-596c56fc48-gpcvd" event={"ID":"eff78caa-c237-4040-b508-7cd9b8b5413f","Type":"ContainerDied","Data":"c9837276bd3680657621b6b1a7f9c0e6b336657e3d6e02311e1af86c999327c8"} Dec 11 13:28:11 crc kubenswrapper[4898]: I1211 13:28:11.400665 4898 scope.go:117] "RemoveContainer" containerID="fc709e30f1c7b9a4dfc26b17afc9659745d99b9f0f5fc75d869e217db7fd7218" Dec 11 13:28:11 crc kubenswrapper[4898]: I1211 13:28:11.400673 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-596c56fc48-gpcvd" Dec 11 13:28:11 crc kubenswrapper[4898]: I1211 13:28:11.442677 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-engine-596c56fc48-gpcvd"] Dec 11 13:28:11 crc kubenswrapper[4898]: I1211 13:28:11.455766 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-engine-596c56fc48-gpcvd"] Dec 11 13:28:12 crc kubenswrapper[4898]: I1211 13:28:12.798947 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eff78caa-c237-4040-b508-7cd9b8b5413f" path="/var/lib/kubelet/pods/eff78caa-c237-4040-b508-7cd9b8b5413f/volumes" Dec 11 13:28:13 crc kubenswrapper[4898]: I1211 13:28:13.429140 4898 generic.go:334] "Generic (PLEG): container finished" podID="421ec273-9526-4100-9a5a-63e0512beee3" containerID="16872782332a1dbb940497943f7cacb007dc4833d1469644a0b34aa6937ab657" exitCode=0 Dec 11 13:28:13 crc kubenswrapper[4898]: I1211 13:28:13.429281 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-rpgj6" event={"ID":"421ec273-9526-4100-9a5a-63e0512beee3","Type":"ContainerDied","Data":"16872782332a1dbb940497943f7cacb007dc4833d1469644a0b34aa6937ab657"} Dec 11 13:28:14 crc kubenswrapper[4898]: I1211 13:28:14.872688 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-rpgj6" Dec 11 13:28:14 crc kubenswrapper[4898]: I1211 13:28:14.930878 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/421ec273-9526-4100-9a5a-63e0512beee3-config-data\") pod \"421ec273-9526-4100-9a5a-63e0512beee3\" (UID: \"421ec273-9526-4100-9a5a-63e0512beee3\") " Dec 11 13:28:14 crc kubenswrapper[4898]: I1211 13:28:14.931280 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/421ec273-9526-4100-9a5a-63e0512beee3-scripts\") pod \"421ec273-9526-4100-9a5a-63e0512beee3\" (UID: \"421ec273-9526-4100-9a5a-63e0512beee3\") " Dec 11 13:28:14 crc kubenswrapper[4898]: I1211 13:28:14.931427 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/421ec273-9526-4100-9a5a-63e0512beee3-combined-ca-bundle\") pod \"421ec273-9526-4100-9a5a-63e0512beee3\" (UID: \"421ec273-9526-4100-9a5a-63e0512beee3\") " Dec 11 13:28:14 crc kubenswrapper[4898]: I1211 13:28:14.931818 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-784v4\" (UniqueName: \"kubernetes.io/projected/421ec273-9526-4100-9a5a-63e0512beee3-kube-api-access-784v4\") pod \"421ec273-9526-4100-9a5a-63e0512beee3\" (UID: \"421ec273-9526-4100-9a5a-63e0512beee3\") " Dec 11 13:28:14 crc kubenswrapper[4898]: I1211 13:28:14.983582 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/421ec273-9526-4100-9a5a-63e0512beee3-scripts" (OuterVolumeSpecName: "scripts") pod "421ec273-9526-4100-9a5a-63e0512beee3" (UID: "421ec273-9526-4100-9a5a-63e0512beee3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:28:14 crc kubenswrapper[4898]: I1211 13:28:14.991663 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/421ec273-9526-4100-9a5a-63e0512beee3-kube-api-access-784v4" (OuterVolumeSpecName: "kube-api-access-784v4") pod "421ec273-9526-4100-9a5a-63e0512beee3" (UID: "421ec273-9526-4100-9a5a-63e0512beee3"). InnerVolumeSpecName "kube-api-access-784v4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:28:15 crc kubenswrapper[4898]: I1211 13:28:15.009251 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/421ec273-9526-4100-9a5a-63e0512beee3-config-data" (OuterVolumeSpecName: "config-data") pod "421ec273-9526-4100-9a5a-63e0512beee3" (UID: "421ec273-9526-4100-9a5a-63e0512beee3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:28:15 crc kubenswrapper[4898]: I1211 13:28:15.015548 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/421ec273-9526-4100-9a5a-63e0512beee3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "421ec273-9526-4100-9a5a-63e0512beee3" (UID: "421ec273-9526-4100-9a5a-63e0512beee3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:28:15 crc kubenswrapper[4898]: I1211 13:28:15.035685 4898 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/421ec273-9526-4100-9a5a-63e0512beee3-scripts\") on node \"crc\" DevicePath \"\"" Dec 11 13:28:15 crc kubenswrapper[4898]: I1211 13:28:15.035919 4898 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/421ec273-9526-4100-9a5a-63e0512beee3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 11 13:28:15 crc kubenswrapper[4898]: I1211 13:28:15.035986 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-784v4\" (UniqueName: \"kubernetes.io/projected/421ec273-9526-4100-9a5a-63e0512beee3-kube-api-access-784v4\") on node \"crc\" DevicePath \"\"" Dec 11 13:28:15 crc kubenswrapper[4898]: I1211 13:28:15.036060 4898 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/421ec273-9526-4100-9a5a-63e0512beee3-config-data\") on node \"crc\" DevicePath \"\"" Dec 11 13:28:15 crc kubenswrapper[4898]: I1211 13:28:15.400097 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 11 13:28:15 crc kubenswrapper[4898]: I1211 13:28:15.457028 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-rpgj6" event={"ID":"421ec273-9526-4100-9a5a-63e0512beee3","Type":"ContainerDied","Data":"ed982c0a94c488cd357dbbf23c9cb8dc8fca157682e658a06f112e144d99ea85"} Dec 11 13:28:15 crc kubenswrapper[4898]: I1211 13:28:15.457076 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-rpgj6" Dec 11 13:28:15 crc kubenswrapper[4898]: I1211 13:28:15.457086 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ed982c0a94c488cd357dbbf23c9cb8dc8fca157682e658a06f112e144d99ea85" Dec 11 13:28:15 crc kubenswrapper[4898]: I1211 13:28:15.459543 4898 generic.go:334] "Generic (PLEG): container finished" podID="6cd68b61-5d09-46d1-908e-6c4e72120b56" containerID="274b0abc1fdfc3ce5a15a25adbb0b48e53b5a3fcf39978399042c805a99784f1" exitCode=0 Dec 11 13:28:15 crc kubenswrapper[4898]: I1211 13:28:15.459589 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6cd68b61-5d09-46d1-908e-6c4e72120b56","Type":"ContainerDied","Data":"274b0abc1fdfc3ce5a15a25adbb0b48e53b5a3fcf39978399042c805a99784f1"} Dec 11 13:28:15 crc kubenswrapper[4898]: I1211 13:28:15.459617 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6cd68b61-5d09-46d1-908e-6c4e72120b56","Type":"ContainerDied","Data":"5c07f9cff2353bcf5cf4b2f004a7c7e797e6adf5da75b6d3981fe6053f04f96f"} Dec 11 13:28:15 crc kubenswrapper[4898]: I1211 13:28:15.459637 4898 scope.go:117] "RemoveContainer" containerID="1bbaa7c3bb200ccc1c640e6114f7e3a0303fdd1b94ede257fd32fb93797d2494" Dec 11 13:28:15 crc kubenswrapper[4898]: I1211 13:28:15.459807 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 11 13:28:15 crc kubenswrapper[4898]: I1211 13:28:15.499968 4898 scope.go:117] "RemoveContainer" containerID="1ea49050370723bf68b85266e2fa38d433ebc60e30a962085c0c9e4d75ad8d18" Dec 11 13:28:15 crc kubenswrapper[4898]: I1211 13:28:15.531638 4898 scope.go:117] "RemoveContainer" containerID="2e8bac42a8b3e790b20f0794f98cc0756711880a237f2fd8e9a96afe9cdb3388" Dec 11 13:28:15 crc kubenswrapper[4898]: I1211 13:28:15.570504 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6cd68b61-5d09-46d1-908e-6c4e72120b56-run-httpd\") pod \"6cd68b61-5d09-46d1-908e-6c4e72120b56\" (UID: \"6cd68b61-5d09-46d1-908e-6c4e72120b56\") " Dec 11 13:28:15 crc kubenswrapper[4898]: I1211 13:28:15.570636 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mf5xb\" (UniqueName: \"kubernetes.io/projected/6cd68b61-5d09-46d1-908e-6c4e72120b56-kube-api-access-mf5xb\") pod \"6cd68b61-5d09-46d1-908e-6c4e72120b56\" (UID: \"6cd68b61-5d09-46d1-908e-6c4e72120b56\") " Dec 11 13:28:15 crc kubenswrapper[4898]: I1211 13:28:15.570725 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6cd68b61-5d09-46d1-908e-6c4e72120b56-config-data\") pod \"6cd68b61-5d09-46d1-908e-6c4e72120b56\" (UID: \"6cd68b61-5d09-46d1-908e-6c4e72120b56\") " Dec 11 13:28:15 crc kubenswrapper[4898]: I1211 13:28:15.570741 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6cd68b61-5d09-46d1-908e-6c4e72120b56-scripts\") pod \"6cd68b61-5d09-46d1-908e-6c4e72120b56\" (UID: \"6cd68b61-5d09-46d1-908e-6c4e72120b56\") " Dec 11 13:28:15 crc kubenswrapper[4898]: I1211 13:28:15.570771 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6cd68b61-5d09-46d1-908e-6c4e72120b56-sg-core-conf-yaml\") pod \"6cd68b61-5d09-46d1-908e-6c4e72120b56\" (UID: \"6cd68b61-5d09-46d1-908e-6c4e72120b56\") " Dec 11 13:28:15 crc kubenswrapper[4898]: I1211 13:28:15.570796 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cd68b61-5d09-46d1-908e-6c4e72120b56-combined-ca-bundle\") pod \"6cd68b61-5d09-46d1-908e-6c4e72120b56\" (UID: \"6cd68b61-5d09-46d1-908e-6c4e72120b56\") " Dec 11 13:28:15 crc kubenswrapper[4898]: I1211 13:28:15.570909 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6cd68b61-5d09-46d1-908e-6c4e72120b56-log-httpd\") pod \"6cd68b61-5d09-46d1-908e-6c4e72120b56\" (UID: \"6cd68b61-5d09-46d1-908e-6c4e72120b56\") " Dec 11 13:28:15 crc kubenswrapper[4898]: I1211 13:28:15.571860 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6cd68b61-5d09-46d1-908e-6c4e72120b56-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "6cd68b61-5d09-46d1-908e-6c4e72120b56" (UID: "6cd68b61-5d09-46d1-908e-6c4e72120b56"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 13:28:15 crc kubenswrapper[4898]: I1211 13:28:15.572310 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6cd68b61-5d09-46d1-908e-6c4e72120b56-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "6cd68b61-5d09-46d1-908e-6c4e72120b56" (UID: "6cd68b61-5d09-46d1-908e-6c4e72120b56"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 13:28:15 crc kubenswrapper[4898]: I1211 13:28:15.573037 4898 scope.go:117] "RemoveContainer" containerID="274b0abc1fdfc3ce5a15a25adbb0b48e53b5a3fcf39978399042c805a99784f1" Dec 11 13:28:15 crc kubenswrapper[4898]: I1211 13:28:15.577038 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6cd68b61-5d09-46d1-908e-6c4e72120b56-kube-api-access-mf5xb" (OuterVolumeSpecName: "kube-api-access-mf5xb") pod "6cd68b61-5d09-46d1-908e-6c4e72120b56" (UID: "6cd68b61-5d09-46d1-908e-6c4e72120b56"). InnerVolumeSpecName "kube-api-access-mf5xb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:28:15 crc kubenswrapper[4898]: I1211 13:28:15.579537 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6cd68b61-5d09-46d1-908e-6c4e72120b56-scripts" (OuterVolumeSpecName: "scripts") pod "6cd68b61-5d09-46d1-908e-6c4e72120b56" (UID: "6cd68b61-5d09-46d1-908e-6c4e72120b56"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:28:15 crc kubenswrapper[4898]: I1211 13:28:15.592867 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 11 13:28:15 crc kubenswrapper[4898]: E1211 13:28:15.593363 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ee1613f-2749-4158-b826-8096758ca04f" containerName="heat-cfnapi" Dec 11 13:28:15 crc kubenswrapper[4898]: I1211 13:28:15.593380 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ee1613f-2749-4158-b826-8096758ca04f" containerName="heat-cfnapi" Dec 11 13:28:15 crc kubenswrapper[4898]: E1211 13:28:15.593398 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6532ba5-2d1a-44f9-bf4a-6fd73f4bb6ee" containerName="heat-api" Dec 11 13:28:15 crc kubenswrapper[4898]: I1211 13:28:15.593405 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6532ba5-2d1a-44f9-bf4a-6fd73f4bb6ee" containerName="heat-api" Dec 11 13:28:15 crc kubenswrapper[4898]: E1211 13:28:15.593412 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0720809d-a8eb-4edd-a201-67305a32bc97" containerName="heat-cfnapi" Dec 11 13:28:15 crc kubenswrapper[4898]: I1211 13:28:15.593418 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="0720809d-a8eb-4edd-a201-67305a32bc97" containerName="heat-cfnapi" Dec 11 13:28:15 crc kubenswrapper[4898]: E1211 13:28:15.593434 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6cd68b61-5d09-46d1-908e-6c4e72120b56" containerName="sg-core" Dec 11 13:28:15 crc kubenswrapper[4898]: I1211 13:28:15.593440 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="6cd68b61-5d09-46d1-908e-6c4e72120b56" containerName="sg-core" Dec 11 13:28:15 crc kubenswrapper[4898]: E1211 13:28:15.593465 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eff78caa-c237-4040-b508-7cd9b8b5413f" containerName="heat-engine" Dec 11 13:28:15 crc kubenswrapper[4898]: I1211 13:28:15.593471 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="eff78caa-c237-4040-b508-7cd9b8b5413f" containerName="heat-engine" Dec 11 13:28:15 crc kubenswrapper[4898]: E1211 13:28:15.593486 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6cd68b61-5d09-46d1-908e-6c4e72120b56" containerName="ceilometer-notification-agent" Dec 11 13:28:15 crc kubenswrapper[4898]: I1211 13:28:15.593492 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="6cd68b61-5d09-46d1-908e-6c4e72120b56" containerName="ceilometer-notification-agent" Dec 11 13:28:15 crc kubenswrapper[4898]: E1211 13:28:15.593504 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6cd68b61-5d09-46d1-908e-6c4e72120b56" containerName="proxy-httpd" Dec 11 13:28:15 crc kubenswrapper[4898]: I1211 13:28:15.593509 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="6cd68b61-5d09-46d1-908e-6c4e72120b56" containerName="proxy-httpd" Dec 11 13:28:15 crc kubenswrapper[4898]: E1211 13:28:15.593521 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2caa9e06-4684-4011-8b55-e4436bf09d20" containerName="heat-api" Dec 11 13:28:15 crc kubenswrapper[4898]: I1211 13:28:15.593526 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="2caa9e06-4684-4011-8b55-e4436bf09d20" containerName="heat-api" Dec 11 13:28:15 crc kubenswrapper[4898]: E1211 13:28:15.593538 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="421ec273-9526-4100-9a5a-63e0512beee3" containerName="nova-cell0-conductor-db-sync" Dec 11 13:28:15 crc kubenswrapper[4898]: I1211 13:28:15.593546 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="421ec273-9526-4100-9a5a-63e0512beee3" containerName="nova-cell0-conductor-db-sync" Dec 11 13:28:15 crc kubenswrapper[4898]: E1211 13:28:15.593561 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2caa9e06-4684-4011-8b55-e4436bf09d20" containerName="heat-api" Dec 11 13:28:15 crc kubenswrapper[4898]: I1211 13:28:15.593566 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="2caa9e06-4684-4011-8b55-e4436bf09d20" containerName="heat-api" Dec 11 13:28:15 crc kubenswrapper[4898]: E1211 13:28:15.593576 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6cd68b61-5d09-46d1-908e-6c4e72120b56" containerName="ceilometer-central-agent" Dec 11 13:28:15 crc kubenswrapper[4898]: I1211 13:28:15.593581 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="6cd68b61-5d09-46d1-908e-6c4e72120b56" containerName="ceilometer-central-agent" Dec 11 13:28:15 crc kubenswrapper[4898]: I1211 13:28:15.593782 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="6cd68b61-5d09-46d1-908e-6c4e72120b56" containerName="ceilometer-central-agent" Dec 11 13:28:15 crc kubenswrapper[4898]: I1211 13:28:15.593792 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="0720809d-a8eb-4edd-a201-67305a32bc97" containerName="heat-cfnapi" Dec 11 13:28:15 crc kubenswrapper[4898]: I1211 13:28:15.593802 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="c6532ba5-2d1a-44f9-bf4a-6fd73f4bb6ee" containerName="heat-api" Dec 11 13:28:15 crc kubenswrapper[4898]: I1211 13:28:15.593815 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="6cd68b61-5d09-46d1-908e-6c4e72120b56" containerName="ceilometer-notification-agent" Dec 11 13:28:15 crc kubenswrapper[4898]: I1211 13:28:15.593826 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="2caa9e06-4684-4011-8b55-e4436bf09d20" containerName="heat-api" Dec 11 13:28:15 crc kubenswrapper[4898]: I1211 13:28:15.593835 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="421ec273-9526-4100-9a5a-63e0512beee3" containerName="nova-cell0-conductor-db-sync" Dec 11 13:28:15 crc kubenswrapper[4898]: I1211 13:28:15.593847 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="6cd68b61-5d09-46d1-908e-6c4e72120b56" containerName="sg-core" Dec 11 13:28:15 crc kubenswrapper[4898]: I1211 13:28:15.593858 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="2caa9e06-4684-4011-8b55-e4436bf09d20" containerName="heat-api" Dec 11 13:28:15 crc kubenswrapper[4898]: I1211 13:28:15.593866 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="6cd68b61-5d09-46d1-908e-6c4e72120b56" containerName="proxy-httpd" Dec 11 13:28:15 crc kubenswrapper[4898]: I1211 13:28:15.593876 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="eff78caa-c237-4040-b508-7cd9b8b5413f" containerName="heat-engine" Dec 11 13:28:15 crc kubenswrapper[4898]: I1211 13:28:15.593887 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ee1613f-2749-4158-b826-8096758ca04f" containerName="heat-cfnapi" Dec 11 13:28:15 crc kubenswrapper[4898]: I1211 13:28:15.594636 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 11 13:28:15 crc kubenswrapper[4898]: I1211 13:28:15.599338 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-2kdzw" Dec 11 13:28:15 crc kubenswrapper[4898]: I1211 13:28:15.602427 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Dec 11 13:28:15 crc kubenswrapper[4898]: I1211 13:28:15.618887 4898 scope.go:117] "RemoveContainer" containerID="1bbaa7c3bb200ccc1c640e6114f7e3a0303fdd1b94ede257fd32fb93797d2494" Dec 11 13:28:15 crc kubenswrapper[4898]: E1211 13:28:15.621838 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1bbaa7c3bb200ccc1c640e6114f7e3a0303fdd1b94ede257fd32fb93797d2494\": container with ID starting with 1bbaa7c3bb200ccc1c640e6114f7e3a0303fdd1b94ede257fd32fb93797d2494 not found: ID does not exist" containerID="1bbaa7c3bb200ccc1c640e6114f7e3a0303fdd1b94ede257fd32fb93797d2494" Dec 11 13:28:15 crc kubenswrapper[4898]: I1211 13:28:15.621914 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1bbaa7c3bb200ccc1c640e6114f7e3a0303fdd1b94ede257fd32fb93797d2494"} err="failed to get container status \"1bbaa7c3bb200ccc1c640e6114f7e3a0303fdd1b94ede257fd32fb93797d2494\": rpc error: code = NotFound desc = could not find container \"1bbaa7c3bb200ccc1c640e6114f7e3a0303fdd1b94ede257fd32fb93797d2494\": container with ID starting with 1bbaa7c3bb200ccc1c640e6114f7e3a0303fdd1b94ede257fd32fb93797d2494 not found: ID does not exist" Dec 11 13:28:15 crc kubenswrapper[4898]: I1211 13:28:15.621967 4898 scope.go:117] "RemoveContainer" containerID="1ea49050370723bf68b85266e2fa38d433ebc60e30a962085c0c9e4d75ad8d18" Dec 11 13:28:15 crc kubenswrapper[4898]: E1211 13:28:15.622776 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ea49050370723bf68b85266e2fa38d433ebc60e30a962085c0c9e4d75ad8d18\": container with ID starting with 1ea49050370723bf68b85266e2fa38d433ebc60e30a962085c0c9e4d75ad8d18 not found: ID does not exist" containerID="1ea49050370723bf68b85266e2fa38d433ebc60e30a962085c0c9e4d75ad8d18" Dec 11 13:28:15 crc kubenswrapper[4898]: I1211 13:28:15.622809 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ea49050370723bf68b85266e2fa38d433ebc60e30a962085c0c9e4d75ad8d18"} err="failed to get container status \"1ea49050370723bf68b85266e2fa38d433ebc60e30a962085c0c9e4d75ad8d18\": rpc error: code = NotFound desc = could not find container \"1ea49050370723bf68b85266e2fa38d433ebc60e30a962085c0c9e4d75ad8d18\": container with ID starting with 1ea49050370723bf68b85266e2fa38d433ebc60e30a962085c0c9e4d75ad8d18 not found: ID does not exist" Dec 11 13:28:15 crc kubenswrapper[4898]: I1211 13:28:15.622829 4898 scope.go:117] "RemoveContainer" containerID="2e8bac42a8b3e790b20f0794f98cc0756711880a237f2fd8e9a96afe9cdb3388" Dec 11 13:28:15 crc kubenswrapper[4898]: E1211 13:28:15.623368 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e8bac42a8b3e790b20f0794f98cc0756711880a237f2fd8e9a96afe9cdb3388\": container with ID starting with 2e8bac42a8b3e790b20f0794f98cc0756711880a237f2fd8e9a96afe9cdb3388 not found: ID does not exist" containerID="2e8bac42a8b3e790b20f0794f98cc0756711880a237f2fd8e9a96afe9cdb3388" Dec 11 13:28:15 crc kubenswrapper[4898]: I1211 13:28:15.623423 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e8bac42a8b3e790b20f0794f98cc0756711880a237f2fd8e9a96afe9cdb3388"} err="failed to get container status \"2e8bac42a8b3e790b20f0794f98cc0756711880a237f2fd8e9a96afe9cdb3388\": rpc error: code = NotFound desc = could not find container \"2e8bac42a8b3e790b20f0794f98cc0756711880a237f2fd8e9a96afe9cdb3388\": container with ID starting with 2e8bac42a8b3e790b20f0794f98cc0756711880a237f2fd8e9a96afe9cdb3388 not found: ID does not exist" Dec 11 13:28:15 crc kubenswrapper[4898]: I1211 13:28:15.623560 4898 scope.go:117] "RemoveContainer" containerID="274b0abc1fdfc3ce5a15a25adbb0b48e53b5a3fcf39978399042c805a99784f1" Dec 11 13:28:15 crc kubenswrapper[4898]: E1211 13:28:15.623953 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"274b0abc1fdfc3ce5a15a25adbb0b48e53b5a3fcf39978399042c805a99784f1\": container with ID starting with 274b0abc1fdfc3ce5a15a25adbb0b48e53b5a3fcf39978399042c805a99784f1 not found: ID does not exist" containerID="274b0abc1fdfc3ce5a15a25adbb0b48e53b5a3fcf39978399042c805a99784f1" Dec 11 13:28:15 crc kubenswrapper[4898]: I1211 13:28:15.624003 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"274b0abc1fdfc3ce5a15a25adbb0b48e53b5a3fcf39978399042c805a99784f1"} err="failed to get container status \"274b0abc1fdfc3ce5a15a25adbb0b48e53b5a3fcf39978399042c805a99784f1\": rpc error: code = NotFound desc = could not find container \"274b0abc1fdfc3ce5a15a25adbb0b48e53b5a3fcf39978399042c805a99784f1\": container with ID starting with 274b0abc1fdfc3ce5a15a25adbb0b48e53b5a3fcf39978399042c805a99784f1 not found: ID does not exist" Dec 11 13:28:15 crc kubenswrapper[4898]: I1211 13:28:15.626880 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 11 13:28:15 crc kubenswrapper[4898]: I1211 13:28:15.648654 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6cd68b61-5d09-46d1-908e-6c4e72120b56-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "6cd68b61-5d09-46d1-908e-6c4e72120b56" (UID: "6cd68b61-5d09-46d1-908e-6c4e72120b56"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:28:15 crc kubenswrapper[4898]: I1211 13:28:15.675377 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6dbbbea1-e6ed-4ad9-9e31-baf09c312e76-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"6dbbbea1-e6ed-4ad9-9e31-baf09c312e76\") " pod="openstack/nova-cell0-conductor-0" Dec 11 13:28:15 crc kubenswrapper[4898]: I1211 13:28:15.675519 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6dbbbea1-e6ed-4ad9-9e31-baf09c312e76-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"6dbbbea1-e6ed-4ad9-9e31-baf09c312e76\") " pod="openstack/nova-cell0-conductor-0" Dec 11 13:28:15 crc kubenswrapper[4898]: I1211 13:28:15.675564 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fdbpd\" (UniqueName: \"kubernetes.io/projected/6dbbbea1-e6ed-4ad9-9e31-baf09c312e76-kube-api-access-fdbpd\") pod \"nova-cell0-conductor-0\" (UID: \"6dbbbea1-e6ed-4ad9-9e31-baf09c312e76\") " pod="openstack/nova-cell0-conductor-0" Dec 11 13:28:15 crc kubenswrapper[4898]: I1211 13:28:15.675712 4898 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6cd68b61-5d09-46d1-908e-6c4e72120b56-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 11 13:28:15 crc kubenswrapper[4898]: I1211 13:28:15.675733 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mf5xb\" (UniqueName: \"kubernetes.io/projected/6cd68b61-5d09-46d1-908e-6c4e72120b56-kube-api-access-mf5xb\") on node \"crc\" DevicePath \"\"" Dec 11 13:28:15 crc kubenswrapper[4898]: I1211 13:28:15.675742 4898 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6cd68b61-5d09-46d1-908e-6c4e72120b56-scripts\") on node \"crc\" DevicePath \"\"" Dec 11 13:28:15 crc kubenswrapper[4898]: I1211 13:28:15.675751 4898 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6cd68b61-5d09-46d1-908e-6c4e72120b56-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 11 13:28:15 crc kubenswrapper[4898]: I1211 13:28:15.675763 4898 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6cd68b61-5d09-46d1-908e-6c4e72120b56-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 11 13:28:15 crc kubenswrapper[4898]: I1211 13:28:15.695568 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6cd68b61-5d09-46d1-908e-6c4e72120b56-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6cd68b61-5d09-46d1-908e-6c4e72120b56" (UID: "6cd68b61-5d09-46d1-908e-6c4e72120b56"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:28:15 crc kubenswrapper[4898]: I1211 13:28:15.725793 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6cd68b61-5d09-46d1-908e-6c4e72120b56-config-data" (OuterVolumeSpecName: "config-data") pod "6cd68b61-5d09-46d1-908e-6c4e72120b56" (UID: "6cd68b61-5d09-46d1-908e-6c4e72120b56"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:28:15 crc kubenswrapper[4898]: I1211 13:28:15.776799 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6dbbbea1-e6ed-4ad9-9e31-baf09c312e76-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"6dbbbea1-e6ed-4ad9-9e31-baf09c312e76\") " pod="openstack/nova-cell0-conductor-0" Dec 11 13:28:15 crc kubenswrapper[4898]: I1211 13:28:15.777075 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fdbpd\" (UniqueName: \"kubernetes.io/projected/6dbbbea1-e6ed-4ad9-9e31-baf09c312e76-kube-api-access-fdbpd\") pod \"nova-cell0-conductor-0\" (UID: \"6dbbbea1-e6ed-4ad9-9e31-baf09c312e76\") " pod="openstack/nova-cell0-conductor-0" Dec 11 13:28:15 crc kubenswrapper[4898]: I1211 13:28:15.777320 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6dbbbea1-e6ed-4ad9-9e31-baf09c312e76-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"6dbbbea1-e6ed-4ad9-9e31-baf09c312e76\") " pod="openstack/nova-cell0-conductor-0" Dec 11 13:28:15 crc kubenswrapper[4898]: I1211 13:28:15.777544 4898 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6cd68b61-5d09-46d1-908e-6c4e72120b56-config-data\") on node \"crc\" DevicePath \"\"" Dec 11 13:28:15 crc kubenswrapper[4898]: I1211 13:28:15.777642 4898 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cd68b61-5d09-46d1-908e-6c4e72120b56-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 11 13:28:15 crc kubenswrapper[4898]: I1211 13:28:15.785298 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6dbbbea1-e6ed-4ad9-9e31-baf09c312e76-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"6dbbbea1-e6ed-4ad9-9e31-baf09c312e76\") " pod="openstack/nova-cell0-conductor-0" Dec 11 13:28:15 crc kubenswrapper[4898]: I1211 13:28:15.785879 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6dbbbea1-e6ed-4ad9-9e31-baf09c312e76-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"6dbbbea1-e6ed-4ad9-9e31-baf09c312e76\") " pod="openstack/nova-cell0-conductor-0" Dec 11 13:28:15 crc kubenswrapper[4898]: I1211 13:28:15.798007 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fdbpd\" (UniqueName: \"kubernetes.io/projected/6dbbbea1-e6ed-4ad9-9e31-baf09c312e76-kube-api-access-fdbpd\") pod \"nova-cell0-conductor-0\" (UID: \"6dbbbea1-e6ed-4ad9-9e31-baf09c312e76\") " pod="openstack/nova-cell0-conductor-0" Dec 11 13:28:15 crc kubenswrapper[4898]: I1211 13:28:15.903663 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 11 13:28:15 crc kubenswrapper[4898]: I1211 13:28:15.922872 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 11 13:28:15 crc kubenswrapper[4898]: I1211 13:28:15.923308 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 11 13:28:15 crc kubenswrapper[4898]: I1211 13:28:15.935289 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 11 13:28:15 crc kubenswrapper[4898]: E1211 13:28:15.935874 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0720809d-a8eb-4edd-a201-67305a32bc97" containerName="heat-cfnapi" Dec 11 13:28:15 crc kubenswrapper[4898]: I1211 13:28:15.935893 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="0720809d-a8eb-4edd-a201-67305a32bc97" containerName="heat-cfnapi" Dec 11 13:28:15 crc kubenswrapper[4898]: I1211 13:28:15.936128 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="0720809d-a8eb-4edd-a201-67305a32bc97" containerName="heat-cfnapi" Dec 11 13:28:15 crc kubenswrapper[4898]: I1211 13:28:15.938128 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 11 13:28:15 crc kubenswrapper[4898]: I1211 13:28:15.940097 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 11 13:28:15 crc kubenswrapper[4898]: I1211 13:28:15.940593 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 11 13:28:15 crc kubenswrapper[4898]: I1211 13:28:15.949529 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 11 13:28:16 crc kubenswrapper[4898]: I1211 13:28:16.084432 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73500676-55b3-4022-b71b-e9ea005da327-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"73500676-55b3-4022-b71b-e9ea005da327\") " pod="openstack/ceilometer-0" Dec 11 13:28:16 crc kubenswrapper[4898]: I1211 13:28:16.085019 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/73500676-55b3-4022-b71b-e9ea005da327-log-httpd\") pod \"ceilometer-0\" (UID: \"73500676-55b3-4022-b71b-e9ea005da327\") " pod="openstack/ceilometer-0" Dec 11 13:28:16 crc kubenswrapper[4898]: I1211 13:28:16.085111 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/73500676-55b3-4022-b71b-e9ea005da327-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"73500676-55b3-4022-b71b-e9ea005da327\") " pod="openstack/ceilometer-0" Dec 11 13:28:16 crc kubenswrapper[4898]: I1211 13:28:16.085348 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-54x5g\" (UniqueName: \"kubernetes.io/projected/73500676-55b3-4022-b71b-e9ea005da327-kube-api-access-54x5g\") pod \"ceilometer-0\" (UID: \"73500676-55b3-4022-b71b-e9ea005da327\") " pod="openstack/ceilometer-0" Dec 11 13:28:16 crc kubenswrapper[4898]: I1211 13:28:16.085410 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/73500676-55b3-4022-b71b-e9ea005da327-scripts\") pod \"ceilometer-0\" (UID: \"73500676-55b3-4022-b71b-e9ea005da327\") " pod="openstack/ceilometer-0" Dec 11 13:28:16 crc kubenswrapper[4898]: I1211 13:28:16.085637 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73500676-55b3-4022-b71b-e9ea005da327-config-data\") pod \"ceilometer-0\" (UID: \"73500676-55b3-4022-b71b-e9ea005da327\") " pod="openstack/ceilometer-0" Dec 11 13:28:16 crc kubenswrapper[4898]: I1211 13:28:16.085685 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/73500676-55b3-4022-b71b-e9ea005da327-run-httpd\") pod \"ceilometer-0\" (UID: \"73500676-55b3-4022-b71b-e9ea005da327\") " pod="openstack/ceilometer-0" Dec 11 13:28:16 crc kubenswrapper[4898]: I1211 13:28:16.187612 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-54x5g\" (UniqueName: \"kubernetes.io/projected/73500676-55b3-4022-b71b-e9ea005da327-kube-api-access-54x5g\") pod \"ceilometer-0\" (UID: \"73500676-55b3-4022-b71b-e9ea005da327\") " pod="openstack/ceilometer-0" Dec 11 13:28:16 crc kubenswrapper[4898]: I1211 13:28:16.187679 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/73500676-55b3-4022-b71b-e9ea005da327-scripts\") pod \"ceilometer-0\" (UID: \"73500676-55b3-4022-b71b-e9ea005da327\") " pod="openstack/ceilometer-0" Dec 11 13:28:16 crc kubenswrapper[4898]: I1211 13:28:16.187732 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73500676-55b3-4022-b71b-e9ea005da327-config-data\") pod \"ceilometer-0\" (UID: \"73500676-55b3-4022-b71b-e9ea005da327\") " pod="openstack/ceilometer-0" Dec 11 13:28:16 crc kubenswrapper[4898]: I1211 13:28:16.187763 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/73500676-55b3-4022-b71b-e9ea005da327-run-httpd\") pod \"ceilometer-0\" (UID: \"73500676-55b3-4022-b71b-e9ea005da327\") " pod="openstack/ceilometer-0" Dec 11 13:28:16 crc kubenswrapper[4898]: I1211 13:28:16.187835 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73500676-55b3-4022-b71b-e9ea005da327-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"73500676-55b3-4022-b71b-e9ea005da327\") " pod="openstack/ceilometer-0" Dec 11 13:28:16 crc kubenswrapper[4898]: I1211 13:28:16.187864 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/73500676-55b3-4022-b71b-e9ea005da327-log-httpd\") pod \"ceilometer-0\" (UID: \"73500676-55b3-4022-b71b-e9ea005da327\") " pod="openstack/ceilometer-0" Dec 11 13:28:16 crc kubenswrapper[4898]: I1211 13:28:16.187943 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/73500676-55b3-4022-b71b-e9ea005da327-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"73500676-55b3-4022-b71b-e9ea005da327\") " pod="openstack/ceilometer-0" Dec 11 13:28:16 crc kubenswrapper[4898]: I1211 13:28:16.191869 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/73500676-55b3-4022-b71b-e9ea005da327-log-httpd\") pod \"ceilometer-0\" (UID: \"73500676-55b3-4022-b71b-e9ea005da327\") " pod="openstack/ceilometer-0" Dec 11 13:28:16 crc kubenswrapper[4898]: I1211 13:28:16.192285 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/73500676-55b3-4022-b71b-e9ea005da327-run-httpd\") pod \"ceilometer-0\" (UID: \"73500676-55b3-4022-b71b-e9ea005da327\") " pod="openstack/ceilometer-0" Dec 11 13:28:16 crc kubenswrapper[4898]: I1211 13:28:16.193031 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/73500676-55b3-4022-b71b-e9ea005da327-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"73500676-55b3-4022-b71b-e9ea005da327\") " pod="openstack/ceilometer-0" Dec 11 13:28:16 crc kubenswrapper[4898]: I1211 13:28:16.194132 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73500676-55b3-4022-b71b-e9ea005da327-config-data\") pod \"ceilometer-0\" (UID: \"73500676-55b3-4022-b71b-e9ea005da327\") " pod="openstack/ceilometer-0" Dec 11 13:28:16 crc kubenswrapper[4898]: I1211 13:28:16.194580 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73500676-55b3-4022-b71b-e9ea005da327-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"73500676-55b3-4022-b71b-e9ea005da327\") " pod="openstack/ceilometer-0" Dec 11 13:28:16 crc kubenswrapper[4898]: I1211 13:28:16.206015 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/73500676-55b3-4022-b71b-e9ea005da327-scripts\") pod \"ceilometer-0\" (UID: \"73500676-55b3-4022-b71b-e9ea005da327\") " pod="openstack/ceilometer-0" Dec 11 13:28:16 crc kubenswrapper[4898]: I1211 13:28:16.213990 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-54x5g\" (UniqueName: \"kubernetes.io/projected/73500676-55b3-4022-b71b-e9ea005da327-kube-api-access-54x5g\") pod \"ceilometer-0\" (UID: \"73500676-55b3-4022-b71b-e9ea005da327\") " pod="openstack/ceilometer-0" Dec 11 13:28:16 crc kubenswrapper[4898]: I1211 13:28:16.345258 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 11 13:28:16 crc kubenswrapper[4898]: I1211 13:28:16.416385 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 11 13:28:16 crc kubenswrapper[4898]: W1211 13:28:16.424405 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6dbbbea1_e6ed_4ad9_9e31_baf09c312e76.slice/crio-9e8b2b01818b324ecc16eab7604a5d28a45322704c3f5b4e1143a1fd5cd291ff WatchSource:0}: Error finding container 9e8b2b01818b324ecc16eab7604a5d28a45322704c3f5b4e1143a1fd5cd291ff: Status 404 returned error can't find the container with id 9e8b2b01818b324ecc16eab7604a5d28a45322704c3f5b4e1143a1fd5cd291ff Dec 11 13:28:16 crc kubenswrapper[4898]: I1211 13:28:16.485491 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"6dbbbea1-e6ed-4ad9-9e31-baf09c312e76","Type":"ContainerStarted","Data":"9e8b2b01818b324ecc16eab7604a5d28a45322704c3f5b4e1143a1fd5cd291ff"} Dec 11 13:28:16 crc kubenswrapper[4898]: I1211 13:28:16.791626 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6cd68b61-5d09-46d1-908e-6c4e72120b56" path="/var/lib/kubelet/pods/6cd68b61-5d09-46d1-908e-6c4e72120b56/volumes" Dec 11 13:28:16 crc kubenswrapper[4898]: I1211 13:28:16.854065 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 11 13:28:17 crc kubenswrapper[4898]: I1211 13:28:17.505011 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"6dbbbea1-e6ed-4ad9-9e31-baf09c312e76","Type":"ContainerStarted","Data":"9c078972b29d5fc6738c0ac9415b13eee1b265891ab0809e60a0b6daf0af086b"} Dec 11 13:28:17 crc kubenswrapper[4898]: I1211 13:28:17.506075 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Dec 11 13:28:17 crc kubenswrapper[4898]: I1211 13:28:17.508121 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"73500676-55b3-4022-b71b-e9ea005da327","Type":"ContainerStarted","Data":"b61db2069e5c7c6e97ddf6b6bab7e5320948ad1d9efbd62a7895ebd1c25250a4"} Dec 11 13:28:17 crc kubenswrapper[4898]: I1211 13:28:17.562695 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.562445011 podStartE2EDuration="2.562445011s" podCreationTimestamp="2025-12-11 13:28:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 13:28:17.52694042 +0000 UTC m=+1455.099266857" watchObservedRunningTime="2025-12-11 13:28:17.562445011 +0000 UTC m=+1455.134771448" Dec 11 13:28:18 crc kubenswrapper[4898]: I1211 13:28:18.520946 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"73500676-55b3-4022-b71b-e9ea005da327","Type":"ContainerStarted","Data":"dda3ce5a58ed31eafa9421cc97414ff3691240e3840ac40fe6ba7cc2098b9615"} Dec 11 13:28:18 crc kubenswrapper[4898]: I1211 13:28:18.521536 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"73500676-55b3-4022-b71b-e9ea005da327","Type":"ContainerStarted","Data":"f7538843310d5e936ec46984e8748bf7460107648430e87a6f8f4ef09f14552a"} Dec 11 13:28:20 crc kubenswrapper[4898]: I1211 13:28:20.544540 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"73500676-55b3-4022-b71b-e9ea005da327","Type":"ContainerStarted","Data":"4876e5c8b89988f7bc81d291e1be363f2b4f1e01c8e5cb6cd4a53a2b8f2bf4e8"} Dec 11 13:28:22 crc kubenswrapper[4898]: I1211 13:28:22.584465 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"73500676-55b3-4022-b71b-e9ea005da327","Type":"ContainerStarted","Data":"6b1647c5996bc80836e8b9639147d4427c9df0cbbb330afd8aac53079812cb44"} Dec 11 13:28:22 crc kubenswrapper[4898]: I1211 13:28:22.587203 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 11 13:28:22 crc kubenswrapper[4898]: I1211 13:28:22.608132 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.9507407690000003 podStartE2EDuration="7.60811714s" podCreationTimestamp="2025-12-11 13:28:15 +0000 UTC" firstStartedPulling="2025-12-11 13:28:16.84261427 +0000 UTC m=+1454.414940707" lastFinishedPulling="2025-12-11 13:28:21.499990641 +0000 UTC m=+1459.072317078" observedRunningTime="2025-12-11 13:28:22.606506347 +0000 UTC m=+1460.178832794" watchObservedRunningTime="2025-12-11 13:28:22.60811714 +0000 UTC m=+1460.180443577" Dec 11 13:28:25 crc kubenswrapper[4898]: I1211 13:28:25.955200 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Dec 11 13:28:26 crc kubenswrapper[4898]: I1211 13:28:26.394219 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 11 13:28:26 crc kubenswrapper[4898]: I1211 13:28:26.394815 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="73500676-55b3-4022-b71b-e9ea005da327" containerName="ceilometer-central-agent" containerID="cri-o://f7538843310d5e936ec46984e8748bf7460107648430e87a6f8f4ef09f14552a" gracePeriod=30 Dec 11 13:28:26 crc kubenswrapper[4898]: I1211 13:28:26.394951 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="73500676-55b3-4022-b71b-e9ea005da327" containerName="proxy-httpd" containerID="cri-o://6b1647c5996bc80836e8b9639147d4427c9df0cbbb330afd8aac53079812cb44" gracePeriod=30 Dec 11 13:28:26 crc kubenswrapper[4898]: I1211 13:28:26.395006 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="73500676-55b3-4022-b71b-e9ea005da327" containerName="sg-core" containerID="cri-o://4876e5c8b89988f7bc81d291e1be363f2b4f1e01c8e5cb6cd4a53a2b8f2bf4e8" gracePeriod=30 Dec 11 13:28:26 crc kubenswrapper[4898]: I1211 13:28:26.394979 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="73500676-55b3-4022-b71b-e9ea005da327" containerName="ceilometer-notification-agent" containerID="cri-o://dda3ce5a58ed31eafa9421cc97414ff3691240e3840ac40fe6ba7cc2098b9615" gracePeriod=30 Dec 11 13:28:26 crc kubenswrapper[4898]: I1211 13:28:26.639300 4898 generic.go:334] "Generic (PLEG): container finished" podID="73500676-55b3-4022-b71b-e9ea005da327" containerID="4876e5c8b89988f7bc81d291e1be363f2b4f1e01c8e5cb6cd4a53a2b8f2bf4e8" exitCode=2 Dec 11 13:28:26 crc kubenswrapper[4898]: I1211 13:28:26.639370 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"73500676-55b3-4022-b71b-e9ea005da327","Type":"ContainerDied","Data":"4876e5c8b89988f7bc81d291e1be363f2b4f1e01c8e5cb6cd4a53a2b8f2bf4e8"} Dec 11 13:28:26 crc kubenswrapper[4898]: I1211 13:28:26.705067 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-dgfd5"] Dec 11 13:28:26 crc kubenswrapper[4898]: I1211 13:28:26.707056 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-dgfd5" Dec 11 13:28:26 crc kubenswrapper[4898]: I1211 13:28:26.710209 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Dec 11 13:28:26 crc kubenswrapper[4898]: I1211 13:28:26.710507 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Dec 11 13:28:26 crc kubenswrapper[4898]: I1211 13:28:26.744521 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-dgfd5"] Dec 11 13:28:26 crc kubenswrapper[4898]: I1211 13:28:26.850858 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17ff0417-b6d7-42f4-9de4-e2482b659fc2-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-dgfd5\" (UID: \"17ff0417-b6d7-42f4-9de4-e2482b659fc2\") " pod="openstack/nova-cell0-cell-mapping-dgfd5" Dec 11 13:28:26 crc kubenswrapper[4898]: I1211 13:28:26.850925 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17ff0417-b6d7-42f4-9de4-e2482b659fc2-config-data\") pod \"nova-cell0-cell-mapping-dgfd5\" (UID: \"17ff0417-b6d7-42f4-9de4-e2482b659fc2\") " pod="openstack/nova-cell0-cell-mapping-dgfd5" Dec 11 13:28:26 crc kubenswrapper[4898]: I1211 13:28:26.851055 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pnvrx\" (UniqueName: \"kubernetes.io/projected/17ff0417-b6d7-42f4-9de4-e2482b659fc2-kube-api-access-pnvrx\") pod \"nova-cell0-cell-mapping-dgfd5\" (UID: \"17ff0417-b6d7-42f4-9de4-e2482b659fc2\") " pod="openstack/nova-cell0-cell-mapping-dgfd5" Dec 11 13:28:26 crc kubenswrapper[4898]: I1211 13:28:26.851113 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/17ff0417-b6d7-42f4-9de4-e2482b659fc2-scripts\") pod \"nova-cell0-cell-mapping-dgfd5\" (UID: \"17ff0417-b6d7-42f4-9de4-e2482b659fc2\") " pod="openstack/nova-cell0-cell-mapping-dgfd5" Dec 11 13:28:26 crc kubenswrapper[4898]: I1211 13:28:26.890503 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 11 13:28:26 crc kubenswrapper[4898]: I1211 13:28:26.917307 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 11 13:28:26 crc kubenswrapper[4898]: I1211 13:28:26.917434 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 11 13:28:26 crc kubenswrapper[4898]: I1211 13:28:26.930066 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 11 13:28:26 crc kubenswrapper[4898]: I1211 13:28:26.958944 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pnvrx\" (UniqueName: \"kubernetes.io/projected/17ff0417-b6d7-42f4-9de4-e2482b659fc2-kube-api-access-pnvrx\") pod \"nova-cell0-cell-mapping-dgfd5\" (UID: \"17ff0417-b6d7-42f4-9de4-e2482b659fc2\") " pod="openstack/nova-cell0-cell-mapping-dgfd5" Dec 11 13:28:26 crc kubenswrapper[4898]: I1211 13:28:26.959049 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/17ff0417-b6d7-42f4-9de4-e2482b659fc2-scripts\") pod \"nova-cell0-cell-mapping-dgfd5\" (UID: \"17ff0417-b6d7-42f4-9de4-e2482b659fc2\") " pod="openstack/nova-cell0-cell-mapping-dgfd5" Dec 11 13:28:26 crc kubenswrapper[4898]: I1211 13:28:26.959194 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17ff0417-b6d7-42f4-9de4-e2482b659fc2-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-dgfd5\" (UID: \"17ff0417-b6d7-42f4-9de4-e2482b659fc2\") " pod="openstack/nova-cell0-cell-mapping-dgfd5" Dec 11 13:28:26 crc kubenswrapper[4898]: I1211 13:28:26.959280 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17ff0417-b6d7-42f4-9de4-e2482b659fc2-config-data\") pod \"nova-cell0-cell-mapping-dgfd5\" (UID: \"17ff0417-b6d7-42f4-9de4-e2482b659fc2\") " pod="openstack/nova-cell0-cell-mapping-dgfd5" Dec 11 13:28:26 crc kubenswrapper[4898]: I1211 13:28:26.985362 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17ff0417-b6d7-42f4-9de4-e2482b659fc2-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-dgfd5\" (UID: \"17ff0417-b6d7-42f4-9de4-e2482b659fc2\") " pod="openstack/nova-cell0-cell-mapping-dgfd5" Dec 11 13:28:26 crc kubenswrapper[4898]: I1211 13:28:26.985918 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/17ff0417-b6d7-42f4-9de4-e2482b659fc2-scripts\") pod \"nova-cell0-cell-mapping-dgfd5\" (UID: \"17ff0417-b6d7-42f4-9de4-e2482b659fc2\") " pod="openstack/nova-cell0-cell-mapping-dgfd5" Dec 11 13:28:26 crc kubenswrapper[4898]: I1211 13:28:26.986530 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17ff0417-b6d7-42f4-9de4-e2482b659fc2-config-data\") pod \"nova-cell0-cell-mapping-dgfd5\" (UID: \"17ff0417-b6d7-42f4-9de4-e2482b659fc2\") " pod="openstack/nova-cell0-cell-mapping-dgfd5" Dec 11 13:28:27 crc kubenswrapper[4898]: I1211 13:28:27.002311 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pnvrx\" (UniqueName: \"kubernetes.io/projected/17ff0417-b6d7-42f4-9de4-e2482b659fc2-kube-api-access-pnvrx\") pod \"nova-cell0-cell-mapping-dgfd5\" (UID: \"17ff0417-b6d7-42f4-9de4-e2482b659fc2\") " pod="openstack/nova-cell0-cell-mapping-dgfd5" Dec 11 13:28:27 crc kubenswrapper[4898]: I1211 13:28:27.019221 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 11 13:28:27 crc kubenswrapper[4898]: I1211 13:28:27.024411 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 11 13:28:27 crc kubenswrapper[4898]: I1211 13:28:27.033679 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 11 13:28:27 crc kubenswrapper[4898]: I1211 13:28:27.040203 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-dgfd5" Dec 11 13:28:27 crc kubenswrapper[4898]: I1211 13:28:27.061393 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-db-create-hbds7"] Dec 11 13:28:27 crc kubenswrapper[4898]: I1211 13:28:27.062902 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be99ef39-7e89-4189-8f85-91cc80505896-config-data\") pod \"nova-api-0\" (UID: \"be99ef39-7e89-4189-8f85-91cc80505896\") " pod="openstack/nova-api-0" Dec 11 13:28:27 crc kubenswrapper[4898]: I1211 13:28:27.063033 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/be99ef39-7e89-4189-8f85-91cc80505896-logs\") pod \"nova-api-0\" (UID: \"be99ef39-7e89-4189-8f85-91cc80505896\") " pod="openstack/nova-api-0" Dec 11 13:28:27 crc kubenswrapper[4898]: I1211 13:28:27.063140 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be99ef39-7e89-4189-8f85-91cc80505896-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"be99ef39-7e89-4189-8f85-91cc80505896\") " pod="openstack/nova-api-0" Dec 11 13:28:27 crc kubenswrapper[4898]: I1211 13:28:27.063210 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s4qpx\" (UniqueName: \"kubernetes.io/projected/be99ef39-7e89-4189-8f85-91cc80505896-kube-api-access-s4qpx\") pod \"nova-api-0\" (UID: \"be99ef39-7e89-4189-8f85-91cc80505896\") " pod="openstack/nova-api-0" Dec 11 13:28:27 crc kubenswrapper[4898]: I1211 13:28:27.063840 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-hbds7" Dec 11 13:28:27 crc kubenswrapper[4898]: I1211 13:28:27.165051 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 11 13:28:27 crc kubenswrapper[4898]: I1211 13:28:27.166521 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s4qpx\" (UniqueName: \"kubernetes.io/projected/be99ef39-7e89-4189-8f85-91cc80505896-kube-api-access-s4qpx\") pod \"nova-api-0\" (UID: \"be99ef39-7e89-4189-8f85-91cc80505896\") " pod="openstack/nova-api-0" Dec 11 13:28:27 crc kubenswrapper[4898]: I1211 13:28:27.166597 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be99ef39-7e89-4189-8f85-91cc80505896-config-data\") pod \"nova-api-0\" (UID: \"be99ef39-7e89-4189-8f85-91cc80505896\") " pod="openstack/nova-api-0" Dec 11 13:28:27 crc kubenswrapper[4898]: I1211 13:28:27.166632 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j4vdw\" (UniqueName: \"kubernetes.io/projected/8a29489d-803b-4a04-a612-485c4a015ccf-kube-api-access-j4vdw\") pod \"nova-scheduler-0\" (UID: \"8a29489d-803b-4a04-a612-485c4a015ccf\") " pod="openstack/nova-scheduler-0" Dec 11 13:28:27 crc kubenswrapper[4898]: I1211 13:28:27.166765 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/be99ef39-7e89-4189-8f85-91cc80505896-logs\") pod \"nova-api-0\" (UID: \"be99ef39-7e89-4189-8f85-91cc80505896\") " pod="openstack/nova-api-0" Dec 11 13:28:27 crc kubenswrapper[4898]: I1211 13:28:27.166849 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a29489d-803b-4a04-a612-485c4a015ccf-config-data\") pod \"nova-scheduler-0\" (UID: \"8a29489d-803b-4a04-a612-485c4a015ccf\") " pod="openstack/nova-scheduler-0" Dec 11 13:28:27 crc kubenswrapper[4898]: I1211 13:28:27.166887 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/919f5290-479c-4afc-9160-f69d2f2b1e09-operator-scripts\") pod \"aodh-db-create-hbds7\" (UID: \"919f5290-479c-4afc-9160-f69d2f2b1e09\") " pod="openstack/aodh-db-create-hbds7" Dec 11 13:28:27 crc kubenswrapper[4898]: I1211 13:28:27.166939 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l92xm\" (UniqueName: \"kubernetes.io/projected/919f5290-479c-4afc-9160-f69d2f2b1e09-kube-api-access-l92xm\") pod \"aodh-db-create-hbds7\" (UID: \"919f5290-479c-4afc-9160-f69d2f2b1e09\") " pod="openstack/aodh-db-create-hbds7" Dec 11 13:28:27 crc kubenswrapper[4898]: I1211 13:28:27.166988 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a29489d-803b-4a04-a612-485c4a015ccf-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"8a29489d-803b-4a04-a612-485c4a015ccf\") " pod="openstack/nova-scheduler-0" Dec 11 13:28:27 crc kubenswrapper[4898]: I1211 13:28:27.167017 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be99ef39-7e89-4189-8f85-91cc80505896-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"be99ef39-7e89-4189-8f85-91cc80505896\") " pod="openstack/nova-api-0" Dec 11 13:28:27 crc kubenswrapper[4898]: I1211 13:28:27.168068 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/be99ef39-7e89-4189-8f85-91cc80505896-logs\") pod \"nova-api-0\" (UID: \"be99ef39-7e89-4189-8f85-91cc80505896\") " pod="openstack/nova-api-0" Dec 11 13:28:27 crc kubenswrapper[4898]: I1211 13:28:27.184280 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be99ef39-7e89-4189-8f85-91cc80505896-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"be99ef39-7e89-4189-8f85-91cc80505896\") " pod="openstack/nova-api-0" Dec 11 13:28:27 crc kubenswrapper[4898]: I1211 13:28:27.194791 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be99ef39-7e89-4189-8f85-91cc80505896-config-data\") pod \"nova-api-0\" (UID: \"be99ef39-7e89-4189-8f85-91cc80505896\") " pod="openstack/nova-api-0" Dec 11 13:28:27 crc kubenswrapper[4898]: I1211 13:28:27.213096 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-create-hbds7"] Dec 11 13:28:27 crc kubenswrapper[4898]: I1211 13:28:27.225714 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 11 13:28:27 crc kubenswrapper[4898]: I1211 13:28:27.227499 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 11 13:28:27 crc kubenswrapper[4898]: I1211 13:28:27.239780 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Dec 11 13:28:27 crc kubenswrapper[4898]: I1211 13:28:27.270741 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-3c53-account-create-update-kzh6s"] Dec 11 13:28:27 crc kubenswrapper[4898]: I1211 13:28:27.281860 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a29489d-803b-4a04-a612-485c4a015ccf-config-data\") pod \"nova-scheduler-0\" (UID: \"8a29489d-803b-4a04-a612-485c4a015ccf\") " pod="openstack/nova-scheduler-0" Dec 11 13:28:27 crc kubenswrapper[4898]: I1211 13:28:27.281913 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/919f5290-479c-4afc-9160-f69d2f2b1e09-operator-scripts\") pod \"aodh-db-create-hbds7\" (UID: \"919f5290-479c-4afc-9160-f69d2f2b1e09\") " pod="openstack/aodh-db-create-hbds7" Dec 11 13:28:27 crc kubenswrapper[4898]: I1211 13:28:27.281953 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l92xm\" (UniqueName: \"kubernetes.io/projected/919f5290-479c-4afc-9160-f69d2f2b1e09-kube-api-access-l92xm\") pod \"aodh-db-create-hbds7\" (UID: \"919f5290-479c-4afc-9160-f69d2f2b1e09\") " pod="openstack/aodh-db-create-hbds7" Dec 11 13:28:27 crc kubenswrapper[4898]: I1211 13:28:27.281980 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a29489d-803b-4a04-a612-485c4a015ccf-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"8a29489d-803b-4a04-a612-485c4a015ccf\") " pod="openstack/nova-scheduler-0" Dec 11 13:28:27 crc kubenswrapper[4898]: I1211 13:28:27.282079 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j4vdw\" (UniqueName: \"kubernetes.io/projected/8a29489d-803b-4a04-a612-485c4a015ccf-kube-api-access-j4vdw\") pod \"nova-scheduler-0\" (UID: \"8a29489d-803b-4a04-a612-485c4a015ccf\") " pod="openstack/nova-scheduler-0" Dec 11 13:28:27 crc kubenswrapper[4898]: I1211 13:28:27.283695 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/919f5290-479c-4afc-9160-f69d2f2b1e09-operator-scripts\") pod \"aodh-db-create-hbds7\" (UID: \"919f5290-479c-4afc-9160-f69d2f2b1e09\") " pod="openstack/aodh-db-create-hbds7" Dec 11 13:28:27 crc kubenswrapper[4898]: I1211 13:28:27.288337 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-3c53-account-create-update-kzh6s" Dec 11 13:28:27 crc kubenswrapper[4898]: I1211 13:28:27.290924 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s4qpx\" (UniqueName: \"kubernetes.io/projected/be99ef39-7e89-4189-8f85-91cc80505896-kube-api-access-s4qpx\") pod \"nova-api-0\" (UID: \"be99ef39-7e89-4189-8f85-91cc80505896\") " pod="openstack/nova-api-0" Dec 11 13:28:27 crc kubenswrapper[4898]: I1211 13:28:27.297029 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-db-secret" Dec 11 13:28:27 crc kubenswrapper[4898]: I1211 13:28:27.297255 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a29489d-803b-4a04-a612-485c4a015ccf-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"8a29489d-803b-4a04-a612-485c4a015ccf\") " pod="openstack/nova-scheduler-0" Dec 11 13:28:27 crc kubenswrapper[4898]: I1211 13:28:27.303007 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 11 13:28:27 crc kubenswrapper[4898]: I1211 13:28:27.328351 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l92xm\" (UniqueName: \"kubernetes.io/projected/919f5290-479c-4afc-9160-f69d2f2b1e09-kube-api-access-l92xm\") pod \"aodh-db-create-hbds7\" (UID: \"919f5290-479c-4afc-9160-f69d2f2b1e09\") " pod="openstack/aodh-db-create-hbds7" Dec 11 13:28:27 crc kubenswrapper[4898]: I1211 13:28:27.333051 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j4vdw\" (UniqueName: \"kubernetes.io/projected/8a29489d-803b-4a04-a612-485c4a015ccf-kube-api-access-j4vdw\") pod \"nova-scheduler-0\" (UID: \"8a29489d-803b-4a04-a612-485c4a015ccf\") " pod="openstack/nova-scheduler-0" Dec 11 13:28:27 crc kubenswrapper[4898]: I1211 13:28:27.333129 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-3c53-account-create-update-kzh6s"] Dec 11 13:28:27 crc kubenswrapper[4898]: I1211 13:28:27.335882 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a29489d-803b-4a04-a612-485c4a015ccf-config-data\") pod \"nova-scheduler-0\" (UID: \"8a29489d-803b-4a04-a612-485c4a015ccf\") " pod="openstack/nova-scheduler-0" Dec 11 13:28:27 crc kubenswrapper[4898]: I1211 13:28:27.353680 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 11 13:28:27 crc kubenswrapper[4898]: I1211 13:28:27.361529 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 11 13:28:27 crc kubenswrapper[4898]: I1211 13:28:27.370065 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 11 13:28:27 crc kubenswrapper[4898]: I1211 13:28:27.384268 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cd9qp\" (UniqueName: \"kubernetes.io/projected/5ca26903-ffb9-4637-aac0-5284a81cbe85-kube-api-access-cd9qp\") pod \"aodh-3c53-account-create-update-kzh6s\" (UID: \"5ca26903-ffb9-4637-aac0-5284a81cbe85\") " pod="openstack/aodh-3c53-account-create-update-kzh6s" Dec 11 13:28:27 crc kubenswrapper[4898]: I1211 13:28:27.384545 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5ca26903-ffb9-4637-aac0-5284a81cbe85-operator-scripts\") pod \"aodh-3c53-account-create-update-kzh6s\" (UID: \"5ca26903-ffb9-4637-aac0-5284a81cbe85\") " pod="openstack/aodh-3c53-account-create-update-kzh6s" Dec 11 13:28:27 crc kubenswrapper[4898]: I1211 13:28:27.384719 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70ba1510-a232-4614-bced-c1afee3bd9b2-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"70ba1510-a232-4614-bced-c1afee3bd9b2\") " pod="openstack/nova-cell1-novncproxy-0" Dec 11 13:28:27 crc kubenswrapper[4898]: I1211 13:28:27.384796 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kjjt9\" (UniqueName: \"kubernetes.io/projected/70ba1510-a232-4614-bced-c1afee3bd9b2-kube-api-access-kjjt9\") pod \"nova-cell1-novncproxy-0\" (UID: \"70ba1510-a232-4614-bced-c1afee3bd9b2\") " pod="openstack/nova-cell1-novncproxy-0" Dec 11 13:28:27 crc kubenswrapper[4898]: I1211 13:28:27.384842 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70ba1510-a232-4614-bced-c1afee3bd9b2-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"70ba1510-a232-4614-bced-c1afee3bd9b2\") " pod="openstack/nova-cell1-novncproxy-0" Dec 11 13:28:27 crc kubenswrapper[4898]: I1211 13:28:27.397229 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 11 13:28:27 crc kubenswrapper[4898]: I1211 13:28:27.463396 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 11 13:28:27 crc kubenswrapper[4898]: I1211 13:28:27.488797 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70ba1510-a232-4614-bced-c1afee3bd9b2-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"70ba1510-a232-4614-bced-c1afee3bd9b2\") " pod="openstack/nova-cell1-novncproxy-0" Dec 11 13:28:27 crc kubenswrapper[4898]: I1211 13:28:27.488868 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kjjt9\" (UniqueName: \"kubernetes.io/projected/70ba1510-a232-4614-bced-c1afee3bd9b2-kube-api-access-kjjt9\") pod \"nova-cell1-novncproxy-0\" (UID: \"70ba1510-a232-4614-bced-c1afee3bd9b2\") " pod="openstack/nova-cell1-novncproxy-0" Dec 11 13:28:27 crc kubenswrapper[4898]: I1211 13:28:27.488902 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70ba1510-a232-4614-bced-c1afee3bd9b2-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"70ba1510-a232-4614-bced-c1afee3bd9b2\") " pod="openstack/nova-cell1-novncproxy-0" Dec 11 13:28:27 crc kubenswrapper[4898]: I1211 13:28:27.488983 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b930380f-7595-4ade-99f4-30a280f023ff-logs\") pod \"nova-metadata-0\" (UID: \"b930380f-7595-4ade-99f4-30a280f023ff\") " pod="openstack/nova-metadata-0" Dec 11 13:28:27 crc kubenswrapper[4898]: I1211 13:28:27.489022 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cd9qp\" (UniqueName: \"kubernetes.io/projected/5ca26903-ffb9-4637-aac0-5284a81cbe85-kube-api-access-cd9qp\") pod \"aodh-3c53-account-create-update-kzh6s\" (UID: \"5ca26903-ffb9-4637-aac0-5284a81cbe85\") " pod="openstack/aodh-3c53-account-create-update-kzh6s" Dec 11 13:28:27 crc kubenswrapper[4898]: I1211 13:28:27.489059 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b930380f-7595-4ade-99f4-30a280f023ff-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b930380f-7595-4ade-99f4-30a280f023ff\") " pod="openstack/nova-metadata-0" Dec 11 13:28:27 crc kubenswrapper[4898]: I1211 13:28:27.489118 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r5xg7\" (UniqueName: \"kubernetes.io/projected/b930380f-7595-4ade-99f4-30a280f023ff-kube-api-access-r5xg7\") pod \"nova-metadata-0\" (UID: \"b930380f-7595-4ade-99f4-30a280f023ff\") " pod="openstack/nova-metadata-0" Dec 11 13:28:27 crc kubenswrapper[4898]: I1211 13:28:27.489165 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5ca26903-ffb9-4637-aac0-5284a81cbe85-operator-scripts\") pod \"aodh-3c53-account-create-update-kzh6s\" (UID: \"5ca26903-ffb9-4637-aac0-5284a81cbe85\") " pod="openstack/aodh-3c53-account-create-update-kzh6s" Dec 11 13:28:27 crc kubenswrapper[4898]: I1211 13:28:27.489184 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b930380f-7595-4ade-99f4-30a280f023ff-config-data\") pod \"nova-metadata-0\" (UID: \"b930380f-7595-4ade-99f4-30a280f023ff\") " pod="openstack/nova-metadata-0" Dec 11 13:28:27 crc kubenswrapper[4898]: I1211 13:28:27.491288 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5ca26903-ffb9-4637-aac0-5284a81cbe85-operator-scripts\") pod \"aodh-3c53-account-create-update-kzh6s\" (UID: \"5ca26903-ffb9-4637-aac0-5284a81cbe85\") " pod="openstack/aodh-3c53-account-create-update-kzh6s" Dec 11 13:28:27 crc kubenswrapper[4898]: I1211 13:28:27.510116 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70ba1510-a232-4614-bced-c1afee3bd9b2-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"70ba1510-a232-4614-bced-c1afee3bd9b2\") " pod="openstack/nova-cell1-novncproxy-0" Dec 11 13:28:27 crc kubenswrapper[4898]: I1211 13:28:27.511082 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70ba1510-a232-4614-bced-c1afee3bd9b2-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"70ba1510-a232-4614-bced-c1afee3bd9b2\") " pod="openstack/nova-cell1-novncproxy-0" Dec 11 13:28:27 crc kubenswrapper[4898]: I1211 13:28:27.527366 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-568d7fd7cf-g8csh"] Dec 11 13:28:27 crc kubenswrapper[4898]: I1211 13:28:27.529807 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-568d7fd7cf-g8csh" Dec 11 13:28:27 crc kubenswrapper[4898]: I1211 13:28:27.533034 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-hbds7" Dec 11 13:28:27 crc kubenswrapper[4898]: I1211 13:28:27.546241 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 11 13:28:27 crc kubenswrapper[4898]: I1211 13:28:27.552271 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cd9qp\" (UniqueName: \"kubernetes.io/projected/5ca26903-ffb9-4637-aac0-5284a81cbe85-kube-api-access-cd9qp\") pod \"aodh-3c53-account-create-update-kzh6s\" (UID: \"5ca26903-ffb9-4637-aac0-5284a81cbe85\") " pod="openstack/aodh-3c53-account-create-update-kzh6s" Dec 11 13:28:27 crc kubenswrapper[4898]: I1211 13:28:27.555896 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kjjt9\" (UniqueName: \"kubernetes.io/projected/70ba1510-a232-4614-bced-c1afee3bd9b2-kube-api-access-kjjt9\") pod \"nova-cell1-novncproxy-0\" (UID: \"70ba1510-a232-4614-bced-c1afee3bd9b2\") " pod="openstack/nova-cell1-novncproxy-0" Dec 11 13:28:27 crc kubenswrapper[4898]: I1211 13:28:27.572231 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 11 13:28:27 crc kubenswrapper[4898]: I1211 13:28:27.706515 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-3c53-account-create-update-kzh6s" Dec 11 13:28:27 crc kubenswrapper[4898]: I1211 13:28:27.745249 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b930380f-7595-4ade-99f4-30a280f023ff-config-data\") pod \"nova-metadata-0\" (UID: \"b930380f-7595-4ade-99f4-30a280f023ff\") " pod="openstack/nova-metadata-0" Dec 11 13:28:27 crc kubenswrapper[4898]: I1211 13:28:27.745342 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/15ab34e2-564c-479b-a8af-c428abb92e17-ovsdbserver-nb\") pod \"dnsmasq-dns-568d7fd7cf-g8csh\" (UID: \"15ab34e2-564c-479b-a8af-c428abb92e17\") " pod="openstack/dnsmasq-dns-568d7fd7cf-g8csh" Dec 11 13:28:27 crc kubenswrapper[4898]: I1211 13:28:27.745506 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b930380f-7595-4ade-99f4-30a280f023ff-logs\") pod \"nova-metadata-0\" (UID: \"b930380f-7595-4ade-99f4-30a280f023ff\") " pod="openstack/nova-metadata-0" Dec 11 13:28:27 crc kubenswrapper[4898]: I1211 13:28:27.745535 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dg7vs\" (UniqueName: \"kubernetes.io/projected/15ab34e2-564c-479b-a8af-c428abb92e17-kube-api-access-dg7vs\") pod \"dnsmasq-dns-568d7fd7cf-g8csh\" (UID: \"15ab34e2-564c-479b-a8af-c428abb92e17\") " pod="openstack/dnsmasq-dns-568d7fd7cf-g8csh" Dec 11 13:28:27 crc kubenswrapper[4898]: I1211 13:28:27.745604 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b930380f-7595-4ade-99f4-30a280f023ff-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b930380f-7595-4ade-99f4-30a280f023ff\") " pod="openstack/nova-metadata-0" Dec 11 13:28:27 crc kubenswrapper[4898]: I1211 13:28:27.745633 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/15ab34e2-564c-479b-a8af-c428abb92e17-dns-swift-storage-0\") pod \"dnsmasq-dns-568d7fd7cf-g8csh\" (UID: \"15ab34e2-564c-479b-a8af-c428abb92e17\") " pod="openstack/dnsmasq-dns-568d7fd7cf-g8csh" Dec 11 13:28:27 crc kubenswrapper[4898]: I1211 13:28:27.745813 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/15ab34e2-564c-479b-a8af-c428abb92e17-config\") pod \"dnsmasq-dns-568d7fd7cf-g8csh\" (UID: \"15ab34e2-564c-479b-a8af-c428abb92e17\") " pod="openstack/dnsmasq-dns-568d7fd7cf-g8csh" Dec 11 13:28:27 crc kubenswrapper[4898]: I1211 13:28:27.745836 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/15ab34e2-564c-479b-a8af-c428abb92e17-dns-svc\") pod \"dnsmasq-dns-568d7fd7cf-g8csh\" (UID: \"15ab34e2-564c-479b-a8af-c428abb92e17\") " pod="openstack/dnsmasq-dns-568d7fd7cf-g8csh" Dec 11 13:28:27 crc kubenswrapper[4898]: I1211 13:28:27.745891 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/15ab34e2-564c-479b-a8af-c428abb92e17-ovsdbserver-sb\") pod \"dnsmasq-dns-568d7fd7cf-g8csh\" (UID: \"15ab34e2-564c-479b-a8af-c428abb92e17\") " pod="openstack/dnsmasq-dns-568d7fd7cf-g8csh" Dec 11 13:28:27 crc kubenswrapper[4898]: I1211 13:28:27.745924 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r5xg7\" (UniqueName: \"kubernetes.io/projected/b930380f-7595-4ade-99f4-30a280f023ff-kube-api-access-r5xg7\") pod \"nova-metadata-0\" (UID: \"b930380f-7595-4ade-99f4-30a280f023ff\") " pod="openstack/nova-metadata-0" Dec 11 13:28:27 crc kubenswrapper[4898]: I1211 13:28:27.746127 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b930380f-7595-4ade-99f4-30a280f023ff-logs\") pod \"nova-metadata-0\" (UID: \"b930380f-7595-4ade-99f4-30a280f023ff\") " pod="openstack/nova-metadata-0" Dec 11 13:28:27 crc kubenswrapper[4898]: I1211 13:28:27.746429 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-568d7fd7cf-g8csh"] Dec 11 13:28:27 crc kubenswrapper[4898]: I1211 13:28:27.754445 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b930380f-7595-4ade-99f4-30a280f023ff-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b930380f-7595-4ade-99f4-30a280f023ff\") " pod="openstack/nova-metadata-0" Dec 11 13:28:27 crc kubenswrapper[4898]: I1211 13:28:27.783030 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b930380f-7595-4ade-99f4-30a280f023ff-config-data\") pod \"nova-metadata-0\" (UID: \"b930380f-7595-4ade-99f4-30a280f023ff\") " pod="openstack/nova-metadata-0" Dec 11 13:28:27 crc kubenswrapper[4898]: I1211 13:28:27.794295 4898 generic.go:334] "Generic (PLEG): container finished" podID="73500676-55b3-4022-b71b-e9ea005da327" containerID="6b1647c5996bc80836e8b9639147d4427c9df0cbbb330afd8aac53079812cb44" exitCode=0 Dec 11 13:28:27 crc kubenswrapper[4898]: I1211 13:28:27.794326 4898 generic.go:334] "Generic (PLEG): container finished" podID="73500676-55b3-4022-b71b-e9ea005da327" containerID="dda3ce5a58ed31eafa9421cc97414ff3691240e3840ac40fe6ba7cc2098b9615" exitCode=0 Dec 11 13:28:27 crc kubenswrapper[4898]: I1211 13:28:27.794337 4898 generic.go:334] "Generic (PLEG): container finished" podID="73500676-55b3-4022-b71b-e9ea005da327" containerID="f7538843310d5e936ec46984e8748bf7460107648430e87a6f8f4ef09f14552a" exitCode=0 Dec 11 13:28:27 crc kubenswrapper[4898]: I1211 13:28:27.794359 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"73500676-55b3-4022-b71b-e9ea005da327","Type":"ContainerDied","Data":"6b1647c5996bc80836e8b9639147d4427c9df0cbbb330afd8aac53079812cb44"} Dec 11 13:28:27 crc kubenswrapper[4898]: I1211 13:28:27.794474 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"73500676-55b3-4022-b71b-e9ea005da327","Type":"ContainerDied","Data":"dda3ce5a58ed31eafa9421cc97414ff3691240e3840ac40fe6ba7cc2098b9615"} Dec 11 13:28:27 crc kubenswrapper[4898]: I1211 13:28:27.794486 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"73500676-55b3-4022-b71b-e9ea005da327","Type":"ContainerDied","Data":"f7538843310d5e936ec46984e8748bf7460107648430e87a6f8f4ef09f14552a"} Dec 11 13:28:27 crc kubenswrapper[4898]: I1211 13:28:27.797144 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r5xg7\" (UniqueName: \"kubernetes.io/projected/b930380f-7595-4ade-99f4-30a280f023ff-kube-api-access-r5xg7\") pod \"nova-metadata-0\" (UID: \"b930380f-7595-4ade-99f4-30a280f023ff\") " pod="openstack/nova-metadata-0" Dec 11 13:28:27 crc kubenswrapper[4898]: I1211 13:28:27.851904 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/15ab34e2-564c-479b-a8af-c428abb92e17-ovsdbserver-nb\") pod \"dnsmasq-dns-568d7fd7cf-g8csh\" (UID: \"15ab34e2-564c-479b-a8af-c428abb92e17\") " pod="openstack/dnsmasq-dns-568d7fd7cf-g8csh" Dec 11 13:28:27 crc kubenswrapper[4898]: I1211 13:28:27.852039 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dg7vs\" (UniqueName: \"kubernetes.io/projected/15ab34e2-564c-479b-a8af-c428abb92e17-kube-api-access-dg7vs\") pod \"dnsmasq-dns-568d7fd7cf-g8csh\" (UID: \"15ab34e2-564c-479b-a8af-c428abb92e17\") " pod="openstack/dnsmasq-dns-568d7fd7cf-g8csh" Dec 11 13:28:27 crc kubenswrapper[4898]: I1211 13:28:27.852111 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/15ab34e2-564c-479b-a8af-c428abb92e17-dns-swift-storage-0\") pod \"dnsmasq-dns-568d7fd7cf-g8csh\" (UID: \"15ab34e2-564c-479b-a8af-c428abb92e17\") " pod="openstack/dnsmasq-dns-568d7fd7cf-g8csh" Dec 11 13:28:27 crc kubenswrapper[4898]: I1211 13:28:27.852161 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/15ab34e2-564c-479b-a8af-c428abb92e17-config\") pod \"dnsmasq-dns-568d7fd7cf-g8csh\" (UID: \"15ab34e2-564c-479b-a8af-c428abb92e17\") " pod="openstack/dnsmasq-dns-568d7fd7cf-g8csh" Dec 11 13:28:27 crc kubenswrapper[4898]: I1211 13:28:27.852185 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/15ab34e2-564c-479b-a8af-c428abb92e17-dns-svc\") pod \"dnsmasq-dns-568d7fd7cf-g8csh\" (UID: \"15ab34e2-564c-479b-a8af-c428abb92e17\") " pod="openstack/dnsmasq-dns-568d7fd7cf-g8csh" Dec 11 13:28:27 crc kubenswrapper[4898]: I1211 13:28:27.852227 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/15ab34e2-564c-479b-a8af-c428abb92e17-ovsdbserver-sb\") pod \"dnsmasq-dns-568d7fd7cf-g8csh\" (UID: \"15ab34e2-564c-479b-a8af-c428abb92e17\") " pod="openstack/dnsmasq-dns-568d7fd7cf-g8csh" Dec 11 13:28:27 crc kubenswrapper[4898]: I1211 13:28:27.853344 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/15ab34e2-564c-479b-a8af-c428abb92e17-ovsdbserver-sb\") pod \"dnsmasq-dns-568d7fd7cf-g8csh\" (UID: \"15ab34e2-564c-479b-a8af-c428abb92e17\") " pod="openstack/dnsmasq-dns-568d7fd7cf-g8csh" Dec 11 13:28:27 crc kubenswrapper[4898]: I1211 13:28:27.853607 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/15ab34e2-564c-479b-a8af-c428abb92e17-dns-swift-storage-0\") pod \"dnsmasq-dns-568d7fd7cf-g8csh\" (UID: \"15ab34e2-564c-479b-a8af-c428abb92e17\") " pod="openstack/dnsmasq-dns-568d7fd7cf-g8csh" Dec 11 13:28:27 crc kubenswrapper[4898]: I1211 13:28:27.853959 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/15ab34e2-564c-479b-a8af-c428abb92e17-config\") pod \"dnsmasq-dns-568d7fd7cf-g8csh\" (UID: \"15ab34e2-564c-479b-a8af-c428abb92e17\") " pod="openstack/dnsmasq-dns-568d7fd7cf-g8csh" Dec 11 13:28:27 crc kubenswrapper[4898]: I1211 13:28:27.864590 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/15ab34e2-564c-479b-a8af-c428abb92e17-ovsdbserver-nb\") pod \"dnsmasq-dns-568d7fd7cf-g8csh\" (UID: \"15ab34e2-564c-479b-a8af-c428abb92e17\") " pod="openstack/dnsmasq-dns-568d7fd7cf-g8csh" Dec 11 13:28:27 crc kubenswrapper[4898]: I1211 13:28:27.888332 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/15ab34e2-564c-479b-a8af-c428abb92e17-dns-svc\") pod \"dnsmasq-dns-568d7fd7cf-g8csh\" (UID: \"15ab34e2-564c-479b-a8af-c428abb92e17\") " pod="openstack/dnsmasq-dns-568d7fd7cf-g8csh" Dec 11 13:28:27 crc kubenswrapper[4898]: I1211 13:28:27.901577 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dg7vs\" (UniqueName: \"kubernetes.io/projected/15ab34e2-564c-479b-a8af-c428abb92e17-kube-api-access-dg7vs\") pod \"dnsmasq-dns-568d7fd7cf-g8csh\" (UID: \"15ab34e2-564c-479b-a8af-c428abb92e17\") " pod="openstack/dnsmasq-dns-568d7fd7cf-g8csh" Dec 11 13:28:28 crc kubenswrapper[4898]: I1211 13:28:28.025253 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 11 13:28:28 crc kubenswrapper[4898]: I1211 13:28:28.194018 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-568d7fd7cf-g8csh" Dec 11 13:28:28 crc kubenswrapper[4898]: I1211 13:28:28.523488 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-dgfd5"] Dec 11 13:28:28 crc kubenswrapper[4898]: I1211 13:28:28.819557 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 11 13:28:28 crc kubenswrapper[4898]: I1211 13:28:28.823560 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"73500676-55b3-4022-b71b-e9ea005da327","Type":"ContainerDied","Data":"b61db2069e5c7c6e97ddf6b6bab7e5320948ad1d9efbd62a7895ebd1c25250a4"} Dec 11 13:28:28 crc kubenswrapper[4898]: I1211 13:28:28.823620 4898 scope.go:117] "RemoveContainer" containerID="6b1647c5996bc80836e8b9639147d4427c9df0cbbb330afd8aac53079812cb44" Dec 11 13:28:28 crc kubenswrapper[4898]: I1211 13:28:28.826468 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-dgfd5" event={"ID":"17ff0417-b6d7-42f4-9de4-e2482b659fc2","Type":"ContainerStarted","Data":"133a88df083acfde8e58f68551b4fbb1a61e8ee829f40e6ad09b7553d20f6bf3"} Dec 11 13:28:28 crc kubenswrapper[4898]: I1211 13:28:28.853029 4898 scope.go:117] "RemoveContainer" containerID="4876e5c8b89988f7bc81d291e1be363f2b4f1e01c8e5cb6cd4a53a2b8f2bf4e8" Dec 11 13:28:28 crc kubenswrapper[4898]: I1211 13:28:28.899869 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73500676-55b3-4022-b71b-e9ea005da327-config-data\") pod \"73500676-55b3-4022-b71b-e9ea005da327\" (UID: \"73500676-55b3-4022-b71b-e9ea005da327\") " Dec 11 13:28:28 crc kubenswrapper[4898]: I1211 13:28:28.899942 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73500676-55b3-4022-b71b-e9ea005da327-combined-ca-bundle\") pod \"73500676-55b3-4022-b71b-e9ea005da327\" (UID: \"73500676-55b3-4022-b71b-e9ea005da327\") " Dec 11 13:28:28 crc kubenswrapper[4898]: I1211 13:28:28.899967 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/73500676-55b3-4022-b71b-e9ea005da327-log-httpd\") pod \"73500676-55b3-4022-b71b-e9ea005da327\" (UID: \"73500676-55b3-4022-b71b-e9ea005da327\") " Dec 11 13:28:28 crc kubenswrapper[4898]: I1211 13:28:28.900199 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-54x5g\" (UniqueName: \"kubernetes.io/projected/73500676-55b3-4022-b71b-e9ea005da327-kube-api-access-54x5g\") pod \"73500676-55b3-4022-b71b-e9ea005da327\" (UID: \"73500676-55b3-4022-b71b-e9ea005da327\") " Dec 11 13:28:28 crc kubenswrapper[4898]: I1211 13:28:28.900246 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/73500676-55b3-4022-b71b-e9ea005da327-scripts\") pod \"73500676-55b3-4022-b71b-e9ea005da327\" (UID: \"73500676-55b3-4022-b71b-e9ea005da327\") " Dec 11 13:28:28 crc kubenswrapper[4898]: I1211 13:28:28.900331 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/73500676-55b3-4022-b71b-e9ea005da327-run-httpd\") pod \"73500676-55b3-4022-b71b-e9ea005da327\" (UID: \"73500676-55b3-4022-b71b-e9ea005da327\") " Dec 11 13:28:28 crc kubenswrapper[4898]: I1211 13:28:28.900354 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/73500676-55b3-4022-b71b-e9ea005da327-sg-core-conf-yaml\") pod \"73500676-55b3-4022-b71b-e9ea005da327\" (UID: \"73500676-55b3-4022-b71b-e9ea005da327\") " Dec 11 13:28:28 crc kubenswrapper[4898]: I1211 13:28:28.909628 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 11 13:28:28 crc kubenswrapper[4898]: I1211 13:28:28.910188 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/73500676-55b3-4022-b71b-e9ea005da327-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "73500676-55b3-4022-b71b-e9ea005da327" (UID: "73500676-55b3-4022-b71b-e9ea005da327"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 13:28:28 crc kubenswrapper[4898]: I1211 13:28:28.913153 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/73500676-55b3-4022-b71b-e9ea005da327-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "73500676-55b3-4022-b71b-e9ea005da327" (UID: "73500676-55b3-4022-b71b-e9ea005da327"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 13:28:28 crc kubenswrapper[4898]: I1211 13:28:28.915091 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73500676-55b3-4022-b71b-e9ea005da327-scripts" (OuterVolumeSpecName: "scripts") pod "73500676-55b3-4022-b71b-e9ea005da327" (UID: "73500676-55b3-4022-b71b-e9ea005da327"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:28:28 crc kubenswrapper[4898]: I1211 13:28:28.915129 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73500676-55b3-4022-b71b-e9ea005da327-kube-api-access-54x5g" (OuterVolumeSpecName: "kube-api-access-54x5g") pod "73500676-55b3-4022-b71b-e9ea005da327" (UID: "73500676-55b3-4022-b71b-e9ea005da327"). InnerVolumeSpecName "kube-api-access-54x5g". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:28:28 crc kubenswrapper[4898]: I1211 13:28:28.916225 4898 scope.go:117] "RemoveContainer" containerID="dda3ce5a58ed31eafa9421cc97414ff3691240e3840ac40fe6ba7cc2098b9615" Dec 11 13:28:28 crc kubenswrapper[4898]: I1211 13:28:28.935445 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 11 13:28:28 crc kubenswrapper[4898]: I1211 13:28:28.968275 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73500676-55b3-4022-b71b-e9ea005da327-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "73500676-55b3-4022-b71b-e9ea005da327" (UID: "73500676-55b3-4022-b71b-e9ea005da327"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:28:28 crc kubenswrapper[4898]: I1211 13:28:28.994290 4898 scope.go:117] "RemoveContainer" containerID="f7538843310d5e936ec46984e8748bf7460107648430e87a6f8f4ef09f14552a" Dec 11 13:28:29 crc kubenswrapper[4898]: I1211 13:28:29.003936 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-54x5g\" (UniqueName: \"kubernetes.io/projected/73500676-55b3-4022-b71b-e9ea005da327-kube-api-access-54x5g\") on node \"crc\" DevicePath \"\"" Dec 11 13:28:29 crc kubenswrapper[4898]: I1211 13:28:29.003974 4898 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/73500676-55b3-4022-b71b-e9ea005da327-scripts\") on node \"crc\" DevicePath \"\"" Dec 11 13:28:29 crc kubenswrapper[4898]: I1211 13:28:29.003986 4898 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/73500676-55b3-4022-b71b-e9ea005da327-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 11 13:28:29 crc kubenswrapper[4898]: I1211 13:28:29.003994 4898 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/73500676-55b3-4022-b71b-e9ea005da327-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 11 13:28:29 crc kubenswrapper[4898]: I1211 13:28:29.004003 4898 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/73500676-55b3-4022-b71b-e9ea005da327-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 11 13:28:29 crc kubenswrapper[4898]: I1211 13:28:29.057977 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73500676-55b3-4022-b71b-e9ea005da327-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "73500676-55b3-4022-b71b-e9ea005da327" (UID: "73500676-55b3-4022-b71b-e9ea005da327"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:28:29 crc kubenswrapper[4898]: I1211 13:28:29.088989 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73500676-55b3-4022-b71b-e9ea005da327-config-data" (OuterVolumeSpecName: "config-data") pod "73500676-55b3-4022-b71b-e9ea005da327" (UID: "73500676-55b3-4022-b71b-e9ea005da327"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:28:29 crc kubenswrapper[4898]: I1211 13:28:29.108216 4898 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73500676-55b3-4022-b71b-e9ea005da327-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 11 13:28:29 crc kubenswrapper[4898]: I1211 13:28:29.108258 4898 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73500676-55b3-4022-b71b-e9ea005da327-config-data\") on node \"crc\" DevicePath \"\"" Dec 11 13:28:29 crc kubenswrapper[4898]: I1211 13:28:29.346680 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-create-hbds7"] Dec 11 13:28:29 crc kubenswrapper[4898]: I1211 13:28:29.393679 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 11 13:28:29 crc kubenswrapper[4898]: I1211 13:28:29.413600 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-3c53-account-create-update-kzh6s"] Dec 11 13:28:29 crc kubenswrapper[4898]: I1211 13:28:29.427520 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-sjh99"] Dec 11 13:28:29 crc kubenswrapper[4898]: E1211 13:28:29.428076 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73500676-55b3-4022-b71b-e9ea005da327" containerName="ceilometer-notification-agent" Dec 11 13:28:29 crc kubenswrapper[4898]: I1211 13:28:29.428114 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="73500676-55b3-4022-b71b-e9ea005da327" containerName="ceilometer-notification-agent" Dec 11 13:28:29 crc kubenswrapper[4898]: E1211 13:28:29.428139 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73500676-55b3-4022-b71b-e9ea005da327" containerName="ceilometer-central-agent" Dec 11 13:28:29 crc kubenswrapper[4898]: I1211 13:28:29.428145 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="73500676-55b3-4022-b71b-e9ea005da327" containerName="ceilometer-central-agent" Dec 11 13:28:29 crc kubenswrapper[4898]: E1211 13:28:29.428179 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73500676-55b3-4022-b71b-e9ea005da327" containerName="sg-core" Dec 11 13:28:29 crc kubenswrapper[4898]: I1211 13:28:29.428186 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="73500676-55b3-4022-b71b-e9ea005da327" containerName="sg-core" Dec 11 13:28:29 crc kubenswrapper[4898]: E1211 13:28:29.428203 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73500676-55b3-4022-b71b-e9ea005da327" containerName="proxy-httpd" Dec 11 13:28:29 crc kubenswrapper[4898]: I1211 13:28:29.428209 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="73500676-55b3-4022-b71b-e9ea005da327" containerName="proxy-httpd" Dec 11 13:28:29 crc kubenswrapper[4898]: I1211 13:28:29.428471 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="73500676-55b3-4022-b71b-e9ea005da327" containerName="sg-core" Dec 11 13:28:29 crc kubenswrapper[4898]: I1211 13:28:29.428489 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="73500676-55b3-4022-b71b-e9ea005da327" containerName="ceilometer-central-agent" Dec 11 13:28:29 crc kubenswrapper[4898]: I1211 13:28:29.428504 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="73500676-55b3-4022-b71b-e9ea005da327" containerName="ceilometer-notification-agent" Dec 11 13:28:29 crc kubenswrapper[4898]: I1211 13:28:29.428520 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="73500676-55b3-4022-b71b-e9ea005da327" containerName="proxy-httpd" Dec 11 13:28:29 crc kubenswrapper[4898]: I1211 13:28:29.429651 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-sjh99" Dec 11 13:28:29 crc kubenswrapper[4898]: I1211 13:28:29.432101 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Dec 11 13:28:29 crc kubenswrapper[4898]: I1211 13:28:29.432838 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Dec 11 13:28:29 crc kubenswrapper[4898]: I1211 13:28:29.441195 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 11 13:28:29 crc kubenswrapper[4898]: I1211 13:28:29.456969 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-sjh99"] Dec 11 13:28:29 crc kubenswrapper[4898]: I1211 13:28:29.507553 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-568d7fd7cf-g8csh"] Dec 11 13:28:29 crc kubenswrapper[4898]: I1211 13:28:29.522236 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac397035-8dbc-46c6-8e4c-d4c25cb38c8f-config-data\") pod \"nova-cell1-conductor-db-sync-sjh99\" (UID: \"ac397035-8dbc-46c6-8e4c-d4c25cb38c8f\") " pod="openstack/nova-cell1-conductor-db-sync-sjh99" Dec 11 13:28:29 crc kubenswrapper[4898]: I1211 13:28:29.522284 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac397035-8dbc-46c6-8e4c-d4c25cb38c8f-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-sjh99\" (UID: \"ac397035-8dbc-46c6-8e4c-d4c25cb38c8f\") " pod="openstack/nova-cell1-conductor-db-sync-sjh99" Dec 11 13:28:29 crc kubenswrapper[4898]: I1211 13:28:29.522450 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v7w6q\" (UniqueName: \"kubernetes.io/projected/ac397035-8dbc-46c6-8e4c-d4c25cb38c8f-kube-api-access-v7w6q\") pod \"nova-cell1-conductor-db-sync-sjh99\" (UID: \"ac397035-8dbc-46c6-8e4c-d4c25cb38c8f\") " pod="openstack/nova-cell1-conductor-db-sync-sjh99" Dec 11 13:28:29 crc kubenswrapper[4898]: I1211 13:28:29.522931 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ac397035-8dbc-46c6-8e4c-d4c25cb38c8f-scripts\") pod \"nova-cell1-conductor-db-sync-sjh99\" (UID: \"ac397035-8dbc-46c6-8e4c-d4c25cb38c8f\") " pod="openstack/nova-cell1-conductor-db-sync-sjh99" Dec 11 13:28:29 crc kubenswrapper[4898]: I1211 13:28:29.628042 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ac397035-8dbc-46c6-8e4c-d4c25cb38c8f-scripts\") pod \"nova-cell1-conductor-db-sync-sjh99\" (UID: \"ac397035-8dbc-46c6-8e4c-d4c25cb38c8f\") " pod="openstack/nova-cell1-conductor-db-sync-sjh99" Dec 11 13:28:29 crc kubenswrapper[4898]: I1211 13:28:29.628551 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac397035-8dbc-46c6-8e4c-d4c25cb38c8f-config-data\") pod \"nova-cell1-conductor-db-sync-sjh99\" (UID: \"ac397035-8dbc-46c6-8e4c-d4c25cb38c8f\") " pod="openstack/nova-cell1-conductor-db-sync-sjh99" Dec 11 13:28:29 crc kubenswrapper[4898]: I1211 13:28:29.628589 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac397035-8dbc-46c6-8e4c-d4c25cb38c8f-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-sjh99\" (UID: \"ac397035-8dbc-46c6-8e4c-d4c25cb38c8f\") " pod="openstack/nova-cell1-conductor-db-sync-sjh99" Dec 11 13:28:29 crc kubenswrapper[4898]: I1211 13:28:29.628655 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v7w6q\" (UniqueName: \"kubernetes.io/projected/ac397035-8dbc-46c6-8e4c-d4c25cb38c8f-kube-api-access-v7w6q\") pod \"nova-cell1-conductor-db-sync-sjh99\" (UID: \"ac397035-8dbc-46c6-8e4c-d4c25cb38c8f\") " pod="openstack/nova-cell1-conductor-db-sync-sjh99" Dec 11 13:28:29 crc kubenswrapper[4898]: I1211 13:28:29.635751 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac397035-8dbc-46c6-8e4c-d4c25cb38c8f-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-sjh99\" (UID: \"ac397035-8dbc-46c6-8e4c-d4c25cb38c8f\") " pod="openstack/nova-cell1-conductor-db-sync-sjh99" Dec 11 13:28:29 crc kubenswrapper[4898]: I1211 13:28:29.635967 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac397035-8dbc-46c6-8e4c-d4c25cb38c8f-config-data\") pod \"nova-cell1-conductor-db-sync-sjh99\" (UID: \"ac397035-8dbc-46c6-8e4c-d4c25cb38c8f\") " pod="openstack/nova-cell1-conductor-db-sync-sjh99" Dec 11 13:28:29 crc kubenswrapper[4898]: I1211 13:28:29.640224 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ac397035-8dbc-46c6-8e4c-d4c25cb38c8f-scripts\") pod \"nova-cell1-conductor-db-sync-sjh99\" (UID: \"ac397035-8dbc-46c6-8e4c-d4c25cb38c8f\") " pod="openstack/nova-cell1-conductor-db-sync-sjh99" Dec 11 13:28:29 crc kubenswrapper[4898]: I1211 13:28:29.653332 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v7w6q\" (UniqueName: \"kubernetes.io/projected/ac397035-8dbc-46c6-8e4c-d4c25cb38c8f-kube-api-access-v7w6q\") pod \"nova-cell1-conductor-db-sync-sjh99\" (UID: \"ac397035-8dbc-46c6-8e4c-d4c25cb38c8f\") " pod="openstack/nova-cell1-conductor-db-sync-sjh99" Dec 11 13:28:29 crc kubenswrapper[4898]: I1211 13:28:29.795176 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-sjh99" Dec 11 13:28:29 crc kubenswrapper[4898]: I1211 13:28:29.847096 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b930380f-7595-4ade-99f4-30a280f023ff","Type":"ContainerStarted","Data":"3959be6e66bc3876876600b4ebdb7d78fefd633cab5ecef2f1b6f7a1fe4d4410"} Dec 11 13:28:29 crc kubenswrapper[4898]: I1211 13:28:29.848840 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"8a29489d-803b-4a04-a612-485c4a015ccf","Type":"ContainerStarted","Data":"d4fc65b12d369a8260658d38a956f3bb70764607dcdf697e3b37085b2592ea53"} Dec 11 13:28:29 crc kubenswrapper[4898]: I1211 13:28:29.850549 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-hbds7" event={"ID":"919f5290-479c-4afc-9160-f69d2f2b1e09","Type":"ContainerStarted","Data":"01a797987a20399dd26baf2edefd4ffb0a1e8252bd9a262fc6820fde1b0e942b"} Dec 11 13:28:29 crc kubenswrapper[4898]: I1211 13:28:29.851727 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 11 13:28:29 crc kubenswrapper[4898]: I1211 13:28:29.865742 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"70ba1510-a232-4614-bced-c1afee3bd9b2","Type":"ContainerStarted","Data":"e25ffe273c6a89cb5f8d40bdd5dc5663f71d640d048239602d586867d8343925"} Dec 11 13:28:29 crc kubenswrapper[4898]: I1211 13:28:29.867547 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-568d7fd7cf-g8csh" event={"ID":"15ab34e2-564c-479b-a8af-c428abb92e17","Type":"ContainerStarted","Data":"b0649ae698e71d91b30746b58d42bd209699f98c8a5fa50d81de43885811fd5a"} Dec 11 13:28:29 crc kubenswrapper[4898]: I1211 13:28:29.870479 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-3c53-account-create-update-kzh6s" event={"ID":"5ca26903-ffb9-4637-aac0-5284a81cbe85","Type":"ContainerStarted","Data":"4ce2452881b345f6489849122fafc1fb2135af6ced96adcf3276e683e52ca7b3"} Dec 11 13:28:29 crc kubenswrapper[4898]: I1211 13:28:29.873151 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-dgfd5" event={"ID":"17ff0417-b6d7-42f4-9de4-e2482b659fc2","Type":"ContainerStarted","Data":"f697e132855971325a13977d240628b9d340ff5b8fbc1d55606c7d0c7520615b"} Dec 11 13:28:29 crc kubenswrapper[4898]: I1211 13:28:29.875495 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"be99ef39-7e89-4189-8f85-91cc80505896","Type":"ContainerStarted","Data":"76d373a189cf13dc25d257ac6796be2e989e06d66a8ed6a0e4426f1f6e638f92"} Dec 11 13:28:29 crc kubenswrapper[4898]: I1211 13:28:29.898609 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-dgfd5" podStartSLOduration=3.89859031 podStartE2EDuration="3.89859031s" podCreationTimestamp="2025-12-11 13:28:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 13:28:29.892579928 +0000 UTC m=+1467.464906365" watchObservedRunningTime="2025-12-11 13:28:29.89859031 +0000 UTC m=+1467.470916747" Dec 11 13:28:29 crc kubenswrapper[4898]: I1211 13:28:29.953856 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 11 13:28:29 crc kubenswrapper[4898]: I1211 13:28:29.970937 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 11 13:28:30 crc kubenswrapper[4898]: I1211 13:28:30.022312 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 11 13:28:30 crc kubenswrapper[4898]: I1211 13:28:30.027162 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 11 13:28:30 crc kubenswrapper[4898]: I1211 13:28:30.034923 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 11 13:28:30 crc kubenswrapper[4898]: I1211 13:28:30.035158 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 11 13:28:30 crc kubenswrapper[4898]: I1211 13:28:30.044101 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 11 13:28:30 crc kubenswrapper[4898]: I1211 13:28:30.088799 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c3709b73-c7ab-4ad7-8d8d-2dc55ea4e46f-run-httpd\") pod \"ceilometer-0\" (UID: \"c3709b73-c7ab-4ad7-8d8d-2dc55ea4e46f\") " pod="openstack/ceilometer-0" Dec 11 13:28:30 crc kubenswrapper[4898]: I1211 13:28:30.089155 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3709b73-c7ab-4ad7-8d8d-2dc55ea4e46f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c3709b73-c7ab-4ad7-8d8d-2dc55ea4e46f\") " pod="openstack/ceilometer-0" Dec 11 13:28:30 crc kubenswrapper[4898]: I1211 13:28:30.089278 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c3709b73-c7ab-4ad7-8d8d-2dc55ea4e46f-log-httpd\") pod \"ceilometer-0\" (UID: \"c3709b73-c7ab-4ad7-8d8d-2dc55ea4e46f\") " pod="openstack/ceilometer-0" Dec 11 13:28:30 crc kubenswrapper[4898]: I1211 13:28:30.089423 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c3709b73-c7ab-4ad7-8d8d-2dc55ea4e46f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c3709b73-c7ab-4ad7-8d8d-2dc55ea4e46f\") " pod="openstack/ceilometer-0" Dec 11 13:28:30 crc kubenswrapper[4898]: I1211 13:28:30.089524 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c3709b73-c7ab-4ad7-8d8d-2dc55ea4e46f-scripts\") pod \"ceilometer-0\" (UID: \"c3709b73-c7ab-4ad7-8d8d-2dc55ea4e46f\") " pod="openstack/ceilometer-0" Dec 11 13:28:30 crc kubenswrapper[4898]: I1211 13:28:30.089562 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3709b73-c7ab-4ad7-8d8d-2dc55ea4e46f-config-data\") pod \"ceilometer-0\" (UID: \"c3709b73-c7ab-4ad7-8d8d-2dc55ea4e46f\") " pod="openstack/ceilometer-0" Dec 11 13:28:30 crc kubenswrapper[4898]: I1211 13:28:30.089587 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7p2zs\" (UniqueName: \"kubernetes.io/projected/c3709b73-c7ab-4ad7-8d8d-2dc55ea4e46f-kube-api-access-7p2zs\") pod \"ceilometer-0\" (UID: \"c3709b73-c7ab-4ad7-8d8d-2dc55ea4e46f\") " pod="openstack/ceilometer-0" Dec 11 13:28:30 crc kubenswrapper[4898]: I1211 13:28:30.191307 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c3709b73-c7ab-4ad7-8d8d-2dc55ea4e46f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c3709b73-c7ab-4ad7-8d8d-2dc55ea4e46f\") " pod="openstack/ceilometer-0" Dec 11 13:28:30 crc kubenswrapper[4898]: I1211 13:28:30.191391 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c3709b73-c7ab-4ad7-8d8d-2dc55ea4e46f-scripts\") pod \"ceilometer-0\" (UID: \"c3709b73-c7ab-4ad7-8d8d-2dc55ea4e46f\") " pod="openstack/ceilometer-0" Dec 11 13:28:30 crc kubenswrapper[4898]: I1211 13:28:30.191420 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3709b73-c7ab-4ad7-8d8d-2dc55ea4e46f-config-data\") pod \"ceilometer-0\" (UID: \"c3709b73-c7ab-4ad7-8d8d-2dc55ea4e46f\") " pod="openstack/ceilometer-0" Dec 11 13:28:30 crc kubenswrapper[4898]: I1211 13:28:30.191441 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7p2zs\" (UniqueName: \"kubernetes.io/projected/c3709b73-c7ab-4ad7-8d8d-2dc55ea4e46f-kube-api-access-7p2zs\") pod \"ceilometer-0\" (UID: \"c3709b73-c7ab-4ad7-8d8d-2dc55ea4e46f\") " pod="openstack/ceilometer-0" Dec 11 13:28:30 crc kubenswrapper[4898]: I1211 13:28:30.191504 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c3709b73-c7ab-4ad7-8d8d-2dc55ea4e46f-run-httpd\") pod \"ceilometer-0\" (UID: \"c3709b73-c7ab-4ad7-8d8d-2dc55ea4e46f\") " pod="openstack/ceilometer-0" Dec 11 13:28:30 crc kubenswrapper[4898]: I1211 13:28:30.191569 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3709b73-c7ab-4ad7-8d8d-2dc55ea4e46f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c3709b73-c7ab-4ad7-8d8d-2dc55ea4e46f\") " pod="openstack/ceilometer-0" Dec 11 13:28:30 crc kubenswrapper[4898]: I1211 13:28:30.191630 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c3709b73-c7ab-4ad7-8d8d-2dc55ea4e46f-log-httpd\") pod \"ceilometer-0\" (UID: \"c3709b73-c7ab-4ad7-8d8d-2dc55ea4e46f\") " pod="openstack/ceilometer-0" Dec 11 13:28:30 crc kubenswrapper[4898]: I1211 13:28:30.192081 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c3709b73-c7ab-4ad7-8d8d-2dc55ea4e46f-log-httpd\") pod \"ceilometer-0\" (UID: \"c3709b73-c7ab-4ad7-8d8d-2dc55ea4e46f\") " pod="openstack/ceilometer-0" Dec 11 13:28:30 crc kubenswrapper[4898]: I1211 13:28:30.194793 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c3709b73-c7ab-4ad7-8d8d-2dc55ea4e46f-run-httpd\") pod \"ceilometer-0\" (UID: \"c3709b73-c7ab-4ad7-8d8d-2dc55ea4e46f\") " pod="openstack/ceilometer-0" Dec 11 13:28:30 crc kubenswrapper[4898]: I1211 13:28:30.199424 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3709b73-c7ab-4ad7-8d8d-2dc55ea4e46f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c3709b73-c7ab-4ad7-8d8d-2dc55ea4e46f\") " pod="openstack/ceilometer-0" Dec 11 13:28:30 crc kubenswrapper[4898]: I1211 13:28:30.200682 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3709b73-c7ab-4ad7-8d8d-2dc55ea4e46f-config-data\") pod \"ceilometer-0\" (UID: \"c3709b73-c7ab-4ad7-8d8d-2dc55ea4e46f\") " pod="openstack/ceilometer-0" Dec 11 13:28:30 crc kubenswrapper[4898]: I1211 13:28:30.200975 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c3709b73-c7ab-4ad7-8d8d-2dc55ea4e46f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c3709b73-c7ab-4ad7-8d8d-2dc55ea4e46f\") " pod="openstack/ceilometer-0" Dec 11 13:28:30 crc kubenswrapper[4898]: I1211 13:28:30.201697 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c3709b73-c7ab-4ad7-8d8d-2dc55ea4e46f-scripts\") pod \"ceilometer-0\" (UID: \"c3709b73-c7ab-4ad7-8d8d-2dc55ea4e46f\") " pod="openstack/ceilometer-0" Dec 11 13:28:30 crc kubenswrapper[4898]: I1211 13:28:30.213427 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7p2zs\" (UniqueName: \"kubernetes.io/projected/c3709b73-c7ab-4ad7-8d8d-2dc55ea4e46f-kube-api-access-7p2zs\") pod \"ceilometer-0\" (UID: \"c3709b73-c7ab-4ad7-8d8d-2dc55ea4e46f\") " pod="openstack/ceilometer-0" Dec 11 13:28:30 crc kubenswrapper[4898]: I1211 13:28:30.387551 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 11 13:28:30 crc kubenswrapper[4898]: I1211 13:28:30.552349 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-sjh99"] Dec 11 13:28:31 crc kubenswrapper[4898]: W1211 13:28:30.654835 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podac397035_8dbc_46c6_8e4c_d4c25cb38c8f.slice/crio-e334c1f355641ba3ddf2f69609c6fe2029eeedfb31187bcfb4ca0ed77c783b30 WatchSource:0}: Error finding container e334c1f355641ba3ddf2f69609c6fe2029eeedfb31187bcfb4ca0ed77c783b30: Status 404 returned error can't find the container with id e334c1f355641ba3ddf2f69609c6fe2029eeedfb31187bcfb4ca0ed77c783b30 Dec 11 13:28:31 crc kubenswrapper[4898]: I1211 13:28:30.800036 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="73500676-55b3-4022-b71b-e9ea005da327" path="/var/lib/kubelet/pods/73500676-55b3-4022-b71b-e9ea005da327/volumes" Dec 11 13:28:31 crc kubenswrapper[4898]: I1211 13:28:30.894741 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-sjh99" event={"ID":"ac397035-8dbc-46c6-8e4c-d4c25cb38c8f","Type":"ContainerStarted","Data":"e334c1f355641ba3ddf2f69609c6fe2029eeedfb31187bcfb4ca0ed77c783b30"} Dec 11 13:28:31 crc kubenswrapper[4898]: I1211 13:28:30.897768 4898 generic.go:334] "Generic (PLEG): container finished" podID="919f5290-479c-4afc-9160-f69d2f2b1e09" containerID="f5f0d7867b6e6d2bfc699b65ac6b96c700a4e6aaf23dcb4464fe1c5b49fa3507" exitCode=0 Dec 11 13:28:31 crc kubenswrapper[4898]: I1211 13:28:30.897810 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-hbds7" event={"ID":"919f5290-479c-4afc-9160-f69d2f2b1e09","Type":"ContainerDied","Data":"f5f0d7867b6e6d2bfc699b65ac6b96c700a4e6aaf23dcb4464fe1c5b49fa3507"} Dec 11 13:28:31 crc kubenswrapper[4898]: I1211 13:28:30.904092 4898 generic.go:334] "Generic (PLEG): container finished" podID="15ab34e2-564c-479b-a8af-c428abb92e17" containerID="514a94234818ba822cb459a5fdaa1a0acdd09e6aa1865cffe12dc7c985d3bc5c" exitCode=0 Dec 11 13:28:31 crc kubenswrapper[4898]: I1211 13:28:30.904163 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-568d7fd7cf-g8csh" event={"ID":"15ab34e2-564c-479b-a8af-c428abb92e17","Type":"ContainerDied","Data":"514a94234818ba822cb459a5fdaa1a0acdd09e6aa1865cffe12dc7c985d3bc5c"} Dec 11 13:28:31 crc kubenswrapper[4898]: I1211 13:28:30.920308 4898 generic.go:334] "Generic (PLEG): container finished" podID="5ca26903-ffb9-4637-aac0-5284a81cbe85" containerID="10e802201759470e332ecae05d7658d71baf79f445281506f97ea9311a53af8e" exitCode=0 Dec 11 13:28:31 crc kubenswrapper[4898]: I1211 13:28:30.921026 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-3c53-account-create-update-kzh6s" event={"ID":"5ca26903-ffb9-4637-aac0-5284a81cbe85","Type":"ContainerDied","Data":"10e802201759470e332ecae05d7658d71baf79f445281506f97ea9311a53af8e"} Dec 11 13:28:31 crc kubenswrapper[4898]: I1211 13:28:31.520142 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 11 13:28:31 crc kubenswrapper[4898]: I1211 13:28:31.532013 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 11 13:28:31 crc kubenswrapper[4898]: I1211 13:28:31.935052 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-568d7fd7cf-g8csh" event={"ID":"15ab34e2-564c-479b-a8af-c428abb92e17","Type":"ContainerStarted","Data":"a3c664e55bf0186f9da9cc6fb55b1679a3f6b85c6614e1aacc0f6f280ef624d8"} Dec 11 13:28:31 crc kubenswrapper[4898]: I1211 13:28:31.935430 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-568d7fd7cf-g8csh" Dec 11 13:28:31 crc kubenswrapper[4898]: I1211 13:28:31.939401 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-sjh99" event={"ID":"ac397035-8dbc-46c6-8e4c-d4c25cb38c8f","Type":"ContainerStarted","Data":"b25bab6209ad2fceaf30d8c7226b13536eb0afd152d8951b38bf8d6cf63418d8"} Dec 11 13:28:31 crc kubenswrapper[4898]: I1211 13:28:31.959429 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-568d7fd7cf-g8csh" podStartSLOduration=4.959416503 podStartE2EDuration="4.959416503s" podCreationTimestamp="2025-12-11 13:28:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 13:28:31.95821032 +0000 UTC m=+1469.530536747" watchObservedRunningTime="2025-12-11 13:28:31.959416503 +0000 UTC m=+1469.531742940" Dec 11 13:28:32 crc kubenswrapper[4898]: I1211 13:28:32.209651 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-sjh99" podStartSLOduration=3.209629653 podStartE2EDuration="3.209629653s" podCreationTimestamp="2025-12-11 13:28:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 13:28:31.971915731 +0000 UTC m=+1469.544242168" watchObservedRunningTime="2025-12-11 13:28:32.209629653 +0000 UTC m=+1469.781956090" Dec 11 13:28:32 crc kubenswrapper[4898]: I1211 13:28:32.233098 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 11 13:28:32 crc kubenswrapper[4898]: W1211 13:28:32.670648 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc3709b73_c7ab_4ad7_8d8d_2dc55ea4e46f.slice/crio-f38dd567c531a4bddff5efd86837853f031eecc1207dc7144de9bfd6c482f126 WatchSource:0}: Error finding container f38dd567c531a4bddff5efd86837853f031eecc1207dc7144de9bfd6c482f126: Status 404 returned error can't find the container with id f38dd567c531a4bddff5efd86837853f031eecc1207dc7144de9bfd6c482f126 Dec 11 13:28:32 crc kubenswrapper[4898]: I1211 13:28:32.800044 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-3c53-account-create-update-kzh6s" Dec 11 13:28:32 crc kubenswrapper[4898]: I1211 13:28:32.805171 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-hbds7" Dec 11 13:28:32 crc kubenswrapper[4898]: I1211 13:28:32.967637 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c3709b73-c7ab-4ad7-8d8d-2dc55ea4e46f","Type":"ContainerStarted","Data":"f38dd567c531a4bddff5efd86837853f031eecc1207dc7144de9bfd6c482f126"} Dec 11 13:28:32 crc kubenswrapper[4898]: I1211 13:28:32.971128 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-hbds7" event={"ID":"919f5290-479c-4afc-9160-f69d2f2b1e09","Type":"ContainerDied","Data":"01a797987a20399dd26baf2edefd4ffb0a1e8252bd9a262fc6820fde1b0e942b"} Dec 11 13:28:32 crc kubenswrapper[4898]: I1211 13:28:32.971173 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="01a797987a20399dd26baf2edefd4ffb0a1e8252bd9a262fc6820fde1b0e942b" Dec 11 13:28:32 crc kubenswrapper[4898]: I1211 13:28:32.971259 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-hbds7" Dec 11 13:28:32 crc kubenswrapper[4898]: I1211 13:28:32.973948 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cd9qp\" (UniqueName: \"kubernetes.io/projected/5ca26903-ffb9-4637-aac0-5284a81cbe85-kube-api-access-cd9qp\") pod \"5ca26903-ffb9-4637-aac0-5284a81cbe85\" (UID: \"5ca26903-ffb9-4637-aac0-5284a81cbe85\") " Dec 11 13:28:32 crc kubenswrapper[4898]: I1211 13:28:32.974026 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l92xm\" (UniqueName: \"kubernetes.io/projected/919f5290-479c-4afc-9160-f69d2f2b1e09-kube-api-access-l92xm\") pod \"919f5290-479c-4afc-9160-f69d2f2b1e09\" (UID: \"919f5290-479c-4afc-9160-f69d2f2b1e09\") " Dec 11 13:28:32 crc kubenswrapper[4898]: I1211 13:28:32.974200 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/919f5290-479c-4afc-9160-f69d2f2b1e09-operator-scripts\") pod \"919f5290-479c-4afc-9160-f69d2f2b1e09\" (UID: \"919f5290-479c-4afc-9160-f69d2f2b1e09\") " Dec 11 13:28:32 crc kubenswrapper[4898]: I1211 13:28:32.974270 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5ca26903-ffb9-4637-aac0-5284a81cbe85-operator-scripts\") pod \"5ca26903-ffb9-4637-aac0-5284a81cbe85\" (UID: \"5ca26903-ffb9-4637-aac0-5284a81cbe85\") " Dec 11 13:28:32 crc kubenswrapper[4898]: I1211 13:28:32.975313 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5ca26903-ffb9-4637-aac0-5284a81cbe85-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5ca26903-ffb9-4637-aac0-5284a81cbe85" (UID: "5ca26903-ffb9-4637-aac0-5284a81cbe85"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:28:32 crc kubenswrapper[4898]: I1211 13:28:32.980345 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-3c53-account-create-update-kzh6s" Dec 11 13:28:32 crc kubenswrapper[4898]: I1211 13:28:32.980897 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-3c53-account-create-update-kzh6s" event={"ID":"5ca26903-ffb9-4637-aac0-5284a81cbe85","Type":"ContainerDied","Data":"4ce2452881b345f6489849122fafc1fb2135af6ced96adcf3276e683e52ca7b3"} Dec 11 13:28:32 crc kubenswrapper[4898]: I1211 13:28:32.980932 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4ce2452881b345f6489849122fafc1fb2135af6ced96adcf3276e683e52ca7b3" Dec 11 13:28:32 crc kubenswrapper[4898]: I1211 13:28:32.989429 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ca26903-ffb9-4637-aac0-5284a81cbe85-kube-api-access-cd9qp" (OuterVolumeSpecName: "kube-api-access-cd9qp") pod "5ca26903-ffb9-4637-aac0-5284a81cbe85" (UID: "5ca26903-ffb9-4637-aac0-5284a81cbe85"). InnerVolumeSpecName "kube-api-access-cd9qp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:28:33 crc kubenswrapper[4898]: I1211 13:28:33.004251 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/919f5290-479c-4afc-9160-f69d2f2b1e09-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "919f5290-479c-4afc-9160-f69d2f2b1e09" (UID: "919f5290-479c-4afc-9160-f69d2f2b1e09"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:28:33 crc kubenswrapper[4898]: I1211 13:28:33.011208 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/919f5290-479c-4afc-9160-f69d2f2b1e09-kube-api-access-l92xm" (OuterVolumeSpecName: "kube-api-access-l92xm") pod "919f5290-479c-4afc-9160-f69d2f2b1e09" (UID: "919f5290-479c-4afc-9160-f69d2f2b1e09"). InnerVolumeSpecName "kube-api-access-l92xm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:28:33 crc kubenswrapper[4898]: I1211 13:28:33.077522 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l92xm\" (UniqueName: \"kubernetes.io/projected/919f5290-479c-4afc-9160-f69d2f2b1e09-kube-api-access-l92xm\") on node \"crc\" DevicePath \"\"" Dec 11 13:28:33 crc kubenswrapper[4898]: I1211 13:28:33.077564 4898 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/919f5290-479c-4afc-9160-f69d2f2b1e09-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 11 13:28:33 crc kubenswrapper[4898]: I1211 13:28:33.077577 4898 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5ca26903-ffb9-4637-aac0-5284a81cbe85-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 11 13:28:33 crc kubenswrapper[4898]: I1211 13:28:33.077587 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cd9qp\" (UniqueName: \"kubernetes.io/projected/5ca26903-ffb9-4637-aac0-5284a81cbe85-kube-api-access-cd9qp\") on node \"crc\" DevicePath \"\"" Dec 11 13:28:34 crc kubenswrapper[4898]: I1211 13:28:34.995827 4898 patch_prober.go:28] interesting pod/machine-config-daemon-7mmvk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 11 13:28:34 crc kubenswrapper[4898]: I1211 13:28:34.996502 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 11 13:28:34 crc kubenswrapper[4898]: I1211 13:28:34.996548 4898 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" Dec 11 13:28:34 crc kubenswrapper[4898]: I1211 13:28:34.997427 4898 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0cb68cd95a282ab2d60091b5e9b86bf1ff863de795c2f88e13ffcb75e9110201"} pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 11 13:28:34 crc kubenswrapper[4898]: I1211 13:28:34.997505 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" containerName="machine-config-daemon" containerID="cri-o://0cb68cd95a282ab2d60091b5e9b86bf1ff863de795c2f88e13ffcb75e9110201" gracePeriod=600 Dec 11 13:28:35 crc kubenswrapper[4898]: I1211 13:28:35.009757 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"be99ef39-7e89-4189-8f85-91cc80505896","Type":"ContainerStarted","Data":"e8513b56ae064c5a9aabd60c9682e79c6235cb547769908b23caacacfd0f694f"} Dec 11 13:28:35 crc kubenswrapper[4898]: I1211 13:28:35.015646 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"8a29489d-803b-4a04-a612-485c4a015ccf","Type":"ContainerStarted","Data":"f9732cc402e347d4b36742d400e8ab9e5305266fb1c433be67d4bdf39775bc4b"} Dec 11 13:28:35 crc kubenswrapper[4898]: I1211 13:28:35.023840 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c3709b73-c7ab-4ad7-8d8d-2dc55ea4e46f","Type":"ContainerStarted","Data":"4ac829fa4779e37c5e1d3153325769c40f5a68d2c0629692f66c1f8d44a2b591"} Dec 11 13:28:35 crc kubenswrapper[4898]: I1211 13:28:35.033005 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b930380f-7595-4ade-99f4-30a280f023ff","Type":"ContainerStarted","Data":"f14258e237652231b3a31dcb33a7a2070e8e7beb428c05889fd816f61d08d6c6"} Dec 11 13:28:35 crc kubenswrapper[4898]: I1211 13:28:35.037293 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"70ba1510-a232-4614-bced-c1afee3bd9b2","Type":"ContainerStarted","Data":"98333e506b604b553b1094b55473be7e6becb832ff69dccac03718315c85834a"} Dec 11 13:28:35 crc kubenswrapper[4898]: I1211 13:28:35.040934 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="70ba1510-a232-4614-bced-c1afee3bd9b2" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://98333e506b604b553b1094b55473be7e6becb832ff69dccac03718315c85834a" gracePeriod=30 Dec 11 13:28:35 crc kubenswrapper[4898]: I1211 13:28:35.048908 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=3.486242142 podStartE2EDuration="9.048886473s" podCreationTimestamp="2025-12-11 13:28:26 +0000 UTC" firstStartedPulling="2025-12-11 13:28:28.93488453 +0000 UTC m=+1466.507210967" lastFinishedPulling="2025-12-11 13:28:34.497528861 +0000 UTC m=+1472.069855298" observedRunningTime="2025-12-11 13:28:35.033009783 +0000 UTC m=+1472.605336220" watchObservedRunningTime="2025-12-11 13:28:35.048886473 +0000 UTC m=+1472.621212910" Dec 11 13:28:35 crc kubenswrapper[4898]: I1211 13:28:35.065119 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.482664474 podStartE2EDuration="8.065096911s" podCreationTimestamp="2025-12-11 13:28:27 +0000 UTC" firstStartedPulling="2025-12-11 13:28:28.945726643 +0000 UTC m=+1466.518053080" lastFinishedPulling="2025-12-11 13:28:34.52815908 +0000 UTC m=+1472.100485517" observedRunningTime="2025-12-11 13:28:35.064923517 +0000 UTC m=+1472.637249954" watchObservedRunningTime="2025-12-11 13:28:35.065096911 +0000 UTC m=+1472.637423348" Dec 11 13:28:36 crc kubenswrapper[4898]: I1211 13:28:36.057563 4898 generic.go:334] "Generic (PLEG): container finished" podID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" containerID="0cb68cd95a282ab2d60091b5e9b86bf1ff863de795c2f88e13ffcb75e9110201" exitCode=0 Dec 11 13:28:36 crc kubenswrapper[4898]: I1211 13:28:36.057720 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" event={"ID":"b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c","Type":"ContainerDied","Data":"0cb68cd95a282ab2d60091b5e9b86bf1ff863de795c2f88e13ffcb75e9110201"} Dec 11 13:28:36 crc kubenswrapper[4898]: I1211 13:28:36.060502 4898 scope.go:117] "RemoveContainer" containerID="ca4a970ba19c45b9f6200362f9a5d9ef16d6404aa1395da5964e4d307dc4af2f" Dec 11 13:28:36 crc kubenswrapper[4898]: I1211 13:28:36.060573 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" event={"ID":"b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c","Type":"ContainerStarted","Data":"a48cb70f955c6dfbf71c08dc1817c1d1fa98db5c3f0203c3fcc75aa88e18e901"} Dec 11 13:28:36 crc kubenswrapper[4898]: I1211 13:28:36.075058 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"be99ef39-7e89-4189-8f85-91cc80505896","Type":"ContainerStarted","Data":"f5b03c43b8097f88807399d5ebb212023d00d6c2007613ef8b15039bd91d0b96"} Dec 11 13:28:36 crc kubenswrapper[4898]: I1211 13:28:36.082406 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="b930380f-7595-4ade-99f4-30a280f023ff" containerName="nova-metadata-log" containerID="cri-o://f14258e237652231b3a31dcb33a7a2070e8e7beb428c05889fd816f61d08d6c6" gracePeriod=30 Dec 11 13:28:36 crc kubenswrapper[4898]: I1211 13:28:36.082778 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="b930380f-7595-4ade-99f4-30a280f023ff" containerName="nova-metadata-metadata" containerID="cri-o://ebfc7617de1531a0081125be8d0c524db3f6f39ef7bc1c3fa0f9117feb144698" gracePeriod=30 Dec 11 13:28:36 crc kubenswrapper[4898]: I1211 13:28:36.082898 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b930380f-7595-4ade-99f4-30a280f023ff","Type":"ContainerStarted","Data":"ebfc7617de1531a0081125be8d0c524db3f6f39ef7bc1c3fa0f9117feb144698"} Dec 11 13:28:36 crc kubenswrapper[4898]: I1211 13:28:36.102054 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=5.000836749 podStartE2EDuration="10.102039323s" podCreationTimestamp="2025-12-11 13:28:26 +0000 UTC" firstStartedPulling="2025-12-11 13:28:29.458379047 +0000 UTC m=+1467.030705484" lastFinishedPulling="2025-12-11 13:28:34.559581621 +0000 UTC m=+1472.131908058" observedRunningTime="2025-12-11 13:28:36.101274353 +0000 UTC m=+1473.673600810" watchObservedRunningTime="2025-12-11 13:28:36.102039323 +0000 UTC m=+1473.674365760" Dec 11 13:28:36 crc kubenswrapper[4898]: I1211 13:28:36.125623 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=4.037409509 podStartE2EDuration="9.125608441s" podCreationTimestamp="2025-12-11 13:28:27 +0000 UTC" firstStartedPulling="2025-12-11 13:28:29.424548311 +0000 UTC m=+1466.996874748" lastFinishedPulling="2025-12-11 13:28:34.512747243 +0000 UTC m=+1472.085073680" observedRunningTime="2025-12-11 13:28:36.116445013 +0000 UTC m=+1473.688771480" watchObservedRunningTime="2025-12-11 13:28:36.125608441 +0000 UTC m=+1473.697934878" Dec 11 13:28:37 crc kubenswrapper[4898]: I1211 13:28:37.172096 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c3709b73-c7ab-4ad7-8d8d-2dc55ea4e46f","Type":"ContainerStarted","Data":"6fc713df7449b7c4adc18bdc23c5d4003539ee4c4ce910a2cffe2c474eeb0e4e"} Dec 11 13:28:37 crc kubenswrapper[4898]: I1211 13:28:37.182882 4898 generic.go:334] "Generic (PLEG): container finished" podID="b930380f-7595-4ade-99f4-30a280f023ff" containerID="ebfc7617de1531a0081125be8d0c524db3f6f39ef7bc1c3fa0f9117feb144698" exitCode=0 Dec 11 13:28:37 crc kubenswrapper[4898]: I1211 13:28:37.182911 4898 generic.go:334] "Generic (PLEG): container finished" podID="b930380f-7595-4ade-99f4-30a280f023ff" containerID="f14258e237652231b3a31dcb33a7a2070e8e7beb428c05889fd816f61d08d6c6" exitCode=143 Dec 11 13:28:37 crc kubenswrapper[4898]: I1211 13:28:37.182952 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b930380f-7595-4ade-99f4-30a280f023ff","Type":"ContainerDied","Data":"ebfc7617de1531a0081125be8d0c524db3f6f39ef7bc1c3fa0f9117feb144698"} Dec 11 13:28:37 crc kubenswrapper[4898]: I1211 13:28:37.182991 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b930380f-7595-4ade-99f4-30a280f023ff","Type":"ContainerDied","Data":"f14258e237652231b3a31dcb33a7a2070e8e7beb428c05889fd816f61d08d6c6"} Dec 11 13:28:37 crc kubenswrapper[4898]: I1211 13:28:37.464877 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 11 13:28:37 crc kubenswrapper[4898]: I1211 13:28:37.465343 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 11 13:28:37 crc kubenswrapper[4898]: I1211 13:28:37.519518 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-db-sync-f9lpn"] Dec 11 13:28:37 crc kubenswrapper[4898]: E1211 13:28:37.520118 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="919f5290-479c-4afc-9160-f69d2f2b1e09" containerName="mariadb-database-create" Dec 11 13:28:37 crc kubenswrapper[4898]: I1211 13:28:37.520138 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="919f5290-479c-4afc-9160-f69d2f2b1e09" containerName="mariadb-database-create" Dec 11 13:28:37 crc kubenswrapper[4898]: E1211 13:28:37.520154 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ca26903-ffb9-4637-aac0-5284a81cbe85" containerName="mariadb-account-create-update" Dec 11 13:28:37 crc kubenswrapper[4898]: I1211 13:28:37.520160 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ca26903-ffb9-4637-aac0-5284a81cbe85" containerName="mariadb-account-create-update" Dec 11 13:28:37 crc kubenswrapper[4898]: I1211 13:28:37.520420 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="919f5290-479c-4afc-9160-f69d2f2b1e09" containerName="mariadb-database-create" Dec 11 13:28:37 crc kubenswrapper[4898]: I1211 13:28:37.520471 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ca26903-ffb9-4637-aac0-5284a81cbe85" containerName="mariadb-account-create-update" Dec 11 13:28:37 crc kubenswrapper[4898]: I1211 13:28:37.521385 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-f9lpn" Dec 11 13:28:37 crc kubenswrapper[4898]: I1211 13:28:37.529285 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 11 13:28:37 crc kubenswrapper[4898]: I1211 13:28:37.529638 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Dec 11 13:28:37 crc kubenswrapper[4898]: I1211 13:28:37.529931 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Dec 11 13:28:37 crc kubenswrapper[4898]: I1211 13:28:37.530626 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-h8xn6" Dec 11 13:28:37 crc kubenswrapper[4898]: I1211 13:28:37.538423 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-f9lpn"] Dec 11 13:28:37 crc kubenswrapper[4898]: I1211 13:28:37.553094 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 11 13:28:37 crc kubenswrapper[4898]: I1211 13:28:37.554665 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 11 13:28:37 crc kubenswrapper[4898]: I1211 13:28:37.555356 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 11 13:28:37 crc kubenswrapper[4898]: I1211 13:28:37.568655 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 11 13:28:37 crc kubenswrapper[4898]: I1211 13:28:37.575017 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Dec 11 13:28:37 crc kubenswrapper[4898]: I1211 13:28:37.607177 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b930380f-7595-4ade-99f4-30a280f023ff-config-data\") pod \"b930380f-7595-4ade-99f4-30a280f023ff\" (UID: \"b930380f-7595-4ade-99f4-30a280f023ff\") " Dec 11 13:28:37 crc kubenswrapper[4898]: I1211 13:28:37.607614 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r5xg7\" (UniqueName: \"kubernetes.io/projected/b930380f-7595-4ade-99f4-30a280f023ff-kube-api-access-r5xg7\") pod \"b930380f-7595-4ade-99f4-30a280f023ff\" (UID: \"b930380f-7595-4ade-99f4-30a280f023ff\") " Dec 11 13:28:37 crc kubenswrapper[4898]: I1211 13:28:37.607711 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b930380f-7595-4ade-99f4-30a280f023ff-combined-ca-bundle\") pod \"b930380f-7595-4ade-99f4-30a280f023ff\" (UID: \"b930380f-7595-4ade-99f4-30a280f023ff\") " Dec 11 13:28:37 crc kubenswrapper[4898]: I1211 13:28:37.607834 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b930380f-7595-4ade-99f4-30a280f023ff-logs\") pod \"b930380f-7595-4ade-99f4-30a280f023ff\" (UID: \"b930380f-7595-4ade-99f4-30a280f023ff\") " Dec 11 13:28:37 crc kubenswrapper[4898]: I1211 13:28:37.608323 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0b381d75-882f-425b-9c58-ec00804fda34-scripts\") pod \"aodh-db-sync-f9lpn\" (UID: \"0b381d75-882f-425b-9c58-ec00804fda34\") " pod="openstack/aodh-db-sync-f9lpn" Dec 11 13:28:37 crc kubenswrapper[4898]: I1211 13:28:37.614369 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b381d75-882f-425b-9c58-ec00804fda34-config-data\") pod \"aodh-db-sync-f9lpn\" (UID: \"0b381d75-882f-425b-9c58-ec00804fda34\") " pod="openstack/aodh-db-sync-f9lpn" Dec 11 13:28:37 crc kubenswrapper[4898]: I1211 13:28:37.614852 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b381d75-882f-425b-9c58-ec00804fda34-combined-ca-bundle\") pod \"aodh-db-sync-f9lpn\" (UID: \"0b381d75-882f-425b-9c58-ec00804fda34\") " pod="openstack/aodh-db-sync-f9lpn" Dec 11 13:28:37 crc kubenswrapper[4898]: I1211 13:28:37.615005 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2hl6r\" (UniqueName: \"kubernetes.io/projected/0b381d75-882f-425b-9c58-ec00804fda34-kube-api-access-2hl6r\") pod \"aodh-db-sync-f9lpn\" (UID: \"0b381d75-882f-425b-9c58-ec00804fda34\") " pod="openstack/aodh-db-sync-f9lpn" Dec 11 13:28:37 crc kubenswrapper[4898]: I1211 13:28:37.610623 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b930380f-7595-4ade-99f4-30a280f023ff-logs" (OuterVolumeSpecName: "logs") pod "b930380f-7595-4ade-99f4-30a280f023ff" (UID: "b930380f-7595-4ade-99f4-30a280f023ff"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 13:28:37 crc kubenswrapper[4898]: I1211 13:28:37.626740 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b930380f-7595-4ade-99f4-30a280f023ff-kube-api-access-r5xg7" (OuterVolumeSpecName: "kube-api-access-r5xg7") pod "b930380f-7595-4ade-99f4-30a280f023ff" (UID: "b930380f-7595-4ade-99f4-30a280f023ff"). InnerVolumeSpecName "kube-api-access-r5xg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:28:37 crc kubenswrapper[4898]: I1211 13:28:37.651678 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b930380f-7595-4ade-99f4-30a280f023ff-config-data" (OuterVolumeSpecName: "config-data") pod "b930380f-7595-4ade-99f4-30a280f023ff" (UID: "b930380f-7595-4ade-99f4-30a280f023ff"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:28:37 crc kubenswrapper[4898]: I1211 13:28:37.674987 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b930380f-7595-4ade-99f4-30a280f023ff-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b930380f-7595-4ade-99f4-30a280f023ff" (UID: "b930380f-7595-4ade-99f4-30a280f023ff"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:28:37 crc kubenswrapper[4898]: I1211 13:28:37.718903 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b381d75-882f-425b-9c58-ec00804fda34-combined-ca-bundle\") pod \"aodh-db-sync-f9lpn\" (UID: \"0b381d75-882f-425b-9c58-ec00804fda34\") " pod="openstack/aodh-db-sync-f9lpn" Dec 11 13:28:37 crc kubenswrapper[4898]: I1211 13:28:37.718981 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2hl6r\" (UniqueName: \"kubernetes.io/projected/0b381d75-882f-425b-9c58-ec00804fda34-kube-api-access-2hl6r\") pod \"aodh-db-sync-f9lpn\" (UID: \"0b381d75-882f-425b-9c58-ec00804fda34\") " pod="openstack/aodh-db-sync-f9lpn" Dec 11 13:28:37 crc kubenswrapper[4898]: I1211 13:28:37.719115 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0b381d75-882f-425b-9c58-ec00804fda34-scripts\") pod \"aodh-db-sync-f9lpn\" (UID: \"0b381d75-882f-425b-9c58-ec00804fda34\") " pod="openstack/aodh-db-sync-f9lpn" Dec 11 13:28:37 crc kubenswrapper[4898]: I1211 13:28:37.719217 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b381d75-882f-425b-9c58-ec00804fda34-config-data\") pod \"aodh-db-sync-f9lpn\" (UID: \"0b381d75-882f-425b-9c58-ec00804fda34\") " pod="openstack/aodh-db-sync-f9lpn" Dec 11 13:28:37 crc kubenswrapper[4898]: I1211 13:28:37.719383 4898 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b930380f-7595-4ade-99f4-30a280f023ff-logs\") on node \"crc\" DevicePath \"\"" Dec 11 13:28:37 crc kubenswrapper[4898]: I1211 13:28:37.719397 4898 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b930380f-7595-4ade-99f4-30a280f023ff-config-data\") on node \"crc\" DevicePath \"\"" Dec 11 13:28:37 crc kubenswrapper[4898]: I1211 13:28:37.719410 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r5xg7\" (UniqueName: \"kubernetes.io/projected/b930380f-7595-4ade-99f4-30a280f023ff-kube-api-access-r5xg7\") on node \"crc\" DevicePath \"\"" Dec 11 13:28:37 crc kubenswrapper[4898]: I1211 13:28:37.719422 4898 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b930380f-7595-4ade-99f4-30a280f023ff-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 11 13:28:37 crc kubenswrapper[4898]: I1211 13:28:37.726807 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b381d75-882f-425b-9c58-ec00804fda34-config-data\") pod \"aodh-db-sync-f9lpn\" (UID: \"0b381d75-882f-425b-9c58-ec00804fda34\") " pod="openstack/aodh-db-sync-f9lpn" Dec 11 13:28:37 crc kubenswrapper[4898]: I1211 13:28:37.727079 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b381d75-882f-425b-9c58-ec00804fda34-combined-ca-bundle\") pod \"aodh-db-sync-f9lpn\" (UID: \"0b381d75-882f-425b-9c58-ec00804fda34\") " pod="openstack/aodh-db-sync-f9lpn" Dec 11 13:28:37 crc kubenswrapper[4898]: I1211 13:28:37.730832 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0b381d75-882f-425b-9c58-ec00804fda34-scripts\") pod \"aodh-db-sync-f9lpn\" (UID: \"0b381d75-882f-425b-9c58-ec00804fda34\") " pod="openstack/aodh-db-sync-f9lpn" Dec 11 13:28:37 crc kubenswrapper[4898]: I1211 13:28:37.737208 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2hl6r\" (UniqueName: \"kubernetes.io/projected/0b381d75-882f-425b-9c58-ec00804fda34-kube-api-access-2hl6r\") pod \"aodh-db-sync-f9lpn\" (UID: \"0b381d75-882f-425b-9c58-ec00804fda34\") " pod="openstack/aodh-db-sync-f9lpn" Dec 11 13:28:37 crc kubenswrapper[4898]: I1211 13:28:37.902009 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-f9lpn" Dec 11 13:28:38 crc kubenswrapper[4898]: I1211 13:28:38.197658 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-568d7fd7cf-g8csh" Dec 11 13:28:38 crc kubenswrapper[4898]: I1211 13:28:38.203016 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c3709b73-c7ab-4ad7-8d8d-2dc55ea4e46f","Type":"ContainerStarted","Data":"2337b6827599a355a74164b0ff0906446a51991d184793e68007ee6f656a4d94"} Dec 11 13:28:38 crc kubenswrapper[4898]: I1211 13:28:38.209207 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b930380f-7595-4ade-99f4-30a280f023ff","Type":"ContainerDied","Data":"3959be6e66bc3876876600b4ebdb7d78fefd633cab5ecef2f1b6f7a1fe4d4410"} Dec 11 13:28:38 crc kubenswrapper[4898]: I1211 13:28:38.209310 4898 scope.go:117] "RemoveContainer" containerID="ebfc7617de1531a0081125be8d0c524db3f6f39ef7bc1c3fa0f9117feb144698" Dec 11 13:28:38 crc kubenswrapper[4898]: I1211 13:28:38.209370 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 11 13:28:38 crc kubenswrapper[4898]: I1211 13:28:38.263517 4898 scope.go:117] "RemoveContainer" containerID="f14258e237652231b3a31dcb33a7a2070e8e7beb428c05889fd816f61d08d6c6" Dec 11 13:28:38 crc kubenswrapper[4898]: I1211 13:28:38.331307 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 11 13:28:38 crc kubenswrapper[4898]: I1211 13:28:38.367941 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 11 13:28:38 crc kubenswrapper[4898]: I1211 13:28:38.380516 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-688b9f5b49-m8v8g"] Dec 11 13:28:38 crc kubenswrapper[4898]: I1211 13:28:38.380818 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-688b9f5b49-m8v8g" podUID="3ec65fb5-ed03-4c86-b3da-19711d02ef4d" containerName="dnsmasq-dns" containerID="cri-o://34268be809bb953ad05606192e502b89eee8a121c7980b49b6c8fc38f9cb3a0b" gracePeriod=10 Dec 11 13:28:38 crc kubenswrapper[4898]: I1211 13:28:38.395581 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 11 13:28:38 crc kubenswrapper[4898]: I1211 13:28:38.409753 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 11 13:28:38 crc kubenswrapper[4898]: E1211 13:28:38.410873 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b930380f-7595-4ade-99f4-30a280f023ff" containerName="nova-metadata-log" Dec 11 13:28:38 crc kubenswrapper[4898]: I1211 13:28:38.410975 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="b930380f-7595-4ade-99f4-30a280f023ff" containerName="nova-metadata-log" Dec 11 13:28:38 crc kubenswrapper[4898]: E1211 13:28:38.411067 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b930380f-7595-4ade-99f4-30a280f023ff" containerName="nova-metadata-metadata" Dec 11 13:28:38 crc kubenswrapper[4898]: I1211 13:28:38.411124 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="b930380f-7595-4ade-99f4-30a280f023ff" containerName="nova-metadata-metadata" Dec 11 13:28:38 crc kubenswrapper[4898]: I1211 13:28:38.411618 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="b930380f-7595-4ade-99f4-30a280f023ff" containerName="nova-metadata-log" Dec 11 13:28:38 crc kubenswrapper[4898]: I1211 13:28:38.411707 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="b930380f-7595-4ade-99f4-30a280f023ff" containerName="nova-metadata-metadata" Dec 11 13:28:38 crc kubenswrapper[4898]: I1211 13:28:38.413216 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 11 13:28:38 crc kubenswrapper[4898]: I1211 13:28:38.416375 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 11 13:28:38 crc kubenswrapper[4898]: I1211 13:28:38.416613 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 11 13:28:38 crc kubenswrapper[4898]: I1211 13:28:38.423538 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 11 13:28:38 crc kubenswrapper[4898]: I1211 13:28:38.452048 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b122352-5f1f-4bf6-905a-e030611792de-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"0b122352-5f1f-4bf6-905a-e030611792de\") " pod="openstack/nova-metadata-0" Dec 11 13:28:38 crc kubenswrapper[4898]: I1211 13:28:38.452477 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fmg85\" (UniqueName: \"kubernetes.io/projected/0b122352-5f1f-4bf6-905a-e030611792de-kube-api-access-fmg85\") pod \"nova-metadata-0\" (UID: \"0b122352-5f1f-4bf6-905a-e030611792de\") " pod="openstack/nova-metadata-0" Dec 11 13:28:38 crc kubenswrapper[4898]: I1211 13:28:38.452601 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0b122352-5f1f-4bf6-905a-e030611792de-logs\") pod \"nova-metadata-0\" (UID: \"0b122352-5f1f-4bf6-905a-e030611792de\") " pod="openstack/nova-metadata-0" Dec 11 13:28:38 crc kubenswrapper[4898]: I1211 13:28:38.452785 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b122352-5f1f-4bf6-905a-e030611792de-config-data\") pod \"nova-metadata-0\" (UID: \"0b122352-5f1f-4bf6-905a-e030611792de\") " pod="openstack/nova-metadata-0" Dec 11 13:28:38 crc kubenswrapper[4898]: I1211 13:28:38.454529 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/0b122352-5f1f-4bf6-905a-e030611792de-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"0b122352-5f1f-4bf6-905a-e030611792de\") " pod="openstack/nova-metadata-0" Dec 11 13:28:38 crc kubenswrapper[4898]: I1211 13:28:38.569605 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b122352-5f1f-4bf6-905a-e030611792de-config-data\") pod \"nova-metadata-0\" (UID: \"0b122352-5f1f-4bf6-905a-e030611792de\") " pod="openstack/nova-metadata-0" Dec 11 13:28:38 crc kubenswrapper[4898]: I1211 13:28:38.569696 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/0b122352-5f1f-4bf6-905a-e030611792de-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"0b122352-5f1f-4bf6-905a-e030611792de\") " pod="openstack/nova-metadata-0" Dec 11 13:28:38 crc kubenswrapper[4898]: I1211 13:28:38.569919 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b122352-5f1f-4bf6-905a-e030611792de-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"0b122352-5f1f-4bf6-905a-e030611792de\") " pod="openstack/nova-metadata-0" Dec 11 13:28:38 crc kubenswrapper[4898]: I1211 13:28:38.570632 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fmg85\" (UniqueName: \"kubernetes.io/projected/0b122352-5f1f-4bf6-905a-e030611792de-kube-api-access-fmg85\") pod \"nova-metadata-0\" (UID: \"0b122352-5f1f-4bf6-905a-e030611792de\") " pod="openstack/nova-metadata-0" Dec 11 13:28:38 crc kubenswrapper[4898]: I1211 13:28:38.570694 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0b122352-5f1f-4bf6-905a-e030611792de-logs\") pod \"nova-metadata-0\" (UID: \"0b122352-5f1f-4bf6-905a-e030611792de\") " pod="openstack/nova-metadata-0" Dec 11 13:28:38 crc kubenswrapper[4898]: I1211 13:28:38.571475 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0b122352-5f1f-4bf6-905a-e030611792de-logs\") pod \"nova-metadata-0\" (UID: \"0b122352-5f1f-4bf6-905a-e030611792de\") " pod="openstack/nova-metadata-0" Dec 11 13:28:38 crc kubenswrapper[4898]: I1211 13:28:38.588418 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b122352-5f1f-4bf6-905a-e030611792de-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"0b122352-5f1f-4bf6-905a-e030611792de\") " pod="openstack/nova-metadata-0" Dec 11 13:28:38 crc kubenswrapper[4898]: I1211 13:28:38.590688 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b122352-5f1f-4bf6-905a-e030611792de-config-data\") pod \"nova-metadata-0\" (UID: \"0b122352-5f1f-4bf6-905a-e030611792de\") " pod="openstack/nova-metadata-0" Dec 11 13:28:38 crc kubenswrapper[4898]: I1211 13:28:38.594734 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fmg85\" (UniqueName: \"kubernetes.io/projected/0b122352-5f1f-4bf6-905a-e030611792de-kube-api-access-fmg85\") pod \"nova-metadata-0\" (UID: \"0b122352-5f1f-4bf6-905a-e030611792de\") " pod="openstack/nova-metadata-0" Dec 11 13:28:38 crc kubenswrapper[4898]: I1211 13:28:38.606252 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/0b122352-5f1f-4bf6-905a-e030611792de-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"0b122352-5f1f-4bf6-905a-e030611792de\") " pod="openstack/nova-metadata-0" Dec 11 13:28:38 crc kubenswrapper[4898]: I1211 13:28:38.608423 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="be99ef39-7e89-4189-8f85-91cc80505896" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.230:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 11 13:28:38 crc kubenswrapper[4898]: I1211 13:28:38.626324 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-f9lpn"] Dec 11 13:28:38 crc kubenswrapper[4898]: W1211 13:28:38.664028 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0b381d75_882f_425b_9c58_ec00804fda34.slice/crio-5f3eed33c0885d03279ce2a95ebb6547f783796c486de926cef7ab6f4edd873d WatchSource:0}: Error finding container 5f3eed33c0885d03279ce2a95ebb6547f783796c486de926cef7ab6f4edd873d: Status 404 returned error can't find the container with id 5f3eed33c0885d03279ce2a95ebb6547f783796c486de926cef7ab6f4edd873d Dec 11 13:28:38 crc kubenswrapper[4898]: I1211 13:28:38.664125 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="be99ef39-7e89-4189-8f85-91cc80505896" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.230:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 11 13:28:38 crc kubenswrapper[4898]: I1211 13:28:38.805934 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b930380f-7595-4ade-99f4-30a280f023ff" path="/var/lib/kubelet/pods/b930380f-7595-4ade-99f4-30a280f023ff/volumes" Dec 11 13:28:38 crc kubenswrapper[4898]: I1211 13:28:38.848553 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 11 13:28:39 crc kubenswrapper[4898]: I1211 13:28:39.253775 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c3709b73-c7ab-4ad7-8d8d-2dc55ea4e46f","Type":"ContainerStarted","Data":"4458eb4c87aa70fbd2c81f49dba275d2ee2a8dd1fc614ec66f570958f2340070"} Dec 11 13:28:39 crc kubenswrapper[4898]: I1211 13:28:39.254176 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 11 13:28:39 crc kubenswrapper[4898]: I1211 13:28:39.258073 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-f9lpn" event={"ID":"0b381d75-882f-425b-9c58-ec00804fda34","Type":"ContainerStarted","Data":"5f3eed33c0885d03279ce2a95ebb6547f783796c486de926cef7ab6f4edd873d"} Dec 11 13:28:39 crc kubenswrapper[4898]: I1211 13:28:39.263317 4898 generic.go:334] "Generic (PLEG): container finished" podID="3ec65fb5-ed03-4c86-b3da-19711d02ef4d" containerID="34268be809bb953ad05606192e502b89eee8a121c7980b49b6c8fc38f9cb3a0b" exitCode=0 Dec 11 13:28:39 crc kubenswrapper[4898]: I1211 13:28:39.263372 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-688b9f5b49-m8v8g" event={"ID":"3ec65fb5-ed03-4c86-b3da-19711d02ef4d","Type":"ContainerDied","Data":"34268be809bb953ad05606192e502b89eee8a121c7980b49b6c8fc38f9cb3a0b"} Dec 11 13:28:39 crc kubenswrapper[4898]: I1211 13:28:39.305624 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=5.386218298 podStartE2EDuration="10.305601481s" podCreationTimestamp="2025-12-11 13:28:29 +0000 UTC" firstStartedPulling="2025-12-11 13:28:33.795258775 +0000 UTC m=+1471.367585212" lastFinishedPulling="2025-12-11 13:28:38.714641958 +0000 UTC m=+1476.286968395" observedRunningTime="2025-12-11 13:28:39.291735256 +0000 UTC m=+1476.864061693" watchObservedRunningTime="2025-12-11 13:28:39.305601481 +0000 UTC m=+1476.877927918" Dec 11 13:28:39 crc kubenswrapper[4898]: I1211 13:28:39.333780 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-688b9f5b49-m8v8g" Dec 11 13:28:39 crc kubenswrapper[4898]: I1211 13:28:39.395094 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hf7zh\" (UniqueName: \"kubernetes.io/projected/3ec65fb5-ed03-4c86-b3da-19711d02ef4d-kube-api-access-hf7zh\") pod \"3ec65fb5-ed03-4c86-b3da-19711d02ef4d\" (UID: \"3ec65fb5-ed03-4c86-b3da-19711d02ef4d\") " Dec 11 13:28:39 crc kubenswrapper[4898]: I1211 13:28:39.395192 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3ec65fb5-ed03-4c86-b3da-19711d02ef4d-dns-svc\") pod \"3ec65fb5-ed03-4c86-b3da-19711d02ef4d\" (UID: \"3ec65fb5-ed03-4c86-b3da-19711d02ef4d\") " Dec 11 13:28:39 crc kubenswrapper[4898]: I1211 13:28:39.395267 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3ec65fb5-ed03-4c86-b3da-19711d02ef4d-dns-swift-storage-0\") pod \"3ec65fb5-ed03-4c86-b3da-19711d02ef4d\" (UID: \"3ec65fb5-ed03-4c86-b3da-19711d02ef4d\") " Dec 11 13:28:39 crc kubenswrapper[4898]: I1211 13:28:39.395315 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3ec65fb5-ed03-4c86-b3da-19711d02ef4d-ovsdbserver-sb\") pod \"3ec65fb5-ed03-4c86-b3da-19711d02ef4d\" (UID: \"3ec65fb5-ed03-4c86-b3da-19711d02ef4d\") " Dec 11 13:28:39 crc kubenswrapper[4898]: I1211 13:28:39.395392 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3ec65fb5-ed03-4c86-b3da-19711d02ef4d-ovsdbserver-nb\") pod \"3ec65fb5-ed03-4c86-b3da-19711d02ef4d\" (UID: \"3ec65fb5-ed03-4c86-b3da-19711d02ef4d\") " Dec 11 13:28:39 crc kubenswrapper[4898]: I1211 13:28:39.395529 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3ec65fb5-ed03-4c86-b3da-19711d02ef4d-config\") pod \"3ec65fb5-ed03-4c86-b3da-19711d02ef4d\" (UID: \"3ec65fb5-ed03-4c86-b3da-19711d02ef4d\") " Dec 11 13:28:39 crc kubenswrapper[4898]: I1211 13:28:39.403071 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ec65fb5-ed03-4c86-b3da-19711d02ef4d-kube-api-access-hf7zh" (OuterVolumeSpecName: "kube-api-access-hf7zh") pod "3ec65fb5-ed03-4c86-b3da-19711d02ef4d" (UID: "3ec65fb5-ed03-4c86-b3da-19711d02ef4d"). InnerVolumeSpecName "kube-api-access-hf7zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:28:39 crc kubenswrapper[4898]: I1211 13:28:39.475870 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 11 13:28:39 crc kubenswrapper[4898]: I1211 13:28:39.488067 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3ec65fb5-ed03-4c86-b3da-19711d02ef4d-config" (OuterVolumeSpecName: "config") pod "3ec65fb5-ed03-4c86-b3da-19711d02ef4d" (UID: "3ec65fb5-ed03-4c86-b3da-19711d02ef4d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:28:39 crc kubenswrapper[4898]: I1211 13:28:39.508430 4898 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3ec65fb5-ed03-4c86-b3da-19711d02ef4d-config\") on node \"crc\" DevicePath \"\"" Dec 11 13:28:39 crc kubenswrapper[4898]: I1211 13:28:39.508476 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hf7zh\" (UniqueName: \"kubernetes.io/projected/3ec65fb5-ed03-4c86-b3da-19711d02ef4d-kube-api-access-hf7zh\") on node \"crc\" DevicePath \"\"" Dec 11 13:28:39 crc kubenswrapper[4898]: I1211 13:28:39.524818 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3ec65fb5-ed03-4c86-b3da-19711d02ef4d-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "3ec65fb5-ed03-4c86-b3da-19711d02ef4d" (UID: "3ec65fb5-ed03-4c86-b3da-19711d02ef4d"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:28:39 crc kubenswrapper[4898]: I1211 13:28:39.550641 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3ec65fb5-ed03-4c86-b3da-19711d02ef4d-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "3ec65fb5-ed03-4c86-b3da-19711d02ef4d" (UID: "3ec65fb5-ed03-4c86-b3da-19711d02ef4d"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:28:39 crc kubenswrapper[4898]: I1211 13:28:39.552790 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3ec65fb5-ed03-4c86-b3da-19711d02ef4d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "3ec65fb5-ed03-4c86-b3da-19711d02ef4d" (UID: "3ec65fb5-ed03-4c86-b3da-19711d02ef4d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:28:39 crc kubenswrapper[4898]: I1211 13:28:39.611023 4898 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3ec65fb5-ed03-4c86-b3da-19711d02ef4d-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 11 13:28:39 crc kubenswrapper[4898]: I1211 13:28:39.611072 4898 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3ec65fb5-ed03-4c86-b3da-19711d02ef4d-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 11 13:28:39 crc kubenswrapper[4898]: I1211 13:28:39.611086 4898 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3ec65fb5-ed03-4c86-b3da-19711d02ef4d-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 11 13:28:39 crc kubenswrapper[4898]: I1211 13:28:39.641591 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3ec65fb5-ed03-4c86-b3da-19711d02ef4d-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "3ec65fb5-ed03-4c86-b3da-19711d02ef4d" (UID: "3ec65fb5-ed03-4c86-b3da-19711d02ef4d"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:28:39 crc kubenswrapper[4898]: I1211 13:28:39.744029 4898 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3ec65fb5-ed03-4c86-b3da-19711d02ef4d-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 11 13:28:40 crc kubenswrapper[4898]: I1211 13:28:40.285096 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-688b9f5b49-m8v8g" event={"ID":"3ec65fb5-ed03-4c86-b3da-19711d02ef4d","Type":"ContainerDied","Data":"6bfc24cc35b3b19439986ff3d40b2a88a2d588ca1324395af0c81f8e7ed41cd1"} Dec 11 13:28:40 crc kubenswrapper[4898]: I1211 13:28:40.285423 4898 scope.go:117] "RemoveContainer" containerID="34268be809bb953ad05606192e502b89eee8a121c7980b49b6c8fc38f9cb3a0b" Dec 11 13:28:40 crc kubenswrapper[4898]: I1211 13:28:40.285106 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-688b9f5b49-m8v8g" Dec 11 13:28:40 crc kubenswrapper[4898]: I1211 13:28:40.288710 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0b122352-5f1f-4bf6-905a-e030611792de","Type":"ContainerStarted","Data":"91817998ae207d7c069cae2bb71f93ee500500383d401e2d9488753f47323842"} Dec 11 13:28:40 crc kubenswrapper[4898]: I1211 13:28:40.288891 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0b122352-5f1f-4bf6-905a-e030611792de","Type":"ContainerStarted","Data":"7b39637beaaf84a4332a3d32006736cf8c161272fa5fb1d4e19016f2337171a8"} Dec 11 13:28:40 crc kubenswrapper[4898]: I1211 13:28:40.291231 4898 generic.go:334] "Generic (PLEG): container finished" podID="17ff0417-b6d7-42f4-9de4-e2482b659fc2" containerID="f697e132855971325a13977d240628b9d340ff5b8fbc1d55606c7d0c7520615b" exitCode=0 Dec 11 13:28:40 crc kubenswrapper[4898]: I1211 13:28:40.291266 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-dgfd5" event={"ID":"17ff0417-b6d7-42f4-9de4-e2482b659fc2","Type":"ContainerDied","Data":"f697e132855971325a13977d240628b9d340ff5b8fbc1d55606c7d0c7520615b"} Dec 11 13:28:40 crc kubenswrapper[4898]: I1211 13:28:40.356231 4898 scope.go:117] "RemoveContainer" containerID="3a1f98064dff3f8d2587ac4ebb0c37042beb56a5d07d4f78149ee4ec7c5a6a46" Dec 11 13:28:40 crc kubenswrapper[4898]: I1211 13:28:40.373917 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-688b9f5b49-m8v8g"] Dec 11 13:28:40 crc kubenswrapper[4898]: I1211 13:28:40.402697 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-688b9f5b49-m8v8g"] Dec 11 13:28:40 crc kubenswrapper[4898]: I1211 13:28:40.790054 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ec65fb5-ed03-4c86-b3da-19711d02ef4d" path="/var/lib/kubelet/pods/3ec65fb5-ed03-4c86-b3da-19711d02ef4d/volumes" Dec 11 13:28:41 crc kubenswrapper[4898]: I1211 13:28:41.313717 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0b122352-5f1f-4bf6-905a-e030611792de","Type":"ContainerStarted","Data":"6b8347c8204905e123f632717b1f2f776b13c415eacde7bed18fa2e25df5ca5a"} Dec 11 13:28:41 crc kubenswrapper[4898]: I1211 13:28:41.347501 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.3474835 podStartE2EDuration="3.3474835s" podCreationTimestamp="2025-12-11 13:28:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 13:28:41.33713728 +0000 UTC m=+1478.909463717" watchObservedRunningTime="2025-12-11 13:28:41.3474835 +0000 UTC m=+1478.919809937" Dec 11 13:28:41 crc kubenswrapper[4898]: I1211 13:28:41.836431 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-dgfd5" Dec 11 13:28:41 crc kubenswrapper[4898]: I1211 13:28:41.916831 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17ff0417-b6d7-42f4-9de4-e2482b659fc2-combined-ca-bundle\") pod \"17ff0417-b6d7-42f4-9de4-e2482b659fc2\" (UID: \"17ff0417-b6d7-42f4-9de4-e2482b659fc2\") " Dec 11 13:28:41 crc kubenswrapper[4898]: I1211 13:28:41.916945 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17ff0417-b6d7-42f4-9de4-e2482b659fc2-config-data\") pod \"17ff0417-b6d7-42f4-9de4-e2482b659fc2\" (UID: \"17ff0417-b6d7-42f4-9de4-e2482b659fc2\") " Dec 11 13:28:41 crc kubenswrapper[4898]: I1211 13:28:41.917102 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/17ff0417-b6d7-42f4-9de4-e2482b659fc2-scripts\") pod \"17ff0417-b6d7-42f4-9de4-e2482b659fc2\" (UID: \"17ff0417-b6d7-42f4-9de4-e2482b659fc2\") " Dec 11 13:28:41 crc kubenswrapper[4898]: I1211 13:28:41.917196 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pnvrx\" (UniqueName: \"kubernetes.io/projected/17ff0417-b6d7-42f4-9de4-e2482b659fc2-kube-api-access-pnvrx\") pod \"17ff0417-b6d7-42f4-9de4-e2482b659fc2\" (UID: \"17ff0417-b6d7-42f4-9de4-e2482b659fc2\") " Dec 11 13:28:41 crc kubenswrapper[4898]: I1211 13:28:41.933971 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17ff0417-b6d7-42f4-9de4-e2482b659fc2-kube-api-access-pnvrx" (OuterVolumeSpecName: "kube-api-access-pnvrx") pod "17ff0417-b6d7-42f4-9de4-e2482b659fc2" (UID: "17ff0417-b6d7-42f4-9de4-e2482b659fc2"). InnerVolumeSpecName "kube-api-access-pnvrx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:28:41 crc kubenswrapper[4898]: I1211 13:28:41.954029 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17ff0417-b6d7-42f4-9de4-e2482b659fc2-scripts" (OuterVolumeSpecName: "scripts") pod "17ff0417-b6d7-42f4-9de4-e2482b659fc2" (UID: "17ff0417-b6d7-42f4-9de4-e2482b659fc2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:28:41 crc kubenswrapper[4898]: I1211 13:28:41.963289 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17ff0417-b6d7-42f4-9de4-e2482b659fc2-config-data" (OuterVolumeSpecName: "config-data") pod "17ff0417-b6d7-42f4-9de4-e2482b659fc2" (UID: "17ff0417-b6d7-42f4-9de4-e2482b659fc2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:28:42 crc kubenswrapper[4898]: I1211 13:28:42.019798 4898 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/17ff0417-b6d7-42f4-9de4-e2482b659fc2-scripts\") on node \"crc\" DevicePath \"\"" Dec 11 13:28:42 crc kubenswrapper[4898]: I1211 13:28:42.020026 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pnvrx\" (UniqueName: \"kubernetes.io/projected/17ff0417-b6d7-42f4-9de4-e2482b659fc2-kube-api-access-pnvrx\") on node \"crc\" DevicePath \"\"" Dec 11 13:28:42 crc kubenswrapper[4898]: I1211 13:28:42.020039 4898 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17ff0417-b6d7-42f4-9de4-e2482b659fc2-config-data\") on node \"crc\" DevicePath \"\"" Dec 11 13:28:42 crc kubenswrapper[4898]: I1211 13:28:42.028731 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17ff0417-b6d7-42f4-9de4-e2482b659fc2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "17ff0417-b6d7-42f4-9de4-e2482b659fc2" (UID: "17ff0417-b6d7-42f4-9de4-e2482b659fc2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:28:42 crc kubenswrapper[4898]: I1211 13:28:42.122405 4898 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17ff0417-b6d7-42f4-9de4-e2482b659fc2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 11 13:28:42 crc kubenswrapper[4898]: I1211 13:28:42.334012 4898 generic.go:334] "Generic (PLEG): container finished" podID="ac397035-8dbc-46c6-8e4c-d4c25cb38c8f" containerID="b25bab6209ad2fceaf30d8c7226b13536eb0afd152d8951b38bf8d6cf63418d8" exitCode=0 Dec 11 13:28:42 crc kubenswrapper[4898]: I1211 13:28:42.334076 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-sjh99" event={"ID":"ac397035-8dbc-46c6-8e4c-d4c25cb38c8f","Type":"ContainerDied","Data":"b25bab6209ad2fceaf30d8c7226b13536eb0afd152d8951b38bf8d6cf63418d8"} Dec 11 13:28:42 crc kubenswrapper[4898]: I1211 13:28:42.338835 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-dgfd5" Dec 11 13:28:42 crc kubenswrapper[4898]: I1211 13:28:42.339333 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-dgfd5" event={"ID":"17ff0417-b6d7-42f4-9de4-e2482b659fc2","Type":"ContainerDied","Data":"133a88df083acfde8e58f68551b4fbb1a61e8ee829f40e6ad09b7553d20f6bf3"} Dec 11 13:28:42 crc kubenswrapper[4898]: I1211 13:28:42.339366 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="133a88df083acfde8e58f68551b4fbb1a61e8ee829f40e6ad09b7553d20f6bf3" Dec 11 13:28:42 crc kubenswrapper[4898]: I1211 13:28:42.497358 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 11 13:28:42 crc kubenswrapper[4898]: I1211 13:28:42.497595 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="be99ef39-7e89-4189-8f85-91cc80505896" containerName="nova-api-log" containerID="cri-o://e8513b56ae064c5a9aabd60c9682e79c6235cb547769908b23caacacfd0f694f" gracePeriod=30 Dec 11 13:28:42 crc kubenswrapper[4898]: I1211 13:28:42.498179 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="be99ef39-7e89-4189-8f85-91cc80505896" containerName="nova-api-api" containerID="cri-o://f5b03c43b8097f88807399d5ebb212023d00d6c2007613ef8b15039bd91d0b96" gracePeriod=30 Dec 11 13:28:42 crc kubenswrapper[4898]: I1211 13:28:42.514776 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 11 13:28:42 crc kubenswrapper[4898]: I1211 13:28:42.514968 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="8a29489d-803b-4a04-a612-485c4a015ccf" containerName="nova-scheduler-scheduler" containerID="cri-o://f9732cc402e347d4b36742d400e8ab9e5305266fb1c433be67d4bdf39775bc4b" gracePeriod=30 Dec 11 13:28:42 crc kubenswrapper[4898]: I1211 13:28:42.543170 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 11 13:28:43 crc kubenswrapper[4898]: I1211 13:28:43.060629 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-7g6rk"] Dec 11 13:28:43 crc kubenswrapper[4898]: E1211 13:28:43.061295 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17ff0417-b6d7-42f4-9de4-e2482b659fc2" containerName="nova-manage" Dec 11 13:28:43 crc kubenswrapper[4898]: I1211 13:28:43.061319 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="17ff0417-b6d7-42f4-9de4-e2482b659fc2" containerName="nova-manage" Dec 11 13:28:43 crc kubenswrapper[4898]: E1211 13:28:43.061367 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ec65fb5-ed03-4c86-b3da-19711d02ef4d" containerName="dnsmasq-dns" Dec 11 13:28:43 crc kubenswrapper[4898]: I1211 13:28:43.061376 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ec65fb5-ed03-4c86-b3da-19711d02ef4d" containerName="dnsmasq-dns" Dec 11 13:28:43 crc kubenswrapper[4898]: E1211 13:28:43.061441 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ec65fb5-ed03-4c86-b3da-19711d02ef4d" containerName="init" Dec 11 13:28:43 crc kubenswrapper[4898]: I1211 13:28:43.061475 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ec65fb5-ed03-4c86-b3da-19711d02ef4d" containerName="init" Dec 11 13:28:43 crc kubenswrapper[4898]: I1211 13:28:43.061858 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ec65fb5-ed03-4c86-b3da-19711d02ef4d" containerName="dnsmasq-dns" Dec 11 13:28:43 crc kubenswrapper[4898]: I1211 13:28:43.061898 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="17ff0417-b6d7-42f4-9de4-e2482b659fc2" containerName="nova-manage" Dec 11 13:28:43 crc kubenswrapper[4898]: I1211 13:28:43.064941 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7g6rk" Dec 11 13:28:43 crc kubenswrapper[4898]: I1211 13:28:43.075570 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7g6rk"] Dec 11 13:28:43 crc kubenswrapper[4898]: I1211 13:28:43.142938 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/78074bfc-7a8f-4154-99f1-6c1abefca223-catalog-content\") pod \"certified-operators-7g6rk\" (UID: \"78074bfc-7a8f-4154-99f1-6c1abefca223\") " pod="openshift-marketplace/certified-operators-7g6rk" Dec 11 13:28:43 crc kubenswrapper[4898]: I1211 13:28:43.143033 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jq7p9\" (UniqueName: \"kubernetes.io/projected/78074bfc-7a8f-4154-99f1-6c1abefca223-kube-api-access-jq7p9\") pod \"certified-operators-7g6rk\" (UID: \"78074bfc-7a8f-4154-99f1-6c1abefca223\") " pod="openshift-marketplace/certified-operators-7g6rk" Dec 11 13:28:43 crc kubenswrapper[4898]: I1211 13:28:43.143222 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/78074bfc-7a8f-4154-99f1-6c1abefca223-utilities\") pod \"certified-operators-7g6rk\" (UID: \"78074bfc-7a8f-4154-99f1-6c1abefca223\") " pod="openshift-marketplace/certified-operators-7g6rk" Dec 11 13:28:43 crc kubenswrapper[4898]: I1211 13:28:43.245696 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/78074bfc-7a8f-4154-99f1-6c1abefca223-catalog-content\") pod \"certified-operators-7g6rk\" (UID: \"78074bfc-7a8f-4154-99f1-6c1abefca223\") " pod="openshift-marketplace/certified-operators-7g6rk" Dec 11 13:28:43 crc kubenswrapper[4898]: I1211 13:28:43.245806 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jq7p9\" (UniqueName: \"kubernetes.io/projected/78074bfc-7a8f-4154-99f1-6c1abefca223-kube-api-access-jq7p9\") pod \"certified-operators-7g6rk\" (UID: \"78074bfc-7a8f-4154-99f1-6c1abefca223\") " pod="openshift-marketplace/certified-operators-7g6rk" Dec 11 13:28:43 crc kubenswrapper[4898]: I1211 13:28:43.245966 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/78074bfc-7a8f-4154-99f1-6c1abefca223-utilities\") pod \"certified-operators-7g6rk\" (UID: \"78074bfc-7a8f-4154-99f1-6c1abefca223\") " pod="openshift-marketplace/certified-operators-7g6rk" Dec 11 13:28:43 crc kubenswrapper[4898]: I1211 13:28:43.246378 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/78074bfc-7a8f-4154-99f1-6c1abefca223-utilities\") pod \"certified-operators-7g6rk\" (UID: \"78074bfc-7a8f-4154-99f1-6c1abefca223\") " pod="openshift-marketplace/certified-operators-7g6rk" Dec 11 13:28:43 crc kubenswrapper[4898]: I1211 13:28:43.246634 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/78074bfc-7a8f-4154-99f1-6c1abefca223-catalog-content\") pod \"certified-operators-7g6rk\" (UID: \"78074bfc-7a8f-4154-99f1-6c1abefca223\") " pod="openshift-marketplace/certified-operators-7g6rk" Dec 11 13:28:43 crc kubenswrapper[4898]: I1211 13:28:43.264987 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jq7p9\" (UniqueName: \"kubernetes.io/projected/78074bfc-7a8f-4154-99f1-6c1abefca223-kube-api-access-jq7p9\") pod \"certified-operators-7g6rk\" (UID: \"78074bfc-7a8f-4154-99f1-6c1abefca223\") " pod="openshift-marketplace/certified-operators-7g6rk" Dec 11 13:28:43 crc kubenswrapper[4898]: I1211 13:28:43.386633 4898 generic.go:334] "Generic (PLEG): container finished" podID="be99ef39-7e89-4189-8f85-91cc80505896" containerID="e8513b56ae064c5a9aabd60c9682e79c6235cb547769908b23caacacfd0f694f" exitCode=143 Dec 11 13:28:43 crc kubenswrapper[4898]: I1211 13:28:43.386798 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"be99ef39-7e89-4189-8f85-91cc80505896","Type":"ContainerDied","Data":"e8513b56ae064c5a9aabd60c9682e79c6235cb547769908b23caacacfd0f694f"} Dec 11 13:28:43 crc kubenswrapper[4898]: I1211 13:28:43.403543 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7g6rk" Dec 11 13:28:43 crc kubenswrapper[4898]: I1211 13:28:43.406970 4898 generic.go:334] "Generic (PLEG): container finished" podID="8a29489d-803b-4a04-a612-485c4a015ccf" containerID="f9732cc402e347d4b36742d400e8ab9e5305266fb1c433be67d4bdf39775bc4b" exitCode=0 Dec 11 13:28:43 crc kubenswrapper[4898]: I1211 13:28:43.407215 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"8a29489d-803b-4a04-a612-485c4a015ccf","Type":"ContainerDied","Data":"f9732cc402e347d4b36742d400e8ab9e5305266fb1c433be67d4bdf39775bc4b"} Dec 11 13:28:43 crc kubenswrapper[4898]: I1211 13:28:43.407367 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="0b122352-5f1f-4bf6-905a-e030611792de" containerName="nova-metadata-log" containerID="cri-o://91817998ae207d7c069cae2bb71f93ee500500383d401e2d9488753f47323842" gracePeriod=30 Dec 11 13:28:43 crc kubenswrapper[4898]: I1211 13:28:43.408074 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="0b122352-5f1f-4bf6-905a-e030611792de" containerName="nova-metadata-metadata" containerID="cri-o://6b8347c8204905e123f632717b1f2f776b13c415eacde7bed18fa2e25df5ca5a" gracePeriod=30 Dec 11 13:28:43 crc kubenswrapper[4898]: I1211 13:28:43.848869 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 11 13:28:43 crc kubenswrapper[4898]: I1211 13:28:43.849124 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 11 13:28:44 crc kubenswrapper[4898]: I1211 13:28:44.420070 4898 generic.go:334] "Generic (PLEG): container finished" podID="0b122352-5f1f-4bf6-905a-e030611792de" containerID="6b8347c8204905e123f632717b1f2f776b13c415eacde7bed18fa2e25df5ca5a" exitCode=0 Dec 11 13:28:44 crc kubenswrapper[4898]: I1211 13:28:44.420100 4898 generic.go:334] "Generic (PLEG): container finished" podID="0b122352-5f1f-4bf6-905a-e030611792de" containerID="91817998ae207d7c069cae2bb71f93ee500500383d401e2d9488753f47323842" exitCode=143 Dec 11 13:28:44 crc kubenswrapper[4898]: I1211 13:28:44.420118 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0b122352-5f1f-4bf6-905a-e030611792de","Type":"ContainerDied","Data":"6b8347c8204905e123f632717b1f2f776b13c415eacde7bed18fa2e25df5ca5a"} Dec 11 13:28:44 crc kubenswrapper[4898]: I1211 13:28:44.420141 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0b122352-5f1f-4bf6-905a-e030611792de","Type":"ContainerDied","Data":"91817998ae207d7c069cae2bb71f93ee500500383d401e2d9488753f47323842"} Dec 11 13:28:44 crc kubenswrapper[4898]: I1211 13:28:44.700199 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-sjh99" Dec 11 13:28:44 crc kubenswrapper[4898]: I1211 13:28:44.786354 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ac397035-8dbc-46c6-8e4c-d4c25cb38c8f-scripts\") pod \"ac397035-8dbc-46c6-8e4c-d4c25cb38c8f\" (UID: \"ac397035-8dbc-46c6-8e4c-d4c25cb38c8f\") " Dec 11 13:28:44 crc kubenswrapper[4898]: I1211 13:28:44.786480 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v7w6q\" (UniqueName: \"kubernetes.io/projected/ac397035-8dbc-46c6-8e4c-d4c25cb38c8f-kube-api-access-v7w6q\") pod \"ac397035-8dbc-46c6-8e4c-d4c25cb38c8f\" (UID: \"ac397035-8dbc-46c6-8e4c-d4c25cb38c8f\") " Dec 11 13:28:44 crc kubenswrapper[4898]: I1211 13:28:44.786757 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac397035-8dbc-46c6-8e4c-d4c25cb38c8f-config-data\") pod \"ac397035-8dbc-46c6-8e4c-d4c25cb38c8f\" (UID: \"ac397035-8dbc-46c6-8e4c-d4c25cb38c8f\") " Dec 11 13:28:44 crc kubenswrapper[4898]: I1211 13:28:44.786809 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac397035-8dbc-46c6-8e4c-d4c25cb38c8f-combined-ca-bundle\") pod \"ac397035-8dbc-46c6-8e4c-d4c25cb38c8f\" (UID: \"ac397035-8dbc-46c6-8e4c-d4c25cb38c8f\") " Dec 11 13:28:44 crc kubenswrapper[4898]: I1211 13:28:44.797799 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac397035-8dbc-46c6-8e4c-d4c25cb38c8f-kube-api-access-v7w6q" (OuterVolumeSpecName: "kube-api-access-v7w6q") pod "ac397035-8dbc-46c6-8e4c-d4c25cb38c8f" (UID: "ac397035-8dbc-46c6-8e4c-d4c25cb38c8f"). InnerVolumeSpecName "kube-api-access-v7w6q". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:28:44 crc kubenswrapper[4898]: I1211 13:28:44.798492 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac397035-8dbc-46c6-8e4c-d4c25cb38c8f-scripts" (OuterVolumeSpecName: "scripts") pod "ac397035-8dbc-46c6-8e4c-d4c25cb38c8f" (UID: "ac397035-8dbc-46c6-8e4c-d4c25cb38c8f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:28:44 crc kubenswrapper[4898]: I1211 13:28:44.831813 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac397035-8dbc-46c6-8e4c-d4c25cb38c8f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ac397035-8dbc-46c6-8e4c-d4c25cb38c8f" (UID: "ac397035-8dbc-46c6-8e4c-d4c25cb38c8f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:28:44 crc kubenswrapper[4898]: I1211 13:28:44.833905 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac397035-8dbc-46c6-8e4c-d4c25cb38c8f-config-data" (OuterVolumeSpecName: "config-data") pod "ac397035-8dbc-46c6-8e4c-d4c25cb38c8f" (UID: "ac397035-8dbc-46c6-8e4c-d4c25cb38c8f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:28:44 crc kubenswrapper[4898]: I1211 13:28:44.900256 4898 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac397035-8dbc-46c6-8e4c-d4c25cb38c8f-config-data\") on node \"crc\" DevicePath \"\"" Dec 11 13:28:44 crc kubenswrapper[4898]: I1211 13:28:44.900287 4898 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac397035-8dbc-46c6-8e4c-d4c25cb38c8f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 11 13:28:44 crc kubenswrapper[4898]: I1211 13:28:44.900301 4898 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ac397035-8dbc-46c6-8e4c-d4c25cb38c8f-scripts\") on node \"crc\" DevicePath \"\"" Dec 11 13:28:44 crc kubenswrapper[4898]: I1211 13:28:44.900309 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v7w6q\" (UniqueName: \"kubernetes.io/projected/ac397035-8dbc-46c6-8e4c-d4c25cb38c8f-kube-api-access-v7w6q\") on node \"crc\" DevicePath \"\"" Dec 11 13:28:45 crc kubenswrapper[4898]: I1211 13:28:45.441116 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-ngkww"] Dec 11 13:28:45 crc kubenswrapper[4898]: E1211 13:28:45.441688 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac397035-8dbc-46c6-8e4c-d4c25cb38c8f" containerName="nova-cell1-conductor-db-sync" Dec 11 13:28:45 crc kubenswrapper[4898]: I1211 13:28:45.441700 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac397035-8dbc-46c6-8e4c-d4c25cb38c8f" containerName="nova-cell1-conductor-db-sync" Dec 11 13:28:45 crc kubenswrapper[4898]: I1211 13:28:45.441892 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac397035-8dbc-46c6-8e4c-d4c25cb38c8f" containerName="nova-cell1-conductor-db-sync" Dec 11 13:28:45 crc kubenswrapper[4898]: I1211 13:28:45.445293 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ngkww" Dec 11 13:28:45 crc kubenswrapper[4898]: I1211 13:28:45.447596 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-sjh99" event={"ID":"ac397035-8dbc-46c6-8e4c-d4c25cb38c8f","Type":"ContainerDied","Data":"e334c1f355641ba3ddf2f69609c6fe2029eeedfb31187bcfb4ca0ed77c783b30"} Dec 11 13:28:45 crc kubenswrapper[4898]: I1211 13:28:45.447629 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e334c1f355641ba3ddf2f69609c6fe2029eeedfb31187bcfb4ca0ed77c783b30" Dec 11 13:28:45 crc kubenswrapper[4898]: I1211 13:28:45.447682 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-sjh99" Dec 11 13:28:45 crc kubenswrapper[4898]: I1211 13:28:45.453750 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ngkww"] Dec 11 13:28:45 crc kubenswrapper[4898]: I1211 13:28:45.619492 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j7p7j\" (UniqueName: \"kubernetes.io/projected/09202c46-beab-4e3f-aa19-f693116c3d85-kube-api-access-j7p7j\") pod \"redhat-marketplace-ngkww\" (UID: \"09202c46-beab-4e3f-aa19-f693116c3d85\") " pod="openshift-marketplace/redhat-marketplace-ngkww" Dec 11 13:28:45 crc kubenswrapper[4898]: I1211 13:28:45.619570 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/09202c46-beab-4e3f-aa19-f693116c3d85-utilities\") pod \"redhat-marketplace-ngkww\" (UID: \"09202c46-beab-4e3f-aa19-f693116c3d85\") " pod="openshift-marketplace/redhat-marketplace-ngkww" Dec 11 13:28:45 crc kubenswrapper[4898]: I1211 13:28:45.619599 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/09202c46-beab-4e3f-aa19-f693116c3d85-catalog-content\") pod \"redhat-marketplace-ngkww\" (UID: \"09202c46-beab-4e3f-aa19-f693116c3d85\") " pod="openshift-marketplace/redhat-marketplace-ngkww" Dec 11 13:28:45 crc kubenswrapper[4898]: I1211 13:28:45.727594 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j7p7j\" (UniqueName: \"kubernetes.io/projected/09202c46-beab-4e3f-aa19-f693116c3d85-kube-api-access-j7p7j\") pod \"redhat-marketplace-ngkww\" (UID: \"09202c46-beab-4e3f-aa19-f693116c3d85\") " pod="openshift-marketplace/redhat-marketplace-ngkww" Dec 11 13:28:45 crc kubenswrapper[4898]: I1211 13:28:45.727797 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/09202c46-beab-4e3f-aa19-f693116c3d85-utilities\") pod \"redhat-marketplace-ngkww\" (UID: \"09202c46-beab-4e3f-aa19-f693116c3d85\") " pod="openshift-marketplace/redhat-marketplace-ngkww" Dec 11 13:28:45 crc kubenswrapper[4898]: I1211 13:28:45.727833 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/09202c46-beab-4e3f-aa19-f693116c3d85-catalog-content\") pod \"redhat-marketplace-ngkww\" (UID: \"09202c46-beab-4e3f-aa19-f693116c3d85\") " pod="openshift-marketplace/redhat-marketplace-ngkww" Dec 11 13:28:45 crc kubenswrapper[4898]: I1211 13:28:45.728403 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/09202c46-beab-4e3f-aa19-f693116c3d85-utilities\") pod \"redhat-marketplace-ngkww\" (UID: \"09202c46-beab-4e3f-aa19-f693116c3d85\") " pod="openshift-marketplace/redhat-marketplace-ngkww" Dec 11 13:28:45 crc kubenswrapper[4898]: I1211 13:28:45.728492 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/09202c46-beab-4e3f-aa19-f693116c3d85-catalog-content\") pod \"redhat-marketplace-ngkww\" (UID: \"09202c46-beab-4e3f-aa19-f693116c3d85\") " pod="openshift-marketplace/redhat-marketplace-ngkww" Dec 11 13:28:45 crc kubenswrapper[4898]: I1211 13:28:45.760339 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j7p7j\" (UniqueName: \"kubernetes.io/projected/09202c46-beab-4e3f-aa19-f693116c3d85-kube-api-access-j7p7j\") pod \"redhat-marketplace-ngkww\" (UID: \"09202c46-beab-4e3f-aa19-f693116c3d85\") " pod="openshift-marketplace/redhat-marketplace-ngkww" Dec 11 13:28:45 crc kubenswrapper[4898]: I1211 13:28:45.772692 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ngkww" Dec 11 13:28:45 crc kubenswrapper[4898]: I1211 13:28:45.814196 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 11 13:28:45 crc kubenswrapper[4898]: I1211 13:28:45.816030 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 11 13:28:45 crc kubenswrapper[4898]: I1211 13:28:45.819148 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Dec 11 13:28:45 crc kubenswrapper[4898]: I1211 13:28:45.850684 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 11 13:28:45 crc kubenswrapper[4898]: I1211 13:28:45.879875 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 11 13:28:45 crc kubenswrapper[4898]: I1211 13:28:45.935194 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5txs8\" (UniqueName: \"kubernetes.io/projected/ec3c0ff6-8367-4309-a686-820483a8f6e5-kube-api-access-5txs8\") pod \"nova-cell1-conductor-0\" (UID: \"ec3c0ff6-8367-4309-a686-820483a8f6e5\") " pod="openstack/nova-cell1-conductor-0" Dec 11 13:28:45 crc kubenswrapper[4898]: I1211 13:28:45.980179 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec3c0ff6-8367-4309-a686-820483a8f6e5-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"ec3c0ff6-8367-4309-a686-820483a8f6e5\") " pod="openstack/nova-cell1-conductor-0" Dec 11 13:28:45 crc kubenswrapper[4898]: I1211 13:28:45.980289 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec3c0ff6-8367-4309-a686-820483a8f6e5-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"ec3c0ff6-8367-4309-a686-820483a8f6e5\") " pod="openstack/nova-cell1-conductor-0" Dec 11 13:28:46 crc kubenswrapper[4898]: E1211 13:28:46.035314 4898 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podac397035_8dbc_46c6_8e4c_d4c25cb38c8f.slice\": RecentStats: unable to find data in memory cache]" Dec 11 13:28:46 crc kubenswrapper[4898]: I1211 13:28:46.083400 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b122352-5f1f-4bf6-905a-e030611792de-combined-ca-bundle\") pod \"0b122352-5f1f-4bf6-905a-e030611792de\" (UID: \"0b122352-5f1f-4bf6-905a-e030611792de\") " Dec 11 13:28:46 crc kubenswrapper[4898]: I1211 13:28:46.083602 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0b122352-5f1f-4bf6-905a-e030611792de-logs\") pod \"0b122352-5f1f-4bf6-905a-e030611792de\" (UID: \"0b122352-5f1f-4bf6-905a-e030611792de\") " Dec 11 13:28:46 crc kubenswrapper[4898]: I1211 13:28:46.083665 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fmg85\" (UniqueName: \"kubernetes.io/projected/0b122352-5f1f-4bf6-905a-e030611792de-kube-api-access-fmg85\") pod \"0b122352-5f1f-4bf6-905a-e030611792de\" (UID: \"0b122352-5f1f-4bf6-905a-e030611792de\") " Dec 11 13:28:46 crc kubenswrapper[4898]: I1211 13:28:46.083721 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/0b122352-5f1f-4bf6-905a-e030611792de-nova-metadata-tls-certs\") pod \"0b122352-5f1f-4bf6-905a-e030611792de\" (UID: \"0b122352-5f1f-4bf6-905a-e030611792de\") " Dec 11 13:28:46 crc kubenswrapper[4898]: I1211 13:28:46.083803 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b122352-5f1f-4bf6-905a-e030611792de-config-data\") pod \"0b122352-5f1f-4bf6-905a-e030611792de\" (UID: \"0b122352-5f1f-4bf6-905a-e030611792de\") " Dec 11 13:28:46 crc kubenswrapper[4898]: I1211 13:28:46.084334 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0b122352-5f1f-4bf6-905a-e030611792de-logs" (OuterVolumeSpecName: "logs") pod "0b122352-5f1f-4bf6-905a-e030611792de" (UID: "0b122352-5f1f-4bf6-905a-e030611792de"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 13:28:46 crc kubenswrapper[4898]: I1211 13:28:46.084385 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec3c0ff6-8367-4309-a686-820483a8f6e5-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"ec3c0ff6-8367-4309-a686-820483a8f6e5\") " pod="openstack/nova-cell1-conductor-0" Dec 11 13:28:46 crc kubenswrapper[4898]: I1211 13:28:46.084439 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec3c0ff6-8367-4309-a686-820483a8f6e5-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"ec3c0ff6-8367-4309-a686-820483a8f6e5\") " pod="openstack/nova-cell1-conductor-0" Dec 11 13:28:46 crc kubenswrapper[4898]: I1211 13:28:46.084590 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5txs8\" (UniqueName: \"kubernetes.io/projected/ec3c0ff6-8367-4309-a686-820483a8f6e5-kube-api-access-5txs8\") pod \"nova-cell1-conductor-0\" (UID: \"ec3c0ff6-8367-4309-a686-820483a8f6e5\") " pod="openstack/nova-cell1-conductor-0" Dec 11 13:28:46 crc kubenswrapper[4898]: I1211 13:28:46.084730 4898 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0b122352-5f1f-4bf6-905a-e030611792de-logs\") on node \"crc\" DevicePath \"\"" Dec 11 13:28:46 crc kubenswrapper[4898]: I1211 13:28:46.090437 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b122352-5f1f-4bf6-905a-e030611792de-kube-api-access-fmg85" (OuterVolumeSpecName: "kube-api-access-fmg85") pod "0b122352-5f1f-4bf6-905a-e030611792de" (UID: "0b122352-5f1f-4bf6-905a-e030611792de"). InnerVolumeSpecName "kube-api-access-fmg85". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:28:46 crc kubenswrapper[4898]: I1211 13:28:46.109626 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec3c0ff6-8367-4309-a686-820483a8f6e5-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"ec3c0ff6-8367-4309-a686-820483a8f6e5\") " pod="openstack/nova-cell1-conductor-0" Dec 11 13:28:46 crc kubenswrapper[4898]: I1211 13:28:46.110699 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5txs8\" (UniqueName: \"kubernetes.io/projected/ec3c0ff6-8367-4309-a686-820483a8f6e5-kube-api-access-5txs8\") pod \"nova-cell1-conductor-0\" (UID: \"ec3c0ff6-8367-4309-a686-820483a8f6e5\") " pod="openstack/nova-cell1-conductor-0" Dec 11 13:28:46 crc kubenswrapper[4898]: I1211 13:28:46.111012 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec3c0ff6-8367-4309-a686-820483a8f6e5-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"ec3c0ff6-8367-4309-a686-820483a8f6e5\") " pod="openstack/nova-cell1-conductor-0" Dec 11 13:28:46 crc kubenswrapper[4898]: I1211 13:28:46.120593 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 11 13:28:46 crc kubenswrapper[4898]: I1211 13:28:46.155628 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b122352-5f1f-4bf6-905a-e030611792de-config-data" (OuterVolumeSpecName: "config-data") pod "0b122352-5f1f-4bf6-905a-e030611792de" (UID: "0b122352-5f1f-4bf6-905a-e030611792de"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:28:46 crc kubenswrapper[4898]: I1211 13:28:46.185601 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a29489d-803b-4a04-a612-485c4a015ccf-combined-ca-bundle\") pod \"8a29489d-803b-4a04-a612-485c4a015ccf\" (UID: \"8a29489d-803b-4a04-a612-485c4a015ccf\") " Dec 11 13:28:46 crc kubenswrapper[4898]: I1211 13:28:46.186612 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j4vdw\" (UniqueName: \"kubernetes.io/projected/8a29489d-803b-4a04-a612-485c4a015ccf-kube-api-access-j4vdw\") pod \"8a29489d-803b-4a04-a612-485c4a015ccf\" (UID: \"8a29489d-803b-4a04-a612-485c4a015ccf\") " Dec 11 13:28:46 crc kubenswrapper[4898]: I1211 13:28:46.186838 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a29489d-803b-4a04-a612-485c4a015ccf-config-data\") pod \"8a29489d-803b-4a04-a612-485c4a015ccf\" (UID: \"8a29489d-803b-4a04-a612-485c4a015ccf\") " Dec 11 13:28:46 crc kubenswrapper[4898]: I1211 13:28:46.187746 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fmg85\" (UniqueName: \"kubernetes.io/projected/0b122352-5f1f-4bf6-905a-e030611792de-kube-api-access-fmg85\") on node \"crc\" DevicePath \"\"" Dec 11 13:28:46 crc kubenswrapper[4898]: I1211 13:28:46.187761 4898 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b122352-5f1f-4bf6-905a-e030611792de-config-data\") on node \"crc\" DevicePath \"\"" Dec 11 13:28:46 crc kubenswrapper[4898]: I1211 13:28:46.193758 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b122352-5f1f-4bf6-905a-e030611792de-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0b122352-5f1f-4bf6-905a-e030611792de" (UID: "0b122352-5f1f-4bf6-905a-e030611792de"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:28:46 crc kubenswrapper[4898]: I1211 13:28:46.195672 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a29489d-803b-4a04-a612-485c4a015ccf-kube-api-access-j4vdw" (OuterVolumeSpecName: "kube-api-access-j4vdw") pod "8a29489d-803b-4a04-a612-485c4a015ccf" (UID: "8a29489d-803b-4a04-a612-485c4a015ccf"). InnerVolumeSpecName "kube-api-access-j4vdw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:28:46 crc kubenswrapper[4898]: I1211 13:28:46.235944 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a29489d-803b-4a04-a612-485c4a015ccf-config-data" (OuterVolumeSpecName: "config-data") pod "8a29489d-803b-4a04-a612-485c4a015ccf" (UID: "8a29489d-803b-4a04-a612-485c4a015ccf"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:28:46 crc kubenswrapper[4898]: I1211 13:28:46.236710 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b122352-5f1f-4bf6-905a-e030611792de-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "0b122352-5f1f-4bf6-905a-e030611792de" (UID: "0b122352-5f1f-4bf6-905a-e030611792de"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:28:46 crc kubenswrapper[4898]: I1211 13:28:46.242353 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a29489d-803b-4a04-a612-485c4a015ccf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8a29489d-803b-4a04-a612-485c4a015ccf" (UID: "8a29489d-803b-4a04-a612-485c4a015ccf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:28:46 crc kubenswrapper[4898]: I1211 13:28:46.278581 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7g6rk"] Dec 11 13:28:46 crc kubenswrapper[4898]: I1211 13:28:46.290050 4898 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b122352-5f1f-4bf6-905a-e030611792de-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 11 13:28:46 crc kubenswrapper[4898]: I1211 13:28:46.290091 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j4vdw\" (UniqueName: \"kubernetes.io/projected/8a29489d-803b-4a04-a612-485c4a015ccf-kube-api-access-j4vdw\") on node \"crc\" DevicePath \"\"" Dec 11 13:28:46 crc kubenswrapper[4898]: I1211 13:28:46.290105 4898 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/0b122352-5f1f-4bf6-905a-e030611792de-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 11 13:28:46 crc kubenswrapper[4898]: I1211 13:28:46.290118 4898 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a29489d-803b-4a04-a612-485c4a015ccf-config-data\") on node \"crc\" DevicePath \"\"" Dec 11 13:28:46 crc kubenswrapper[4898]: I1211 13:28:46.290134 4898 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a29489d-803b-4a04-a612-485c4a015ccf-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 11 13:28:46 crc kubenswrapper[4898]: I1211 13:28:46.360034 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 11 13:28:46 crc kubenswrapper[4898]: I1211 13:28:46.473200 4898 generic.go:334] "Generic (PLEG): container finished" podID="be99ef39-7e89-4189-8f85-91cc80505896" containerID="f5b03c43b8097f88807399d5ebb212023d00d6c2007613ef8b15039bd91d0b96" exitCode=0 Dec 11 13:28:46 crc kubenswrapper[4898]: I1211 13:28:46.473444 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"be99ef39-7e89-4189-8f85-91cc80505896","Type":"ContainerDied","Data":"f5b03c43b8097f88807399d5ebb212023d00d6c2007613ef8b15039bd91d0b96"} Dec 11 13:28:46 crc kubenswrapper[4898]: I1211 13:28:46.474743 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-f9lpn" event={"ID":"0b381d75-882f-425b-9c58-ec00804fda34","Type":"ContainerStarted","Data":"c9a3f122a98d0205d6df97b02ea6119c381812cf008acf7587427145634ff0e4"} Dec 11 13:28:46 crc kubenswrapper[4898]: I1211 13:28:46.478054 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7g6rk" event={"ID":"78074bfc-7a8f-4154-99f1-6c1abefca223","Type":"ContainerStarted","Data":"9a6634885421b535622a9a24d8307b1e59728cd80bfdb1ea5216b34583b19a55"} Dec 11 13:28:46 crc kubenswrapper[4898]: I1211 13:28:46.481605 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"8a29489d-803b-4a04-a612-485c4a015ccf","Type":"ContainerDied","Data":"d4fc65b12d369a8260658d38a956f3bb70764607dcdf697e3b37085b2592ea53"} Dec 11 13:28:46 crc kubenswrapper[4898]: I1211 13:28:46.481665 4898 scope.go:117] "RemoveContainer" containerID="f9732cc402e347d4b36742d400e8ab9e5305266fb1c433be67d4bdf39775bc4b" Dec 11 13:28:46 crc kubenswrapper[4898]: I1211 13:28:46.481812 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 11 13:28:46 crc kubenswrapper[4898]: I1211 13:28:46.485689 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0b122352-5f1f-4bf6-905a-e030611792de","Type":"ContainerDied","Data":"7b39637beaaf84a4332a3d32006736cf8c161272fa5fb1d4e19016f2337171a8"} Dec 11 13:28:46 crc kubenswrapper[4898]: I1211 13:28:46.485862 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 11 13:28:46 crc kubenswrapper[4898]: I1211 13:28:46.500259 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-db-sync-f9lpn" podStartSLOduration=2.477694689 podStartE2EDuration="9.500241058s" podCreationTimestamp="2025-12-11 13:28:37 +0000 UTC" firstStartedPulling="2025-12-11 13:28:38.689612641 +0000 UTC m=+1476.261939068" lastFinishedPulling="2025-12-11 13:28:45.712159 +0000 UTC m=+1483.284485437" observedRunningTime="2025-12-11 13:28:46.49623643 +0000 UTC m=+1484.068562887" watchObservedRunningTime="2025-12-11 13:28:46.500241058 +0000 UTC m=+1484.072567495" Dec 11 13:28:46 crc kubenswrapper[4898]: I1211 13:28:46.540665 4898 scope.go:117] "RemoveContainer" containerID="6b8347c8204905e123f632717b1f2f776b13c415eacde7bed18fa2e25df5ca5a" Dec 11 13:28:46 crc kubenswrapper[4898]: I1211 13:28:46.567011 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 11 13:28:46 crc kubenswrapper[4898]: I1211 13:28:46.610562 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Dec 11 13:28:46 crc kubenswrapper[4898]: I1211 13:28:46.622412 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 11 13:28:46 crc kubenswrapper[4898]: E1211 13:28:46.623049 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b122352-5f1f-4bf6-905a-e030611792de" containerName="nova-metadata-log" Dec 11 13:28:46 crc kubenswrapper[4898]: I1211 13:28:46.623061 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b122352-5f1f-4bf6-905a-e030611792de" containerName="nova-metadata-log" Dec 11 13:28:46 crc kubenswrapper[4898]: E1211 13:28:46.623100 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b122352-5f1f-4bf6-905a-e030611792de" containerName="nova-metadata-metadata" Dec 11 13:28:46 crc kubenswrapper[4898]: I1211 13:28:46.623106 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b122352-5f1f-4bf6-905a-e030611792de" containerName="nova-metadata-metadata" Dec 11 13:28:46 crc kubenswrapper[4898]: E1211 13:28:46.623125 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a29489d-803b-4a04-a612-485c4a015ccf" containerName="nova-scheduler-scheduler" Dec 11 13:28:46 crc kubenswrapper[4898]: I1211 13:28:46.623130 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a29489d-803b-4a04-a612-485c4a015ccf" containerName="nova-scheduler-scheduler" Dec 11 13:28:46 crc kubenswrapper[4898]: I1211 13:28:46.623345 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a29489d-803b-4a04-a612-485c4a015ccf" containerName="nova-scheduler-scheduler" Dec 11 13:28:46 crc kubenswrapper[4898]: I1211 13:28:46.623363 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b122352-5f1f-4bf6-905a-e030611792de" containerName="nova-metadata-log" Dec 11 13:28:46 crc kubenswrapper[4898]: I1211 13:28:46.623384 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b122352-5f1f-4bf6-905a-e030611792de" containerName="nova-metadata-metadata" Dec 11 13:28:46 crc kubenswrapper[4898]: I1211 13:28:46.624102 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 11 13:28:46 crc kubenswrapper[4898]: I1211 13:28:46.627688 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 11 13:28:46 crc kubenswrapper[4898]: I1211 13:28:46.638521 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 11 13:28:46 crc kubenswrapper[4898]: I1211 13:28:46.654555 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 11 13:28:46 crc kubenswrapper[4898]: I1211 13:28:46.676518 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 11 13:28:46 crc kubenswrapper[4898]: I1211 13:28:46.693565 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ngkww"] Dec 11 13:28:46 crc kubenswrapper[4898]: I1211 13:28:46.694583 4898 scope.go:117] "RemoveContainer" containerID="91817998ae207d7c069cae2bb71f93ee500500383d401e2d9488753f47323842" Dec 11 13:28:46 crc kubenswrapper[4898]: I1211 13:28:46.704836 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k7vfl\" (UniqueName: \"kubernetes.io/projected/4337e072-ac49-4912-946d-e52de7c33a1b-kube-api-access-k7vfl\") pod \"nova-scheduler-0\" (UID: \"4337e072-ac49-4912-946d-e52de7c33a1b\") " pod="openstack/nova-scheduler-0" Dec 11 13:28:46 crc kubenswrapper[4898]: I1211 13:28:46.704932 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4337e072-ac49-4912-946d-e52de7c33a1b-config-data\") pod \"nova-scheduler-0\" (UID: \"4337e072-ac49-4912-946d-e52de7c33a1b\") " pod="openstack/nova-scheduler-0" Dec 11 13:28:46 crc kubenswrapper[4898]: I1211 13:28:46.704963 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4337e072-ac49-4912-946d-e52de7c33a1b-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"4337e072-ac49-4912-946d-e52de7c33a1b\") " pod="openstack/nova-scheduler-0" Dec 11 13:28:46 crc kubenswrapper[4898]: I1211 13:28:46.705097 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 11 13:28:46 crc kubenswrapper[4898]: I1211 13:28:46.709555 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 11 13:28:46 crc kubenswrapper[4898]: I1211 13:28:46.712021 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 11 13:28:46 crc kubenswrapper[4898]: I1211 13:28:46.714086 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 11 13:28:46 crc kubenswrapper[4898]: I1211 13:28:46.753323 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 11 13:28:46 crc kubenswrapper[4898]: I1211 13:28:46.804386 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b122352-5f1f-4bf6-905a-e030611792de" path="/var/lib/kubelet/pods/0b122352-5f1f-4bf6-905a-e030611792de/volumes" Dec 11 13:28:46 crc kubenswrapper[4898]: I1211 13:28:46.805610 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a29489d-803b-4a04-a612-485c4a015ccf" path="/var/lib/kubelet/pods/8a29489d-803b-4a04-a612-485c4a015ccf/volumes" Dec 11 13:28:46 crc kubenswrapper[4898]: I1211 13:28:46.807573 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4337e072-ac49-4912-946d-e52de7c33a1b-config-data\") pod \"nova-scheduler-0\" (UID: \"4337e072-ac49-4912-946d-e52de7c33a1b\") " pod="openstack/nova-scheduler-0" Dec 11 13:28:46 crc kubenswrapper[4898]: I1211 13:28:46.807608 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4337e072-ac49-4912-946d-e52de7c33a1b-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"4337e072-ac49-4912-946d-e52de7c33a1b\") " pod="openstack/nova-scheduler-0" Dec 11 13:28:46 crc kubenswrapper[4898]: I1211 13:28:46.807683 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c9d50b6b-d1c5-4072-964e-475b4bfb685d-logs\") pod \"nova-metadata-0\" (UID: \"c9d50b6b-d1c5-4072-964e-475b4bfb685d\") " pod="openstack/nova-metadata-0" Dec 11 13:28:46 crc kubenswrapper[4898]: I1211 13:28:46.807703 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wzwhc\" (UniqueName: \"kubernetes.io/projected/c9d50b6b-d1c5-4072-964e-475b4bfb685d-kube-api-access-wzwhc\") pod \"nova-metadata-0\" (UID: \"c9d50b6b-d1c5-4072-964e-475b4bfb685d\") " pod="openstack/nova-metadata-0" Dec 11 13:28:46 crc kubenswrapper[4898]: I1211 13:28:46.807729 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c9d50b6b-d1c5-4072-964e-475b4bfb685d-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"c9d50b6b-d1c5-4072-964e-475b4bfb685d\") " pod="openstack/nova-metadata-0" Dec 11 13:28:46 crc kubenswrapper[4898]: I1211 13:28:46.807779 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9d50b6b-d1c5-4072-964e-475b4bfb685d-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c9d50b6b-d1c5-4072-964e-475b4bfb685d\") " pod="openstack/nova-metadata-0" Dec 11 13:28:46 crc kubenswrapper[4898]: I1211 13:28:46.807795 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9d50b6b-d1c5-4072-964e-475b4bfb685d-config-data\") pod \"nova-metadata-0\" (UID: \"c9d50b6b-d1c5-4072-964e-475b4bfb685d\") " pod="openstack/nova-metadata-0" Dec 11 13:28:46 crc kubenswrapper[4898]: I1211 13:28:46.808122 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k7vfl\" (UniqueName: \"kubernetes.io/projected/4337e072-ac49-4912-946d-e52de7c33a1b-kube-api-access-k7vfl\") pod \"nova-scheduler-0\" (UID: \"4337e072-ac49-4912-946d-e52de7c33a1b\") " pod="openstack/nova-scheduler-0" Dec 11 13:28:46 crc kubenswrapper[4898]: I1211 13:28:46.814966 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4337e072-ac49-4912-946d-e52de7c33a1b-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"4337e072-ac49-4912-946d-e52de7c33a1b\") " pod="openstack/nova-scheduler-0" Dec 11 13:28:46 crc kubenswrapper[4898]: I1211 13:28:46.816269 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4337e072-ac49-4912-946d-e52de7c33a1b-config-data\") pod \"nova-scheduler-0\" (UID: \"4337e072-ac49-4912-946d-e52de7c33a1b\") " pod="openstack/nova-scheduler-0" Dec 11 13:28:46 crc kubenswrapper[4898]: I1211 13:28:46.824530 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k7vfl\" (UniqueName: \"kubernetes.io/projected/4337e072-ac49-4912-946d-e52de7c33a1b-kube-api-access-k7vfl\") pod \"nova-scheduler-0\" (UID: \"4337e072-ac49-4912-946d-e52de7c33a1b\") " pod="openstack/nova-scheduler-0" Dec 11 13:28:46 crc kubenswrapper[4898]: I1211 13:28:46.910596 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9d50b6b-d1c5-4072-964e-475b4bfb685d-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c9d50b6b-d1c5-4072-964e-475b4bfb685d\") " pod="openstack/nova-metadata-0" Dec 11 13:28:46 crc kubenswrapper[4898]: I1211 13:28:46.910636 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9d50b6b-d1c5-4072-964e-475b4bfb685d-config-data\") pod \"nova-metadata-0\" (UID: \"c9d50b6b-d1c5-4072-964e-475b4bfb685d\") " pod="openstack/nova-metadata-0" Dec 11 13:28:46 crc kubenswrapper[4898]: I1211 13:28:46.910857 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c9d50b6b-d1c5-4072-964e-475b4bfb685d-logs\") pod \"nova-metadata-0\" (UID: \"c9d50b6b-d1c5-4072-964e-475b4bfb685d\") " pod="openstack/nova-metadata-0" Dec 11 13:28:46 crc kubenswrapper[4898]: I1211 13:28:46.910878 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wzwhc\" (UniqueName: \"kubernetes.io/projected/c9d50b6b-d1c5-4072-964e-475b4bfb685d-kube-api-access-wzwhc\") pod \"nova-metadata-0\" (UID: \"c9d50b6b-d1c5-4072-964e-475b4bfb685d\") " pod="openstack/nova-metadata-0" Dec 11 13:28:46 crc kubenswrapper[4898]: I1211 13:28:46.910904 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c9d50b6b-d1c5-4072-964e-475b4bfb685d-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"c9d50b6b-d1c5-4072-964e-475b4bfb685d\") " pod="openstack/nova-metadata-0" Dec 11 13:28:46 crc kubenswrapper[4898]: I1211 13:28:46.912725 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c9d50b6b-d1c5-4072-964e-475b4bfb685d-logs\") pod \"nova-metadata-0\" (UID: \"c9d50b6b-d1c5-4072-964e-475b4bfb685d\") " pod="openstack/nova-metadata-0" Dec 11 13:28:46 crc kubenswrapper[4898]: I1211 13:28:46.919959 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9d50b6b-d1c5-4072-964e-475b4bfb685d-config-data\") pod \"nova-metadata-0\" (UID: \"c9d50b6b-d1c5-4072-964e-475b4bfb685d\") " pod="openstack/nova-metadata-0" Dec 11 13:28:46 crc kubenswrapper[4898]: I1211 13:28:46.926778 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c9d50b6b-d1c5-4072-964e-475b4bfb685d-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"c9d50b6b-d1c5-4072-964e-475b4bfb685d\") " pod="openstack/nova-metadata-0" Dec 11 13:28:46 crc kubenswrapper[4898]: I1211 13:28:46.927479 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9d50b6b-d1c5-4072-964e-475b4bfb685d-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c9d50b6b-d1c5-4072-964e-475b4bfb685d\") " pod="openstack/nova-metadata-0" Dec 11 13:28:46 crc kubenswrapper[4898]: I1211 13:28:46.929714 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wzwhc\" (UniqueName: \"kubernetes.io/projected/c9d50b6b-d1c5-4072-964e-475b4bfb685d-kube-api-access-wzwhc\") pod \"nova-metadata-0\" (UID: \"c9d50b6b-d1c5-4072-964e-475b4bfb685d\") " pod="openstack/nova-metadata-0" Dec 11 13:28:46 crc kubenswrapper[4898]: I1211 13:28:46.992355 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 11 13:28:46 crc kubenswrapper[4898]: I1211 13:28:46.997733 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 11 13:28:47 crc kubenswrapper[4898]: I1211 13:28:47.030172 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 11 13:28:47 crc kubenswrapper[4898]: I1211 13:28:47.066349 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 11 13:28:47 crc kubenswrapper[4898]: W1211 13:28:47.083361 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podec3c0ff6_8367_4309_a686_820483a8f6e5.slice/crio-3f0b86b017999af1b3d744a6c01cc87cb68be7eb3493ae95f5f2478ab353ec64 WatchSource:0}: Error finding container 3f0b86b017999af1b3d744a6c01cc87cb68be7eb3493ae95f5f2478ab353ec64: Status 404 returned error can't find the container with id 3f0b86b017999af1b3d744a6c01cc87cb68be7eb3493ae95f5f2478ab353ec64 Dec 11 13:28:47 crc kubenswrapper[4898]: I1211 13:28:47.117774 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be99ef39-7e89-4189-8f85-91cc80505896-config-data\") pod \"be99ef39-7e89-4189-8f85-91cc80505896\" (UID: \"be99ef39-7e89-4189-8f85-91cc80505896\") " Dec 11 13:28:47 crc kubenswrapper[4898]: I1211 13:28:47.117875 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be99ef39-7e89-4189-8f85-91cc80505896-combined-ca-bundle\") pod \"be99ef39-7e89-4189-8f85-91cc80505896\" (UID: \"be99ef39-7e89-4189-8f85-91cc80505896\") " Dec 11 13:28:47 crc kubenswrapper[4898]: I1211 13:28:47.119547 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/be99ef39-7e89-4189-8f85-91cc80505896-logs\") pod \"be99ef39-7e89-4189-8f85-91cc80505896\" (UID: \"be99ef39-7e89-4189-8f85-91cc80505896\") " Dec 11 13:28:47 crc kubenswrapper[4898]: I1211 13:28:47.119591 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4qpx\" (UniqueName: \"kubernetes.io/projected/be99ef39-7e89-4189-8f85-91cc80505896-kube-api-access-s4qpx\") pod \"be99ef39-7e89-4189-8f85-91cc80505896\" (UID: \"be99ef39-7e89-4189-8f85-91cc80505896\") " Dec 11 13:28:47 crc kubenswrapper[4898]: I1211 13:28:47.120578 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/be99ef39-7e89-4189-8f85-91cc80505896-logs" (OuterVolumeSpecName: "logs") pod "be99ef39-7e89-4189-8f85-91cc80505896" (UID: "be99ef39-7e89-4189-8f85-91cc80505896"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 13:28:47 crc kubenswrapper[4898]: I1211 13:28:47.122090 4898 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/be99ef39-7e89-4189-8f85-91cc80505896-logs\") on node \"crc\" DevicePath \"\"" Dec 11 13:28:47 crc kubenswrapper[4898]: I1211 13:28:47.126379 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be99ef39-7e89-4189-8f85-91cc80505896-kube-api-access-s4qpx" (OuterVolumeSpecName: "kube-api-access-s4qpx") pod "be99ef39-7e89-4189-8f85-91cc80505896" (UID: "be99ef39-7e89-4189-8f85-91cc80505896"). InnerVolumeSpecName "kube-api-access-s4qpx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:28:47 crc kubenswrapper[4898]: I1211 13:28:47.154970 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be99ef39-7e89-4189-8f85-91cc80505896-config-data" (OuterVolumeSpecName: "config-data") pod "be99ef39-7e89-4189-8f85-91cc80505896" (UID: "be99ef39-7e89-4189-8f85-91cc80505896"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:28:47 crc kubenswrapper[4898]: I1211 13:28:47.169226 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be99ef39-7e89-4189-8f85-91cc80505896-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "be99ef39-7e89-4189-8f85-91cc80505896" (UID: "be99ef39-7e89-4189-8f85-91cc80505896"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:28:47 crc kubenswrapper[4898]: I1211 13:28:47.223745 4898 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be99ef39-7e89-4189-8f85-91cc80505896-config-data\") on node \"crc\" DevicePath \"\"" Dec 11 13:28:47 crc kubenswrapper[4898]: I1211 13:28:47.224012 4898 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be99ef39-7e89-4189-8f85-91cc80505896-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 11 13:28:47 crc kubenswrapper[4898]: I1211 13:28:47.224025 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4qpx\" (UniqueName: \"kubernetes.io/projected/be99ef39-7e89-4189-8f85-91cc80505896-kube-api-access-s4qpx\") on node \"crc\" DevicePath \"\"" Dec 11 13:28:47 crc kubenswrapper[4898]: I1211 13:28:47.498987 4898 generic.go:334] "Generic (PLEG): container finished" podID="09202c46-beab-4e3f-aa19-f693116c3d85" containerID="d8f3fb958911a3fe38909e22367776c0d0d5a705a98dff53f9b0bd0e84d32b3d" exitCode=0 Dec 11 13:28:47 crc kubenswrapper[4898]: I1211 13:28:47.499106 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ngkww" event={"ID":"09202c46-beab-4e3f-aa19-f693116c3d85","Type":"ContainerDied","Data":"d8f3fb958911a3fe38909e22367776c0d0d5a705a98dff53f9b0bd0e84d32b3d"} Dec 11 13:28:47 crc kubenswrapper[4898]: I1211 13:28:47.499144 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ngkww" event={"ID":"09202c46-beab-4e3f-aa19-f693116c3d85","Type":"ContainerStarted","Data":"fd46eb9eb485cc40a2b712d89de59deb61e811bbe5ac9d7475570cfc5903e01b"} Dec 11 13:28:47 crc kubenswrapper[4898]: I1211 13:28:47.503718 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"be99ef39-7e89-4189-8f85-91cc80505896","Type":"ContainerDied","Data":"76d373a189cf13dc25d257ac6796be2e989e06d66a8ed6a0e4426f1f6e638f92"} Dec 11 13:28:47 crc kubenswrapper[4898]: I1211 13:28:47.503774 4898 scope.go:117] "RemoveContainer" containerID="f5b03c43b8097f88807399d5ebb212023d00d6c2007613ef8b15039bd91d0b96" Dec 11 13:28:47 crc kubenswrapper[4898]: I1211 13:28:47.503908 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 11 13:28:47 crc kubenswrapper[4898]: I1211 13:28:47.514542 4898 generic.go:334] "Generic (PLEG): container finished" podID="78074bfc-7a8f-4154-99f1-6c1abefca223" containerID="66042f98a624765e3a0cc63862363ac26489d7658ed44c5c5ddf20df39f8300e" exitCode=0 Dec 11 13:28:47 crc kubenswrapper[4898]: I1211 13:28:47.514654 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7g6rk" event={"ID":"78074bfc-7a8f-4154-99f1-6c1abefca223","Type":"ContainerDied","Data":"66042f98a624765e3a0cc63862363ac26489d7658ed44c5c5ddf20df39f8300e"} Dec 11 13:28:47 crc kubenswrapper[4898]: I1211 13:28:47.532282 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 11 13:28:47 crc kubenswrapper[4898]: W1211 13:28:47.546877 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4337e072_ac49_4912_946d_e52de7c33a1b.slice/crio-7d4092cc18d7a71bfad409a098b09b27912d77aadb068847a880021b0a483d96 WatchSource:0}: Error finding container 7d4092cc18d7a71bfad409a098b09b27912d77aadb068847a880021b0a483d96: Status 404 returned error can't find the container with id 7d4092cc18d7a71bfad409a098b09b27912d77aadb068847a880021b0a483d96 Dec 11 13:28:47 crc kubenswrapper[4898]: I1211 13:28:47.554146 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"ec3c0ff6-8367-4309-a686-820483a8f6e5","Type":"ContainerStarted","Data":"6ee992fc67c6f2c1919bd79a6b4bbeb00778ccd35a9912924d740ce8d00add35"} Dec 11 13:28:47 crc kubenswrapper[4898]: I1211 13:28:47.554211 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"ec3c0ff6-8367-4309-a686-820483a8f6e5","Type":"ContainerStarted","Data":"3f0b86b017999af1b3d744a6c01cc87cb68be7eb3493ae95f5f2478ab353ec64"} Dec 11 13:28:47 crc kubenswrapper[4898]: I1211 13:28:47.615594 4898 scope.go:117] "RemoveContainer" containerID="e8513b56ae064c5a9aabd60c9682e79c6235cb547769908b23caacacfd0f694f" Dec 11 13:28:47 crc kubenswrapper[4898]: I1211 13:28:47.628343 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 11 13:28:47 crc kubenswrapper[4898]: I1211 13:28:47.682279 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 11 13:28:47 crc kubenswrapper[4898]: I1211 13:28:47.684633 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.68461353 podStartE2EDuration="2.68461353s" podCreationTimestamp="2025-12-11 13:28:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 13:28:47.580333598 +0000 UTC m=+1485.152660035" watchObservedRunningTime="2025-12-11 13:28:47.68461353 +0000 UTC m=+1485.256939967" Dec 11 13:28:47 crc kubenswrapper[4898]: I1211 13:28:47.728873 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 11 13:28:47 crc kubenswrapper[4898]: E1211 13:28:47.729318 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be99ef39-7e89-4189-8f85-91cc80505896" containerName="nova-api-api" Dec 11 13:28:47 crc kubenswrapper[4898]: I1211 13:28:47.729335 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="be99ef39-7e89-4189-8f85-91cc80505896" containerName="nova-api-api" Dec 11 13:28:47 crc kubenswrapper[4898]: E1211 13:28:47.729382 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be99ef39-7e89-4189-8f85-91cc80505896" containerName="nova-api-log" Dec 11 13:28:47 crc kubenswrapper[4898]: I1211 13:28:47.729388 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="be99ef39-7e89-4189-8f85-91cc80505896" containerName="nova-api-log" Dec 11 13:28:47 crc kubenswrapper[4898]: I1211 13:28:47.729611 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="be99ef39-7e89-4189-8f85-91cc80505896" containerName="nova-api-log" Dec 11 13:28:47 crc kubenswrapper[4898]: I1211 13:28:47.729634 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="be99ef39-7e89-4189-8f85-91cc80505896" containerName="nova-api-api" Dec 11 13:28:47 crc kubenswrapper[4898]: I1211 13:28:47.731252 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 11 13:28:47 crc kubenswrapper[4898]: I1211 13:28:47.735568 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 11 13:28:47 crc kubenswrapper[4898]: I1211 13:28:47.753193 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 11 13:28:47 crc kubenswrapper[4898]: I1211 13:28:47.765893 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 11 13:28:47 crc kubenswrapper[4898]: I1211 13:28:47.872047 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k8lm5\" (UniqueName: \"kubernetes.io/projected/db5a5721-a011-492c-b79d-cc175721a3c4-kube-api-access-k8lm5\") pod \"nova-api-0\" (UID: \"db5a5721-a011-492c-b79d-cc175721a3c4\") " pod="openstack/nova-api-0" Dec 11 13:28:47 crc kubenswrapper[4898]: I1211 13:28:47.872117 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db5a5721-a011-492c-b79d-cc175721a3c4-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"db5a5721-a011-492c-b79d-cc175721a3c4\") " pod="openstack/nova-api-0" Dec 11 13:28:47 crc kubenswrapper[4898]: I1211 13:28:47.872188 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db5a5721-a011-492c-b79d-cc175721a3c4-config-data\") pod \"nova-api-0\" (UID: \"db5a5721-a011-492c-b79d-cc175721a3c4\") " pod="openstack/nova-api-0" Dec 11 13:28:47 crc kubenswrapper[4898]: I1211 13:28:47.872376 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/db5a5721-a011-492c-b79d-cc175721a3c4-logs\") pod \"nova-api-0\" (UID: \"db5a5721-a011-492c-b79d-cc175721a3c4\") " pod="openstack/nova-api-0" Dec 11 13:28:47 crc kubenswrapper[4898]: I1211 13:28:47.975379 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/db5a5721-a011-492c-b79d-cc175721a3c4-logs\") pod \"nova-api-0\" (UID: \"db5a5721-a011-492c-b79d-cc175721a3c4\") " pod="openstack/nova-api-0" Dec 11 13:28:47 crc kubenswrapper[4898]: I1211 13:28:47.975516 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k8lm5\" (UniqueName: \"kubernetes.io/projected/db5a5721-a011-492c-b79d-cc175721a3c4-kube-api-access-k8lm5\") pod \"nova-api-0\" (UID: \"db5a5721-a011-492c-b79d-cc175721a3c4\") " pod="openstack/nova-api-0" Dec 11 13:28:47 crc kubenswrapper[4898]: I1211 13:28:47.975559 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db5a5721-a011-492c-b79d-cc175721a3c4-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"db5a5721-a011-492c-b79d-cc175721a3c4\") " pod="openstack/nova-api-0" Dec 11 13:28:47 crc kubenswrapper[4898]: I1211 13:28:47.975608 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db5a5721-a011-492c-b79d-cc175721a3c4-config-data\") pod \"nova-api-0\" (UID: \"db5a5721-a011-492c-b79d-cc175721a3c4\") " pod="openstack/nova-api-0" Dec 11 13:28:48 crc kubenswrapper[4898]: I1211 13:28:47.978636 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/db5a5721-a011-492c-b79d-cc175721a3c4-logs\") pod \"nova-api-0\" (UID: \"db5a5721-a011-492c-b79d-cc175721a3c4\") " pod="openstack/nova-api-0" Dec 11 13:28:48 crc kubenswrapper[4898]: I1211 13:28:47.994780 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db5a5721-a011-492c-b79d-cc175721a3c4-config-data\") pod \"nova-api-0\" (UID: \"db5a5721-a011-492c-b79d-cc175721a3c4\") " pod="openstack/nova-api-0" Dec 11 13:28:48 crc kubenswrapper[4898]: I1211 13:28:47.996071 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k8lm5\" (UniqueName: \"kubernetes.io/projected/db5a5721-a011-492c-b79d-cc175721a3c4-kube-api-access-k8lm5\") pod \"nova-api-0\" (UID: \"db5a5721-a011-492c-b79d-cc175721a3c4\") " pod="openstack/nova-api-0" Dec 11 13:28:48 crc kubenswrapper[4898]: I1211 13:28:47.998037 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db5a5721-a011-492c-b79d-cc175721a3c4-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"db5a5721-a011-492c-b79d-cc175721a3c4\") " pod="openstack/nova-api-0" Dec 11 13:28:48 crc kubenswrapper[4898]: I1211 13:28:48.262330 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 11 13:28:48 crc kubenswrapper[4898]: I1211 13:28:48.589049 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"4337e072-ac49-4912-946d-e52de7c33a1b","Type":"ContainerStarted","Data":"a1dbf6c292cd096709c9173ab16e98d34d6b4899b2eae82c2ee00bd664f557d8"} Dec 11 13:28:48 crc kubenswrapper[4898]: I1211 13:28:48.589108 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"4337e072-ac49-4912-946d-e52de7c33a1b","Type":"ContainerStarted","Data":"7d4092cc18d7a71bfad409a098b09b27912d77aadb068847a880021b0a483d96"} Dec 11 13:28:48 crc kubenswrapper[4898]: I1211 13:28:48.594711 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7g6rk" event={"ID":"78074bfc-7a8f-4154-99f1-6c1abefca223","Type":"ContainerStarted","Data":"fc93ad9481faaf06cb9cfa1e3eac5eb1eb43d20dd7aaf5112cc1512642115e96"} Dec 11 13:28:48 crc kubenswrapper[4898]: I1211 13:28:48.600910 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c9d50b6b-d1c5-4072-964e-475b4bfb685d","Type":"ContainerStarted","Data":"cefc528e45f4b84ad123b7112e1b8cdbc6fffed72b0905ba66016708c0b55e09"} Dec 11 13:28:48 crc kubenswrapper[4898]: I1211 13:28:48.600960 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c9d50b6b-d1c5-4072-964e-475b4bfb685d","Type":"ContainerStarted","Data":"2f061f8cda1338f451f3df1a1607c6f6db4157dc25c409a20ffdd023497e8b3b"} Dec 11 13:28:48 crc kubenswrapper[4898]: I1211 13:28:48.600975 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c9d50b6b-d1c5-4072-964e-475b4bfb685d","Type":"ContainerStarted","Data":"4157068ed5e8522edbbd23b7fe110bdef403b8e56bc491ebd1eb422c9a9c7a4a"} Dec 11 13:28:48 crc kubenswrapper[4898]: I1211 13:28:48.601293 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Dec 11 13:28:48 crc kubenswrapper[4898]: I1211 13:28:48.616579 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.6165622710000003 podStartE2EDuration="2.616562271s" podCreationTimestamp="2025-12-11 13:28:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 13:28:48.61465227 +0000 UTC m=+1486.186978707" watchObservedRunningTime="2025-12-11 13:28:48.616562271 +0000 UTC m=+1486.188888708" Dec 11 13:28:48 crc kubenswrapper[4898]: I1211 13:28:48.672884 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.672862625 podStartE2EDuration="2.672862625s" podCreationTimestamp="2025-12-11 13:28:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 13:28:48.663355268 +0000 UTC m=+1486.235681705" watchObservedRunningTime="2025-12-11 13:28:48.672862625 +0000 UTC m=+1486.245189062" Dec 11 13:28:48 crc kubenswrapper[4898]: I1211 13:28:48.792595 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="be99ef39-7e89-4189-8f85-91cc80505896" path="/var/lib/kubelet/pods/be99ef39-7e89-4189-8f85-91cc80505896/volumes" Dec 11 13:28:48 crc kubenswrapper[4898]: I1211 13:28:48.842946 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 11 13:28:48 crc kubenswrapper[4898]: W1211 13:28:48.849204 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddb5a5721_a011_492c_b79d_cc175721a3c4.slice/crio-efd9c241fd0db5a98a438a5a7410d7cf66ced0c79dccf04526786d8c5b8e7d18 WatchSource:0}: Error finding container efd9c241fd0db5a98a438a5a7410d7cf66ced0c79dccf04526786d8c5b8e7d18: Status 404 returned error can't find the container with id efd9c241fd0db5a98a438a5a7410d7cf66ced0c79dccf04526786d8c5b8e7d18 Dec 11 13:28:49 crc kubenswrapper[4898]: I1211 13:28:49.612915 4898 generic.go:334] "Generic (PLEG): container finished" podID="78074bfc-7a8f-4154-99f1-6c1abefca223" containerID="fc93ad9481faaf06cb9cfa1e3eac5eb1eb43d20dd7aaf5112cc1512642115e96" exitCode=0 Dec 11 13:28:49 crc kubenswrapper[4898]: I1211 13:28:49.613048 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7g6rk" event={"ID":"78074bfc-7a8f-4154-99f1-6c1abefca223","Type":"ContainerDied","Data":"fc93ad9481faaf06cb9cfa1e3eac5eb1eb43d20dd7aaf5112cc1512642115e96"} Dec 11 13:28:49 crc kubenswrapper[4898]: I1211 13:28:49.616261 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"db5a5721-a011-492c-b79d-cc175721a3c4","Type":"ContainerStarted","Data":"183e17e589fd2c89941b4b6ee9a299c6676c61402b03bbb7f580231da70d85df"} Dec 11 13:28:49 crc kubenswrapper[4898]: I1211 13:28:49.616303 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"db5a5721-a011-492c-b79d-cc175721a3c4","Type":"ContainerStarted","Data":"4710cb8ea8f1d8dc82a27c9e1822b82d18c87c680794c9cf0033f6878139aeeb"} Dec 11 13:28:49 crc kubenswrapper[4898]: I1211 13:28:49.616320 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"db5a5721-a011-492c-b79d-cc175721a3c4","Type":"ContainerStarted","Data":"efd9c241fd0db5a98a438a5a7410d7cf66ced0c79dccf04526786d8c5b8e7d18"} Dec 11 13:28:49 crc kubenswrapper[4898]: I1211 13:28:49.628045 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ngkww" event={"ID":"09202c46-beab-4e3f-aa19-f693116c3d85","Type":"ContainerStarted","Data":"18c889b834d38549edd32686508be946d9c9b534d6bf7b681ca4e20b36efb634"} Dec 11 13:28:49 crc kubenswrapper[4898]: I1211 13:28:49.631512 4898 generic.go:334] "Generic (PLEG): container finished" podID="0b381d75-882f-425b-9c58-ec00804fda34" containerID="c9a3f122a98d0205d6df97b02ea6119c381812cf008acf7587427145634ff0e4" exitCode=0 Dec 11 13:28:49 crc kubenswrapper[4898]: I1211 13:28:49.631562 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-f9lpn" event={"ID":"0b381d75-882f-425b-9c58-ec00804fda34","Type":"ContainerDied","Data":"c9a3f122a98d0205d6df97b02ea6119c381812cf008acf7587427145634ff0e4"} Dec 11 13:28:49 crc kubenswrapper[4898]: I1211 13:28:49.664837 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.66481771 podStartE2EDuration="2.66481771s" podCreationTimestamp="2025-12-11 13:28:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 13:28:49.66036948 +0000 UTC m=+1487.232695917" watchObservedRunningTime="2025-12-11 13:28:49.66481771 +0000 UTC m=+1487.237144147" Dec 11 13:28:50 crc kubenswrapper[4898]: I1211 13:28:50.644313 4898 generic.go:334] "Generic (PLEG): container finished" podID="09202c46-beab-4e3f-aa19-f693116c3d85" containerID="18c889b834d38549edd32686508be946d9c9b534d6bf7b681ca4e20b36efb634" exitCode=0 Dec 11 13:28:50 crc kubenswrapper[4898]: I1211 13:28:50.644361 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ngkww" event={"ID":"09202c46-beab-4e3f-aa19-f693116c3d85","Type":"ContainerDied","Data":"18c889b834d38549edd32686508be946d9c9b534d6bf7b681ca4e20b36efb634"} Dec 11 13:28:51 crc kubenswrapper[4898]: I1211 13:28:51.100502 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-f9lpn" Dec 11 13:28:51 crc kubenswrapper[4898]: I1211 13:28:51.195013 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b381d75-882f-425b-9c58-ec00804fda34-combined-ca-bundle\") pod \"0b381d75-882f-425b-9c58-ec00804fda34\" (UID: \"0b381d75-882f-425b-9c58-ec00804fda34\") " Dec 11 13:28:51 crc kubenswrapper[4898]: I1211 13:28:51.195770 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b381d75-882f-425b-9c58-ec00804fda34-config-data\") pod \"0b381d75-882f-425b-9c58-ec00804fda34\" (UID: \"0b381d75-882f-425b-9c58-ec00804fda34\") " Dec 11 13:28:51 crc kubenswrapper[4898]: I1211 13:28:51.195889 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0b381d75-882f-425b-9c58-ec00804fda34-scripts\") pod \"0b381d75-882f-425b-9c58-ec00804fda34\" (UID: \"0b381d75-882f-425b-9c58-ec00804fda34\") " Dec 11 13:28:51 crc kubenswrapper[4898]: I1211 13:28:51.196227 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2hl6r\" (UniqueName: \"kubernetes.io/projected/0b381d75-882f-425b-9c58-ec00804fda34-kube-api-access-2hl6r\") pod \"0b381d75-882f-425b-9c58-ec00804fda34\" (UID: \"0b381d75-882f-425b-9c58-ec00804fda34\") " Dec 11 13:28:51 crc kubenswrapper[4898]: I1211 13:28:51.202413 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b381d75-882f-425b-9c58-ec00804fda34-kube-api-access-2hl6r" (OuterVolumeSpecName: "kube-api-access-2hl6r") pod "0b381d75-882f-425b-9c58-ec00804fda34" (UID: "0b381d75-882f-425b-9c58-ec00804fda34"). InnerVolumeSpecName "kube-api-access-2hl6r". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:28:51 crc kubenswrapper[4898]: I1211 13:28:51.205002 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b381d75-882f-425b-9c58-ec00804fda34-scripts" (OuterVolumeSpecName: "scripts") pod "0b381d75-882f-425b-9c58-ec00804fda34" (UID: "0b381d75-882f-425b-9c58-ec00804fda34"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:28:51 crc kubenswrapper[4898]: I1211 13:28:51.238313 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b381d75-882f-425b-9c58-ec00804fda34-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0b381d75-882f-425b-9c58-ec00804fda34" (UID: "0b381d75-882f-425b-9c58-ec00804fda34"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:28:51 crc kubenswrapper[4898]: I1211 13:28:51.239626 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b381d75-882f-425b-9c58-ec00804fda34-config-data" (OuterVolumeSpecName: "config-data") pod "0b381d75-882f-425b-9c58-ec00804fda34" (UID: "0b381d75-882f-425b-9c58-ec00804fda34"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:28:51 crc kubenswrapper[4898]: I1211 13:28:51.300597 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2hl6r\" (UniqueName: \"kubernetes.io/projected/0b381d75-882f-425b-9c58-ec00804fda34-kube-api-access-2hl6r\") on node \"crc\" DevicePath \"\"" Dec 11 13:28:51 crc kubenswrapper[4898]: I1211 13:28:51.301238 4898 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b381d75-882f-425b-9c58-ec00804fda34-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 11 13:28:51 crc kubenswrapper[4898]: I1211 13:28:51.301333 4898 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b381d75-882f-425b-9c58-ec00804fda34-config-data\") on node \"crc\" DevicePath \"\"" Dec 11 13:28:51 crc kubenswrapper[4898]: I1211 13:28:51.301414 4898 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0b381d75-882f-425b-9c58-ec00804fda34-scripts\") on node \"crc\" DevicePath \"\"" Dec 11 13:28:51 crc kubenswrapper[4898]: I1211 13:28:51.656365 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-f9lpn" event={"ID":"0b381d75-882f-425b-9c58-ec00804fda34","Type":"ContainerDied","Data":"5f3eed33c0885d03279ce2a95ebb6547f783796c486de926cef7ab6f4edd873d"} Dec 11 13:28:51 crc kubenswrapper[4898]: I1211 13:28:51.656682 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5f3eed33c0885d03279ce2a95ebb6547f783796c486de926cef7ab6f4edd873d" Dec 11 13:28:51 crc kubenswrapper[4898]: I1211 13:28:51.656413 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-f9lpn" Dec 11 13:28:51 crc kubenswrapper[4898]: I1211 13:28:51.993722 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 11 13:28:52 crc kubenswrapper[4898]: I1211 13:28:52.031026 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 11 13:28:52 crc kubenswrapper[4898]: I1211 13:28:52.031085 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 11 13:28:52 crc kubenswrapper[4898]: I1211 13:28:52.646510 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-wjmfk"] Dec 11 13:28:52 crc kubenswrapper[4898]: E1211 13:28:52.647421 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b381d75-882f-425b-9c58-ec00804fda34" containerName="aodh-db-sync" Dec 11 13:28:52 crc kubenswrapper[4898]: I1211 13:28:52.647450 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b381d75-882f-425b-9c58-ec00804fda34" containerName="aodh-db-sync" Dec 11 13:28:52 crc kubenswrapper[4898]: I1211 13:28:52.647756 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b381d75-882f-425b-9c58-ec00804fda34" containerName="aodh-db-sync" Dec 11 13:28:52 crc kubenswrapper[4898]: I1211 13:28:52.649600 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wjmfk" Dec 11 13:28:52 crc kubenswrapper[4898]: I1211 13:28:52.658705 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wjmfk"] Dec 11 13:28:52 crc kubenswrapper[4898]: I1211 13:28:52.700189 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7g6rk" event={"ID":"78074bfc-7a8f-4154-99f1-6c1abefca223","Type":"ContainerStarted","Data":"0293b7791f85ddab3b413f003364e5031078f21cb17e1c37d07a7035959e8955"} Dec 11 13:28:52 crc kubenswrapper[4898]: I1211 13:28:52.704806 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ngkww" event={"ID":"09202c46-beab-4e3f-aa19-f693116c3d85","Type":"ContainerStarted","Data":"3ff27c4fa2fa8cc4c189febb36e4ce98cb7e281ff565f7f3a320e826fb891a84"} Dec 11 13:28:52 crc kubenswrapper[4898]: I1211 13:28:52.751534 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-7g6rk" podStartSLOduration=5.2631606269999995 podStartE2EDuration="9.751516875s" podCreationTimestamp="2025-12-11 13:28:43 +0000 UTC" firstStartedPulling="2025-12-11 13:28:47.531797764 +0000 UTC m=+1485.104124201" lastFinishedPulling="2025-12-11 13:28:52.020154012 +0000 UTC m=+1489.592480449" observedRunningTime="2025-12-11 13:28:52.724988637 +0000 UTC m=+1490.297315084" watchObservedRunningTime="2025-12-11 13:28:52.751516875 +0000 UTC m=+1490.323843302" Dec 11 13:28:52 crc kubenswrapper[4898]: I1211 13:28:52.826002 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-ngkww" podStartSLOduration=3.305839022 podStartE2EDuration="7.82598056s" podCreationTimestamp="2025-12-11 13:28:45 +0000 UTC" firstStartedPulling="2025-12-11 13:28:47.50134636 +0000 UTC m=+1485.073672807" lastFinishedPulling="2025-12-11 13:28:52.021487898 +0000 UTC m=+1489.593814345" observedRunningTime="2025-12-11 13:28:52.750933989 +0000 UTC m=+1490.323260446" watchObservedRunningTime="2025-12-11 13:28:52.82598056 +0000 UTC m=+1490.398306997" Dec 11 13:28:52 crc kubenswrapper[4898]: I1211 13:28:52.834806 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-0"] Dec 11 13:28:52 crc kubenswrapper[4898]: I1211 13:28:52.839488 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Dec 11 13:28:52 crc kubenswrapper[4898]: I1211 13:28:52.839611 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Dec 11 13:28:52 crc kubenswrapper[4898]: I1211 13:28:52.842833 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Dec 11 13:28:52 crc kubenswrapper[4898]: I1211 13:28:52.843039 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-h8xn6" Dec 11 13:28:52 crc kubenswrapper[4898]: I1211 13:28:52.852067 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Dec 11 13:28:52 crc kubenswrapper[4898]: I1211 13:28:52.876113 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cec835c2-4160-4de3-ac12-7e808d0882b3-catalog-content\") pod \"redhat-operators-wjmfk\" (UID: \"cec835c2-4160-4de3-ac12-7e808d0882b3\") " pod="openshift-marketplace/redhat-operators-wjmfk" Dec 11 13:28:52 crc kubenswrapper[4898]: I1211 13:28:52.876415 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cec835c2-4160-4de3-ac12-7e808d0882b3-utilities\") pod \"redhat-operators-wjmfk\" (UID: \"cec835c2-4160-4de3-ac12-7e808d0882b3\") " pod="openshift-marketplace/redhat-operators-wjmfk" Dec 11 13:28:52 crc kubenswrapper[4898]: I1211 13:28:52.876570 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g86t9\" (UniqueName: \"kubernetes.io/projected/cec835c2-4160-4de3-ac12-7e808d0882b3-kube-api-access-g86t9\") pod \"redhat-operators-wjmfk\" (UID: \"cec835c2-4160-4de3-ac12-7e808d0882b3\") " pod="openshift-marketplace/redhat-operators-wjmfk" Dec 11 13:28:52 crc kubenswrapper[4898]: I1211 13:28:52.978449 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b73e56da-3159-42d6-a1f8-d72c25c82451-scripts\") pod \"aodh-0\" (UID: \"b73e56da-3159-42d6-a1f8-d72c25c82451\") " pod="openstack/aodh-0" Dec 11 13:28:52 crc kubenswrapper[4898]: I1211 13:28:52.978514 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cec835c2-4160-4de3-ac12-7e808d0882b3-catalog-content\") pod \"redhat-operators-wjmfk\" (UID: \"cec835c2-4160-4de3-ac12-7e808d0882b3\") " pod="openshift-marketplace/redhat-operators-wjmfk" Dec 11 13:28:52 crc kubenswrapper[4898]: I1211 13:28:52.978544 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cec835c2-4160-4de3-ac12-7e808d0882b3-utilities\") pod \"redhat-operators-wjmfk\" (UID: \"cec835c2-4160-4de3-ac12-7e808d0882b3\") " pod="openshift-marketplace/redhat-operators-wjmfk" Dec 11 13:28:52 crc kubenswrapper[4898]: I1211 13:28:52.978623 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b73e56da-3159-42d6-a1f8-d72c25c82451-combined-ca-bundle\") pod \"aodh-0\" (UID: \"b73e56da-3159-42d6-a1f8-d72c25c82451\") " pod="openstack/aodh-0" Dec 11 13:28:52 crc kubenswrapper[4898]: I1211 13:28:52.978667 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b73e56da-3159-42d6-a1f8-d72c25c82451-config-data\") pod \"aodh-0\" (UID: \"b73e56da-3159-42d6-a1f8-d72c25c82451\") " pod="openstack/aodh-0" Dec 11 13:28:52 crc kubenswrapper[4898]: I1211 13:28:52.978696 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g86t9\" (UniqueName: \"kubernetes.io/projected/cec835c2-4160-4de3-ac12-7e808d0882b3-kube-api-access-g86t9\") pod \"redhat-operators-wjmfk\" (UID: \"cec835c2-4160-4de3-ac12-7e808d0882b3\") " pod="openshift-marketplace/redhat-operators-wjmfk" Dec 11 13:28:52 crc kubenswrapper[4898]: I1211 13:28:52.978926 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wbn6c\" (UniqueName: \"kubernetes.io/projected/b73e56da-3159-42d6-a1f8-d72c25c82451-kube-api-access-wbn6c\") pod \"aodh-0\" (UID: \"b73e56da-3159-42d6-a1f8-d72c25c82451\") " pod="openstack/aodh-0" Dec 11 13:28:52 crc kubenswrapper[4898]: I1211 13:28:52.979066 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cec835c2-4160-4de3-ac12-7e808d0882b3-utilities\") pod \"redhat-operators-wjmfk\" (UID: \"cec835c2-4160-4de3-ac12-7e808d0882b3\") " pod="openshift-marketplace/redhat-operators-wjmfk" Dec 11 13:28:52 crc kubenswrapper[4898]: I1211 13:28:52.979196 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cec835c2-4160-4de3-ac12-7e808d0882b3-catalog-content\") pod \"redhat-operators-wjmfk\" (UID: \"cec835c2-4160-4de3-ac12-7e808d0882b3\") " pod="openshift-marketplace/redhat-operators-wjmfk" Dec 11 13:28:53 crc kubenswrapper[4898]: I1211 13:28:53.006674 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g86t9\" (UniqueName: \"kubernetes.io/projected/cec835c2-4160-4de3-ac12-7e808d0882b3-kube-api-access-g86t9\") pod \"redhat-operators-wjmfk\" (UID: \"cec835c2-4160-4de3-ac12-7e808d0882b3\") " pod="openshift-marketplace/redhat-operators-wjmfk" Dec 11 13:28:53 crc kubenswrapper[4898]: I1211 13:28:53.095132 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b73e56da-3159-42d6-a1f8-d72c25c82451-scripts\") pod \"aodh-0\" (UID: \"b73e56da-3159-42d6-a1f8-d72c25c82451\") " pod="openstack/aodh-0" Dec 11 13:28:53 crc kubenswrapper[4898]: I1211 13:28:53.095218 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b73e56da-3159-42d6-a1f8-d72c25c82451-combined-ca-bundle\") pod \"aodh-0\" (UID: \"b73e56da-3159-42d6-a1f8-d72c25c82451\") " pod="openstack/aodh-0" Dec 11 13:28:53 crc kubenswrapper[4898]: I1211 13:28:53.095260 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b73e56da-3159-42d6-a1f8-d72c25c82451-config-data\") pod \"aodh-0\" (UID: \"b73e56da-3159-42d6-a1f8-d72c25c82451\") " pod="openstack/aodh-0" Dec 11 13:28:53 crc kubenswrapper[4898]: I1211 13:28:53.095327 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wbn6c\" (UniqueName: \"kubernetes.io/projected/b73e56da-3159-42d6-a1f8-d72c25c82451-kube-api-access-wbn6c\") pod \"aodh-0\" (UID: \"b73e56da-3159-42d6-a1f8-d72c25c82451\") " pod="openstack/aodh-0" Dec 11 13:28:53 crc kubenswrapper[4898]: I1211 13:28:53.102640 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b73e56da-3159-42d6-a1f8-d72c25c82451-scripts\") pod \"aodh-0\" (UID: \"b73e56da-3159-42d6-a1f8-d72c25c82451\") " pod="openstack/aodh-0" Dec 11 13:28:53 crc kubenswrapper[4898]: I1211 13:28:53.102640 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b73e56da-3159-42d6-a1f8-d72c25c82451-combined-ca-bundle\") pod \"aodh-0\" (UID: \"b73e56da-3159-42d6-a1f8-d72c25c82451\") " pod="openstack/aodh-0" Dec 11 13:28:53 crc kubenswrapper[4898]: I1211 13:28:53.104368 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b73e56da-3159-42d6-a1f8-d72c25c82451-config-data\") pod \"aodh-0\" (UID: \"b73e56da-3159-42d6-a1f8-d72c25c82451\") " pod="openstack/aodh-0" Dec 11 13:28:53 crc kubenswrapper[4898]: I1211 13:28:53.117055 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wbn6c\" (UniqueName: \"kubernetes.io/projected/b73e56da-3159-42d6-a1f8-d72c25c82451-kube-api-access-wbn6c\") pod \"aodh-0\" (UID: \"b73e56da-3159-42d6-a1f8-d72c25c82451\") " pod="openstack/aodh-0" Dec 11 13:28:53 crc kubenswrapper[4898]: I1211 13:28:53.178790 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Dec 11 13:28:53 crc kubenswrapper[4898]: I1211 13:28:53.272442 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wjmfk" Dec 11 13:28:53 crc kubenswrapper[4898]: I1211 13:28:53.404801 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-7g6rk" Dec 11 13:28:53 crc kubenswrapper[4898]: I1211 13:28:53.406774 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-7g6rk" Dec 11 13:28:53 crc kubenswrapper[4898]: I1211 13:28:53.775337 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Dec 11 13:28:53 crc kubenswrapper[4898]: I1211 13:28:53.948441 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wjmfk"] Dec 11 13:28:53 crc kubenswrapper[4898]: W1211 13:28:53.955465 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcec835c2_4160_4de3_ac12_7e808d0882b3.slice/crio-fdbf33e56068f5e21b63a35214eb3d35a7049edd994047a8996fbd7549f8c1f9 WatchSource:0}: Error finding container fdbf33e56068f5e21b63a35214eb3d35a7049edd994047a8996fbd7549f8c1f9: Status 404 returned error can't find the container with id fdbf33e56068f5e21b63a35214eb3d35a7049edd994047a8996fbd7549f8c1f9 Dec 11 13:28:54 crc kubenswrapper[4898]: I1211 13:28:54.471341 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-7g6rk" podUID="78074bfc-7a8f-4154-99f1-6c1abefca223" containerName="registry-server" probeResult="failure" output=< Dec 11 13:28:54 crc kubenswrapper[4898]: timeout: failed to connect service ":50051" within 1s Dec 11 13:28:54 crc kubenswrapper[4898]: > Dec 11 13:28:54 crc kubenswrapper[4898]: I1211 13:28:54.756697 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"b73e56da-3159-42d6-a1f8-d72c25c82451","Type":"ContainerStarted","Data":"163e91db32d2702038e09839f575e2ac673c39e9c44579879d49c23e0f2fa361"} Dec 11 13:28:54 crc kubenswrapper[4898]: I1211 13:28:54.779127 4898 generic.go:334] "Generic (PLEG): container finished" podID="cec835c2-4160-4de3-ac12-7e808d0882b3" containerID="2596a74cf684f98fac072af76d18988d5cbfb02f9add9d93ea9ed2a758b72ca2" exitCode=0 Dec 11 13:28:54 crc kubenswrapper[4898]: I1211 13:28:54.797915 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wjmfk" event={"ID":"cec835c2-4160-4de3-ac12-7e808d0882b3","Type":"ContainerDied","Data":"2596a74cf684f98fac072af76d18988d5cbfb02f9add9d93ea9ed2a758b72ca2"} Dec 11 13:28:54 crc kubenswrapper[4898]: I1211 13:28:54.797957 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wjmfk" event={"ID":"cec835c2-4160-4de3-ac12-7e808d0882b3","Type":"ContainerStarted","Data":"fdbf33e56068f5e21b63a35214eb3d35a7049edd994047a8996fbd7549f8c1f9"} Dec 11 13:28:55 crc kubenswrapper[4898]: I1211 13:28:55.365284 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 11 13:28:55 crc kubenswrapper[4898]: I1211 13:28:55.365872 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c3709b73-c7ab-4ad7-8d8d-2dc55ea4e46f" containerName="ceilometer-central-agent" containerID="cri-o://4ac829fa4779e37c5e1d3153325769c40f5a68d2c0629692f66c1f8d44a2b591" gracePeriod=30 Dec 11 13:28:55 crc kubenswrapper[4898]: I1211 13:28:55.365954 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c3709b73-c7ab-4ad7-8d8d-2dc55ea4e46f" containerName="sg-core" containerID="cri-o://2337b6827599a355a74164b0ff0906446a51991d184793e68007ee6f656a4d94" gracePeriod=30 Dec 11 13:28:55 crc kubenswrapper[4898]: I1211 13:28:55.365987 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c3709b73-c7ab-4ad7-8d8d-2dc55ea4e46f" containerName="ceilometer-notification-agent" containerID="cri-o://6fc713df7449b7c4adc18bdc23c5d4003539ee4c4ce910a2cffe2c474eeb0e4e" gracePeriod=30 Dec 11 13:28:55 crc kubenswrapper[4898]: I1211 13:28:55.366152 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c3709b73-c7ab-4ad7-8d8d-2dc55ea4e46f" containerName="proxy-httpd" containerID="cri-o://4458eb4c87aa70fbd2c81f49dba275d2ee2a8dd1fc614ec66f570958f2340070" gracePeriod=30 Dec 11 13:28:55 crc kubenswrapper[4898]: I1211 13:28:55.398041 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="c3709b73-c7ab-4ad7-8d8d-2dc55ea4e46f" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Dec 11 13:28:55 crc kubenswrapper[4898]: I1211 13:28:55.772499 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-ngkww" Dec 11 13:28:55 crc kubenswrapper[4898]: I1211 13:28:55.772786 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-ngkww" Dec 11 13:28:55 crc kubenswrapper[4898]: I1211 13:28:55.792467 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wjmfk" event={"ID":"cec835c2-4160-4de3-ac12-7e808d0882b3","Type":"ContainerStarted","Data":"d4e4c4012d995f9a3a0d09a553e88e460cd092b296f6614c95d691c579e53039"} Dec 11 13:28:55 crc kubenswrapper[4898]: I1211 13:28:55.795560 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"b73e56da-3159-42d6-a1f8-d72c25c82451","Type":"ContainerStarted","Data":"ebde97722847b184057ddff9e66777896d279cb30e997da67b1e0fc4bc72b07c"} Dec 11 13:28:55 crc kubenswrapper[4898]: I1211 13:28:55.803671 4898 generic.go:334] "Generic (PLEG): container finished" podID="c3709b73-c7ab-4ad7-8d8d-2dc55ea4e46f" containerID="4458eb4c87aa70fbd2c81f49dba275d2ee2a8dd1fc614ec66f570958f2340070" exitCode=0 Dec 11 13:28:55 crc kubenswrapper[4898]: I1211 13:28:55.803706 4898 generic.go:334] "Generic (PLEG): container finished" podID="c3709b73-c7ab-4ad7-8d8d-2dc55ea4e46f" containerID="2337b6827599a355a74164b0ff0906446a51991d184793e68007ee6f656a4d94" exitCode=2 Dec 11 13:28:55 crc kubenswrapper[4898]: I1211 13:28:55.803716 4898 generic.go:334] "Generic (PLEG): container finished" podID="c3709b73-c7ab-4ad7-8d8d-2dc55ea4e46f" containerID="4ac829fa4779e37c5e1d3153325769c40f5a68d2c0629692f66c1f8d44a2b591" exitCode=0 Dec 11 13:28:55 crc kubenswrapper[4898]: I1211 13:28:55.803742 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c3709b73-c7ab-4ad7-8d8d-2dc55ea4e46f","Type":"ContainerDied","Data":"4458eb4c87aa70fbd2c81f49dba275d2ee2a8dd1fc614ec66f570958f2340070"} Dec 11 13:28:55 crc kubenswrapper[4898]: I1211 13:28:55.803769 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c3709b73-c7ab-4ad7-8d8d-2dc55ea4e46f","Type":"ContainerDied","Data":"2337b6827599a355a74164b0ff0906446a51991d184793e68007ee6f656a4d94"} Dec 11 13:28:55 crc kubenswrapper[4898]: I1211 13:28:55.803782 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c3709b73-c7ab-4ad7-8d8d-2dc55ea4e46f","Type":"ContainerDied","Data":"4ac829fa4779e37c5e1d3153325769c40f5a68d2c0629692f66c1f8d44a2b591"} Dec 11 13:28:55 crc kubenswrapper[4898]: I1211 13:28:55.830522 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-ngkww" Dec 11 13:28:56 crc kubenswrapper[4898]: I1211 13:28:56.398706 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Dec 11 13:28:56 crc kubenswrapper[4898]: I1211 13:28:56.716720 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-0"] Dec 11 13:28:56 crc kubenswrapper[4898]: I1211 13:28:56.992841 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 11 13:28:57 crc kubenswrapper[4898]: I1211 13:28:57.025727 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 11 13:28:57 crc kubenswrapper[4898]: I1211 13:28:57.031063 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 11 13:28:57 crc kubenswrapper[4898]: I1211 13:28:57.031085 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 11 13:28:57 crc kubenswrapper[4898]: I1211 13:28:57.856213 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 11 13:28:58 crc kubenswrapper[4898]: I1211 13:28:58.048715 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="c9d50b6b-d1c5-4072-964e-475b4bfb685d" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.245:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 11 13:28:58 crc kubenswrapper[4898]: I1211 13:28:58.048745 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="c9d50b6b-d1c5-4072-964e-475b4bfb685d" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.245:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 11 13:28:58 crc kubenswrapper[4898]: I1211 13:28:58.263001 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 11 13:28:58 crc kubenswrapper[4898]: I1211 13:28:58.263573 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 11 13:28:59 crc kubenswrapper[4898]: I1211 13:28:59.352640 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="db5a5721-a011-492c-b79d-cc175721a3c4" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.246:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 11 13:28:59 crc kubenswrapper[4898]: I1211 13:28:59.352626 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="db5a5721-a011-492c-b79d-cc175721a3c4" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.246:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 11 13:28:59 crc kubenswrapper[4898]: I1211 13:28:59.849344 4898 generic.go:334] "Generic (PLEG): container finished" podID="cec835c2-4160-4de3-ac12-7e808d0882b3" containerID="d4e4c4012d995f9a3a0d09a553e88e460cd092b296f6614c95d691c579e53039" exitCode=0 Dec 11 13:28:59 crc kubenswrapper[4898]: I1211 13:28:59.849423 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wjmfk" event={"ID":"cec835c2-4160-4de3-ac12-7e808d0882b3","Type":"ContainerDied","Data":"d4e4c4012d995f9a3a0d09a553e88e460cd092b296f6614c95d691c579e53039"} Dec 11 13:28:59 crc kubenswrapper[4898]: I1211 13:28:59.856396 4898 generic.go:334] "Generic (PLEG): container finished" podID="c3709b73-c7ab-4ad7-8d8d-2dc55ea4e46f" containerID="6fc713df7449b7c4adc18bdc23c5d4003539ee4c4ce910a2cffe2c474eeb0e4e" exitCode=0 Dec 11 13:28:59 crc kubenswrapper[4898]: I1211 13:28:59.857543 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c3709b73-c7ab-4ad7-8d8d-2dc55ea4e46f","Type":"ContainerDied","Data":"6fc713df7449b7c4adc18bdc23c5d4003539ee4c4ce910a2cffe2c474eeb0e4e"} Dec 11 13:29:00 crc kubenswrapper[4898]: I1211 13:29:00.526504 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 11 13:29:00 crc kubenswrapper[4898]: I1211 13:29:00.677263 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c3709b73-c7ab-4ad7-8d8d-2dc55ea4e46f-run-httpd\") pod \"c3709b73-c7ab-4ad7-8d8d-2dc55ea4e46f\" (UID: \"c3709b73-c7ab-4ad7-8d8d-2dc55ea4e46f\") " Dec 11 13:29:00 crc kubenswrapper[4898]: I1211 13:29:00.677319 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7p2zs\" (UniqueName: \"kubernetes.io/projected/c3709b73-c7ab-4ad7-8d8d-2dc55ea4e46f-kube-api-access-7p2zs\") pod \"c3709b73-c7ab-4ad7-8d8d-2dc55ea4e46f\" (UID: \"c3709b73-c7ab-4ad7-8d8d-2dc55ea4e46f\") " Dec 11 13:29:00 crc kubenswrapper[4898]: I1211 13:29:00.677342 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3709b73-c7ab-4ad7-8d8d-2dc55ea4e46f-combined-ca-bundle\") pod \"c3709b73-c7ab-4ad7-8d8d-2dc55ea4e46f\" (UID: \"c3709b73-c7ab-4ad7-8d8d-2dc55ea4e46f\") " Dec 11 13:29:00 crc kubenswrapper[4898]: I1211 13:29:00.677396 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3709b73-c7ab-4ad7-8d8d-2dc55ea4e46f-config-data\") pod \"c3709b73-c7ab-4ad7-8d8d-2dc55ea4e46f\" (UID: \"c3709b73-c7ab-4ad7-8d8d-2dc55ea4e46f\") " Dec 11 13:29:00 crc kubenswrapper[4898]: I1211 13:29:00.677511 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c3709b73-c7ab-4ad7-8d8d-2dc55ea4e46f-log-httpd\") pod \"c3709b73-c7ab-4ad7-8d8d-2dc55ea4e46f\" (UID: \"c3709b73-c7ab-4ad7-8d8d-2dc55ea4e46f\") " Dec 11 13:29:00 crc kubenswrapper[4898]: I1211 13:29:00.677558 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c3709b73-c7ab-4ad7-8d8d-2dc55ea4e46f-sg-core-conf-yaml\") pod \"c3709b73-c7ab-4ad7-8d8d-2dc55ea4e46f\" (UID: \"c3709b73-c7ab-4ad7-8d8d-2dc55ea4e46f\") " Dec 11 13:29:00 crc kubenswrapper[4898]: I1211 13:29:00.677682 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c3709b73-c7ab-4ad7-8d8d-2dc55ea4e46f-scripts\") pod \"c3709b73-c7ab-4ad7-8d8d-2dc55ea4e46f\" (UID: \"c3709b73-c7ab-4ad7-8d8d-2dc55ea4e46f\") " Dec 11 13:29:00 crc kubenswrapper[4898]: I1211 13:29:00.678376 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c3709b73-c7ab-4ad7-8d8d-2dc55ea4e46f-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "c3709b73-c7ab-4ad7-8d8d-2dc55ea4e46f" (UID: "c3709b73-c7ab-4ad7-8d8d-2dc55ea4e46f"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 13:29:00 crc kubenswrapper[4898]: I1211 13:29:00.678805 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c3709b73-c7ab-4ad7-8d8d-2dc55ea4e46f-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "c3709b73-c7ab-4ad7-8d8d-2dc55ea4e46f" (UID: "c3709b73-c7ab-4ad7-8d8d-2dc55ea4e46f"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 13:29:00 crc kubenswrapper[4898]: I1211 13:29:00.684081 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3709b73-c7ab-4ad7-8d8d-2dc55ea4e46f-kube-api-access-7p2zs" (OuterVolumeSpecName: "kube-api-access-7p2zs") pod "c3709b73-c7ab-4ad7-8d8d-2dc55ea4e46f" (UID: "c3709b73-c7ab-4ad7-8d8d-2dc55ea4e46f"). InnerVolumeSpecName "kube-api-access-7p2zs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:29:00 crc kubenswrapper[4898]: I1211 13:29:00.684169 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3709b73-c7ab-4ad7-8d8d-2dc55ea4e46f-scripts" (OuterVolumeSpecName: "scripts") pod "c3709b73-c7ab-4ad7-8d8d-2dc55ea4e46f" (UID: "c3709b73-c7ab-4ad7-8d8d-2dc55ea4e46f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:29:00 crc kubenswrapper[4898]: I1211 13:29:00.721894 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3709b73-c7ab-4ad7-8d8d-2dc55ea4e46f-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "c3709b73-c7ab-4ad7-8d8d-2dc55ea4e46f" (UID: "c3709b73-c7ab-4ad7-8d8d-2dc55ea4e46f"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:29:00 crc kubenswrapper[4898]: I1211 13:29:00.780648 4898 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c3709b73-c7ab-4ad7-8d8d-2dc55ea4e46f-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 11 13:29:00 crc kubenswrapper[4898]: I1211 13:29:00.780679 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7p2zs\" (UniqueName: \"kubernetes.io/projected/c3709b73-c7ab-4ad7-8d8d-2dc55ea4e46f-kube-api-access-7p2zs\") on node \"crc\" DevicePath \"\"" Dec 11 13:29:00 crc kubenswrapper[4898]: I1211 13:29:00.780690 4898 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c3709b73-c7ab-4ad7-8d8d-2dc55ea4e46f-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 11 13:29:00 crc kubenswrapper[4898]: I1211 13:29:00.780700 4898 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c3709b73-c7ab-4ad7-8d8d-2dc55ea4e46f-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 11 13:29:00 crc kubenswrapper[4898]: I1211 13:29:00.780708 4898 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c3709b73-c7ab-4ad7-8d8d-2dc55ea4e46f-scripts\") on node \"crc\" DevicePath \"\"" Dec 11 13:29:00 crc kubenswrapper[4898]: I1211 13:29:00.785584 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3709b73-c7ab-4ad7-8d8d-2dc55ea4e46f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c3709b73-c7ab-4ad7-8d8d-2dc55ea4e46f" (UID: "c3709b73-c7ab-4ad7-8d8d-2dc55ea4e46f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:29:00 crc kubenswrapper[4898]: I1211 13:29:00.803995 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3709b73-c7ab-4ad7-8d8d-2dc55ea4e46f-config-data" (OuterVolumeSpecName: "config-data") pod "c3709b73-c7ab-4ad7-8d8d-2dc55ea4e46f" (UID: "c3709b73-c7ab-4ad7-8d8d-2dc55ea4e46f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:29:00 crc kubenswrapper[4898]: I1211 13:29:00.869394 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"b73e56da-3159-42d6-a1f8-d72c25c82451","Type":"ContainerStarted","Data":"f5c9cad012b20f086cf7603bb966371d258a905d3caa52f32907ce842f5015b5"} Dec 11 13:29:00 crc kubenswrapper[4898]: I1211 13:29:00.874636 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c3709b73-c7ab-4ad7-8d8d-2dc55ea4e46f","Type":"ContainerDied","Data":"f38dd567c531a4bddff5efd86837853f031eecc1207dc7144de9bfd6c482f126"} Dec 11 13:29:00 crc kubenswrapper[4898]: I1211 13:29:00.874708 4898 scope.go:117] "RemoveContainer" containerID="4458eb4c87aa70fbd2c81f49dba275d2ee2a8dd1fc614ec66f570958f2340070" Dec 11 13:29:00 crc kubenswrapper[4898]: I1211 13:29:00.874727 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 11 13:29:00 crc kubenswrapper[4898]: I1211 13:29:00.883153 4898 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3709b73-c7ab-4ad7-8d8d-2dc55ea4e46f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 11 13:29:00 crc kubenswrapper[4898]: I1211 13:29:00.883188 4898 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3709b73-c7ab-4ad7-8d8d-2dc55ea4e46f-config-data\") on node \"crc\" DevicePath \"\"" Dec 11 13:29:00 crc kubenswrapper[4898]: I1211 13:29:00.918821 4898 scope.go:117] "RemoveContainer" containerID="2337b6827599a355a74164b0ff0906446a51991d184793e68007ee6f656a4d94" Dec 11 13:29:00 crc kubenswrapper[4898]: I1211 13:29:00.924520 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 11 13:29:00 crc kubenswrapper[4898]: I1211 13:29:00.938006 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 11 13:29:00 crc kubenswrapper[4898]: I1211 13:29:00.954377 4898 scope.go:117] "RemoveContainer" containerID="6fc713df7449b7c4adc18bdc23c5d4003539ee4c4ce910a2cffe2c474eeb0e4e" Dec 11 13:29:00 crc kubenswrapper[4898]: I1211 13:29:00.988926 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 11 13:29:00 crc kubenswrapper[4898]: E1211 13:29:00.989530 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3709b73-c7ab-4ad7-8d8d-2dc55ea4e46f" containerName="ceilometer-central-agent" Dec 11 13:29:00 crc kubenswrapper[4898]: I1211 13:29:00.989554 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3709b73-c7ab-4ad7-8d8d-2dc55ea4e46f" containerName="ceilometer-central-agent" Dec 11 13:29:00 crc kubenswrapper[4898]: E1211 13:29:00.989602 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3709b73-c7ab-4ad7-8d8d-2dc55ea4e46f" containerName="ceilometer-notification-agent" Dec 11 13:29:00 crc kubenswrapper[4898]: I1211 13:29:00.989612 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3709b73-c7ab-4ad7-8d8d-2dc55ea4e46f" containerName="ceilometer-notification-agent" Dec 11 13:29:00 crc kubenswrapper[4898]: E1211 13:29:00.989633 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3709b73-c7ab-4ad7-8d8d-2dc55ea4e46f" containerName="sg-core" Dec 11 13:29:00 crc kubenswrapper[4898]: I1211 13:29:00.989642 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3709b73-c7ab-4ad7-8d8d-2dc55ea4e46f" containerName="sg-core" Dec 11 13:29:00 crc kubenswrapper[4898]: E1211 13:29:00.989673 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3709b73-c7ab-4ad7-8d8d-2dc55ea4e46f" containerName="proxy-httpd" Dec 11 13:29:00 crc kubenswrapper[4898]: I1211 13:29:00.989684 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3709b73-c7ab-4ad7-8d8d-2dc55ea4e46f" containerName="proxy-httpd" Dec 11 13:29:00 crc kubenswrapper[4898]: I1211 13:29:00.989960 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3709b73-c7ab-4ad7-8d8d-2dc55ea4e46f" containerName="ceilometer-central-agent" Dec 11 13:29:00 crc kubenswrapper[4898]: I1211 13:29:00.989988 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3709b73-c7ab-4ad7-8d8d-2dc55ea4e46f" containerName="sg-core" Dec 11 13:29:00 crc kubenswrapper[4898]: I1211 13:29:00.990002 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3709b73-c7ab-4ad7-8d8d-2dc55ea4e46f" containerName="proxy-httpd" Dec 11 13:29:00 crc kubenswrapper[4898]: I1211 13:29:00.990043 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3709b73-c7ab-4ad7-8d8d-2dc55ea4e46f" containerName="ceilometer-notification-agent" Dec 11 13:29:00 crc kubenswrapper[4898]: I1211 13:29:00.992932 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 11 13:29:00 crc kubenswrapper[4898]: I1211 13:29:00.994803 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 11 13:29:00 crc kubenswrapper[4898]: I1211 13:29:00.996103 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 11 13:29:00 crc kubenswrapper[4898]: I1211 13:29:00.998370 4898 scope.go:117] "RemoveContainer" containerID="4ac829fa4779e37c5e1d3153325769c40f5a68d2c0629692f66c1f8d44a2b591" Dec 11 13:29:01 crc kubenswrapper[4898]: I1211 13:29:01.005018 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 11 13:29:01 crc kubenswrapper[4898]: I1211 13:29:01.088235 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/50ddabd8-3fa4-4116-9f20-7a29bc22e7d1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"50ddabd8-3fa4-4116-9f20-7a29bc22e7d1\") " pod="openstack/ceilometer-0" Dec 11 13:29:01 crc kubenswrapper[4898]: I1211 13:29:01.088428 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/50ddabd8-3fa4-4116-9f20-7a29bc22e7d1-run-httpd\") pod \"ceilometer-0\" (UID: \"50ddabd8-3fa4-4116-9f20-7a29bc22e7d1\") " pod="openstack/ceilometer-0" Dec 11 13:29:01 crc kubenswrapper[4898]: I1211 13:29:01.088619 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/50ddabd8-3fa4-4116-9f20-7a29bc22e7d1-log-httpd\") pod \"ceilometer-0\" (UID: \"50ddabd8-3fa4-4116-9f20-7a29bc22e7d1\") " pod="openstack/ceilometer-0" Dec 11 13:29:01 crc kubenswrapper[4898]: I1211 13:29:01.088666 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50ddabd8-3fa4-4116-9f20-7a29bc22e7d1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"50ddabd8-3fa4-4116-9f20-7a29bc22e7d1\") " pod="openstack/ceilometer-0" Dec 11 13:29:01 crc kubenswrapper[4898]: I1211 13:29:01.088690 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/50ddabd8-3fa4-4116-9f20-7a29bc22e7d1-scripts\") pod \"ceilometer-0\" (UID: \"50ddabd8-3fa4-4116-9f20-7a29bc22e7d1\") " pod="openstack/ceilometer-0" Dec 11 13:29:01 crc kubenswrapper[4898]: I1211 13:29:01.088809 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50ddabd8-3fa4-4116-9f20-7a29bc22e7d1-config-data\") pod \"ceilometer-0\" (UID: \"50ddabd8-3fa4-4116-9f20-7a29bc22e7d1\") " pod="openstack/ceilometer-0" Dec 11 13:29:01 crc kubenswrapper[4898]: I1211 13:29:01.088912 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wk2r6\" (UniqueName: \"kubernetes.io/projected/50ddabd8-3fa4-4116-9f20-7a29bc22e7d1-kube-api-access-wk2r6\") pod \"ceilometer-0\" (UID: \"50ddabd8-3fa4-4116-9f20-7a29bc22e7d1\") " pod="openstack/ceilometer-0" Dec 11 13:29:01 crc kubenswrapper[4898]: I1211 13:29:01.191039 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/50ddabd8-3fa4-4116-9f20-7a29bc22e7d1-run-httpd\") pod \"ceilometer-0\" (UID: \"50ddabd8-3fa4-4116-9f20-7a29bc22e7d1\") " pod="openstack/ceilometer-0" Dec 11 13:29:01 crc kubenswrapper[4898]: I1211 13:29:01.191153 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/50ddabd8-3fa4-4116-9f20-7a29bc22e7d1-log-httpd\") pod \"ceilometer-0\" (UID: \"50ddabd8-3fa4-4116-9f20-7a29bc22e7d1\") " pod="openstack/ceilometer-0" Dec 11 13:29:01 crc kubenswrapper[4898]: I1211 13:29:01.191196 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50ddabd8-3fa4-4116-9f20-7a29bc22e7d1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"50ddabd8-3fa4-4116-9f20-7a29bc22e7d1\") " pod="openstack/ceilometer-0" Dec 11 13:29:01 crc kubenswrapper[4898]: I1211 13:29:01.191223 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/50ddabd8-3fa4-4116-9f20-7a29bc22e7d1-scripts\") pod \"ceilometer-0\" (UID: \"50ddabd8-3fa4-4116-9f20-7a29bc22e7d1\") " pod="openstack/ceilometer-0" Dec 11 13:29:01 crc kubenswrapper[4898]: I1211 13:29:01.191275 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50ddabd8-3fa4-4116-9f20-7a29bc22e7d1-config-data\") pod \"ceilometer-0\" (UID: \"50ddabd8-3fa4-4116-9f20-7a29bc22e7d1\") " pod="openstack/ceilometer-0" Dec 11 13:29:01 crc kubenswrapper[4898]: I1211 13:29:01.191319 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wk2r6\" (UniqueName: \"kubernetes.io/projected/50ddabd8-3fa4-4116-9f20-7a29bc22e7d1-kube-api-access-wk2r6\") pod \"ceilometer-0\" (UID: \"50ddabd8-3fa4-4116-9f20-7a29bc22e7d1\") " pod="openstack/ceilometer-0" Dec 11 13:29:01 crc kubenswrapper[4898]: I1211 13:29:01.191507 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/50ddabd8-3fa4-4116-9f20-7a29bc22e7d1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"50ddabd8-3fa4-4116-9f20-7a29bc22e7d1\") " pod="openstack/ceilometer-0" Dec 11 13:29:01 crc kubenswrapper[4898]: I1211 13:29:01.191743 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/50ddabd8-3fa4-4116-9f20-7a29bc22e7d1-log-httpd\") pod \"ceilometer-0\" (UID: \"50ddabd8-3fa4-4116-9f20-7a29bc22e7d1\") " pod="openstack/ceilometer-0" Dec 11 13:29:01 crc kubenswrapper[4898]: I1211 13:29:01.192242 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/50ddabd8-3fa4-4116-9f20-7a29bc22e7d1-run-httpd\") pod \"ceilometer-0\" (UID: \"50ddabd8-3fa4-4116-9f20-7a29bc22e7d1\") " pod="openstack/ceilometer-0" Dec 11 13:29:01 crc kubenswrapper[4898]: I1211 13:29:01.194570 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/50ddabd8-3fa4-4116-9f20-7a29bc22e7d1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"50ddabd8-3fa4-4116-9f20-7a29bc22e7d1\") " pod="openstack/ceilometer-0" Dec 11 13:29:01 crc kubenswrapper[4898]: I1211 13:29:01.194642 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/50ddabd8-3fa4-4116-9f20-7a29bc22e7d1-scripts\") pod \"ceilometer-0\" (UID: \"50ddabd8-3fa4-4116-9f20-7a29bc22e7d1\") " pod="openstack/ceilometer-0" Dec 11 13:29:01 crc kubenswrapper[4898]: I1211 13:29:01.196929 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50ddabd8-3fa4-4116-9f20-7a29bc22e7d1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"50ddabd8-3fa4-4116-9f20-7a29bc22e7d1\") " pod="openstack/ceilometer-0" Dec 11 13:29:01 crc kubenswrapper[4898]: I1211 13:29:01.197479 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50ddabd8-3fa4-4116-9f20-7a29bc22e7d1-config-data\") pod \"ceilometer-0\" (UID: \"50ddabd8-3fa4-4116-9f20-7a29bc22e7d1\") " pod="openstack/ceilometer-0" Dec 11 13:29:01 crc kubenswrapper[4898]: I1211 13:29:01.227551 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wk2r6\" (UniqueName: \"kubernetes.io/projected/50ddabd8-3fa4-4116-9f20-7a29bc22e7d1-kube-api-access-wk2r6\") pod \"ceilometer-0\" (UID: \"50ddabd8-3fa4-4116-9f20-7a29bc22e7d1\") " pod="openstack/ceilometer-0" Dec 11 13:29:01 crc kubenswrapper[4898]: I1211 13:29:01.320597 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 11 13:29:02 crc kubenswrapper[4898]: I1211 13:29:02.123707 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 11 13:29:02 crc kubenswrapper[4898]: I1211 13:29:02.791399 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c3709b73-c7ab-4ad7-8d8d-2dc55ea4e46f" path="/var/lib/kubelet/pods/c3709b73-c7ab-4ad7-8d8d-2dc55ea4e46f/volumes" Dec 11 13:29:02 crc kubenswrapper[4898]: I1211 13:29:02.915068 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wjmfk" event={"ID":"cec835c2-4160-4de3-ac12-7e808d0882b3","Type":"ContainerStarted","Data":"9292cebc1b48aeecbc0ce9044ffe5d53bb4221f09cd1347d0fbd64f4940c6b97"} Dec 11 13:29:02 crc kubenswrapper[4898]: I1211 13:29:02.918249 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"50ddabd8-3fa4-4116-9f20-7a29bc22e7d1","Type":"ContainerStarted","Data":"006520574b5be8ba9b5061a97ddb08cc80007e96279a08430f7f96b4aba9a8be"} Dec 11 13:29:02 crc kubenswrapper[4898]: I1211 13:29:02.920563 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"b73e56da-3159-42d6-a1f8-d72c25c82451","Type":"ContainerStarted","Data":"c20379cd692de345371ccacef2282a32d999fa39e12d626e1bb6a5519452e5e4"} Dec 11 13:29:02 crc kubenswrapper[4898]: I1211 13:29:02.943580 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-wjmfk" podStartSLOduration=3.6913494140000003 podStartE2EDuration="10.943543539s" podCreationTimestamp="2025-12-11 13:28:52 +0000 UTC" firstStartedPulling="2025-12-11 13:28:54.792445697 +0000 UTC m=+1492.364772134" lastFinishedPulling="2025-12-11 13:29:02.044639822 +0000 UTC m=+1499.616966259" observedRunningTime="2025-12-11 13:29:02.935756208 +0000 UTC m=+1500.508082655" watchObservedRunningTime="2025-12-11 13:29:02.943543539 +0000 UTC m=+1500.515869976" Dec 11 13:29:03 crc kubenswrapper[4898]: I1211 13:29:03.273102 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-wjmfk" Dec 11 13:29:03 crc kubenswrapper[4898]: I1211 13:29:03.273147 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-wjmfk" Dec 11 13:29:03 crc kubenswrapper[4898]: I1211 13:29:03.490860 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-7g6rk" Dec 11 13:29:03 crc kubenswrapper[4898]: I1211 13:29:03.606645 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-7g6rk" Dec 11 13:29:03 crc kubenswrapper[4898]: I1211 13:29:03.945255 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"50ddabd8-3fa4-4116-9f20-7a29bc22e7d1","Type":"ContainerStarted","Data":"96d65975ddadea8e8533d5cdb78a2e7c2895759ef5f22df8d1ca2d722bbbd40d"} Dec 11 13:29:04 crc kubenswrapper[4898]: I1211 13:29:04.155383 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7g6rk"] Dec 11 13:29:04 crc kubenswrapper[4898]: I1211 13:29:04.334933 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-wjmfk" podUID="cec835c2-4160-4de3-ac12-7e808d0882b3" containerName="registry-server" probeResult="failure" output=< Dec 11 13:29:04 crc kubenswrapper[4898]: timeout: failed to connect service ":50051" within 1s Dec 11 13:29:04 crc kubenswrapper[4898]: > Dec 11 13:29:04 crc kubenswrapper[4898]: I1211 13:29:04.957860 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"b73e56da-3159-42d6-a1f8-d72c25c82451","Type":"ContainerStarted","Data":"03b964980a19479b34062f3c357c1cab94e007b1fa03cc7f46ee1ba0608dab69"} Dec 11 13:29:04 crc kubenswrapper[4898]: I1211 13:29:04.958022 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-7g6rk" podUID="78074bfc-7a8f-4154-99f1-6c1abefca223" containerName="registry-server" containerID="cri-o://0293b7791f85ddab3b413f003364e5031078f21cb17e1c37d07a7035959e8955" gracePeriod=2 Dec 11 13:29:04 crc kubenswrapper[4898]: I1211 13:29:04.958086 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="b73e56da-3159-42d6-a1f8-d72c25c82451" containerName="aodh-api" containerID="cri-o://ebde97722847b184057ddff9e66777896d279cb30e997da67b1e0fc4bc72b07c" gracePeriod=30 Dec 11 13:29:04 crc kubenswrapper[4898]: I1211 13:29:04.958135 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="b73e56da-3159-42d6-a1f8-d72c25c82451" containerName="aodh-notifier" containerID="cri-o://c20379cd692de345371ccacef2282a32d999fa39e12d626e1bb6a5519452e5e4" gracePeriod=30 Dec 11 13:29:04 crc kubenswrapper[4898]: I1211 13:29:04.958151 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="b73e56da-3159-42d6-a1f8-d72c25c82451" containerName="aodh-evaluator" containerID="cri-o://f5c9cad012b20f086cf7603bb966371d258a905d3caa52f32907ce842f5015b5" gracePeriod=30 Dec 11 13:29:04 crc kubenswrapper[4898]: I1211 13:29:04.958152 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="b73e56da-3159-42d6-a1f8-d72c25c82451" containerName="aodh-listener" containerID="cri-o://03b964980a19479b34062f3c357c1cab94e007b1fa03cc7f46ee1ba0608dab69" gracePeriod=30 Dec 11 13:29:04 crc kubenswrapper[4898]: I1211 13:29:04.995835 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-0" podStartSLOduration=2.284913223 podStartE2EDuration="12.995813589s" podCreationTimestamp="2025-12-11 13:28:52 +0000 UTC" firstStartedPulling="2025-12-11 13:28:53.769627517 +0000 UTC m=+1491.341953954" lastFinishedPulling="2025-12-11 13:29:04.480527883 +0000 UTC m=+1502.052854320" observedRunningTime="2025-12-11 13:29:04.98956801 +0000 UTC m=+1502.561894457" watchObservedRunningTime="2025-12-11 13:29:04.995813589 +0000 UTC m=+1502.568140016" Dec 11 13:29:05 crc kubenswrapper[4898]: I1211 13:29:05.770209 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7g6rk" Dec 11 13:29:05 crc kubenswrapper[4898]: I1211 13:29:05.780917 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 11 13:29:05 crc kubenswrapper[4898]: I1211 13:29:05.850600 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-ngkww" Dec 11 13:29:05 crc kubenswrapper[4898]: I1211 13:29:05.948348 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jq7p9\" (UniqueName: \"kubernetes.io/projected/78074bfc-7a8f-4154-99f1-6c1abefca223-kube-api-access-jq7p9\") pod \"78074bfc-7a8f-4154-99f1-6c1abefca223\" (UID: \"78074bfc-7a8f-4154-99f1-6c1abefca223\") " Dec 11 13:29:05 crc kubenswrapper[4898]: I1211 13:29:05.949626 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kjjt9\" (UniqueName: \"kubernetes.io/projected/70ba1510-a232-4614-bced-c1afee3bd9b2-kube-api-access-kjjt9\") pod \"70ba1510-a232-4614-bced-c1afee3bd9b2\" (UID: \"70ba1510-a232-4614-bced-c1afee3bd9b2\") " Dec 11 13:29:05 crc kubenswrapper[4898]: I1211 13:29:05.949811 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/78074bfc-7a8f-4154-99f1-6c1abefca223-catalog-content\") pod \"78074bfc-7a8f-4154-99f1-6c1abefca223\" (UID: \"78074bfc-7a8f-4154-99f1-6c1abefca223\") " Dec 11 13:29:05 crc kubenswrapper[4898]: I1211 13:29:05.949952 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70ba1510-a232-4614-bced-c1afee3bd9b2-combined-ca-bundle\") pod \"70ba1510-a232-4614-bced-c1afee3bd9b2\" (UID: \"70ba1510-a232-4614-bced-c1afee3bd9b2\") " Dec 11 13:29:05 crc kubenswrapper[4898]: I1211 13:29:05.950084 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/78074bfc-7a8f-4154-99f1-6c1abefca223-utilities\") pod \"78074bfc-7a8f-4154-99f1-6c1abefca223\" (UID: \"78074bfc-7a8f-4154-99f1-6c1abefca223\") " Dec 11 13:29:05 crc kubenswrapper[4898]: I1211 13:29:05.950126 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70ba1510-a232-4614-bced-c1afee3bd9b2-config-data\") pod \"70ba1510-a232-4614-bced-c1afee3bd9b2\" (UID: \"70ba1510-a232-4614-bced-c1afee3bd9b2\") " Dec 11 13:29:05 crc kubenswrapper[4898]: I1211 13:29:05.960146 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/78074bfc-7a8f-4154-99f1-6c1abefca223-utilities" (OuterVolumeSpecName: "utilities") pod "78074bfc-7a8f-4154-99f1-6c1abefca223" (UID: "78074bfc-7a8f-4154-99f1-6c1abefca223"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 13:29:05 crc kubenswrapper[4898]: I1211 13:29:05.980216 4898 generic.go:334] "Generic (PLEG): container finished" podID="78074bfc-7a8f-4154-99f1-6c1abefca223" containerID="0293b7791f85ddab3b413f003364e5031078f21cb17e1c37d07a7035959e8955" exitCode=0 Dec 11 13:29:05 crc kubenswrapper[4898]: I1211 13:29:05.980477 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7g6rk" event={"ID":"78074bfc-7a8f-4154-99f1-6c1abefca223","Type":"ContainerDied","Data":"0293b7791f85ddab3b413f003364e5031078f21cb17e1c37d07a7035959e8955"} Dec 11 13:29:05 crc kubenswrapper[4898]: I1211 13:29:05.980565 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7g6rk" event={"ID":"78074bfc-7a8f-4154-99f1-6c1abefca223","Type":"ContainerDied","Data":"9a6634885421b535622a9a24d8307b1e59728cd80bfdb1ea5216b34583b19a55"} Dec 11 13:29:05 crc kubenswrapper[4898]: I1211 13:29:05.980627 4898 scope.go:117] "RemoveContainer" containerID="0293b7791f85ddab3b413f003364e5031078f21cb17e1c37d07a7035959e8955" Dec 11 13:29:05 crc kubenswrapper[4898]: I1211 13:29:05.980790 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7g6rk" Dec 11 13:29:05 crc kubenswrapper[4898]: I1211 13:29:05.999724 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/78074bfc-7a8f-4154-99f1-6c1abefca223-kube-api-access-jq7p9" (OuterVolumeSpecName: "kube-api-access-jq7p9") pod "78074bfc-7a8f-4154-99f1-6c1abefca223" (UID: "78074bfc-7a8f-4154-99f1-6c1abefca223"). InnerVolumeSpecName "kube-api-access-jq7p9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:29:06 crc kubenswrapper[4898]: I1211 13:29:06.001087 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"50ddabd8-3fa4-4116-9f20-7a29bc22e7d1","Type":"ContainerStarted","Data":"7e982e672761b461554d02f60835ac21b3ae4e3c28a83b45aa4cdd86cd1143a1"} Dec 11 13:29:06 crc kubenswrapper[4898]: I1211 13:29:06.004344 4898 generic.go:334] "Generic (PLEG): container finished" podID="b73e56da-3159-42d6-a1f8-d72c25c82451" containerID="f5c9cad012b20f086cf7603bb966371d258a905d3caa52f32907ce842f5015b5" exitCode=0 Dec 11 13:29:06 crc kubenswrapper[4898]: I1211 13:29:06.004472 4898 generic.go:334] "Generic (PLEG): container finished" podID="b73e56da-3159-42d6-a1f8-d72c25c82451" containerID="ebde97722847b184057ddff9e66777896d279cb30e997da67b1e0fc4bc72b07c" exitCode=0 Dec 11 13:29:06 crc kubenswrapper[4898]: I1211 13:29:06.004659 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"b73e56da-3159-42d6-a1f8-d72c25c82451","Type":"ContainerDied","Data":"f5c9cad012b20f086cf7603bb966371d258a905d3caa52f32907ce842f5015b5"} Dec 11 13:29:06 crc kubenswrapper[4898]: I1211 13:29:06.004820 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"b73e56da-3159-42d6-a1f8-d72c25c82451","Type":"ContainerDied","Data":"ebde97722847b184057ddff9e66777896d279cb30e997da67b1e0fc4bc72b07c"} Dec 11 13:29:06 crc kubenswrapper[4898]: I1211 13:29:06.009268 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70ba1510-a232-4614-bced-c1afee3bd9b2-kube-api-access-kjjt9" (OuterVolumeSpecName: "kube-api-access-kjjt9") pod "70ba1510-a232-4614-bced-c1afee3bd9b2" (UID: "70ba1510-a232-4614-bced-c1afee3bd9b2"). InnerVolumeSpecName "kube-api-access-kjjt9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:29:06 crc kubenswrapper[4898]: I1211 13:29:06.038859 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70ba1510-a232-4614-bced-c1afee3bd9b2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "70ba1510-a232-4614-bced-c1afee3bd9b2" (UID: "70ba1510-a232-4614-bced-c1afee3bd9b2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:29:06 crc kubenswrapper[4898]: I1211 13:29:06.039855 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70ba1510-a232-4614-bced-c1afee3bd9b2-config-data" (OuterVolumeSpecName: "config-data") pod "70ba1510-a232-4614-bced-c1afee3bd9b2" (UID: "70ba1510-a232-4614-bced-c1afee3bd9b2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:29:06 crc kubenswrapper[4898]: I1211 13:29:06.039945 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/78074bfc-7a8f-4154-99f1-6c1abefca223-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "78074bfc-7a8f-4154-99f1-6c1abefca223" (UID: "78074bfc-7a8f-4154-99f1-6c1abefca223"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 13:29:06 crc kubenswrapper[4898]: I1211 13:29:06.050137 4898 generic.go:334] "Generic (PLEG): container finished" podID="70ba1510-a232-4614-bced-c1afee3bd9b2" containerID="98333e506b604b553b1094b55473be7e6becb832ff69dccac03718315c85834a" exitCode=137 Dec 11 13:29:06 crc kubenswrapper[4898]: I1211 13:29:06.050207 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"70ba1510-a232-4614-bced-c1afee3bd9b2","Type":"ContainerDied","Data":"98333e506b604b553b1094b55473be7e6becb832ff69dccac03718315c85834a"} Dec 11 13:29:06 crc kubenswrapper[4898]: I1211 13:29:06.050243 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"70ba1510-a232-4614-bced-c1afee3bd9b2","Type":"ContainerDied","Data":"e25ffe273c6a89cb5f8d40bdd5dc5663f71d640d048239602d586867d8343925"} Dec 11 13:29:06 crc kubenswrapper[4898]: I1211 13:29:06.050307 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 11 13:29:06 crc kubenswrapper[4898]: I1211 13:29:06.059551 4898 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70ba1510-a232-4614-bced-c1afee3bd9b2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 11 13:29:06 crc kubenswrapper[4898]: I1211 13:29:06.059858 4898 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/78074bfc-7a8f-4154-99f1-6c1abefca223-utilities\") on node \"crc\" DevicePath \"\"" Dec 11 13:29:06 crc kubenswrapper[4898]: I1211 13:29:06.059994 4898 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70ba1510-a232-4614-bced-c1afee3bd9b2-config-data\") on node \"crc\" DevicePath \"\"" Dec 11 13:29:06 crc kubenswrapper[4898]: I1211 13:29:06.060075 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jq7p9\" (UniqueName: \"kubernetes.io/projected/78074bfc-7a8f-4154-99f1-6c1abefca223-kube-api-access-jq7p9\") on node \"crc\" DevicePath \"\"" Dec 11 13:29:06 crc kubenswrapper[4898]: I1211 13:29:06.060146 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kjjt9\" (UniqueName: \"kubernetes.io/projected/70ba1510-a232-4614-bced-c1afee3bd9b2-kube-api-access-kjjt9\") on node \"crc\" DevicePath \"\"" Dec 11 13:29:06 crc kubenswrapper[4898]: I1211 13:29:06.060218 4898 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/78074bfc-7a8f-4154-99f1-6c1abefca223-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 11 13:29:06 crc kubenswrapper[4898]: I1211 13:29:06.065618 4898 scope.go:117] "RemoveContainer" containerID="fc93ad9481faaf06cb9cfa1e3eac5eb1eb43d20dd7aaf5112cc1512642115e96" Dec 11 13:29:06 crc kubenswrapper[4898]: I1211 13:29:06.126485 4898 scope.go:117] "RemoveContainer" containerID="66042f98a624765e3a0cc63862363ac26489d7658ed44c5c5ddf20df39f8300e" Dec 11 13:29:06 crc kubenswrapper[4898]: I1211 13:29:06.132525 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 11 13:29:06 crc kubenswrapper[4898]: I1211 13:29:06.154963 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 11 13:29:06 crc kubenswrapper[4898]: I1211 13:29:06.177027 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 11 13:29:06 crc kubenswrapper[4898]: E1211 13:29:06.178367 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78074bfc-7a8f-4154-99f1-6c1abefca223" containerName="extract-utilities" Dec 11 13:29:06 crc kubenswrapper[4898]: I1211 13:29:06.178425 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="78074bfc-7a8f-4154-99f1-6c1abefca223" containerName="extract-utilities" Dec 11 13:29:06 crc kubenswrapper[4898]: E1211 13:29:06.178525 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70ba1510-a232-4614-bced-c1afee3bd9b2" containerName="nova-cell1-novncproxy-novncproxy" Dec 11 13:29:06 crc kubenswrapper[4898]: I1211 13:29:06.178569 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="70ba1510-a232-4614-bced-c1afee3bd9b2" containerName="nova-cell1-novncproxy-novncproxy" Dec 11 13:29:06 crc kubenswrapper[4898]: E1211 13:29:06.178597 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78074bfc-7a8f-4154-99f1-6c1abefca223" containerName="registry-server" Dec 11 13:29:06 crc kubenswrapper[4898]: I1211 13:29:06.178609 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="78074bfc-7a8f-4154-99f1-6c1abefca223" containerName="registry-server" Dec 11 13:29:06 crc kubenswrapper[4898]: E1211 13:29:06.178682 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78074bfc-7a8f-4154-99f1-6c1abefca223" containerName="extract-content" Dec 11 13:29:06 crc kubenswrapper[4898]: I1211 13:29:06.178694 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="78074bfc-7a8f-4154-99f1-6c1abefca223" containerName="extract-content" Dec 11 13:29:06 crc kubenswrapper[4898]: I1211 13:29:06.178820 4898 scope.go:117] "RemoveContainer" containerID="0293b7791f85ddab3b413f003364e5031078f21cb17e1c37d07a7035959e8955" Dec 11 13:29:06 crc kubenswrapper[4898]: I1211 13:29:06.179176 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="78074bfc-7a8f-4154-99f1-6c1abefca223" containerName="registry-server" Dec 11 13:29:06 crc kubenswrapper[4898]: I1211 13:29:06.179267 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="70ba1510-a232-4614-bced-c1afee3bd9b2" containerName="nova-cell1-novncproxy-novncproxy" Dec 11 13:29:06 crc kubenswrapper[4898]: E1211 13:29:06.181726 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0293b7791f85ddab3b413f003364e5031078f21cb17e1c37d07a7035959e8955\": container with ID starting with 0293b7791f85ddab3b413f003364e5031078f21cb17e1c37d07a7035959e8955 not found: ID does not exist" containerID="0293b7791f85ddab3b413f003364e5031078f21cb17e1c37d07a7035959e8955" Dec 11 13:29:06 crc kubenswrapper[4898]: I1211 13:29:06.181768 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0293b7791f85ddab3b413f003364e5031078f21cb17e1c37d07a7035959e8955"} err="failed to get container status \"0293b7791f85ddab3b413f003364e5031078f21cb17e1c37d07a7035959e8955\": rpc error: code = NotFound desc = could not find container \"0293b7791f85ddab3b413f003364e5031078f21cb17e1c37d07a7035959e8955\": container with ID starting with 0293b7791f85ddab3b413f003364e5031078f21cb17e1c37d07a7035959e8955 not found: ID does not exist" Dec 11 13:29:06 crc kubenswrapper[4898]: I1211 13:29:06.181795 4898 scope.go:117] "RemoveContainer" containerID="fc93ad9481faaf06cb9cfa1e3eac5eb1eb43d20dd7aaf5112cc1512642115e96" Dec 11 13:29:06 crc kubenswrapper[4898]: I1211 13:29:06.183087 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 11 13:29:06 crc kubenswrapper[4898]: E1211 13:29:06.183772 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fc93ad9481faaf06cb9cfa1e3eac5eb1eb43d20dd7aaf5112cc1512642115e96\": container with ID starting with fc93ad9481faaf06cb9cfa1e3eac5eb1eb43d20dd7aaf5112cc1512642115e96 not found: ID does not exist" containerID="fc93ad9481faaf06cb9cfa1e3eac5eb1eb43d20dd7aaf5112cc1512642115e96" Dec 11 13:29:06 crc kubenswrapper[4898]: I1211 13:29:06.183807 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc93ad9481faaf06cb9cfa1e3eac5eb1eb43d20dd7aaf5112cc1512642115e96"} err="failed to get container status \"fc93ad9481faaf06cb9cfa1e3eac5eb1eb43d20dd7aaf5112cc1512642115e96\": rpc error: code = NotFound desc = could not find container \"fc93ad9481faaf06cb9cfa1e3eac5eb1eb43d20dd7aaf5112cc1512642115e96\": container with ID starting with fc93ad9481faaf06cb9cfa1e3eac5eb1eb43d20dd7aaf5112cc1512642115e96 not found: ID does not exist" Dec 11 13:29:06 crc kubenswrapper[4898]: I1211 13:29:06.183829 4898 scope.go:117] "RemoveContainer" containerID="66042f98a624765e3a0cc63862363ac26489d7658ed44c5c5ddf20df39f8300e" Dec 11 13:29:06 crc kubenswrapper[4898]: I1211 13:29:06.186909 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Dec 11 13:29:06 crc kubenswrapper[4898]: I1211 13:29:06.187057 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Dec 11 13:29:06 crc kubenswrapper[4898]: I1211 13:29:06.187746 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Dec 11 13:29:06 crc kubenswrapper[4898]: E1211 13:29:06.193013 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"66042f98a624765e3a0cc63862363ac26489d7658ed44c5c5ddf20df39f8300e\": container with ID starting with 66042f98a624765e3a0cc63862363ac26489d7658ed44c5c5ddf20df39f8300e not found: ID does not exist" containerID="66042f98a624765e3a0cc63862363ac26489d7658ed44c5c5ddf20df39f8300e" Dec 11 13:29:06 crc kubenswrapper[4898]: I1211 13:29:06.193389 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"66042f98a624765e3a0cc63862363ac26489d7658ed44c5c5ddf20df39f8300e"} err="failed to get container status \"66042f98a624765e3a0cc63862363ac26489d7658ed44c5c5ddf20df39f8300e\": rpc error: code = NotFound desc = could not find container \"66042f98a624765e3a0cc63862363ac26489d7658ed44c5c5ddf20df39f8300e\": container with ID starting with 66042f98a624765e3a0cc63862363ac26489d7658ed44c5c5ddf20df39f8300e not found: ID does not exist" Dec 11 13:29:06 crc kubenswrapper[4898]: I1211 13:29:06.193532 4898 scope.go:117] "RemoveContainer" containerID="98333e506b604b553b1094b55473be7e6becb832ff69dccac03718315c85834a" Dec 11 13:29:06 crc kubenswrapper[4898]: I1211 13:29:06.194142 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 11 13:29:06 crc kubenswrapper[4898]: I1211 13:29:06.248660 4898 scope.go:117] "RemoveContainer" containerID="98333e506b604b553b1094b55473be7e6becb832ff69dccac03718315c85834a" Dec 11 13:29:06 crc kubenswrapper[4898]: E1211 13:29:06.249374 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"98333e506b604b553b1094b55473be7e6becb832ff69dccac03718315c85834a\": container with ID starting with 98333e506b604b553b1094b55473be7e6becb832ff69dccac03718315c85834a not found: ID does not exist" containerID="98333e506b604b553b1094b55473be7e6becb832ff69dccac03718315c85834a" Dec 11 13:29:06 crc kubenswrapper[4898]: I1211 13:29:06.249623 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"98333e506b604b553b1094b55473be7e6becb832ff69dccac03718315c85834a"} err="failed to get container status \"98333e506b604b553b1094b55473be7e6becb832ff69dccac03718315c85834a\": rpc error: code = NotFound desc = could not find container \"98333e506b604b553b1094b55473be7e6becb832ff69dccac03718315c85834a\": container with ID starting with 98333e506b604b553b1094b55473be7e6becb832ff69dccac03718315c85834a not found: ID does not exist" Dec 11 13:29:06 crc kubenswrapper[4898]: I1211 13:29:06.267090 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b2660b5e-f832-4d7b-8fa5-f3f661708a33-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"b2660b5e-f832-4d7b-8fa5-f3f661708a33\") " pod="openstack/nova-cell1-novncproxy-0" Dec 11 13:29:06 crc kubenswrapper[4898]: I1211 13:29:06.267211 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/b2660b5e-f832-4d7b-8fa5-f3f661708a33-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"b2660b5e-f832-4d7b-8fa5-f3f661708a33\") " pod="openstack/nova-cell1-novncproxy-0" Dec 11 13:29:06 crc kubenswrapper[4898]: I1211 13:29:06.267311 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2660b5e-f832-4d7b-8fa5-f3f661708a33-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"b2660b5e-f832-4d7b-8fa5-f3f661708a33\") " pod="openstack/nova-cell1-novncproxy-0" Dec 11 13:29:06 crc kubenswrapper[4898]: I1211 13:29:06.267355 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/b2660b5e-f832-4d7b-8fa5-f3f661708a33-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"b2660b5e-f832-4d7b-8fa5-f3f661708a33\") " pod="openstack/nova-cell1-novncproxy-0" Dec 11 13:29:06 crc kubenswrapper[4898]: I1211 13:29:06.267388 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6dfmt\" (UniqueName: \"kubernetes.io/projected/b2660b5e-f832-4d7b-8fa5-f3f661708a33-kube-api-access-6dfmt\") pod \"nova-cell1-novncproxy-0\" (UID: \"b2660b5e-f832-4d7b-8fa5-f3f661708a33\") " pod="openstack/nova-cell1-novncproxy-0" Dec 11 13:29:06 crc kubenswrapper[4898]: I1211 13:29:06.329357 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7g6rk"] Dec 11 13:29:06 crc kubenswrapper[4898]: I1211 13:29:06.345372 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-7g6rk"] Dec 11 13:29:06 crc kubenswrapper[4898]: I1211 13:29:06.369680 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b2660b5e-f832-4d7b-8fa5-f3f661708a33-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"b2660b5e-f832-4d7b-8fa5-f3f661708a33\") " pod="openstack/nova-cell1-novncproxy-0" Dec 11 13:29:06 crc kubenswrapper[4898]: I1211 13:29:06.369967 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/b2660b5e-f832-4d7b-8fa5-f3f661708a33-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"b2660b5e-f832-4d7b-8fa5-f3f661708a33\") " pod="openstack/nova-cell1-novncproxy-0" Dec 11 13:29:06 crc kubenswrapper[4898]: I1211 13:29:06.370095 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2660b5e-f832-4d7b-8fa5-f3f661708a33-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"b2660b5e-f832-4d7b-8fa5-f3f661708a33\") " pod="openstack/nova-cell1-novncproxy-0" Dec 11 13:29:06 crc kubenswrapper[4898]: I1211 13:29:06.370237 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/b2660b5e-f832-4d7b-8fa5-f3f661708a33-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"b2660b5e-f832-4d7b-8fa5-f3f661708a33\") " pod="openstack/nova-cell1-novncproxy-0" Dec 11 13:29:06 crc kubenswrapper[4898]: I1211 13:29:06.370318 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6dfmt\" (UniqueName: \"kubernetes.io/projected/b2660b5e-f832-4d7b-8fa5-f3f661708a33-kube-api-access-6dfmt\") pod \"nova-cell1-novncproxy-0\" (UID: \"b2660b5e-f832-4d7b-8fa5-f3f661708a33\") " pod="openstack/nova-cell1-novncproxy-0" Dec 11 13:29:06 crc kubenswrapper[4898]: I1211 13:29:06.375935 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b2660b5e-f832-4d7b-8fa5-f3f661708a33-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"b2660b5e-f832-4d7b-8fa5-f3f661708a33\") " pod="openstack/nova-cell1-novncproxy-0" Dec 11 13:29:06 crc kubenswrapper[4898]: I1211 13:29:06.377038 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/b2660b5e-f832-4d7b-8fa5-f3f661708a33-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"b2660b5e-f832-4d7b-8fa5-f3f661708a33\") " pod="openstack/nova-cell1-novncproxy-0" Dec 11 13:29:06 crc kubenswrapper[4898]: I1211 13:29:06.380195 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2660b5e-f832-4d7b-8fa5-f3f661708a33-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"b2660b5e-f832-4d7b-8fa5-f3f661708a33\") " pod="openstack/nova-cell1-novncproxy-0" Dec 11 13:29:06 crc kubenswrapper[4898]: I1211 13:29:06.382783 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/b2660b5e-f832-4d7b-8fa5-f3f661708a33-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"b2660b5e-f832-4d7b-8fa5-f3f661708a33\") " pod="openstack/nova-cell1-novncproxy-0" Dec 11 13:29:06 crc kubenswrapper[4898]: I1211 13:29:06.391060 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6dfmt\" (UniqueName: \"kubernetes.io/projected/b2660b5e-f832-4d7b-8fa5-f3f661708a33-kube-api-access-6dfmt\") pod \"nova-cell1-novncproxy-0\" (UID: \"b2660b5e-f832-4d7b-8fa5-f3f661708a33\") " pod="openstack/nova-cell1-novncproxy-0" Dec 11 13:29:06 crc kubenswrapper[4898]: I1211 13:29:06.526155 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 11 13:29:06 crc kubenswrapper[4898]: I1211 13:29:06.798215 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="70ba1510-a232-4614-bced-c1afee3bd9b2" path="/var/lib/kubelet/pods/70ba1510-a232-4614-bced-c1afee3bd9b2/volumes" Dec 11 13:29:06 crc kubenswrapper[4898]: I1211 13:29:06.799427 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="78074bfc-7a8f-4154-99f1-6c1abefca223" path="/var/lib/kubelet/pods/78074bfc-7a8f-4154-99f1-6c1abefca223/volumes" Dec 11 13:29:07 crc kubenswrapper[4898]: I1211 13:29:07.070529 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"50ddabd8-3fa4-4116-9f20-7a29bc22e7d1","Type":"ContainerStarted","Data":"a2a6904d83685e8b613df8f2a0de327d08de9e1fd7e4ee1ca2e2ac818bb7079e"} Dec 11 13:29:07 crc kubenswrapper[4898]: I1211 13:29:07.084697 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 11 13:29:07 crc kubenswrapper[4898]: I1211 13:29:07.084768 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 11 13:29:07 crc kubenswrapper[4898]: I1211 13:29:07.092075 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 11 13:29:07 crc kubenswrapper[4898]: I1211 13:29:07.099381 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 11 13:29:07 crc kubenswrapper[4898]: I1211 13:29:07.112475 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 11 13:29:08 crc kubenswrapper[4898]: I1211 13:29:08.105313 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"50ddabd8-3fa4-4116-9f20-7a29bc22e7d1","Type":"ContainerStarted","Data":"ef979401f0c2ae6d9c77bbbb58c826b3542fa1b4bbff35ae4e707c39ee551530"} Dec 11 13:29:08 crc kubenswrapper[4898]: I1211 13:29:08.106126 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 11 13:29:08 crc kubenswrapper[4898]: I1211 13:29:08.114032 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"b2660b5e-f832-4d7b-8fa5-f3f661708a33","Type":"ContainerStarted","Data":"0f9af9d5bb13befe34e6c225bd3845eddd99f7099e5474bf47832bfda99c0194"} Dec 11 13:29:08 crc kubenswrapper[4898]: I1211 13:29:08.114085 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"b2660b5e-f832-4d7b-8fa5-f3f661708a33","Type":"ContainerStarted","Data":"25d3088a8ccc2f109d1be670850cd0c68976f071bd595f700d49396c4aa15f7a"} Dec 11 13:29:08 crc kubenswrapper[4898]: I1211 13:29:08.163247 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.55748197 podStartE2EDuration="8.163227898s" podCreationTimestamp="2025-12-11 13:29:00 +0000 UTC" firstStartedPulling="2025-12-11 13:29:02.146092927 +0000 UTC m=+1499.718419374" lastFinishedPulling="2025-12-11 13:29:07.751838865 +0000 UTC m=+1505.324165302" observedRunningTime="2025-12-11 13:29:08.133615076 +0000 UTC m=+1505.705941513" watchObservedRunningTime="2025-12-11 13:29:08.163227898 +0000 UTC m=+1505.735554335" Dec 11 13:29:08 crc kubenswrapper[4898]: I1211 13:29:08.175842 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ngkww"] Dec 11 13:29:08 crc kubenswrapper[4898]: I1211 13:29:08.176117 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-ngkww" podUID="09202c46-beab-4e3f-aa19-f693116c3d85" containerName="registry-server" containerID="cri-o://3ff27c4fa2fa8cc4c189febb36e4ce98cb7e281ff565f7f3a320e826fb891a84" gracePeriod=2 Dec 11 13:29:08 crc kubenswrapper[4898]: I1211 13:29:08.182255 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.182236452 podStartE2EDuration="2.182236452s" podCreationTimestamp="2025-12-11 13:29:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 13:29:08.15038679 +0000 UTC m=+1505.722713227" watchObservedRunningTime="2025-12-11 13:29:08.182236452 +0000 UTC m=+1505.754562889" Dec 11 13:29:08 crc kubenswrapper[4898]: I1211 13:29:08.267377 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 11 13:29:08 crc kubenswrapper[4898]: I1211 13:29:08.267765 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 11 13:29:08 crc kubenswrapper[4898]: I1211 13:29:08.274245 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 11 13:29:08 crc kubenswrapper[4898]: I1211 13:29:08.278289 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 11 13:29:08 crc kubenswrapper[4898]: I1211 13:29:08.760787 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ngkww" Dec 11 13:29:08 crc kubenswrapper[4898]: I1211 13:29:08.943768 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j7p7j\" (UniqueName: \"kubernetes.io/projected/09202c46-beab-4e3f-aa19-f693116c3d85-kube-api-access-j7p7j\") pod \"09202c46-beab-4e3f-aa19-f693116c3d85\" (UID: \"09202c46-beab-4e3f-aa19-f693116c3d85\") " Dec 11 13:29:08 crc kubenswrapper[4898]: I1211 13:29:08.943933 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/09202c46-beab-4e3f-aa19-f693116c3d85-catalog-content\") pod \"09202c46-beab-4e3f-aa19-f693116c3d85\" (UID: \"09202c46-beab-4e3f-aa19-f693116c3d85\") " Dec 11 13:29:08 crc kubenswrapper[4898]: I1211 13:29:08.943980 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/09202c46-beab-4e3f-aa19-f693116c3d85-utilities\") pod \"09202c46-beab-4e3f-aa19-f693116c3d85\" (UID: \"09202c46-beab-4e3f-aa19-f693116c3d85\") " Dec 11 13:29:08 crc kubenswrapper[4898]: I1211 13:29:08.944566 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/09202c46-beab-4e3f-aa19-f693116c3d85-utilities" (OuterVolumeSpecName: "utilities") pod "09202c46-beab-4e3f-aa19-f693116c3d85" (UID: "09202c46-beab-4e3f-aa19-f693116c3d85"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 13:29:08 crc kubenswrapper[4898]: I1211 13:29:08.944875 4898 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/09202c46-beab-4e3f-aa19-f693116c3d85-utilities\") on node \"crc\" DevicePath \"\"" Dec 11 13:29:08 crc kubenswrapper[4898]: I1211 13:29:08.964020 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09202c46-beab-4e3f-aa19-f693116c3d85-kube-api-access-j7p7j" (OuterVolumeSpecName: "kube-api-access-j7p7j") pod "09202c46-beab-4e3f-aa19-f693116c3d85" (UID: "09202c46-beab-4e3f-aa19-f693116c3d85"). InnerVolumeSpecName "kube-api-access-j7p7j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:29:08 crc kubenswrapper[4898]: I1211 13:29:08.972636 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/09202c46-beab-4e3f-aa19-f693116c3d85-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "09202c46-beab-4e3f-aa19-f693116c3d85" (UID: "09202c46-beab-4e3f-aa19-f693116c3d85"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 13:29:09 crc kubenswrapper[4898]: I1211 13:29:09.047182 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j7p7j\" (UniqueName: \"kubernetes.io/projected/09202c46-beab-4e3f-aa19-f693116c3d85-kube-api-access-j7p7j\") on node \"crc\" DevicePath \"\"" Dec 11 13:29:09 crc kubenswrapper[4898]: I1211 13:29:09.047241 4898 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/09202c46-beab-4e3f-aa19-f693116c3d85-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 11 13:29:09 crc kubenswrapper[4898]: I1211 13:29:09.126728 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ngkww" Dec 11 13:29:09 crc kubenswrapper[4898]: I1211 13:29:09.126769 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ngkww" event={"ID":"09202c46-beab-4e3f-aa19-f693116c3d85","Type":"ContainerDied","Data":"3ff27c4fa2fa8cc4c189febb36e4ce98cb7e281ff565f7f3a320e826fb891a84"} Dec 11 13:29:09 crc kubenswrapper[4898]: I1211 13:29:09.126866 4898 scope.go:117] "RemoveContainer" containerID="3ff27c4fa2fa8cc4c189febb36e4ce98cb7e281ff565f7f3a320e826fb891a84" Dec 11 13:29:09 crc kubenswrapper[4898]: I1211 13:29:09.126620 4898 generic.go:334] "Generic (PLEG): container finished" podID="09202c46-beab-4e3f-aa19-f693116c3d85" containerID="3ff27c4fa2fa8cc4c189febb36e4ce98cb7e281ff565f7f3a320e826fb891a84" exitCode=0 Dec 11 13:29:09 crc kubenswrapper[4898]: I1211 13:29:09.139918 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ngkww" event={"ID":"09202c46-beab-4e3f-aa19-f693116c3d85","Type":"ContainerDied","Data":"fd46eb9eb485cc40a2b712d89de59deb61e811bbe5ac9d7475570cfc5903e01b"} Dec 11 13:29:09 crc kubenswrapper[4898]: I1211 13:29:09.140159 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 11 13:29:09 crc kubenswrapper[4898]: I1211 13:29:09.146134 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 11 13:29:09 crc kubenswrapper[4898]: I1211 13:29:09.176930 4898 scope.go:117] "RemoveContainer" containerID="18c889b834d38549edd32686508be946d9c9b534d6bf7b681ca4e20b36efb634" Dec 11 13:29:09 crc kubenswrapper[4898]: I1211 13:29:09.195124 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ngkww"] Dec 11 13:29:09 crc kubenswrapper[4898]: I1211 13:29:09.234684 4898 scope.go:117] "RemoveContainer" containerID="d8f3fb958911a3fe38909e22367776c0d0d5a705a98dff53f9b0bd0e84d32b3d" Dec 11 13:29:09 crc kubenswrapper[4898]: I1211 13:29:09.268637 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-ngkww"] Dec 11 13:29:09 crc kubenswrapper[4898]: I1211 13:29:09.312322 4898 scope.go:117] "RemoveContainer" containerID="3ff27c4fa2fa8cc4c189febb36e4ce98cb7e281ff565f7f3a320e826fb891a84" Dec 11 13:29:09 crc kubenswrapper[4898]: E1211 13:29:09.313610 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ff27c4fa2fa8cc4c189febb36e4ce98cb7e281ff565f7f3a320e826fb891a84\": container with ID starting with 3ff27c4fa2fa8cc4c189febb36e4ce98cb7e281ff565f7f3a320e826fb891a84 not found: ID does not exist" containerID="3ff27c4fa2fa8cc4c189febb36e4ce98cb7e281ff565f7f3a320e826fb891a84" Dec 11 13:29:09 crc kubenswrapper[4898]: I1211 13:29:09.313650 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ff27c4fa2fa8cc4c189febb36e4ce98cb7e281ff565f7f3a320e826fb891a84"} err="failed to get container status \"3ff27c4fa2fa8cc4c189febb36e4ce98cb7e281ff565f7f3a320e826fb891a84\": rpc error: code = NotFound desc = could not find container \"3ff27c4fa2fa8cc4c189febb36e4ce98cb7e281ff565f7f3a320e826fb891a84\": container with ID starting with 3ff27c4fa2fa8cc4c189febb36e4ce98cb7e281ff565f7f3a320e826fb891a84 not found: ID does not exist" Dec 11 13:29:09 crc kubenswrapper[4898]: I1211 13:29:09.313675 4898 scope.go:117] "RemoveContainer" containerID="18c889b834d38549edd32686508be946d9c9b534d6bf7b681ca4e20b36efb634" Dec 11 13:29:09 crc kubenswrapper[4898]: E1211 13:29:09.314046 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"18c889b834d38549edd32686508be946d9c9b534d6bf7b681ca4e20b36efb634\": container with ID starting with 18c889b834d38549edd32686508be946d9c9b534d6bf7b681ca4e20b36efb634 not found: ID does not exist" containerID="18c889b834d38549edd32686508be946d9c9b534d6bf7b681ca4e20b36efb634" Dec 11 13:29:09 crc kubenswrapper[4898]: I1211 13:29:09.314089 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18c889b834d38549edd32686508be946d9c9b534d6bf7b681ca4e20b36efb634"} err="failed to get container status \"18c889b834d38549edd32686508be946d9c9b534d6bf7b681ca4e20b36efb634\": rpc error: code = NotFound desc = could not find container \"18c889b834d38549edd32686508be946d9c9b534d6bf7b681ca4e20b36efb634\": container with ID starting with 18c889b834d38549edd32686508be946d9c9b534d6bf7b681ca4e20b36efb634 not found: ID does not exist" Dec 11 13:29:09 crc kubenswrapper[4898]: I1211 13:29:09.314154 4898 scope.go:117] "RemoveContainer" containerID="d8f3fb958911a3fe38909e22367776c0d0d5a705a98dff53f9b0bd0e84d32b3d" Dec 11 13:29:09 crc kubenswrapper[4898]: E1211 13:29:09.317880 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d8f3fb958911a3fe38909e22367776c0d0d5a705a98dff53f9b0bd0e84d32b3d\": container with ID starting with d8f3fb958911a3fe38909e22367776c0d0d5a705a98dff53f9b0bd0e84d32b3d not found: ID does not exist" containerID="d8f3fb958911a3fe38909e22367776c0d0d5a705a98dff53f9b0bd0e84d32b3d" Dec 11 13:29:09 crc kubenswrapper[4898]: I1211 13:29:09.317917 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d8f3fb958911a3fe38909e22367776c0d0d5a705a98dff53f9b0bd0e84d32b3d"} err="failed to get container status \"d8f3fb958911a3fe38909e22367776c0d0d5a705a98dff53f9b0bd0e84d32b3d\": rpc error: code = NotFound desc = could not find container \"d8f3fb958911a3fe38909e22367776c0d0d5a705a98dff53f9b0bd0e84d32b3d\": container with ID starting with d8f3fb958911a3fe38909e22367776c0d0d5a705a98dff53f9b0bd0e84d32b3d not found: ID does not exist" Dec 11 13:29:09 crc kubenswrapper[4898]: I1211 13:29:09.375436 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-f84f9ccf-k4bhj"] Dec 11 13:29:09 crc kubenswrapper[4898]: E1211 13:29:09.376003 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09202c46-beab-4e3f-aa19-f693116c3d85" containerName="extract-content" Dec 11 13:29:09 crc kubenswrapper[4898]: I1211 13:29:09.376015 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="09202c46-beab-4e3f-aa19-f693116c3d85" containerName="extract-content" Dec 11 13:29:09 crc kubenswrapper[4898]: E1211 13:29:09.376033 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09202c46-beab-4e3f-aa19-f693116c3d85" containerName="registry-server" Dec 11 13:29:09 crc kubenswrapper[4898]: I1211 13:29:09.376039 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="09202c46-beab-4e3f-aa19-f693116c3d85" containerName="registry-server" Dec 11 13:29:09 crc kubenswrapper[4898]: E1211 13:29:09.376054 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09202c46-beab-4e3f-aa19-f693116c3d85" containerName="extract-utilities" Dec 11 13:29:09 crc kubenswrapper[4898]: I1211 13:29:09.376061 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="09202c46-beab-4e3f-aa19-f693116c3d85" containerName="extract-utilities" Dec 11 13:29:09 crc kubenswrapper[4898]: I1211 13:29:09.376268 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="09202c46-beab-4e3f-aa19-f693116c3d85" containerName="registry-server" Dec 11 13:29:09 crc kubenswrapper[4898]: I1211 13:29:09.377505 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f84f9ccf-k4bhj" Dec 11 13:29:09 crc kubenswrapper[4898]: I1211 13:29:09.448526 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-f84f9ccf-k4bhj"] Dec 11 13:29:09 crc kubenswrapper[4898]: I1211 13:29:09.482645 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/df10e08a-4d2c-4260-8930-0a64e2fe8b0d-ovsdbserver-nb\") pod \"dnsmasq-dns-f84f9ccf-k4bhj\" (UID: \"df10e08a-4d2c-4260-8930-0a64e2fe8b0d\") " pod="openstack/dnsmasq-dns-f84f9ccf-k4bhj" Dec 11 13:29:09 crc kubenswrapper[4898]: I1211 13:29:09.482738 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/df10e08a-4d2c-4260-8930-0a64e2fe8b0d-dns-svc\") pod \"dnsmasq-dns-f84f9ccf-k4bhj\" (UID: \"df10e08a-4d2c-4260-8930-0a64e2fe8b0d\") " pod="openstack/dnsmasq-dns-f84f9ccf-k4bhj" Dec 11 13:29:09 crc kubenswrapper[4898]: I1211 13:29:09.482860 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nqnwd\" (UniqueName: \"kubernetes.io/projected/df10e08a-4d2c-4260-8930-0a64e2fe8b0d-kube-api-access-nqnwd\") pod \"dnsmasq-dns-f84f9ccf-k4bhj\" (UID: \"df10e08a-4d2c-4260-8930-0a64e2fe8b0d\") " pod="openstack/dnsmasq-dns-f84f9ccf-k4bhj" Dec 11 13:29:09 crc kubenswrapper[4898]: I1211 13:29:09.482985 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df10e08a-4d2c-4260-8930-0a64e2fe8b0d-config\") pod \"dnsmasq-dns-f84f9ccf-k4bhj\" (UID: \"df10e08a-4d2c-4260-8930-0a64e2fe8b0d\") " pod="openstack/dnsmasq-dns-f84f9ccf-k4bhj" Dec 11 13:29:09 crc kubenswrapper[4898]: I1211 13:29:09.483074 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/df10e08a-4d2c-4260-8930-0a64e2fe8b0d-dns-swift-storage-0\") pod \"dnsmasq-dns-f84f9ccf-k4bhj\" (UID: \"df10e08a-4d2c-4260-8930-0a64e2fe8b0d\") " pod="openstack/dnsmasq-dns-f84f9ccf-k4bhj" Dec 11 13:29:09 crc kubenswrapper[4898]: I1211 13:29:09.483349 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/df10e08a-4d2c-4260-8930-0a64e2fe8b0d-ovsdbserver-sb\") pod \"dnsmasq-dns-f84f9ccf-k4bhj\" (UID: \"df10e08a-4d2c-4260-8930-0a64e2fe8b0d\") " pod="openstack/dnsmasq-dns-f84f9ccf-k4bhj" Dec 11 13:29:09 crc kubenswrapper[4898]: I1211 13:29:09.585413 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/df10e08a-4d2c-4260-8930-0a64e2fe8b0d-ovsdbserver-nb\") pod \"dnsmasq-dns-f84f9ccf-k4bhj\" (UID: \"df10e08a-4d2c-4260-8930-0a64e2fe8b0d\") " pod="openstack/dnsmasq-dns-f84f9ccf-k4bhj" Dec 11 13:29:09 crc kubenswrapper[4898]: I1211 13:29:09.585493 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/df10e08a-4d2c-4260-8930-0a64e2fe8b0d-dns-svc\") pod \"dnsmasq-dns-f84f9ccf-k4bhj\" (UID: \"df10e08a-4d2c-4260-8930-0a64e2fe8b0d\") " pod="openstack/dnsmasq-dns-f84f9ccf-k4bhj" Dec 11 13:29:09 crc kubenswrapper[4898]: I1211 13:29:09.585561 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nqnwd\" (UniqueName: \"kubernetes.io/projected/df10e08a-4d2c-4260-8930-0a64e2fe8b0d-kube-api-access-nqnwd\") pod \"dnsmasq-dns-f84f9ccf-k4bhj\" (UID: \"df10e08a-4d2c-4260-8930-0a64e2fe8b0d\") " pod="openstack/dnsmasq-dns-f84f9ccf-k4bhj" Dec 11 13:29:09 crc kubenswrapper[4898]: I1211 13:29:09.585645 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df10e08a-4d2c-4260-8930-0a64e2fe8b0d-config\") pod \"dnsmasq-dns-f84f9ccf-k4bhj\" (UID: \"df10e08a-4d2c-4260-8930-0a64e2fe8b0d\") " pod="openstack/dnsmasq-dns-f84f9ccf-k4bhj" Dec 11 13:29:09 crc kubenswrapper[4898]: I1211 13:29:09.585695 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/df10e08a-4d2c-4260-8930-0a64e2fe8b0d-dns-swift-storage-0\") pod \"dnsmasq-dns-f84f9ccf-k4bhj\" (UID: \"df10e08a-4d2c-4260-8930-0a64e2fe8b0d\") " pod="openstack/dnsmasq-dns-f84f9ccf-k4bhj" Dec 11 13:29:09 crc kubenswrapper[4898]: I1211 13:29:09.585747 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/df10e08a-4d2c-4260-8930-0a64e2fe8b0d-ovsdbserver-sb\") pod \"dnsmasq-dns-f84f9ccf-k4bhj\" (UID: \"df10e08a-4d2c-4260-8930-0a64e2fe8b0d\") " pod="openstack/dnsmasq-dns-f84f9ccf-k4bhj" Dec 11 13:29:09 crc kubenswrapper[4898]: I1211 13:29:09.586717 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/df10e08a-4d2c-4260-8930-0a64e2fe8b0d-ovsdbserver-sb\") pod \"dnsmasq-dns-f84f9ccf-k4bhj\" (UID: \"df10e08a-4d2c-4260-8930-0a64e2fe8b0d\") " pod="openstack/dnsmasq-dns-f84f9ccf-k4bhj" Dec 11 13:29:09 crc kubenswrapper[4898]: I1211 13:29:09.587229 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/df10e08a-4d2c-4260-8930-0a64e2fe8b0d-ovsdbserver-nb\") pod \"dnsmasq-dns-f84f9ccf-k4bhj\" (UID: \"df10e08a-4d2c-4260-8930-0a64e2fe8b0d\") " pod="openstack/dnsmasq-dns-f84f9ccf-k4bhj" Dec 11 13:29:09 crc kubenswrapper[4898]: I1211 13:29:09.587734 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/df10e08a-4d2c-4260-8930-0a64e2fe8b0d-dns-svc\") pod \"dnsmasq-dns-f84f9ccf-k4bhj\" (UID: \"df10e08a-4d2c-4260-8930-0a64e2fe8b0d\") " pod="openstack/dnsmasq-dns-f84f9ccf-k4bhj" Dec 11 13:29:09 crc kubenswrapper[4898]: I1211 13:29:09.588299 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/df10e08a-4d2c-4260-8930-0a64e2fe8b0d-dns-swift-storage-0\") pod \"dnsmasq-dns-f84f9ccf-k4bhj\" (UID: \"df10e08a-4d2c-4260-8930-0a64e2fe8b0d\") " pod="openstack/dnsmasq-dns-f84f9ccf-k4bhj" Dec 11 13:29:09 crc kubenswrapper[4898]: I1211 13:29:09.588315 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df10e08a-4d2c-4260-8930-0a64e2fe8b0d-config\") pod \"dnsmasq-dns-f84f9ccf-k4bhj\" (UID: \"df10e08a-4d2c-4260-8930-0a64e2fe8b0d\") " pod="openstack/dnsmasq-dns-f84f9ccf-k4bhj" Dec 11 13:29:09 crc kubenswrapper[4898]: I1211 13:29:09.613187 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nqnwd\" (UniqueName: \"kubernetes.io/projected/df10e08a-4d2c-4260-8930-0a64e2fe8b0d-kube-api-access-nqnwd\") pod \"dnsmasq-dns-f84f9ccf-k4bhj\" (UID: \"df10e08a-4d2c-4260-8930-0a64e2fe8b0d\") " pod="openstack/dnsmasq-dns-f84f9ccf-k4bhj" Dec 11 13:29:09 crc kubenswrapper[4898]: I1211 13:29:09.720030 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f84f9ccf-k4bhj" Dec 11 13:29:10 crc kubenswrapper[4898]: I1211 13:29:10.278546 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-f84f9ccf-k4bhj"] Dec 11 13:29:10 crc kubenswrapper[4898]: I1211 13:29:10.789687 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09202c46-beab-4e3f-aa19-f693116c3d85" path="/var/lib/kubelet/pods/09202c46-beab-4e3f-aa19-f693116c3d85/volumes" Dec 11 13:29:11 crc kubenswrapper[4898]: I1211 13:29:11.164272 4898 generic.go:334] "Generic (PLEG): container finished" podID="df10e08a-4d2c-4260-8930-0a64e2fe8b0d" containerID="8bb3b02915c58562a243cd9617c5a57019af46823515e8b413162ea23fc0752b" exitCode=0 Dec 11 13:29:11 crc kubenswrapper[4898]: I1211 13:29:11.164343 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f84f9ccf-k4bhj" event={"ID":"df10e08a-4d2c-4260-8930-0a64e2fe8b0d","Type":"ContainerDied","Data":"8bb3b02915c58562a243cd9617c5a57019af46823515e8b413162ea23fc0752b"} Dec 11 13:29:11 crc kubenswrapper[4898]: I1211 13:29:11.164711 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f84f9ccf-k4bhj" event={"ID":"df10e08a-4d2c-4260-8930-0a64e2fe8b0d","Type":"ContainerStarted","Data":"e6a6639c9b15ed8dbeb93c198b57242f136275fa48621bd33e7c9dd3da547616"} Dec 11 13:29:11 crc kubenswrapper[4898]: I1211 13:29:11.528087 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Dec 11 13:29:12 crc kubenswrapper[4898]: I1211 13:29:12.177708 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f84f9ccf-k4bhj" event={"ID":"df10e08a-4d2c-4260-8930-0a64e2fe8b0d","Type":"ContainerStarted","Data":"42b6bd9e6e276bb794412bcfaf33a934513de982344b0a975064af0a20c56491"} Dec 11 13:29:12 crc kubenswrapper[4898]: I1211 13:29:12.177955 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-f84f9ccf-k4bhj" Dec 11 13:29:12 crc kubenswrapper[4898]: I1211 13:29:12.202533 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-f84f9ccf-k4bhj" podStartSLOduration=3.202516582 podStartE2EDuration="3.202516582s" podCreationTimestamp="2025-12-11 13:29:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 13:29:12.202477631 +0000 UTC m=+1509.774804068" watchObservedRunningTime="2025-12-11 13:29:12.202516582 +0000 UTC m=+1509.774843019" Dec 11 13:29:12 crc kubenswrapper[4898]: I1211 13:29:12.389598 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 11 13:29:12 crc kubenswrapper[4898]: I1211 13:29:12.389860 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="50ddabd8-3fa4-4116-9f20-7a29bc22e7d1" containerName="ceilometer-central-agent" containerID="cri-o://96d65975ddadea8e8533d5cdb78a2e7c2895759ef5f22df8d1ca2d722bbbd40d" gracePeriod=30 Dec 11 13:29:12 crc kubenswrapper[4898]: I1211 13:29:12.389980 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="50ddabd8-3fa4-4116-9f20-7a29bc22e7d1" containerName="ceilometer-notification-agent" containerID="cri-o://7e982e672761b461554d02f60835ac21b3ae4e3c28a83b45aa4cdd86cd1143a1" gracePeriod=30 Dec 11 13:29:12 crc kubenswrapper[4898]: I1211 13:29:12.389980 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="50ddabd8-3fa4-4116-9f20-7a29bc22e7d1" containerName="proxy-httpd" containerID="cri-o://ef979401f0c2ae6d9c77bbbb58c826b3542fa1b4bbff35ae4e707c39ee551530" gracePeriod=30 Dec 11 13:29:12 crc kubenswrapper[4898]: I1211 13:29:12.389983 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="50ddabd8-3fa4-4116-9f20-7a29bc22e7d1" containerName="sg-core" containerID="cri-o://a2a6904d83685e8b613df8f2a0de327d08de9e1fd7e4ee1ca2e2ac818bb7079e" gracePeriod=30 Dec 11 13:29:12 crc kubenswrapper[4898]: I1211 13:29:12.685137 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 11 13:29:12 crc kubenswrapper[4898]: I1211 13:29:12.685730 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="db5a5721-a011-492c-b79d-cc175721a3c4" containerName="nova-api-log" containerID="cri-o://4710cb8ea8f1d8dc82a27c9e1822b82d18c87c680794c9cf0033f6878139aeeb" gracePeriod=30 Dec 11 13:29:12 crc kubenswrapper[4898]: I1211 13:29:12.685795 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="db5a5721-a011-492c-b79d-cc175721a3c4" containerName="nova-api-api" containerID="cri-o://183e17e589fd2c89941b4b6ee9a299c6676c61402b03bbb7f580231da70d85df" gracePeriod=30 Dec 11 13:29:13 crc kubenswrapper[4898]: I1211 13:29:13.191259 4898 generic.go:334] "Generic (PLEG): container finished" podID="50ddabd8-3fa4-4116-9f20-7a29bc22e7d1" containerID="ef979401f0c2ae6d9c77bbbb58c826b3542fa1b4bbff35ae4e707c39ee551530" exitCode=0 Dec 11 13:29:13 crc kubenswrapper[4898]: I1211 13:29:13.191294 4898 generic.go:334] "Generic (PLEG): container finished" podID="50ddabd8-3fa4-4116-9f20-7a29bc22e7d1" containerID="a2a6904d83685e8b613df8f2a0de327d08de9e1fd7e4ee1ca2e2ac818bb7079e" exitCode=2 Dec 11 13:29:13 crc kubenswrapper[4898]: I1211 13:29:13.191307 4898 generic.go:334] "Generic (PLEG): container finished" podID="50ddabd8-3fa4-4116-9f20-7a29bc22e7d1" containerID="7e982e672761b461554d02f60835ac21b3ae4e3c28a83b45aa4cdd86cd1143a1" exitCode=0 Dec 11 13:29:13 crc kubenswrapper[4898]: I1211 13:29:13.191327 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"50ddabd8-3fa4-4116-9f20-7a29bc22e7d1","Type":"ContainerDied","Data":"ef979401f0c2ae6d9c77bbbb58c826b3542fa1b4bbff35ae4e707c39ee551530"} Dec 11 13:29:13 crc kubenswrapper[4898]: I1211 13:29:13.191359 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"50ddabd8-3fa4-4116-9f20-7a29bc22e7d1","Type":"ContainerDied","Data":"a2a6904d83685e8b613df8f2a0de327d08de9e1fd7e4ee1ca2e2ac818bb7079e"} Dec 11 13:29:13 crc kubenswrapper[4898]: I1211 13:29:13.191369 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"50ddabd8-3fa4-4116-9f20-7a29bc22e7d1","Type":"ContainerDied","Data":"7e982e672761b461554d02f60835ac21b3ae4e3c28a83b45aa4cdd86cd1143a1"} Dec 11 13:29:13 crc kubenswrapper[4898]: I1211 13:29:13.193199 4898 generic.go:334] "Generic (PLEG): container finished" podID="db5a5721-a011-492c-b79d-cc175721a3c4" containerID="4710cb8ea8f1d8dc82a27c9e1822b82d18c87c680794c9cf0033f6878139aeeb" exitCode=143 Dec 11 13:29:13 crc kubenswrapper[4898]: I1211 13:29:13.193346 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"db5a5721-a011-492c-b79d-cc175721a3c4","Type":"ContainerDied","Data":"4710cb8ea8f1d8dc82a27c9e1822b82d18c87c680794c9cf0033f6878139aeeb"} Dec 11 13:29:14 crc kubenswrapper[4898]: I1211 13:29:14.211654 4898 generic.go:334] "Generic (PLEG): container finished" podID="50ddabd8-3fa4-4116-9f20-7a29bc22e7d1" containerID="96d65975ddadea8e8533d5cdb78a2e7c2895759ef5f22df8d1ca2d722bbbd40d" exitCode=0 Dec 11 13:29:14 crc kubenswrapper[4898]: I1211 13:29:14.211916 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"50ddabd8-3fa4-4116-9f20-7a29bc22e7d1","Type":"ContainerDied","Data":"96d65975ddadea8e8533d5cdb78a2e7c2895759ef5f22df8d1ca2d722bbbd40d"} Dec 11 13:29:14 crc kubenswrapper[4898]: I1211 13:29:14.326703 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-wjmfk" podUID="cec835c2-4160-4de3-ac12-7e808d0882b3" containerName="registry-server" probeResult="failure" output=< Dec 11 13:29:14 crc kubenswrapper[4898]: timeout: failed to connect service ":50051" within 1s Dec 11 13:29:14 crc kubenswrapper[4898]: > Dec 11 13:29:14 crc kubenswrapper[4898]: I1211 13:29:14.607054 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 11 13:29:14 crc kubenswrapper[4898]: I1211 13:29:14.804166 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50ddabd8-3fa4-4116-9f20-7a29bc22e7d1-combined-ca-bundle\") pod \"50ddabd8-3fa4-4116-9f20-7a29bc22e7d1\" (UID: \"50ddabd8-3fa4-4116-9f20-7a29bc22e7d1\") " Dec 11 13:29:14 crc kubenswrapper[4898]: I1211 13:29:14.804891 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/50ddabd8-3fa4-4116-9f20-7a29bc22e7d1-scripts\") pod \"50ddabd8-3fa4-4116-9f20-7a29bc22e7d1\" (UID: \"50ddabd8-3fa4-4116-9f20-7a29bc22e7d1\") " Dec 11 13:29:14 crc kubenswrapper[4898]: I1211 13:29:14.805093 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/50ddabd8-3fa4-4116-9f20-7a29bc22e7d1-sg-core-conf-yaml\") pod \"50ddabd8-3fa4-4116-9f20-7a29bc22e7d1\" (UID: \"50ddabd8-3fa4-4116-9f20-7a29bc22e7d1\") " Dec 11 13:29:14 crc kubenswrapper[4898]: I1211 13:29:14.805228 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/50ddabd8-3fa4-4116-9f20-7a29bc22e7d1-log-httpd\") pod \"50ddabd8-3fa4-4116-9f20-7a29bc22e7d1\" (UID: \"50ddabd8-3fa4-4116-9f20-7a29bc22e7d1\") " Dec 11 13:29:14 crc kubenswrapper[4898]: I1211 13:29:14.805343 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50ddabd8-3fa4-4116-9f20-7a29bc22e7d1-config-data\") pod \"50ddabd8-3fa4-4116-9f20-7a29bc22e7d1\" (UID: \"50ddabd8-3fa4-4116-9f20-7a29bc22e7d1\") " Dec 11 13:29:14 crc kubenswrapper[4898]: I1211 13:29:14.805442 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/50ddabd8-3fa4-4116-9f20-7a29bc22e7d1-run-httpd\") pod \"50ddabd8-3fa4-4116-9f20-7a29bc22e7d1\" (UID: \"50ddabd8-3fa4-4116-9f20-7a29bc22e7d1\") " Dec 11 13:29:14 crc kubenswrapper[4898]: I1211 13:29:14.805557 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wk2r6\" (UniqueName: \"kubernetes.io/projected/50ddabd8-3fa4-4116-9f20-7a29bc22e7d1-kube-api-access-wk2r6\") pod \"50ddabd8-3fa4-4116-9f20-7a29bc22e7d1\" (UID: \"50ddabd8-3fa4-4116-9f20-7a29bc22e7d1\") " Dec 11 13:29:14 crc kubenswrapper[4898]: I1211 13:29:14.805736 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/50ddabd8-3fa4-4116-9f20-7a29bc22e7d1-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "50ddabd8-3fa4-4116-9f20-7a29bc22e7d1" (UID: "50ddabd8-3fa4-4116-9f20-7a29bc22e7d1"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 13:29:14 crc kubenswrapper[4898]: I1211 13:29:14.805844 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/50ddabd8-3fa4-4116-9f20-7a29bc22e7d1-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "50ddabd8-3fa4-4116-9f20-7a29bc22e7d1" (UID: "50ddabd8-3fa4-4116-9f20-7a29bc22e7d1"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 13:29:14 crc kubenswrapper[4898]: I1211 13:29:14.806832 4898 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/50ddabd8-3fa4-4116-9f20-7a29bc22e7d1-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 11 13:29:14 crc kubenswrapper[4898]: I1211 13:29:14.806918 4898 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/50ddabd8-3fa4-4116-9f20-7a29bc22e7d1-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 11 13:29:14 crc kubenswrapper[4898]: I1211 13:29:14.812148 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50ddabd8-3fa4-4116-9f20-7a29bc22e7d1-scripts" (OuterVolumeSpecName: "scripts") pod "50ddabd8-3fa4-4116-9f20-7a29bc22e7d1" (UID: "50ddabd8-3fa4-4116-9f20-7a29bc22e7d1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:29:14 crc kubenswrapper[4898]: I1211 13:29:14.818998 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/50ddabd8-3fa4-4116-9f20-7a29bc22e7d1-kube-api-access-wk2r6" (OuterVolumeSpecName: "kube-api-access-wk2r6") pod "50ddabd8-3fa4-4116-9f20-7a29bc22e7d1" (UID: "50ddabd8-3fa4-4116-9f20-7a29bc22e7d1"). InnerVolumeSpecName "kube-api-access-wk2r6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:29:14 crc kubenswrapper[4898]: I1211 13:29:14.848851 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50ddabd8-3fa4-4116-9f20-7a29bc22e7d1-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "50ddabd8-3fa4-4116-9f20-7a29bc22e7d1" (UID: "50ddabd8-3fa4-4116-9f20-7a29bc22e7d1"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:29:14 crc kubenswrapper[4898]: I1211 13:29:14.911992 4898 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/50ddabd8-3fa4-4116-9f20-7a29bc22e7d1-scripts\") on node \"crc\" DevicePath \"\"" Dec 11 13:29:14 crc kubenswrapper[4898]: I1211 13:29:14.912034 4898 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/50ddabd8-3fa4-4116-9f20-7a29bc22e7d1-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 11 13:29:14 crc kubenswrapper[4898]: I1211 13:29:14.912049 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wk2r6\" (UniqueName: \"kubernetes.io/projected/50ddabd8-3fa4-4116-9f20-7a29bc22e7d1-kube-api-access-wk2r6\") on node \"crc\" DevicePath \"\"" Dec 11 13:29:14 crc kubenswrapper[4898]: I1211 13:29:14.922129 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50ddabd8-3fa4-4116-9f20-7a29bc22e7d1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "50ddabd8-3fa4-4116-9f20-7a29bc22e7d1" (UID: "50ddabd8-3fa4-4116-9f20-7a29bc22e7d1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:29:14 crc kubenswrapper[4898]: I1211 13:29:14.959830 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50ddabd8-3fa4-4116-9f20-7a29bc22e7d1-config-data" (OuterVolumeSpecName: "config-data") pod "50ddabd8-3fa4-4116-9f20-7a29bc22e7d1" (UID: "50ddabd8-3fa4-4116-9f20-7a29bc22e7d1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:29:15 crc kubenswrapper[4898]: I1211 13:29:15.014563 4898 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50ddabd8-3fa4-4116-9f20-7a29bc22e7d1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 11 13:29:15 crc kubenswrapper[4898]: I1211 13:29:15.014853 4898 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50ddabd8-3fa4-4116-9f20-7a29bc22e7d1-config-data\") on node \"crc\" DevicePath \"\"" Dec 11 13:29:15 crc kubenswrapper[4898]: I1211 13:29:15.236716 4898 generic.go:334] "Generic (PLEG): container finished" podID="b73e56da-3159-42d6-a1f8-d72c25c82451" containerID="c20379cd692de345371ccacef2282a32d999fa39e12d626e1bb6a5519452e5e4" exitCode=0 Dec 11 13:29:15 crc kubenswrapper[4898]: I1211 13:29:15.236867 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"b73e56da-3159-42d6-a1f8-d72c25c82451","Type":"ContainerDied","Data":"c20379cd692de345371ccacef2282a32d999fa39e12d626e1bb6a5519452e5e4"} Dec 11 13:29:15 crc kubenswrapper[4898]: I1211 13:29:15.245533 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"50ddabd8-3fa4-4116-9f20-7a29bc22e7d1","Type":"ContainerDied","Data":"006520574b5be8ba9b5061a97ddb08cc80007e96279a08430f7f96b4aba9a8be"} Dec 11 13:29:15 crc kubenswrapper[4898]: I1211 13:29:15.245984 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 11 13:29:15 crc kubenswrapper[4898]: I1211 13:29:15.246021 4898 scope.go:117] "RemoveContainer" containerID="ef979401f0c2ae6d9c77bbbb58c826b3542fa1b4bbff35ae4e707c39ee551530" Dec 11 13:29:15 crc kubenswrapper[4898]: I1211 13:29:15.283527 4898 scope.go:117] "RemoveContainer" containerID="a2a6904d83685e8b613df8f2a0de327d08de9e1fd7e4ee1ca2e2ac818bb7079e" Dec 11 13:29:15 crc kubenswrapper[4898]: I1211 13:29:15.290154 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 11 13:29:15 crc kubenswrapper[4898]: I1211 13:29:15.311671 4898 scope.go:117] "RemoveContainer" containerID="7e982e672761b461554d02f60835ac21b3ae4e3c28a83b45aa4cdd86cd1143a1" Dec 11 13:29:15 crc kubenswrapper[4898]: I1211 13:29:15.362800 4898 scope.go:117] "RemoveContainer" containerID="96d65975ddadea8e8533d5cdb78a2e7c2895759ef5f22df8d1ca2d722bbbd40d" Dec 11 13:29:15 crc kubenswrapper[4898]: I1211 13:29:15.370567 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 11 13:29:15 crc kubenswrapper[4898]: I1211 13:29:15.382064 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 11 13:29:15 crc kubenswrapper[4898]: E1211 13:29:15.384249 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50ddabd8-3fa4-4116-9f20-7a29bc22e7d1" containerName="sg-core" Dec 11 13:29:15 crc kubenswrapper[4898]: I1211 13:29:15.384306 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="50ddabd8-3fa4-4116-9f20-7a29bc22e7d1" containerName="sg-core" Dec 11 13:29:15 crc kubenswrapper[4898]: E1211 13:29:15.384336 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50ddabd8-3fa4-4116-9f20-7a29bc22e7d1" containerName="ceilometer-notification-agent" Dec 11 13:29:15 crc kubenswrapper[4898]: I1211 13:29:15.384343 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="50ddabd8-3fa4-4116-9f20-7a29bc22e7d1" containerName="ceilometer-notification-agent" Dec 11 13:29:15 crc kubenswrapper[4898]: E1211 13:29:15.384391 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50ddabd8-3fa4-4116-9f20-7a29bc22e7d1" containerName="proxy-httpd" Dec 11 13:29:15 crc kubenswrapper[4898]: I1211 13:29:15.384400 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="50ddabd8-3fa4-4116-9f20-7a29bc22e7d1" containerName="proxy-httpd" Dec 11 13:29:15 crc kubenswrapper[4898]: E1211 13:29:15.384417 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50ddabd8-3fa4-4116-9f20-7a29bc22e7d1" containerName="ceilometer-central-agent" Dec 11 13:29:15 crc kubenswrapper[4898]: I1211 13:29:15.384427 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="50ddabd8-3fa4-4116-9f20-7a29bc22e7d1" containerName="ceilometer-central-agent" Dec 11 13:29:15 crc kubenswrapper[4898]: I1211 13:29:15.384703 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="50ddabd8-3fa4-4116-9f20-7a29bc22e7d1" containerName="ceilometer-central-agent" Dec 11 13:29:15 crc kubenswrapper[4898]: I1211 13:29:15.384741 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="50ddabd8-3fa4-4116-9f20-7a29bc22e7d1" containerName="ceilometer-notification-agent" Dec 11 13:29:15 crc kubenswrapper[4898]: I1211 13:29:15.384763 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="50ddabd8-3fa4-4116-9f20-7a29bc22e7d1" containerName="sg-core" Dec 11 13:29:15 crc kubenswrapper[4898]: I1211 13:29:15.384781 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="50ddabd8-3fa4-4116-9f20-7a29bc22e7d1" containerName="proxy-httpd" Dec 11 13:29:15 crc kubenswrapper[4898]: I1211 13:29:15.387247 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 11 13:29:15 crc kubenswrapper[4898]: I1211 13:29:15.389716 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 11 13:29:15 crc kubenswrapper[4898]: I1211 13:29:15.390094 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 11 13:29:15 crc kubenswrapper[4898]: I1211 13:29:15.392300 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 11 13:29:15 crc kubenswrapper[4898]: I1211 13:29:15.540202 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8b5bec9e-6060-43e8-954c-ee6153e9a841-scripts\") pod \"ceilometer-0\" (UID: \"8b5bec9e-6060-43e8-954c-ee6153e9a841\") " pod="openstack/ceilometer-0" Dec 11 13:29:15 crc kubenswrapper[4898]: I1211 13:29:15.540346 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8b5bec9e-6060-43e8-954c-ee6153e9a841-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8b5bec9e-6060-43e8-954c-ee6153e9a841\") " pod="openstack/ceilometer-0" Dec 11 13:29:15 crc kubenswrapper[4898]: I1211 13:29:15.540441 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8b5bec9e-6060-43e8-954c-ee6153e9a841-run-httpd\") pod \"ceilometer-0\" (UID: \"8b5bec9e-6060-43e8-954c-ee6153e9a841\") " pod="openstack/ceilometer-0" Dec 11 13:29:15 crc kubenswrapper[4898]: I1211 13:29:15.540530 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b5bec9e-6060-43e8-954c-ee6153e9a841-config-data\") pod \"ceilometer-0\" (UID: \"8b5bec9e-6060-43e8-954c-ee6153e9a841\") " pod="openstack/ceilometer-0" Dec 11 13:29:15 crc kubenswrapper[4898]: I1211 13:29:15.540554 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b5bec9e-6060-43e8-954c-ee6153e9a841-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8b5bec9e-6060-43e8-954c-ee6153e9a841\") " pod="openstack/ceilometer-0" Dec 11 13:29:15 crc kubenswrapper[4898]: I1211 13:29:15.540655 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cfdhb\" (UniqueName: \"kubernetes.io/projected/8b5bec9e-6060-43e8-954c-ee6153e9a841-kube-api-access-cfdhb\") pod \"ceilometer-0\" (UID: \"8b5bec9e-6060-43e8-954c-ee6153e9a841\") " pod="openstack/ceilometer-0" Dec 11 13:29:15 crc kubenswrapper[4898]: I1211 13:29:15.540713 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8b5bec9e-6060-43e8-954c-ee6153e9a841-log-httpd\") pod \"ceilometer-0\" (UID: \"8b5bec9e-6060-43e8-954c-ee6153e9a841\") " pod="openstack/ceilometer-0" Dec 11 13:29:15 crc kubenswrapper[4898]: I1211 13:29:15.642259 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cfdhb\" (UniqueName: \"kubernetes.io/projected/8b5bec9e-6060-43e8-954c-ee6153e9a841-kube-api-access-cfdhb\") pod \"ceilometer-0\" (UID: \"8b5bec9e-6060-43e8-954c-ee6153e9a841\") " pod="openstack/ceilometer-0" Dec 11 13:29:15 crc kubenswrapper[4898]: I1211 13:29:15.642345 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8b5bec9e-6060-43e8-954c-ee6153e9a841-log-httpd\") pod \"ceilometer-0\" (UID: \"8b5bec9e-6060-43e8-954c-ee6153e9a841\") " pod="openstack/ceilometer-0" Dec 11 13:29:15 crc kubenswrapper[4898]: I1211 13:29:15.642379 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8b5bec9e-6060-43e8-954c-ee6153e9a841-scripts\") pod \"ceilometer-0\" (UID: \"8b5bec9e-6060-43e8-954c-ee6153e9a841\") " pod="openstack/ceilometer-0" Dec 11 13:29:15 crc kubenswrapper[4898]: I1211 13:29:15.642498 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8b5bec9e-6060-43e8-954c-ee6153e9a841-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8b5bec9e-6060-43e8-954c-ee6153e9a841\") " pod="openstack/ceilometer-0" Dec 11 13:29:15 crc kubenswrapper[4898]: I1211 13:29:15.642552 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8b5bec9e-6060-43e8-954c-ee6153e9a841-run-httpd\") pod \"ceilometer-0\" (UID: \"8b5bec9e-6060-43e8-954c-ee6153e9a841\") " pod="openstack/ceilometer-0" Dec 11 13:29:15 crc kubenswrapper[4898]: I1211 13:29:15.642615 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b5bec9e-6060-43e8-954c-ee6153e9a841-config-data\") pod \"ceilometer-0\" (UID: \"8b5bec9e-6060-43e8-954c-ee6153e9a841\") " pod="openstack/ceilometer-0" Dec 11 13:29:15 crc kubenswrapper[4898]: I1211 13:29:15.642638 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b5bec9e-6060-43e8-954c-ee6153e9a841-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8b5bec9e-6060-43e8-954c-ee6153e9a841\") " pod="openstack/ceilometer-0" Dec 11 13:29:15 crc kubenswrapper[4898]: I1211 13:29:15.643303 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8b5bec9e-6060-43e8-954c-ee6153e9a841-log-httpd\") pod \"ceilometer-0\" (UID: \"8b5bec9e-6060-43e8-954c-ee6153e9a841\") " pod="openstack/ceilometer-0" Dec 11 13:29:15 crc kubenswrapper[4898]: I1211 13:29:15.643786 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8b5bec9e-6060-43e8-954c-ee6153e9a841-run-httpd\") pod \"ceilometer-0\" (UID: \"8b5bec9e-6060-43e8-954c-ee6153e9a841\") " pod="openstack/ceilometer-0" Dec 11 13:29:15 crc kubenswrapper[4898]: I1211 13:29:15.649180 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b5bec9e-6060-43e8-954c-ee6153e9a841-config-data\") pod \"ceilometer-0\" (UID: \"8b5bec9e-6060-43e8-954c-ee6153e9a841\") " pod="openstack/ceilometer-0" Dec 11 13:29:15 crc kubenswrapper[4898]: I1211 13:29:15.649398 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b5bec9e-6060-43e8-954c-ee6153e9a841-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8b5bec9e-6060-43e8-954c-ee6153e9a841\") " pod="openstack/ceilometer-0" Dec 11 13:29:15 crc kubenswrapper[4898]: I1211 13:29:15.649403 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8b5bec9e-6060-43e8-954c-ee6153e9a841-scripts\") pod \"ceilometer-0\" (UID: \"8b5bec9e-6060-43e8-954c-ee6153e9a841\") " pod="openstack/ceilometer-0" Dec 11 13:29:15 crc kubenswrapper[4898]: I1211 13:29:15.649780 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8b5bec9e-6060-43e8-954c-ee6153e9a841-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8b5bec9e-6060-43e8-954c-ee6153e9a841\") " pod="openstack/ceilometer-0" Dec 11 13:29:15 crc kubenswrapper[4898]: I1211 13:29:15.664280 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cfdhb\" (UniqueName: \"kubernetes.io/projected/8b5bec9e-6060-43e8-954c-ee6153e9a841-kube-api-access-cfdhb\") pod \"ceilometer-0\" (UID: \"8b5bec9e-6060-43e8-954c-ee6153e9a841\") " pod="openstack/ceilometer-0" Dec 11 13:29:15 crc kubenswrapper[4898]: I1211 13:29:15.714884 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 11 13:29:16 crc kubenswrapper[4898]: I1211 13:29:16.213849 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 11 13:29:16 crc kubenswrapper[4898]: I1211 13:29:16.262989 4898 generic.go:334] "Generic (PLEG): container finished" podID="db5a5721-a011-492c-b79d-cc175721a3c4" containerID="183e17e589fd2c89941b4b6ee9a299c6676c61402b03bbb7f580231da70d85df" exitCode=0 Dec 11 13:29:16 crc kubenswrapper[4898]: I1211 13:29:16.263055 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"db5a5721-a011-492c-b79d-cc175721a3c4","Type":"ContainerDied","Data":"183e17e589fd2c89941b4b6ee9a299c6676c61402b03bbb7f580231da70d85df"} Dec 11 13:29:16 crc kubenswrapper[4898]: I1211 13:29:16.264296 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8b5bec9e-6060-43e8-954c-ee6153e9a841","Type":"ContainerStarted","Data":"de5a3b49ff8ee96bd283690d339f25f8998190192993318f124d24c990388b50"} Dec 11 13:29:16 crc kubenswrapper[4898]: I1211 13:29:16.479375 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 11 13:29:16 crc kubenswrapper[4898]: I1211 13:29:16.527540 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Dec 11 13:29:16 crc kubenswrapper[4898]: I1211 13:29:16.554302 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Dec 11 13:29:16 crc kubenswrapper[4898]: I1211 13:29:16.667704 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k8lm5\" (UniqueName: \"kubernetes.io/projected/db5a5721-a011-492c-b79d-cc175721a3c4-kube-api-access-k8lm5\") pod \"db5a5721-a011-492c-b79d-cc175721a3c4\" (UID: \"db5a5721-a011-492c-b79d-cc175721a3c4\") " Dec 11 13:29:16 crc kubenswrapper[4898]: I1211 13:29:16.667884 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/db5a5721-a011-492c-b79d-cc175721a3c4-logs\") pod \"db5a5721-a011-492c-b79d-cc175721a3c4\" (UID: \"db5a5721-a011-492c-b79d-cc175721a3c4\") " Dec 11 13:29:16 crc kubenswrapper[4898]: I1211 13:29:16.667939 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db5a5721-a011-492c-b79d-cc175721a3c4-combined-ca-bundle\") pod \"db5a5721-a011-492c-b79d-cc175721a3c4\" (UID: \"db5a5721-a011-492c-b79d-cc175721a3c4\") " Dec 11 13:29:16 crc kubenswrapper[4898]: I1211 13:29:16.667984 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db5a5721-a011-492c-b79d-cc175721a3c4-config-data\") pod \"db5a5721-a011-492c-b79d-cc175721a3c4\" (UID: \"db5a5721-a011-492c-b79d-cc175721a3c4\") " Dec 11 13:29:16 crc kubenswrapper[4898]: I1211 13:29:16.668298 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/db5a5721-a011-492c-b79d-cc175721a3c4-logs" (OuterVolumeSpecName: "logs") pod "db5a5721-a011-492c-b79d-cc175721a3c4" (UID: "db5a5721-a011-492c-b79d-cc175721a3c4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 13:29:16 crc kubenswrapper[4898]: I1211 13:29:16.668899 4898 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/db5a5721-a011-492c-b79d-cc175721a3c4-logs\") on node \"crc\" DevicePath \"\"" Dec 11 13:29:16 crc kubenswrapper[4898]: I1211 13:29:16.676779 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db5a5721-a011-492c-b79d-cc175721a3c4-kube-api-access-k8lm5" (OuterVolumeSpecName: "kube-api-access-k8lm5") pod "db5a5721-a011-492c-b79d-cc175721a3c4" (UID: "db5a5721-a011-492c-b79d-cc175721a3c4"). InnerVolumeSpecName "kube-api-access-k8lm5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:29:16 crc kubenswrapper[4898]: I1211 13:29:16.714529 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db5a5721-a011-492c-b79d-cc175721a3c4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "db5a5721-a011-492c-b79d-cc175721a3c4" (UID: "db5a5721-a011-492c-b79d-cc175721a3c4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:29:16 crc kubenswrapper[4898]: I1211 13:29:16.724922 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db5a5721-a011-492c-b79d-cc175721a3c4-config-data" (OuterVolumeSpecName: "config-data") pod "db5a5721-a011-492c-b79d-cc175721a3c4" (UID: "db5a5721-a011-492c-b79d-cc175721a3c4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:29:16 crc kubenswrapper[4898]: I1211 13:29:16.770647 4898 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db5a5721-a011-492c-b79d-cc175721a3c4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 11 13:29:16 crc kubenswrapper[4898]: I1211 13:29:16.771071 4898 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db5a5721-a011-492c-b79d-cc175721a3c4-config-data\") on node \"crc\" DevicePath \"\"" Dec 11 13:29:16 crc kubenswrapper[4898]: I1211 13:29:16.771087 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k8lm5\" (UniqueName: \"kubernetes.io/projected/db5a5721-a011-492c-b79d-cc175721a3c4-kube-api-access-k8lm5\") on node \"crc\" DevicePath \"\"" Dec 11 13:29:16 crc kubenswrapper[4898]: I1211 13:29:16.801271 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="50ddabd8-3fa4-4116-9f20-7a29bc22e7d1" path="/var/lib/kubelet/pods/50ddabd8-3fa4-4116-9f20-7a29bc22e7d1/volumes" Dec 11 13:29:17 crc kubenswrapper[4898]: I1211 13:29:17.278835 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"db5a5721-a011-492c-b79d-cc175721a3c4","Type":"ContainerDied","Data":"efd9c241fd0db5a98a438a5a7410d7cf66ced0c79dccf04526786d8c5b8e7d18"} Dec 11 13:29:17 crc kubenswrapper[4898]: I1211 13:29:17.278902 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 11 13:29:17 crc kubenswrapper[4898]: I1211 13:29:17.279191 4898 scope.go:117] "RemoveContainer" containerID="183e17e589fd2c89941b4b6ee9a299c6676c61402b03bbb7f580231da70d85df" Dec 11 13:29:17 crc kubenswrapper[4898]: I1211 13:29:17.281307 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8b5bec9e-6060-43e8-954c-ee6153e9a841","Type":"ContainerStarted","Data":"0ff71a84e971432d65e47f89bc14dab420c94ec9d0695ce6e52bb57f24dd1027"} Dec 11 13:29:17 crc kubenswrapper[4898]: I1211 13:29:17.299031 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Dec 11 13:29:17 crc kubenswrapper[4898]: I1211 13:29:17.314484 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 11 13:29:17 crc kubenswrapper[4898]: I1211 13:29:17.335510 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 11 13:29:17 crc kubenswrapper[4898]: I1211 13:29:17.343543 4898 scope.go:117] "RemoveContainer" containerID="4710cb8ea8f1d8dc82a27c9e1822b82d18c87c680794c9cf0033f6878139aeeb" Dec 11 13:29:17 crc kubenswrapper[4898]: I1211 13:29:17.360517 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 11 13:29:17 crc kubenswrapper[4898]: E1211 13:29:17.361141 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db5a5721-a011-492c-b79d-cc175721a3c4" containerName="nova-api-log" Dec 11 13:29:17 crc kubenswrapper[4898]: I1211 13:29:17.361167 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="db5a5721-a011-492c-b79d-cc175721a3c4" containerName="nova-api-log" Dec 11 13:29:17 crc kubenswrapper[4898]: E1211 13:29:17.361217 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db5a5721-a011-492c-b79d-cc175721a3c4" containerName="nova-api-api" Dec 11 13:29:17 crc kubenswrapper[4898]: I1211 13:29:17.361227 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="db5a5721-a011-492c-b79d-cc175721a3c4" containerName="nova-api-api" Dec 11 13:29:17 crc kubenswrapper[4898]: I1211 13:29:17.361525 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="db5a5721-a011-492c-b79d-cc175721a3c4" containerName="nova-api-api" Dec 11 13:29:17 crc kubenswrapper[4898]: I1211 13:29:17.361551 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="db5a5721-a011-492c-b79d-cc175721a3c4" containerName="nova-api-log" Dec 11 13:29:17 crc kubenswrapper[4898]: I1211 13:29:17.363088 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 11 13:29:17 crc kubenswrapper[4898]: I1211 13:29:17.367704 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Dec 11 13:29:17 crc kubenswrapper[4898]: I1211 13:29:17.368077 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 11 13:29:17 crc kubenswrapper[4898]: I1211 13:29:17.368367 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Dec 11 13:29:17 crc kubenswrapper[4898]: I1211 13:29:17.386031 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/51554a75-4d15-45ef-9b09-3d4d08a8bc0a-public-tls-certs\") pod \"nova-api-0\" (UID: \"51554a75-4d15-45ef-9b09-3d4d08a8bc0a\") " pod="openstack/nova-api-0" Dec 11 13:29:17 crc kubenswrapper[4898]: I1211 13:29:17.386254 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51554a75-4d15-45ef-9b09-3d4d08a8bc0a-config-data\") pod \"nova-api-0\" (UID: \"51554a75-4d15-45ef-9b09-3d4d08a8bc0a\") " pod="openstack/nova-api-0" Dec 11 13:29:17 crc kubenswrapper[4898]: I1211 13:29:17.386432 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/51554a75-4d15-45ef-9b09-3d4d08a8bc0a-internal-tls-certs\") pod \"nova-api-0\" (UID: \"51554a75-4d15-45ef-9b09-3d4d08a8bc0a\") " pod="openstack/nova-api-0" Dec 11 13:29:17 crc kubenswrapper[4898]: I1211 13:29:17.387018 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g7nwc\" (UniqueName: \"kubernetes.io/projected/51554a75-4d15-45ef-9b09-3d4d08a8bc0a-kube-api-access-g7nwc\") pod \"nova-api-0\" (UID: \"51554a75-4d15-45ef-9b09-3d4d08a8bc0a\") " pod="openstack/nova-api-0" Dec 11 13:29:17 crc kubenswrapper[4898]: I1211 13:29:17.387111 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/51554a75-4d15-45ef-9b09-3d4d08a8bc0a-logs\") pod \"nova-api-0\" (UID: \"51554a75-4d15-45ef-9b09-3d4d08a8bc0a\") " pod="openstack/nova-api-0" Dec 11 13:29:17 crc kubenswrapper[4898]: I1211 13:29:17.387233 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51554a75-4d15-45ef-9b09-3d4d08a8bc0a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"51554a75-4d15-45ef-9b09-3d4d08a8bc0a\") " pod="openstack/nova-api-0" Dec 11 13:29:17 crc kubenswrapper[4898]: I1211 13:29:17.408022 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 11 13:29:17 crc kubenswrapper[4898]: I1211 13:29:17.489913 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/51554a75-4d15-45ef-9b09-3d4d08a8bc0a-public-tls-certs\") pod \"nova-api-0\" (UID: \"51554a75-4d15-45ef-9b09-3d4d08a8bc0a\") " pod="openstack/nova-api-0" Dec 11 13:29:17 crc kubenswrapper[4898]: I1211 13:29:17.489961 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51554a75-4d15-45ef-9b09-3d4d08a8bc0a-config-data\") pod \"nova-api-0\" (UID: \"51554a75-4d15-45ef-9b09-3d4d08a8bc0a\") " pod="openstack/nova-api-0" Dec 11 13:29:17 crc kubenswrapper[4898]: I1211 13:29:17.490043 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/51554a75-4d15-45ef-9b09-3d4d08a8bc0a-internal-tls-certs\") pod \"nova-api-0\" (UID: \"51554a75-4d15-45ef-9b09-3d4d08a8bc0a\") " pod="openstack/nova-api-0" Dec 11 13:29:17 crc kubenswrapper[4898]: I1211 13:29:17.490099 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g7nwc\" (UniqueName: \"kubernetes.io/projected/51554a75-4d15-45ef-9b09-3d4d08a8bc0a-kube-api-access-g7nwc\") pod \"nova-api-0\" (UID: \"51554a75-4d15-45ef-9b09-3d4d08a8bc0a\") " pod="openstack/nova-api-0" Dec 11 13:29:17 crc kubenswrapper[4898]: I1211 13:29:17.490124 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/51554a75-4d15-45ef-9b09-3d4d08a8bc0a-logs\") pod \"nova-api-0\" (UID: \"51554a75-4d15-45ef-9b09-3d4d08a8bc0a\") " pod="openstack/nova-api-0" Dec 11 13:29:17 crc kubenswrapper[4898]: I1211 13:29:17.490140 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51554a75-4d15-45ef-9b09-3d4d08a8bc0a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"51554a75-4d15-45ef-9b09-3d4d08a8bc0a\") " pod="openstack/nova-api-0" Dec 11 13:29:17 crc kubenswrapper[4898]: I1211 13:29:17.493942 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/51554a75-4d15-45ef-9b09-3d4d08a8bc0a-logs\") pod \"nova-api-0\" (UID: \"51554a75-4d15-45ef-9b09-3d4d08a8bc0a\") " pod="openstack/nova-api-0" Dec 11 13:29:17 crc kubenswrapper[4898]: I1211 13:29:17.497658 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51554a75-4d15-45ef-9b09-3d4d08a8bc0a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"51554a75-4d15-45ef-9b09-3d4d08a8bc0a\") " pod="openstack/nova-api-0" Dec 11 13:29:17 crc kubenswrapper[4898]: I1211 13:29:17.498182 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/51554a75-4d15-45ef-9b09-3d4d08a8bc0a-public-tls-certs\") pod \"nova-api-0\" (UID: \"51554a75-4d15-45ef-9b09-3d4d08a8bc0a\") " pod="openstack/nova-api-0" Dec 11 13:29:17 crc kubenswrapper[4898]: I1211 13:29:17.518783 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/51554a75-4d15-45ef-9b09-3d4d08a8bc0a-internal-tls-certs\") pod \"nova-api-0\" (UID: \"51554a75-4d15-45ef-9b09-3d4d08a8bc0a\") " pod="openstack/nova-api-0" Dec 11 13:29:17 crc kubenswrapper[4898]: I1211 13:29:17.519142 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g7nwc\" (UniqueName: \"kubernetes.io/projected/51554a75-4d15-45ef-9b09-3d4d08a8bc0a-kube-api-access-g7nwc\") pod \"nova-api-0\" (UID: \"51554a75-4d15-45ef-9b09-3d4d08a8bc0a\") " pod="openstack/nova-api-0" Dec 11 13:29:17 crc kubenswrapper[4898]: I1211 13:29:17.519515 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51554a75-4d15-45ef-9b09-3d4d08a8bc0a-config-data\") pod \"nova-api-0\" (UID: \"51554a75-4d15-45ef-9b09-3d4d08a8bc0a\") " pod="openstack/nova-api-0" Dec 11 13:29:17 crc kubenswrapper[4898]: I1211 13:29:17.628233 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-2hj8k"] Dec 11 13:29:17 crc kubenswrapper[4898]: I1211 13:29:17.635594 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-2hj8k" Dec 11 13:29:17 crc kubenswrapper[4898]: I1211 13:29:17.641539 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Dec 11 13:29:17 crc kubenswrapper[4898]: I1211 13:29:17.641835 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Dec 11 13:29:17 crc kubenswrapper[4898]: I1211 13:29:17.655207 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-2hj8k"] Dec 11 13:29:17 crc kubenswrapper[4898]: I1211 13:29:17.701895 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 11 13:29:17 crc kubenswrapper[4898]: I1211 13:29:17.808595 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c4188522-fa4a-4c8e-92af-dd304dbc64f1-scripts\") pod \"nova-cell1-cell-mapping-2hj8k\" (UID: \"c4188522-fa4a-4c8e-92af-dd304dbc64f1\") " pod="openstack/nova-cell1-cell-mapping-2hj8k" Dec 11 13:29:17 crc kubenswrapper[4898]: I1211 13:29:17.808972 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4188522-fa4a-4c8e-92af-dd304dbc64f1-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-2hj8k\" (UID: \"c4188522-fa4a-4c8e-92af-dd304dbc64f1\") " pod="openstack/nova-cell1-cell-mapping-2hj8k" Dec 11 13:29:17 crc kubenswrapper[4898]: I1211 13:29:17.809050 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tcjgp\" (UniqueName: \"kubernetes.io/projected/c4188522-fa4a-4c8e-92af-dd304dbc64f1-kube-api-access-tcjgp\") pod \"nova-cell1-cell-mapping-2hj8k\" (UID: \"c4188522-fa4a-4c8e-92af-dd304dbc64f1\") " pod="openstack/nova-cell1-cell-mapping-2hj8k" Dec 11 13:29:17 crc kubenswrapper[4898]: I1211 13:29:17.809974 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c4188522-fa4a-4c8e-92af-dd304dbc64f1-config-data\") pod \"nova-cell1-cell-mapping-2hj8k\" (UID: \"c4188522-fa4a-4c8e-92af-dd304dbc64f1\") " pod="openstack/nova-cell1-cell-mapping-2hj8k" Dec 11 13:29:17 crc kubenswrapper[4898]: I1211 13:29:17.912945 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c4188522-fa4a-4c8e-92af-dd304dbc64f1-config-data\") pod \"nova-cell1-cell-mapping-2hj8k\" (UID: \"c4188522-fa4a-4c8e-92af-dd304dbc64f1\") " pod="openstack/nova-cell1-cell-mapping-2hj8k" Dec 11 13:29:17 crc kubenswrapper[4898]: I1211 13:29:17.913090 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c4188522-fa4a-4c8e-92af-dd304dbc64f1-scripts\") pod \"nova-cell1-cell-mapping-2hj8k\" (UID: \"c4188522-fa4a-4c8e-92af-dd304dbc64f1\") " pod="openstack/nova-cell1-cell-mapping-2hj8k" Dec 11 13:29:17 crc kubenswrapper[4898]: I1211 13:29:17.913120 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4188522-fa4a-4c8e-92af-dd304dbc64f1-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-2hj8k\" (UID: \"c4188522-fa4a-4c8e-92af-dd304dbc64f1\") " pod="openstack/nova-cell1-cell-mapping-2hj8k" Dec 11 13:29:17 crc kubenswrapper[4898]: I1211 13:29:17.913168 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tcjgp\" (UniqueName: \"kubernetes.io/projected/c4188522-fa4a-4c8e-92af-dd304dbc64f1-kube-api-access-tcjgp\") pod \"nova-cell1-cell-mapping-2hj8k\" (UID: \"c4188522-fa4a-4c8e-92af-dd304dbc64f1\") " pod="openstack/nova-cell1-cell-mapping-2hj8k" Dec 11 13:29:17 crc kubenswrapper[4898]: I1211 13:29:17.921965 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c4188522-fa4a-4c8e-92af-dd304dbc64f1-scripts\") pod \"nova-cell1-cell-mapping-2hj8k\" (UID: \"c4188522-fa4a-4c8e-92af-dd304dbc64f1\") " pod="openstack/nova-cell1-cell-mapping-2hj8k" Dec 11 13:29:17 crc kubenswrapper[4898]: I1211 13:29:17.922203 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c4188522-fa4a-4c8e-92af-dd304dbc64f1-config-data\") pod \"nova-cell1-cell-mapping-2hj8k\" (UID: \"c4188522-fa4a-4c8e-92af-dd304dbc64f1\") " pod="openstack/nova-cell1-cell-mapping-2hj8k" Dec 11 13:29:17 crc kubenswrapper[4898]: I1211 13:29:17.922703 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4188522-fa4a-4c8e-92af-dd304dbc64f1-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-2hj8k\" (UID: \"c4188522-fa4a-4c8e-92af-dd304dbc64f1\") " pod="openstack/nova-cell1-cell-mapping-2hj8k" Dec 11 13:29:17 crc kubenswrapper[4898]: I1211 13:29:17.935028 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tcjgp\" (UniqueName: \"kubernetes.io/projected/c4188522-fa4a-4c8e-92af-dd304dbc64f1-kube-api-access-tcjgp\") pod \"nova-cell1-cell-mapping-2hj8k\" (UID: \"c4188522-fa4a-4c8e-92af-dd304dbc64f1\") " pod="openstack/nova-cell1-cell-mapping-2hj8k" Dec 11 13:29:18 crc kubenswrapper[4898]: I1211 13:29:18.091229 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-2hj8k" Dec 11 13:29:18 crc kubenswrapper[4898]: I1211 13:29:18.231562 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 11 13:29:18 crc kubenswrapper[4898]: I1211 13:29:18.301182 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8b5bec9e-6060-43e8-954c-ee6153e9a841","Type":"ContainerStarted","Data":"e5b8edb8ea2ed08143ee09684135eca75584ff72dc887d2c5fb4a80e271996bd"} Dec 11 13:29:18 crc kubenswrapper[4898]: I1211 13:29:18.303261 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"51554a75-4d15-45ef-9b09-3d4d08a8bc0a","Type":"ContainerStarted","Data":"e1d048188c6b7ebebf1e4a2d07f0a88771ea25350dd49c75943731e762cdab88"} Dec 11 13:29:18 crc kubenswrapper[4898]: W1211 13:29:18.639529 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc4188522_fa4a_4c8e_92af_dd304dbc64f1.slice/crio-6fb4847e9f53919784e832d24c9ac4a0dbc0ffb28576e85c1afbbf3761dee9fc WatchSource:0}: Error finding container 6fb4847e9f53919784e832d24c9ac4a0dbc0ffb28576e85c1afbbf3761dee9fc: Status 404 returned error can't find the container with id 6fb4847e9f53919784e832d24c9ac4a0dbc0ffb28576e85c1afbbf3761dee9fc Dec 11 13:29:18 crc kubenswrapper[4898]: I1211 13:29:18.639731 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-2hj8k"] Dec 11 13:29:18 crc kubenswrapper[4898]: I1211 13:29:18.799525 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="db5a5721-a011-492c-b79d-cc175721a3c4" path="/var/lib/kubelet/pods/db5a5721-a011-492c-b79d-cc175721a3c4/volumes" Dec 11 13:29:19 crc kubenswrapper[4898]: I1211 13:29:19.316185 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"51554a75-4d15-45ef-9b09-3d4d08a8bc0a","Type":"ContainerStarted","Data":"59f970d82ca624167203babda7e2de83da79e47e0a3eee36b8cc957db3fedd15"} Dec 11 13:29:19 crc kubenswrapper[4898]: I1211 13:29:19.316230 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"51554a75-4d15-45ef-9b09-3d4d08a8bc0a","Type":"ContainerStarted","Data":"ef4205358fec75ed00826564e5c3263149146455434cd62c9a57a5cca4260ed0"} Dec 11 13:29:19 crc kubenswrapper[4898]: I1211 13:29:19.318910 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-2hj8k" event={"ID":"c4188522-fa4a-4c8e-92af-dd304dbc64f1","Type":"ContainerStarted","Data":"dd3e7074c59bad247315decb5917b222623bd6e03dd0c4043626d5da8c535707"} Dec 11 13:29:19 crc kubenswrapper[4898]: I1211 13:29:19.318959 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-2hj8k" event={"ID":"c4188522-fa4a-4c8e-92af-dd304dbc64f1","Type":"ContainerStarted","Data":"6fb4847e9f53919784e832d24c9ac4a0dbc0ffb28576e85c1afbbf3761dee9fc"} Dec 11 13:29:19 crc kubenswrapper[4898]: I1211 13:29:19.322333 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8b5bec9e-6060-43e8-954c-ee6153e9a841","Type":"ContainerStarted","Data":"12351c1f1f32a9581dfa326b208d96bd48915b87a3c8fefd0ebc8992142c55c5"} Dec 11 13:29:19 crc kubenswrapper[4898]: I1211 13:29:19.342964 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.342939172 podStartE2EDuration="2.342939172s" podCreationTimestamp="2025-12-11 13:29:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 13:29:19.332977232 +0000 UTC m=+1516.905303679" watchObservedRunningTime="2025-12-11 13:29:19.342939172 +0000 UTC m=+1516.915265609" Dec 11 13:29:19 crc kubenswrapper[4898]: I1211 13:29:19.371888 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-2hj8k" podStartSLOduration=2.371863524 podStartE2EDuration="2.371863524s" podCreationTimestamp="2025-12-11 13:29:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 13:29:19.352415828 +0000 UTC m=+1516.924742265" watchObservedRunningTime="2025-12-11 13:29:19.371863524 +0000 UTC m=+1516.944189961" Dec 11 13:29:19 crc kubenswrapper[4898]: I1211 13:29:19.722613 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-f84f9ccf-k4bhj" Dec 11 13:29:19 crc kubenswrapper[4898]: I1211 13:29:19.810347 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-568d7fd7cf-g8csh"] Dec 11 13:29:19 crc kubenswrapper[4898]: I1211 13:29:19.810658 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-568d7fd7cf-g8csh" podUID="15ab34e2-564c-479b-a8af-c428abb92e17" containerName="dnsmasq-dns" containerID="cri-o://a3c664e55bf0186f9da9cc6fb55b1679a3f6b85c6614e1aacc0f6f280ef624d8" gracePeriod=10 Dec 11 13:29:20 crc kubenswrapper[4898]: I1211 13:29:20.352287 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8b5bec9e-6060-43e8-954c-ee6153e9a841","Type":"ContainerStarted","Data":"ae5f57e85ad7814ab6cd8ae2cf85c905997f161db6a3cf68751505a0d0f0865c"} Dec 11 13:29:20 crc kubenswrapper[4898]: I1211 13:29:20.354694 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 11 13:29:20 crc kubenswrapper[4898]: I1211 13:29:20.358381 4898 generic.go:334] "Generic (PLEG): container finished" podID="15ab34e2-564c-479b-a8af-c428abb92e17" containerID="a3c664e55bf0186f9da9cc6fb55b1679a3f6b85c6614e1aacc0f6f280ef624d8" exitCode=0 Dec 11 13:29:20 crc kubenswrapper[4898]: I1211 13:29:20.359430 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-568d7fd7cf-g8csh" event={"ID":"15ab34e2-564c-479b-a8af-c428abb92e17","Type":"ContainerDied","Data":"a3c664e55bf0186f9da9cc6fb55b1679a3f6b85c6614e1aacc0f6f280ef624d8"} Dec 11 13:29:20 crc kubenswrapper[4898]: I1211 13:29:20.359475 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-568d7fd7cf-g8csh" event={"ID":"15ab34e2-564c-479b-a8af-c428abb92e17","Type":"ContainerDied","Data":"b0649ae698e71d91b30746b58d42bd209699f98c8a5fa50d81de43885811fd5a"} Dec 11 13:29:20 crc kubenswrapper[4898]: I1211 13:29:20.359489 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b0649ae698e71d91b30746b58d42bd209699f98c8a5fa50d81de43885811fd5a" Dec 11 13:29:20 crc kubenswrapper[4898]: I1211 13:29:20.391077 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.938133992 podStartE2EDuration="5.391058897s" podCreationTimestamp="2025-12-11 13:29:15 +0000 UTC" firstStartedPulling="2025-12-11 13:29:16.214656942 +0000 UTC m=+1513.786983379" lastFinishedPulling="2025-12-11 13:29:19.667581847 +0000 UTC m=+1517.239908284" observedRunningTime="2025-12-11 13:29:20.381110938 +0000 UTC m=+1517.953437455" watchObservedRunningTime="2025-12-11 13:29:20.391058897 +0000 UTC m=+1517.963385334" Dec 11 13:29:20 crc kubenswrapper[4898]: I1211 13:29:20.418286 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-568d7fd7cf-g8csh" Dec 11 13:29:20 crc kubenswrapper[4898]: I1211 13:29:20.485160 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/15ab34e2-564c-479b-a8af-c428abb92e17-config\") pod \"15ab34e2-564c-479b-a8af-c428abb92e17\" (UID: \"15ab34e2-564c-479b-a8af-c428abb92e17\") " Dec 11 13:29:20 crc kubenswrapper[4898]: I1211 13:29:20.485349 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/15ab34e2-564c-479b-a8af-c428abb92e17-ovsdbserver-sb\") pod \"15ab34e2-564c-479b-a8af-c428abb92e17\" (UID: \"15ab34e2-564c-479b-a8af-c428abb92e17\") " Dec 11 13:29:20 crc kubenswrapper[4898]: I1211 13:29:20.485499 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/15ab34e2-564c-479b-a8af-c428abb92e17-dns-swift-storage-0\") pod \"15ab34e2-564c-479b-a8af-c428abb92e17\" (UID: \"15ab34e2-564c-479b-a8af-c428abb92e17\") " Dec 11 13:29:20 crc kubenswrapper[4898]: I1211 13:29:20.485599 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/15ab34e2-564c-479b-a8af-c428abb92e17-ovsdbserver-nb\") pod \"15ab34e2-564c-479b-a8af-c428abb92e17\" (UID: \"15ab34e2-564c-479b-a8af-c428abb92e17\") " Dec 11 13:29:20 crc kubenswrapper[4898]: I1211 13:29:20.485884 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dg7vs\" (UniqueName: \"kubernetes.io/projected/15ab34e2-564c-479b-a8af-c428abb92e17-kube-api-access-dg7vs\") pod \"15ab34e2-564c-479b-a8af-c428abb92e17\" (UID: \"15ab34e2-564c-479b-a8af-c428abb92e17\") " Dec 11 13:29:20 crc kubenswrapper[4898]: I1211 13:29:20.485939 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/15ab34e2-564c-479b-a8af-c428abb92e17-dns-svc\") pod \"15ab34e2-564c-479b-a8af-c428abb92e17\" (UID: \"15ab34e2-564c-479b-a8af-c428abb92e17\") " Dec 11 13:29:20 crc kubenswrapper[4898]: I1211 13:29:20.504694 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15ab34e2-564c-479b-a8af-c428abb92e17-kube-api-access-dg7vs" (OuterVolumeSpecName: "kube-api-access-dg7vs") pod "15ab34e2-564c-479b-a8af-c428abb92e17" (UID: "15ab34e2-564c-479b-a8af-c428abb92e17"). InnerVolumeSpecName "kube-api-access-dg7vs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:29:20 crc kubenswrapper[4898]: I1211 13:29:20.587402 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/15ab34e2-564c-479b-a8af-c428abb92e17-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "15ab34e2-564c-479b-a8af-c428abb92e17" (UID: "15ab34e2-564c-479b-a8af-c428abb92e17"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:29:20 crc kubenswrapper[4898]: I1211 13:29:20.589300 4898 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/15ab34e2-564c-479b-a8af-c428abb92e17-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 11 13:29:20 crc kubenswrapper[4898]: I1211 13:29:20.589337 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dg7vs\" (UniqueName: \"kubernetes.io/projected/15ab34e2-564c-479b-a8af-c428abb92e17-kube-api-access-dg7vs\") on node \"crc\" DevicePath \"\"" Dec 11 13:29:20 crc kubenswrapper[4898]: I1211 13:29:20.593904 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/15ab34e2-564c-479b-a8af-c428abb92e17-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "15ab34e2-564c-479b-a8af-c428abb92e17" (UID: "15ab34e2-564c-479b-a8af-c428abb92e17"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:29:20 crc kubenswrapper[4898]: I1211 13:29:20.596695 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/15ab34e2-564c-479b-a8af-c428abb92e17-config" (OuterVolumeSpecName: "config") pod "15ab34e2-564c-479b-a8af-c428abb92e17" (UID: "15ab34e2-564c-479b-a8af-c428abb92e17"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:29:20 crc kubenswrapper[4898]: I1211 13:29:20.628053 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/15ab34e2-564c-479b-a8af-c428abb92e17-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "15ab34e2-564c-479b-a8af-c428abb92e17" (UID: "15ab34e2-564c-479b-a8af-c428abb92e17"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:29:20 crc kubenswrapper[4898]: I1211 13:29:20.636748 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/15ab34e2-564c-479b-a8af-c428abb92e17-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "15ab34e2-564c-479b-a8af-c428abb92e17" (UID: "15ab34e2-564c-479b-a8af-c428abb92e17"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:29:20 crc kubenswrapper[4898]: I1211 13:29:20.691754 4898 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/15ab34e2-564c-479b-a8af-c428abb92e17-config\") on node \"crc\" DevicePath \"\"" Dec 11 13:29:20 crc kubenswrapper[4898]: I1211 13:29:20.691782 4898 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/15ab34e2-564c-479b-a8af-c428abb92e17-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 11 13:29:20 crc kubenswrapper[4898]: I1211 13:29:20.691793 4898 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/15ab34e2-564c-479b-a8af-c428abb92e17-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 11 13:29:20 crc kubenswrapper[4898]: I1211 13:29:20.691801 4898 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/15ab34e2-564c-479b-a8af-c428abb92e17-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 11 13:29:21 crc kubenswrapper[4898]: I1211 13:29:21.368093 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-568d7fd7cf-g8csh" Dec 11 13:29:21 crc kubenswrapper[4898]: I1211 13:29:21.403987 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-568d7fd7cf-g8csh"] Dec 11 13:29:21 crc kubenswrapper[4898]: I1211 13:29:21.422907 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-568d7fd7cf-g8csh"] Dec 11 13:29:22 crc kubenswrapper[4898]: I1211 13:29:22.798705 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="15ab34e2-564c-479b-a8af-c428abb92e17" path="/var/lib/kubelet/pods/15ab34e2-564c-479b-a8af-c428abb92e17/volumes" Dec 11 13:29:23 crc kubenswrapper[4898]: I1211 13:29:23.341412 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-wjmfk" Dec 11 13:29:23 crc kubenswrapper[4898]: I1211 13:29:23.404397 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-wjmfk" Dec 11 13:29:23 crc kubenswrapper[4898]: I1211 13:29:23.852835 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-wjmfk"] Dec 11 13:29:24 crc kubenswrapper[4898]: I1211 13:29:24.407957 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-wjmfk" podUID="cec835c2-4160-4de3-ac12-7e808d0882b3" containerName="registry-server" containerID="cri-o://9292cebc1b48aeecbc0ce9044ffe5d53bb4221f09cd1347d0fbd64f4940c6b97" gracePeriod=2 Dec 11 13:29:24 crc kubenswrapper[4898]: I1211 13:29:24.975184 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wjmfk" Dec 11 13:29:25 crc kubenswrapper[4898]: I1211 13:29:25.012583 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cec835c2-4160-4de3-ac12-7e808d0882b3-utilities\") pod \"cec835c2-4160-4de3-ac12-7e808d0882b3\" (UID: \"cec835c2-4160-4de3-ac12-7e808d0882b3\") " Dec 11 13:29:25 crc kubenswrapper[4898]: I1211 13:29:25.012832 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cec835c2-4160-4de3-ac12-7e808d0882b3-catalog-content\") pod \"cec835c2-4160-4de3-ac12-7e808d0882b3\" (UID: \"cec835c2-4160-4de3-ac12-7e808d0882b3\") " Dec 11 13:29:25 crc kubenswrapper[4898]: I1211 13:29:25.016591 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cec835c2-4160-4de3-ac12-7e808d0882b3-utilities" (OuterVolumeSpecName: "utilities") pod "cec835c2-4160-4de3-ac12-7e808d0882b3" (UID: "cec835c2-4160-4de3-ac12-7e808d0882b3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 13:29:25 crc kubenswrapper[4898]: I1211 13:29:25.115028 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g86t9\" (UniqueName: \"kubernetes.io/projected/cec835c2-4160-4de3-ac12-7e808d0882b3-kube-api-access-g86t9\") pod \"cec835c2-4160-4de3-ac12-7e808d0882b3\" (UID: \"cec835c2-4160-4de3-ac12-7e808d0882b3\") " Dec 11 13:29:25 crc kubenswrapper[4898]: I1211 13:29:25.115880 4898 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cec835c2-4160-4de3-ac12-7e808d0882b3-utilities\") on node \"crc\" DevicePath \"\"" Dec 11 13:29:25 crc kubenswrapper[4898]: I1211 13:29:25.122231 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cec835c2-4160-4de3-ac12-7e808d0882b3-kube-api-access-g86t9" (OuterVolumeSpecName: "kube-api-access-g86t9") pod "cec835c2-4160-4de3-ac12-7e808d0882b3" (UID: "cec835c2-4160-4de3-ac12-7e808d0882b3"). InnerVolumeSpecName "kube-api-access-g86t9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:29:25 crc kubenswrapper[4898]: I1211 13:29:25.138059 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cec835c2-4160-4de3-ac12-7e808d0882b3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cec835c2-4160-4de3-ac12-7e808d0882b3" (UID: "cec835c2-4160-4de3-ac12-7e808d0882b3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 13:29:25 crc kubenswrapper[4898]: I1211 13:29:25.217699 4898 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cec835c2-4160-4de3-ac12-7e808d0882b3-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 11 13:29:25 crc kubenswrapper[4898]: I1211 13:29:25.217742 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g86t9\" (UniqueName: \"kubernetes.io/projected/cec835c2-4160-4de3-ac12-7e808d0882b3-kube-api-access-g86t9\") on node \"crc\" DevicePath \"\"" Dec 11 13:29:25 crc kubenswrapper[4898]: I1211 13:29:25.420992 4898 generic.go:334] "Generic (PLEG): container finished" podID="c4188522-fa4a-4c8e-92af-dd304dbc64f1" containerID="dd3e7074c59bad247315decb5917b222623bd6e03dd0c4043626d5da8c535707" exitCode=0 Dec 11 13:29:25 crc kubenswrapper[4898]: I1211 13:29:25.421122 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-2hj8k" event={"ID":"c4188522-fa4a-4c8e-92af-dd304dbc64f1","Type":"ContainerDied","Data":"dd3e7074c59bad247315decb5917b222623bd6e03dd0c4043626d5da8c535707"} Dec 11 13:29:25 crc kubenswrapper[4898]: I1211 13:29:25.425239 4898 generic.go:334] "Generic (PLEG): container finished" podID="cec835c2-4160-4de3-ac12-7e808d0882b3" containerID="9292cebc1b48aeecbc0ce9044ffe5d53bb4221f09cd1347d0fbd64f4940c6b97" exitCode=0 Dec 11 13:29:25 crc kubenswrapper[4898]: I1211 13:29:25.425291 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wjmfk" event={"ID":"cec835c2-4160-4de3-ac12-7e808d0882b3","Type":"ContainerDied","Data":"9292cebc1b48aeecbc0ce9044ffe5d53bb4221f09cd1347d0fbd64f4940c6b97"} Dec 11 13:29:25 crc kubenswrapper[4898]: I1211 13:29:25.425332 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wjmfk" Dec 11 13:29:25 crc kubenswrapper[4898]: I1211 13:29:25.425386 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wjmfk" event={"ID":"cec835c2-4160-4de3-ac12-7e808d0882b3","Type":"ContainerDied","Data":"fdbf33e56068f5e21b63a35214eb3d35a7049edd994047a8996fbd7549f8c1f9"} Dec 11 13:29:25 crc kubenswrapper[4898]: I1211 13:29:25.425419 4898 scope.go:117] "RemoveContainer" containerID="9292cebc1b48aeecbc0ce9044ffe5d53bb4221f09cd1347d0fbd64f4940c6b97" Dec 11 13:29:25 crc kubenswrapper[4898]: I1211 13:29:25.455786 4898 scope.go:117] "RemoveContainer" containerID="d4e4c4012d995f9a3a0d09a553e88e460cd092b296f6614c95d691c579e53039" Dec 11 13:29:25 crc kubenswrapper[4898]: I1211 13:29:25.483094 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-wjmfk"] Dec 11 13:29:25 crc kubenswrapper[4898]: I1211 13:29:25.493449 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-wjmfk"] Dec 11 13:29:25 crc kubenswrapper[4898]: I1211 13:29:25.509550 4898 scope.go:117] "RemoveContainer" containerID="2596a74cf684f98fac072af76d18988d5cbfb02f9add9d93ea9ed2a758b72ca2" Dec 11 13:29:25 crc kubenswrapper[4898]: I1211 13:29:25.550703 4898 scope.go:117] "RemoveContainer" containerID="9292cebc1b48aeecbc0ce9044ffe5d53bb4221f09cd1347d0fbd64f4940c6b97" Dec 11 13:29:25 crc kubenswrapper[4898]: E1211 13:29:25.551491 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9292cebc1b48aeecbc0ce9044ffe5d53bb4221f09cd1347d0fbd64f4940c6b97\": container with ID starting with 9292cebc1b48aeecbc0ce9044ffe5d53bb4221f09cd1347d0fbd64f4940c6b97 not found: ID does not exist" containerID="9292cebc1b48aeecbc0ce9044ffe5d53bb4221f09cd1347d0fbd64f4940c6b97" Dec 11 13:29:25 crc kubenswrapper[4898]: I1211 13:29:25.551540 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9292cebc1b48aeecbc0ce9044ffe5d53bb4221f09cd1347d0fbd64f4940c6b97"} err="failed to get container status \"9292cebc1b48aeecbc0ce9044ffe5d53bb4221f09cd1347d0fbd64f4940c6b97\": rpc error: code = NotFound desc = could not find container \"9292cebc1b48aeecbc0ce9044ffe5d53bb4221f09cd1347d0fbd64f4940c6b97\": container with ID starting with 9292cebc1b48aeecbc0ce9044ffe5d53bb4221f09cd1347d0fbd64f4940c6b97 not found: ID does not exist" Dec 11 13:29:25 crc kubenswrapper[4898]: I1211 13:29:25.551569 4898 scope.go:117] "RemoveContainer" containerID="d4e4c4012d995f9a3a0d09a553e88e460cd092b296f6614c95d691c579e53039" Dec 11 13:29:25 crc kubenswrapper[4898]: E1211 13:29:25.552030 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d4e4c4012d995f9a3a0d09a553e88e460cd092b296f6614c95d691c579e53039\": container with ID starting with d4e4c4012d995f9a3a0d09a553e88e460cd092b296f6614c95d691c579e53039 not found: ID does not exist" containerID="d4e4c4012d995f9a3a0d09a553e88e460cd092b296f6614c95d691c579e53039" Dec 11 13:29:25 crc kubenswrapper[4898]: I1211 13:29:25.552073 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4e4c4012d995f9a3a0d09a553e88e460cd092b296f6614c95d691c579e53039"} err="failed to get container status \"d4e4c4012d995f9a3a0d09a553e88e460cd092b296f6614c95d691c579e53039\": rpc error: code = NotFound desc = could not find container \"d4e4c4012d995f9a3a0d09a553e88e460cd092b296f6614c95d691c579e53039\": container with ID starting with d4e4c4012d995f9a3a0d09a553e88e460cd092b296f6614c95d691c579e53039 not found: ID does not exist" Dec 11 13:29:25 crc kubenswrapper[4898]: I1211 13:29:25.552104 4898 scope.go:117] "RemoveContainer" containerID="2596a74cf684f98fac072af76d18988d5cbfb02f9add9d93ea9ed2a758b72ca2" Dec 11 13:29:25 crc kubenswrapper[4898]: E1211 13:29:25.552925 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2596a74cf684f98fac072af76d18988d5cbfb02f9add9d93ea9ed2a758b72ca2\": container with ID starting with 2596a74cf684f98fac072af76d18988d5cbfb02f9add9d93ea9ed2a758b72ca2 not found: ID does not exist" containerID="2596a74cf684f98fac072af76d18988d5cbfb02f9add9d93ea9ed2a758b72ca2" Dec 11 13:29:25 crc kubenswrapper[4898]: I1211 13:29:25.552978 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2596a74cf684f98fac072af76d18988d5cbfb02f9add9d93ea9ed2a758b72ca2"} err="failed to get container status \"2596a74cf684f98fac072af76d18988d5cbfb02f9add9d93ea9ed2a758b72ca2\": rpc error: code = NotFound desc = could not find container \"2596a74cf684f98fac072af76d18988d5cbfb02f9add9d93ea9ed2a758b72ca2\": container with ID starting with 2596a74cf684f98fac072af76d18988d5cbfb02f9add9d93ea9ed2a758b72ca2 not found: ID does not exist" Dec 11 13:29:26 crc kubenswrapper[4898]: I1211 13:29:26.793387 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cec835c2-4160-4de3-ac12-7e808d0882b3" path="/var/lib/kubelet/pods/cec835c2-4160-4de3-ac12-7e808d0882b3/volumes" Dec 11 13:29:26 crc kubenswrapper[4898]: I1211 13:29:26.910397 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-2hj8k" Dec 11 13:29:27 crc kubenswrapper[4898]: I1211 13:29:27.058086 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c4188522-fa4a-4c8e-92af-dd304dbc64f1-scripts\") pod \"c4188522-fa4a-4c8e-92af-dd304dbc64f1\" (UID: \"c4188522-fa4a-4c8e-92af-dd304dbc64f1\") " Dec 11 13:29:27 crc kubenswrapper[4898]: I1211 13:29:27.058436 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c4188522-fa4a-4c8e-92af-dd304dbc64f1-config-data\") pod \"c4188522-fa4a-4c8e-92af-dd304dbc64f1\" (UID: \"c4188522-fa4a-4c8e-92af-dd304dbc64f1\") " Dec 11 13:29:27 crc kubenswrapper[4898]: I1211 13:29:27.058566 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tcjgp\" (UniqueName: \"kubernetes.io/projected/c4188522-fa4a-4c8e-92af-dd304dbc64f1-kube-api-access-tcjgp\") pod \"c4188522-fa4a-4c8e-92af-dd304dbc64f1\" (UID: \"c4188522-fa4a-4c8e-92af-dd304dbc64f1\") " Dec 11 13:29:27 crc kubenswrapper[4898]: I1211 13:29:27.058700 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4188522-fa4a-4c8e-92af-dd304dbc64f1-combined-ca-bundle\") pod \"c4188522-fa4a-4c8e-92af-dd304dbc64f1\" (UID: \"c4188522-fa4a-4c8e-92af-dd304dbc64f1\") " Dec 11 13:29:27 crc kubenswrapper[4898]: I1211 13:29:27.064254 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4188522-fa4a-4c8e-92af-dd304dbc64f1-scripts" (OuterVolumeSpecName: "scripts") pod "c4188522-fa4a-4c8e-92af-dd304dbc64f1" (UID: "c4188522-fa4a-4c8e-92af-dd304dbc64f1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:29:27 crc kubenswrapper[4898]: I1211 13:29:27.065341 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c4188522-fa4a-4c8e-92af-dd304dbc64f1-kube-api-access-tcjgp" (OuterVolumeSpecName: "kube-api-access-tcjgp") pod "c4188522-fa4a-4c8e-92af-dd304dbc64f1" (UID: "c4188522-fa4a-4c8e-92af-dd304dbc64f1"). InnerVolumeSpecName "kube-api-access-tcjgp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:29:27 crc kubenswrapper[4898]: I1211 13:29:27.095444 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4188522-fa4a-4c8e-92af-dd304dbc64f1-config-data" (OuterVolumeSpecName: "config-data") pod "c4188522-fa4a-4c8e-92af-dd304dbc64f1" (UID: "c4188522-fa4a-4c8e-92af-dd304dbc64f1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:29:27 crc kubenswrapper[4898]: I1211 13:29:27.096423 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4188522-fa4a-4c8e-92af-dd304dbc64f1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c4188522-fa4a-4c8e-92af-dd304dbc64f1" (UID: "c4188522-fa4a-4c8e-92af-dd304dbc64f1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:29:27 crc kubenswrapper[4898]: I1211 13:29:27.161667 4898 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c4188522-fa4a-4c8e-92af-dd304dbc64f1-scripts\") on node \"crc\" DevicePath \"\"" Dec 11 13:29:27 crc kubenswrapper[4898]: I1211 13:29:27.161872 4898 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c4188522-fa4a-4c8e-92af-dd304dbc64f1-config-data\") on node \"crc\" DevicePath \"\"" Dec 11 13:29:27 crc kubenswrapper[4898]: I1211 13:29:27.161961 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tcjgp\" (UniqueName: \"kubernetes.io/projected/c4188522-fa4a-4c8e-92af-dd304dbc64f1-kube-api-access-tcjgp\") on node \"crc\" DevicePath \"\"" Dec 11 13:29:27 crc kubenswrapper[4898]: I1211 13:29:27.162072 4898 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4188522-fa4a-4c8e-92af-dd304dbc64f1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 11 13:29:27 crc kubenswrapper[4898]: I1211 13:29:27.462785 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-2hj8k" event={"ID":"c4188522-fa4a-4c8e-92af-dd304dbc64f1","Type":"ContainerDied","Data":"6fb4847e9f53919784e832d24c9ac4a0dbc0ffb28576e85c1afbbf3761dee9fc"} Dec 11 13:29:27 crc kubenswrapper[4898]: I1211 13:29:27.462832 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6fb4847e9f53919784e832d24c9ac4a0dbc0ffb28576e85c1afbbf3761dee9fc" Dec 11 13:29:27 crc kubenswrapper[4898]: I1211 13:29:27.462911 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-2hj8k" Dec 11 13:29:27 crc kubenswrapper[4898]: I1211 13:29:27.638202 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 11 13:29:27 crc kubenswrapper[4898]: I1211 13:29:27.638521 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="51554a75-4d15-45ef-9b09-3d4d08a8bc0a" containerName="nova-api-log" containerID="cri-o://ef4205358fec75ed00826564e5c3263149146455434cd62c9a57a5cca4260ed0" gracePeriod=30 Dec 11 13:29:27 crc kubenswrapper[4898]: I1211 13:29:27.639013 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="51554a75-4d15-45ef-9b09-3d4d08a8bc0a" containerName="nova-api-api" containerID="cri-o://59f970d82ca624167203babda7e2de83da79e47e0a3eee36b8cc957db3fedd15" gracePeriod=30 Dec 11 13:29:27 crc kubenswrapper[4898]: I1211 13:29:27.657106 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 11 13:29:27 crc kubenswrapper[4898]: I1211 13:29:27.657381 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="4337e072-ac49-4912-946d-e52de7c33a1b" containerName="nova-scheduler-scheduler" containerID="cri-o://a1dbf6c292cd096709c9173ab16e98d34d6b4899b2eae82c2ee00bd664f557d8" gracePeriod=30 Dec 11 13:29:27 crc kubenswrapper[4898]: I1211 13:29:27.688811 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 11 13:29:27 crc kubenswrapper[4898]: I1211 13:29:27.689658 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="c9d50b6b-d1c5-4072-964e-475b4bfb685d" containerName="nova-metadata-log" containerID="cri-o://2f061f8cda1338f451f3df1a1607c6f6db4157dc25c409a20ffdd023497e8b3b" gracePeriod=30 Dec 11 13:29:27 crc kubenswrapper[4898]: I1211 13:29:27.689735 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="c9d50b6b-d1c5-4072-964e-475b4bfb685d" containerName="nova-metadata-metadata" containerID="cri-o://cefc528e45f4b84ad123b7112e1b8cdbc6fffed72b0905ba66016708c0b55e09" gracePeriod=30 Dec 11 13:29:28 crc kubenswrapper[4898]: I1211 13:29:28.454272 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 11 13:29:28 crc kubenswrapper[4898]: I1211 13:29:28.493062 4898 generic.go:334] "Generic (PLEG): container finished" podID="c9d50b6b-d1c5-4072-964e-475b4bfb685d" containerID="2f061f8cda1338f451f3df1a1607c6f6db4157dc25c409a20ffdd023497e8b3b" exitCode=143 Dec 11 13:29:28 crc kubenswrapper[4898]: I1211 13:29:28.493132 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c9d50b6b-d1c5-4072-964e-475b4bfb685d","Type":"ContainerDied","Data":"2f061f8cda1338f451f3df1a1607c6f6db4157dc25c409a20ffdd023497e8b3b"} Dec 11 13:29:28 crc kubenswrapper[4898]: I1211 13:29:28.495988 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51554a75-4d15-45ef-9b09-3d4d08a8bc0a-config-data\") pod \"51554a75-4d15-45ef-9b09-3d4d08a8bc0a\" (UID: \"51554a75-4d15-45ef-9b09-3d4d08a8bc0a\") " Dec 11 13:29:28 crc kubenswrapper[4898]: I1211 13:29:28.496069 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/51554a75-4d15-45ef-9b09-3d4d08a8bc0a-public-tls-certs\") pod \"51554a75-4d15-45ef-9b09-3d4d08a8bc0a\" (UID: \"51554a75-4d15-45ef-9b09-3d4d08a8bc0a\") " Dec 11 13:29:28 crc kubenswrapper[4898]: I1211 13:29:28.496246 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/51554a75-4d15-45ef-9b09-3d4d08a8bc0a-logs\") pod \"51554a75-4d15-45ef-9b09-3d4d08a8bc0a\" (UID: \"51554a75-4d15-45ef-9b09-3d4d08a8bc0a\") " Dec 11 13:29:28 crc kubenswrapper[4898]: I1211 13:29:28.496300 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g7nwc\" (UniqueName: \"kubernetes.io/projected/51554a75-4d15-45ef-9b09-3d4d08a8bc0a-kube-api-access-g7nwc\") pod \"51554a75-4d15-45ef-9b09-3d4d08a8bc0a\" (UID: \"51554a75-4d15-45ef-9b09-3d4d08a8bc0a\") " Dec 11 13:29:28 crc kubenswrapper[4898]: I1211 13:29:28.496498 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/51554a75-4d15-45ef-9b09-3d4d08a8bc0a-internal-tls-certs\") pod \"51554a75-4d15-45ef-9b09-3d4d08a8bc0a\" (UID: \"51554a75-4d15-45ef-9b09-3d4d08a8bc0a\") " Dec 11 13:29:28 crc kubenswrapper[4898]: I1211 13:29:28.496522 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51554a75-4d15-45ef-9b09-3d4d08a8bc0a-combined-ca-bundle\") pod \"51554a75-4d15-45ef-9b09-3d4d08a8bc0a\" (UID: \"51554a75-4d15-45ef-9b09-3d4d08a8bc0a\") " Dec 11 13:29:28 crc kubenswrapper[4898]: I1211 13:29:28.496609 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/51554a75-4d15-45ef-9b09-3d4d08a8bc0a-logs" (OuterVolumeSpecName: "logs") pod "51554a75-4d15-45ef-9b09-3d4d08a8bc0a" (UID: "51554a75-4d15-45ef-9b09-3d4d08a8bc0a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 13:29:28 crc kubenswrapper[4898]: I1211 13:29:28.497220 4898 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/51554a75-4d15-45ef-9b09-3d4d08a8bc0a-logs\") on node \"crc\" DevicePath \"\"" Dec 11 13:29:28 crc kubenswrapper[4898]: I1211 13:29:28.501899 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51554a75-4d15-45ef-9b09-3d4d08a8bc0a-kube-api-access-g7nwc" (OuterVolumeSpecName: "kube-api-access-g7nwc") pod "51554a75-4d15-45ef-9b09-3d4d08a8bc0a" (UID: "51554a75-4d15-45ef-9b09-3d4d08a8bc0a"). InnerVolumeSpecName "kube-api-access-g7nwc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:29:28 crc kubenswrapper[4898]: I1211 13:29:28.503410 4898 generic.go:334] "Generic (PLEG): container finished" podID="51554a75-4d15-45ef-9b09-3d4d08a8bc0a" containerID="59f970d82ca624167203babda7e2de83da79e47e0a3eee36b8cc957db3fedd15" exitCode=0 Dec 11 13:29:28 crc kubenswrapper[4898]: I1211 13:29:28.503438 4898 generic.go:334] "Generic (PLEG): container finished" podID="51554a75-4d15-45ef-9b09-3d4d08a8bc0a" containerID="ef4205358fec75ed00826564e5c3263149146455434cd62c9a57a5cca4260ed0" exitCode=143 Dec 11 13:29:28 crc kubenswrapper[4898]: I1211 13:29:28.503475 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"51554a75-4d15-45ef-9b09-3d4d08a8bc0a","Type":"ContainerDied","Data":"59f970d82ca624167203babda7e2de83da79e47e0a3eee36b8cc957db3fedd15"} Dec 11 13:29:28 crc kubenswrapper[4898]: I1211 13:29:28.503504 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"51554a75-4d15-45ef-9b09-3d4d08a8bc0a","Type":"ContainerDied","Data":"ef4205358fec75ed00826564e5c3263149146455434cd62c9a57a5cca4260ed0"} Dec 11 13:29:28 crc kubenswrapper[4898]: I1211 13:29:28.503513 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"51554a75-4d15-45ef-9b09-3d4d08a8bc0a","Type":"ContainerDied","Data":"e1d048188c6b7ebebf1e4a2d07f0a88771ea25350dd49c75943731e762cdab88"} Dec 11 13:29:28 crc kubenswrapper[4898]: I1211 13:29:28.503528 4898 scope.go:117] "RemoveContainer" containerID="59f970d82ca624167203babda7e2de83da79e47e0a3eee36b8cc957db3fedd15" Dec 11 13:29:28 crc kubenswrapper[4898]: I1211 13:29:28.503572 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 11 13:29:28 crc kubenswrapper[4898]: I1211 13:29:28.539122 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51554a75-4d15-45ef-9b09-3d4d08a8bc0a-config-data" (OuterVolumeSpecName: "config-data") pod "51554a75-4d15-45ef-9b09-3d4d08a8bc0a" (UID: "51554a75-4d15-45ef-9b09-3d4d08a8bc0a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:29:28 crc kubenswrapper[4898]: I1211 13:29:28.542808 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51554a75-4d15-45ef-9b09-3d4d08a8bc0a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "51554a75-4d15-45ef-9b09-3d4d08a8bc0a" (UID: "51554a75-4d15-45ef-9b09-3d4d08a8bc0a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:29:28 crc kubenswrapper[4898]: I1211 13:29:28.588504 4898 scope.go:117] "RemoveContainer" containerID="ef4205358fec75ed00826564e5c3263149146455434cd62c9a57a5cca4260ed0" Dec 11 13:29:28 crc kubenswrapper[4898]: I1211 13:29:28.589871 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51554a75-4d15-45ef-9b09-3d4d08a8bc0a-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "51554a75-4d15-45ef-9b09-3d4d08a8bc0a" (UID: "51554a75-4d15-45ef-9b09-3d4d08a8bc0a"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:29:28 crc kubenswrapper[4898]: I1211 13:29:28.592394 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51554a75-4d15-45ef-9b09-3d4d08a8bc0a-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "51554a75-4d15-45ef-9b09-3d4d08a8bc0a" (UID: "51554a75-4d15-45ef-9b09-3d4d08a8bc0a"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:29:28 crc kubenswrapper[4898]: I1211 13:29:28.599008 4898 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/51554a75-4d15-45ef-9b09-3d4d08a8bc0a-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 11 13:29:28 crc kubenswrapper[4898]: I1211 13:29:28.599054 4898 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51554a75-4d15-45ef-9b09-3d4d08a8bc0a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 11 13:29:28 crc kubenswrapper[4898]: I1211 13:29:28.599067 4898 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51554a75-4d15-45ef-9b09-3d4d08a8bc0a-config-data\") on node \"crc\" DevicePath \"\"" Dec 11 13:29:28 crc kubenswrapper[4898]: I1211 13:29:28.599078 4898 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/51554a75-4d15-45ef-9b09-3d4d08a8bc0a-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 11 13:29:28 crc kubenswrapper[4898]: I1211 13:29:28.599091 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g7nwc\" (UniqueName: \"kubernetes.io/projected/51554a75-4d15-45ef-9b09-3d4d08a8bc0a-kube-api-access-g7nwc\") on node \"crc\" DevicePath \"\"" Dec 11 13:29:28 crc kubenswrapper[4898]: I1211 13:29:28.620744 4898 scope.go:117] "RemoveContainer" containerID="59f970d82ca624167203babda7e2de83da79e47e0a3eee36b8cc957db3fedd15" Dec 11 13:29:28 crc kubenswrapper[4898]: E1211 13:29:28.621751 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"59f970d82ca624167203babda7e2de83da79e47e0a3eee36b8cc957db3fedd15\": container with ID starting with 59f970d82ca624167203babda7e2de83da79e47e0a3eee36b8cc957db3fedd15 not found: ID does not exist" containerID="59f970d82ca624167203babda7e2de83da79e47e0a3eee36b8cc957db3fedd15" Dec 11 13:29:28 crc kubenswrapper[4898]: I1211 13:29:28.621781 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59f970d82ca624167203babda7e2de83da79e47e0a3eee36b8cc957db3fedd15"} err="failed to get container status \"59f970d82ca624167203babda7e2de83da79e47e0a3eee36b8cc957db3fedd15\": rpc error: code = NotFound desc = could not find container \"59f970d82ca624167203babda7e2de83da79e47e0a3eee36b8cc957db3fedd15\": container with ID starting with 59f970d82ca624167203babda7e2de83da79e47e0a3eee36b8cc957db3fedd15 not found: ID does not exist" Dec 11 13:29:28 crc kubenswrapper[4898]: I1211 13:29:28.621827 4898 scope.go:117] "RemoveContainer" containerID="ef4205358fec75ed00826564e5c3263149146455434cd62c9a57a5cca4260ed0" Dec 11 13:29:28 crc kubenswrapper[4898]: E1211 13:29:28.622123 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ef4205358fec75ed00826564e5c3263149146455434cd62c9a57a5cca4260ed0\": container with ID starting with ef4205358fec75ed00826564e5c3263149146455434cd62c9a57a5cca4260ed0 not found: ID does not exist" containerID="ef4205358fec75ed00826564e5c3263149146455434cd62c9a57a5cca4260ed0" Dec 11 13:29:28 crc kubenswrapper[4898]: I1211 13:29:28.622155 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef4205358fec75ed00826564e5c3263149146455434cd62c9a57a5cca4260ed0"} err="failed to get container status \"ef4205358fec75ed00826564e5c3263149146455434cd62c9a57a5cca4260ed0\": rpc error: code = NotFound desc = could not find container \"ef4205358fec75ed00826564e5c3263149146455434cd62c9a57a5cca4260ed0\": container with ID starting with ef4205358fec75ed00826564e5c3263149146455434cd62c9a57a5cca4260ed0 not found: ID does not exist" Dec 11 13:29:28 crc kubenswrapper[4898]: I1211 13:29:28.622183 4898 scope.go:117] "RemoveContainer" containerID="59f970d82ca624167203babda7e2de83da79e47e0a3eee36b8cc957db3fedd15" Dec 11 13:29:28 crc kubenswrapper[4898]: I1211 13:29:28.622438 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59f970d82ca624167203babda7e2de83da79e47e0a3eee36b8cc957db3fedd15"} err="failed to get container status \"59f970d82ca624167203babda7e2de83da79e47e0a3eee36b8cc957db3fedd15\": rpc error: code = NotFound desc = could not find container \"59f970d82ca624167203babda7e2de83da79e47e0a3eee36b8cc957db3fedd15\": container with ID starting with 59f970d82ca624167203babda7e2de83da79e47e0a3eee36b8cc957db3fedd15 not found: ID does not exist" Dec 11 13:29:28 crc kubenswrapper[4898]: I1211 13:29:28.622483 4898 scope.go:117] "RemoveContainer" containerID="ef4205358fec75ed00826564e5c3263149146455434cd62c9a57a5cca4260ed0" Dec 11 13:29:28 crc kubenswrapper[4898]: I1211 13:29:28.622727 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef4205358fec75ed00826564e5c3263149146455434cd62c9a57a5cca4260ed0"} err="failed to get container status \"ef4205358fec75ed00826564e5c3263149146455434cd62c9a57a5cca4260ed0\": rpc error: code = NotFound desc = could not find container \"ef4205358fec75ed00826564e5c3263149146455434cd62c9a57a5cca4260ed0\": container with ID starting with ef4205358fec75ed00826564e5c3263149146455434cd62c9a57a5cca4260ed0 not found: ID does not exist" Dec 11 13:29:28 crc kubenswrapper[4898]: I1211 13:29:28.830186 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 11 13:29:28 crc kubenswrapper[4898]: I1211 13:29:28.841046 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 11 13:29:28 crc kubenswrapper[4898]: I1211 13:29:28.860593 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 11 13:29:28 crc kubenswrapper[4898]: E1211 13:29:28.861254 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cec835c2-4160-4de3-ac12-7e808d0882b3" containerName="extract-utilities" Dec 11 13:29:28 crc kubenswrapper[4898]: I1211 13:29:28.861281 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="cec835c2-4160-4de3-ac12-7e808d0882b3" containerName="extract-utilities" Dec 11 13:29:28 crc kubenswrapper[4898]: E1211 13:29:28.861293 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cec835c2-4160-4de3-ac12-7e808d0882b3" containerName="registry-server" Dec 11 13:29:28 crc kubenswrapper[4898]: I1211 13:29:28.861302 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="cec835c2-4160-4de3-ac12-7e808d0882b3" containerName="registry-server" Dec 11 13:29:28 crc kubenswrapper[4898]: E1211 13:29:28.861323 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51554a75-4d15-45ef-9b09-3d4d08a8bc0a" containerName="nova-api-api" Dec 11 13:29:28 crc kubenswrapper[4898]: I1211 13:29:28.861331 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="51554a75-4d15-45ef-9b09-3d4d08a8bc0a" containerName="nova-api-api" Dec 11 13:29:28 crc kubenswrapper[4898]: E1211 13:29:28.861356 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51554a75-4d15-45ef-9b09-3d4d08a8bc0a" containerName="nova-api-log" Dec 11 13:29:28 crc kubenswrapper[4898]: I1211 13:29:28.861368 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="51554a75-4d15-45ef-9b09-3d4d08a8bc0a" containerName="nova-api-log" Dec 11 13:29:28 crc kubenswrapper[4898]: E1211 13:29:28.861390 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15ab34e2-564c-479b-a8af-c428abb92e17" containerName="init" Dec 11 13:29:28 crc kubenswrapper[4898]: I1211 13:29:28.861397 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="15ab34e2-564c-479b-a8af-c428abb92e17" containerName="init" Dec 11 13:29:28 crc kubenswrapper[4898]: E1211 13:29:28.861428 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4188522-fa4a-4c8e-92af-dd304dbc64f1" containerName="nova-manage" Dec 11 13:29:28 crc kubenswrapper[4898]: I1211 13:29:28.861438 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4188522-fa4a-4c8e-92af-dd304dbc64f1" containerName="nova-manage" Dec 11 13:29:28 crc kubenswrapper[4898]: E1211 13:29:28.861481 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15ab34e2-564c-479b-a8af-c428abb92e17" containerName="dnsmasq-dns" Dec 11 13:29:28 crc kubenswrapper[4898]: I1211 13:29:28.861490 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="15ab34e2-564c-479b-a8af-c428abb92e17" containerName="dnsmasq-dns" Dec 11 13:29:28 crc kubenswrapper[4898]: E1211 13:29:28.861519 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cec835c2-4160-4de3-ac12-7e808d0882b3" containerName="extract-content" Dec 11 13:29:28 crc kubenswrapper[4898]: I1211 13:29:28.861529 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="cec835c2-4160-4de3-ac12-7e808d0882b3" containerName="extract-content" Dec 11 13:29:28 crc kubenswrapper[4898]: I1211 13:29:28.861832 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="51554a75-4d15-45ef-9b09-3d4d08a8bc0a" containerName="nova-api-api" Dec 11 13:29:28 crc kubenswrapper[4898]: I1211 13:29:28.861864 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="51554a75-4d15-45ef-9b09-3d4d08a8bc0a" containerName="nova-api-log" Dec 11 13:29:28 crc kubenswrapper[4898]: I1211 13:29:28.861883 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="15ab34e2-564c-479b-a8af-c428abb92e17" containerName="dnsmasq-dns" Dec 11 13:29:28 crc kubenswrapper[4898]: I1211 13:29:28.861898 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4188522-fa4a-4c8e-92af-dd304dbc64f1" containerName="nova-manage" Dec 11 13:29:28 crc kubenswrapper[4898]: I1211 13:29:28.861934 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="cec835c2-4160-4de3-ac12-7e808d0882b3" containerName="registry-server" Dec 11 13:29:28 crc kubenswrapper[4898]: I1211 13:29:28.863585 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 11 13:29:28 crc kubenswrapper[4898]: I1211 13:29:28.865647 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 11 13:29:28 crc kubenswrapper[4898]: I1211 13:29:28.865876 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Dec 11 13:29:28 crc kubenswrapper[4898]: I1211 13:29:28.866357 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Dec 11 13:29:28 crc kubenswrapper[4898]: I1211 13:29:28.874041 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 11 13:29:28 crc kubenswrapper[4898]: I1211 13:29:28.904800 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12f955af-c5c3-47de-b1ef-f23b46d06a62-config-data\") pod \"nova-api-0\" (UID: \"12f955af-c5c3-47de-b1ef-f23b46d06a62\") " pod="openstack/nova-api-0" Dec 11 13:29:28 crc kubenswrapper[4898]: I1211 13:29:28.904866 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/12f955af-c5c3-47de-b1ef-f23b46d06a62-internal-tls-certs\") pod \"nova-api-0\" (UID: \"12f955af-c5c3-47de-b1ef-f23b46d06a62\") " pod="openstack/nova-api-0" Dec 11 13:29:28 crc kubenswrapper[4898]: I1211 13:29:28.905188 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/12f955af-c5c3-47de-b1ef-f23b46d06a62-logs\") pod \"nova-api-0\" (UID: \"12f955af-c5c3-47de-b1ef-f23b46d06a62\") " pod="openstack/nova-api-0" Dec 11 13:29:28 crc kubenswrapper[4898]: I1211 13:29:28.905242 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6bhfm\" (UniqueName: \"kubernetes.io/projected/12f955af-c5c3-47de-b1ef-f23b46d06a62-kube-api-access-6bhfm\") pod \"nova-api-0\" (UID: \"12f955af-c5c3-47de-b1ef-f23b46d06a62\") " pod="openstack/nova-api-0" Dec 11 13:29:28 crc kubenswrapper[4898]: I1211 13:29:28.905300 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/12f955af-c5c3-47de-b1ef-f23b46d06a62-public-tls-certs\") pod \"nova-api-0\" (UID: \"12f955af-c5c3-47de-b1ef-f23b46d06a62\") " pod="openstack/nova-api-0" Dec 11 13:29:28 crc kubenswrapper[4898]: I1211 13:29:28.905371 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12f955af-c5c3-47de-b1ef-f23b46d06a62-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"12f955af-c5c3-47de-b1ef-f23b46d06a62\") " pod="openstack/nova-api-0" Dec 11 13:29:29 crc kubenswrapper[4898]: I1211 13:29:29.008155 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/12f955af-c5c3-47de-b1ef-f23b46d06a62-logs\") pod \"nova-api-0\" (UID: \"12f955af-c5c3-47de-b1ef-f23b46d06a62\") " pod="openstack/nova-api-0" Dec 11 13:29:29 crc kubenswrapper[4898]: I1211 13:29:29.008219 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6bhfm\" (UniqueName: \"kubernetes.io/projected/12f955af-c5c3-47de-b1ef-f23b46d06a62-kube-api-access-6bhfm\") pod \"nova-api-0\" (UID: \"12f955af-c5c3-47de-b1ef-f23b46d06a62\") " pod="openstack/nova-api-0" Dec 11 13:29:29 crc kubenswrapper[4898]: I1211 13:29:29.008308 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/12f955af-c5c3-47de-b1ef-f23b46d06a62-public-tls-certs\") pod \"nova-api-0\" (UID: \"12f955af-c5c3-47de-b1ef-f23b46d06a62\") " pod="openstack/nova-api-0" Dec 11 13:29:29 crc kubenswrapper[4898]: I1211 13:29:29.008374 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12f955af-c5c3-47de-b1ef-f23b46d06a62-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"12f955af-c5c3-47de-b1ef-f23b46d06a62\") " pod="openstack/nova-api-0" Dec 11 13:29:29 crc kubenswrapper[4898]: I1211 13:29:29.008530 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12f955af-c5c3-47de-b1ef-f23b46d06a62-config-data\") pod \"nova-api-0\" (UID: \"12f955af-c5c3-47de-b1ef-f23b46d06a62\") " pod="openstack/nova-api-0" Dec 11 13:29:29 crc kubenswrapper[4898]: I1211 13:29:29.008566 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/12f955af-c5c3-47de-b1ef-f23b46d06a62-internal-tls-certs\") pod \"nova-api-0\" (UID: \"12f955af-c5c3-47de-b1ef-f23b46d06a62\") " pod="openstack/nova-api-0" Dec 11 13:29:29 crc kubenswrapper[4898]: I1211 13:29:29.009001 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/12f955af-c5c3-47de-b1ef-f23b46d06a62-logs\") pod \"nova-api-0\" (UID: \"12f955af-c5c3-47de-b1ef-f23b46d06a62\") " pod="openstack/nova-api-0" Dec 11 13:29:29 crc kubenswrapper[4898]: I1211 13:29:29.011822 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/12f955af-c5c3-47de-b1ef-f23b46d06a62-internal-tls-certs\") pod \"nova-api-0\" (UID: \"12f955af-c5c3-47de-b1ef-f23b46d06a62\") " pod="openstack/nova-api-0" Dec 11 13:29:29 crc kubenswrapper[4898]: I1211 13:29:29.012156 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/12f955af-c5c3-47de-b1ef-f23b46d06a62-public-tls-certs\") pod \"nova-api-0\" (UID: \"12f955af-c5c3-47de-b1ef-f23b46d06a62\") " pod="openstack/nova-api-0" Dec 11 13:29:29 crc kubenswrapper[4898]: I1211 13:29:29.012907 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12f955af-c5c3-47de-b1ef-f23b46d06a62-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"12f955af-c5c3-47de-b1ef-f23b46d06a62\") " pod="openstack/nova-api-0" Dec 11 13:29:29 crc kubenswrapper[4898]: I1211 13:29:29.013541 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12f955af-c5c3-47de-b1ef-f23b46d06a62-config-data\") pod \"nova-api-0\" (UID: \"12f955af-c5c3-47de-b1ef-f23b46d06a62\") " pod="openstack/nova-api-0" Dec 11 13:29:29 crc kubenswrapper[4898]: I1211 13:29:29.031845 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6bhfm\" (UniqueName: \"kubernetes.io/projected/12f955af-c5c3-47de-b1ef-f23b46d06a62-kube-api-access-6bhfm\") pod \"nova-api-0\" (UID: \"12f955af-c5c3-47de-b1ef-f23b46d06a62\") " pod="openstack/nova-api-0" Dec 11 13:29:29 crc kubenswrapper[4898]: I1211 13:29:29.200572 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 11 13:29:29 crc kubenswrapper[4898]: W1211 13:29:29.684080 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod12f955af_c5c3_47de_b1ef_f23b46d06a62.slice/crio-c12fa6187b7cca4a8b31c3e6d346b656853c08e59589855ed5033cddfceaf1b3 WatchSource:0}: Error finding container c12fa6187b7cca4a8b31c3e6d346b656853c08e59589855ed5033cddfceaf1b3: Status 404 returned error can't find the container with id c12fa6187b7cca4a8b31c3e6d346b656853c08e59589855ed5033cddfceaf1b3 Dec 11 13:29:29 crc kubenswrapper[4898]: I1211 13:29:29.685887 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 11 13:29:30 crc kubenswrapper[4898]: I1211 13:29:30.391539 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="c3709b73-c7ab-4ad7-8d8d-2dc55ea4e46f" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.238:3000/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 11 13:29:30 crc kubenswrapper[4898]: I1211 13:29:30.536185 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"12f955af-c5c3-47de-b1ef-f23b46d06a62","Type":"ContainerStarted","Data":"f949a130c7f732f53e17aa2336b114b7a9ac9954cde9ddfc51c34714d51b1ce7"} Dec 11 13:29:30 crc kubenswrapper[4898]: I1211 13:29:30.536242 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"12f955af-c5c3-47de-b1ef-f23b46d06a62","Type":"ContainerStarted","Data":"9df3a4b28676c72448caeea9d056223d77a5daa35ae05aacfe620495338843f4"} Dec 11 13:29:30 crc kubenswrapper[4898]: I1211 13:29:30.536260 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"12f955af-c5c3-47de-b1ef-f23b46d06a62","Type":"ContainerStarted","Data":"c12fa6187b7cca4a8b31c3e6d346b656853c08e59589855ed5033cddfceaf1b3"} Dec 11 13:29:30 crc kubenswrapper[4898]: I1211 13:29:30.570407 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.570373347 podStartE2EDuration="2.570373347s" podCreationTimestamp="2025-12-11 13:29:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 13:29:30.55605008 +0000 UTC m=+1528.128376567" watchObservedRunningTime="2025-12-11 13:29:30.570373347 +0000 UTC m=+1528.142699814" Dec 11 13:29:30 crc kubenswrapper[4898]: I1211 13:29:30.804447 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="51554a75-4d15-45ef-9b09-3d4d08a8bc0a" path="/var/lib/kubelet/pods/51554a75-4d15-45ef-9b09-3d4d08a8bc0a/volumes" Dec 11 13:29:31 crc kubenswrapper[4898]: I1211 13:29:31.483997 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 11 13:29:31 crc kubenswrapper[4898]: I1211 13:29:31.559275 4898 generic.go:334] "Generic (PLEG): container finished" podID="4337e072-ac49-4912-946d-e52de7c33a1b" containerID="a1dbf6c292cd096709c9173ab16e98d34d6b4899b2eae82c2ee00bd664f557d8" exitCode=0 Dec 11 13:29:31 crc kubenswrapper[4898]: I1211 13:29:31.559350 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"4337e072-ac49-4912-946d-e52de7c33a1b","Type":"ContainerDied","Data":"a1dbf6c292cd096709c9173ab16e98d34d6b4899b2eae82c2ee00bd664f557d8"} Dec 11 13:29:31 crc kubenswrapper[4898]: I1211 13:29:31.562435 4898 generic.go:334] "Generic (PLEG): container finished" podID="c9d50b6b-d1c5-4072-964e-475b4bfb685d" containerID="cefc528e45f4b84ad123b7112e1b8cdbc6fffed72b0905ba66016708c0b55e09" exitCode=0 Dec 11 13:29:31 crc kubenswrapper[4898]: I1211 13:29:31.562482 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c9d50b6b-d1c5-4072-964e-475b4bfb685d","Type":"ContainerDied","Data":"cefc528e45f4b84ad123b7112e1b8cdbc6fffed72b0905ba66016708c0b55e09"} Dec 11 13:29:31 crc kubenswrapper[4898]: I1211 13:29:31.562511 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 11 13:29:31 crc kubenswrapper[4898]: I1211 13:29:31.562532 4898 scope.go:117] "RemoveContainer" containerID="cefc528e45f4b84ad123b7112e1b8cdbc6fffed72b0905ba66016708c0b55e09" Dec 11 13:29:31 crc kubenswrapper[4898]: I1211 13:29:31.562520 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c9d50b6b-d1c5-4072-964e-475b4bfb685d","Type":"ContainerDied","Data":"4157068ed5e8522edbbd23b7fe110bdef403b8e56bc491ebd1eb422c9a9c7a4a"} Dec 11 13:29:31 crc kubenswrapper[4898]: I1211 13:29:31.581783 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c9d50b6b-d1c5-4072-964e-475b4bfb685d-logs\") pod \"c9d50b6b-d1c5-4072-964e-475b4bfb685d\" (UID: \"c9d50b6b-d1c5-4072-964e-475b4bfb685d\") " Dec 11 13:29:31 crc kubenswrapper[4898]: I1211 13:29:31.581848 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9d50b6b-d1c5-4072-964e-475b4bfb685d-combined-ca-bundle\") pod \"c9d50b6b-d1c5-4072-964e-475b4bfb685d\" (UID: \"c9d50b6b-d1c5-4072-964e-475b4bfb685d\") " Dec 11 13:29:31 crc kubenswrapper[4898]: I1211 13:29:31.581874 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c9d50b6b-d1c5-4072-964e-475b4bfb685d-nova-metadata-tls-certs\") pod \"c9d50b6b-d1c5-4072-964e-475b4bfb685d\" (UID: \"c9d50b6b-d1c5-4072-964e-475b4bfb685d\") " Dec 11 13:29:31 crc kubenswrapper[4898]: I1211 13:29:31.582117 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9d50b6b-d1c5-4072-964e-475b4bfb685d-config-data\") pod \"c9d50b6b-d1c5-4072-964e-475b4bfb685d\" (UID: \"c9d50b6b-d1c5-4072-964e-475b4bfb685d\") " Dec 11 13:29:31 crc kubenswrapper[4898]: I1211 13:29:31.582158 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wzwhc\" (UniqueName: \"kubernetes.io/projected/c9d50b6b-d1c5-4072-964e-475b4bfb685d-kube-api-access-wzwhc\") pod \"c9d50b6b-d1c5-4072-964e-475b4bfb685d\" (UID: \"c9d50b6b-d1c5-4072-964e-475b4bfb685d\") " Dec 11 13:29:31 crc kubenswrapper[4898]: I1211 13:29:31.583710 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c9d50b6b-d1c5-4072-964e-475b4bfb685d-logs" (OuterVolumeSpecName: "logs") pod "c9d50b6b-d1c5-4072-964e-475b4bfb685d" (UID: "c9d50b6b-d1c5-4072-964e-475b4bfb685d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 13:29:31 crc kubenswrapper[4898]: I1211 13:29:31.586256 4898 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c9d50b6b-d1c5-4072-964e-475b4bfb685d-logs\") on node \"crc\" DevicePath \"\"" Dec 11 13:29:31 crc kubenswrapper[4898]: I1211 13:29:31.590612 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9d50b6b-d1c5-4072-964e-475b4bfb685d-kube-api-access-wzwhc" (OuterVolumeSpecName: "kube-api-access-wzwhc") pod "c9d50b6b-d1c5-4072-964e-475b4bfb685d" (UID: "c9d50b6b-d1c5-4072-964e-475b4bfb685d"). InnerVolumeSpecName "kube-api-access-wzwhc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:29:31 crc kubenswrapper[4898]: I1211 13:29:31.598380 4898 scope.go:117] "RemoveContainer" containerID="2f061f8cda1338f451f3df1a1607c6f6db4157dc25c409a20ffdd023497e8b3b" Dec 11 13:29:31 crc kubenswrapper[4898]: I1211 13:29:31.618511 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9d50b6b-d1c5-4072-964e-475b4bfb685d-config-data" (OuterVolumeSpecName: "config-data") pod "c9d50b6b-d1c5-4072-964e-475b4bfb685d" (UID: "c9d50b6b-d1c5-4072-964e-475b4bfb685d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:29:31 crc kubenswrapper[4898]: I1211 13:29:31.688732 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wzwhc\" (UniqueName: \"kubernetes.io/projected/c9d50b6b-d1c5-4072-964e-475b4bfb685d-kube-api-access-wzwhc\") on node \"crc\" DevicePath \"\"" Dec 11 13:29:31 crc kubenswrapper[4898]: I1211 13:29:31.689003 4898 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9d50b6b-d1c5-4072-964e-475b4bfb685d-config-data\") on node \"crc\" DevicePath \"\"" Dec 11 13:29:31 crc kubenswrapper[4898]: I1211 13:29:31.698392 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9d50b6b-d1c5-4072-964e-475b4bfb685d-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "c9d50b6b-d1c5-4072-964e-475b4bfb685d" (UID: "c9d50b6b-d1c5-4072-964e-475b4bfb685d"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:29:31 crc kubenswrapper[4898]: I1211 13:29:31.699792 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9d50b6b-d1c5-4072-964e-475b4bfb685d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c9d50b6b-d1c5-4072-964e-475b4bfb685d" (UID: "c9d50b6b-d1c5-4072-964e-475b4bfb685d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:29:31 crc kubenswrapper[4898]: I1211 13:29:31.775595 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 11 13:29:31 crc kubenswrapper[4898]: I1211 13:29:31.781871 4898 scope.go:117] "RemoveContainer" containerID="cefc528e45f4b84ad123b7112e1b8cdbc6fffed72b0905ba66016708c0b55e09" Dec 11 13:29:31 crc kubenswrapper[4898]: E1211 13:29:31.782265 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cefc528e45f4b84ad123b7112e1b8cdbc6fffed72b0905ba66016708c0b55e09\": container with ID starting with cefc528e45f4b84ad123b7112e1b8cdbc6fffed72b0905ba66016708c0b55e09 not found: ID does not exist" containerID="cefc528e45f4b84ad123b7112e1b8cdbc6fffed72b0905ba66016708c0b55e09" Dec 11 13:29:31 crc kubenswrapper[4898]: I1211 13:29:31.782389 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cefc528e45f4b84ad123b7112e1b8cdbc6fffed72b0905ba66016708c0b55e09"} err="failed to get container status \"cefc528e45f4b84ad123b7112e1b8cdbc6fffed72b0905ba66016708c0b55e09\": rpc error: code = NotFound desc = could not find container \"cefc528e45f4b84ad123b7112e1b8cdbc6fffed72b0905ba66016708c0b55e09\": container with ID starting with cefc528e45f4b84ad123b7112e1b8cdbc6fffed72b0905ba66016708c0b55e09 not found: ID does not exist" Dec 11 13:29:31 crc kubenswrapper[4898]: I1211 13:29:31.782505 4898 scope.go:117] "RemoveContainer" containerID="2f061f8cda1338f451f3df1a1607c6f6db4157dc25c409a20ffdd023497e8b3b" Dec 11 13:29:31 crc kubenswrapper[4898]: E1211 13:29:31.782884 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f061f8cda1338f451f3df1a1607c6f6db4157dc25c409a20ffdd023497e8b3b\": container with ID starting with 2f061f8cda1338f451f3df1a1607c6f6db4157dc25c409a20ffdd023497e8b3b not found: ID does not exist" containerID="2f061f8cda1338f451f3df1a1607c6f6db4157dc25c409a20ffdd023497e8b3b" Dec 11 13:29:31 crc kubenswrapper[4898]: I1211 13:29:31.782924 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f061f8cda1338f451f3df1a1607c6f6db4157dc25c409a20ffdd023497e8b3b"} err="failed to get container status \"2f061f8cda1338f451f3df1a1607c6f6db4157dc25c409a20ffdd023497e8b3b\": rpc error: code = NotFound desc = could not find container \"2f061f8cda1338f451f3df1a1607c6f6db4157dc25c409a20ffdd023497e8b3b\": container with ID starting with 2f061f8cda1338f451f3df1a1607c6f6db4157dc25c409a20ffdd023497e8b3b not found: ID does not exist" Dec 11 13:29:31 crc kubenswrapper[4898]: I1211 13:29:31.791835 4898 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9d50b6b-d1c5-4072-964e-475b4bfb685d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 11 13:29:31 crc kubenswrapper[4898]: I1211 13:29:31.791876 4898 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c9d50b6b-d1c5-4072-964e-475b4bfb685d-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 11 13:29:31 crc kubenswrapper[4898]: I1211 13:29:31.893188 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4337e072-ac49-4912-946d-e52de7c33a1b-config-data\") pod \"4337e072-ac49-4912-946d-e52de7c33a1b\" (UID: \"4337e072-ac49-4912-946d-e52de7c33a1b\") " Dec 11 13:29:31 crc kubenswrapper[4898]: I1211 13:29:31.893322 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4337e072-ac49-4912-946d-e52de7c33a1b-combined-ca-bundle\") pod \"4337e072-ac49-4912-946d-e52de7c33a1b\" (UID: \"4337e072-ac49-4912-946d-e52de7c33a1b\") " Dec 11 13:29:31 crc kubenswrapper[4898]: I1211 13:29:31.893379 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k7vfl\" (UniqueName: \"kubernetes.io/projected/4337e072-ac49-4912-946d-e52de7c33a1b-kube-api-access-k7vfl\") pod \"4337e072-ac49-4912-946d-e52de7c33a1b\" (UID: \"4337e072-ac49-4912-946d-e52de7c33a1b\") " Dec 11 13:29:31 crc kubenswrapper[4898]: I1211 13:29:31.897976 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4337e072-ac49-4912-946d-e52de7c33a1b-kube-api-access-k7vfl" (OuterVolumeSpecName: "kube-api-access-k7vfl") pod "4337e072-ac49-4912-946d-e52de7c33a1b" (UID: "4337e072-ac49-4912-946d-e52de7c33a1b"). InnerVolumeSpecName "kube-api-access-k7vfl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:29:31 crc kubenswrapper[4898]: I1211 13:29:31.902995 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 11 13:29:31 crc kubenswrapper[4898]: I1211 13:29:31.937749 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 11 13:29:31 crc kubenswrapper[4898]: I1211 13:29:31.938232 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4337e072-ac49-4912-946d-e52de7c33a1b-config-data" (OuterVolumeSpecName: "config-data") pod "4337e072-ac49-4912-946d-e52de7c33a1b" (UID: "4337e072-ac49-4912-946d-e52de7c33a1b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:29:31 crc kubenswrapper[4898]: I1211 13:29:31.954718 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4337e072-ac49-4912-946d-e52de7c33a1b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4337e072-ac49-4912-946d-e52de7c33a1b" (UID: "4337e072-ac49-4912-946d-e52de7c33a1b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:29:31 crc kubenswrapper[4898]: I1211 13:29:31.960734 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 11 13:29:31 crc kubenswrapper[4898]: E1211 13:29:31.961700 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9d50b6b-d1c5-4072-964e-475b4bfb685d" containerName="nova-metadata-log" Dec 11 13:29:31 crc kubenswrapper[4898]: I1211 13:29:31.961809 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9d50b6b-d1c5-4072-964e-475b4bfb685d" containerName="nova-metadata-log" Dec 11 13:29:31 crc kubenswrapper[4898]: E1211 13:29:31.961861 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9d50b6b-d1c5-4072-964e-475b4bfb685d" containerName="nova-metadata-metadata" Dec 11 13:29:31 crc kubenswrapper[4898]: I1211 13:29:31.961867 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9d50b6b-d1c5-4072-964e-475b4bfb685d" containerName="nova-metadata-metadata" Dec 11 13:29:31 crc kubenswrapper[4898]: E1211 13:29:31.961876 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4337e072-ac49-4912-946d-e52de7c33a1b" containerName="nova-scheduler-scheduler" Dec 11 13:29:31 crc kubenswrapper[4898]: I1211 13:29:31.961882 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="4337e072-ac49-4912-946d-e52de7c33a1b" containerName="nova-scheduler-scheduler" Dec 11 13:29:31 crc kubenswrapper[4898]: I1211 13:29:31.962118 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9d50b6b-d1c5-4072-964e-475b4bfb685d" containerName="nova-metadata-log" Dec 11 13:29:31 crc kubenswrapper[4898]: I1211 13:29:31.962140 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="4337e072-ac49-4912-946d-e52de7c33a1b" containerName="nova-scheduler-scheduler" Dec 11 13:29:31 crc kubenswrapper[4898]: I1211 13:29:31.962154 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9d50b6b-d1c5-4072-964e-475b4bfb685d" containerName="nova-metadata-metadata" Dec 11 13:29:31 crc kubenswrapper[4898]: I1211 13:29:31.964692 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 11 13:29:31 crc kubenswrapper[4898]: I1211 13:29:31.972379 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 11 13:29:31 crc kubenswrapper[4898]: I1211 13:29:31.972791 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 11 13:29:31 crc kubenswrapper[4898]: I1211 13:29:31.988991 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 11 13:29:31 crc kubenswrapper[4898]: I1211 13:29:31.997501 4898 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4337e072-ac49-4912-946d-e52de7c33a1b-config-data\") on node \"crc\" DevicePath \"\"" Dec 11 13:29:31 crc kubenswrapper[4898]: I1211 13:29:31.997536 4898 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4337e072-ac49-4912-946d-e52de7c33a1b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 11 13:29:31 crc kubenswrapper[4898]: I1211 13:29:31.997550 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k7vfl\" (UniqueName: \"kubernetes.io/projected/4337e072-ac49-4912-946d-e52de7c33a1b-kube-api-access-k7vfl\") on node \"crc\" DevicePath \"\"" Dec 11 13:29:32 crc kubenswrapper[4898]: I1211 13:29:32.099381 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a7102f4f-6a78-4e44-8f68-9efeccd5d632-logs\") pod \"nova-metadata-0\" (UID: \"a7102f4f-6a78-4e44-8f68-9efeccd5d632\") " pod="openstack/nova-metadata-0" Dec 11 13:29:32 crc kubenswrapper[4898]: I1211 13:29:32.099712 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7102f4f-6a78-4e44-8f68-9efeccd5d632-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"a7102f4f-6a78-4e44-8f68-9efeccd5d632\") " pod="openstack/nova-metadata-0" Dec 11 13:29:32 crc kubenswrapper[4898]: I1211 13:29:32.099834 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7102f4f-6a78-4e44-8f68-9efeccd5d632-config-data\") pod \"nova-metadata-0\" (UID: \"a7102f4f-6a78-4e44-8f68-9efeccd5d632\") " pod="openstack/nova-metadata-0" Dec 11 13:29:32 crc kubenswrapper[4898]: I1211 13:29:32.100138 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7102f4f-6a78-4e44-8f68-9efeccd5d632-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"a7102f4f-6a78-4e44-8f68-9efeccd5d632\") " pod="openstack/nova-metadata-0" Dec 11 13:29:32 crc kubenswrapper[4898]: I1211 13:29:32.100227 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9hszh\" (UniqueName: \"kubernetes.io/projected/a7102f4f-6a78-4e44-8f68-9efeccd5d632-kube-api-access-9hszh\") pod \"nova-metadata-0\" (UID: \"a7102f4f-6a78-4e44-8f68-9efeccd5d632\") " pod="openstack/nova-metadata-0" Dec 11 13:29:32 crc kubenswrapper[4898]: I1211 13:29:32.202664 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7102f4f-6a78-4e44-8f68-9efeccd5d632-config-data\") pod \"nova-metadata-0\" (UID: \"a7102f4f-6a78-4e44-8f68-9efeccd5d632\") " pod="openstack/nova-metadata-0" Dec 11 13:29:32 crc kubenswrapper[4898]: I1211 13:29:32.202839 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7102f4f-6a78-4e44-8f68-9efeccd5d632-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"a7102f4f-6a78-4e44-8f68-9efeccd5d632\") " pod="openstack/nova-metadata-0" Dec 11 13:29:32 crc kubenswrapper[4898]: I1211 13:29:32.202901 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9hszh\" (UniqueName: \"kubernetes.io/projected/a7102f4f-6a78-4e44-8f68-9efeccd5d632-kube-api-access-9hszh\") pod \"nova-metadata-0\" (UID: \"a7102f4f-6a78-4e44-8f68-9efeccd5d632\") " pod="openstack/nova-metadata-0" Dec 11 13:29:32 crc kubenswrapper[4898]: I1211 13:29:32.203678 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a7102f4f-6a78-4e44-8f68-9efeccd5d632-logs\") pod \"nova-metadata-0\" (UID: \"a7102f4f-6a78-4e44-8f68-9efeccd5d632\") " pod="openstack/nova-metadata-0" Dec 11 13:29:32 crc kubenswrapper[4898]: I1211 13:29:32.204125 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a7102f4f-6a78-4e44-8f68-9efeccd5d632-logs\") pod \"nova-metadata-0\" (UID: \"a7102f4f-6a78-4e44-8f68-9efeccd5d632\") " pod="openstack/nova-metadata-0" Dec 11 13:29:32 crc kubenswrapper[4898]: I1211 13:29:32.204298 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7102f4f-6a78-4e44-8f68-9efeccd5d632-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"a7102f4f-6a78-4e44-8f68-9efeccd5d632\") " pod="openstack/nova-metadata-0" Dec 11 13:29:32 crc kubenswrapper[4898]: I1211 13:29:32.207891 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7102f4f-6a78-4e44-8f68-9efeccd5d632-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"a7102f4f-6a78-4e44-8f68-9efeccd5d632\") " pod="openstack/nova-metadata-0" Dec 11 13:29:32 crc kubenswrapper[4898]: I1211 13:29:32.208297 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7102f4f-6a78-4e44-8f68-9efeccd5d632-config-data\") pod \"nova-metadata-0\" (UID: \"a7102f4f-6a78-4e44-8f68-9efeccd5d632\") " pod="openstack/nova-metadata-0" Dec 11 13:29:32 crc kubenswrapper[4898]: I1211 13:29:32.209208 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7102f4f-6a78-4e44-8f68-9efeccd5d632-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"a7102f4f-6a78-4e44-8f68-9efeccd5d632\") " pod="openstack/nova-metadata-0" Dec 11 13:29:32 crc kubenswrapper[4898]: I1211 13:29:32.221191 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9hszh\" (UniqueName: \"kubernetes.io/projected/a7102f4f-6a78-4e44-8f68-9efeccd5d632-kube-api-access-9hszh\") pod \"nova-metadata-0\" (UID: \"a7102f4f-6a78-4e44-8f68-9efeccd5d632\") " pod="openstack/nova-metadata-0" Dec 11 13:29:32 crc kubenswrapper[4898]: I1211 13:29:32.303980 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 11 13:29:32 crc kubenswrapper[4898]: I1211 13:29:32.576946 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 11 13:29:32 crc kubenswrapper[4898]: I1211 13:29:32.576944 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"4337e072-ac49-4912-946d-e52de7c33a1b","Type":"ContainerDied","Data":"7d4092cc18d7a71bfad409a098b09b27912d77aadb068847a880021b0a483d96"} Dec 11 13:29:32 crc kubenswrapper[4898]: I1211 13:29:32.577076 4898 scope.go:117] "RemoveContainer" containerID="a1dbf6c292cd096709c9173ab16e98d34d6b4899b2eae82c2ee00bd664f557d8" Dec 11 13:29:32 crc kubenswrapper[4898]: I1211 13:29:32.668016 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 11 13:29:32 crc kubenswrapper[4898]: I1211 13:29:32.681723 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Dec 11 13:29:32 crc kubenswrapper[4898]: I1211 13:29:32.694556 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 11 13:29:32 crc kubenswrapper[4898]: I1211 13:29:32.696349 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 11 13:29:32 crc kubenswrapper[4898]: I1211 13:29:32.698486 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 11 13:29:32 crc kubenswrapper[4898]: I1211 13:29:32.712409 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 11 13:29:32 crc kubenswrapper[4898]: I1211 13:29:32.771773 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 11 13:29:32 crc kubenswrapper[4898]: W1211 13:29:32.772829 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda7102f4f_6a78_4e44_8f68_9efeccd5d632.slice/crio-cf8bba8e671b5c3d3f418d5cef3ef54a2b3b05daa3ef6946e8fa1915f8deb381 WatchSource:0}: Error finding container cf8bba8e671b5c3d3f418d5cef3ef54a2b3b05daa3ef6946e8fa1915f8deb381: Status 404 returned error can't find the container with id cf8bba8e671b5c3d3f418d5cef3ef54a2b3b05daa3ef6946e8fa1915f8deb381 Dec 11 13:29:32 crc kubenswrapper[4898]: I1211 13:29:32.786638 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4337e072-ac49-4912-946d-e52de7c33a1b" path="/var/lib/kubelet/pods/4337e072-ac49-4912-946d-e52de7c33a1b/volumes" Dec 11 13:29:32 crc kubenswrapper[4898]: I1211 13:29:32.787218 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9d50b6b-d1c5-4072-964e-475b4bfb685d" path="/var/lib/kubelet/pods/c9d50b6b-d1c5-4072-964e-475b4bfb685d/volumes" Dec 11 13:29:32 crc kubenswrapper[4898]: I1211 13:29:32.819769 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/119fd4c2-86a1-48d1-8005-2fc9a3062219-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"119fd4c2-86a1-48d1-8005-2fc9a3062219\") " pod="openstack/nova-scheduler-0" Dec 11 13:29:32 crc kubenswrapper[4898]: I1211 13:29:32.819922 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ktg9n\" (UniqueName: \"kubernetes.io/projected/119fd4c2-86a1-48d1-8005-2fc9a3062219-kube-api-access-ktg9n\") pod \"nova-scheduler-0\" (UID: \"119fd4c2-86a1-48d1-8005-2fc9a3062219\") " pod="openstack/nova-scheduler-0" Dec 11 13:29:32 crc kubenswrapper[4898]: I1211 13:29:32.820063 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/119fd4c2-86a1-48d1-8005-2fc9a3062219-config-data\") pod \"nova-scheduler-0\" (UID: \"119fd4c2-86a1-48d1-8005-2fc9a3062219\") " pod="openstack/nova-scheduler-0" Dec 11 13:29:32 crc kubenswrapper[4898]: I1211 13:29:32.922047 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ktg9n\" (UniqueName: \"kubernetes.io/projected/119fd4c2-86a1-48d1-8005-2fc9a3062219-kube-api-access-ktg9n\") pod \"nova-scheduler-0\" (UID: \"119fd4c2-86a1-48d1-8005-2fc9a3062219\") " pod="openstack/nova-scheduler-0" Dec 11 13:29:32 crc kubenswrapper[4898]: I1211 13:29:32.922147 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/119fd4c2-86a1-48d1-8005-2fc9a3062219-config-data\") pod \"nova-scheduler-0\" (UID: \"119fd4c2-86a1-48d1-8005-2fc9a3062219\") " pod="openstack/nova-scheduler-0" Dec 11 13:29:32 crc kubenswrapper[4898]: I1211 13:29:32.923762 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/119fd4c2-86a1-48d1-8005-2fc9a3062219-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"119fd4c2-86a1-48d1-8005-2fc9a3062219\") " pod="openstack/nova-scheduler-0" Dec 11 13:29:32 crc kubenswrapper[4898]: I1211 13:29:32.927010 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/119fd4c2-86a1-48d1-8005-2fc9a3062219-config-data\") pod \"nova-scheduler-0\" (UID: \"119fd4c2-86a1-48d1-8005-2fc9a3062219\") " pod="openstack/nova-scheduler-0" Dec 11 13:29:32 crc kubenswrapper[4898]: I1211 13:29:32.928003 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/119fd4c2-86a1-48d1-8005-2fc9a3062219-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"119fd4c2-86a1-48d1-8005-2fc9a3062219\") " pod="openstack/nova-scheduler-0" Dec 11 13:29:32 crc kubenswrapper[4898]: I1211 13:29:32.938424 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ktg9n\" (UniqueName: \"kubernetes.io/projected/119fd4c2-86a1-48d1-8005-2fc9a3062219-kube-api-access-ktg9n\") pod \"nova-scheduler-0\" (UID: \"119fd4c2-86a1-48d1-8005-2fc9a3062219\") " pod="openstack/nova-scheduler-0" Dec 11 13:29:33 crc kubenswrapper[4898]: I1211 13:29:33.018423 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 11 13:29:33 crc kubenswrapper[4898]: I1211 13:29:33.516898 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 11 13:29:33 crc kubenswrapper[4898]: I1211 13:29:33.620959 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a7102f4f-6a78-4e44-8f68-9efeccd5d632","Type":"ContainerStarted","Data":"5fa68b9c1e64da097b813a2bed8c39ca79d51af0046dc9c00bd559a2f8bc054c"} Dec 11 13:29:33 crc kubenswrapper[4898]: I1211 13:29:33.621311 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a7102f4f-6a78-4e44-8f68-9efeccd5d632","Type":"ContainerStarted","Data":"28609c82872af042a0decdc69b12b83d145c88392574746f5cb8da2da51273f8"} Dec 11 13:29:33 crc kubenswrapper[4898]: I1211 13:29:33.621329 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a7102f4f-6a78-4e44-8f68-9efeccd5d632","Type":"ContainerStarted","Data":"cf8bba8e671b5c3d3f418d5cef3ef54a2b3b05daa3ef6946e8fa1915f8deb381"} Dec 11 13:29:33 crc kubenswrapper[4898]: I1211 13:29:33.643227 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"119fd4c2-86a1-48d1-8005-2fc9a3062219","Type":"ContainerStarted","Data":"3c5d5eb03e6ea3eec48391c025ef66f7fa83585b3e2d443677b0ab7c5d175c37"} Dec 11 13:29:33 crc kubenswrapper[4898]: I1211 13:29:33.677317 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.6772967789999997 podStartE2EDuration="2.677296779s" podCreationTimestamp="2025-12-11 13:29:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 13:29:33.665410147 +0000 UTC m=+1531.237736594" watchObservedRunningTime="2025-12-11 13:29:33.677296779 +0000 UTC m=+1531.249623216" Dec 11 13:29:34 crc kubenswrapper[4898]: I1211 13:29:34.658280 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"119fd4c2-86a1-48d1-8005-2fc9a3062219","Type":"ContainerStarted","Data":"a33592a023877e565bb289a2eebce23036ca581fe7f8864582f31c5255c8422a"} Dec 11 13:29:34 crc kubenswrapper[4898]: I1211 13:29:34.688192 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.688168786 podStartE2EDuration="2.688168786s" podCreationTimestamp="2025-12-11 13:29:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 13:29:34.679032839 +0000 UTC m=+1532.251359336" watchObservedRunningTime="2025-12-11 13:29:34.688168786 +0000 UTC m=+1532.260495223" Dec 11 13:29:35 crc kubenswrapper[4898]: I1211 13:29:35.547075 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Dec 11 13:29:35 crc kubenswrapper[4898]: I1211 13:29:35.597764 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wbn6c\" (UniqueName: \"kubernetes.io/projected/b73e56da-3159-42d6-a1f8-d72c25c82451-kube-api-access-wbn6c\") pod \"b73e56da-3159-42d6-a1f8-d72c25c82451\" (UID: \"b73e56da-3159-42d6-a1f8-d72c25c82451\") " Dec 11 13:29:35 crc kubenswrapper[4898]: I1211 13:29:35.597826 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b73e56da-3159-42d6-a1f8-d72c25c82451-combined-ca-bundle\") pod \"b73e56da-3159-42d6-a1f8-d72c25c82451\" (UID: \"b73e56da-3159-42d6-a1f8-d72c25c82451\") " Dec 11 13:29:35 crc kubenswrapper[4898]: I1211 13:29:35.597868 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b73e56da-3159-42d6-a1f8-d72c25c82451-config-data\") pod \"b73e56da-3159-42d6-a1f8-d72c25c82451\" (UID: \"b73e56da-3159-42d6-a1f8-d72c25c82451\") " Dec 11 13:29:35 crc kubenswrapper[4898]: I1211 13:29:35.598001 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b73e56da-3159-42d6-a1f8-d72c25c82451-scripts\") pod \"b73e56da-3159-42d6-a1f8-d72c25c82451\" (UID: \"b73e56da-3159-42d6-a1f8-d72c25c82451\") " Dec 11 13:29:35 crc kubenswrapper[4898]: I1211 13:29:35.604682 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b73e56da-3159-42d6-a1f8-d72c25c82451-scripts" (OuterVolumeSpecName: "scripts") pod "b73e56da-3159-42d6-a1f8-d72c25c82451" (UID: "b73e56da-3159-42d6-a1f8-d72c25c82451"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:29:35 crc kubenswrapper[4898]: I1211 13:29:35.621744 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b73e56da-3159-42d6-a1f8-d72c25c82451-kube-api-access-wbn6c" (OuterVolumeSpecName: "kube-api-access-wbn6c") pod "b73e56da-3159-42d6-a1f8-d72c25c82451" (UID: "b73e56da-3159-42d6-a1f8-d72c25c82451"). InnerVolumeSpecName "kube-api-access-wbn6c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:29:35 crc kubenswrapper[4898]: I1211 13:29:35.703507 4898 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b73e56da-3159-42d6-a1f8-d72c25c82451-scripts\") on node \"crc\" DevicePath \"\"" Dec 11 13:29:35 crc kubenswrapper[4898]: I1211 13:29:35.703568 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wbn6c\" (UniqueName: \"kubernetes.io/projected/b73e56da-3159-42d6-a1f8-d72c25c82451-kube-api-access-wbn6c\") on node \"crc\" DevicePath \"\"" Dec 11 13:29:35 crc kubenswrapper[4898]: I1211 13:29:35.716729 4898 generic.go:334] "Generic (PLEG): container finished" podID="b73e56da-3159-42d6-a1f8-d72c25c82451" containerID="03b964980a19479b34062f3c357c1cab94e007b1fa03cc7f46ee1ba0608dab69" exitCode=137 Dec 11 13:29:35 crc kubenswrapper[4898]: I1211 13:29:35.718595 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Dec 11 13:29:35 crc kubenswrapper[4898]: I1211 13:29:35.718734 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"b73e56da-3159-42d6-a1f8-d72c25c82451","Type":"ContainerDied","Data":"03b964980a19479b34062f3c357c1cab94e007b1fa03cc7f46ee1ba0608dab69"} Dec 11 13:29:35 crc kubenswrapper[4898]: I1211 13:29:35.718833 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"b73e56da-3159-42d6-a1f8-d72c25c82451","Type":"ContainerDied","Data":"163e91db32d2702038e09839f575e2ac673c39e9c44579879d49c23e0f2fa361"} Dec 11 13:29:35 crc kubenswrapper[4898]: I1211 13:29:35.718875 4898 scope.go:117] "RemoveContainer" containerID="03b964980a19479b34062f3c357c1cab94e007b1fa03cc7f46ee1ba0608dab69" Dec 11 13:29:35 crc kubenswrapper[4898]: I1211 13:29:35.768048 4898 scope.go:117] "RemoveContainer" containerID="c20379cd692de345371ccacef2282a32d999fa39e12d626e1bb6a5519452e5e4" Dec 11 13:29:35 crc kubenswrapper[4898]: I1211 13:29:35.773596 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b73e56da-3159-42d6-a1f8-d72c25c82451-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b73e56da-3159-42d6-a1f8-d72c25c82451" (UID: "b73e56da-3159-42d6-a1f8-d72c25c82451"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:29:35 crc kubenswrapper[4898]: I1211 13:29:35.781747 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b73e56da-3159-42d6-a1f8-d72c25c82451-config-data" (OuterVolumeSpecName: "config-data") pod "b73e56da-3159-42d6-a1f8-d72c25c82451" (UID: "b73e56da-3159-42d6-a1f8-d72c25c82451"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:29:35 crc kubenswrapper[4898]: I1211 13:29:35.794290 4898 scope.go:117] "RemoveContainer" containerID="f5c9cad012b20f086cf7603bb966371d258a905d3caa52f32907ce842f5015b5" Dec 11 13:29:35 crc kubenswrapper[4898]: I1211 13:29:35.807754 4898 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b73e56da-3159-42d6-a1f8-d72c25c82451-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 11 13:29:35 crc kubenswrapper[4898]: I1211 13:29:35.807927 4898 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b73e56da-3159-42d6-a1f8-d72c25c82451-config-data\") on node \"crc\" DevicePath \"\"" Dec 11 13:29:35 crc kubenswrapper[4898]: I1211 13:29:35.816115 4898 scope.go:117] "RemoveContainer" containerID="ebde97722847b184057ddff9e66777896d279cb30e997da67b1e0fc4bc72b07c" Dec 11 13:29:35 crc kubenswrapper[4898]: I1211 13:29:35.837749 4898 scope.go:117] "RemoveContainer" containerID="03b964980a19479b34062f3c357c1cab94e007b1fa03cc7f46ee1ba0608dab69" Dec 11 13:29:35 crc kubenswrapper[4898]: E1211 13:29:35.838810 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"03b964980a19479b34062f3c357c1cab94e007b1fa03cc7f46ee1ba0608dab69\": container with ID starting with 03b964980a19479b34062f3c357c1cab94e007b1fa03cc7f46ee1ba0608dab69 not found: ID does not exist" containerID="03b964980a19479b34062f3c357c1cab94e007b1fa03cc7f46ee1ba0608dab69" Dec 11 13:29:35 crc kubenswrapper[4898]: I1211 13:29:35.838923 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03b964980a19479b34062f3c357c1cab94e007b1fa03cc7f46ee1ba0608dab69"} err="failed to get container status \"03b964980a19479b34062f3c357c1cab94e007b1fa03cc7f46ee1ba0608dab69\": rpc error: code = NotFound desc = could not find container \"03b964980a19479b34062f3c357c1cab94e007b1fa03cc7f46ee1ba0608dab69\": container with ID starting with 03b964980a19479b34062f3c357c1cab94e007b1fa03cc7f46ee1ba0608dab69 not found: ID does not exist" Dec 11 13:29:35 crc kubenswrapper[4898]: I1211 13:29:35.839005 4898 scope.go:117] "RemoveContainer" containerID="c20379cd692de345371ccacef2282a32d999fa39e12d626e1bb6a5519452e5e4" Dec 11 13:29:35 crc kubenswrapper[4898]: E1211 13:29:35.839387 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c20379cd692de345371ccacef2282a32d999fa39e12d626e1bb6a5519452e5e4\": container with ID starting with c20379cd692de345371ccacef2282a32d999fa39e12d626e1bb6a5519452e5e4 not found: ID does not exist" containerID="c20379cd692de345371ccacef2282a32d999fa39e12d626e1bb6a5519452e5e4" Dec 11 13:29:35 crc kubenswrapper[4898]: I1211 13:29:35.839419 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c20379cd692de345371ccacef2282a32d999fa39e12d626e1bb6a5519452e5e4"} err="failed to get container status \"c20379cd692de345371ccacef2282a32d999fa39e12d626e1bb6a5519452e5e4\": rpc error: code = NotFound desc = could not find container \"c20379cd692de345371ccacef2282a32d999fa39e12d626e1bb6a5519452e5e4\": container with ID starting with c20379cd692de345371ccacef2282a32d999fa39e12d626e1bb6a5519452e5e4 not found: ID does not exist" Dec 11 13:29:35 crc kubenswrapper[4898]: I1211 13:29:35.839475 4898 scope.go:117] "RemoveContainer" containerID="f5c9cad012b20f086cf7603bb966371d258a905d3caa52f32907ce842f5015b5" Dec 11 13:29:35 crc kubenswrapper[4898]: E1211 13:29:35.839833 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f5c9cad012b20f086cf7603bb966371d258a905d3caa52f32907ce842f5015b5\": container with ID starting with f5c9cad012b20f086cf7603bb966371d258a905d3caa52f32907ce842f5015b5 not found: ID does not exist" containerID="f5c9cad012b20f086cf7603bb966371d258a905d3caa52f32907ce842f5015b5" Dec 11 13:29:35 crc kubenswrapper[4898]: I1211 13:29:35.839912 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f5c9cad012b20f086cf7603bb966371d258a905d3caa52f32907ce842f5015b5"} err="failed to get container status \"f5c9cad012b20f086cf7603bb966371d258a905d3caa52f32907ce842f5015b5\": rpc error: code = NotFound desc = could not find container \"f5c9cad012b20f086cf7603bb966371d258a905d3caa52f32907ce842f5015b5\": container with ID starting with f5c9cad012b20f086cf7603bb966371d258a905d3caa52f32907ce842f5015b5 not found: ID does not exist" Dec 11 13:29:35 crc kubenswrapper[4898]: I1211 13:29:35.839997 4898 scope.go:117] "RemoveContainer" containerID="ebde97722847b184057ddff9e66777896d279cb30e997da67b1e0fc4bc72b07c" Dec 11 13:29:35 crc kubenswrapper[4898]: E1211 13:29:35.840412 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ebde97722847b184057ddff9e66777896d279cb30e997da67b1e0fc4bc72b07c\": container with ID starting with ebde97722847b184057ddff9e66777896d279cb30e997da67b1e0fc4bc72b07c not found: ID does not exist" containerID="ebde97722847b184057ddff9e66777896d279cb30e997da67b1e0fc4bc72b07c" Dec 11 13:29:35 crc kubenswrapper[4898]: I1211 13:29:35.840503 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ebde97722847b184057ddff9e66777896d279cb30e997da67b1e0fc4bc72b07c"} err="failed to get container status \"ebde97722847b184057ddff9e66777896d279cb30e997da67b1e0fc4bc72b07c\": rpc error: code = NotFound desc = could not find container \"ebde97722847b184057ddff9e66777896d279cb30e997da67b1e0fc4bc72b07c\": container with ID starting with ebde97722847b184057ddff9e66777896d279cb30e997da67b1e0fc4bc72b07c not found: ID does not exist" Dec 11 13:29:36 crc kubenswrapper[4898]: I1211 13:29:36.099378 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-0"] Dec 11 13:29:36 crc kubenswrapper[4898]: I1211 13:29:36.122401 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-0"] Dec 11 13:29:36 crc kubenswrapper[4898]: I1211 13:29:36.147312 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-0"] Dec 11 13:29:36 crc kubenswrapper[4898]: E1211 13:29:36.148439 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b73e56da-3159-42d6-a1f8-d72c25c82451" containerName="aodh-listener" Dec 11 13:29:36 crc kubenswrapper[4898]: I1211 13:29:36.148563 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="b73e56da-3159-42d6-a1f8-d72c25c82451" containerName="aodh-listener" Dec 11 13:29:36 crc kubenswrapper[4898]: E1211 13:29:36.148653 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b73e56da-3159-42d6-a1f8-d72c25c82451" containerName="aodh-notifier" Dec 11 13:29:36 crc kubenswrapper[4898]: I1211 13:29:36.148708 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="b73e56da-3159-42d6-a1f8-d72c25c82451" containerName="aodh-notifier" Dec 11 13:29:36 crc kubenswrapper[4898]: E1211 13:29:36.148768 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b73e56da-3159-42d6-a1f8-d72c25c82451" containerName="aodh-evaluator" Dec 11 13:29:36 crc kubenswrapper[4898]: I1211 13:29:36.148828 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="b73e56da-3159-42d6-a1f8-d72c25c82451" containerName="aodh-evaluator" Dec 11 13:29:36 crc kubenswrapper[4898]: E1211 13:29:36.148896 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b73e56da-3159-42d6-a1f8-d72c25c82451" containerName="aodh-api" Dec 11 13:29:36 crc kubenswrapper[4898]: I1211 13:29:36.148949 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="b73e56da-3159-42d6-a1f8-d72c25c82451" containerName="aodh-api" Dec 11 13:29:36 crc kubenswrapper[4898]: I1211 13:29:36.149240 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="b73e56da-3159-42d6-a1f8-d72c25c82451" containerName="aodh-api" Dec 11 13:29:36 crc kubenswrapper[4898]: I1211 13:29:36.149300 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="b73e56da-3159-42d6-a1f8-d72c25c82451" containerName="aodh-evaluator" Dec 11 13:29:36 crc kubenswrapper[4898]: I1211 13:29:36.149359 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="b73e56da-3159-42d6-a1f8-d72c25c82451" containerName="aodh-notifier" Dec 11 13:29:36 crc kubenswrapper[4898]: I1211 13:29:36.149429 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="b73e56da-3159-42d6-a1f8-d72c25c82451" containerName="aodh-listener" Dec 11 13:29:36 crc kubenswrapper[4898]: I1211 13:29:36.151503 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Dec 11 13:29:36 crc kubenswrapper[4898]: I1211 13:29:36.153870 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-aodh-internal-svc" Dec 11 13:29:36 crc kubenswrapper[4898]: I1211 13:29:36.154041 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-h8xn6" Dec 11 13:29:36 crc kubenswrapper[4898]: I1211 13:29:36.154299 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Dec 11 13:29:36 crc kubenswrapper[4898]: I1211 13:29:36.154351 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Dec 11 13:29:36 crc kubenswrapper[4898]: I1211 13:29:36.155328 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-aodh-public-svc" Dec 11 13:29:36 crc kubenswrapper[4898]: I1211 13:29:36.174248 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Dec 11 13:29:36 crc kubenswrapper[4898]: I1211 13:29:36.215625 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/987dc857-e48a-418b-b021-4c6048a9c47d-scripts\") pod \"aodh-0\" (UID: \"987dc857-e48a-418b-b021-4c6048a9c47d\") " pod="openstack/aodh-0" Dec 11 13:29:36 crc kubenswrapper[4898]: I1211 13:29:36.215967 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/987dc857-e48a-418b-b021-4c6048a9c47d-combined-ca-bundle\") pod \"aodh-0\" (UID: \"987dc857-e48a-418b-b021-4c6048a9c47d\") " pod="openstack/aodh-0" Dec 11 13:29:36 crc kubenswrapper[4898]: I1211 13:29:36.216323 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c6gf7\" (UniqueName: \"kubernetes.io/projected/987dc857-e48a-418b-b021-4c6048a9c47d-kube-api-access-c6gf7\") pod \"aodh-0\" (UID: \"987dc857-e48a-418b-b021-4c6048a9c47d\") " pod="openstack/aodh-0" Dec 11 13:29:36 crc kubenswrapper[4898]: I1211 13:29:36.216539 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/987dc857-e48a-418b-b021-4c6048a9c47d-public-tls-certs\") pod \"aodh-0\" (UID: \"987dc857-e48a-418b-b021-4c6048a9c47d\") " pod="openstack/aodh-0" Dec 11 13:29:36 crc kubenswrapper[4898]: I1211 13:29:36.216941 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/987dc857-e48a-418b-b021-4c6048a9c47d-config-data\") pod \"aodh-0\" (UID: \"987dc857-e48a-418b-b021-4c6048a9c47d\") " pod="openstack/aodh-0" Dec 11 13:29:36 crc kubenswrapper[4898]: I1211 13:29:36.216966 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/987dc857-e48a-418b-b021-4c6048a9c47d-internal-tls-certs\") pod \"aodh-0\" (UID: \"987dc857-e48a-418b-b021-4c6048a9c47d\") " pod="openstack/aodh-0" Dec 11 13:29:36 crc kubenswrapper[4898]: I1211 13:29:36.319273 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c6gf7\" (UniqueName: \"kubernetes.io/projected/987dc857-e48a-418b-b021-4c6048a9c47d-kube-api-access-c6gf7\") pod \"aodh-0\" (UID: \"987dc857-e48a-418b-b021-4c6048a9c47d\") " pod="openstack/aodh-0" Dec 11 13:29:36 crc kubenswrapper[4898]: I1211 13:29:36.319419 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/987dc857-e48a-418b-b021-4c6048a9c47d-public-tls-certs\") pod \"aodh-0\" (UID: \"987dc857-e48a-418b-b021-4c6048a9c47d\") " pod="openstack/aodh-0" Dec 11 13:29:36 crc kubenswrapper[4898]: I1211 13:29:36.319582 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/987dc857-e48a-418b-b021-4c6048a9c47d-config-data\") pod \"aodh-0\" (UID: \"987dc857-e48a-418b-b021-4c6048a9c47d\") " pod="openstack/aodh-0" Dec 11 13:29:36 crc kubenswrapper[4898]: I1211 13:29:36.319617 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/987dc857-e48a-418b-b021-4c6048a9c47d-internal-tls-certs\") pod \"aodh-0\" (UID: \"987dc857-e48a-418b-b021-4c6048a9c47d\") " pod="openstack/aodh-0" Dec 11 13:29:36 crc kubenswrapper[4898]: I1211 13:29:36.319674 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/987dc857-e48a-418b-b021-4c6048a9c47d-scripts\") pod \"aodh-0\" (UID: \"987dc857-e48a-418b-b021-4c6048a9c47d\") " pod="openstack/aodh-0" Dec 11 13:29:36 crc kubenswrapper[4898]: I1211 13:29:36.319741 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/987dc857-e48a-418b-b021-4c6048a9c47d-combined-ca-bundle\") pod \"aodh-0\" (UID: \"987dc857-e48a-418b-b021-4c6048a9c47d\") " pod="openstack/aodh-0" Dec 11 13:29:36 crc kubenswrapper[4898]: I1211 13:29:36.325374 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/987dc857-e48a-418b-b021-4c6048a9c47d-combined-ca-bundle\") pod \"aodh-0\" (UID: \"987dc857-e48a-418b-b021-4c6048a9c47d\") " pod="openstack/aodh-0" Dec 11 13:29:36 crc kubenswrapper[4898]: I1211 13:29:36.326990 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/987dc857-e48a-418b-b021-4c6048a9c47d-scripts\") pod \"aodh-0\" (UID: \"987dc857-e48a-418b-b021-4c6048a9c47d\") " pod="openstack/aodh-0" Dec 11 13:29:36 crc kubenswrapper[4898]: I1211 13:29:36.327134 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/987dc857-e48a-418b-b021-4c6048a9c47d-config-data\") pod \"aodh-0\" (UID: \"987dc857-e48a-418b-b021-4c6048a9c47d\") " pod="openstack/aodh-0" Dec 11 13:29:36 crc kubenswrapper[4898]: I1211 13:29:36.329939 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/987dc857-e48a-418b-b021-4c6048a9c47d-public-tls-certs\") pod \"aodh-0\" (UID: \"987dc857-e48a-418b-b021-4c6048a9c47d\") " pod="openstack/aodh-0" Dec 11 13:29:36 crc kubenswrapper[4898]: I1211 13:29:36.336604 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/987dc857-e48a-418b-b021-4c6048a9c47d-internal-tls-certs\") pod \"aodh-0\" (UID: \"987dc857-e48a-418b-b021-4c6048a9c47d\") " pod="openstack/aodh-0" Dec 11 13:29:36 crc kubenswrapper[4898]: I1211 13:29:36.338749 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c6gf7\" (UniqueName: \"kubernetes.io/projected/987dc857-e48a-418b-b021-4c6048a9c47d-kube-api-access-c6gf7\") pod \"aodh-0\" (UID: \"987dc857-e48a-418b-b021-4c6048a9c47d\") " pod="openstack/aodh-0" Dec 11 13:29:36 crc kubenswrapper[4898]: I1211 13:29:36.471646 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Dec 11 13:29:36 crc kubenswrapper[4898]: I1211 13:29:36.791032 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b73e56da-3159-42d6-a1f8-d72c25c82451" path="/var/lib/kubelet/pods/b73e56da-3159-42d6-a1f8-d72c25c82451/volumes" Dec 11 13:29:37 crc kubenswrapper[4898]: I1211 13:29:37.151268 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Dec 11 13:29:37 crc kubenswrapper[4898]: W1211 13:29:37.153958 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod987dc857_e48a_418b_b021_4c6048a9c47d.slice/crio-3d95242720a9b4647f124d419dcd5baf02ce3ed1aed5bf25907b93d3e32a2a8c WatchSource:0}: Error finding container 3d95242720a9b4647f124d419dcd5baf02ce3ed1aed5bf25907b93d3e32a2a8c: Status 404 returned error can't find the container with id 3d95242720a9b4647f124d419dcd5baf02ce3ed1aed5bf25907b93d3e32a2a8c Dec 11 13:29:37 crc kubenswrapper[4898]: I1211 13:29:37.304384 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 11 13:29:37 crc kubenswrapper[4898]: I1211 13:29:37.304532 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 11 13:29:37 crc kubenswrapper[4898]: I1211 13:29:37.767914 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"987dc857-e48a-418b-b021-4c6048a9c47d","Type":"ContainerStarted","Data":"3d95242720a9b4647f124d419dcd5baf02ce3ed1aed5bf25907b93d3e32a2a8c"} Dec 11 13:29:38 crc kubenswrapper[4898]: I1211 13:29:38.019333 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 11 13:29:38 crc kubenswrapper[4898]: I1211 13:29:38.794589 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"987dc857-e48a-418b-b021-4c6048a9c47d","Type":"ContainerStarted","Data":"de4b16c5afcb5773b006b4743cd600fd4507bc0a5cf283b2ebd4df8bb6cfe58e"} Dec 11 13:29:38 crc kubenswrapper[4898]: I1211 13:29:38.795180 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"987dc857-e48a-418b-b021-4c6048a9c47d","Type":"ContainerStarted","Data":"5827c5a3c0bb30c0cc495c71bdc472acab633f448964aa6a6369156d76ec74ee"} Dec 11 13:29:39 crc kubenswrapper[4898]: I1211 13:29:39.226866 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 11 13:29:39 crc kubenswrapper[4898]: I1211 13:29:39.226920 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 11 13:29:39 crc kubenswrapper[4898]: I1211 13:29:39.800272 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"987dc857-e48a-418b-b021-4c6048a9c47d","Type":"ContainerStarted","Data":"6b5434e0063a62a0d125c37aa7e9aba4fae1680a17072d74cbf1cca9505d5d72"} Dec 11 13:29:40 crc kubenswrapper[4898]: I1211 13:29:40.240834 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="12f955af-c5c3-47de-b1ef-f23b46d06a62" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.255:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 11 13:29:40 crc kubenswrapper[4898]: I1211 13:29:40.241073 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="12f955af-c5c3-47de-b1ef-f23b46d06a62" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.255:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 11 13:29:40 crc kubenswrapper[4898]: I1211 13:29:40.813185 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"987dc857-e48a-418b-b021-4c6048a9c47d","Type":"ContainerStarted","Data":"d43c6a7ba4d05c274438d925d713c555505735c68c1af45658a1cbb485f781cd"} Dec 11 13:29:40 crc kubenswrapper[4898]: I1211 13:29:40.835345 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-0" podStartSLOduration=1.955344824 podStartE2EDuration="4.835320205s" podCreationTimestamp="2025-12-11 13:29:36 +0000 UTC" firstStartedPulling="2025-12-11 13:29:37.156861825 +0000 UTC m=+1534.729188262" lastFinishedPulling="2025-12-11 13:29:40.036837206 +0000 UTC m=+1537.609163643" observedRunningTime="2025-12-11 13:29:40.831741428 +0000 UTC m=+1538.404067885" watchObservedRunningTime="2025-12-11 13:29:40.835320205 +0000 UTC m=+1538.407646642" Dec 11 13:29:42 crc kubenswrapper[4898]: I1211 13:29:42.304820 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 11 13:29:42 crc kubenswrapper[4898]: I1211 13:29:42.306805 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 11 13:29:43 crc kubenswrapper[4898]: I1211 13:29:43.019398 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 11 13:29:43 crc kubenswrapper[4898]: I1211 13:29:43.050859 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 11 13:29:43 crc kubenswrapper[4898]: I1211 13:29:43.314623 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="a7102f4f-6a78-4e44-8f68-9efeccd5d632" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.1.0:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 11 13:29:43 crc kubenswrapper[4898]: I1211 13:29:43.314664 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="a7102f4f-6a78-4e44-8f68-9efeccd5d632" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.1.0:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 11 13:29:43 crc kubenswrapper[4898]: I1211 13:29:43.898386 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 11 13:29:45 crc kubenswrapper[4898]: I1211 13:29:45.720085 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 11 13:29:49 crc kubenswrapper[4898]: I1211 13:29:49.210590 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 11 13:29:49 crc kubenswrapper[4898]: I1211 13:29:49.211423 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 11 13:29:49 crc kubenswrapper[4898]: I1211 13:29:49.218748 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 11 13:29:49 crc kubenswrapper[4898]: I1211 13:29:49.225858 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 11 13:29:49 crc kubenswrapper[4898]: I1211 13:29:49.711760 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 11 13:29:49 crc kubenswrapper[4898]: I1211 13:29:49.712282 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="354dfb1c-30d5-4253-beeb-3e57ca531689" containerName="kube-state-metrics" containerID="cri-o://473fa7fe6a537a1b8710634ab2622e077258c24c2ed80cd70ef967c0615240d6" gracePeriod=30 Dec 11 13:29:49 crc kubenswrapper[4898]: I1211 13:29:49.777105 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mysqld-exporter-0"] Dec 11 13:29:49 crc kubenswrapper[4898]: I1211 13:29:49.777334 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/mysqld-exporter-0" podUID="3a77ab96-9580-4509-ba2c-963e51ed44a5" containerName="mysqld-exporter" containerID="cri-o://3947f61e9b2304dc24476513cec9b1930a2cdd56cace91f82e0ff5dfa848e8e0" gracePeriod=30 Dec 11 13:29:49 crc kubenswrapper[4898]: I1211 13:29:49.931307 4898 generic.go:334] "Generic (PLEG): container finished" podID="354dfb1c-30d5-4253-beeb-3e57ca531689" containerID="473fa7fe6a537a1b8710634ab2622e077258c24c2ed80cd70ef967c0615240d6" exitCode=2 Dec 11 13:29:49 crc kubenswrapper[4898]: I1211 13:29:49.931414 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"354dfb1c-30d5-4253-beeb-3e57ca531689","Type":"ContainerDied","Data":"473fa7fe6a537a1b8710634ab2622e077258c24c2ed80cd70ef967c0615240d6"} Dec 11 13:29:49 crc kubenswrapper[4898]: I1211 13:29:49.933510 4898 generic.go:334] "Generic (PLEG): container finished" podID="3a77ab96-9580-4509-ba2c-963e51ed44a5" containerID="3947f61e9b2304dc24476513cec9b1930a2cdd56cace91f82e0ff5dfa848e8e0" exitCode=2 Dec 11 13:29:49 crc kubenswrapper[4898]: I1211 13:29:49.935511 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-0" event={"ID":"3a77ab96-9580-4509-ba2c-963e51ed44a5","Type":"ContainerDied","Data":"3947f61e9b2304dc24476513cec9b1930a2cdd56cace91f82e0ff5dfa848e8e0"} Dec 11 13:29:49 crc kubenswrapper[4898]: I1211 13:29:49.935581 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 11 13:29:49 crc kubenswrapper[4898]: I1211 13:29:49.950634 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 11 13:29:50 crc kubenswrapper[4898]: I1211 13:29:50.241386 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 11 13:29:50 crc kubenswrapper[4898]: I1211 13:29:50.367100 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-0" Dec 11 13:29:50 crc kubenswrapper[4898]: I1211 13:29:50.393823 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-shs6l\" (UniqueName: \"kubernetes.io/projected/354dfb1c-30d5-4253-beeb-3e57ca531689-kube-api-access-shs6l\") pod \"354dfb1c-30d5-4253-beeb-3e57ca531689\" (UID: \"354dfb1c-30d5-4253-beeb-3e57ca531689\") " Dec 11 13:29:50 crc kubenswrapper[4898]: I1211 13:29:50.399530 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/354dfb1c-30d5-4253-beeb-3e57ca531689-kube-api-access-shs6l" (OuterVolumeSpecName: "kube-api-access-shs6l") pod "354dfb1c-30d5-4253-beeb-3e57ca531689" (UID: "354dfb1c-30d5-4253-beeb-3e57ca531689"). InnerVolumeSpecName "kube-api-access-shs6l". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:29:50 crc kubenswrapper[4898]: I1211 13:29:50.495814 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fdqz2\" (UniqueName: \"kubernetes.io/projected/3a77ab96-9580-4509-ba2c-963e51ed44a5-kube-api-access-fdqz2\") pod \"3a77ab96-9580-4509-ba2c-963e51ed44a5\" (UID: \"3a77ab96-9580-4509-ba2c-963e51ed44a5\") " Dec 11 13:29:50 crc kubenswrapper[4898]: I1211 13:29:50.495860 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a77ab96-9580-4509-ba2c-963e51ed44a5-config-data\") pod \"3a77ab96-9580-4509-ba2c-963e51ed44a5\" (UID: \"3a77ab96-9580-4509-ba2c-963e51ed44a5\") " Dec 11 13:29:50 crc kubenswrapper[4898]: I1211 13:29:50.496152 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a77ab96-9580-4509-ba2c-963e51ed44a5-combined-ca-bundle\") pod \"3a77ab96-9580-4509-ba2c-963e51ed44a5\" (UID: \"3a77ab96-9580-4509-ba2c-963e51ed44a5\") " Dec 11 13:29:50 crc kubenswrapper[4898]: I1211 13:29:50.496673 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-shs6l\" (UniqueName: \"kubernetes.io/projected/354dfb1c-30d5-4253-beeb-3e57ca531689-kube-api-access-shs6l\") on node \"crc\" DevicePath \"\"" Dec 11 13:29:50 crc kubenswrapper[4898]: I1211 13:29:50.509750 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a77ab96-9580-4509-ba2c-963e51ed44a5-kube-api-access-fdqz2" (OuterVolumeSpecName: "kube-api-access-fdqz2") pod "3a77ab96-9580-4509-ba2c-963e51ed44a5" (UID: "3a77ab96-9580-4509-ba2c-963e51ed44a5"). InnerVolumeSpecName "kube-api-access-fdqz2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:29:50 crc kubenswrapper[4898]: I1211 13:29:50.530732 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a77ab96-9580-4509-ba2c-963e51ed44a5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3a77ab96-9580-4509-ba2c-963e51ed44a5" (UID: "3a77ab96-9580-4509-ba2c-963e51ed44a5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:29:50 crc kubenswrapper[4898]: I1211 13:29:50.559373 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a77ab96-9580-4509-ba2c-963e51ed44a5-config-data" (OuterVolumeSpecName: "config-data") pod "3a77ab96-9580-4509-ba2c-963e51ed44a5" (UID: "3a77ab96-9580-4509-ba2c-963e51ed44a5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:29:50 crc kubenswrapper[4898]: I1211 13:29:50.598591 4898 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a77ab96-9580-4509-ba2c-963e51ed44a5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 11 13:29:50 crc kubenswrapper[4898]: I1211 13:29:50.598634 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fdqz2\" (UniqueName: \"kubernetes.io/projected/3a77ab96-9580-4509-ba2c-963e51ed44a5-kube-api-access-fdqz2\") on node \"crc\" DevicePath \"\"" Dec 11 13:29:50 crc kubenswrapper[4898]: I1211 13:29:50.598648 4898 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a77ab96-9580-4509-ba2c-963e51ed44a5-config-data\") on node \"crc\" DevicePath \"\"" Dec 11 13:29:50 crc kubenswrapper[4898]: I1211 13:29:50.954579 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-0" event={"ID":"3a77ab96-9580-4509-ba2c-963e51ed44a5","Type":"ContainerDied","Data":"a35f41f8c5d53cb13d130f09965357e4a0bf44514942145950cb29c970d489cc"} Dec 11 13:29:50 crc kubenswrapper[4898]: I1211 13:29:50.954638 4898 scope.go:117] "RemoveContainer" containerID="3947f61e9b2304dc24476513cec9b1930a2cdd56cace91f82e0ff5dfa848e8e0" Dec 11 13:29:50 crc kubenswrapper[4898]: I1211 13:29:50.954808 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-0" Dec 11 13:29:50 crc kubenswrapper[4898]: I1211 13:29:50.961227 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"354dfb1c-30d5-4253-beeb-3e57ca531689","Type":"ContainerDied","Data":"022fc477c0f518ee661899d91f54c29bc1ee8e61021d6712aa7ecd99179376ab"} Dec 11 13:29:50 crc kubenswrapper[4898]: I1211 13:29:50.961387 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 11 13:29:51 crc kubenswrapper[4898]: I1211 13:29:51.021689 4898 scope.go:117] "RemoveContainer" containerID="473fa7fe6a537a1b8710634ab2622e077258c24c2ed80cd70ef967c0615240d6" Dec 11 13:29:51 crc kubenswrapper[4898]: I1211 13:29:51.074551 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 11 13:29:51 crc kubenswrapper[4898]: I1211 13:29:51.090036 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 11 13:29:51 crc kubenswrapper[4898]: I1211 13:29:51.105376 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Dec 11 13:29:51 crc kubenswrapper[4898]: E1211 13:29:51.106036 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a77ab96-9580-4509-ba2c-963e51ed44a5" containerName="mysqld-exporter" Dec 11 13:29:51 crc kubenswrapper[4898]: I1211 13:29:51.106103 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a77ab96-9580-4509-ba2c-963e51ed44a5" containerName="mysqld-exporter" Dec 11 13:29:51 crc kubenswrapper[4898]: E1211 13:29:51.106184 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="354dfb1c-30d5-4253-beeb-3e57ca531689" containerName="kube-state-metrics" Dec 11 13:29:51 crc kubenswrapper[4898]: I1211 13:29:51.106234 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="354dfb1c-30d5-4253-beeb-3e57ca531689" containerName="kube-state-metrics" Dec 11 13:29:51 crc kubenswrapper[4898]: I1211 13:29:51.106575 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="354dfb1c-30d5-4253-beeb-3e57ca531689" containerName="kube-state-metrics" Dec 11 13:29:51 crc kubenswrapper[4898]: I1211 13:29:51.106668 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a77ab96-9580-4509-ba2c-963e51ed44a5" containerName="mysqld-exporter" Dec 11 13:29:51 crc kubenswrapper[4898]: I1211 13:29:51.107622 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 11 13:29:51 crc kubenswrapper[4898]: I1211 13:29:51.110072 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Dec 11 13:29:51 crc kubenswrapper[4898]: I1211 13:29:51.110736 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Dec 11 13:29:51 crc kubenswrapper[4898]: I1211 13:29:51.126655 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mysqld-exporter-0"] Dec 11 13:29:51 crc kubenswrapper[4898]: I1211 13:29:51.137998 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mysqld-exporter-0"] Dec 11 13:29:51 crc kubenswrapper[4898]: I1211 13:29:51.155697 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 11 13:29:51 crc kubenswrapper[4898]: I1211 13:29:51.170025 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mysqld-exporter-0"] Dec 11 13:29:51 crc kubenswrapper[4898]: I1211 13:29:51.171999 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-0" Dec 11 13:29:51 crc kubenswrapper[4898]: I1211 13:29:51.173647 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"mysqld-exporter-config-data" Dec 11 13:29:51 crc kubenswrapper[4898]: I1211 13:29:51.174296 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-mysqld-exporter-svc" Dec 11 13:29:51 crc kubenswrapper[4898]: I1211 13:29:51.181416 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-0"] Dec 11 13:29:51 crc kubenswrapper[4898]: I1211 13:29:51.220322 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/882df8e7-3b5e-4dd9-af28-fa75e752cade-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"882df8e7-3b5e-4dd9-af28-fa75e752cade\") " pod="openstack/kube-state-metrics-0" Dec 11 13:29:51 crc kubenswrapper[4898]: I1211 13:29:51.220623 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l8whc\" (UniqueName: \"kubernetes.io/projected/882df8e7-3b5e-4dd9-af28-fa75e752cade-kube-api-access-l8whc\") pod \"kube-state-metrics-0\" (UID: \"882df8e7-3b5e-4dd9-af28-fa75e752cade\") " pod="openstack/kube-state-metrics-0" Dec 11 13:29:51 crc kubenswrapper[4898]: I1211 13:29:51.220802 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/882df8e7-3b5e-4dd9-af28-fa75e752cade-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"882df8e7-3b5e-4dd9-af28-fa75e752cade\") " pod="openstack/kube-state-metrics-0" Dec 11 13:29:51 crc kubenswrapper[4898]: I1211 13:29:51.221042 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/882df8e7-3b5e-4dd9-af28-fa75e752cade-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"882df8e7-3b5e-4dd9-af28-fa75e752cade\") " pod="openstack/kube-state-metrics-0" Dec 11 13:29:51 crc kubenswrapper[4898]: I1211 13:29:51.323098 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/882df8e7-3b5e-4dd9-af28-fa75e752cade-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"882df8e7-3b5e-4dd9-af28-fa75e752cade\") " pod="openstack/kube-state-metrics-0" Dec 11 13:29:51 crc kubenswrapper[4898]: I1211 13:29:51.323164 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a0fdc4f-1bd6-4444-80e4-6ce57885c417-config-data\") pod \"mysqld-exporter-0\" (UID: \"8a0fdc4f-1bd6-4444-80e4-6ce57885c417\") " pod="openstack/mysqld-exporter-0" Dec 11 13:29:51 crc kubenswrapper[4898]: I1211 13:29:51.323211 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mysqld-exporter-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a0fdc4f-1bd6-4444-80e4-6ce57885c417-mysqld-exporter-tls-certs\") pod \"mysqld-exporter-0\" (UID: \"8a0fdc4f-1bd6-4444-80e4-6ce57885c417\") " pod="openstack/mysqld-exporter-0" Dec 11 13:29:51 crc kubenswrapper[4898]: I1211 13:29:51.323267 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/882df8e7-3b5e-4dd9-af28-fa75e752cade-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"882df8e7-3b5e-4dd9-af28-fa75e752cade\") " pod="openstack/kube-state-metrics-0" Dec 11 13:29:51 crc kubenswrapper[4898]: I1211 13:29:51.323303 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-prgrs\" (UniqueName: \"kubernetes.io/projected/8a0fdc4f-1bd6-4444-80e4-6ce57885c417-kube-api-access-prgrs\") pod \"mysqld-exporter-0\" (UID: \"8a0fdc4f-1bd6-4444-80e4-6ce57885c417\") " pod="openstack/mysqld-exporter-0" Dec 11 13:29:51 crc kubenswrapper[4898]: I1211 13:29:51.323361 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l8whc\" (UniqueName: \"kubernetes.io/projected/882df8e7-3b5e-4dd9-af28-fa75e752cade-kube-api-access-l8whc\") pod \"kube-state-metrics-0\" (UID: \"882df8e7-3b5e-4dd9-af28-fa75e752cade\") " pod="openstack/kube-state-metrics-0" Dec 11 13:29:51 crc kubenswrapper[4898]: I1211 13:29:51.323407 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/882df8e7-3b5e-4dd9-af28-fa75e752cade-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"882df8e7-3b5e-4dd9-af28-fa75e752cade\") " pod="openstack/kube-state-metrics-0" Dec 11 13:29:51 crc kubenswrapper[4898]: I1211 13:29:51.323554 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a0fdc4f-1bd6-4444-80e4-6ce57885c417-combined-ca-bundle\") pod \"mysqld-exporter-0\" (UID: \"8a0fdc4f-1bd6-4444-80e4-6ce57885c417\") " pod="openstack/mysqld-exporter-0" Dec 11 13:29:51 crc kubenswrapper[4898]: I1211 13:29:51.328799 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/882df8e7-3b5e-4dd9-af28-fa75e752cade-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"882df8e7-3b5e-4dd9-af28-fa75e752cade\") " pod="openstack/kube-state-metrics-0" Dec 11 13:29:51 crc kubenswrapper[4898]: I1211 13:29:51.333794 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/882df8e7-3b5e-4dd9-af28-fa75e752cade-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"882df8e7-3b5e-4dd9-af28-fa75e752cade\") " pod="openstack/kube-state-metrics-0" Dec 11 13:29:51 crc kubenswrapper[4898]: I1211 13:29:51.342451 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l8whc\" (UniqueName: \"kubernetes.io/projected/882df8e7-3b5e-4dd9-af28-fa75e752cade-kube-api-access-l8whc\") pod \"kube-state-metrics-0\" (UID: \"882df8e7-3b5e-4dd9-af28-fa75e752cade\") " pod="openstack/kube-state-metrics-0" Dec 11 13:29:51 crc kubenswrapper[4898]: I1211 13:29:51.369307 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/882df8e7-3b5e-4dd9-af28-fa75e752cade-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"882df8e7-3b5e-4dd9-af28-fa75e752cade\") " pod="openstack/kube-state-metrics-0" Dec 11 13:29:51 crc kubenswrapper[4898]: I1211 13:29:51.425900 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mysqld-exporter-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a0fdc4f-1bd6-4444-80e4-6ce57885c417-mysqld-exporter-tls-certs\") pod \"mysqld-exporter-0\" (UID: \"8a0fdc4f-1bd6-4444-80e4-6ce57885c417\") " pod="openstack/mysqld-exporter-0" Dec 11 13:29:51 crc kubenswrapper[4898]: I1211 13:29:51.426015 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-prgrs\" (UniqueName: \"kubernetes.io/projected/8a0fdc4f-1bd6-4444-80e4-6ce57885c417-kube-api-access-prgrs\") pod \"mysqld-exporter-0\" (UID: \"8a0fdc4f-1bd6-4444-80e4-6ce57885c417\") " pod="openstack/mysqld-exporter-0" Dec 11 13:29:51 crc kubenswrapper[4898]: I1211 13:29:51.426148 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a0fdc4f-1bd6-4444-80e4-6ce57885c417-combined-ca-bundle\") pod \"mysqld-exporter-0\" (UID: \"8a0fdc4f-1bd6-4444-80e4-6ce57885c417\") " pod="openstack/mysqld-exporter-0" Dec 11 13:29:51 crc kubenswrapper[4898]: I1211 13:29:51.426200 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a0fdc4f-1bd6-4444-80e4-6ce57885c417-config-data\") pod \"mysqld-exporter-0\" (UID: \"8a0fdc4f-1bd6-4444-80e4-6ce57885c417\") " pod="openstack/mysqld-exporter-0" Dec 11 13:29:51 crc kubenswrapper[4898]: I1211 13:29:51.431329 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mysqld-exporter-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a0fdc4f-1bd6-4444-80e4-6ce57885c417-mysqld-exporter-tls-certs\") pod \"mysqld-exporter-0\" (UID: \"8a0fdc4f-1bd6-4444-80e4-6ce57885c417\") " pod="openstack/mysqld-exporter-0" Dec 11 13:29:51 crc kubenswrapper[4898]: I1211 13:29:51.431406 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a0fdc4f-1bd6-4444-80e4-6ce57885c417-config-data\") pod \"mysqld-exporter-0\" (UID: \"8a0fdc4f-1bd6-4444-80e4-6ce57885c417\") " pod="openstack/mysqld-exporter-0" Dec 11 13:29:51 crc kubenswrapper[4898]: I1211 13:29:51.431859 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a0fdc4f-1bd6-4444-80e4-6ce57885c417-combined-ca-bundle\") pod \"mysqld-exporter-0\" (UID: \"8a0fdc4f-1bd6-4444-80e4-6ce57885c417\") " pod="openstack/mysqld-exporter-0" Dec 11 13:29:51 crc kubenswrapper[4898]: I1211 13:29:51.431936 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 11 13:29:51 crc kubenswrapper[4898]: I1211 13:29:51.449818 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-prgrs\" (UniqueName: \"kubernetes.io/projected/8a0fdc4f-1bd6-4444-80e4-6ce57885c417-kube-api-access-prgrs\") pod \"mysqld-exporter-0\" (UID: \"8a0fdc4f-1bd6-4444-80e4-6ce57885c417\") " pod="openstack/mysqld-exporter-0" Dec 11 13:29:51 crc kubenswrapper[4898]: I1211 13:29:51.492660 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-0" Dec 11 13:29:52 crc kubenswrapper[4898]: I1211 13:29:52.007484 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 11 13:29:52 crc kubenswrapper[4898]: I1211 13:29:52.126791 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-0"] Dec 11 13:29:52 crc kubenswrapper[4898]: W1211 13:29:52.138418 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8a0fdc4f_1bd6_4444_80e4_6ce57885c417.slice/crio-691785d35afa10c59405944d7307f750cf3460480ae2b1aac7dd2b4cc253d754 WatchSource:0}: Error finding container 691785d35afa10c59405944d7307f750cf3460480ae2b1aac7dd2b4cc253d754: Status 404 returned error can't find the container with id 691785d35afa10c59405944d7307f750cf3460480ae2b1aac7dd2b4cc253d754 Dec 11 13:29:52 crc kubenswrapper[4898]: I1211 13:29:52.310391 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 11 13:29:52 crc kubenswrapper[4898]: I1211 13:29:52.311083 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8b5bec9e-6060-43e8-954c-ee6153e9a841" containerName="ceilometer-central-agent" containerID="cri-o://0ff71a84e971432d65e47f89bc14dab420c94ec9d0695ce6e52bb57f24dd1027" gracePeriod=30 Dec 11 13:29:52 crc kubenswrapper[4898]: I1211 13:29:52.311224 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8b5bec9e-6060-43e8-954c-ee6153e9a841" containerName="proxy-httpd" containerID="cri-o://ae5f57e85ad7814ab6cd8ae2cf85c905997f161db6a3cf68751505a0d0f0865c" gracePeriod=30 Dec 11 13:29:52 crc kubenswrapper[4898]: I1211 13:29:52.311278 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8b5bec9e-6060-43e8-954c-ee6153e9a841" containerName="sg-core" containerID="cri-o://12351c1f1f32a9581dfa326b208d96bd48915b87a3c8fefd0ebc8992142c55c5" gracePeriod=30 Dec 11 13:29:52 crc kubenswrapper[4898]: I1211 13:29:52.311319 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8b5bec9e-6060-43e8-954c-ee6153e9a841" containerName="ceilometer-notification-agent" containerID="cri-o://e5b8edb8ea2ed08143ee09684135eca75584ff72dc887d2c5fb4a80e271996bd" gracePeriod=30 Dec 11 13:29:52 crc kubenswrapper[4898]: I1211 13:29:52.320839 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 11 13:29:52 crc kubenswrapper[4898]: I1211 13:29:52.324959 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 11 13:29:52 crc kubenswrapper[4898]: I1211 13:29:52.335967 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 11 13:29:52 crc kubenswrapper[4898]: I1211 13:29:52.798374 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="354dfb1c-30d5-4253-beeb-3e57ca531689" path="/var/lib/kubelet/pods/354dfb1c-30d5-4253-beeb-3e57ca531689/volumes" Dec 11 13:29:52 crc kubenswrapper[4898]: I1211 13:29:52.799512 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3a77ab96-9580-4509-ba2c-963e51ed44a5" path="/var/lib/kubelet/pods/3a77ab96-9580-4509-ba2c-963e51ed44a5/volumes" Dec 11 13:29:53 crc kubenswrapper[4898]: I1211 13:29:53.060957 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-0" event={"ID":"8a0fdc4f-1bd6-4444-80e4-6ce57885c417","Type":"ContainerStarted","Data":"691785d35afa10c59405944d7307f750cf3460480ae2b1aac7dd2b4cc253d754"} Dec 11 13:29:53 crc kubenswrapper[4898]: I1211 13:29:53.063758 4898 generic.go:334] "Generic (PLEG): container finished" podID="8b5bec9e-6060-43e8-954c-ee6153e9a841" containerID="ae5f57e85ad7814ab6cd8ae2cf85c905997f161db6a3cf68751505a0d0f0865c" exitCode=0 Dec 11 13:29:53 crc kubenswrapper[4898]: I1211 13:29:53.063784 4898 generic.go:334] "Generic (PLEG): container finished" podID="8b5bec9e-6060-43e8-954c-ee6153e9a841" containerID="12351c1f1f32a9581dfa326b208d96bd48915b87a3c8fefd0ebc8992142c55c5" exitCode=2 Dec 11 13:29:53 crc kubenswrapper[4898]: I1211 13:29:53.063793 4898 generic.go:334] "Generic (PLEG): container finished" podID="8b5bec9e-6060-43e8-954c-ee6153e9a841" containerID="0ff71a84e971432d65e47f89bc14dab420c94ec9d0695ce6e52bb57f24dd1027" exitCode=0 Dec 11 13:29:53 crc kubenswrapper[4898]: I1211 13:29:53.063839 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8b5bec9e-6060-43e8-954c-ee6153e9a841","Type":"ContainerDied","Data":"ae5f57e85ad7814ab6cd8ae2cf85c905997f161db6a3cf68751505a0d0f0865c"} Dec 11 13:29:53 crc kubenswrapper[4898]: I1211 13:29:53.063891 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8b5bec9e-6060-43e8-954c-ee6153e9a841","Type":"ContainerDied","Data":"12351c1f1f32a9581dfa326b208d96bd48915b87a3c8fefd0ebc8992142c55c5"} Dec 11 13:29:53 crc kubenswrapper[4898]: I1211 13:29:53.063906 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8b5bec9e-6060-43e8-954c-ee6153e9a841","Type":"ContainerDied","Data":"0ff71a84e971432d65e47f89bc14dab420c94ec9d0695ce6e52bb57f24dd1027"} Dec 11 13:29:53 crc kubenswrapper[4898]: I1211 13:29:53.066179 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"882df8e7-3b5e-4dd9-af28-fa75e752cade","Type":"ContainerStarted","Data":"afd6dd291b9fbbf5aafb645e9bba32a7a5e8ca80704ca88f961ab4377cead80b"} Dec 11 13:29:53 crc kubenswrapper[4898]: I1211 13:29:53.066233 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"882df8e7-3b5e-4dd9-af28-fa75e752cade","Type":"ContainerStarted","Data":"9317889f8d1eab16624ca39bc52262e78db849607b5264d528e64850b494fd8e"} Dec 11 13:29:53 crc kubenswrapper[4898]: I1211 13:29:53.066937 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Dec 11 13:29:53 crc kubenswrapper[4898]: I1211 13:29:53.072689 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 11 13:29:53 crc kubenswrapper[4898]: I1211 13:29:53.095886 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.732799144 podStartE2EDuration="3.095858229s" podCreationTimestamp="2025-12-11 13:29:50 +0000 UTC" firstStartedPulling="2025-12-11 13:29:52.056644685 +0000 UTC m=+1549.628971122" lastFinishedPulling="2025-12-11 13:29:52.41970378 +0000 UTC m=+1549.992030207" observedRunningTime="2025-12-11 13:29:53.081090649 +0000 UTC m=+1550.653417086" watchObservedRunningTime="2025-12-11 13:29:53.095858229 +0000 UTC m=+1550.668184676" Dec 11 13:29:54 crc kubenswrapper[4898]: I1211 13:29:54.077091 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-0" event={"ID":"8a0fdc4f-1bd6-4444-80e4-6ce57885c417","Type":"ContainerStarted","Data":"3d21c1686154c34f53d1396e1ca8ccb8bae454be7309d9f7842de3bb4422be97"} Dec 11 13:29:54 crc kubenswrapper[4898]: I1211 13:29:54.092595 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mysqld-exporter-0" podStartSLOduration=2.344746574 podStartE2EDuration="3.092577003s" podCreationTimestamp="2025-12-11 13:29:51 +0000 UTC" firstStartedPulling="2025-12-11 13:29:52.143433553 +0000 UTC m=+1549.715759990" lastFinishedPulling="2025-12-11 13:29:52.891263992 +0000 UTC m=+1550.463590419" observedRunningTime="2025-12-11 13:29:54.091124194 +0000 UTC m=+1551.663450621" watchObservedRunningTime="2025-12-11 13:29:54.092577003 +0000 UTC m=+1551.664903440" Dec 11 13:29:54 crc kubenswrapper[4898]: I1211 13:29:54.820549 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 11 13:29:54 crc kubenswrapper[4898]: I1211 13:29:54.953051 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfdhb\" (UniqueName: \"kubernetes.io/projected/8b5bec9e-6060-43e8-954c-ee6153e9a841-kube-api-access-cfdhb\") pod \"8b5bec9e-6060-43e8-954c-ee6153e9a841\" (UID: \"8b5bec9e-6060-43e8-954c-ee6153e9a841\") " Dec 11 13:29:54 crc kubenswrapper[4898]: I1211 13:29:54.953414 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b5bec9e-6060-43e8-954c-ee6153e9a841-combined-ca-bundle\") pod \"8b5bec9e-6060-43e8-954c-ee6153e9a841\" (UID: \"8b5bec9e-6060-43e8-954c-ee6153e9a841\") " Dec 11 13:29:54 crc kubenswrapper[4898]: I1211 13:29:54.953439 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8b5bec9e-6060-43e8-954c-ee6153e9a841-log-httpd\") pod \"8b5bec9e-6060-43e8-954c-ee6153e9a841\" (UID: \"8b5bec9e-6060-43e8-954c-ee6153e9a841\") " Dec 11 13:29:54 crc kubenswrapper[4898]: I1211 13:29:54.953472 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8b5bec9e-6060-43e8-954c-ee6153e9a841-run-httpd\") pod \"8b5bec9e-6060-43e8-954c-ee6153e9a841\" (UID: \"8b5bec9e-6060-43e8-954c-ee6153e9a841\") " Dec 11 13:29:54 crc kubenswrapper[4898]: I1211 13:29:54.953501 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8b5bec9e-6060-43e8-954c-ee6153e9a841-sg-core-conf-yaml\") pod \"8b5bec9e-6060-43e8-954c-ee6153e9a841\" (UID: \"8b5bec9e-6060-43e8-954c-ee6153e9a841\") " Dec 11 13:29:54 crc kubenswrapper[4898]: I1211 13:29:54.953553 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8b5bec9e-6060-43e8-954c-ee6153e9a841-scripts\") pod \"8b5bec9e-6060-43e8-954c-ee6153e9a841\" (UID: \"8b5bec9e-6060-43e8-954c-ee6153e9a841\") " Dec 11 13:29:54 crc kubenswrapper[4898]: I1211 13:29:54.953674 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b5bec9e-6060-43e8-954c-ee6153e9a841-config-data\") pod \"8b5bec9e-6060-43e8-954c-ee6153e9a841\" (UID: \"8b5bec9e-6060-43e8-954c-ee6153e9a841\") " Dec 11 13:29:54 crc kubenswrapper[4898]: I1211 13:29:54.954225 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8b5bec9e-6060-43e8-954c-ee6153e9a841-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "8b5bec9e-6060-43e8-954c-ee6153e9a841" (UID: "8b5bec9e-6060-43e8-954c-ee6153e9a841"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 13:29:54 crc kubenswrapper[4898]: I1211 13:29:54.954989 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8b5bec9e-6060-43e8-954c-ee6153e9a841-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "8b5bec9e-6060-43e8-954c-ee6153e9a841" (UID: "8b5bec9e-6060-43e8-954c-ee6153e9a841"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 13:29:54 crc kubenswrapper[4898]: I1211 13:29:54.955609 4898 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8b5bec9e-6060-43e8-954c-ee6153e9a841-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 11 13:29:54 crc kubenswrapper[4898]: I1211 13:29:54.956239 4898 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8b5bec9e-6060-43e8-954c-ee6153e9a841-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 11 13:29:54 crc kubenswrapper[4898]: I1211 13:29:54.960429 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b5bec9e-6060-43e8-954c-ee6153e9a841-scripts" (OuterVolumeSpecName: "scripts") pod "8b5bec9e-6060-43e8-954c-ee6153e9a841" (UID: "8b5bec9e-6060-43e8-954c-ee6153e9a841"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:29:54 crc kubenswrapper[4898]: I1211 13:29:54.960665 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b5bec9e-6060-43e8-954c-ee6153e9a841-kube-api-access-cfdhb" (OuterVolumeSpecName: "kube-api-access-cfdhb") pod "8b5bec9e-6060-43e8-954c-ee6153e9a841" (UID: "8b5bec9e-6060-43e8-954c-ee6153e9a841"). InnerVolumeSpecName "kube-api-access-cfdhb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:29:55 crc kubenswrapper[4898]: I1211 13:29:55.006783 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b5bec9e-6060-43e8-954c-ee6153e9a841-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "8b5bec9e-6060-43e8-954c-ee6153e9a841" (UID: "8b5bec9e-6060-43e8-954c-ee6153e9a841"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:29:55 crc kubenswrapper[4898]: I1211 13:29:55.047889 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b5bec9e-6060-43e8-954c-ee6153e9a841-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8b5bec9e-6060-43e8-954c-ee6153e9a841" (UID: "8b5bec9e-6060-43e8-954c-ee6153e9a841"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:29:55 crc kubenswrapper[4898]: I1211 13:29:55.058809 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfdhb\" (UniqueName: \"kubernetes.io/projected/8b5bec9e-6060-43e8-954c-ee6153e9a841-kube-api-access-cfdhb\") on node \"crc\" DevicePath \"\"" Dec 11 13:29:55 crc kubenswrapper[4898]: I1211 13:29:55.058848 4898 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b5bec9e-6060-43e8-954c-ee6153e9a841-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 11 13:29:55 crc kubenswrapper[4898]: I1211 13:29:55.058858 4898 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8b5bec9e-6060-43e8-954c-ee6153e9a841-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 11 13:29:55 crc kubenswrapper[4898]: I1211 13:29:55.058867 4898 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8b5bec9e-6060-43e8-954c-ee6153e9a841-scripts\") on node \"crc\" DevicePath \"\"" Dec 11 13:29:55 crc kubenswrapper[4898]: I1211 13:29:55.069903 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b5bec9e-6060-43e8-954c-ee6153e9a841-config-data" (OuterVolumeSpecName: "config-data") pod "8b5bec9e-6060-43e8-954c-ee6153e9a841" (UID: "8b5bec9e-6060-43e8-954c-ee6153e9a841"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:29:55 crc kubenswrapper[4898]: I1211 13:29:55.106611 4898 generic.go:334] "Generic (PLEG): container finished" podID="8b5bec9e-6060-43e8-954c-ee6153e9a841" containerID="e5b8edb8ea2ed08143ee09684135eca75584ff72dc887d2c5fb4a80e271996bd" exitCode=0 Dec 11 13:29:55 crc kubenswrapper[4898]: I1211 13:29:55.106668 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8b5bec9e-6060-43e8-954c-ee6153e9a841","Type":"ContainerDied","Data":"e5b8edb8ea2ed08143ee09684135eca75584ff72dc887d2c5fb4a80e271996bd"} Dec 11 13:29:55 crc kubenswrapper[4898]: I1211 13:29:55.106719 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 11 13:29:55 crc kubenswrapper[4898]: I1211 13:29:55.106744 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8b5bec9e-6060-43e8-954c-ee6153e9a841","Type":"ContainerDied","Data":"de5a3b49ff8ee96bd283690d339f25f8998190192993318f124d24c990388b50"} Dec 11 13:29:55 crc kubenswrapper[4898]: I1211 13:29:55.106769 4898 scope.go:117] "RemoveContainer" containerID="ae5f57e85ad7814ab6cd8ae2cf85c905997f161db6a3cf68751505a0d0f0865c" Dec 11 13:29:55 crc kubenswrapper[4898]: I1211 13:29:55.153250 4898 scope.go:117] "RemoveContainer" containerID="12351c1f1f32a9581dfa326b208d96bd48915b87a3c8fefd0ebc8992142c55c5" Dec 11 13:29:55 crc kubenswrapper[4898]: I1211 13:29:55.161283 4898 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b5bec9e-6060-43e8-954c-ee6153e9a841-config-data\") on node \"crc\" DevicePath \"\"" Dec 11 13:29:55 crc kubenswrapper[4898]: I1211 13:29:55.201894 4898 scope.go:117] "RemoveContainer" containerID="e5b8edb8ea2ed08143ee09684135eca75584ff72dc887d2c5fb4a80e271996bd" Dec 11 13:29:55 crc kubenswrapper[4898]: I1211 13:29:55.213159 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 11 13:29:55 crc kubenswrapper[4898]: I1211 13:29:55.225291 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 11 13:29:55 crc kubenswrapper[4898]: I1211 13:29:55.236311 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 11 13:29:55 crc kubenswrapper[4898]: E1211 13:29:55.236891 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b5bec9e-6060-43e8-954c-ee6153e9a841" containerName="ceilometer-central-agent" Dec 11 13:29:55 crc kubenswrapper[4898]: I1211 13:29:55.236907 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b5bec9e-6060-43e8-954c-ee6153e9a841" containerName="ceilometer-central-agent" Dec 11 13:29:55 crc kubenswrapper[4898]: E1211 13:29:55.236927 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b5bec9e-6060-43e8-954c-ee6153e9a841" containerName="sg-core" Dec 11 13:29:55 crc kubenswrapper[4898]: I1211 13:29:55.236935 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b5bec9e-6060-43e8-954c-ee6153e9a841" containerName="sg-core" Dec 11 13:29:55 crc kubenswrapper[4898]: E1211 13:29:55.236984 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b5bec9e-6060-43e8-954c-ee6153e9a841" containerName="ceilometer-notification-agent" Dec 11 13:29:55 crc kubenswrapper[4898]: I1211 13:29:55.236991 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b5bec9e-6060-43e8-954c-ee6153e9a841" containerName="ceilometer-notification-agent" Dec 11 13:29:55 crc kubenswrapper[4898]: E1211 13:29:55.237003 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b5bec9e-6060-43e8-954c-ee6153e9a841" containerName="proxy-httpd" Dec 11 13:29:55 crc kubenswrapper[4898]: I1211 13:29:55.237008 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b5bec9e-6060-43e8-954c-ee6153e9a841" containerName="proxy-httpd" Dec 11 13:29:55 crc kubenswrapper[4898]: I1211 13:29:55.237235 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b5bec9e-6060-43e8-954c-ee6153e9a841" containerName="sg-core" Dec 11 13:29:55 crc kubenswrapper[4898]: I1211 13:29:55.237251 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b5bec9e-6060-43e8-954c-ee6153e9a841" containerName="ceilometer-notification-agent" Dec 11 13:29:55 crc kubenswrapper[4898]: I1211 13:29:55.237274 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b5bec9e-6060-43e8-954c-ee6153e9a841" containerName="proxy-httpd" Dec 11 13:29:55 crc kubenswrapper[4898]: I1211 13:29:55.237287 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b5bec9e-6060-43e8-954c-ee6153e9a841" containerName="ceilometer-central-agent" Dec 11 13:29:55 crc kubenswrapper[4898]: I1211 13:29:55.239363 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 11 13:29:55 crc kubenswrapper[4898]: I1211 13:29:55.244001 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Dec 11 13:29:55 crc kubenswrapper[4898]: I1211 13:29:55.244245 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 11 13:29:55 crc kubenswrapper[4898]: I1211 13:29:55.244737 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 11 13:29:55 crc kubenswrapper[4898]: I1211 13:29:55.247651 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 11 13:29:55 crc kubenswrapper[4898]: I1211 13:29:55.259981 4898 scope.go:117] "RemoveContainer" containerID="0ff71a84e971432d65e47f89bc14dab420c94ec9d0695ce6e52bb57f24dd1027" Dec 11 13:29:55 crc kubenswrapper[4898]: I1211 13:29:55.289055 4898 scope.go:117] "RemoveContainer" containerID="ae5f57e85ad7814ab6cd8ae2cf85c905997f161db6a3cf68751505a0d0f0865c" Dec 11 13:29:55 crc kubenswrapper[4898]: E1211 13:29:55.289703 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ae5f57e85ad7814ab6cd8ae2cf85c905997f161db6a3cf68751505a0d0f0865c\": container with ID starting with ae5f57e85ad7814ab6cd8ae2cf85c905997f161db6a3cf68751505a0d0f0865c not found: ID does not exist" containerID="ae5f57e85ad7814ab6cd8ae2cf85c905997f161db6a3cf68751505a0d0f0865c" Dec 11 13:29:55 crc kubenswrapper[4898]: I1211 13:29:55.289752 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae5f57e85ad7814ab6cd8ae2cf85c905997f161db6a3cf68751505a0d0f0865c"} err="failed to get container status \"ae5f57e85ad7814ab6cd8ae2cf85c905997f161db6a3cf68751505a0d0f0865c\": rpc error: code = NotFound desc = could not find container \"ae5f57e85ad7814ab6cd8ae2cf85c905997f161db6a3cf68751505a0d0f0865c\": container with ID starting with ae5f57e85ad7814ab6cd8ae2cf85c905997f161db6a3cf68751505a0d0f0865c not found: ID does not exist" Dec 11 13:29:55 crc kubenswrapper[4898]: I1211 13:29:55.289787 4898 scope.go:117] "RemoveContainer" containerID="12351c1f1f32a9581dfa326b208d96bd48915b87a3c8fefd0ebc8992142c55c5" Dec 11 13:29:55 crc kubenswrapper[4898]: E1211 13:29:55.290120 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"12351c1f1f32a9581dfa326b208d96bd48915b87a3c8fefd0ebc8992142c55c5\": container with ID starting with 12351c1f1f32a9581dfa326b208d96bd48915b87a3c8fefd0ebc8992142c55c5 not found: ID does not exist" containerID="12351c1f1f32a9581dfa326b208d96bd48915b87a3c8fefd0ebc8992142c55c5" Dec 11 13:29:55 crc kubenswrapper[4898]: I1211 13:29:55.290153 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12351c1f1f32a9581dfa326b208d96bd48915b87a3c8fefd0ebc8992142c55c5"} err="failed to get container status \"12351c1f1f32a9581dfa326b208d96bd48915b87a3c8fefd0ebc8992142c55c5\": rpc error: code = NotFound desc = could not find container \"12351c1f1f32a9581dfa326b208d96bd48915b87a3c8fefd0ebc8992142c55c5\": container with ID starting with 12351c1f1f32a9581dfa326b208d96bd48915b87a3c8fefd0ebc8992142c55c5 not found: ID does not exist" Dec 11 13:29:55 crc kubenswrapper[4898]: I1211 13:29:55.290176 4898 scope.go:117] "RemoveContainer" containerID="e5b8edb8ea2ed08143ee09684135eca75584ff72dc887d2c5fb4a80e271996bd" Dec 11 13:29:55 crc kubenswrapper[4898]: E1211 13:29:55.290635 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e5b8edb8ea2ed08143ee09684135eca75584ff72dc887d2c5fb4a80e271996bd\": container with ID starting with e5b8edb8ea2ed08143ee09684135eca75584ff72dc887d2c5fb4a80e271996bd not found: ID does not exist" containerID="e5b8edb8ea2ed08143ee09684135eca75584ff72dc887d2c5fb4a80e271996bd" Dec 11 13:29:55 crc kubenswrapper[4898]: I1211 13:29:55.290663 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e5b8edb8ea2ed08143ee09684135eca75584ff72dc887d2c5fb4a80e271996bd"} err="failed to get container status \"e5b8edb8ea2ed08143ee09684135eca75584ff72dc887d2c5fb4a80e271996bd\": rpc error: code = NotFound desc = could not find container \"e5b8edb8ea2ed08143ee09684135eca75584ff72dc887d2c5fb4a80e271996bd\": container with ID starting with e5b8edb8ea2ed08143ee09684135eca75584ff72dc887d2c5fb4a80e271996bd not found: ID does not exist" Dec 11 13:29:55 crc kubenswrapper[4898]: I1211 13:29:55.290682 4898 scope.go:117] "RemoveContainer" containerID="0ff71a84e971432d65e47f89bc14dab420c94ec9d0695ce6e52bb57f24dd1027" Dec 11 13:29:55 crc kubenswrapper[4898]: E1211 13:29:55.290940 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ff71a84e971432d65e47f89bc14dab420c94ec9d0695ce6e52bb57f24dd1027\": container with ID starting with 0ff71a84e971432d65e47f89bc14dab420c94ec9d0695ce6e52bb57f24dd1027 not found: ID does not exist" containerID="0ff71a84e971432d65e47f89bc14dab420c94ec9d0695ce6e52bb57f24dd1027" Dec 11 13:29:55 crc kubenswrapper[4898]: I1211 13:29:55.290976 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ff71a84e971432d65e47f89bc14dab420c94ec9d0695ce6e52bb57f24dd1027"} err="failed to get container status \"0ff71a84e971432d65e47f89bc14dab420c94ec9d0695ce6e52bb57f24dd1027\": rpc error: code = NotFound desc = could not find container \"0ff71a84e971432d65e47f89bc14dab420c94ec9d0695ce6e52bb57f24dd1027\": container with ID starting with 0ff71a84e971432d65e47f89bc14dab420c94ec9d0695ce6e52bb57f24dd1027 not found: ID does not exist" Dec 11 13:29:55 crc kubenswrapper[4898]: I1211 13:29:55.369824 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/378696c4-d1ad-4165-b792-11c37b390c35-run-httpd\") pod \"ceilometer-0\" (UID: \"378696c4-d1ad-4165-b792-11c37b390c35\") " pod="openstack/ceilometer-0" Dec 11 13:29:55 crc kubenswrapper[4898]: I1211 13:29:55.369936 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/378696c4-d1ad-4165-b792-11c37b390c35-log-httpd\") pod \"ceilometer-0\" (UID: \"378696c4-d1ad-4165-b792-11c37b390c35\") " pod="openstack/ceilometer-0" Dec 11 13:29:55 crc kubenswrapper[4898]: I1211 13:29:55.370092 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/378696c4-d1ad-4165-b792-11c37b390c35-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"378696c4-d1ad-4165-b792-11c37b390c35\") " pod="openstack/ceilometer-0" Dec 11 13:29:55 crc kubenswrapper[4898]: I1211 13:29:55.370120 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/378696c4-d1ad-4165-b792-11c37b390c35-scripts\") pod \"ceilometer-0\" (UID: \"378696c4-d1ad-4165-b792-11c37b390c35\") " pod="openstack/ceilometer-0" Dec 11 13:29:55 crc kubenswrapper[4898]: I1211 13:29:55.370304 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2wjhg\" (UniqueName: \"kubernetes.io/projected/378696c4-d1ad-4165-b792-11c37b390c35-kube-api-access-2wjhg\") pod \"ceilometer-0\" (UID: \"378696c4-d1ad-4165-b792-11c37b390c35\") " pod="openstack/ceilometer-0" Dec 11 13:29:55 crc kubenswrapper[4898]: I1211 13:29:55.370516 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/378696c4-d1ad-4165-b792-11c37b390c35-config-data\") pod \"ceilometer-0\" (UID: \"378696c4-d1ad-4165-b792-11c37b390c35\") " pod="openstack/ceilometer-0" Dec 11 13:29:55 crc kubenswrapper[4898]: I1211 13:29:55.370540 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/378696c4-d1ad-4165-b792-11c37b390c35-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"378696c4-d1ad-4165-b792-11c37b390c35\") " pod="openstack/ceilometer-0" Dec 11 13:29:55 crc kubenswrapper[4898]: I1211 13:29:55.370663 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/378696c4-d1ad-4165-b792-11c37b390c35-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"378696c4-d1ad-4165-b792-11c37b390c35\") " pod="openstack/ceilometer-0" Dec 11 13:29:55 crc kubenswrapper[4898]: I1211 13:29:55.472609 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/378696c4-d1ad-4165-b792-11c37b390c35-scripts\") pod \"ceilometer-0\" (UID: \"378696c4-d1ad-4165-b792-11c37b390c35\") " pod="openstack/ceilometer-0" Dec 11 13:29:55 crc kubenswrapper[4898]: I1211 13:29:55.472733 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2wjhg\" (UniqueName: \"kubernetes.io/projected/378696c4-d1ad-4165-b792-11c37b390c35-kube-api-access-2wjhg\") pod \"ceilometer-0\" (UID: \"378696c4-d1ad-4165-b792-11c37b390c35\") " pod="openstack/ceilometer-0" Dec 11 13:29:55 crc kubenswrapper[4898]: I1211 13:29:55.472846 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/378696c4-d1ad-4165-b792-11c37b390c35-config-data\") pod \"ceilometer-0\" (UID: \"378696c4-d1ad-4165-b792-11c37b390c35\") " pod="openstack/ceilometer-0" Dec 11 13:29:55 crc kubenswrapper[4898]: I1211 13:29:55.472878 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/378696c4-d1ad-4165-b792-11c37b390c35-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"378696c4-d1ad-4165-b792-11c37b390c35\") " pod="openstack/ceilometer-0" Dec 11 13:29:55 crc kubenswrapper[4898]: I1211 13:29:55.472946 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/378696c4-d1ad-4165-b792-11c37b390c35-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"378696c4-d1ad-4165-b792-11c37b390c35\") " pod="openstack/ceilometer-0" Dec 11 13:29:55 crc kubenswrapper[4898]: I1211 13:29:55.472975 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/378696c4-d1ad-4165-b792-11c37b390c35-run-httpd\") pod \"ceilometer-0\" (UID: \"378696c4-d1ad-4165-b792-11c37b390c35\") " pod="openstack/ceilometer-0" Dec 11 13:29:55 crc kubenswrapper[4898]: I1211 13:29:55.473065 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/378696c4-d1ad-4165-b792-11c37b390c35-log-httpd\") pod \"ceilometer-0\" (UID: \"378696c4-d1ad-4165-b792-11c37b390c35\") " pod="openstack/ceilometer-0" Dec 11 13:29:55 crc kubenswrapper[4898]: I1211 13:29:55.473172 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/378696c4-d1ad-4165-b792-11c37b390c35-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"378696c4-d1ad-4165-b792-11c37b390c35\") " pod="openstack/ceilometer-0" Dec 11 13:29:55 crc kubenswrapper[4898]: I1211 13:29:55.473718 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/378696c4-d1ad-4165-b792-11c37b390c35-run-httpd\") pod \"ceilometer-0\" (UID: \"378696c4-d1ad-4165-b792-11c37b390c35\") " pod="openstack/ceilometer-0" Dec 11 13:29:55 crc kubenswrapper[4898]: I1211 13:29:55.473917 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/378696c4-d1ad-4165-b792-11c37b390c35-log-httpd\") pod \"ceilometer-0\" (UID: \"378696c4-d1ad-4165-b792-11c37b390c35\") " pod="openstack/ceilometer-0" Dec 11 13:29:55 crc kubenswrapper[4898]: I1211 13:29:55.477861 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/378696c4-d1ad-4165-b792-11c37b390c35-config-data\") pod \"ceilometer-0\" (UID: \"378696c4-d1ad-4165-b792-11c37b390c35\") " pod="openstack/ceilometer-0" Dec 11 13:29:55 crc kubenswrapper[4898]: I1211 13:29:55.478276 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/378696c4-d1ad-4165-b792-11c37b390c35-scripts\") pod \"ceilometer-0\" (UID: \"378696c4-d1ad-4165-b792-11c37b390c35\") " pod="openstack/ceilometer-0" Dec 11 13:29:55 crc kubenswrapper[4898]: I1211 13:29:55.486024 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/378696c4-d1ad-4165-b792-11c37b390c35-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"378696c4-d1ad-4165-b792-11c37b390c35\") " pod="openstack/ceilometer-0" Dec 11 13:29:55 crc kubenswrapper[4898]: I1211 13:29:55.486166 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/378696c4-d1ad-4165-b792-11c37b390c35-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"378696c4-d1ad-4165-b792-11c37b390c35\") " pod="openstack/ceilometer-0" Dec 11 13:29:55 crc kubenswrapper[4898]: I1211 13:29:55.486878 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/378696c4-d1ad-4165-b792-11c37b390c35-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"378696c4-d1ad-4165-b792-11c37b390c35\") " pod="openstack/ceilometer-0" Dec 11 13:29:55 crc kubenswrapper[4898]: I1211 13:29:55.491382 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2wjhg\" (UniqueName: \"kubernetes.io/projected/378696c4-d1ad-4165-b792-11c37b390c35-kube-api-access-2wjhg\") pod \"ceilometer-0\" (UID: \"378696c4-d1ad-4165-b792-11c37b390c35\") " pod="openstack/ceilometer-0" Dec 11 13:29:55 crc kubenswrapper[4898]: I1211 13:29:55.576199 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 11 13:29:56 crc kubenswrapper[4898]: I1211 13:29:56.052051 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 11 13:29:56 crc kubenswrapper[4898]: W1211 13:29:56.059162 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod378696c4_d1ad_4165_b792_11c37b390c35.slice/crio-a60cc6ff3e24087b2a1b9c29cf93371ee822dec5099411d9be0b16ed8caaba47 WatchSource:0}: Error finding container a60cc6ff3e24087b2a1b9c29cf93371ee822dec5099411d9be0b16ed8caaba47: Status 404 returned error can't find the container with id a60cc6ff3e24087b2a1b9c29cf93371ee822dec5099411d9be0b16ed8caaba47 Dec 11 13:29:56 crc kubenswrapper[4898]: I1211 13:29:56.122265 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"378696c4-d1ad-4165-b792-11c37b390c35","Type":"ContainerStarted","Data":"a60cc6ff3e24087b2a1b9c29cf93371ee822dec5099411d9be0b16ed8caaba47"} Dec 11 13:29:56 crc kubenswrapper[4898]: I1211 13:29:56.805721 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8b5bec9e-6060-43e8-954c-ee6153e9a841" path="/var/lib/kubelet/pods/8b5bec9e-6060-43e8-954c-ee6153e9a841/volumes" Dec 11 13:29:57 crc kubenswrapper[4898]: I1211 13:29:57.133494 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"378696c4-d1ad-4165-b792-11c37b390c35","Type":"ContainerStarted","Data":"4d9ee735b7d576f6f707efd923ccb25d7fa330c6bacb8416fee94406c0ce8d48"} Dec 11 13:29:58 crc kubenswrapper[4898]: I1211 13:29:58.145401 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"378696c4-d1ad-4165-b792-11c37b390c35","Type":"ContainerStarted","Data":"08b26dd5ecdcb4fa98fd969c8a86c3fbf902869d2c52ae197f13633d4a93fb74"} Dec 11 13:29:59 crc kubenswrapper[4898]: I1211 13:29:59.157180 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"378696c4-d1ad-4165-b792-11c37b390c35","Type":"ContainerStarted","Data":"1060df277c29d530022ccd8aa959c7f49b5d54acbec4b4b623abacb026babed4"} Dec 11 13:29:59 crc kubenswrapper[4898]: I1211 13:29:59.345401 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-d8hls"] Dec 11 13:29:59 crc kubenswrapper[4898]: I1211 13:29:59.349491 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-d8hls" Dec 11 13:29:59 crc kubenswrapper[4898]: I1211 13:29:59.362593 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-d8hls"] Dec 11 13:29:59 crc kubenswrapper[4898]: I1211 13:29:59.487937 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e123d3d-07a5-4968-b891-02bf1e03020f-utilities\") pod \"community-operators-d8hls\" (UID: \"0e123d3d-07a5-4968-b891-02bf1e03020f\") " pod="openshift-marketplace/community-operators-d8hls" Dec 11 13:29:59 crc kubenswrapper[4898]: I1211 13:29:59.488352 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8zxlp\" (UniqueName: \"kubernetes.io/projected/0e123d3d-07a5-4968-b891-02bf1e03020f-kube-api-access-8zxlp\") pod \"community-operators-d8hls\" (UID: \"0e123d3d-07a5-4968-b891-02bf1e03020f\") " pod="openshift-marketplace/community-operators-d8hls" Dec 11 13:29:59 crc kubenswrapper[4898]: I1211 13:29:59.488441 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e123d3d-07a5-4968-b891-02bf1e03020f-catalog-content\") pod \"community-operators-d8hls\" (UID: \"0e123d3d-07a5-4968-b891-02bf1e03020f\") " pod="openshift-marketplace/community-operators-d8hls" Dec 11 13:29:59 crc kubenswrapper[4898]: I1211 13:29:59.591863 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e123d3d-07a5-4968-b891-02bf1e03020f-utilities\") pod \"community-operators-d8hls\" (UID: \"0e123d3d-07a5-4968-b891-02bf1e03020f\") " pod="openshift-marketplace/community-operators-d8hls" Dec 11 13:29:59 crc kubenswrapper[4898]: I1211 13:29:59.591930 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8zxlp\" (UniqueName: \"kubernetes.io/projected/0e123d3d-07a5-4968-b891-02bf1e03020f-kube-api-access-8zxlp\") pod \"community-operators-d8hls\" (UID: \"0e123d3d-07a5-4968-b891-02bf1e03020f\") " pod="openshift-marketplace/community-operators-d8hls" Dec 11 13:29:59 crc kubenswrapper[4898]: I1211 13:29:59.591973 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e123d3d-07a5-4968-b891-02bf1e03020f-catalog-content\") pod \"community-operators-d8hls\" (UID: \"0e123d3d-07a5-4968-b891-02bf1e03020f\") " pod="openshift-marketplace/community-operators-d8hls" Dec 11 13:29:59 crc kubenswrapper[4898]: I1211 13:29:59.592586 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e123d3d-07a5-4968-b891-02bf1e03020f-catalog-content\") pod \"community-operators-d8hls\" (UID: \"0e123d3d-07a5-4968-b891-02bf1e03020f\") " pod="openshift-marketplace/community-operators-d8hls" Dec 11 13:29:59 crc kubenswrapper[4898]: I1211 13:29:59.592807 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e123d3d-07a5-4968-b891-02bf1e03020f-utilities\") pod \"community-operators-d8hls\" (UID: \"0e123d3d-07a5-4968-b891-02bf1e03020f\") " pod="openshift-marketplace/community-operators-d8hls" Dec 11 13:29:59 crc kubenswrapper[4898]: I1211 13:29:59.609674 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8zxlp\" (UniqueName: \"kubernetes.io/projected/0e123d3d-07a5-4968-b891-02bf1e03020f-kube-api-access-8zxlp\") pod \"community-operators-d8hls\" (UID: \"0e123d3d-07a5-4968-b891-02bf1e03020f\") " pod="openshift-marketplace/community-operators-d8hls" Dec 11 13:29:59 crc kubenswrapper[4898]: I1211 13:29:59.672308 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-d8hls" Dec 11 13:30:00 crc kubenswrapper[4898]: I1211 13:30:00.150571 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29424330-fzl2t"] Dec 11 13:30:00 crc kubenswrapper[4898]: I1211 13:30:00.152769 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29424330-fzl2t" Dec 11 13:30:00 crc kubenswrapper[4898]: I1211 13:30:00.156224 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 11 13:30:00 crc kubenswrapper[4898]: I1211 13:30:00.156649 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 11 13:30:00 crc kubenswrapper[4898]: I1211 13:30:00.211370 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29424330-fzl2t"] Dec 11 13:30:00 crc kubenswrapper[4898]: I1211 13:30:00.249699 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"378696c4-d1ad-4165-b792-11c37b390c35","Type":"ContainerStarted","Data":"5b6e52ddedc83d69d942d29c81a79a8a88df5f31c6828c3fd8e7114f88215fe6"} Dec 11 13:30:00 crc kubenswrapper[4898]: I1211 13:30:00.249994 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 11 13:30:00 crc kubenswrapper[4898]: I1211 13:30:00.282176 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-d8hls"] Dec 11 13:30:00 crc kubenswrapper[4898]: I1211 13:30:00.296212 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.776992581 podStartE2EDuration="5.29619266s" podCreationTimestamp="2025-12-11 13:29:55 +0000 UTC" firstStartedPulling="2025-12-11 13:29:56.062848154 +0000 UTC m=+1553.635174591" lastFinishedPulling="2025-12-11 13:29:59.582048233 +0000 UTC m=+1557.154374670" observedRunningTime="2025-12-11 13:30:00.273933407 +0000 UTC m=+1557.846259864" watchObservedRunningTime="2025-12-11 13:30:00.29619266 +0000 UTC m=+1557.868519097" Dec 11 13:30:00 crc kubenswrapper[4898]: I1211 13:30:00.322681 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f590609a-e2e1-4add-97e0-9d08b6a2c723-secret-volume\") pod \"collect-profiles-29424330-fzl2t\" (UID: \"f590609a-e2e1-4add-97e0-9d08b6a2c723\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29424330-fzl2t" Dec 11 13:30:00 crc kubenswrapper[4898]: I1211 13:30:00.322772 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mpn2g\" (UniqueName: \"kubernetes.io/projected/f590609a-e2e1-4add-97e0-9d08b6a2c723-kube-api-access-mpn2g\") pod \"collect-profiles-29424330-fzl2t\" (UID: \"f590609a-e2e1-4add-97e0-9d08b6a2c723\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29424330-fzl2t" Dec 11 13:30:00 crc kubenswrapper[4898]: I1211 13:30:00.322795 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f590609a-e2e1-4add-97e0-9d08b6a2c723-config-volume\") pod \"collect-profiles-29424330-fzl2t\" (UID: \"f590609a-e2e1-4add-97e0-9d08b6a2c723\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29424330-fzl2t" Dec 11 13:30:00 crc kubenswrapper[4898]: I1211 13:30:00.424927 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f590609a-e2e1-4add-97e0-9d08b6a2c723-secret-volume\") pod \"collect-profiles-29424330-fzl2t\" (UID: \"f590609a-e2e1-4add-97e0-9d08b6a2c723\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29424330-fzl2t" Dec 11 13:30:00 crc kubenswrapper[4898]: I1211 13:30:00.425029 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mpn2g\" (UniqueName: \"kubernetes.io/projected/f590609a-e2e1-4add-97e0-9d08b6a2c723-kube-api-access-mpn2g\") pod \"collect-profiles-29424330-fzl2t\" (UID: \"f590609a-e2e1-4add-97e0-9d08b6a2c723\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29424330-fzl2t" Dec 11 13:30:00 crc kubenswrapper[4898]: I1211 13:30:00.425068 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f590609a-e2e1-4add-97e0-9d08b6a2c723-config-volume\") pod \"collect-profiles-29424330-fzl2t\" (UID: \"f590609a-e2e1-4add-97e0-9d08b6a2c723\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29424330-fzl2t" Dec 11 13:30:00 crc kubenswrapper[4898]: I1211 13:30:00.425858 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f590609a-e2e1-4add-97e0-9d08b6a2c723-config-volume\") pod \"collect-profiles-29424330-fzl2t\" (UID: \"f590609a-e2e1-4add-97e0-9d08b6a2c723\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29424330-fzl2t" Dec 11 13:30:00 crc kubenswrapper[4898]: I1211 13:30:00.432547 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f590609a-e2e1-4add-97e0-9d08b6a2c723-secret-volume\") pod \"collect-profiles-29424330-fzl2t\" (UID: \"f590609a-e2e1-4add-97e0-9d08b6a2c723\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29424330-fzl2t" Dec 11 13:30:00 crc kubenswrapper[4898]: I1211 13:30:00.453006 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mpn2g\" (UniqueName: \"kubernetes.io/projected/f590609a-e2e1-4add-97e0-9d08b6a2c723-kube-api-access-mpn2g\") pod \"collect-profiles-29424330-fzl2t\" (UID: \"f590609a-e2e1-4add-97e0-9d08b6a2c723\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29424330-fzl2t" Dec 11 13:30:00 crc kubenswrapper[4898]: I1211 13:30:00.486075 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29424330-fzl2t" Dec 11 13:30:01 crc kubenswrapper[4898]: W1211 13:30:01.023862 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf590609a_e2e1_4add_97e0_9d08b6a2c723.slice/crio-98cae023f596c72c42a4329ddb56a9389de2be345bddf2562e0a69c8d5444db4 WatchSource:0}: Error finding container 98cae023f596c72c42a4329ddb56a9389de2be345bddf2562e0a69c8d5444db4: Status 404 returned error can't find the container with id 98cae023f596c72c42a4329ddb56a9389de2be345bddf2562e0a69c8d5444db4 Dec 11 13:30:01 crc kubenswrapper[4898]: I1211 13:30:01.029084 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29424330-fzl2t"] Dec 11 13:30:01 crc kubenswrapper[4898]: I1211 13:30:01.285495 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29424330-fzl2t" event={"ID":"f590609a-e2e1-4add-97e0-9d08b6a2c723","Type":"ContainerStarted","Data":"98cae023f596c72c42a4329ddb56a9389de2be345bddf2562e0a69c8d5444db4"} Dec 11 13:30:01 crc kubenswrapper[4898]: I1211 13:30:01.290572 4898 generic.go:334] "Generic (PLEG): container finished" podID="0e123d3d-07a5-4968-b891-02bf1e03020f" containerID="dfc91ffa4eabbe50328e6bd6bb5200a42d53496027b808caefe6cedf95409299" exitCode=0 Dec 11 13:30:01 crc kubenswrapper[4898]: I1211 13:30:01.291999 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d8hls" event={"ID":"0e123d3d-07a5-4968-b891-02bf1e03020f","Type":"ContainerDied","Data":"dfc91ffa4eabbe50328e6bd6bb5200a42d53496027b808caefe6cedf95409299"} Dec 11 13:30:01 crc kubenswrapper[4898]: I1211 13:30:01.292041 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d8hls" event={"ID":"0e123d3d-07a5-4968-b891-02bf1e03020f","Type":"ContainerStarted","Data":"30a75a42e9b7ca5b0380403fbf30faccc692fc22e490c0a9a59fcd78633c3e06"} Dec 11 13:30:01 crc kubenswrapper[4898]: I1211 13:30:01.451077 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Dec 11 13:30:02 crc kubenswrapper[4898]: I1211 13:30:02.305409 4898 generic.go:334] "Generic (PLEG): container finished" podID="f590609a-e2e1-4add-97e0-9d08b6a2c723" containerID="e80ed6a6aac5e97971b98ea3a79f0be65d4937042cd9fa25691caa735cae5157" exitCode=0 Dec 11 13:30:02 crc kubenswrapper[4898]: I1211 13:30:02.305474 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29424330-fzl2t" event={"ID":"f590609a-e2e1-4add-97e0-9d08b6a2c723","Type":"ContainerDied","Data":"e80ed6a6aac5e97971b98ea3a79f0be65d4937042cd9fa25691caa735cae5157"} Dec 11 13:30:03 crc kubenswrapper[4898]: I1211 13:30:03.338797 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d8hls" event={"ID":"0e123d3d-07a5-4968-b891-02bf1e03020f","Type":"ContainerStarted","Data":"5649a8389cca108a08f18b8312b60b3f071f333fe1841f2cb946f708418305be"} Dec 11 13:30:04 crc kubenswrapper[4898]: I1211 13:30:04.111230 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29424330-fzl2t" Dec 11 13:30:04 crc kubenswrapper[4898]: I1211 13:30:04.241794 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f590609a-e2e1-4add-97e0-9d08b6a2c723-secret-volume\") pod \"f590609a-e2e1-4add-97e0-9d08b6a2c723\" (UID: \"f590609a-e2e1-4add-97e0-9d08b6a2c723\") " Dec 11 13:30:04 crc kubenswrapper[4898]: I1211 13:30:04.241915 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mpn2g\" (UniqueName: \"kubernetes.io/projected/f590609a-e2e1-4add-97e0-9d08b6a2c723-kube-api-access-mpn2g\") pod \"f590609a-e2e1-4add-97e0-9d08b6a2c723\" (UID: \"f590609a-e2e1-4add-97e0-9d08b6a2c723\") " Dec 11 13:30:04 crc kubenswrapper[4898]: I1211 13:30:04.241969 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f590609a-e2e1-4add-97e0-9d08b6a2c723-config-volume\") pod \"f590609a-e2e1-4add-97e0-9d08b6a2c723\" (UID: \"f590609a-e2e1-4add-97e0-9d08b6a2c723\") " Dec 11 13:30:04 crc kubenswrapper[4898]: I1211 13:30:04.242531 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f590609a-e2e1-4add-97e0-9d08b6a2c723-config-volume" (OuterVolumeSpecName: "config-volume") pod "f590609a-e2e1-4add-97e0-9d08b6a2c723" (UID: "f590609a-e2e1-4add-97e0-9d08b6a2c723"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:30:04 crc kubenswrapper[4898]: I1211 13:30:04.242875 4898 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f590609a-e2e1-4add-97e0-9d08b6a2c723-config-volume\") on node \"crc\" DevicePath \"\"" Dec 11 13:30:04 crc kubenswrapper[4898]: I1211 13:30:04.248093 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f590609a-e2e1-4add-97e0-9d08b6a2c723-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "f590609a-e2e1-4add-97e0-9d08b6a2c723" (UID: "f590609a-e2e1-4add-97e0-9d08b6a2c723"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:30:04 crc kubenswrapper[4898]: I1211 13:30:04.248388 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f590609a-e2e1-4add-97e0-9d08b6a2c723-kube-api-access-mpn2g" (OuterVolumeSpecName: "kube-api-access-mpn2g") pod "f590609a-e2e1-4add-97e0-9d08b6a2c723" (UID: "f590609a-e2e1-4add-97e0-9d08b6a2c723"). InnerVolumeSpecName "kube-api-access-mpn2g". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:30:04 crc kubenswrapper[4898]: I1211 13:30:04.345538 4898 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f590609a-e2e1-4add-97e0-9d08b6a2c723-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 11 13:30:04 crc kubenswrapper[4898]: I1211 13:30:04.345585 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mpn2g\" (UniqueName: \"kubernetes.io/projected/f590609a-e2e1-4add-97e0-9d08b6a2c723-kube-api-access-mpn2g\") on node \"crc\" DevicePath \"\"" Dec 11 13:30:04 crc kubenswrapper[4898]: I1211 13:30:04.356562 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29424330-fzl2t" event={"ID":"f590609a-e2e1-4add-97e0-9d08b6a2c723","Type":"ContainerDied","Data":"98cae023f596c72c42a4329ddb56a9389de2be345bddf2562e0a69c8d5444db4"} Dec 11 13:30:04 crc kubenswrapper[4898]: I1211 13:30:04.356600 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="98cae023f596c72c42a4329ddb56a9389de2be345bddf2562e0a69c8d5444db4" Dec 11 13:30:04 crc kubenswrapper[4898]: I1211 13:30:04.356623 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29424330-fzl2t" Dec 11 13:30:04 crc kubenswrapper[4898]: I1211 13:30:04.381007 4898 generic.go:334] "Generic (PLEG): container finished" podID="0e123d3d-07a5-4968-b891-02bf1e03020f" containerID="5649a8389cca108a08f18b8312b60b3f071f333fe1841f2cb946f708418305be" exitCode=0 Dec 11 13:30:04 crc kubenswrapper[4898]: I1211 13:30:04.381064 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d8hls" event={"ID":"0e123d3d-07a5-4968-b891-02bf1e03020f","Type":"ContainerDied","Data":"5649a8389cca108a08f18b8312b60b3f071f333fe1841f2cb946f708418305be"} Dec 11 13:30:06 crc kubenswrapper[4898]: I1211 13:30:06.416474 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d8hls" event={"ID":"0e123d3d-07a5-4968-b891-02bf1e03020f","Type":"ContainerStarted","Data":"d18ec97539f287eb43ffce5e11d0017f4e49f8c8f4839db7bd5494010ed2017e"} Dec 11 13:30:06 crc kubenswrapper[4898]: I1211 13:30:06.440238 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-d8hls" podStartSLOduration=3.409756799 podStartE2EDuration="7.440218634s" podCreationTimestamp="2025-12-11 13:29:59 +0000 UTC" firstStartedPulling="2025-12-11 13:30:01.294687782 +0000 UTC m=+1558.867014219" lastFinishedPulling="2025-12-11 13:30:05.325149617 +0000 UTC m=+1562.897476054" observedRunningTime="2025-12-11 13:30:06.431046955 +0000 UTC m=+1564.003373392" watchObservedRunningTime="2025-12-11 13:30:06.440218634 +0000 UTC m=+1564.012545061" Dec 11 13:30:09 crc kubenswrapper[4898]: I1211 13:30:09.673304 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-d8hls" Dec 11 13:30:09 crc kubenswrapper[4898]: I1211 13:30:09.673947 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-d8hls" Dec 11 13:30:09 crc kubenswrapper[4898]: I1211 13:30:09.743233 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-d8hls" Dec 11 13:30:10 crc kubenswrapper[4898]: I1211 13:30:10.517205 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-d8hls" Dec 11 13:30:10 crc kubenswrapper[4898]: I1211 13:30:10.567507 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-d8hls"] Dec 11 13:30:12 crc kubenswrapper[4898]: I1211 13:30:12.478709 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-d8hls" podUID="0e123d3d-07a5-4968-b891-02bf1e03020f" containerName="registry-server" containerID="cri-o://d18ec97539f287eb43ffce5e11d0017f4e49f8c8f4839db7bd5494010ed2017e" gracePeriod=2 Dec 11 13:30:13 crc kubenswrapper[4898]: I1211 13:30:13.028904 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-d8hls" Dec 11 13:30:13 crc kubenswrapper[4898]: I1211 13:30:13.142817 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8zxlp\" (UniqueName: \"kubernetes.io/projected/0e123d3d-07a5-4968-b891-02bf1e03020f-kube-api-access-8zxlp\") pod \"0e123d3d-07a5-4968-b891-02bf1e03020f\" (UID: \"0e123d3d-07a5-4968-b891-02bf1e03020f\") " Dec 11 13:30:13 crc kubenswrapper[4898]: I1211 13:30:13.143265 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e123d3d-07a5-4968-b891-02bf1e03020f-catalog-content\") pod \"0e123d3d-07a5-4968-b891-02bf1e03020f\" (UID: \"0e123d3d-07a5-4968-b891-02bf1e03020f\") " Dec 11 13:30:13 crc kubenswrapper[4898]: I1211 13:30:13.143524 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e123d3d-07a5-4968-b891-02bf1e03020f-utilities\") pod \"0e123d3d-07a5-4968-b891-02bf1e03020f\" (UID: \"0e123d3d-07a5-4968-b891-02bf1e03020f\") " Dec 11 13:30:13 crc kubenswrapper[4898]: I1211 13:30:13.144368 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0e123d3d-07a5-4968-b891-02bf1e03020f-utilities" (OuterVolumeSpecName: "utilities") pod "0e123d3d-07a5-4968-b891-02bf1e03020f" (UID: "0e123d3d-07a5-4968-b891-02bf1e03020f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 13:30:13 crc kubenswrapper[4898]: I1211 13:30:13.148474 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e123d3d-07a5-4968-b891-02bf1e03020f-kube-api-access-8zxlp" (OuterVolumeSpecName: "kube-api-access-8zxlp") pod "0e123d3d-07a5-4968-b891-02bf1e03020f" (UID: "0e123d3d-07a5-4968-b891-02bf1e03020f"). InnerVolumeSpecName "kube-api-access-8zxlp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:30:13 crc kubenswrapper[4898]: I1211 13:30:13.209861 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0e123d3d-07a5-4968-b891-02bf1e03020f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0e123d3d-07a5-4968-b891-02bf1e03020f" (UID: "0e123d3d-07a5-4968-b891-02bf1e03020f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 13:30:13 crc kubenswrapper[4898]: I1211 13:30:13.245936 4898 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e123d3d-07a5-4968-b891-02bf1e03020f-utilities\") on node \"crc\" DevicePath \"\"" Dec 11 13:30:13 crc kubenswrapper[4898]: I1211 13:30:13.245973 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8zxlp\" (UniqueName: \"kubernetes.io/projected/0e123d3d-07a5-4968-b891-02bf1e03020f-kube-api-access-8zxlp\") on node \"crc\" DevicePath \"\"" Dec 11 13:30:13 crc kubenswrapper[4898]: I1211 13:30:13.245985 4898 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e123d3d-07a5-4968-b891-02bf1e03020f-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 11 13:30:13 crc kubenswrapper[4898]: I1211 13:30:13.492060 4898 generic.go:334] "Generic (PLEG): container finished" podID="0e123d3d-07a5-4968-b891-02bf1e03020f" containerID="d18ec97539f287eb43ffce5e11d0017f4e49f8c8f4839db7bd5494010ed2017e" exitCode=0 Dec 11 13:30:13 crc kubenswrapper[4898]: I1211 13:30:13.492103 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d8hls" event={"ID":"0e123d3d-07a5-4968-b891-02bf1e03020f","Type":"ContainerDied","Data":"d18ec97539f287eb43ffce5e11d0017f4e49f8c8f4839db7bd5494010ed2017e"} Dec 11 13:30:13 crc kubenswrapper[4898]: I1211 13:30:13.492134 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d8hls" event={"ID":"0e123d3d-07a5-4968-b891-02bf1e03020f","Type":"ContainerDied","Data":"30a75a42e9b7ca5b0380403fbf30faccc692fc22e490c0a9a59fcd78633c3e06"} Dec 11 13:30:13 crc kubenswrapper[4898]: I1211 13:30:13.492157 4898 scope.go:117] "RemoveContainer" containerID="d18ec97539f287eb43ffce5e11d0017f4e49f8c8f4839db7bd5494010ed2017e" Dec 11 13:30:13 crc kubenswrapper[4898]: I1211 13:30:13.492324 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-d8hls" Dec 11 13:30:13 crc kubenswrapper[4898]: I1211 13:30:13.519195 4898 scope.go:117] "RemoveContainer" containerID="5649a8389cca108a08f18b8312b60b3f071f333fe1841f2cb946f708418305be" Dec 11 13:30:13 crc kubenswrapper[4898]: I1211 13:30:13.544864 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-d8hls"] Dec 11 13:30:13 crc kubenswrapper[4898]: I1211 13:30:13.560346 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-d8hls"] Dec 11 13:30:13 crc kubenswrapper[4898]: I1211 13:30:13.562161 4898 scope.go:117] "RemoveContainer" containerID="dfc91ffa4eabbe50328e6bd6bb5200a42d53496027b808caefe6cedf95409299" Dec 11 13:30:13 crc kubenswrapper[4898]: I1211 13:30:13.619970 4898 scope.go:117] "RemoveContainer" containerID="d18ec97539f287eb43ffce5e11d0017f4e49f8c8f4839db7bd5494010ed2017e" Dec 11 13:30:13 crc kubenswrapper[4898]: E1211 13:30:13.622736 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d18ec97539f287eb43ffce5e11d0017f4e49f8c8f4839db7bd5494010ed2017e\": container with ID starting with d18ec97539f287eb43ffce5e11d0017f4e49f8c8f4839db7bd5494010ed2017e not found: ID does not exist" containerID="d18ec97539f287eb43ffce5e11d0017f4e49f8c8f4839db7bd5494010ed2017e" Dec 11 13:30:13 crc kubenswrapper[4898]: I1211 13:30:13.622863 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d18ec97539f287eb43ffce5e11d0017f4e49f8c8f4839db7bd5494010ed2017e"} err="failed to get container status \"d18ec97539f287eb43ffce5e11d0017f4e49f8c8f4839db7bd5494010ed2017e\": rpc error: code = NotFound desc = could not find container \"d18ec97539f287eb43ffce5e11d0017f4e49f8c8f4839db7bd5494010ed2017e\": container with ID starting with d18ec97539f287eb43ffce5e11d0017f4e49f8c8f4839db7bd5494010ed2017e not found: ID does not exist" Dec 11 13:30:13 crc kubenswrapper[4898]: I1211 13:30:13.622978 4898 scope.go:117] "RemoveContainer" containerID="5649a8389cca108a08f18b8312b60b3f071f333fe1841f2cb946f708418305be" Dec 11 13:30:13 crc kubenswrapper[4898]: E1211 13:30:13.623434 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5649a8389cca108a08f18b8312b60b3f071f333fe1841f2cb946f708418305be\": container with ID starting with 5649a8389cca108a08f18b8312b60b3f071f333fe1841f2cb946f708418305be not found: ID does not exist" containerID="5649a8389cca108a08f18b8312b60b3f071f333fe1841f2cb946f708418305be" Dec 11 13:30:13 crc kubenswrapper[4898]: I1211 13:30:13.623569 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5649a8389cca108a08f18b8312b60b3f071f333fe1841f2cb946f708418305be"} err="failed to get container status \"5649a8389cca108a08f18b8312b60b3f071f333fe1841f2cb946f708418305be\": rpc error: code = NotFound desc = could not find container \"5649a8389cca108a08f18b8312b60b3f071f333fe1841f2cb946f708418305be\": container with ID starting with 5649a8389cca108a08f18b8312b60b3f071f333fe1841f2cb946f708418305be not found: ID does not exist" Dec 11 13:30:13 crc kubenswrapper[4898]: I1211 13:30:13.623658 4898 scope.go:117] "RemoveContainer" containerID="dfc91ffa4eabbe50328e6bd6bb5200a42d53496027b808caefe6cedf95409299" Dec 11 13:30:13 crc kubenswrapper[4898]: E1211 13:30:13.625758 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dfc91ffa4eabbe50328e6bd6bb5200a42d53496027b808caefe6cedf95409299\": container with ID starting with dfc91ffa4eabbe50328e6bd6bb5200a42d53496027b808caefe6cedf95409299 not found: ID does not exist" containerID="dfc91ffa4eabbe50328e6bd6bb5200a42d53496027b808caefe6cedf95409299" Dec 11 13:30:13 crc kubenswrapper[4898]: I1211 13:30:13.625872 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dfc91ffa4eabbe50328e6bd6bb5200a42d53496027b808caefe6cedf95409299"} err="failed to get container status \"dfc91ffa4eabbe50328e6bd6bb5200a42d53496027b808caefe6cedf95409299\": rpc error: code = NotFound desc = could not find container \"dfc91ffa4eabbe50328e6bd6bb5200a42d53496027b808caefe6cedf95409299\": container with ID starting with dfc91ffa4eabbe50328e6bd6bb5200a42d53496027b808caefe6cedf95409299 not found: ID does not exist" Dec 11 13:30:14 crc kubenswrapper[4898]: I1211 13:30:14.797196 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0e123d3d-07a5-4968-b891-02bf1e03020f" path="/var/lib/kubelet/pods/0e123d3d-07a5-4968-b891-02bf1e03020f/volumes" Dec 11 13:30:25 crc kubenswrapper[4898]: I1211 13:30:25.592930 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 11 13:30:37 crc kubenswrapper[4898]: I1211 13:30:37.452354 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-sync-mvzhc"] Dec 11 13:30:37 crc kubenswrapper[4898]: I1211 13:30:37.464167 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-sync-mvzhc"] Dec 11 13:30:37 crc kubenswrapper[4898]: I1211 13:30:37.539888 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-sync-j2jpr"] Dec 11 13:30:37 crc kubenswrapper[4898]: E1211 13:30:37.540566 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e123d3d-07a5-4968-b891-02bf1e03020f" containerName="extract-utilities" Dec 11 13:30:37 crc kubenswrapper[4898]: I1211 13:30:37.540592 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e123d3d-07a5-4968-b891-02bf1e03020f" containerName="extract-utilities" Dec 11 13:30:37 crc kubenswrapper[4898]: E1211 13:30:37.540615 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e123d3d-07a5-4968-b891-02bf1e03020f" containerName="extract-content" Dec 11 13:30:37 crc kubenswrapper[4898]: I1211 13:30:37.540625 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e123d3d-07a5-4968-b891-02bf1e03020f" containerName="extract-content" Dec 11 13:30:37 crc kubenswrapper[4898]: E1211 13:30:37.540668 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f590609a-e2e1-4add-97e0-9d08b6a2c723" containerName="collect-profiles" Dec 11 13:30:37 crc kubenswrapper[4898]: I1211 13:30:37.540677 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="f590609a-e2e1-4add-97e0-9d08b6a2c723" containerName="collect-profiles" Dec 11 13:30:37 crc kubenswrapper[4898]: E1211 13:30:37.540700 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e123d3d-07a5-4968-b891-02bf1e03020f" containerName="registry-server" Dec 11 13:30:37 crc kubenswrapper[4898]: I1211 13:30:37.540709 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e123d3d-07a5-4968-b891-02bf1e03020f" containerName="registry-server" Dec 11 13:30:37 crc kubenswrapper[4898]: I1211 13:30:37.540962 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e123d3d-07a5-4968-b891-02bf1e03020f" containerName="registry-server" Dec 11 13:30:37 crc kubenswrapper[4898]: I1211 13:30:37.541003 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="f590609a-e2e1-4add-97e0-9d08b6a2c723" containerName="collect-profiles" Dec 11 13:30:37 crc kubenswrapper[4898]: I1211 13:30:37.541975 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-j2jpr" Dec 11 13:30:37 crc kubenswrapper[4898]: I1211 13:30:37.552965 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-j2jpr"] Dec 11 13:30:37 crc kubenswrapper[4898]: I1211 13:30:37.647661 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77bd783a-a06c-400b-b29e-a6edb7d613b3-config-data\") pod \"heat-db-sync-j2jpr\" (UID: \"77bd783a-a06c-400b-b29e-a6edb7d613b3\") " pod="openstack/heat-db-sync-j2jpr" Dec 11 13:30:37 crc kubenswrapper[4898]: I1211 13:30:37.647938 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77bd783a-a06c-400b-b29e-a6edb7d613b3-combined-ca-bundle\") pod \"heat-db-sync-j2jpr\" (UID: \"77bd783a-a06c-400b-b29e-a6edb7d613b3\") " pod="openstack/heat-db-sync-j2jpr" Dec 11 13:30:37 crc kubenswrapper[4898]: I1211 13:30:37.648290 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n7bbk\" (UniqueName: \"kubernetes.io/projected/77bd783a-a06c-400b-b29e-a6edb7d613b3-kube-api-access-n7bbk\") pod \"heat-db-sync-j2jpr\" (UID: \"77bd783a-a06c-400b-b29e-a6edb7d613b3\") " pod="openstack/heat-db-sync-j2jpr" Dec 11 13:30:37 crc kubenswrapper[4898]: I1211 13:30:37.750360 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n7bbk\" (UniqueName: \"kubernetes.io/projected/77bd783a-a06c-400b-b29e-a6edb7d613b3-kube-api-access-n7bbk\") pod \"heat-db-sync-j2jpr\" (UID: \"77bd783a-a06c-400b-b29e-a6edb7d613b3\") " pod="openstack/heat-db-sync-j2jpr" Dec 11 13:30:37 crc kubenswrapper[4898]: I1211 13:30:37.750434 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77bd783a-a06c-400b-b29e-a6edb7d613b3-config-data\") pod \"heat-db-sync-j2jpr\" (UID: \"77bd783a-a06c-400b-b29e-a6edb7d613b3\") " pod="openstack/heat-db-sync-j2jpr" Dec 11 13:30:37 crc kubenswrapper[4898]: I1211 13:30:37.750474 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77bd783a-a06c-400b-b29e-a6edb7d613b3-combined-ca-bundle\") pod \"heat-db-sync-j2jpr\" (UID: \"77bd783a-a06c-400b-b29e-a6edb7d613b3\") " pod="openstack/heat-db-sync-j2jpr" Dec 11 13:30:37 crc kubenswrapper[4898]: I1211 13:30:37.756290 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77bd783a-a06c-400b-b29e-a6edb7d613b3-config-data\") pod \"heat-db-sync-j2jpr\" (UID: \"77bd783a-a06c-400b-b29e-a6edb7d613b3\") " pod="openstack/heat-db-sync-j2jpr" Dec 11 13:30:37 crc kubenswrapper[4898]: I1211 13:30:37.756584 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77bd783a-a06c-400b-b29e-a6edb7d613b3-combined-ca-bundle\") pod \"heat-db-sync-j2jpr\" (UID: \"77bd783a-a06c-400b-b29e-a6edb7d613b3\") " pod="openstack/heat-db-sync-j2jpr" Dec 11 13:30:37 crc kubenswrapper[4898]: I1211 13:30:37.773728 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n7bbk\" (UniqueName: \"kubernetes.io/projected/77bd783a-a06c-400b-b29e-a6edb7d613b3-kube-api-access-n7bbk\") pod \"heat-db-sync-j2jpr\" (UID: \"77bd783a-a06c-400b-b29e-a6edb7d613b3\") " pod="openstack/heat-db-sync-j2jpr" Dec 11 13:30:37 crc kubenswrapper[4898]: I1211 13:30:37.915963 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-j2jpr" Dec 11 13:30:38 crc kubenswrapper[4898]: I1211 13:30:38.425361 4898 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 11 13:30:38 crc kubenswrapper[4898]: I1211 13:30:38.449046 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-j2jpr"] Dec 11 13:30:38 crc kubenswrapper[4898]: I1211 13:30:38.797785 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="23459d62-b558-4f82-a875-311d5fa486e5" path="/var/lib/kubelet/pods/23459d62-b558-4f82-a875-311d5fa486e5/volumes" Dec 11 13:30:38 crc kubenswrapper[4898]: I1211 13:30:38.798956 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-j2jpr" event={"ID":"77bd783a-a06c-400b-b29e-a6edb7d613b3","Type":"ContainerStarted","Data":"648e619f7a82d254b30bda5be17f31d3eb893afc50974aee86dc2fc990df3ea7"} Dec 11 13:30:39 crc kubenswrapper[4898]: I1211 13:30:39.705259 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 11 13:30:39 crc kubenswrapper[4898]: I1211 13:30:39.705607 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="378696c4-d1ad-4165-b792-11c37b390c35" containerName="sg-core" containerID="cri-o://1060df277c29d530022ccd8aa959c7f49b5d54acbec4b4b623abacb026babed4" gracePeriod=30 Dec 11 13:30:39 crc kubenswrapper[4898]: I1211 13:30:39.705642 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="378696c4-d1ad-4165-b792-11c37b390c35" containerName="ceilometer-notification-agent" containerID="cri-o://08b26dd5ecdcb4fa98fd969c8a86c3fbf902869d2c52ae197f13633d4a93fb74" gracePeriod=30 Dec 11 13:30:39 crc kubenswrapper[4898]: I1211 13:30:39.705668 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="378696c4-d1ad-4165-b792-11c37b390c35" containerName="ceilometer-central-agent" containerID="cri-o://4d9ee735b7d576f6f707efd923ccb25d7fa330c6bacb8416fee94406c0ce8d48" gracePeriod=30 Dec 11 13:30:39 crc kubenswrapper[4898]: I1211 13:30:39.705618 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="378696c4-d1ad-4165-b792-11c37b390c35" containerName="proxy-httpd" containerID="cri-o://5b6e52ddedc83d69d942d29c81a79a8a88df5f31c6828c3fd8e7114f88215fe6" gracePeriod=30 Dec 11 13:30:40 crc kubenswrapper[4898]: I1211 13:30:40.217260 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 11 13:30:40 crc kubenswrapper[4898]: I1211 13:30:40.824684 4898 generic.go:334] "Generic (PLEG): container finished" podID="378696c4-d1ad-4165-b792-11c37b390c35" containerID="5b6e52ddedc83d69d942d29c81a79a8a88df5f31c6828c3fd8e7114f88215fe6" exitCode=0 Dec 11 13:30:40 crc kubenswrapper[4898]: I1211 13:30:40.824929 4898 generic.go:334] "Generic (PLEG): container finished" podID="378696c4-d1ad-4165-b792-11c37b390c35" containerID="1060df277c29d530022ccd8aa959c7f49b5d54acbec4b4b623abacb026babed4" exitCode=2 Dec 11 13:30:40 crc kubenswrapper[4898]: I1211 13:30:40.824938 4898 generic.go:334] "Generic (PLEG): container finished" podID="378696c4-d1ad-4165-b792-11c37b390c35" containerID="4d9ee735b7d576f6f707efd923ccb25d7fa330c6bacb8416fee94406c0ce8d48" exitCode=0 Dec 11 13:30:40 crc kubenswrapper[4898]: I1211 13:30:40.824777 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"378696c4-d1ad-4165-b792-11c37b390c35","Type":"ContainerDied","Data":"5b6e52ddedc83d69d942d29c81a79a8a88df5f31c6828c3fd8e7114f88215fe6"} Dec 11 13:30:40 crc kubenswrapper[4898]: I1211 13:30:40.824973 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"378696c4-d1ad-4165-b792-11c37b390c35","Type":"ContainerDied","Data":"1060df277c29d530022ccd8aa959c7f49b5d54acbec4b4b623abacb026babed4"} Dec 11 13:30:40 crc kubenswrapper[4898]: I1211 13:30:40.824989 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"378696c4-d1ad-4165-b792-11c37b390c35","Type":"ContainerDied","Data":"4d9ee735b7d576f6f707efd923ccb25d7fa330c6bacb8416fee94406c0ce8d48"} Dec 11 13:30:41 crc kubenswrapper[4898]: I1211 13:30:41.241843 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 11 13:30:41 crc kubenswrapper[4898]: I1211 13:30:41.868932 4898 generic.go:334] "Generic (PLEG): container finished" podID="378696c4-d1ad-4165-b792-11c37b390c35" containerID="08b26dd5ecdcb4fa98fd969c8a86c3fbf902869d2c52ae197f13633d4a93fb74" exitCode=0 Dec 11 13:30:41 crc kubenswrapper[4898]: I1211 13:30:41.869385 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"378696c4-d1ad-4165-b792-11c37b390c35","Type":"ContainerDied","Data":"08b26dd5ecdcb4fa98fd969c8a86c3fbf902869d2c52ae197f13633d4a93fb74"} Dec 11 13:30:42 crc kubenswrapper[4898]: I1211 13:30:42.215397 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 11 13:30:42 crc kubenswrapper[4898]: I1211 13:30:42.365907 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/378696c4-d1ad-4165-b792-11c37b390c35-combined-ca-bundle\") pod \"378696c4-d1ad-4165-b792-11c37b390c35\" (UID: \"378696c4-d1ad-4165-b792-11c37b390c35\") " Dec 11 13:30:42 crc kubenswrapper[4898]: I1211 13:30:42.365971 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2wjhg\" (UniqueName: \"kubernetes.io/projected/378696c4-d1ad-4165-b792-11c37b390c35-kube-api-access-2wjhg\") pod \"378696c4-d1ad-4165-b792-11c37b390c35\" (UID: \"378696c4-d1ad-4165-b792-11c37b390c35\") " Dec 11 13:30:42 crc kubenswrapper[4898]: I1211 13:30:42.366052 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/378696c4-d1ad-4165-b792-11c37b390c35-run-httpd\") pod \"378696c4-d1ad-4165-b792-11c37b390c35\" (UID: \"378696c4-d1ad-4165-b792-11c37b390c35\") " Dec 11 13:30:42 crc kubenswrapper[4898]: I1211 13:30:42.366107 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/378696c4-d1ad-4165-b792-11c37b390c35-scripts\") pod \"378696c4-d1ad-4165-b792-11c37b390c35\" (UID: \"378696c4-d1ad-4165-b792-11c37b390c35\") " Dec 11 13:30:42 crc kubenswrapper[4898]: I1211 13:30:42.366161 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/378696c4-d1ad-4165-b792-11c37b390c35-sg-core-conf-yaml\") pod \"378696c4-d1ad-4165-b792-11c37b390c35\" (UID: \"378696c4-d1ad-4165-b792-11c37b390c35\") " Dec 11 13:30:42 crc kubenswrapper[4898]: I1211 13:30:42.366274 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/378696c4-d1ad-4165-b792-11c37b390c35-ceilometer-tls-certs\") pod \"378696c4-d1ad-4165-b792-11c37b390c35\" (UID: \"378696c4-d1ad-4165-b792-11c37b390c35\") " Dec 11 13:30:42 crc kubenswrapper[4898]: I1211 13:30:42.366352 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/378696c4-d1ad-4165-b792-11c37b390c35-log-httpd\") pod \"378696c4-d1ad-4165-b792-11c37b390c35\" (UID: \"378696c4-d1ad-4165-b792-11c37b390c35\") " Dec 11 13:30:42 crc kubenswrapper[4898]: I1211 13:30:42.366392 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/378696c4-d1ad-4165-b792-11c37b390c35-config-data\") pod \"378696c4-d1ad-4165-b792-11c37b390c35\" (UID: \"378696c4-d1ad-4165-b792-11c37b390c35\") " Dec 11 13:30:42 crc kubenswrapper[4898]: I1211 13:30:42.366646 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/378696c4-d1ad-4165-b792-11c37b390c35-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "378696c4-d1ad-4165-b792-11c37b390c35" (UID: "378696c4-d1ad-4165-b792-11c37b390c35"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 13:30:42 crc kubenswrapper[4898]: I1211 13:30:42.367443 4898 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/378696c4-d1ad-4165-b792-11c37b390c35-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 11 13:30:42 crc kubenswrapper[4898]: I1211 13:30:42.367801 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/378696c4-d1ad-4165-b792-11c37b390c35-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "378696c4-d1ad-4165-b792-11c37b390c35" (UID: "378696c4-d1ad-4165-b792-11c37b390c35"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 13:30:42 crc kubenswrapper[4898]: I1211 13:30:42.372736 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/378696c4-d1ad-4165-b792-11c37b390c35-kube-api-access-2wjhg" (OuterVolumeSpecName: "kube-api-access-2wjhg") pod "378696c4-d1ad-4165-b792-11c37b390c35" (UID: "378696c4-d1ad-4165-b792-11c37b390c35"). InnerVolumeSpecName "kube-api-access-2wjhg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:30:42 crc kubenswrapper[4898]: I1211 13:30:42.377997 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/378696c4-d1ad-4165-b792-11c37b390c35-scripts" (OuterVolumeSpecName: "scripts") pod "378696c4-d1ad-4165-b792-11c37b390c35" (UID: "378696c4-d1ad-4165-b792-11c37b390c35"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:30:42 crc kubenswrapper[4898]: I1211 13:30:42.410078 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/378696c4-d1ad-4165-b792-11c37b390c35-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "378696c4-d1ad-4165-b792-11c37b390c35" (UID: "378696c4-d1ad-4165-b792-11c37b390c35"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:30:42 crc kubenswrapper[4898]: I1211 13:30:42.451235 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/378696c4-d1ad-4165-b792-11c37b390c35-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "378696c4-d1ad-4165-b792-11c37b390c35" (UID: "378696c4-d1ad-4165-b792-11c37b390c35"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:30:42 crc kubenswrapper[4898]: I1211 13:30:42.470209 4898 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/378696c4-d1ad-4165-b792-11c37b390c35-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 11 13:30:42 crc kubenswrapper[4898]: I1211 13:30:42.470259 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2wjhg\" (UniqueName: \"kubernetes.io/projected/378696c4-d1ad-4165-b792-11c37b390c35-kube-api-access-2wjhg\") on node \"crc\" DevicePath \"\"" Dec 11 13:30:42 crc kubenswrapper[4898]: I1211 13:30:42.470271 4898 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/378696c4-d1ad-4165-b792-11c37b390c35-scripts\") on node \"crc\" DevicePath \"\"" Dec 11 13:30:42 crc kubenswrapper[4898]: I1211 13:30:42.470279 4898 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/378696c4-d1ad-4165-b792-11c37b390c35-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 11 13:30:42 crc kubenswrapper[4898]: I1211 13:30:42.470287 4898 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/378696c4-d1ad-4165-b792-11c37b390c35-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 11 13:30:42 crc kubenswrapper[4898]: I1211 13:30:42.494757 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/378696c4-d1ad-4165-b792-11c37b390c35-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "378696c4-d1ad-4165-b792-11c37b390c35" (UID: "378696c4-d1ad-4165-b792-11c37b390c35"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:30:42 crc kubenswrapper[4898]: I1211 13:30:42.542080 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/378696c4-d1ad-4165-b792-11c37b390c35-config-data" (OuterVolumeSpecName: "config-data") pod "378696c4-d1ad-4165-b792-11c37b390c35" (UID: "378696c4-d1ad-4165-b792-11c37b390c35"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:30:42 crc kubenswrapper[4898]: I1211 13:30:42.572846 4898 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/378696c4-d1ad-4165-b792-11c37b390c35-config-data\") on node \"crc\" DevicePath \"\"" Dec 11 13:30:42 crc kubenswrapper[4898]: I1211 13:30:42.572880 4898 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/378696c4-d1ad-4165-b792-11c37b390c35-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 11 13:30:42 crc kubenswrapper[4898]: I1211 13:30:42.888507 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"378696c4-d1ad-4165-b792-11c37b390c35","Type":"ContainerDied","Data":"a60cc6ff3e24087b2a1b9c29cf93371ee822dec5099411d9be0b16ed8caaba47"} Dec 11 13:30:42 crc kubenswrapper[4898]: I1211 13:30:42.888562 4898 scope.go:117] "RemoveContainer" containerID="5b6e52ddedc83d69d942d29c81a79a8a88df5f31c6828c3fd8e7114f88215fe6" Dec 11 13:30:42 crc kubenswrapper[4898]: I1211 13:30:42.888771 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 11 13:30:42 crc kubenswrapper[4898]: I1211 13:30:42.929538 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 11 13:30:42 crc kubenswrapper[4898]: I1211 13:30:42.941719 4898 scope.go:117] "RemoveContainer" containerID="1060df277c29d530022ccd8aa959c7f49b5d54acbec4b4b623abacb026babed4" Dec 11 13:30:42 crc kubenswrapper[4898]: I1211 13:30:42.957415 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 11 13:30:42 crc kubenswrapper[4898]: I1211 13:30:42.973549 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 11 13:30:42 crc kubenswrapper[4898]: E1211 13:30:42.973950 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="378696c4-d1ad-4165-b792-11c37b390c35" containerName="sg-core" Dec 11 13:30:42 crc kubenswrapper[4898]: I1211 13:30:42.973963 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="378696c4-d1ad-4165-b792-11c37b390c35" containerName="sg-core" Dec 11 13:30:42 crc kubenswrapper[4898]: E1211 13:30:42.973976 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="378696c4-d1ad-4165-b792-11c37b390c35" containerName="ceilometer-notification-agent" Dec 11 13:30:42 crc kubenswrapper[4898]: I1211 13:30:42.973982 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="378696c4-d1ad-4165-b792-11c37b390c35" containerName="ceilometer-notification-agent" Dec 11 13:30:42 crc kubenswrapper[4898]: E1211 13:30:42.973990 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="378696c4-d1ad-4165-b792-11c37b390c35" containerName="ceilometer-central-agent" Dec 11 13:30:42 crc kubenswrapper[4898]: I1211 13:30:42.973995 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="378696c4-d1ad-4165-b792-11c37b390c35" containerName="ceilometer-central-agent" Dec 11 13:30:42 crc kubenswrapper[4898]: E1211 13:30:42.974008 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="378696c4-d1ad-4165-b792-11c37b390c35" containerName="proxy-httpd" Dec 11 13:30:42 crc kubenswrapper[4898]: I1211 13:30:42.974015 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="378696c4-d1ad-4165-b792-11c37b390c35" containerName="proxy-httpd" Dec 11 13:30:42 crc kubenswrapper[4898]: I1211 13:30:42.974243 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="378696c4-d1ad-4165-b792-11c37b390c35" containerName="proxy-httpd" Dec 11 13:30:42 crc kubenswrapper[4898]: I1211 13:30:42.974258 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="378696c4-d1ad-4165-b792-11c37b390c35" containerName="ceilometer-central-agent" Dec 11 13:30:42 crc kubenswrapper[4898]: I1211 13:30:42.974273 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="378696c4-d1ad-4165-b792-11c37b390c35" containerName="ceilometer-notification-agent" Dec 11 13:30:42 crc kubenswrapper[4898]: I1211 13:30:42.974281 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="378696c4-d1ad-4165-b792-11c37b390c35" containerName="sg-core" Dec 11 13:30:42 crc kubenswrapper[4898]: I1211 13:30:42.976167 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 11 13:30:42 crc kubenswrapper[4898]: I1211 13:30:42.977510 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 11 13:30:43 crc kubenswrapper[4898]: I1211 13:30:43.003995 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Dec 11 13:30:43 crc kubenswrapper[4898]: I1211 13:30:43.004392 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 11 13:30:43 crc kubenswrapper[4898]: I1211 13:30:43.009552 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 11 13:30:43 crc kubenswrapper[4898]: I1211 13:30:43.064794 4898 scope.go:117] "RemoveContainer" containerID="08b26dd5ecdcb4fa98fd969c8a86c3fbf902869d2c52ae197f13633d4a93fb74" Dec 11 13:30:43 crc kubenswrapper[4898]: I1211 13:30:43.097102 4898 scope.go:117] "RemoveContainer" containerID="4d9ee735b7d576f6f707efd923ccb25d7fa330c6bacb8416fee94406c0ce8d48" Dec 11 13:30:43 crc kubenswrapper[4898]: I1211 13:30:43.100349 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d71ee22f-68e7-43d7-8a6a-012ff8b8104e-scripts\") pod \"ceilometer-0\" (UID: \"d71ee22f-68e7-43d7-8a6a-012ff8b8104e\") " pod="openstack/ceilometer-0" Dec 11 13:30:43 crc kubenswrapper[4898]: I1211 13:30:43.100491 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d71ee22f-68e7-43d7-8a6a-012ff8b8104e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d71ee22f-68e7-43d7-8a6a-012ff8b8104e\") " pod="openstack/ceilometer-0" Dec 11 13:30:43 crc kubenswrapper[4898]: I1211 13:30:43.100664 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d71ee22f-68e7-43d7-8a6a-012ff8b8104e-log-httpd\") pod \"ceilometer-0\" (UID: \"d71ee22f-68e7-43d7-8a6a-012ff8b8104e\") " pod="openstack/ceilometer-0" Dec 11 13:30:43 crc kubenswrapper[4898]: I1211 13:30:43.100787 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/d71ee22f-68e7-43d7-8a6a-012ff8b8104e-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"d71ee22f-68e7-43d7-8a6a-012ff8b8104e\") " pod="openstack/ceilometer-0" Dec 11 13:30:43 crc kubenswrapper[4898]: I1211 13:30:43.100912 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d71ee22f-68e7-43d7-8a6a-012ff8b8104e-config-data\") pod \"ceilometer-0\" (UID: \"d71ee22f-68e7-43d7-8a6a-012ff8b8104e\") " pod="openstack/ceilometer-0" Dec 11 13:30:43 crc kubenswrapper[4898]: I1211 13:30:43.100956 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-swr88\" (UniqueName: \"kubernetes.io/projected/d71ee22f-68e7-43d7-8a6a-012ff8b8104e-kube-api-access-swr88\") pod \"ceilometer-0\" (UID: \"d71ee22f-68e7-43d7-8a6a-012ff8b8104e\") " pod="openstack/ceilometer-0" Dec 11 13:30:43 crc kubenswrapper[4898]: I1211 13:30:43.101023 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d71ee22f-68e7-43d7-8a6a-012ff8b8104e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d71ee22f-68e7-43d7-8a6a-012ff8b8104e\") " pod="openstack/ceilometer-0" Dec 11 13:30:43 crc kubenswrapper[4898]: I1211 13:30:43.101157 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d71ee22f-68e7-43d7-8a6a-012ff8b8104e-run-httpd\") pod \"ceilometer-0\" (UID: \"d71ee22f-68e7-43d7-8a6a-012ff8b8104e\") " pod="openstack/ceilometer-0" Dec 11 13:30:43 crc kubenswrapper[4898]: I1211 13:30:43.203204 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d71ee22f-68e7-43d7-8a6a-012ff8b8104e-run-httpd\") pod \"ceilometer-0\" (UID: \"d71ee22f-68e7-43d7-8a6a-012ff8b8104e\") " pod="openstack/ceilometer-0" Dec 11 13:30:43 crc kubenswrapper[4898]: I1211 13:30:43.203316 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d71ee22f-68e7-43d7-8a6a-012ff8b8104e-scripts\") pod \"ceilometer-0\" (UID: \"d71ee22f-68e7-43d7-8a6a-012ff8b8104e\") " pod="openstack/ceilometer-0" Dec 11 13:30:43 crc kubenswrapper[4898]: I1211 13:30:43.203386 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d71ee22f-68e7-43d7-8a6a-012ff8b8104e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d71ee22f-68e7-43d7-8a6a-012ff8b8104e\") " pod="openstack/ceilometer-0" Dec 11 13:30:43 crc kubenswrapper[4898]: I1211 13:30:43.203418 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d71ee22f-68e7-43d7-8a6a-012ff8b8104e-log-httpd\") pod \"ceilometer-0\" (UID: \"d71ee22f-68e7-43d7-8a6a-012ff8b8104e\") " pod="openstack/ceilometer-0" Dec 11 13:30:43 crc kubenswrapper[4898]: I1211 13:30:43.203449 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/d71ee22f-68e7-43d7-8a6a-012ff8b8104e-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"d71ee22f-68e7-43d7-8a6a-012ff8b8104e\") " pod="openstack/ceilometer-0" Dec 11 13:30:43 crc kubenswrapper[4898]: I1211 13:30:43.203502 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d71ee22f-68e7-43d7-8a6a-012ff8b8104e-config-data\") pod \"ceilometer-0\" (UID: \"d71ee22f-68e7-43d7-8a6a-012ff8b8104e\") " pod="openstack/ceilometer-0" Dec 11 13:30:43 crc kubenswrapper[4898]: I1211 13:30:43.203522 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-swr88\" (UniqueName: \"kubernetes.io/projected/d71ee22f-68e7-43d7-8a6a-012ff8b8104e-kube-api-access-swr88\") pod \"ceilometer-0\" (UID: \"d71ee22f-68e7-43d7-8a6a-012ff8b8104e\") " pod="openstack/ceilometer-0" Dec 11 13:30:43 crc kubenswrapper[4898]: I1211 13:30:43.203554 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d71ee22f-68e7-43d7-8a6a-012ff8b8104e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d71ee22f-68e7-43d7-8a6a-012ff8b8104e\") " pod="openstack/ceilometer-0" Dec 11 13:30:43 crc kubenswrapper[4898]: I1211 13:30:43.203784 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d71ee22f-68e7-43d7-8a6a-012ff8b8104e-run-httpd\") pod \"ceilometer-0\" (UID: \"d71ee22f-68e7-43d7-8a6a-012ff8b8104e\") " pod="openstack/ceilometer-0" Dec 11 13:30:43 crc kubenswrapper[4898]: I1211 13:30:43.204072 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d71ee22f-68e7-43d7-8a6a-012ff8b8104e-log-httpd\") pod \"ceilometer-0\" (UID: \"d71ee22f-68e7-43d7-8a6a-012ff8b8104e\") " pod="openstack/ceilometer-0" Dec 11 13:30:43 crc kubenswrapper[4898]: I1211 13:30:43.212096 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/d71ee22f-68e7-43d7-8a6a-012ff8b8104e-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"d71ee22f-68e7-43d7-8a6a-012ff8b8104e\") " pod="openstack/ceilometer-0" Dec 11 13:30:43 crc kubenswrapper[4898]: I1211 13:30:43.212365 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d71ee22f-68e7-43d7-8a6a-012ff8b8104e-config-data\") pod \"ceilometer-0\" (UID: \"d71ee22f-68e7-43d7-8a6a-012ff8b8104e\") " pod="openstack/ceilometer-0" Dec 11 13:30:43 crc kubenswrapper[4898]: I1211 13:30:43.212417 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d71ee22f-68e7-43d7-8a6a-012ff8b8104e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d71ee22f-68e7-43d7-8a6a-012ff8b8104e\") " pod="openstack/ceilometer-0" Dec 11 13:30:43 crc kubenswrapper[4898]: I1211 13:30:43.218041 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d71ee22f-68e7-43d7-8a6a-012ff8b8104e-scripts\") pod \"ceilometer-0\" (UID: \"d71ee22f-68e7-43d7-8a6a-012ff8b8104e\") " pod="openstack/ceilometer-0" Dec 11 13:30:43 crc kubenswrapper[4898]: I1211 13:30:43.223910 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-swr88\" (UniqueName: \"kubernetes.io/projected/d71ee22f-68e7-43d7-8a6a-012ff8b8104e-kube-api-access-swr88\") pod \"ceilometer-0\" (UID: \"d71ee22f-68e7-43d7-8a6a-012ff8b8104e\") " pod="openstack/ceilometer-0" Dec 11 13:30:43 crc kubenswrapper[4898]: I1211 13:30:43.243432 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d71ee22f-68e7-43d7-8a6a-012ff8b8104e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d71ee22f-68e7-43d7-8a6a-012ff8b8104e\") " pod="openstack/ceilometer-0" Dec 11 13:30:43 crc kubenswrapper[4898]: I1211 13:30:43.323988 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 11 13:30:43 crc kubenswrapper[4898]: I1211 13:30:43.859895 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 11 13:30:43 crc kubenswrapper[4898]: W1211 13:30:43.884046 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd71ee22f_68e7_43d7_8a6a_012ff8b8104e.slice/crio-56135eb2427e81b5b7634c4849fe3255ebf461bbd459b19cafa1ec721cd7eff4 WatchSource:0}: Error finding container 56135eb2427e81b5b7634c4849fe3255ebf461bbd459b19cafa1ec721cd7eff4: Status 404 returned error can't find the container with id 56135eb2427e81b5b7634c4849fe3255ebf461bbd459b19cafa1ec721cd7eff4 Dec 11 13:30:43 crc kubenswrapper[4898]: I1211 13:30:43.957180 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d71ee22f-68e7-43d7-8a6a-012ff8b8104e","Type":"ContainerStarted","Data":"56135eb2427e81b5b7634c4849fe3255ebf461bbd459b19cafa1ec721cd7eff4"} Dec 11 13:30:44 crc kubenswrapper[4898]: I1211 13:30:44.790847 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="378696c4-d1ad-4165-b792-11c37b390c35" path="/var/lib/kubelet/pods/378696c4-d1ad-4165-b792-11c37b390c35/volumes" Dec 11 13:30:45 crc kubenswrapper[4898]: I1211 13:30:45.626771 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="026f0391-aa61-4b41-963f-239e08b0cd34" containerName="rabbitmq" containerID="cri-o://47a1737676630cf454851548de058721d30c553e93dce666a023ff5347745677" gracePeriod=604796 Dec 11 13:30:48 crc kubenswrapper[4898]: I1211 13:30:48.187079 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="6eaf839e-626a-4f9a-b489-d9c37cee9065" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.130:5671: connect: connection refused" Dec 11 13:30:48 crc kubenswrapper[4898]: I1211 13:30:48.489532 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="026f0391-aa61-4b41-963f-239e08b0cd34" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.131:5671: connect: connection refused" Dec 11 13:30:49 crc kubenswrapper[4898]: E1211 13:30:49.505175 4898 handlers.go:78] "Exec lifecycle hook for Container in Pod failed" err=< Dec 11 13:30:49 crc kubenswrapper[4898]: command '/bin/bash -c if [ ! -z "$(cat /etc/pod-info/skipPreStopChecks)" ]; then exit 0; fi; rabbitmq-upgrade await_online_quorum_plus_one -t 604800 && rabbitmq-upgrade await_online_synchronized_mirror -t 604800 || true && rabbitmq-upgrade drain -t 604800' exited with 69: Dec 11 13:30:49 crc kubenswrapper[4898]: 13:30:44.932 [warning] This node is being put into maintenance (drain) mode Dec 11 13:30:49 crc kubenswrapper[4898]: Dec 11 13:30:49 crc kubenswrapper[4898]: 13:30:44.934 [warning] Suspended all listeners and will no longer accept client connections Dec 11 13:30:49 crc kubenswrapper[4898]: Error: Dec 11 13:30:49 crc kubenswrapper[4898]: {:channel_termination_timeout, {:gen_server, :call, [#PID<11968.12466.0>, {:shutdown, 'Node was put into maintenance mode'}, :infinity]}} Dec 11 13:30:49 crc kubenswrapper[4898]: > execCommand=["/bin/bash","-c","if [ ! -z \"$(cat /etc/pod-info/skipPreStopChecks)\" ]; then exit 0; fi; rabbitmq-upgrade await_online_quorum_plus_one -t 604800 \u0026\u0026 rabbitmq-upgrade await_online_synchronized_mirror -t 604800 || true \u0026\u0026 rabbitmq-upgrade drain -t 604800"] containerName="rabbitmq" pod="openstack/rabbitmq-server-0" message=< Dec 11 13:30:49 crc kubenswrapper[4898]: Will wait for a quorum + 1 of nodes to be online for all quorum queues for 604800 seconds... Dec 11 13:30:49 crc kubenswrapper[4898]: Target node seems to be the only one in a single node cluster, the command does not apply Dec 11 13:30:49 crc kubenswrapper[4898]: Will wait for a synchronised mirror be online for all classic mirrored queues for 604800 seconds... Dec 11 13:30:49 crc kubenswrapper[4898]: Target node seems to be the only one in a single node cluster, the command does not apply Dec 11 13:30:49 crc kubenswrapper[4898]: Will put node rabbit@rabbitmq-server-0.rabbitmq-nodes.openstack into maintenance mode. The node will no longer serve any client traffic! Dec 11 13:30:49 crc kubenswrapper[4898]: Dec 11 13:30:49 crc kubenswrapper[4898]: 13:30:44.932 [warning] This node is being put into maintenance (drain) mode Dec 11 13:30:49 crc kubenswrapper[4898]: Dec 11 13:30:49 crc kubenswrapper[4898]: 13:30:44.934 [warning] Suspended all listeners and will no longer accept client connections Dec 11 13:30:49 crc kubenswrapper[4898]: Error: Dec 11 13:30:49 crc kubenswrapper[4898]: {:channel_termination_timeout, {:gen_server, :call, [#PID<11968.12466.0>, {:shutdown, 'Node was put into maintenance mode'}, :infinity]}} Dec 11 13:30:49 crc kubenswrapper[4898]: > Dec 11 13:30:49 crc kubenswrapper[4898]: E1211 13:30:49.505474 4898 kuberuntime_container.go:691] "PreStop hook failed" err=< Dec 11 13:30:49 crc kubenswrapper[4898]: command '/bin/bash -c if [ ! -z "$(cat /etc/pod-info/skipPreStopChecks)" ]; then exit 0; fi; rabbitmq-upgrade await_online_quorum_plus_one -t 604800 && rabbitmq-upgrade await_online_synchronized_mirror -t 604800 || true && rabbitmq-upgrade drain -t 604800' exited with 69: Dec 11 13:30:49 crc kubenswrapper[4898]: 13:30:44.932 [warning] This node is being put into maintenance (drain) mode Dec 11 13:30:49 crc kubenswrapper[4898]: Dec 11 13:30:49 crc kubenswrapper[4898]: 13:30:44.934 [warning] Suspended all listeners and will no longer accept client connections Dec 11 13:30:49 crc kubenswrapper[4898]: Error: Dec 11 13:30:49 crc kubenswrapper[4898]: {:channel_termination_timeout, {:gen_server, :call, [#PID<11968.12466.0>, {:shutdown, 'Node was put into maintenance mode'}, :infinity]}} Dec 11 13:30:49 crc kubenswrapper[4898]: > pod="openstack/rabbitmq-server-0" podUID="6eaf839e-626a-4f9a-b489-d9c37cee9065" containerName="rabbitmq" containerID="cri-o://a46216fb9d726589cfc7c90a401d302f459367871eee5fff476cedaf7f8498bd" Dec 11 13:30:49 crc kubenswrapper[4898]: I1211 13:30:49.505511 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="6eaf839e-626a-4f9a-b489-d9c37cee9065" containerName="rabbitmq" containerID="cri-o://a46216fb9d726589cfc7c90a401d302f459367871eee5fff476cedaf7f8498bd" gracePeriod=604791 Dec 11 13:30:53 crc kubenswrapper[4898]: I1211 13:30:53.074702 4898 generic.go:334] "Generic (PLEG): container finished" podID="026f0391-aa61-4b41-963f-239e08b0cd34" containerID="47a1737676630cf454851548de058721d30c553e93dce666a023ff5347745677" exitCode=0 Dec 11 13:30:53 crc kubenswrapper[4898]: I1211 13:30:53.074779 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"026f0391-aa61-4b41-963f-239e08b0cd34","Type":"ContainerDied","Data":"47a1737676630cf454851548de058721d30c553e93dce666a023ff5347745677"} Dec 11 13:30:58 crc kubenswrapper[4898]: I1211 13:30:58.187738 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="6eaf839e-626a-4f9a-b489-d9c37cee9065" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.130:5671: connect: connection refused" Dec 11 13:31:02 crc kubenswrapper[4898]: I1211 13:31:02.194604 4898 generic.go:334] "Generic (PLEG): container finished" podID="6eaf839e-626a-4f9a-b489-d9c37cee9065" containerID="a46216fb9d726589cfc7c90a401d302f459367871eee5fff476cedaf7f8498bd" exitCode=0 Dec 11 13:31:02 crc kubenswrapper[4898]: I1211 13:31:02.194726 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"6eaf839e-626a-4f9a-b489-d9c37cee9065","Type":"ContainerDied","Data":"a46216fb9d726589cfc7c90a401d302f459367871eee5fff476cedaf7f8498bd"} Dec 11 13:31:02 crc kubenswrapper[4898]: I1211 13:31:02.655442 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5b75489c6f-zhl76"] Dec 11 13:31:02 crc kubenswrapper[4898]: I1211 13:31:02.658009 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b75489c6f-zhl76" Dec 11 13:31:02 crc kubenswrapper[4898]: I1211 13:31:02.661370 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Dec 11 13:31:02 crc kubenswrapper[4898]: I1211 13:31:02.674637 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b75489c6f-zhl76"] Dec 11 13:31:02 crc kubenswrapper[4898]: I1211 13:31:02.727731 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b39b71ba-a893-4a17-92da-bf3db2cf671a-config\") pod \"dnsmasq-dns-5b75489c6f-zhl76\" (UID: \"b39b71ba-a893-4a17-92da-bf3db2cf671a\") " pod="openstack/dnsmasq-dns-5b75489c6f-zhl76" Dec 11 13:31:02 crc kubenswrapper[4898]: I1211 13:31:02.727808 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b39b71ba-a893-4a17-92da-bf3db2cf671a-dns-svc\") pod \"dnsmasq-dns-5b75489c6f-zhl76\" (UID: \"b39b71ba-a893-4a17-92da-bf3db2cf671a\") " pod="openstack/dnsmasq-dns-5b75489c6f-zhl76" Dec 11 13:31:02 crc kubenswrapper[4898]: I1211 13:31:02.727849 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b39b71ba-a893-4a17-92da-bf3db2cf671a-dns-swift-storage-0\") pod \"dnsmasq-dns-5b75489c6f-zhl76\" (UID: \"b39b71ba-a893-4a17-92da-bf3db2cf671a\") " pod="openstack/dnsmasq-dns-5b75489c6f-zhl76" Dec 11 13:31:02 crc kubenswrapper[4898]: I1211 13:31:02.727904 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b39b71ba-a893-4a17-92da-bf3db2cf671a-ovsdbserver-sb\") pod \"dnsmasq-dns-5b75489c6f-zhl76\" (UID: \"b39b71ba-a893-4a17-92da-bf3db2cf671a\") " pod="openstack/dnsmasq-dns-5b75489c6f-zhl76" Dec 11 13:31:02 crc kubenswrapper[4898]: I1211 13:31:02.727989 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wzddq\" (UniqueName: \"kubernetes.io/projected/b39b71ba-a893-4a17-92da-bf3db2cf671a-kube-api-access-wzddq\") pod \"dnsmasq-dns-5b75489c6f-zhl76\" (UID: \"b39b71ba-a893-4a17-92da-bf3db2cf671a\") " pod="openstack/dnsmasq-dns-5b75489c6f-zhl76" Dec 11 13:31:02 crc kubenswrapper[4898]: I1211 13:31:02.728019 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b39b71ba-a893-4a17-92da-bf3db2cf671a-ovsdbserver-nb\") pod \"dnsmasq-dns-5b75489c6f-zhl76\" (UID: \"b39b71ba-a893-4a17-92da-bf3db2cf671a\") " pod="openstack/dnsmasq-dns-5b75489c6f-zhl76" Dec 11 13:31:02 crc kubenswrapper[4898]: I1211 13:31:02.728040 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/b39b71ba-a893-4a17-92da-bf3db2cf671a-openstack-edpm-ipam\") pod \"dnsmasq-dns-5b75489c6f-zhl76\" (UID: \"b39b71ba-a893-4a17-92da-bf3db2cf671a\") " pod="openstack/dnsmasq-dns-5b75489c6f-zhl76" Dec 11 13:31:02 crc kubenswrapper[4898]: I1211 13:31:02.830155 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b39b71ba-a893-4a17-92da-bf3db2cf671a-config\") pod \"dnsmasq-dns-5b75489c6f-zhl76\" (UID: \"b39b71ba-a893-4a17-92da-bf3db2cf671a\") " pod="openstack/dnsmasq-dns-5b75489c6f-zhl76" Dec 11 13:31:02 crc kubenswrapper[4898]: I1211 13:31:02.831920 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b39b71ba-a893-4a17-92da-bf3db2cf671a-dns-svc\") pod \"dnsmasq-dns-5b75489c6f-zhl76\" (UID: \"b39b71ba-a893-4a17-92da-bf3db2cf671a\") " pod="openstack/dnsmasq-dns-5b75489c6f-zhl76" Dec 11 13:31:02 crc kubenswrapper[4898]: I1211 13:31:02.832002 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b39b71ba-a893-4a17-92da-bf3db2cf671a-dns-swift-storage-0\") pod \"dnsmasq-dns-5b75489c6f-zhl76\" (UID: \"b39b71ba-a893-4a17-92da-bf3db2cf671a\") " pod="openstack/dnsmasq-dns-5b75489c6f-zhl76" Dec 11 13:31:02 crc kubenswrapper[4898]: I1211 13:31:02.832187 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b39b71ba-a893-4a17-92da-bf3db2cf671a-ovsdbserver-sb\") pod \"dnsmasq-dns-5b75489c6f-zhl76\" (UID: \"b39b71ba-a893-4a17-92da-bf3db2cf671a\") " pod="openstack/dnsmasq-dns-5b75489c6f-zhl76" Dec 11 13:31:02 crc kubenswrapper[4898]: I1211 13:31:02.831565 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b39b71ba-a893-4a17-92da-bf3db2cf671a-config\") pod \"dnsmasq-dns-5b75489c6f-zhl76\" (UID: \"b39b71ba-a893-4a17-92da-bf3db2cf671a\") " pod="openstack/dnsmasq-dns-5b75489c6f-zhl76" Dec 11 13:31:02 crc kubenswrapper[4898]: I1211 13:31:02.833392 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wzddq\" (UniqueName: \"kubernetes.io/projected/b39b71ba-a893-4a17-92da-bf3db2cf671a-kube-api-access-wzddq\") pod \"dnsmasq-dns-5b75489c6f-zhl76\" (UID: \"b39b71ba-a893-4a17-92da-bf3db2cf671a\") " pod="openstack/dnsmasq-dns-5b75489c6f-zhl76" Dec 11 13:31:02 crc kubenswrapper[4898]: I1211 13:31:02.833433 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b39b71ba-a893-4a17-92da-bf3db2cf671a-dns-svc\") pod \"dnsmasq-dns-5b75489c6f-zhl76\" (UID: \"b39b71ba-a893-4a17-92da-bf3db2cf671a\") " pod="openstack/dnsmasq-dns-5b75489c6f-zhl76" Dec 11 13:31:02 crc kubenswrapper[4898]: I1211 13:31:02.833592 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b39b71ba-a893-4a17-92da-bf3db2cf671a-ovsdbserver-nb\") pod \"dnsmasq-dns-5b75489c6f-zhl76\" (UID: \"b39b71ba-a893-4a17-92da-bf3db2cf671a\") " pod="openstack/dnsmasq-dns-5b75489c6f-zhl76" Dec 11 13:31:02 crc kubenswrapper[4898]: I1211 13:31:02.833631 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/b39b71ba-a893-4a17-92da-bf3db2cf671a-openstack-edpm-ipam\") pod \"dnsmasq-dns-5b75489c6f-zhl76\" (UID: \"b39b71ba-a893-4a17-92da-bf3db2cf671a\") " pod="openstack/dnsmasq-dns-5b75489c6f-zhl76" Dec 11 13:31:02 crc kubenswrapper[4898]: I1211 13:31:02.834331 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b39b71ba-a893-4a17-92da-bf3db2cf671a-ovsdbserver-sb\") pod \"dnsmasq-dns-5b75489c6f-zhl76\" (UID: \"b39b71ba-a893-4a17-92da-bf3db2cf671a\") " pod="openstack/dnsmasq-dns-5b75489c6f-zhl76" Dec 11 13:31:02 crc kubenswrapper[4898]: I1211 13:31:02.835724 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b39b71ba-a893-4a17-92da-bf3db2cf671a-ovsdbserver-nb\") pod \"dnsmasq-dns-5b75489c6f-zhl76\" (UID: \"b39b71ba-a893-4a17-92da-bf3db2cf671a\") " pod="openstack/dnsmasq-dns-5b75489c6f-zhl76" Dec 11 13:31:02 crc kubenswrapper[4898]: I1211 13:31:02.836605 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Dec 11 13:31:02 crc kubenswrapper[4898]: I1211 13:31:02.837841 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b39b71ba-a893-4a17-92da-bf3db2cf671a-dns-swift-storage-0\") pod \"dnsmasq-dns-5b75489c6f-zhl76\" (UID: \"b39b71ba-a893-4a17-92da-bf3db2cf671a\") " pod="openstack/dnsmasq-dns-5b75489c6f-zhl76" Dec 11 13:31:02 crc kubenswrapper[4898]: I1211 13:31:02.845066 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/b39b71ba-a893-4a17-92da-bf3db2cf671a-openstack-edpm-ipam\") pod \"dnsmasq-dns-5b75489c6f-zhl76\" (UID: \"b39b71ba-a893-4a17-92da-bf3db2cf671a\") " pod="openstack/dnsmasq-dns-5b75489c6f-zhl76" Dec 11 13:31:02 crc kubenswrapper[4898]: I1211 13:31:02.861340 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wzddq\" (UniqueName: \"kubernetes.io/projected/b39b71ba-a893-4a17-92da-bf3db2cf671a-kube-api-access-wzddq\") pod \"dnsmasq-dns-5b75489c6f-zhl76\" (UID: \"b39b71ba-a893-4a17-92da-bf3db2cf671a\") " pod="openstack/dnsmasq-dns-5b75489c6f-zhl76" Dec 11 13:31:03 crc kubenswrapper[4898]: I1211 13:31:03.027089 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b75489c6f-zhl76" Dec 11 13:31:03 crc kubenswrapper[4898]: I1211 13:31:03.488221 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="026f0391-aa61-4b41-963f-239e08b0cd34" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.131:5671: i/o timeout" Dec 11 13:31:03 crc kubenswrapper[4898]: I1211 13:31:03.523709 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 11 13:31:03 crc kubenswrapper[4898]: I1211 13:31:03.550974 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/026f0391-aa61-4b41-963f-239e08b0cd34-config-data\") pod \"026f0391-aa61-4b41-963f-239e08b0cd34\" (UID: \"026f0391-aa61-4b41-963f-239e08b0cd34\") " Dec 11 13:31:03 crc kubenswrapper[4898]: I1211 13:31:03.551073 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/026f0391-aa61-4b41-963f-239e08b0cd34-rabbitmq-confd\") pod \"026f0391-aa61-4b41-963f-239e08b0cd34\" (UID: \"026f0391-aa61-4b41-963f-239e08b0cd34\") " Dec 11 13:31:03 crc kubenswrapper[4898]: I1211 13:31:03.551107 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/026f0391-aa61-4b41-963f-239e08b0cd34-rabbitmq-plugins\") pod \"026f0391-aa61-4b41-963f-239e08b0cd34\" (UID: \"026f0391-aa61-4b41-963f-239e08b0cd34\") " Dec 11 13:31:03 crc kubenswrapper[4898]: I1211 13:31:03.551141 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"026f0391-aa61-4b41-963f-239e08b0cd34\" (UID: \"026f0391-aa61-4b41-963f-239e08b0cd34\") " Dec 11 13:31:03 crc kubenswrapper[4898]: I1211 13:31:03.551170 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/026f0391-aa61-4b41-963f-239e08b0cd34-server-conf\") pod \"026f0391-aa61-4b41-963f-239e08b0cd34\" (UID: \"026f0391-aa61-4b41-963f-239e08b0cd34\") " Dec 11 13:31:03 crc kubenswrapper[4898]: I1211 13:31:03.551341 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/026f0391-aa61-4b41-963f-239e08b0cd34-erlang-cookie-secret\") pod \"026f0391-aa61-4b41-963f-239e08b0cd34\" (UID: \"026f0391-aa61-4b41-963f-239e08b0cd34\") " Dec 11 13:31:03 crc kubenswrapper[4898]: I1211 13:31:03.551384 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/026f0391-aa61-4b41-963f-239e08b0cd34-rabbitmq-erlang-cookie\") pod \"026f0391-aa61-4b41-963f-239e08b0cd34\" (UID: \"026f0391-aa61-4b41-963f-239e08b0cd34\") " Dec 11 13:31:03 crc kubenswrapper[4898]: I1211 13:31:03.551500 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/026f0391-aa61-4b41-963f-239e08b0cd34-plugins-conf\") pod \"026f0391-aa61-4b41-963f-239e08b0cd34\" (UID: \"026f0391-aa61-4b41-963f-239e08b0cd34\") " Dec 11 13:31:03 crc kubenswrapper[4898]: I1211 13:31:03.551529 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/026f0391-aa61-4b41-963f-239e08b0cd34-rabbitmq-tls\") pod \"026f0391-aa61-4b41-963f-239e08b0cd34\" (UID: \"026f0391-aa61-4b41-963f-239e08b0cd34\") " Dec 11 13:31:03 crc kubenswrapper[4898]: I1211 13:31:03.551559 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/026f0391-aa61-4b41-963f-239e08b0cd34-pod-info\") pod \"026f0391-aa61-4b41-963f-239e08b0cd34\" (UID: \"026f0391-aa61-4b41-963f-239e08b0cd34\") " Dec 11 13:31:03 crc kubenswrapper[4898]: I1211 13:31:03.551617 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-68zxq\" (UniqueName: \"kubernetes.io/projected/026f0391-aa61-4b41-963f-239e08b0cd34-kube-api-access-68zxq\") pod \"026f0391-aa61-4b41-963f-239e08b0cd34\" (UID: \"026f0391-aa61-4b41-963f-239e08b0cd34\") " Dec 11 13:31:03 crc kubenswrapper[4898]: I1211 13:31:03.553211 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/026f0391-aa61-4b41-963f-239e08b0cd34-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "026f0391-aa61-4b41-963f-239e08b0cd34" (UID: "026f0391-aa61-4b41-963f-239e08b0cd34"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 13:31:03 crc kubenswrapper[4898]: I1211 13:31:03.553558 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/026f0391-aa61-4b41-963f-239e08b0cd34-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "026f0391-aa61-4b41-963f-239e08b0cd34" (UID: "026f0391-aa61-4b41-963f-239e08b0cd34"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 13:31:03 crc kubenswrapper[4898]: I1211 13:31:03.557412 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/026f0391-aa61-4b41-963f-239e08b0cd34-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "026f0391-aa61-4b41-963f-239e08b0cd34" (UID: "026f0391-aa61-4b41-963f-239e08b0cd34"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:31:03 crc kubenswrapper[4898]: I1211 13:31:03.568017 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/026f0391-aa61-4b41-963f-239e08b0cd34-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "026f0391-aa61-4b41-963f-239e08b0cd34" (UID: "026f0391-aa61-4b41-963f-239e08b0cd34"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:31:03 crc kubenswrapper[4898]: I1211 13:31:03.576523 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/026f0391-aa61-4b41-963f-239e08b0cd34-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "026f0391-aa61-4b41-963f-239e08b0cd34" (UID: "026f0391-aa61-4b41-963f-239e08b0cd34"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:31:03 crc kubenswrapper[4898]: I1211 13:31:03.576769 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/026f0391-aa61-4b41-963f-239e08b0cd34-kube-api-access-68zxq" (OuterVolumeSpecName: "kube-api-access-68zxq") pod "026f0391-aa61-4b41-963f-239e08b0cd34" (UID: "026f0391-aa61-4b41-963f-239e08b0cd34"). InnerVolumeSpecName "kube-api-access-68zxq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:31:03 crc kubenswrapper[4898]: I1211 13:31:03.577712 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/026f0391-aa61-4b41-963f-239e08b0cd34-pod-info" (OuterVolumeSpecName: "pod-info") pod "026f0391-aa61-4b41-963f-239e08b0cd34" (UID: "026f0391-aa61-4b41-963f-239e08b0cd34"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Dec 11 13:31:03 crc kubenswrapper[4898]: I1211 13:31:03.592749 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "persistence") pod "026f0391-aa61-4b41-963f-239e08b0cd34" (UID: "026f0391-aa61-4b41-963f-239e08b0cd34"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 11 13:31:03 crc kubenswrapper[4898]: I1211 13:31:03.634829 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/026f0391-aa61-4b41-963f-239e08b0cd34-config-data" (OuterVolumeSpecName: "config-data") pod "026f0391-aa61-4b41-963f-239e08b0cd34" (UID: "026f0391-aa61-4b41-963f-239e08b0cd34"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:31:03 crc kubenswrapper[4898]: I1211 13:31:03.663368 4898 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/026f0391-aa61-4b41-963f-239e08b0cd34-plugins-conf\") on node \"crc\" DevicePath \"\"" Dec 11 13:31:03 crc kubenswrapper[4898]: I1211 13:31:03.663401 4898 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/026f0391-aa61-4b41-963f-239e08b0cd34-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Dec 11 13:31:03 crc kubenswrapper[4898]: I1211 13:31:03.663414 4898 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/026f0391-aa61-4b41-963f-239e08b0cd34-pod-info\") on node \"crc\" DevicePath \"\"" Dec 11 13:31:03 crc kubenswrapper[4898]: I1211 13:31:03.663426 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-68zxq\" (UniqueName: \"kubernetes.io/projected/026f0391-aa61-4b41-963f-239e08b0cd34-kube-api-access-68zxq\") on node \"crc\" DevicePath \"\"" Dec 11 13:31:03 crc kubenswrapper[4898]: I1211 13:31:03.663442 4898 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/026f0391-aa61-4b41-963f-239e08b0cd34-config-data\") on node \"crc\" DevicePath \"\"" Dec 11 13:31:03 crc kubenswrapper[4898]: I1211 13:31:03.663492 4898 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/026f0391-aa61-4b41-963f-239e08b0cd34-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Dec 11 13:31:03 crc kubenswrapper[4898]: I1211 13:31:03.663525 4898 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Dec 11 13:31:03 crc kubenswrapper[4898]: I1211 13:31:03.663540 4898 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/026f0391-aa61-4b41-963f-239e08b0cd34-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Dec 11 13:31:03 crc kubenswrapper[4898]: I1211 13:31:03.663553 4898 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/026f0391-aa61-4b41-963f-239e08b0cd34-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Dec 11 13:31:03 crc kubenswrapper[4898]: I1211 13:31:03.732719 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/026f0391-aa61-4b41-963f-239e08b0cd34-server-conf" (OuterVolumeSpecName: "server-conf") pod "026f0391-aa61-4b41-963f-239e08b0cd34" (UID: "026f0391-aa61-4b41-963f-239e08b0cd34"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:31:03 crc kubenswrapper[4898]: I1211 13:31:03.753709 4898 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Dec 11 13:31:03 crc kubenswrapper[4898]: I1211 13:31:03.766568 4898 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Dec 11 13:31:03 crc kubenswrapper[4898]: I1211 13:31:03.766607 4898 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/026f0391-aa61-4b41-963f-239e08b0cd34-server-conf\") on node \"crc\" DevicePath \"\"" Dec 11 13:31:03 crc kubenswrapper[4898]: I1211 13:31:03.832985 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/026f0391-aa61-4b41-963f-239e08b0cd34-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "026f0391-aa61-4b41-963f-239e08b0cd34" (UID: "026f0391-aa61-4b41-963f-239e08b0cd34"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:31:03 crc kubenswrapper[4898]: I1211 13:31:03.868553 4898 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/026f0391-aa61-4b41-963f-239e08b0cd34-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Dec 11 13:31:04 crc kubenswrapper[4898]: I1211 13:31:04.229217 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"026f0391-aa61-4b41-963f-239e08b0cd34","Type":"ContainerDied","Data":"8d647938454b298fe7a9ca7949a3907dfea0a58d81fcccb2a95acc292402e589"} Dec 11 13:31:04 crc kubenswrapper[4898]: I1211 13:31:04.229278 4898 scope.go:117] "RemoveContainer" containerID="47a1737676630cf454851548de058721d30c553e93dce666a023ff5347745677" Dec 11 13:31:04 crc kubenswrapper[4898]: I1211 13:31:04.229739 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 11 13:31:04 crc kubenswrapper[4898]: I1211 13:31:04.274679 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 11 13:31:04 crc kubenswrapper[4898]: I1211 13:31:04.306183 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 11 13:31:04 crc kubenswrapper[4898]: I1211 13:31:04.330003 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 11 13:31:04 crc kubenswrapper[4898]: E1211 13:31:04.330854 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="026f0391-aa61-4b41-963f-239e08b0cd34" containerName="setup-container" Dec 11 13:31:04 crc kubenswrapper[4898]: I1211 13:31:04.330951 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="026f0391-aa61-4b41-963f-239e08b0cd34" containerName="setup-container" Dec 11 13:31:04 crc kubenswrapper[4898]: E1211 13:31:04.331032 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="026f0391-aa61-4b41-963f-239e08b0cd34" containerName="rabbitmq" Dec 11 13:31:04 crc kubenswrapper[4898]: I1211 13:31:04.331092 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="026f0391-aa61-4b41-963f-239e08b0cd34" containerName="rabbitmq" Dec 11 13:31:04 crc kubenswrapper[4898]: I1211 13:31:04.331400 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="026f0391-aa61-4b41-963f-239e08b0cd34" containerName="rabbitmq" Dec 11 13:31:04 crc kubenswrapper[4898]: I1211 13:31:04.332979 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 11 13:31:04 crc kubenswrapper[4898]: I1211 13:31:04.336985 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Dec 11 13:31:04 crc kubenswrapper[4898]: I1211 13:31:04.337215 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Dec 11 13:31:04 crc kubenswrapper[4898]: I1211 13:31:04.339870 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Dec 11 13:31:04 crc kubenswrapper[4898]: I1211 13:31:04.340790 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Dec 11 13:31:04 crc kubenswrapper[4898]: I1211 13:31:04.340993 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-6hkm6" Dec 11 13:31:04 crc kubenswrapper[4898]: I1211 13:31:04.341306 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Dec 11 13:31:04 crc kubenswrapper[4898]: I1211 13:31:04.341363 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Dec 11 13:31:04 crc kubenswrapper[4898]: I1211 13:31:04.344048 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 11 13:31:04 crc kubenswrapper[4898]: I1211 13:31:04.383932 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/91c646bc-40ca-434e-8db2-df2eb46c4e5e-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"91c646bc-40ca-434e-8db2-df2eb46c4e5e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 11 13:31:04 crc kubenswrapper[4898]: I1211 13:31:04.384221 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/91c646bc-40ca-434e-8db2-df2eb46c4e5e-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"91c646bc-40ca-434e-8db2-df2eb46c4e5e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 11 13:31:04 crc kubenswrapper[4898]: I1211 13:31:04.384280 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/91c646bc-40ca-434e-8db2-df2eb46c4e5e-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"91c646bc-40ca-434e-8db2-df2eb46c4e5e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 11 13:31:04 crc kubenswrapper[4898]: I1211 13:31:04.384367 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/91c646bc-40ca-434e-8db2-df2eb46c4e5e-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"91c646bc-40ca-434e-8db2-df2eb46c4e5e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 11 13:31:04 crc kubenswrapper[4898]: I1211 13:31:04.384392 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6zq92\" (UniqueName: \"kubernetes.io/projected/91c646bc-40ca-434e-8db2-df2eb46c4e5e-kube-api-access-6zq92\") pod \"rabbitmq-cell1-server-0\" (UID: \"91c646bc-40ca-434e-8db2-df2eb46c4e5e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 11 13:31:04 crc kubenswrapper[4898]: I1211 13:31:04.384521 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/91c646bc-40ca-434e-8db2-df2eb46c4e5e-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"91c646bc-40ca-434e-8db2-df2eb46c4e5e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 11 13:31:04 crc kubenswrapper[4898]: I1211 13:31:04.384592 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/91c646bc-40ca-434e-8db2-df2eb46c4e5e-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"91c646bc-40ca-434e-8db2-df2eb46c4e5e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 11 13:31:04 crc kubenswrapper[4898]: I1211 13:31:04.384710 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/91c646bc-40ca-434e-8db2-df2eb46c4e5e-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"91c646bc-40ca-434e-8db2-df2eb46c4e5e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 11 13:31:04 crc kubenswrapper[4898]: I1211 13:31:04.384831 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/91c646bc-40ca-434e-8db2-df2eb46c4e5e-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"91c646bc-40ca-434e-8db2-df2eb46c4e5e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 11 13:31:04 crc kubenswrapper[4898]: I1211 13:31:04.385314 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/91c646bc-40ca-434e-8db2-df2eb46c4e5e-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"91c646bc-40ca-434e-8db2-df2eb46c4e5e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 11 13:31:04 crc kubenswrapper[4898]: I1211 13:31:04.385399 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"91c646bc-40ca-434e-8db2-df2eb46c4e5e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 11 13:31:04 crc kubenswrapper[4898]: E1211 13:31:04.450808 4898 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-heat-engine:current-tested" Dec 11 13:31:04 crc kubenswrapper[4898]: E1211 13:31:04.450861 4898 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-heat-engine:current-tested" Dec 11 13:31:04 crc kubenswrapper[4898]: E1211 13:31:04.450997 4898 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:heat-db-sync,Image:quay.rdoproject.org/podified-master-centos10/openstack-heat-engine:current-tested,Command:[/bin/bash],Args:[-c /usr/bin/heat-manage --config-dir /etc/heat/heat.conf.d db_sync],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/heat/heat.conf.d/00-default.conf,SubPath:00-default.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/heat/heat.conf.d/01-custom.conf,SubPath:01-custom.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-n7bbk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42418,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*42418,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod heat-db-sync-j2jpr_openstack(77bd783a-a06c-400b-b29e-a6edb7d613b3): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 11 13:31:04 crc kubenswrapper[4898]: E1211 13:31:04.452603 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/heat-db-sync-j2jpr" podUID="77bd783a-a06c-400b-b29e-a6edb7d613b3" Dec 11 13:31:04 crc kubenswrapper[4898]: I1211 13:31:04.488265 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/91c646bc-40ca-434e-8db2-df2eb46c4e5e-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"91c646bc-40ca-434e-8db2-df2eb46c4e5e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 11 13:31:04 crc kubenswrapper[4898]: I1211 13:31:04.488347 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"91c646bc-40ca-434e-8db2-df2eb46c4e5e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 11 13:31:04 crc kubenswrapper[4898]: I1211 13:31:04.488451 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/91c646bc-40ca-434e-8db2-df2eb46c4e5e-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"91c646bc-40ca-434e-8db2-df2eb46c4e5e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 11 13:31:04 crc kubenswrapper[4898]: I1211 13:31:04.488511 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/91c646bc-40ca-434e-8db2-df2eb46c4e5e-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"91c646bc-40ca-434e-8db2-df2eb46c4e5e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 11 13:31:04 crc kubenswrapper[4898]: I1211 13:31:04.488549 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/91c646bc-40ca-434e-8db2-df2eb46c4e5e-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"91c646bc-40ca-434e-8db2-df2eb46c4e5e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 11 13:31:04 crc kubenswrapper[4898]: I1211 13:31:04.488610 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/91c646bc-40ca-434e-8db2-df2eb46c4e5e-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"91c646bc-40ca-434e-8db2-df2eb46c4e5e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 11 13:31:04 crc kubenswrapper[4898]: I1211 13:31:04.488637 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6zq92\" (UniqueName: \"kubernetes.io/projected/91c646bc-40ca-434e-8db2-df2eb46c4e5e-kube-api-access-6zq92\") pod \"rabbitmq-cell1-server-0\" (UID: \"91c646bc-40ca-434e-8db2-df2eb46c4e5e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 11 13:31:04 crc kubenswrapper[4898]: I1211 13:31:04.488700 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/91c646bc-40ca-434e-8db2-df2eb46c4e5e-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"91c646bc-40ca-434e-8db2-df2eb46c4e5e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 11 13:31:04 crc kubenswrapper[4898]: I1211 13:31:04.488752 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/91c646bc-40ca-434e-8db2-df2eb46c4e5e-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"91c646bc-40ca-434e-8db2-df2eb46c4e5e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 11 13:31:04 crc kubenswrapper[4898]: I1211 13:31:04.488823 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/91c646bc-40ca-434e-8db2-df2eb46c4e5e-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"91c646bc-40ca-434e-8db2-df2eb46c4e5e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 11 13:31:04 crc kubenswrapper[4898]: I1211 13:31:04.488905 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/91c646bc-40ca-434e-8db2-df2eb46c4e5e-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"91c646bc-40ca-434e-8db2-df2eb46c4e5e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 11 13:31:04 crc kubenswrapper[4898]: I1211 13:31:04.488925 4898 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"91c646bc-40ca-434e-8db2-df2eb46c4e5e\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/rabbitmq-cell1-server-0" Dec 11 13:31:04 crc kubenswrapper[4898]: I1211 13:31:04.489541 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/91c646bc-40ca-434e-8db2-df2eb46c4e5e-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"91c646bc-40ca-434e-8db2-df2eb46c4e5e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 11 13:31:04 crc kubenswrapper[4898]: I1211 13:31:04.490689 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/91c646bc-40ca-434e-8db2-df2eb46c4e5e-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"91c646bc-40ca-434e-8db2-df2eb46c4e5e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 11 13:31:04 crc kubenswrapper[4898]: I1211 13:31:04.491278 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/91c646bc-40ca-434e-8db2-df2eb46c4e5e-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"91c646bc-40ca-434e-8db2-df2eb46c4e5e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 11 13:31:04 crc kubenswrapper[4898]: I1211 13:31:04.492173 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/91c646bc-40ca-434e-8db2-df2eb46c4e5e-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"91c646bc-40ca-434e-8db2-df2eb46c4e5e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 11 13:31:04 crc kubenswrapper[4898]: I1211 13:31:04.495168 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/91c646bc-40ca-434e-8db2-df2eb46c4e5e-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"91c646bc-40ca-434e-8db2-df2eb46c4e5e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 11 13:31:04 crc kubenswrapper[4898]: I1211 13:31:04.495622 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/91c646bc-40ca-434e-8db2-df2eb46c4e5e-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"91c646bc-40ca-434e-8db2-df2eb46c4e5e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 11 13:31:04 crc kubenswrapper[4898]: I1211 13:31:04.496393 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/91c646bc-40ca-434e-8db2-df2eb46c4e5e-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"91c646bc-40ca-434e-8db2-df2eb46c4e5e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 11 13:31:04 crc kubenswrapper[4898]: I1211 13:31:04.501265 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/91c646bc-40ca-434e-8db2-df2eb46c4e5e-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"91c646bc-40ca-434e-8db2-df2eb46c4e5e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 11 13:31:04 crc kubenswrapper[4898]: I1211 13:31:04.507959 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6zq92\" (UniqueName: \"kubernetes.io/projected/91c646bc-40ca-434e-8db2-df2eb46c4e5e-kube-api-access-6zq92\") pod \"rabbitmq-cell1-server-0\" (UID: \"91c646bc-40ca-434e-8db2-df2eb46c4e5e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 11 13:31:04 crc kubenswrapper[4898]: I1211 13:31:04.511621 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/91c646bc-40ca-434e-8db2-df2eb46c4e5e-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"91c646bc-40ca-434e-8db2-df2eb46c4e5e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 11 13:31:04 crc kubenswrapper[4898]: I1211 13:31:04.535733 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"91c646bc-40ca-434e-8db2-df2eb46c4e5e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 11 13:31:04 crc kubenswrapper[4898]: I1211 13:31:04.664627 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 11 13:31:04 crc kubenswrapper[4898]: I1211 13:31:04.790596 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="026f0391-aa61-4b41-963f-239e08b0cd34" path="/var/lib/kubelet/pods/026f0391-aa61-4b41-963f-239e08b0cd34/volumes" Dec 11 13:31:04 crc kubenswrapper[4898]: I1211 13:31:04.995696 4898 patch_prober.go:28] interesting pod/machine-config-daemon-7mmvk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 11 13:31:04 crc kubenswrapper[4898]: I1211 13:31:04.996058 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 11 13:31:05 crc kubenswrapper[4898]: I1211 13:31:05.016623 4898 scope.go:117] "RemoveContainer" containerID="9bd52eab2cebdafa7238c3a7545f94428c7ec74df13750e07166556883515e5a" Dec 11 13:31:05 crc kubenswrapper[4898]: E1211 13:31:05.016689 4898 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-central:current-tested" Dec 11 13:31:05 crc kubenswrapper[4898]: E1211 13:31:05.016751 4898 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-central:current-tested" Dec 11 13:31:05 crc kubenswrapper[4898]: E1211 13:31:05.016926 4898 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ceilometer-central-agent,Image:quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-central:current-tested,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n54fhf5h555h574h559h8fh5c8h659h5ddh5bbhcdh64dh655h569h675h54h599h67fh66ch654h5bch649h64fh695h64h87h695hcbh76hd8h5b7h559q,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:ceilometer-central-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-swr88,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/python3 /var/lib/openstack/bin/centralhealth.py],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(d71ee22f-68e7-43d7-8a6a-012ff8b8104e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 11 13:31:05 crc kubenswrapper[4898]: I1211 13:31:05.217859 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 11 13:31:05 crc kubenswrapper[4898]: I1211 13:31:05.274161 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"6eaf839e-626a-4f9a-b489-d9c37cee9065","Type":"ContainerDied","Data":"c40c2a955382f5d301c518d9e470a4554c1814de3c80decbb2eb1cad292fcaf0"} Dec 11 13:31:05 crc kubenswrapper[4898]: I1211 13:31:05.274250 4898 scope.go:117] "RemoveContainer" containerID="a46216fb9d726589cfc7c90a401d302f459367871eee5fff476cedaf7f8498bd" Dec 11 13:31:05 crc kubenswrapper[4898]: I1211 13:31:05.274792 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 11 13:31:05 crc kubenswrapper[4898]: I1211 13:31:05.331424 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6eaf839e-626a-4f9a-b489-d9c37cee9065-config-data\") pod \"6eaf839e-626a-4f9a-b489-d9c37cee9065\" (UID: \"6eaf839e-626a-4f9a-b489-d9c37cee9065\") " Dec 11 13:31:05 crc kubenswrapper[4898]: I1211 13:31:05.332744 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6eaf839e-626a-4f9a-b489-d9c37cee9065-erlang-cookie-secret\") pod \"6eaf839e-626a-4f9a-b489-d9c37cee9065\" (UID: \"6eaf839e-626a-4f9a-b489-d9c37cee9065\") " Dec 11 13:31:05 crc kubenswrapper[4898]: I1211 13:31:05.332885 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"6eaf839e-626a-4f9a-b489-d9c37cee9065\" (UID: \"6eaf839e-626a-4f9a-b489-d9c37cee9065\") " Dec 11 13:31:05 crc kubenswrapper[4898]: I1211 13:31:05.333002 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bc2fn\" (UniqueName: \"kubernetes.io/projected/6eaf839e-626a-4f9a-b489-d9c37cee9065-kube-api-access-bc2fn\") pod \"6eaf839e-626a-4f9a-b489-d9c37cee9065\" (UID: \"6eaf839e-626a-4f9a-b489-d9c37cee9065\") " Dec 11 13:31:05 crc kubenswrapper[4898]: I1211 13:31:05.333112 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6eaf839e-626a-4f9a-b489-d9c37cee9065-rabbitmq-plugins\") pod \"6eaf839e-626a-4f9a-b489-d9c37cee9065\" (UID: \"6eaf839e-626a-4f9a-b489-d9c37cee9065\") " Dec 11 13:31:05 crc kubenswrapper[4898]: I1211 13:31:05.333216 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6eaf839e-626a-4f9a-b489-d9c37cee9065-rabbitmq-confd\") pod \"6eaf839e-626a-4f9a-b489-d9c37cee9065\" (UID: \"6eaf839e-626a-4f9a-b489-d9c37cee9065\") " Dec 11 13:31:05 crc kubenswrapper[4898]: I1211 13:31:05.333307 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6eaf839e-626a-4f9a-b489-d9c37cee9065-rabbitmq-tls\") pod \"6eaf839e-626a-4f9a-b489-d9c37cee9065\" (UID: \"6eaf839e-626a-4f9a-b489-d9c37cee9065\") " Dec 11 13:31:05 crc kubenswrapper[4898]: I1211 13:31:05.333444 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6eaf839e-626a-4f9a-b489-d9c37cee9065-pod-info\") pod \"6eaf839e-626a-4f9a-b489-d9c37cee9065\" (UID: \"6eaf839e-626a-4f9a-b489-d9c37cee9065\") " Dec 11 13:31:05 crc kubenswrapper[4898]: I1211 13:31:05.333669 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6eaf839e-626a-4f9a-b489-d9c37cee9065-plugins-conf\") pod \"6eaf839e-626a-4f9a-b489-d9c37cee9065\" (UID: \"6eaf839e-626a-4f9a-b489-d9c37cee9065\") " Dec 11 13:31:05 crc kubenswrapper[4898]: I1211 13:31:05.350697 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6eaf839e-626a-4f9a-b489-d9c37cee9065-server-conf\") pod \"6eaf839e-626a-4f9a-b489-d9c37cee9065\" (UID: \"6eaf839e-626a-4f9a-b489-d9c37cee9065\") " Dec 11 13:31:05 crc kubenswrapper[4898]: I1211 13:31:05.350794 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6eaf839e-626a-4f9a-b489-d9c37cee9065-rabbitmq-erlang-cookie\") pod \"6eaf839e-626a-4f9a-b489-d9c37cee9065\" (UID: \"6eaf839e-626a-4f9a-b489-d9c37cee9065\") " Dec 11 13:31:05 crc kubenswrapper[4898]: I1211 13:31:05.356389 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6eaf839e-626a-4f9a-b489-d9c37cee9065-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "6eaf839e-626a-4f9a-b489-d9c37cee9065" (UID: "6eaf839e-626a-4f9a-b489-d9c37cee9065"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 13:31:05 crc kubenswrapper[4898]: I1211 13:31:05.356636 4898 scope.go:117] "RemoveContainer" containerID="bbc99eb3803c73f2cd3f89521dd3bdaae4c7d877f90cf10b7734bbb008573b50" Dec 11 13:31:05 crc kubenswrapper[4898]: I1211 13:31:05.363812 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "persistence") pod "6eaf839e-626a-4f9a-b489-d9c37cee9065" (UID: "6eaf839e-626a-4f9a-b489-d9c37cee9065"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 11 13:31:05 crc kubenswrapper[4898]: I1211 13:31:05.365739 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6eaf839e-626a-4f9a-b489-d9c37cee9065-kube-api-access-bc2fn" (OuterVolumeSpecName: "kube-api-access-bc2fn") pod "6eaf839e-626a-4f9a-b489-d9c37cee9065" (UID: "6eaf839e-626a-4f9a-b489-d9c37cee9065"). InnerVolumeSpecName "kube-api-access-bc2fn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:31:05 crc kubenswrapper[4898]: E1211 13:31:05.366203 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-heat-engine:current-tested\\\"\"" pod="openstack/heat-db-sync-j2jpr" podUID="77bd783a-a06c-400b-b29e-a6edb7d613b3" Dec 11 13:31:05 crc kubenswrapper[4898]: I1211 13:31:05.367801 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6eaf839e-626a-4f9a-b489-d9c37cee9065-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "6eaf839e-626a-4f9a-b489-d9c37cee9065" (UID: "6eaf839e-626a-4f9a-b489-d9c37cee9065"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:31:05 crc kubenswrapper[4898]: I1211 13:31:05.369086 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6eaf839e-626a-4f9a-b489-d9c37cee9065-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "6eaf839e-626a-4f9a-b489-d9c37cee9065" (UID: "6eaf839e-626a-4f9a-b489-d9c37cee9065"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:31:05 crc kubenswrapper[4898]: I1211 13:31:05.369892 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6eaf839e-626a-4f9a-b489-d9c37cee9065-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "6eaf839e-626a-4f9a-b489-d9c37cee9065" (UID: "6eaf839e-626a-4f9a-b489-d9c37cee9065"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:31:05 crc kubenswrapper[4898]: I1211 13:31:05.370048 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6eaf839e-626a-4f9a-b489-d9c37cee9065-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "6eaf839e-626a-4f9a-b489-d9c37cee9065" (UID: "6eaf839e-626a-4f9a-b489-d9c37cee9065"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 13:31:05 crc kubenswrapper[4898]: I1211 13:31:05.371751 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/6eaf839e-626a-4f9a-b489-d9c37cee9065-pod-info" (OuterVolumeSpecName: "pod-info") pod "6eaf839e-626a-4f9a-b489-d9c37cee9065" (UID: "6eaf839e-626a-4f9a-b489-d9c37cee9065"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Dec 11 13:31:05 crc kubenswrapper[4898]: I1211 13:31:05.420788 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6eaf839e-626a-4f9a-b489-d9c37cee9065-config-data" (OuterVolumeSpecName: "config-data") pod "6eaf839e-626a-4f9a-b489-d9c37cee9065" (UID: "6eaf839e-626a-4f9a-b489-d9c37cee9065"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:31:05 crc kubenswrapper[4898]: I1211 13:31:05.454394 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bc2fn\" (UniqueName: \"kubernetes.io/projected/6eaf839e-626a-4f9a-b489-d9c37cee9065-kube-api-access-bc2fn\") on node \"crc\" DevicePath \"\"" Dec 11 13:31:05 crc kubenswrapper[4898]: I1211 13:31:05.454431 4898 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6eaf839e-626a-4f9a-b489-d9c37cee9065-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Dec 11 13:31:05 crc kubenswrapper[4898]: I1211 13:31:05.454445 4898 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6eaf839e-626a-4f9a-b489-d9c37cee9065-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Dec 11 13:31:05 crc kubenswrapper[4898]: I1211 13:31:05.454471 4898 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6eaf839e-626a-4f9a-b489-d9c37cee9065-pod-info\") on node \"crc\" DevicePath \"\"" Dec 11 13:31:05 crc kubenswrapper[4898]: I1211 13:31:05.454481 4898 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6eaf839e-626a-4f9a-b489-d9c37cee9065-plugins-conf\") on node \"crc\" DevicePath \"\"" Dec 11 13:31:05 crc kubenswrapper[4898]: I1211 13:31:05.454491 4898 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6eaf839e-626a-4f9a-b489-d9c37cee9065-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Dec 11 13:31:05 crc kubenswrapper[4898]: I1211 13:31:05.454499 4898 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6eaf839e-626a-4f9a-b489-d9c37cee9065-config-data\") on node \"crc\" DevicePath \"\"" Dec 11 13:31:05 crc kubenswrapper[4898]: I1211 13:31:05.454507 4898 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6eaf839e-626a-4f9a-b489-d9c37cee9065-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Dec 11 13:31:05 crc kubenswrapper[4898]: I1211 13:31:05.454533 4898 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Dec 11 13:31:05 crc kubenswrapper[4898]: I1211 13:31:05.467298 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6eaf839e-626a-4f9a-b489-d9c37cee9065-server-conf" (OuterVolumeSpecName: "server-conf") pod "6eaf839e-626a-4f9a-b489-d9c37cee9065" (UID: "6eaf839e-626a-4f9a-b489-d9c37cee9065"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:31:05 crc kubenswrapper[4898]: I1211 13:31:05.479696 4898 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Dec 11 13:31:05 crc kubenswrapper[4898]: I1211 13:31:05.557202 4898 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6eaf839e-626a-4f9a-b489-d9c37cee9065-server-conf\") on node \"crc\" DevicePath \"\"" Dec 11 13:31:05 crc kubenswrapper[4898]: I1211 13:31:05.557241 4898 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Dec 11 13:31:05 crc kubenswrapper[4898]: I1211 13:31:05.578606 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6eaf839e-626a-4f9a-b489-d9c37cee9065-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "6eaf839e-626a-4f9a-b489-d9c37cee9065" (UID: "6eaf839e-626a-4f9a-b489-d9c37cee9065"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:31:05 crc kubenswrapper[4898]: I1211 13:31:05.661949 4898 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6eaf839e-626a-4f9a-b489-d9c37cee9065-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Dec 11 13:31:05 crc kubenswrapper[4898]: I1211 13:31:05.666489 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b75489c6f-zhl76"] Dec 11 13:31:05 crc kubenswrapper[4898]: W1211 13:31:05.677705 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb39b71ba_a893_4a17_92da_bf3db2cf671a.slice/crio-2c6dec5b611c7677470bdd60300521e9ff38fd47785aed2ed144668aa94754ba WatchSource:0}: Error finding container 2c6dec5b611c7677470bdd60300521e9ff38fd47785aed2ed144668aa94754ba: Status 404 returned error can't find the container with id 2c6dec5b611c7677470bdd60300521e9ff38fd47785aed2ed144668aa94754ba Dec 11 13:31:05 crc kubenswrapper[4898]: I1211 13:31:05.699962 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 11 13:31:05 crc kubenswrapper[4898]: I1211 13:31:05.917953 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 11 13:31:05 crc kubenswrapper[4898]: I1211 13:31:05.933997 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 11 13:31:05 crc kubenswrapper[4898]: I1211 13:31:05.948042 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Dec 11 13:31:05 crc kubenswrapper[4898]: E1211 13:31:05.948595 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6eaf839e-626a-4f9a-b489-d9c37cee9065" containerName="setup-container" Dec 11 13:31:05 crc kubenswrapper[4898]: I1211 13:31:05.948615 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="6eaf839e-626a-4f9a-b489-d9c37cee9065" containerName="setup-container" Dec 11 13:31:05 crc kubenswrapper[4898]: E1211 13:31:05.948647 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6eaf839e-626a-4f9a-b489-d9c37cee9065" containerName="rabbitmq" Dec 11 13:31:05 crc kubenswrapper[4898]: I1211 13:31:05.948654 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="6eaf839e-626a-4f9a-b489-d9c37cee9065" containerName="rabbitmq" Dec 11 13:31:05 crc kubenswrapper[4898]: I1211 13:31:05.948948 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="6eaf839e-626a-4f9a-b489-d9c37cee9065" containerName="rabbitmq" Dec 11 13:31:05 crc kubenswrapper[4898]: I1211 13:31:05.950642 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 11 13:31:05 crc kubenswrapper[4898]: I1211 13:31:05.956127 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Dec 11 13:31:05 crc kubenswrapper[4898]: I1211 13:31:05.956141 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Dec 11 13:31:05 crc kubenswrapper[4898]: I1211 13:31:05.956127 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Dec 11 13:31:05 crc kubenswrapper[4898]: I1211 13:31:05.956341 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-z7sht" Dec 11 13:31:05 crc kubenswrapper[4898]: I1211 13:31:05.956407 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Dec 11 13:31:05 crc kubenswrapper[4898]: I1211 13:31:05.956495 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Dec 11 13:31:05 crc kubenswrapper[4898]: I1211 13:31:05.956715 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Dec 11 13:31:05 crc kubenswrapper[4898]: I1211 13:31:05.964817 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 11 13:31:06 crc kubenswrapper[4898]: I1211 13:31:06.070227 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d7d19abc-90d0-413d-b8d7-67ae58b010f7-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"d7d19abc-90d0-413d-b8d7-67ae58b010f7\") " pod="openstack/rabbitmq-server-0" Dec 11 13:31:06 crc kubenswrapper[4898]: I1211 13:31:06.070293 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d7d19abc-90d0-413d-b8d7-67ae58b010f7-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"d7d19abc-90d0-413d-b8d7-67ae58b010f7\") " pod="openstack/rabbitmq-server-0" Dec 11 13:31:06 crc kubenswrapper[4898]: I1211 13:31:06.070316 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d7d19abc-90d0-413d-b8d7-67ae58b010f7-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"d7d19abc-90d0-413d-b8d7-67ae58b010f7\") " pod="openstack/rabbitmq-server-0" Dec 11 13:31:06 crc kubenswrapper[4898]: I1211 13:31:06.070363 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d7d19abc-90d0-413d-b8d7-67ae58b010f7-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"d7d19abc-90d0-413d-b8d7-67ae58b010f7\") " pod="openstack/rabbitmq-server-0" Dec 11 13:31:06 crc kubenswrapper[4898]: I1211 13:31:06.070443 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d7d19abc-90d0-413d-b8d7-67ae58b010f7-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"d7d19abc-90d0-413d-b8d7-67ae58b010f7\") " pod="openstack/rabbitmq-server-0" Dec 11 13:31:06 crc kubenswrapper[4898]: I1211 13:31:06.070513 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/d7d19abc-90d0-413d-b8d7-67ae58b010f7-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"d7d19abc-90d0-413d-b8d7-67ae58b010f7\") " pod="openstack/rabbitmq-server-0" Dec 11 13:31:06 crc kubenswrapper[4898]: I1211 13:31:06.070625 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-szh7k\" (UniqueName: \"kubernetes.io/projected/d7d19abc-90d0-413d-b8d7-67ae58b010f7-kube-api-access-szh7k\") pod \"rabbitmq-server-0\" (UID: \"d7d19abc-90d0-413d-b8d7-67ae58b010f7\") " pod="openstack/rabbitmq-server-0" Dec 11 13:31:06 crc kubenswrapper[4898]: I1211 13:31:06.070647 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d7d19abc-90d0-413d-b8d7-67ae58b010f7-config-data\") pod \"rabbitmq-server-0\" (UID: \"d7d19abc-90d0-413d-b8d7-67ae58b010f7\") " pod="openstack/rabbitmq-server-0" Dec 11 13:31:06 crc kubenswrapper[4898]: I1211 13:31:06.070700 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"d7d19abc-90d0-413d-b8d7-67ae58b010f7\") " pod="openstack/rabbitmq-server-0" Dec 11 13:31:06 crc kubenswrapper[4898]: I1211 13:31:06.070766 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d7d19abc-90d0-413d-b8d7-67ae58b010f7-server-conf\") pod \"rabbitmq-server-0\" (UID: \"d7d19abc-90d0-413d-b8d7-67ae58b010f7\") " pod="openstack/rabbitmq-server-0" Dec 11 13:31:06 crc kubenswrapper[4898]: I1211 13:31:06.071045 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d7d19abc-90d0-413d-b8d7-67ae58b010f7-pod-info\") pod \"rabbitmq-server-0\" (UID: \"d7d19abc-90d0-413d-b8d7-67ae58b010f7\") " pod="openstack/rabbitmq-server-0" Dec 11 13:31:06 crc kubenswrapper[4898]: I1211 13:31:06.172626 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d7d19abc-90d0-413d-b8d7-67ae58b010f7-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"d7d19abc-90d0-413d-b8d7-67ae58b010f7\") " pod="openstack/rabbitmq-server-0" Dec 11 13:31:06 crc kubenswrapper[4898]: I1211 13:31:06.172897 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d7d19abc-90d0-413d-b8d7-67ae58b010f7-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"d7d19abc-90d0-413d-b8d7-67ae58b010f7\") " pod="openstack/rabbitmq-server-0" Dec 11 13:31:06 crc kubenswrapper[4898]: I1211 13:31:06.172921 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d7d19abc-90d0-413d-b8d7-67ae58b010f7-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"d7d19abc-90d0-413d-b8d7-67ae58b010f7\") " pod="openstack/rabbitmq-server-0" Dec 11 13:31:06 crc kubenswrapper[4898]: I1211 13:31:06.172984 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d7d19abc-90d0-413d-b8d7-67ae58b010f7-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"d7d19abc-90d0-413d-b8d7-67ae58b010f7\") " pod="openstack/rabbitmq-server-0" Dec 11 13:31:06 crc kubenswrapper[4898]: I1211 13:31:06.173025 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/d7d19abc-90d0-413d-b8d7-67ae58b010f7-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"d7d19abc-90d0-413d-b8d7-67ae58b010f7\") " pod="openstack/rabbitmq-server-0" Dec 11 13:31:06 crc kubenswrapper[4898]: I1211 13:31:06.173070 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-szh7k\" (UniqueName: \"kubernetes.io/projected/d7d19abc-90d0-413d-b8d7-67ae58b010f7-kube-api-access-szh7k\") pod \"rabbitmq-server-0\" (UID: \"d7d19abc-90d0-413d-b8d7-67ae58b010f7\") " pod="openstack/rabbitmq-server-0" Dec 11 13:31:06 crc kubenswrapper[4898]: I1211 13:31:06.173087 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d7d19abc-90d0-413d-b8d7-67ae58b010f7-config-data\") pod \"rabbitmq-server-0\" (UID: \"d7d19abc-90d0-413d-b8d7-67ae58b010f7\") " pod="openstack/rabbitmq-server-0" Dec 11 13:31:06 crc kubenswrapper[4898]: I1211 13:31:06.173136 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"d7d19abc-90d0-413d-b8d7-67ae58b010f7\") " pod="openstack/rabbitmq-server-0" Dec 11 13:31:06 crc kubenswrapper[4898]: I1211 13:31:06.173197 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d7d19abc-90d0-413d-b8d7-67ae58b010f7-server-conf\") pod \"rabbitmq-server-0\" (UID: \"d7d19abc-90d0-413d-b8d7-67ae58b010f7\") " pod="openstack/rabbitmq-server-0" Dec 11 13:31:06 crc kubenswrapper[4898]: I1211 13:31:06.173228 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d7d19abc-90d0-413d-b8d7-67ae58b010f7-pod-info\") pod \"rabbitmq-server-0\" (UID: \"d7d19abc-90d0-413d-b8d7-67ae58b010f7\") " pod="openstack/rabbitmq-server-0" Dec 11 13:31:06 crc kubenswrapper[4898]: I1211 13:31:06.173252 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d7d19abc-90d0-413d-b8d7-67ae58b010f7-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"d7d19abc-90d0-413d-b8d7-67ae58b010f7\") " pod="openstack/rabbitmq-server-0" Dec 11 13:31:06 crc kubenswrapper[4898]: I1211 13:31:06.173291 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d7d19abc-90d0-413d-b8d7-67ae58b010f7-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"d7d19abc-90d0-413d-b8d7-67ae58b010f7\") " pod="openstack/rabbitmq-server-0" Dec 11 13:31:06 crc kubenswrapper[4898]: I1211 13:31:06.173933 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d7d19abc-90d0-413d-b8d7-67ae58b010f7-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"d7d19abc-90d0-413d-b8d7-67ae58b010f7\") " pod="openstack/rabbitmq-server-0" Dec 11 13:31:06 crc kubenswrapper[4898]: I1211 13:31:06.174136 4898 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"d7d19abc-90d0-413d-b8d7-67ae58b010f7\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/rabbitmq-server-0" Dec 11 13:31:06 crc kubenswrapper[4898]: I1211 13:31:06.174415 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d7d19abc-90d0-413d-b8d7-67ae58b010f7-config-data\") pod \"rabbitmq-server-0\" (UID: \"d7d19abc-90d0-413d-b8d7-67ae58b010f7\") " pod="openstack/rabbitmq-server-0" Dec 11 13:31:06 crc kubenswrapper[4898]: I1211 13:31:06.174186 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d7d19abc-90d0-413d-b8d7-67ae58b010f7-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"d7d19abc-90d0-413d-b8d7-67ae58b010f7\") " pod="openstack/rabbitmq-server-0" Dec 11 13:31:06 crc kubenswrapper[4898]: I1211 13:31:06.174721 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d7d19abc-90d0-413d-b8d7-67ae58b010f7-server-conf\") pod \"rabbitmq-server-0\" (UID: \"d7d19abc-90d0-413d-b8d7-67ae58b010f7\") " pod="openstack/rabbitmq-server-0" Dec 11 13:31:06 crc kubenswrapper[4898]: I1211 13:31:06.177943 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d7d19abc-90d0-413d-b8d7-67ae58b010f7-pod-info\") pod \"rabbitmq-server-0\" (UID: \"d7d19abc-90d0-413d-b8d7-67ae58b010f7\") " pod="openstack/rabbitmq-server-0" Dec 11 13:31:06 crc kubenswrapper[4898]: I1211 13:31:06.178112 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/d7d19abc-90d0-413d-b8d7-67ae58b010f7-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"d7d19abc-90d0-413d-b8d7-67ae58b010f7\") " pod="openstack/rabbitmq-server-0" Dec 11 13:31:06 crc kubenswrapper[4898]: I1211 13:31:06.178848 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d7d19abc-90d0-413d-b8d7-67ae58b010f7-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"d7d19abc-90d0-413d-b8d7-67ae58b010f7\") " pod="openstack/rabbitmq-server-0" Dec 11 13:31:06 crc kubenswrapper[4898]: I1211 13:31:06.179547 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d7d19abc-90d0-413d-b8d7-67ae58b010f7-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"d7d19abc-90d0-413d-b8d7-67ae58b010f7\") " pod="openstack/rabbitmq-server-0" Dec 11 13:31:06 crc kubenswrapper[4898]: I1211 13:31:06.201838 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-szh7k\" (UniqueName: \"kubernetes.io/projected/d7d19abc-90d0-413d-b8d7-67ae58b010f7-kube-api-access-szh7k\") pod \"rabbitmq-server-0\" (UID: \"d7d19abc-90d0-413d-b8d7-67ae58b010f7\") " pod="openstack/rabbitmq-server-0" Dec 11 13:31:06 crc kubenswrapper[4898]: I1211 13:31:06.221855 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"d7d19abc-90d0-413d-b8d7-67ae58b010f7\") " pod="openstack/rabbitmq-server-0" Dec 11 13:31:06 crc kubenswrapper[4898]: I1211 13:31:06.294073 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"91c646bc-40ca-434e-8db2-df2eb46c4e5e","Type":"ContainerStarted","Data":"207a642d64cfb337d159822e364dc370ad41f3f90b38a665b8900494ec103d59"} Dec 11 13:31:06 crc kubenswrapper[4898]: I1211 13:31:06.297371 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b75489c6f-zhl76" event={"ID":"b39b71ba-a893-4a17-92da-bf3db2cf671a","Type":"ContainerStarted","Data":"2c6dec5b611c7677470bdd60300521e9ff38fd47785aed2ed144668aa94754ba"} Dec 11 13:31:06 crc kubenswrapper[4898]: I1211 13:31:06.300666 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d71ee22f-68e7-43d7-8a6a-012ff8b8104e","Type":"ContainerStarted","Data":"6ec900b32a52dcaf7647261cc0daf1de5a92a05c1bbe2578bdcd8671478210dc"} Dec 11 13:31:06 crc kubenswrapper[4898]: I1211 13:31:06.374921 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 11 13:31:06 crc kubenswrapper[4898]: I1211 13:31:06.790130 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6eaf839e-626a-4f9a-b489-d9c37cee9065" path="/var/lib/kubelet/pods/6eaf839e-626a-4f9a-b489-d9c37cee9065/volumes" Dec 11 13:31:06 crc kubenswrapper[4898]: W1211 13:31:06.897662 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd7d19abc_90d0_413d_b8d7_67ae58b010f7.slice/crio-7e74c936abc3b85ac7d73d4f8aba5d1d2707836199b2874a27a03d8c0d977b79 WatchSource:0}: Error finding container 7e74c936abc3b85ac7d73d4f8aba5d1d2707836199b2874a27a03d8c0d977b79: Status 404 returned error can't find the container with id 7e74c936abc3b85ac7d73d4f8aba5d1d2707836199b2874a27a03d8c0d977b79 Dec 11 13:31:06 crc kubenswrapper[4898]: I1211 13:31:06.899225 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 11 13:31:07 crc kubenswrapper[4898]: I1211 13:31:07.319046 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"d7d19abc-90d0-413d-b8d7-67ae58b010f7","Type":"ContainerStarted","Data":"7e74c936abc3b85ac7d73d4f8aba5d1d2707836199b2874a27a03d8c0d977b79"} Dec 11 13:31:07 crc kubenswrapper[4898]: I1211 13:31:07.323318 4898 generic.go:334] "Generic (PLEG): container finished" podID="b39b71ba-a893-4a17-92da-bf3db2cf671a" containerID="0696c18c4423a2b48b5deac7fdd2156f4c06f7570111249dc08cb99e6aae8183" exitCode=0 Dec 11 13:31:07 crc kubenswrapper[4898]: I1211 13:31:07.323545 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b75489c6f-zhl76" event={"ID":"b39b71ba-a893-4a17-92da-bf3db2cf671a","Type":"ContainerDied","Data":"0696c18c4423a2b48b5deac7fdd2156f4c06f7570111249dc08cb99e6aae8183"} Dec 11 13:31:07 crc kubenswrapper[4898]: I1211 13:31:07.329371 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d71ee22f-68e7-43d7-8a6a-012ff8b8104e","Type":"ContainerStarted","Data":"abe85d310e55dfdac30291c84a9ea926c9f5f7e6ef5a3fd3f7d0089e296c8584"} Dec 11 13:31:08 crc kubenswrapper[4898]: E1211 13:31:08.193135 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceilometer-central-agent\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ceilometer-0" podUID="d71ee22f-68e7-43d7-8a6a-012ff8b8104e" Dec 11 13:31:08 crc kubenswrapper[4898]: I1211 13:31:08.343021 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d71ee22f-68e7-43d7-8a6a-012ff8b8104e","Type":"ContainerStarted","Data":"57d07b180a7afcf139a3a83ed34e71fba58b47d006f8f250f6b5428112d4c9bf"} Dec 11 13:31:08 crc kubenswrapper[4898]: I1211 13:31:08.343347 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 11 13:31:08 crc kubenswrapper[4898]: I1211 13:31:08.344807 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"91c646bc-40ca-434e-8db2-df2eb46c4e5e","Type":"ContainerStarted","Data":"4f5efd791407aba9140531ae3f50bf656aa889c9f67ee48de88c9393b231848c"} Dec 11 13:31:08 crc kubenswrapper[4898]: E1211 13:31:08.345190 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceilometer-central-agent\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-central:current-tested\\\"\"" pod="openstack/ceilometer-0" podUID="d71ee22f-68e7-43d7-8a6a-012ff8b8104e" Dec 11 13:31:08 crc kubenswrapper[4898]: I1211 13:31:08.346847 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b75489c6f-zhl76" event={"ID":"b39b71ba-a893-4a17-92da-bf3db2cf671a","Type":"ContainerStarted","Data":"7308587a486c8094c4299e6bde339a714ddc26375cd0bd878c28450438ed879d"} Dec 11 13:31:08 crc kubenswrapper[4898]: I1211 13:31:08.347231 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5b75489c6f-zhl76" Dec 11 13:31:08 crc kubenswrapper[4898]: I1211 13:31:08.414538 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5b75489c6f-zhl76" podStartSLOduration=6.414518638 podStartE2EDuration="6.414518638s" podCreationTimestamp="2025-12-11 13:31:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 13:31:08.412091323 +0000 UTC m=+1625.984417760" watchObservedRunningTime="2025-12-11 13:31:08.414518638 +0000 UTC m=+1625.986845075" Dec 11 13:31:09 crc kubenswrapper[4898]: I1211 13:31:09.357346 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"d7d19abc-90d0-413d-b8d7-67ae58b010f7","Type":"ContainerStarted","Data":"62ab4288f949bc1e3c7a921fe13871a2ec5e772ccee672c86dab942ef3b6c0ed"} Dec 11 13:31:09 crc kubenswrapper[4898]: E1211 13:31:09.359285 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceilometer-central-agent\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-central:current-tested\\\"\"" pod="openstack/ceilometer-0" podUID="d71ee22f-68e7-43d7-8a6a-012ff8b8104e" Dec 11 13:31:13 crc kubenswrapper[4898]: I1211 13:31:13.029681 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5b75489c6f-zhl76" Dec 11 13:31:13 crc kubenswrapper[4898]: I1211 13:31:13.112058 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-f84f9ccf-k4bhj"] Dec 11 13:31:13 crc kubenswrapper[4898]: I1211 13:31:13.112317 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-f84f9ccf-k4bhj" podUID="df10e08a-4d2c-4260-8930-0a64e2fe8b0d" containerName="dnsmasq-dns" containerID="cri-o://42b6bd9e6e276bb794412bcfaf33a934513de982344b0a975064af0a20c56491" gracePeriod=10 Dec 11 13:31:13 crc kubenswrapper[4898]: I1211 13:31:13.305230 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5d75f767dc-9gqbj"] Dec 11 13:31:13 crc kubenswrapper[4898]: I1211 13:31:13.307431 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d75f767dc-9gqbj" Dec 11 13:31:13 crc kubenswrapper[4898]: I1211 13:31:13.359527 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5d75f767dc-9gqbj"] Dec 11 13:31:13 crc kubenswrapper[4898]: I1211 13:31:13.458807 4898 generic.go:334] "Generic (PLEG): container finished" podID="df10e08a-4d2c-4260-8930-0a64e2fe8b0d" containerID="42b6bd9e6e276bb794412bcfaf33a934513de982344b0a975064af0a20c56491" exitCode=0 Dec 11 13:31:13 crc kubenswrapper[4898]: I1211 13:31:13.458860 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f84f9ccf-k4bhj" event={"ID":"df10e08a-4d2c-4260-8930-0a64e2fe8b0d","Type":"ContainerDied","Data":"42b6bd9e6e276bb794412bcfaf33a934513de982344b0a975064af0a20c56491"} Dec 11 13:31:13 crc kubenswrapper[4898]: I1211 13:31:13.489010 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pjbck\" (UniqueName: \"kubernetes.io/projected/1324fad0-9e86-4e14-9d9f-da9de3cf3be7-kube-api-access-pjbck\") pod \"dnsmasq-dns-5d75f767dc-9gqbj\" (UID: \"1324fad0-9e86-4e14-9d9f-da9de3cf3be7\") " pod="openstack/dnsmasq-dns-5d75f767dc-9gqbj" Dec 11 13:31:13 crc kubenswrapper[4898]: I1211 13:31:13.489172 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1324fad0-9e86-4e14-9d9f-da9de3cf3be7-ovsdbserver-sb\") pod \"dnsmasq-dns-5d75f767dc-9gqbj\" (UID: \"1324fad0-9e86-4e14-9d9f-da9de3cf3be7\") " pod="openstack/dnsmasq-dns-5d75f767dc-9gqbj" Dec 11 13:31:13 crc kubenswrapper[4898]: I1211 13:31:13.489224 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1324fad0-9e86-4e14-9d9f-da9de3cf3be7-config\") pod \"dnsmasq-dns-5d75f767dc-9gqbj\" (UID: \"1324fad0-9e86-4e14-9d9f-da9de3cf3be7\") " pod="openstack/dnsmasq-dns-5d75f767dc-9gqbj" Dec 11 13:31:13 crc kubenswrapper[4898]: I1211 13:31:13.489417 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1324fad0-9e86-4e14-9d9f-da9de3cf3be7-dns-svc\") pod \"dnsmasq-dns-5d75f767dc-9gqbj\" (UID: \"1324fad0-9e86-4e14-9d9f-da9de3cf3be7\") " pod="openstack/dnsmasq-dns-5d75f767dc-9gqbj" Dec 11 13:31:13 crc kubenswrapper[4898]: I1211 13:31:13.489472 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/1324fad0-9e86-4e14-9d9f-da9de3cf3be7-openstack-edpm-ipam\") pod \"dnsmasq-dns-5d75f767dc-9gqbj\" (UID: \"1324fad0-9e86-4e14-9d9f-da9de3cf3be7\") " pod="openstack/dnsmasq-dns-5d75f767dc-9gqbj" Dec 11 13:31:13 crc kubenswrapper[4898]: I1211 13:31:13.489537 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1324fad0-9e86-4e14-9d9f-da9de3cf3be7-dns-swift-storage-0\") pod \"dnsmasq-dns-5d75f767dc-9gqbj\" (UID: \"1324fad0-9e86-4e14-9d9f-da9de3cf3be7\") " pod="openstack/dnsmasq-dns-5d75f767dc-9gqbj" Dec 11 13:31:13 crc kubenswrapper[4898]: I1211 13:31:13.489561 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1324fad0-9e86-4e14-9d9f-da9de3cf3be7-ovsdbserver-nb\") pod \"dnsmasq-dns-5d75f767dc-9gqbj\" (UID: \"1324fad0-9e86-4e14-9d9f-da9de3cf3be7\") " pod="openstack/dnsmasq-dns-5d75f767dc-9gqbj" Dec 11 13:31:13 crc kubenswrapper[4898]: I1211 13:31:13.591378 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1324fad0-9e86-4e14-9d9f-da9de3cf3be7-ovsdbserver-sb\") pod \"dnsmasq-dns-5d75f767dc-9gqbj\" (UID: \"1324fad0-9e86-4e14-9d9f-da9de3cf3be7\") " pod="openstack/dnsmasq-dns-5d75f767dc-9gqbj" Dec 11 13:31:13 crc kubenswrapper[4898]: I1211 13:31:13.591486 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1324fad0-9e86-4e14-9d9f-da9de3cf3be7-config\") pod \"dnsmasq-dns-5d75f767dc-9gqbj\" (UID: \"1324fad0-9e86-4e14-9d9f-da9de3cf3be7\") " pod="openstack/dnsmasq-dns-5d75f767dc-9gqbj" Dec 11 13:31:13 crc kubenswrapper[4898]: I1211 13:31:13.591692 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1324fad0-9e86-4e14-9d9f-da9de3cf3be7-dns-svc\") pod \"dnsmasq-dns-5d75f767dc-9gqbj\" (UID: \"1324fad0-9e86-4e14-9d9f-da9de3cf3be7\") " pod="openstack/dnsmasq-dns-5d75f767dc-9gqbj" Dec 11 13:31:13 crc kubenswrapper[4898]: I1211 13:31:13.591739 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/1324fad0-9e86-4e14-9d9f-da9de3cf3be7-openstack-edpm-ipam\") pod \"dnsmasq-dns-5d75f767dc-9gqbj\" (UID: \"1324fad0-9e86-4e14-9d9f-da9de3cf3be7\") " pod="openstack/dnsmasq-dns-5d75f767dc-9gqbj" Dec 11 13:31:13 crc kubenswrapper[4898]: I1211 13:31:13.591806 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1324fad0-9e86-4e14-9d9f-da9de3cf3be7-dns-swift-storage-0\") pod \"dnsmasq-dns-5d75f767dc-9gqbj\" (UID: \"1324fad0-9e86-4e14-9d9f-da9de3cf3be7\") " pod="openstack/dnsmasq-dns-5d75f767dc-9gqbj" Dec 11 13:31:13 crc kubenswrapper[4898]: I1211 13:31:13.591835 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1324fad0-9e86-4e14-9d9f-da9de3cf3be7-ovsdbserver-nb\") pod \"dnsmasq-dns-5d75f767dc-9gqbj\" (UID: \"1324fad0-9e86-4e14-9d9f-da9de3cf3be7\") " pod="openstack/dnsmasq-dns-5d75f767dc-9gqbj" Dec 11 13:31:13 crc kubenswrapper[4898]: I1211 13:31:13.591923 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pjbck\" (UniqueName: \"kubernetes.io/projected/1324fad0-9e86-4e14-9d9f-da9de3cf3be7-kube-api-access-pjbck\") pod \"dnsmasq-dns-5d75f767dc-9gqbj\" (UID: \"1324fad0-9e86-4e14-9d9f-da9de3cf3be7\") " pod="openstack/dnsmasq-dns-5d75f767dc-9gqbj" Dec 11 13:31:13 crc kubenswrapper[4898]: I1211 13:31:13.593444 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1324fad0-9e86-4e14-9d9f-da9de3cf3be7-ovsdbserver-sb\") pod \"dnsmasq-dns-5d75f767dc-9gqbj\" (UID: \"1324fad0-9e86-4e14-9d9f-da9de3cf3be7\") " pod="openstack/dnsmasq-dns-5d75f767dc-9gqbj" Dec 11 13:31:13 crc kubenswrapper[4898]: I1211 13:31:13.597112 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1324fad0-9e86-4e14-9d9f-da9de3cf3be7-dns-swift-storage-0\") pod \"dnsmasq-dns-5d75f767dc-9gqbj\" (UID: \"1324fad0-9e86-4e14-9d9f-da9de3cf3be7\") " pod="openstack/dnsmasq-dns-5d75f767dc-9gqbj" Dec 11 13:31:13 crc kubenswrapper[4898]: I1211 13:31:13.597732 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/1324fad0-9e86-4e14-9d9f-da9de3cf3be7-openstack-edpm-ipam\") pod \"dnsmasq-dns-5d75f767dc-9gqbj\" (UID: \"1324fad0-9e86-4e14-9d9f-da9de3cf3be7\") " pod="openstack/dnsmasq-dns-5d75f767dc-9gqbj" Dec 11 13:31:13 crc kubenswrapper[4898]: I1211 13:31:13.598253 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1324fad0-9e86-4e14-9d9f-da9de3cf3be7-config\") pod \"dnsmasq-dns-5d75f767dc-9gqbj\" (UID: \"1324fad0-9e86-4e14-9d9f-da9de3cf3be7\") " pod="openstack/dnsmasq-dns-5d75f767dc-9gqbj" Dec 11 13:31:13 crc kubenswrapper[4898]: I1211 13:31:13.598394 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1324fad0-9e86-4e14-9d9f-da9de3cf3be7-ovsdbserver-nb\") pod \"dnsmasq-dns-5d75f767dc-9gqbj\" (UID: \"1324fad0-9e86-4e14-9d9f-da9de3cf3be7\") " pod="openstack/dnsmasq-dns-5d75f767dc-9gqbj" Dec 11 13:31:13 crc kubenswrapper[4898]: I1211 13:31:13.598425 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1324fad0-9e86-4e14-9d9f-da9de3cf3be7-dns-svc\") pod \"dnsmasq-dns-5d75f767dc-9gqbj\" (UID: \"1324fad0-9e86-4e14-9d9f-da9de3cf3be7\") " pod="openstack/dnsmasq-dns-5d75f767dc-9gqbj" Dec 11 13:31:13 crc kubenswrapper[4898]: I1211 13:31:13.628391 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pjbck\" (UniqueName: \"kubernetes.io/projected/1324fad0-9e86-4e14-9d9f-da9de3cf3be7-kube-api-access-pjbck\") pod \"dnsmasq-dns-5d75f767dc-9gqbj\" (UID: \"1324fad0-9e86-4e14-9d9f-da9de3cf3be7\") " pod="openstack/dnsmasq-dns-5d75f767dc-9gqbj" Dec 11 13:31:13 crc kubenswrapper[4898]: I1211 13:31:13.647428 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d75f767dc-9gqbj" Dec 11 13:31:13 crc kubenswrapper[4898]: I1211 13:31:13.952554 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f84f9ccf-k4bhj" Dec 11 13:31:14 crc kubenswrapper[4898]: I1211 13:31:14.005142 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/df10e08a-4d2c-4260-8930-0a64e2fe8b0d-dns-swift-storage-0\") pod \"df10e08a-4d2c-4260-8930-0a64e2fe8b0d\" (UID: \"df10e08a-4d2c-4260-8930-0a64e2fe8b0d\") " Dec 11 13:31:14 crc kubenswrapper[4898]: I1211 13:31:14.005248 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/df10e08a-4d2c-4260-8930-0a64e2fe8b0d-dns-svc\") pod \"df10e08a-4d2c-4260-8930-0a64e2fe8b0d\" (UID: \"df10e08a-4d2c-4260-8930-0a64e2fe8b0d\") " Dec 11 13:31:14 crc kubenswrapper[4898]: I1211 13:31:14.005277 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/df10e08a-4d2c-4260-8930-0a64e2fe8b0d-ovsdbserver-nb\") pod \"df10e08a-4d2c-4260-8930-0a64e2fe8b0d\" (UID: \"df10e08a-4d2c-4260-8930-0a64e2fe8b0d\") " Dec 11 13:31:14 crc kubenswrapper[4898]: I1211 13:31:14.005312 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nqnwd\" (UniqueName: \"kubernetes.io/projected/df10e08a-4d2c-4260-8930-0a64e2fe8b0d-kube-api-access-nqnwd\") pod \"df10e08a-4d2c-4260-8930-0a64e2fe8b0d\" (UID: \"df10e08a-4d2c-4260-8930-0a64e2fe8b0d\") " Dec 11 13:31:14 crc kubenswrapper[4898]: I1211 13:31:14.005366 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df10e08a-4d2c-4260-8930-0a64e2fe8b0d-config\") pod \"df10e08a-4d2c-4260-8930-0a64e2fe8b0d\" (UID: \"df10e08a-4d2c-4260-8930-0a64e2fe8b0d\") " Dec 11 13:31:14 crc kubenswrapper[4898]: I1211 13:31:14.005385 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/df10e08a-4d2c-4260-8930-0a64e2fe8b0d-ovsdbserver-sb\") pod \"df10e08a-4d2c-4260-8930-0a64e2fe8b0d\" (UID: \"df10e08a-4d2c-4260-8930-0a64e2fe8b0d\") " Dec 11 13:31:14 crc kubenswrapper[4898]: I1211 13:31:14.012769 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df10e08a-4d2c-4260-8930-0a64e2fe8b0d-kube-api-access-nqnwd" (OuterVolumeSpecName: "kube-api-access-nqnwd") pod "df10e08a-4d2c-4260-8930-0a64e2fe8b0d" (UID: "df10e08a-4d2c-4260-8930-0a64e2fe8b0d"). InnerVolumeSpecName "kube-api-access-nqnwd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:31:14 crc kubenswrapper[4898]: I1211 13:31:14.074510 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df10e08a-4d2c-4260-8930-0a64e2fe8b0d-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "df10e08a-4d2c-4260-8930-0a64e2fe8b0d" (UID: "df10e08a-4d2c-4260-8930-0a64e2fe8b0d"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:31:14 crc kubenswrapper[4898]: I1211 13:31:14.083027 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df10e08a-4d2c-4260-8930-0a64e2fe8b0d-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "df10e08a-4d2c-4260-8930-0a64e2fe8b0d" (UID: "df10e08a-4d2c-4260-8930-0a64e2fe8b0d"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:31:14 crc kubenswrapper[4898]: I1211 13:31:14.089726 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df10e08a-4d2c-4260-8930-0a64e2fe8b0d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "df10e08a-4d2c-4260-8930-0a64e2fe8b0d" (UID: "df10e08a-4d2c-4260-8930-0a64e2fe8b0d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:31:14 crc kubenswrapper[4898]: I1211 13:31:14.107939 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df10e08a-4d2c-4260-8930-0a64e2fe8b0d-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "df10e08a-4d2c-4260-8930-0a64e2fe8b0d" (UID: "df10e08a-4d2c-4260-8930-0a64e2fe8b0d"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:31:14 crc kubenswrapper[4898]: I1211 13:31:14.110717 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df10e08a-4d2c-4260-8930-0a64e2fe8b0d-config" (OuterVolumeSpecName: "config") pod "df10e08a-4d2c-4260-8930-0a64e2fe8b0d" (UID: "df10e08a-4d2c-4260-8930-0a64e2fe8b0d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:31:14 crc kubenswrapper[4898]: I1211 13:31:14.114648 4898 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/df10e08a-4d2c-4260-8930-0a64e2fe8b0d-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 11 13:31:14 crc kubenswrapper[4898]: I1211 13:31:14.114731 4898 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/df10e08a-4d2c-4260-8930-0a64e2fe8b0d-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 11 13:31:14 crc kubenswrapper[4898]: I1211 13:31:14.114743 4898 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/df10e08a-4d2c-4260-8930-0a64e2fe8b0d-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 11 13:31:14 crc kubenswrapper[4898]: I1211 13:31:14.114789 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nqnwd\" (UniqueName: \"kubernetes.io/projected/df10e08a-4d2c-4260-8930-0a64e2fe8b0d-kube-api-access-nqnwd\") on node \"crc\" DevicePath \"\"" Dec 11 13:31:14 crc kubenswrapper[4898]: I1211 13:31:14.114805 4898 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df10e08a-4d2c-4260-8930-0a64e2fe8b0d-config\") on node \"crc\" DevicePath \"\"" Dec 11 13:31:14 crc kubenswrapper[4898]: I1211 13:31:14.114885 4898 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/df10e08a-4d2c-4260-8930-0a64e2fe8b0d-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 11 13:31:14 crc kubenswrapper[4898]: I1211 13:31:14.289901 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5d75f767dc-9gqbj"] Dec 11 13:31:14 crc kubenswrapper[4898]: I1211 13:31:14.474965 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f84f9ccf-k4bhj" event={"ID":"df10e08a-4d2c-4260-8930-0a64e2fe8b0d","Type":"ContainerDied","Data":"e6a6639c9b15ed8dbeb93c198b57242f136275fa48621bd33e7c9dd3da547616"} Dec 11 13:31:14 crc kubenswrapper[4898]: I1211 13:31:14.475196 4898 scope.go:117] "RemoveContainer" containerID="42b6bd9e6e276bb794412bcfaf33a934513de982344b0a975064af0a20c56491" Dec 11 13:31:14 crc kubenswrapper[4898]: I1211 13:31:14.475394 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f84f9ccf-k4bhj" Dec 11 13:31:14 crc kubenswrapper[4898]: I1211 13:31:14.477210 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d75f767dc-9gqbj" event={"ID":"1324fad0-9e86-4e14-9d9f-da9de3cf3be7","Type":"ContainerStarted","Data":"8782a19c29dacd367967d5a6fd08a1820b0aee03efddbcab8e4ffdb8e6bd66e4"} Dec 11 13:31:14 crc kubenswrapper[4898]: I1211 13:31:14.536229 4898 scope.go:117] "RemoveContainer" containerID="8bb3b02915c58562a243cd9617c5a57019af46823515e8b413162ea23fc0752b" Dec 11 13:31:14 crc kubenswrapper[4898]: I1211 13:31:14.574130 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-f84f9ccf-k4bhj"] Dec 11 13:31:14 crc kubenswrapper[4898]: I1211 13:31:14.585910 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-f84f9ccf-k4bhj"] Dec 11 13:31:14 crc kubenswrapper[4898]: I1211 13:31:14.792227 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df10e08a-4d2c-4260-8930-0a64e2fe8b0d" path="/var/lib/kubelet/pods/df10e08a-4d2c-4260-8930-0a64e2fe8b0d/volumes" Dec 11 13:31:15 crc kubenswrapper[4898]: I1211 13:31:15.502561 4898 generic.go:334] "Generic (PLEG): container finished" podID="1324fad0-9e86-4e14-9d9f-da9de3cf3be7" containerID="54cd16f7992c40302ebebc6e1c76679f47ad5cc0d852d5c248090b6bf68f76bd" exitCode=0 Dec 11 13:31:15 crc kubenswrapper[4898]: I1211 13:31:15.502685 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d75f767dc-9gqbj" event={"ID":"1324fad0-9e86-4e14-9d9f-da9de3cf3be7","Type":"ContainerDied","Data":"54cd16f7992c40302ebebc6e1c76679f47ad5cc0d852d5c248090b6bf68f76bd"} Dec 11 13:31:16 crc kubenswrapper[4898]: I1211 13:31:16.518614 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d75f767dc-9gqbj" event={"ID":"1324fad0-9e86-4e14-9d9f-da9de3cf3be7","Type":"ContainerStarted","Data":"58f5cd3b9833f798f74f6bc370eda0949756b5cb2f3a9a608961b71d7ee48995"} Dec 11 13:31:16 crc kubenswrapper[4898]: I1211 13:31:16.519627 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5d75f767dc-9gqbj" Dec 11 13:31:16 crc kubenswrapper[4898]: I1211 13:31:16.553061 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5d75f767dc-9gqbj" podStartSLOduration=3.553042173 podStartE2EDuration="3.553042173s" podCreationTimestamp="2025-12-11 13:31:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 13:31:16.542571442 +0000 UTC m=+1634.114897889" watchObservedRunningTime="2025-12-11 13:31:16.553042173 +0000 UTC m=+1634.125368610" Dec 11 13:31:18 crc kubenswrapper[4898]: I1211 13:31:18.549141 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-j2jpr" event={"ID":"77bd783a-a06c-400b-b29e-a6edb7d613b3","Type":"ContainerStarted","Data":"46916f77e2e5c90443e46eb8bb07913bae74b2a349750f84264c99971a3a5435"} Dec 11 13:31:18 crc kubenswrapper[4898]: I1211 13:31:18.571545 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-db-sync-j2jpr" podStartSLOduration=1.985136555 podStartE2EDuration="41.571523628s" podCreationTimestamp="2025-12-11 13:30:37 +0000 UTC" firstStartedPulling="2025-12-11 13:30:38.425113413 +0000 UTC m=+1595.997439850" lastFinishedPulling="2025-12-11 13:31:18.011500486 +0000 UTC m=+1635.583826923" observedRunningTime="2025-12-11 13:31:18.56562135 +0000 UTC m=+1636.137947807" watchObservedRunningTime="2025-12-11 13:31:18.571523628 +0000 UTC m=+1636.143850065" Dec 11 13:31:20 crc kubenswrapper[4898]: I1211 13:31:20.586077 4898 generic.go:334] "Generic (PLEG): container finished" podID="77bd783a-a06c-400b-b29e-a6edb7d613b3" containerID="46916f77e2e5c90443e46eb8bb07913bae74b2a349750f84264c99971a3a5435" exitCode=0 Dec 11 13:31:20 crc kubenswrapper[4898]: I1211 13:31:20.586168 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-j2jpr" event={"ID":"77bd783a-a06c-400b-b29e-a6edb7d613b3","Type":"ContainerDied","Data":"46916f77e2e5c90443e46eb8bb07913bae74b2a349750f84264c99971a3a5435"} Dec 11 13:31:22 crc kubenswrapper[4898]: I1211 13:31:22.073916 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-j2jpr" Dec 11 13:31:22 crc kubenswrapper[4898]: I1211 13:31:22.138867 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77bd783a-a06c-400b-b29e-a6edb7d613b3-combined-ca-bundle\") pod \"77bd783a-a06c-400b-b29e-a6edb7d613b3\" (UID: \"77bd783a-a06c-400b-b29e-a6edb7d613b3\") " Dec 11 13:31:22 crc kubenswrapper[4898]: I1211 13:31:22.139060 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77bd783a-a06c-400b-b29e-a6edb7d613b3-config-data\") pod \"77bd783a-a06c-400b-b29e-a6edb7d613b3\" (UID: \"77bd783a-a06c-400b-b29e-a6edb7d613b3\") " Dec 11 13:31:22 crc kubenswrapper[4898]: I1211 13:31:22.139132 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n7bbk\" (UniqueName: \"kubernetes.io/projected/77bd783a-a06c-400b-b29e-a6edb7d613b3-kube-api-access-n7bbk\") pod \"77bd783a-a06c-400b-b29e-a6edb7d613b3\" (UID: \"77bd783a-a06c-400b-b29e-a6edb7d613b3\") " Dec 11 13:31:22 crc kubenswrapper[4898]: I1211 13:31:22.147735 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77bd783a-a06c-400b-b29e-a6edb7d613b3-kube-api-access-n7bbk" (OuterVolumeSpecName: "kube-api-access-n7bbk") pod "77bd783a-a06c-400b-b29e-a6edb7d613b3" (UID: "77bd783a-a06c-400b-b29e-a6edb7d613b3"). InnerVolumeSpecName "kube-api-access-n7bbk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:31:22 crc kubenswrapper[4898]: I1211 13:31:22.180061 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77bd783a-a06c-400b-b29e-a6edb7d613b3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "77bd783a-a06c-400b-b29e-a6edb7d613b3" (UID: "77bd783a-a06c-400b-b29e-a6edb7d613b3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:31:22 crc kubenswrapper[4898]: I1211 13:31:22.241148 4898 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77bd783a-a06c-400b-b29e-a6edb7d613b3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 11 13:31:22 crc kubenswrapper[4898]: I1211 13:31:22.241175 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n7bbk\" (UniqueName: \"kubernetes.io/projected/77bd783a-a06c-400b-b29e-a6edb7d613b3-kube-api-access-n7bbk\") on node \"crc\" DevicePath \"\"" Dec 11 13:31:22 crc kubenswrapper[4898]: I1211 13:31:22.246423 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77bd783a-a06c-400b-b29e-a6edb7d613b3-config-data" (OuterVolumeSpecName: "config-data") pod "77bd783a-a06c-400b-b29e-a6edb7d613b3" (UID: "77bd783a-a06c-400b-b29e-a6edb7d613b3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:31:22 crc kubenswrapper[4898]: I1211 13:31:22.344097 4898 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77bd783a-a06c-400b-b29e-a6edb7d613b3-config-data\") on node \"crc\" DevicePath \"\"" Dec 11 13:31:22 crc kubenswrapper[4898]: I1211 13:31:22.616787 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-j2jpr" event={"ID":"77bd783a-a06c-400b-b29e-a6edb7d613b3","Type":"ContainerDied","Data":"648e619f7a82d254b30bda5be17f31d3eb893afc50974aee86dc2fc990df3ea7"} Dec 11 13:31:22 crc kubenswrapper[4898]: I1211 13:31:22.616830 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="648e619f7a82d254b30bda5be17f31d3eb893afc50974aee86dc2fc990df3ea7" Dec 11 13:31:22 crc kubenswrapper[4898]: I1211 13:31:22.616868 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-j2jpr" Dec 11 13:31:23 crc kubenswrapper[4898]: I1211 13:31:23.563254 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-engine-686c7f94b-jlnr7"] Dec 11 13:31:23 crc kubenswrapper[4898]: E1211 13:31:23.564074 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df10e08a-4d2c-4260-8930-0a64e2fe8b0d" containerName="dnsmasq-dns" Dec 11 13:31:23 crc kubenswrapper[4898]: I1211 13:31:23.564088 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="df10e08a-4d2c-4260-8930-0a64e2fe8b0d" containerName="dnsmasq-dns" Dec 11 13:31:23 crc kubenswrapper[4898]: E1211 13:31:23.564127 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df10e08a-4d2c-4260-8930-0a64e2fe8b0d" containerName="init" Dec 11 13:31:23 crc kubenswrapper[4898]: I1211 13:31:23.564135 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="df10e08a-4d2c-4260-8930-0a64e2fe8b0d" containerName="init" Dec 11 13:31:23 crc kubenswrapper[4898]: E1211 13:31:23.564157 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77bd783a-a06c-400b-b29e-a6edb7d613b3" containerName="heat-db-sync" Dec 11 13:31:23 crc kubenswrapper[4898]: I1211 13:31:23.564164 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="77bd783a-a06c-400b-b29e-a6edb7d613b3" containerName="heat-db-sync" Dec 11 13:31:23 crc kubenswrapper[4898]: I1211 13:31:23.564396 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="df10e08a-4d2c-4260-8930-0a64e2fe8b0d" containerName="dnsmasq-dns" Dec 11 13:31:23 crc kubenswrapper[4898]: I1211 13:31:23.564410 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="77bd783a-a06c-400b-b29e-a6edb7d613b3" containerName="heat-db-sync" Dec 11 13:31:23 crc kubenswrapper[4898]: I1211 13:31:23.565286 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-686c7f94b-jlnr7" Dec 11 13:31:23 crc kubenswrapper[4898]: I1211 13:31:23.601566 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-686c7f94b-jlnr7"] Dec 11 13:31:23 crc kubenswrapper[4898]: I1211 13:31:23.617434 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-58ffc484cf-pk2vt"] Dec 11 13:31:23 crc kubenswrapper[4898]: I1211 13:31:23.619505 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-58ffc484cf-pk2vt" Dec 11 13:31:23 crc kubenswrapper[4898]: I1211 13:31:23.643585 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-58ffc484cf-pk2vt"] Dec 11 13:31:23 crc kubenswrapper[4898]: I1211 13:31:23.650591 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5d75f767dc-9gqbj" Dec 11 13:31:23 crc kubenswrapper[4898]: I1211 13:31:23.671150 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-68bbb97c49-zk2lz"] Dec 11 13:31:23 crc kubenswrapper[4898]: I1211 13:31:23.675081 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-68bbb97c49-zk2lz" Dec 11 13:31:23 crc kubenswrapper[4898]: I1211 13:31:23.680413 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/836f22d0-0883-463e-942b-abb6931a997f-combined-ca-bundle\") pod \"heat-engine-686c7f94b-jlnr7\" (UID: \"836f22d0-0883-463e-942b-abb6931a997f\") " pod="openstack/heat-engine-686c7f94b-jlnr7" Dec 11 13:31:23 crc kubenswrapper[4898]: I1211 13:31:23.680723 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tqc2l\" (UniqueName: \"kubernetes.io/projected/836f22d0-0883-463e-942b-abb6931a997f-kube-api-access-tqc2l\") pod \"heat-engine-686c7f94b-jlnr7\" (UID: \"836f22d0-0883-463e-942b-abb6931a997f\") " pod="openstack/heat-engine-686c7f94b-jlnr7" Dec 11 13:31:23 crc kubenswrapper[4898]: I1211 13:31:23.680952 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/836f22d0-0883-463e-942b-abb6931a997f-config-data-custom\") pod \"heat-engine-686c7f94b-jlnr7\" (UID: \"836f22d0-0883-463e-942b-abb6931a997f\") " pod="openstack/heat-engine-686c7f94b-jlnr7" Dec 11 13:31:23 crc kubenswrapper[4898]: I1211 13:31:23.683610 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/836f22d0-0883-463e-942b-abb6931a997f-config-data\") pod \"heat-engine-686c7f94b-jlnr7\" (UID: \"836f22d0-0883-463e-942b-abb6931a997f\") " pod="openstack/heat-engine-686c7f94b-jlnr7" Dec 11 13:31:23 crc kubenswrapper[4898]: I1211 13:31:23.696365 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-68bbb97c49-zk2lz"] Dec 11 13:31:23 crc kubenswrapper[4898]: I1211 13:31:23.786660 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d83af347-3774-4e08-8138-6e67557da826-public-tls-certs\") pod \"heat-api-58ffc484cf-pk2vt\" (UID: \"d83af347-3774-4e08-8138-6e67557da826\") " pod="openstack/heat-api-58ffc484cf-pk2vt" Dec 11 13:31:23 crc kubenswrapper[4898]: I1211 13:31:23.786720 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/911f5e17-1a51-4bf3-8f1c-cdedc2f4404c-config-data\") pod \"heat-cfnapi-68bbb97c49-zk2lz\" (UID: \"911f5e17-1a51-4bf3-8f1c-cdedc2f4404c\") " pod="openstack/heat-cfnapi-68bbb97c49-zk2lz" Dec 11 13:31:23 crc kubenswrapper[4898]: I1211 13:31:23.786759 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/911f5e17-1a51-4bf3-8f1c-cdedc2f4404c-combined-ca-bundle\") pod \"heat-cfnapi-68bbb97c49-zk2lz\" (UID: \"911f5e17-1a51-4bf3-8f1c-cdedc2f4404c\") " pod="openstack/heat-cfnapi-68bbb97c49-zk2lz" Dec 11 13:31:23 crc kubenswrapper[4898]: I1211 13:31:23.786788 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/836f22d0-0883-463e-942b-abb6931a997f-config-data\") pod \"heat-engine-686c7f94b-jlnr7\" (UID: \"836f22d0-0883-463e-942b-abb6931a997f\") " pod="openstack/heat-engine-686c7f94b-jlnr7" Dec 11 13:31:23 crc kubenswrapper[4898]: I1211 13:31:23.786816 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/911f5e17-1a51-4bf3-8f1c-cdedc2f4404c-public-tls-certs\") pod \"heat-cfnapi-68bbb97c49-zk2lz\" (UID: \"911f5e17-1a51-4bf3-8f1c-cdedc2f4404c\") " pod="openstack/heat-cfnapi-68bbb97c49-zk2lz" Dec 11 13:31:23 crc kubenswrapper[4898]: I1211 13:31:23.786853 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-94s5k\" (UniqueName: \"kubernetes.io/projected/d83af347-3774-4e08-8138-6e67557da826-kube-api-access-94s5k\") pod \"heat-api-58ffc484cf-pk2vt\" (UID: \"d83af347-3774-4e08-8138-6e67557da826\") " pod="openstack/heat-api-58ffc484cf-pk2vt" Dec 11 13:31:23 crc kubenswrapper[4898]: I1211 13:31:23.786873 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/911f5e17-1a51-4bf3-8f1c-cdedc2f4404c-internal-tls-certs\") pod \"heat-cfnapi-68bbb97c49-zk2lz\" (UID: \"911f5e17-1a51-4bf3-8f1c-cdedc2f4404c\") " pod="openstack/heat-cfnapi-68bbb97c49-zk2lz" Dec 11 13:31:23 crc kubenswrapper[4898]: I1211 13:31:23.786962 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d83af347-3774-4e08-8138-6e67557da826-internal-tls-certs\") pod \"heat-api-58ffc484cf-pk2vt\" (UID: \"d83af347-3774-4e08-8138-6e67557da826\") " pod="openstack/heat-api-58ffc484cf-pk2vt" Dec 11 13:31:23 crc kubenswrapper[4898]: I1211 13:31:23.787030 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/911f5e17-1a51-4bf3-8f1c-cdedc2f4404c-config-data-custom\") pod \"heat-cfnapi-68bbb97c49-zk2lz\" (UID: \"911f5e17-1a51-4bf3-8f1c-cdedc2f4404c\") " pod="openstack/heat-cfnapi-68bbb97c49-zk2lz" Dec 11 13:31:23 crc kubenswrapper[4898]: I1211 13:31:23.787059 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d83af347-3774-4e08-8138-6e67557da826-combined-ca-bundle\") pod \"heat-api-58ffc484cf-pk2vt\" (UID: \"d83af347-3774-4e08-8138-6e67557da826\") " pod="openstack/heat-api-58ffc484cf-pk2vt" Dec 11 13:31:23 crc kubenswrapper[4898]: I1211 13:31:23.787092 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/836f22d0-0883-463e-942b-abb6931a997f-combined-ca-bundle\") pod \"heat-engine-686c7f94b-jlnr7\" (UID: \"836f22d0-0883-463e-942b-abb6931a997f\") " pod="openstack/heat-engine-686c7f94b-jlnr7" Dec 11 13:31:23 crc kubenswrapper[4898]: I1211 13:31:23.787148 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d83af347-3774-4e08-8138-6e67557da826-config-data-custom\") pod \"heat-api-58ffc484cf-pk2vt\" (UID: \"d83af347-3774-4e08-8138-6e67557da826\") " pod="openstack/heat-api-58ffc484cf-pk2vt" Dec 11 13:31:23 crc kubenswrapper[4898]: I1211 13:31:23.787195 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tqc2l\" (UniqueName: \"kubernetes.io/projected/836f22d0-0883-463e-942b-abb6931a997f-kube-api-access-tqc2l\") pod \"heat-engine-686c7f94b-jlnr7\" (UID: \"836f22d0-0883-463e-942b-abb6931a997f\") " pod="openstack/heat-engine-686c7f94b-jlnr7" Dec 11 13:31:23 crc kubenswrapper[4898]: I1211 13:31:23.787254 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7xbnd\" (UniqueName: \"kubernetes.io/projected/911f5e17-1a51-4bf3-8f1c-cdedc2f4404c-kube-api-access-7xbnd\") pod \"heat-cfnapi-68bbb97c49-zk2lz\" (UID: \"911f5e17-1a51-4bf3-8f1c-cdedc2f4404c\") " pod="openstack/heat-cfnapi-68bbb97c49-zk2lz" Dec 11 13:31:23 crc kubenswrapper[4898]: I1211 13:31:23.787307 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/836f22d0-0883-463e-942b-abb6931a997f-config-data-custom\") pod \"heat-engine-686c7f94b-jlnr7\" (UID: \"836f22d0-0883-463e-942b-abb6931a997f\") " pod="openstack/heat-engine-686c7f94b-jlnr7" Dec 11 13:31:23 crc kubenswrapper[4898]: I1211 13:31:23.787341 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d83af347-3774-4e08-8138-6e67557da826-config-data\") pod \"heat-api-58ffc484cf-pk2vt\" (UID: \"d83af347-3774-4e08-8138-6e67557da826\") " pod="openstack/heat-api-58ffc484cf-pk2vt" Dec 11 13:31:23 crc kubenswrapper[4898]: I1211 13:31:23.795949 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b75489c6f-zhl76"] Dec 11 13:31:23 crc kubenswrapper[4898]: I1211 13:31:23.797226 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/836f22d0-0883-463e-942b-abb6931a997f-config-data-custom\") pod \"heat-engine-686c7f94b-jlnr7\" (UID: \"836f22d0-0883-463e-942b-abb6931a997f\") " pod="openstack/heat-engine-686c7f94b-jlnr7" Dec 11 13:31:23 crc kubenswrapper[4898]: I1211 13:31:23.797760 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/836f22d0-0883-463e-942b-abb6931a997f-config-data\") pod \"heat-engine-686c7f94b-jlnr7\" (UID: \"836f22d0-0883-463e-942b-abb6931a997f\") " pod="openstack/heat-engine-686c7f94b-jlnr7" Dec 11 13:31:23 crc kubenswrapper[4898]: I1211 13:31:23.803339 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5b75489c6f-zhl76" podUID="b39b71ba-a893-4a17-92da-bf3db2cf671a" containerName="dnsmasq-dns" containerID="cri-o://7308587a486c8094c4299e6bde339a714ddc26375cd0bd878c28450438ed879d" gracePeriod=10 Dec 11 13:31:23 crc kubenswrapper[4898]: I1211 13:31:23.804329 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/836f22d0-0883-463e-942b-abb6931a997f-combined-ca-bundle\") pod \"heat-engine-686c7f94b-jlnr7\" (UID: \"836f22d0-0883-463e-942b-abb6931a997f\") " pod="openstack/heat-engine-686c7f94b-jlnr7" Dec 11 13:31:23 crc kubenswrapper[4898]: I1211 13:31:23.823328 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tqc2l\" (UniqueName: \"kubernetes.io/projected/836f22d0-0883-463e-942b-abb6931a997f-kube-api-access-tqc2l\") pod \"heat-engine-686c7f94b-jlnr7\" (UID: \"836f22d0-0883-463e-942b-abb6931a997f\") " pod="openstack/heat-engine-686c7f94b-jlnr7" Dec 11 13:31:23 crc kubenswrapper[4898]: I1211 13:31:23.889119 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/911f5e17-1a51-4bf3-8f1c-cdedc2f4404c-combined-ca-bundle\") pod \"heat-cfnapi-68bbb97c49-zk2lz\" (UID: \"911f5e17-1a51-4bf3-8f1c-cdedc2f4404c\") " pod="openstack/heat-cfnapi-68bbb97c49-zk2lz" Dec 11 13:31:23 crc kubenswrapper[4898]: I1211 13:31:23.889171 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/911f5e17-1a51-4bf3-8f1c-cdedc2f4404c-public-tls-certs\") pod \"heat-cfnapi-68bbb97c49-zk2lz\" (UID: \"911f5e17-1a51-4bf3-8f1c-cdedc2f4404c\") " pod="openstack/heat-cfnapi-68bbb97c49-zk2lz" Dec 11 13:31:23 crc kubenswrapper[4898]: I1211 13:31:23.889225 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-94s5k\" (UniqueName: \"kubernetes.io/projected/d83af347-3774-4e08-8138-6e67557da826-kube-api-access-94s5k\") pod \"heat-api-58ffc484cf-pk2vt\" (UID: \"d83af347-3774-4e08-8138-6e67557da826\") " pod="openstack/heat-api-58ffc484cf-pk2vt" Dec 11 13:31:23 crc kubenswrapper[4898]: I1211 13:31:23.889249 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/911f5e17-1a51-4bf3-8f1c-cdedc2f4404c-internal-tls-certs\") pod \"heat-cfnapi-68bbb97c49-zk2lz\" (UID: \"911f5e17-1a51-4bf3-8f1c-cdedc2f4404c\") " pod="openstack/heat-cfnapi-68bbb97c49-zk2lz" Dec 11 13:31:23 crc kubenswrapper[4898]: I1211 13:31:23.889341 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d83af347-3774-4e08-8138-6e67557da826-internal-tls-certs\") pod \"heat-api-58ffc484cf-pk2vt\" (UID: \"d83af347-3774-4e08-8138-6e67557da826\") " pod="openstack/heat-api-58ffc484cf-pk2vt" Dec 11 13:31:23 crc kubenswrapper[4898]: I1211 13:31:23.889379 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/911f5e17-1a51-4bf3-8f1c-cdedc2f4404c-config-data-custom\") pod \"heat-cfnapi-68bbb97c49-zk2lz\" (UID: \"911f5e17-1a51-4bf3-8f1c-cdedc2f4404c\") " pod="openstack/heat-cfnapi-68bbb97c49-zk2lz" Dec 11 13:31:23 crc kubenswrapper[4898]: I1211 13:31:23.889406 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d83af347-3774-4e08-8138-6e67557da826-combined-ca-bundle\") pod \"heat-api-58ffc484cf-pk2vt\" (UID: \"d83af347-3774-4e08-8138-6e67557da826\") " pod="openstack/heat-api-58ffc484cf-pk2vt" Dec 11 13:31:23 crc kubenswrapper[4898]: I1211 13:31:23.889492 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d83af347-3774-4e08-8138-6e67557da826-config-data-custom\") pod \"heat-api-58ffc484cf-pk2vt\" (UID: \"d83af347-3774-4e08-8138-6e67557da826\") " pod="openstack/heat-api-58ffc484cf-pk2vt" Dec 11 13:31:23 crc kubenswrapper[4898]: I1211 13:31:23.889639 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7xbnd\" (UniqueName: \"kubernetes.io/projected/911f5e17-1a51-4bf3-8f1c-cdedc2f4404c-kube-api-access-7xbnd\") pod \"heat-cfnapi-68bbb97c49-zk2lz\" (UID: \"911f5e17-1a51-4bf3-8f1c-cdedc2f4404c\") " pod="openstack/heat-cfnapi-68bbb97c49-zk2lz" Dec 11 13:31:23 crc kubenswrapper[4898]: I1211 13:31:23.889703 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d83af347-3774-4e08-8138-6e67557da826-config-data\") pod \"heat-api-58ffc484cf-pk2vt\" (UID: \"d83af347-3774-4e08-8138-6e67557da826\") " pod="openstack/heat-api-58ffc484cf-pk2vt" Dec 11 13:31:23 crc kubenswrapper[4898]: I1211 13:31:23.889771 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d83af347-3774-4e08-8138-6e67557da826-public-tls-certs\") pod \"heat-api-58ffc484cf-pk2vt\" (UID: \"d83af347-3774-4e08-8138-6e67557da826\") " pod="openstack/heat-api-58ffc484cf-pk2vt" Dec 11 13:31:23 crc kubenswrapper[4898]: I1211 13:31:23.889795 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/911f5e17-1a51-4bf3-8f1c-cdedc2f4404c-config-data\") pod \"heat-cfnapi-68bbb97c49-zk2lz\" (UID: \"911f5e17-1a51-4bf3-8f1c-cdedc2f4404c\") " pod="openstack/heat-cfnapi-68bbb97c49-zk2lz" Dec 11 13:31:23 crc kubenswrapper[4898]: I1211 13:31:23.894573 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d83af347-3774-4e08-8138-6e67557da826-combined-ca-bundle\") pod \"heat-api-58ffc484cf-pk2vt\" (UID: \"d83af347-3774-4e08-8138-6e67557da826\") " pod="openstack/heat-api-58ffc484cf-pk2vt" Dec 11 13:31:23 crc kubenswrapper[4898]: I1211 13:31:23.894909 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d83af347-3774-4e08-8138-6e67557da826-config-data\") pod \"heat-api-58ffc484cf-pk2vt\" (UID: \"d83af347-3774-4e08-8138-6e67557da826\") " pod="openstack/heat-api-58ffc484cf-pk2vt" Dec 11 13:31:23 crc kubenswrapper[4898]: I1211 13:31:23.895792 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/911f5e17-1a51-4bf3-8f1c-cdedc2f4404c-public-tls-certs\") pod \"heat-cfnapi-68bbb97c49-zk2lz\" (UID: \"911f5e17-1a51-4bf3-8f1c-cdedc2f4404c\") " pod="openstack/heat-cfnapi-68bbb97c49-zk2lz" Dec 11 13:31:23 crc kubenswrapper[4898]: I1211 13:31:23.897582 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d83af347-3774-4e08-8138-6e67557da826-public-tls-certs\") pod \"heat-api-58ffc484cf-pk2vt\" (UID: \"d83af347-3774-4e08-8138-6e67557da826\") " pod="openstack/heat-api-58ffc484cf-pk2vt" Dec 11 13:31:23 crc kubenswrapper[4898]: I1211 13:31:23.900083 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d83af347-3774-4e08-8138-6e67557da826-config-data-custom\") pod \"heat-api-58ffc484cf-pk2vt\" (UID: \"d83af347-3774-4e08-8138-6e67557da826\") " pod="openstack/heat-api-58ffc484cf-pk2vt" Dec 11 13:31:23 crc kubenswrapper[4898]: I1211 13:31:23.900214 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/911f5e17-1a51-4bf3-8f1c-cdedc2f4404c-combined-ca-bundle\") pod \"heat-cfnapi-68bbb97c49-zk2lz\" (UID: \"911f5e17-1a51-4bf3-8f1c-cdedc2f4404c\") " pod="openstack/heat-cfnapi-68bbb97c49-zk2lz" Dec 11 13:31:23 crc kubenswrapper[4898]: I1211 13:31:23.900573 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d83af347-3774-4e08-8138-6e67557da826-internal-tls-certs\") pod \"heat-api-58ffc484cf-pk2vt\" (UID: \"d83af347-3774-4e08-8138-6e67557da826\") " pod="openstack/heat-api-58ffc484cf-pk2vt" Dec 11 13:31:23 crc kubenswrapper[4898]: I1211 13:31:23.901645 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/911f5e17-1a51-4bf3-8f1c-cdedc2f4404c-internal-tls-certs\") pod \"heat-cfnapi-68bbb97c49-zk2lz\" (UID: \"911f5e17-1a51-4bf3-8f1c-cdedc2f4404c\") " pod="openstack/heat-cfnapi-68bbb97c49-zk2lz" Dec 11 13:31:23 crc kubenswrapper[4898]: I1211 13:31:23.901854 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/911f5e17-1a51-4bf3-8f1c-cdedc2f4404c-config-data-custom\") pod \"heat-cfnapi-68bbb97c49-zk2lz\" (UID: \"911f5e17-1a51-4bf3-8f1c-cdedc2f4404c\") " pod="openstack/heat-cfnapi-68bbb97c49-zk2lz" Dec 11 13:31:23 crc kubenswrapper[4898]: I1211 13:31:23.902991 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/911f5e17-1a51-4bf3-8f1c-cdedc2f4404c-config-data\") pod \"heat-cfnapi-68bbb97c49-zk2lz\" (UID: \"911f5e17-1a51-4bf3-8f1c-cdedc2f4404c\") " pod="openstack/heat-cfnapi-68bbb97c49-zk2lz" Dec 11 13:31:23 crc kubenswrapper[4898]: I1211 13:31:23.907768 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7xbnd\" (UniqueName: \"kubernetes.io/projected/911f5e17-1a51-4bf3-8f1c-cdedc2f4404c-kube-api-access-7xbnd\") pod \"heat-cfnapi-68bbb97c49-zk2lz\" (UID: \"911f5e17-1a51-4bf3-8f1c-cdedc2f4404c\") " pod="openstack/heat-cfnapi-68bbb97c49-zk2lz" Dec 11 13:31:23 crc kubenswrapper[4898]: I1211 13:31:23.916493 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-94s5k\" (UniqueName: \"kubernetes.io/projected/d83af347-3774-4e08-8138-6e67557da826-kube-api-access-94s5k\") pod \"heat-api-58ffc484cf-pk2vt\" (UID: \"d83af347-3774-4e08-8138-6e67557da826\") " pod="openstack/heat-api-58ffc484cf-pk2vt" Dec 11 13:31:23 crc kubenswrapper[4898]: I1211 13:31:23.918076 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-686c7f94b-jlnr7" Dec 11 13:31:23 crc kubenswrapper[4898]: I1211 13:31:23.969464 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-58ffc484cf-pk2vt" Dec 11 13:31:24 crc kubenswrapper[4898]: I1211 13:31:24.014298 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-68bbb97c49-zk2lz" Dec 11 13:31:24 crc kubenswrapper[4898]: I1211 13:31:24.374081 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b75489c6f-zhl76" Dec 11 13:31:24 crc kubenswrapper[4898]: I1211 13:31:24.511726 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b39b71ba-a893-4a17-92da-bf3db2cf671a-config\") pod \"b39b71ba-a893-4a17-92da-bf3db2cf671a\" (UID: \"b39b71ba-a893-4a17-92da-bf3db2cf671a\") " Dec 11 13:31:24 crc kubenswrapper[4898]: I1211 13:31:24.511766 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b39b71ba-a893-4a17-92da-bf3db2cf671a-ovsdbserver-sb\") pod \"b39b71ba-a893-4a17-92da-bf3db2cf671a\" (UID: \"b39b71ba-a893-4a17-92da-bf3db2cf671a\") " Dec 11 13:31:24 crc kubenswrapper[4898]: I1211 13:31:24.511848 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wzddq\" (UniqueName: \"kubernetes.io/projected/b39b71ba-a893-4a17-92da-bf3db2cf671a-kube-api-access-wzddq\") pod \"b39b71ba-a893-4a17-92da-bf3db2cf671a\" (UID: \"b39b71ba-a893-4a17-92da-bf3db2cf671a\") " Dec 11 13:31:24 crc kubenswrapper[4898]: I1211 13:31:24.511930 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b39b71ba-a893-4a17-92da-bf3db2cf671a-dns-svc\") pod \"b39b71ba-a893-4a17-92da-bf3db2cf671a\" (UID: \"b39b71ba-a893-4a17-92da-bf3db2cf671a\") " Dec 11 13:31:24 crc kubenswrapper[4898]: I1211 13:31:24.512051 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b39b71ba-a893-4a17-92da-bf3db2cf671a-dns-swift-storage-0\") pod \"b39b71ba-a893-4a17-92da-bf3db2cf671a\" (UID: \"b39b71ba-a893-4a17-92da-bf3db2cf671a\") " Dec 11 13:31:24 crc kubenswrapper[4898]: I1211 13:31:24.512126 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/b39b71ba-a893-4a17-92da-bf3db2cf671a-openstack-edpm-ipam\") pod \"b39b71ba-a893-4a17-92da-bf3db2cf671a\" (UID: \"b39b71ba-a893-4a17-92da-bf3db2cf671a\") " Dec 11 13:31:24 crc kubenswrapper[4898]: I1211 13:31:24.512179 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b39b71ba-a893-4a17-92da-bf3db2cf671a-ovsdbserver-nb\") pod \"b39b71ba-a893-4a17-92da-bf3db2cf671a\" (UID: \"b39b71ba-a893-4a17-92da-bf3db2cf671a\") " Dec 11 13:31:24 crc kubenswrapper[4898]: I1211 13:31:24.516691 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b39b71ba-a893-4a17-92da-bf3db2cf671a-kube-api-access-wzddq" (OuterVolumeSpecName: "kube-api-access-wzddq") pod "b39b71ba-a893-4a17-92da-bf3db2cf671a" (UID: "b39b71ba-a893-4a17-92da-bf3db2cf671a"). InnerVolumeSpecName "kube-api-access-wzddq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:31:24 crc kubenswrapper[4898]: I1211 13:31:24.593980 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b39b71ba-a893-4a17-92da-bf3db2cf671a-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "b39b71ba-a893-4a17-92da-bf3db2cf671a" (UID: "b39b71ba-a893-4a17-92da-bf3db2cf671a"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:31:24 crc kubenswrapper[4898]: I1211 13:31:24.600299 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b39b71ba-a893-4a17-92da-bf3db2cf671a-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "b39b71ba-a893-4a17-92da-bf3db2cf671a" (UID: "b39b71ba-a893-4a17-92da-bf3db2cf671a"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:31:24 crc kubenswrapper[4898]: I1211 13:31:24.612437 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-58ffc484cf-pk2vt"] Dec 11 13:31:24 crc kubenswrapper[4898]: W1211 13:31:24.613897 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd83af347_3774_4e08_8138_6e67557da826.slice/crio-3b04120f40df0ce85f7b69200a7312a6e3b47aca68da2c86f5137a99553c6eff WatchSource:0}: Error finding container 3b04120f40df0ce85f7b69200a7312a6e3b47aca68da2c86f5137a99553c6eff: Status 404 returned error can't find the container with id 3b04120f40df0ce85f7b69200a7312a6e3b47aca68da2c86f5137a99553c6eff Dec 11 13:31:24 crc kubenswrapper[4898]: W1211 13:31:24.614220 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod836f22d0_0883_463e_942b_abb6931a997f.slice/crio-550b56695bb5b8ab81531d8c66d0de880491b5d38b5f7accc59115f293a860bc WatchSource:0}: Error finding container 550b56695bb5b8ab81531d8c66d0de880491b5d38b5f7accc59115f293a860bc: Status 404 returned error can't find the container with id 550b56695bb5b8ab81531d8c66d0de880491b5d38b5f7accc59115f293a860bc Dec 11 13:31:24 crc kubenswrapper[4898]: I1211 13:31:24.618547 4898 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b39b71ba-a893-4a17-92da-bf3db2cf671a-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 11 13:31:24 crc kubenswrapper[4898]: I1211 13:31:24.618577 4898 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b39b71ba-a893-4a17-92da-bf3db2cf671a-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 11 13:31:24 crc kubenswrapper[4898]: I1211 13:31:24.618589 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wzddq\" (UniqueName: \"kubernetes.io/projected/b39b71ba-a893-4a17-92da-bf3db2cf671a-kube-api-access-wzddq\") on node \"crc\" DevicePath \"\"" Dec 11 13:31:24 crc kubenswrapper[4898]: I1211 13:31:24.624473 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b39b71ba-a893-4a17-92da-bf3db2cf671a-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "b39b71ba-a893-4a17-92da-bf3db2cf671a" (UID: "b39b71ba-a893-4a17-92da-bf3db2cf671a"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:31:24 crc kubenswrapper[4898]: I1211 13:31:24.627696 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b39b71ba-a893-4a17-92da-bf3db2cf671a-config" (OuterVolumeSpecName: "config") pod "b39b71ba-a893-4a17-92da-bf3db2cf671a" (UID: "b39b71ba-a893-4a17-92da-bf3db2cf671a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:31:24 crc kubenswrapper[4898]: I1211 13:31:24.629348 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-686c7f94b-jlnr7"] Dec 11 13:31:24 crc kubenswrapper[4898]: I1211 13:31:24.635275 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b39b71ba-a893-4a17-92da-bf3db2cf671a-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "b39b71ba-a893-4a17-92da-bf3db2cf671a" (UID: "b39b71ba-a893-4a17-92da-bf3db2cf671a"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:31:24 crc kubenswrapper[4898]: I1211 13:31:24.636619 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b39b71ba-a893-4a17-92da-bf3db2cf671a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b39b71ba-a893-4a17-92da-bf3db2cf671a" (UID: "b39b71ba-a893-4a17-92da-bf3db2cf671a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:31:24 crc kubenswrapper[4898]: I1211 13:31:24.664742 4898 generic.go:334] "Generic (PLEG): container finished" podID="b39b71ba-a893-4a17-92da-bf3db2cf671a" containerID="7308587a486c8094c4299e6bde339a714ddc26375cd0bd878c28450438ed879d" exitCode=0 Dec 11 13:31:24 crc kubenswrapper[4898]: I1211 13:31:24.664790 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b75489c6f-zhl76" Dec 11 13:31:24 crc kubenswrapper[4898]: I1211 13:31:24.664821 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b75489c6f-zhl76" event={"ID":"b39b71ba-a893-4a17-92da-bf3db2cf671a","Type":"ContainerDied","Data":"7308587a486c8094c4299e6bde339a714ddc26375cd0bd878c28450438ed879d"} Dec 11 13:31:24 crc kubenswrapper[4898]: I1211 13:31:24.664851 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b75489c6f-zhl76" event={"ID":"b39b71ba-a893-4a17-92da-bf3db2cf671a","Type":"ContainerDied","Data":"2c6dec5b611c7677470bdd60300521e9ff38fd47785aed2ed144668aa94754ba"} Dec 11 13:31:24 crc kubenswrapper[4898]: I1211 13:31:24.664868 4898 scope.go:117] "RemoveContainer" containerID="7308587a486c8094c4299e6bde339a714ddc26375cd0bd878c28450438ed879d" Dec 11 13:31:24 crc kubenswrapper[4898]: I1211 13:31:24.666139 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-58ffc484cf-pk2vt" event={"ID":"d83af347-3774-4e08-8138-6e67557da826","Type":"ContainerStarted","Data":"3b04120f40df0ce85f7b69200a7312a6e3b47aca68da2c86f5137a99553c6eff"} Dec 11 13:31:24 crc kubenswrapper[4898]: I1211 13:31:24.670152 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-686c7f94b-jlnr7" event={"ID":"836f22d0-0883-463e-942b-abb6931a997f","Type":"ContainerStarted","Data":"550b56695bb5b8ab81531d8c66d0de880491b5d38b5f7accc59115f293a860bc"} Dec 11 13:31:24 crc kubenswrapper[4898]: I1211 13:31:24.705111 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b75489c6f-zhl76"] Dec 11 13:31:24 crc kubenswrapper[4898]: I1211 13:31:24.709704 4898 scope.go:117] "RemoveContainer" containerID="0696c18c4423a2b48b5deac7fdd2156f4c06f7570111249dc08cb99e6aae8183" Dec 11 13:31:24 crc kubenswrapper[4898]: I1211 13:31:24.720309 4898 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/b39b71ba-a893-4a17-92da-bf3db2cf671a-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Dec 11 13:31:24 crc kubenswrapper[4898]: I1211 13:31:24.720341 4898 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b39b71ba-a893-4a17-92da-bf3db2cf671a-config\") on node \"crc\" DevicePath \"\"" Dec 11 13:31:24 crc kubenswrapper[4898]: I1211 13:31:24.720353 4898 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b39b71ba-a893-4a17-92da-bf3db2cf671a-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 11 13:31:24 crc kubenswrapper[4898]: I1211 13:31:24.720363 4898 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b39b71ba-a893-4a17-92da-bf3db2cf671a-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 11 13:31:24 crc kubenswrapper[4898]: I1211 13:31:24.723045 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5b75489c6f-zhl76"] Dec 11 13:31:24 crc kubenswrapper[4898]: I1211 13:31:24.746670 4898 scope.go:117] "RemoveContainer" containerID="7308587a486c8094c4299e6bde339a714ddc26375cd0bd878c28450438ed879d" Dec 11 13:31:24 crc kubenswrapper[4898]: E1211 13:31:24.747184 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7308587a486c8094c4299e6bde339a714ddc26375cd0bd878c28450438ed879d\": container with ID starting with 7308587a486c8094c4299e6bde339a714ddc26375cd0bd878c28450438ed879d not found: ID does not exist" containerID="7308587a486c8094c4299e6bde339a714ddc26375cd0bd878c28450438ed879d" Dec 11 13:31:24 crc kubenswrapper[4898]: I1211 13:31:24.747220 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7308587a486c8094c4299e6bde339a714ddc26375cd0bd878c28450438ed879d"} err="failed to get container status \"7308587a486c8094c4299e6bde339a714ddc26375cd0bd878c28450438ed879d\": rpc error: code = NotFound desc = could not find container \"7308587a486c8094c4299e6bde339a714ddc26375cd0bd878c28450438ed879d\": container with ID starting with 7308587a486c8094c4299e6bde339a714ddc26375cd0bd878c28450438ed879d not found: ID does not exist" Dec 11 13:31:24 crc kubenswrapper[4898]: I1211 13:31:24.747246 4898 scope.go:117] "RemoveContainer" containerID="0696c18c4423a2b48b5deac7fdd2156f4c06f7570111249dc08cb99e6aae8183" Dec 11 13:31:24 crc kubenswrapper[4898]: E1211 13:31:24.747764 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0696c18c4423a2b48b5deac7fdd2156f4c06f7570111249dc08cb99e6aae8183\": container with ID starting with 0696c18c4423a2b48b5deac7fdd2156f4c06f7570111249dc08cb99e6aae8183 not found: ID does not exist" containerID="0696c18c4423a2b48b5deac7fdd2156f4c06f7570111249dc08cb99e6aae8183" Dec 11 13:31:24 crc kubenswrapper[4898]: I1211 13:31:24.747788 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0696c18c4423a2b48b5deac7fdd2156f4c06f7570111249dc08cb99e6aae8183"} err="failed to get container status \"0696c18c4423a2b48b5deac7fdd2156f4c06f7570111249dc08cb99e6aae8183\": rpc error: code = NotFound desc = could not find container \"0696c18c4423a2b48b5deac7fdd2156f4c06f7570111249dc08cb99e6aae8183\": container with ID starting with 0696c18c4423a2b48b5deac7fdd2156f4c06f7570111249dc08cb99e6aae8183 not found: ID does not exist" Dec 11 13:31:24 crc kubenswrapper[4898]: I1211 13:31:24.795416 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b39b71ba-a893-4a17-92da-bf3db2cf671a" path="/var/lib/kubelet/pods/b39b71ba-a893-4a17-92da-bf3db2cf671a/volumes" Dec 11 13:31:24 crc kubenswrapper[4898]: I1211 13:31:24.796335 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 11 13:31:24 crc kubenswrapper[4898]: I1211 13:31:24.820320 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-68bbb97c49-zk2lz"] Dec 11 13:31:25 crc kubenswrapper[4898]: I1211 13:31:25.692934 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-686c7f94b-jlnr7" event={"ID":"836f22d0-0883-463e-942b-abb6931a997f","Type":"ContainerStarted","Data":"850976ce7db18da3cf3f91a071ac7a5bbd83b8362a342f5fb2e0da727777ed10"} Dec 11 13:31:25 crc kubenswrapper[4898]: I1211 13:31:25.694794 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-engine-686c7f94b-jlnr7" Dec 11 13:31:25 crc kubenswrapper[4898]: I1211 13:31:25.703615 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d71ee22f-68e7-43d7-8a6a-012ff8b8104e","Type":"ContainerStarted","Data":"d9a08fbb5b93e0892e7764b24571d0aa7e4a85ae6a652d601607d681dfc4b9ff"} Dec 11 13:31:25 crc kubenswrapper[4898]: I1211 13:31:25.712514 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-68bbb97c49-zk2lz" event={"ID":"911f5e17-1a51-4bf3-8f1c-cdedc2f4404c","Type":"ContainerStarted","Data":"6c51659a6113c1aff003a66606f643e5a9c81bc38b81511d201acc544f1cc69c"} Dec 11 13:31:25 crc kubenswrapper[4898]: I1211 13:31:25.718332 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-engine-686c7f94b-jlnr7" podStartSLOduration=2.71830895 podStartE2EDuration="2.71830895s" podCreationTimestamp="2025-12-11 13:31:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 13:31:25.710084759 +0000 UTC m=+1643.282411206" watchObservedRunningTime="2025-12-11 13:31:25.71830895 +0000 UTC m=+1643.290635397" Dec 11 13:31:25 crc kubenswrapper[4898]: I1211 13:31:25.743677 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.657871026 podStartE2EDuration="43.74365249s" podCreationTimestamp="2025-12-11 13:30:42 +0000 UTC" firstStartedPulling="2025-12-11 13:30:43.885688202 +0000 UTC m=+1601.458014639" lastFinishedPulling="2025-12-11 13:31:24.971469666 +0000 UTC m=+1642.543796103" observedRunningTime="2025-12-11 13:31:25.740282219 +0000 UTC m=+1643.312608676" watchObservedRunningTime="2025-12-11 13:31:25.74365249 +0000 UTC m=+1643.315978927" Dec 11 13:31:26 crc kubenswrapper[4898]: I1211 13:31:26.723343 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-68bbb97c49-zk2lz" event={"ID":"911f5e17-1a51-4bf3-8f1c-cdedc2f4404c","Type":"ContainerStarted","Data":"f5f1a0a09e8b9b69b90e37ab0f09a6b84148db1a192d367ab4d83af482b941ca"} Dec 11 13:31:26 crc kubenswrapper[4898]: I1211 13:31:26.724055 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-68bbb97c49-zk2lz" Dec 11 13:31:26 crc kubenswrapper[4898]: I1211 13:31:26.725904 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-58ffc484cf-pk2vt" event={"ID":"d83af347-3774-4e08-8138-6e67557da826","Type":"ContainerStarted","Data":"256e552abf3931b0a4a70617dfc320bda93a330faf39bdad05666ceaac3586c7"} Dec 11 13:31:26 crc kubenswrapper[4898]: I1211 13:31:26.726031 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-58ffc484cf-pk2vt" Dec 11 13:31:26 crc kubenswrapper[4898]: I1211 13:31:26.741392 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-68bbb97c49-zk2lz" podStartSLOduration=2.408816808 podStartE2EDuration="3.741372684s" podCreationTimestamp="2025-12-11 13:31:23 +0000 UTC" firstStartedPulling="2025-12-11 13:31:24.828410518 +0000 UTC m=+1642.400736955" lastFinishedPulling="2025-12-11 13:31:26.160966394 +0000 UTC m=+1643.733292831" observedRunningTime="2025-12-11 13:31:26.739979576 +0000 UTC m=+1644.312306023" watchObservedRunningTime="2025-12-11 13:31:26.741372684 +0000 UTC m=+1644.313699111" Dec 11 13:31:26 crc kubenswrapper[4898]: I1211 13:31:26.763038 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-58ffc484cf-pk2vt" podStartSLOduration=2.221359599 podStartE2EDuration="3.763008214s" podCreationTimestamp="2025-12-11 13:31:23 +0000 UTC" firstStartedPulling="2025-12-11 13:31:24.61673468 +0000 UTC m=+1642.189061117" lastFinishedPulling="2025-12-11 13:31:26.158383295 +0000 UTC m=+1643.730709732" observedRunningTime="2025-12-11 13:31:26.75873974 +0000 UTC m=+1644.331066187" watchObservedRunningTime="2025-12-11 13:31:26.763008214 +0000 UTC m=+1644.335334651" Dec 11 13:31:34 crc kubenswrapper[4898]: I1211 13:31:34.995800 4898 patch_prober.go:28] interesting pod/machine-config-daemon-7mmvk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 11 13:31:34 crc kubenswrapper[4898]: I1211 13:31:34.996386 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 11 13:31:35 crc kubenswrapper[4898]: I1211 13:31:35.427169 4898 scope.go:117] "RemoveContainer" containerID="19984b831f4957ac3cf343e7090d9ec3c271a60a5d0b21b8b0f78fbb9f8a6bdf" Dec 11 13:31:35 crc kubenswrapper[4898]: I1211 13:31:35.546607 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-api-58ffc484cf-pk2vt" Dec 11 13:31:35 crc kubenswrapper[4898]: I1211 13:31:35.583904 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-cfnapi-68bbb97c49-zk2lz" Dec 11 13:31:35 crc kubenswrapper[4898]: I1211 13:31:35.706603 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-59dc5cddb6-qswpf"] Dec 11 13:31:35 crc kubenswrapper[4898]: I1211 13:31:35.707126 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-api-59dc5cddb6-qswpf" podUID="4def8e33-aab9-4c66-8b73-c866ac6c5047" containerName="heat-api" containerID="cri-o://6020ea8fd58098a1bd89bcd4ea035d07467ae603bbc9618ec25600a4f889f68a" gracePeriod=60 Dec 11 13:31:35 crc kubenswrapper[4898]: I1211 13:31:35.750497 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-7f89bf5c7d-t54nx"] Dec 11 13:31:35 crc kubenswrapper[4898]: I1211 13:31:35.750689 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-cfnapi-7f89bf5c7d-t54nx" podUID="84a913ce-6f2e-4327-89b0-eb40be31c03e" containerName="heat-cfnapi" containerID="cri-o://e9a47768d5332a760761752a343cf5b0496d0addbfc153fedc62814b54cdd698" gracePeriod=60 Dec 11 13:31:38 crc kubenswrapper[4898]: I1211 13:31:38.680496 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mqdrp"] Dec 11 13:31:38 crc kubenswrapper[4898]: E1211 13:31:38.681618 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b39b71ba-a893-4a17-92da-bf3db2cf671a" containerName="dnsmasq-dns" Dec 11 13:31:38 crc kubenswrapper[4898]: I1211 13:31:38.681635 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="b39b71ba-a893-4a17-92da-bf3db2cf671a" containerName="dnsmasq-dns" Dec 11 13:31:38 crc kubenswrapper[4898]: E1211 13:31:38.681666 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b39b71ba-a893-4a17-92da-bf3db2cf671a" containerName="init" Dec 11 13:31:38 crc kubenswrapper[4898]: I1211 13:31:38.681674 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="b39b71ba-a893-4a17-92da-bf3db2cf671a" containerName="init" Dec 11 13:31:38 crc kubenswrapper[4898]: I1211 13:31:38.681955 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="b39b71ba-a893-4a17-92da-bf3db2cf671a" containerName="dnsmasq-dns" Dec 11 13:31:38 crc kubenswrapper[4898]: I1211 13:31:38.683146 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mqdrp" Dec 11 13:31:38 crc kubenswrapper[4898]: I1211 13:31:38.685671 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 11 13:31:38 crc kubenswrapper[4898]: I1211 13:31:38.686037 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 11 13:31:38 crc kubenswrapper[4898]: I1211 13:31:38.686292 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 11 13:31:38 crc kubenswrapper[4898]: I1211 13:31:38.686437 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-w8jc8" Dec 11 13:31:38 crc kubenswrapper[4898]: I1211 13:31:38.696030 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mqdrp"] Dec 11 13:31:38 crc kubenswrapper[4898]: I1211 13:31:38.729278 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8831e333-4a32-4f65-85e7-16a9a95360c9-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-mqdrp\" (UID: \"8831e333-4a32-4f65-85e7-16a9a95360c9\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mqdrp" Dec 11 13:31:38 crc kubenswrapper[4898]: I1211 13:31:38.729342 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9dh4s\" (UniqueName: \"kubernetes.io/projected/8831e333-4a32-4f65-85e7-16a9a95360c9-kube-api-access-9dh4s\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-mqdrp\" (UID: \"8831e333-4a32-4f65-85e7-16a9a95360c9\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mqdrp" Dec 11 13:31:38 crc kubenswrapper[4898]: I1211 13:31:38.729422 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8831e333-4a32-4f65-85e7-16a9a95360c9-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-mqdrp\" (UID: \"8831e333-4a32-4f65-85e7-16a9a95360c9\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mqdrp" Dec 11 13:31:38 crc kubenswrapper[4898]: I1211 13:31:38.729557 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8831e333-4a32-4f65-85e7-16a9a95360c9-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-mqdrp\" (UID: \"8831e333-4a32-4f65-85e7-16a9a95360c9\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mqdrp" Dec 11 13:31:38 crc kubenswrapper[4898]: I1211 13:31:38.832364 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8831e333-4a32-4f65-85e7-16a9a95360c9-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-mqdrp\" (UID: \"8831e333-4a32-4f65-85e7-16a9a95360c9\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mqdrp" Dec 11 13:31:38 crc kubenswrapper[4898]: I1211 13:31:38.832418 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9dh4s\" (UniqueName: \"kubernetes.io/projected/8831e333-4a32-4f65-85e7-16a9a95360c9-kube-api-access-9dh4s\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-mqdrp\" (UID: \"8831e333-4a32-4f65-85e7-16a9a95360c9\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mqdrp" Dec 11 13:31:38 crc kubenswrapper[4898]: I1211 13:31:38.832495 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8831e333-4a32-4f65-85e7-16a9a95360c9-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-mqdrp\" (UID: \"8831e333-4a32-4f65-85e7-16a9a95360c9\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mqdrp" Dec 11 13:31:38 crc kubenswrapper[4898]: I1211 13:31:38.832573 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8831e333-4a32-4f65-85e7-16a9a95360c9-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-mqdrp\" (UID: \"8831e333-4a32-4f65-85e7-16a9a95360c9\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mqdrp" Dec 11 13:31:38 crc kubenswrapper[4898]: I1211 13:31:38.837925 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8831e333-4a32-4f65-85e7-16a9a95360c9-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-mqdrp\" (UID: \"8831e333-4a32-4f65-85e7-16a9a95360c9\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mqdrp" Dec 11 13:31:38 crc kubenswrapper[4898]: I1211 13:31:38.838152 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8831e333-4a32-4f65-85e7-16a9a95360c9-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-mqdrp\" (UID: \"8831e333-4a32-4f65-85e7-16a9a95360c9\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mqdrp" Dec 11 13:31:38 crc kubenswrapper[4898]: I1211 13:31:38.848811 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8831e333-4a32-4f65-85e7-16a9a95360c9-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-mqdrp\" (UID: \"8831e333-4a32-4f65-85e7-16a9a95360c9\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mqdrp" Dec 11 13:31:38 crc kubenswrapper[4898]: I1211 13:31:38.853300 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9dh4s\" (UniqueName: \"kubernetes.io/projected/8831e333-4a32-4f65-85e7-16a9a95360c9-kube-api-access-9dh4s\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-mqdrp\" (UID: \"8831e333-4a32-4f65-85e7-16a9a95360c9\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mqdrp" Dec 11 13:31:38 crc kubenswrapper[4898]: I1211 13:31:38.900164 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/heat-api-59dc5cddb6-qswpf" podUID="4def8e33-aab9-4c66-8b73-c866ac6c5047" containerName="heat-api" probeResult="failure" output="Get \"https://10.217.0.224:8004/healthcheck\": read tcp 10.217.0.2:53738->10.217.0.224:8004: read: connection reset by peer" Dec 11 13:31:38 crc kubenswrapper[4898]: I1211 13:31:38.902353 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/heat-cfnapi-7f89bf5c7d-t54nx" podUID="84a913ce-6f2e-4327-89b0-eb40be31c03e" containerName="heat-cfnapi" probeResult="failure" output="Get \"https://10.217.0.225:8000/healthcheck\": read tcp 10.217.0.2:47976->10.217.0.225:8000: read: connection reset by peer" Dec 11 13:31:39 crc kubenswrapper[4898]: I1211 13:31:39.011829 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mqdrp" Dec 11 13:31:39 crc kubenswrapper[4898]: I1211 13:31:39.375910 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-7f89bf5c7d-t54nx" Dec 11 13:31:39 crc kubenswrapper[4898]: I1211 13:31:39.460200 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/84a913ce-6f2e-4327-89b0-eb40be31c03e-config-data-custom\") pod \"84a913ce-6f2e-4327-89b0-eb40be31c03e\" (UID: \"84a913ce-6f2e-4327-89b0-eb40be31c03e\") " Dec 11 13:31:39 crc kubenswrapper[4898]: I1211 13:31:39.460317 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84a913ce-6f2e-4327-89b0-eb40be31c03e-combined-ca-bundle\") pod \"84a913ce-6f2e-4327-89b0-eb40be31c03e\" (UID: \"84a913ce-6f2e-4327-89b0-eb40be31c03e\") " Dec 11 13:31:39 crc kubenswrapper[4898]: I1211 13:31:39.460394 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jk7ds\" (UniqueName: \"kubernetes.io/projected/84a913ce-6f2e-4327-89b0-eb40be31c03e-kube-api-access-jk7ds\") pod \"84a913ce-6f2e-4327-89b0-eb40be31c03e\" (UID: \"84a913ce-6f2e-4327-89b0-eb40be31c03e\") " Dec 11 13:31:39 crc kubenswrapper[4898]: I1211 13:31:39.460471 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/84a913ce-6f2e-4327-89b0-eb40be31c03e-internal-tls-certs\") pod \"84a913ce-6f2e-4327-89b0-eb40be31c03e\" (UID: \"84a913ce-6f2e-4327-89b0-eb40be31c03e\") " Dec 11 13:31:39 crc kubenswrapper[4898]: I1211 13:31:39.460540 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84a913ce-6f2e-4327-89b0-eb40be31c03e-config-data\") pod \"84a913ce-6f2e-4327-89b0-eb40be31c03e\" (UID: \"84a913ce-6f2e-4327-89b0-eb40be31c03e\") " Dec 11 13:31:39 crc kubenswrapper[4898]: I1211 13:31:39.460651 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/84a913ce-6f2e-4327-89b0-eb40be31c03e-public-tls-certs\") pod \"84a913ce-6f2e-4327-89b0-eb40be31c03e\" (UID: \"84a913ce-6f2e-4327-89b0-eb40be31c03e\") " Dec 11 13:31:39 crc kubenswrapper[4898]: I1211 13:31:39.470256 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/84a913ce-6f2e-4327-89b0-eb40be31c03e-kube-api-access-jk7ds" (OuterVolumeSpecName: "kube-api-access-jk7ds") pod "84a913ce-6f2e-4327-89b0-eb40be31c03e" (UID: "84a913ce-6f2e-4327-89b0-eb40be31c03e"). InnerVolumeSpecName "kube-api-access-jk7ds". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:31:39 crc kubenswrapper[4898]: I1211 13:31:39.475595 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84a913ce-6f2e-4327-89b0-eb40be31c03e-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "84a913ce-6f2e-4327-89b0-eb40be31c03e" (UID: "84a913ce-6f2e-4327-89b0-eb40be31c03e"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:31:39 crc kubenswrapper[4898]: I1211 13:31:39.552957 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84a913ce-6f2e-4327-89b0-eb40be31c03e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "84a913ce-6f2e-4327-89b0-eb40be31c03e" (UID: "84a913ce-6f2e-4327-89b0-eb40be31c03e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:31:39 crc kubenswrapper[4898]: I1211 13:31:39.565323 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84a913ce-6f2e-4327-89b0-eb40be31c03e-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "84a913ce-6f2e-4327-89b0-eb40be31c03e" (UID: "84a913ce-6f2e-4327-89b0-eb40be31c03e"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:31:39 crc kubenswrapper[4898]: I1211 13:31:39.565707 4898 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/84a913ce-6f2e-4327-89b0-eb40be31c03e-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 11 13:31:39 crc kubenswrapper[4898]: I1211 13:31:39.565734 4898 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84a913ce-6f2e-4327-89b0-eb40be31c03e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 11 13:31:39 crc kubenswrapper[4898]: I1211 13:31:39.565744 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jk7ds\" (UniqueName: \"kubernetes.io/projected/84a913ce-6f2e-4327-89b0-eb40be31c03e-kube-api-access-jk7ds\") on node \"crc\" DevicePath \"\"" Dec 11 13:31:39 crc kubenswrapper[4898]: I1211 13:31:39.565756 4898 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/84a913ce-6f2e-4327-89b0-eb40be31c03e-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 11 13:31:39 crc kubenswrapper[4898]: I1211 13:31:39.569641 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84a913ce-6f2e-4327-89b0-eb40be31c03e-config-data" (OuterVolumeSpecName: "config-data") pod "84a913ce-6f2e-4327-89b0-eb40be31c03e" (UID: "84a913ce-6f2e-4327-89b0-eb40be31c03e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:31:39 crc kubenswrapper[4898]: I1211 13:31:39.597031 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84a913ce-6f2e-4327-89b0-eb40be31c03e-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "84a913ce-6f2e-4327-89b0-eb40be31c03e" (UID: "84a913ce-6f2e-4327-89b0-eb40be31c03e"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:31:39 crc kubenswrapper[4898]: I1211 13:31:39.668046 4898 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84a913ce-6f2e-4327-89b0-eb40be31c03e-config-data\") on node \"crc\" DevicePath \"\"" Dec 11 13:31:39 crc kubenswrapper[4898]: I1211 13:31:39.668278 4898 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/84a913ce-6f2e-4327-89b0-eb40be31c03e-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 11 13:31:39 crc kubenswrapper[4898]: I1211 13:31:39.796693 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mqdrp"] Dec 11 13:31:39 crc kubenswrapper[4898]: I1211 13:31:39.871839 4898 generic.go:334] "Generic (PLEG): container finished" podID="4def8e33-aab9-4c66-8b73-c866ac6c5047" containerID="6020ea8fd58098a1bd89bcd4ea035d07467ae603bbc9618ec25600a4f889f68a" exitCode=0 Dec 11 13:31:39 crc kubenswrapper[4898]: I1211 13:31:39.872171 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-59dc5cddb6-qswpf" event={"ID":"4def8e33-aab9-4c66-8b73-c866ac6c5047","Type":"ContainerDied","Data":"6020ea8fd58098a1bd89bcd4ea035d07467ae603bbc9618ec25600a4f889f68a"} Dec 11 13:31:39 crc kubenswrapper[4898]: I1211 13:31:39.877853 4898 generic.go:334] "Generic (PLEG): container finished" podID="91c646bc-40ca-434e-8db2-df2eb46c4e5e" containerID="4f5efd791407aba9140531ae3f50bf656aa889c9f67ee48de88c9393b231848c" exitCode=0 Dec 11 13:31:39 crc kubenswrapper[4898]: I1211 13:31:39.877927 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"91c646bc-40ca-434e-8db2-df2eb46c4e5e","Type":"ContainerDied","Data":"4f5efd791407aba9140531ae3f50bf656aa889c9f67ee48de88c9393b231848c"} Dec 11 13:31:39 crc kubenswrapper[4898]: I1211 13:31:39.885346 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mqdrp" event={"ID":"8831e333-4a32-4f65-85e7-16a9a95360c9","Type":"ContainerStarted","Data":"0258d62e3dc79794e62212ed082321d988df13b9473d63db77fd236b4f9f9f60"} Dec 11 13:31:39 crc kubenswrapper[4898]: I1211 13:31:39.894958 4898 generic.go:334] "Generic (PLEG): container finished" podID="84a913ce-6f2e-4327-89b0-eb40be31c03e" containerID="e9a47768d5332a760761752a343cf5b0496d0addbfc153fedc62814b54cdd698" exitCode=0 Dec 11 13:31:39 crc kubenswrapper[4898]: I1211 13:31:39.895195 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-7f89bf5c7d-t54nx" Dec 11 13:31:39 crc kubenswrapper[4898]: I1211 13:31:39.895236 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-7f89bf5c7d-t54nx" event={"ID":"84a913ce-6f2e-4327-89b0-eb40be31c03e","Type":"ContainerDied","Data":"e9a47768d5332a760761752a343cf5b0496d0addbfc153fedc62814b54cdd698"} Dec 11 13:31:39 crc kubenswrapper[4898]: I1211 13:31:39.895852 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-7f89bf5c7d-t54nx" event={"ID":"84a913ce-6f2e-4327-89b0-eb40be31c03e","Type":"ContainerDied","Data":"6e213a479ab98c9dd74395981e83aafce69b368b8733a4815c07d410648ff41f"} Dec 11 13:31:39 crc kubenswrapper[4898]: I1211 13:31:39.895899 4898 scope.go:117] "RemoveContainer" containerID="e9a47768d5332a760761752a343cf5b0496d0addbfc153fedc62814b54cdd698" Dec 11 13:31:40 crc kubenswrapper[4898]: I1211 13:31:40.130855 4898 scope.go:117] "RemoveContainer" containerID="e9a47768d5332a760761752a343cf5b0496d0addbfc153fedc62814b54cdd698" Dec 11 13:31:40 crc kubenswrapper[4898]: I1211 13:31:40.141555 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-59dc5cddb6-qswpf" Dec 11 13:31:40 crc kubenswrapper[4898]: E1211 13:31:40.142624 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e9a47768d5332a760761752a343cf5b0496d0addbfc153fedc62814b54cdd698\": container with ID starting with e9a47768d5332a760761752a343cf5b0496d0addbfc153fedc62814b54cdd698 not found: ID does not exist" containerID="e9a47768d5332a760761752a343cf5b0496d0addbfc153fedc62814b54cdd698" Dec 11 13:31:40 crc kubenswrapper[4898]: I1211 13:31:40.142670 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9a47768d5332a760761752a343cf5b0496d0addbfc153fedc62814b54cdd698"} err="failed to get container status \"e9a47768d5332a760761752a343cf5b0496d0addbfc153fedc62814b54cdd698\": rpc error: code = NotFound desc = could not find container \"e9a47768d5332a760761752a343cf5b0496d0addbfc153fedc62814b54cdd698\": container with ID starting with e9a47768d5332a760761752a343cf5b0496d0addbfc153fedc62814b54cdd698 not found: ID does not exist" Dec 11 13:31:40 crc kubenswrapper[4898]: I1211 13:31:40.148705 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-7f89bf5c7d-t54nx"] Dec 11 13:31:40 crc kubenswrapper[4898]: I1211 13:31:40.162528 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-cfnapi-7f89bf5c7d-t54nx"] Dec 11 13:31:40 crc kubenswrapper[4898]: I1211 13:31:40.184900 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4def8e33-aab9-4c66-8b73-c866ac6c5047-combined-ca-bundle\") pod \"4def8e33-aab9-4c66-8b73-c866ac6c5047\" (UID: \"4def8e33-aab9-4c66-8b73-c866ac6c5047\") " Dec 11 13:31:40 crc kubenswrapper[4898]: I1211 13:31:40.185021 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4def8e33-aab9-4c66-8b73-c866ac6c5047-config-data-custom\") pod \"4def8e33-aab9-4c66-8b73-c866ac6c5047\" (UID: \"4def8e33-aab9-4c66-8b73-c866ac6c5047\") " Dec 11 13:31:40 crc kubenswrapper[4898]: I1211 13:31:40.185129 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4def8e33-aab9-4c66-8b73-c866ac6c5047-config-data\") pod \"4def8e33-aab9-4c66-8b73-c866ac6c5047\" (UID: \"4def8e33-aab9-4c66-8b73-c866ac6c5047\") " Dec 11 13:31:40 crc kubenswrapper[4898]: I1211 13:31:40.185293 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xptfp\" (UniqueName: \"kubernetes.io/projected/4def8e33-aab9-4c66-8b73-c866ac6c5047-kube-api-access-xptfp\") pod \"4def8e33-aab9-4c66-8b73-c866ac6c5047\" (UID: \"4def8e33-aab9-4c66-8b73-c866ac6c5047\") " Dec 11 13:31:40 crc kubenswrapper[4898]: I1211 13:31:40.185327 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4def8e33-aab9-4c66-8b73-c866ac6c5047-internal-tls-certs\") pod \"4def8e33-aab9-4c66-8b73-c866ac6c5047\" (UID: \"4def8e33-aab9-4c66-8b73-c866ac6c5047\") " Dec 11 13:31:40 crc kubenswrapper[4898]: I1211 13:31:40.185393 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4def8e33-aab9-4c66-8b73-c866ac6c5047-public-tls-certs\") pod \"4def8e33-aab9-4c66-8b73-c866ac6c5047\" (UID: \"4def8e33-aab9-4c66-8b73-c866ac6c5047\") " Dec 11 13:31:40 crc kubenswrapper[4898]: I1211 13:31:40.199939 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4def8e33-aab9-4c66-8b73-c866ac6c5047-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "4def8e33-aab9-4c66-8b73-c866ac6c5047" (UID: "4def8e33-aab9-4c66-8b73-c866ac6c5047"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:31:40 crc kubenswrapper[4898]: I1211 13:31:40.200650 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4def8e33-aab9-4c66-8b73-c866ac6c5047-kube-api-access-xptfp" (OuterVolumeSpecName: "kube-api-access-xptfp") pod "4def8e33-aab9-4c66-8b73-c866ac6c5047" (UID: "4def8e33-aab9-4c66-8b73-c866ac6c5047"). InnerVolumeSpecName "kube-api-access-xptfp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:31:40 crc kubenswrapper[4898]: I1211 13:31:40.242623 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4def8e33-aab9-4c66-8b73-c866ac6c5047-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4def8e33-aab9-4c66-8b73-c866ac6c5047" (UID: "4def8e33-aab9-4c66-8b73-c866ac6c5047"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:31:40 crc kubenswrapper[4898]: I1211 13:31:40.289687 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xptfp\" (UniqueName: \"kubernetes.io/projected/4def8e33-aab9-4c66-8b73-c866ac6c5047-kube-api-access-xptfp\") on node \"crc\" DevicePath \"\"" Dec 11 13:31:40 crc kubenswrapper[4898]: I1211 13:31:40.289729 4898 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4def8e33-aab9-4c66-8b73-c866ac6c5047-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 11 13:31:40 crc kubenswrapper[4898]: I1211 13:31:40.289744 4898 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4def8e33-aab9-4c66-8b73-c866ac6c5047-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 11 13:31:40 crc kubenswrapper[4898]: I1211 13:31:40.309647 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4def8e33-aab9-4c66-8b73-c866ac6c5047-config-data" (OuterVolumeSpecName: "config-data") pod "4def8e33-aab9-4c66-8b73-c866ac6c5047" (UID: "4def8e33-aab9-4c66-8b73-c866ac6c5047"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:31:40 crc kubenswrapper[4898]: I1211 13:31:40.311627 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4def8e33-aab9-4c66-8b73-c866ac6c5047-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "4def8e33-aab9-4c66-8b73-c866ac6c5047" (UID: "4def8e33-aab9-4c66-8b73-c866ac6c5047"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:31:40 crc kubenswrapper[4898]: I1211 13:31:40.337699 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4def8e33-aab9-4c66-8b73-c866ac6c5047-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "4def8e33-aab9-4c66-8b73-c866ac6c5047" (UID: "4def8e33-aab9-4c66-8b73-c866ac6c5047"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:31:40 crc kubenswrapper[4898]: I1211 13:31:40.394358 4898 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4def8e33-aab9-4c66-8b73-c866ac6c5047-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 11 13:31:40 crc kubenswrapper[4898]: I1211 13:31:40.394401 4898 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4def8e33-aab9-4c66-8b73-c866ac6c5047-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 11 13:31:40 crc kubenswrapper[4898]: I1211 13:31:40.394415 4898 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4def8e33-aab9-4c66-8b73-c866ac6c5047-config-data\") on node \"crc\" DevicePath \"\"" Dec 11 13:31:40 crc kubenswrapper[4898]: I1211 13:31:40.792279 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="84a913ce-6f2e-4327-89b0-eb40be31c03e" path="/var/lib/kubelet/pods/84a913ce-6f2e-4327-89b0-eb40be31c03e/volumes" Dec 11 13:31:40 crc kubenswrapper[4898]: I1211 13:31:40.909215 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-59dc5cddb6-qswpf" event={"ID":"4def8e33-aab9-4c66-8b73-c866ac6c5047","Type":"ContainerDied","Data":"4d47a2ecefda2198c17a4c10b42f72ca37c81d0e136444649e79b5514837f21b"} Dec 11 13:31:40 crc kubenswrapper[4898]: I1211 13:31:40.909264 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-59dc5cddb6-qswpf" Dec 11 13:31:40 crc kubenswrapper[4898]: I1211 13:31:40.909271 4898 scope.go:117] "RemoveContainer" containerID="6020ea8fd58098a1bd89bcd4ea035d07467ae603bbc9618ec25600a4f889f68a" Dec 11 13:31:40 crc kubenswrapper[4898]: I1211 13:31:40.912734 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"91c646bc-40ca-434e-8db2-df2eb46c4e5e","Type":"ContainerStarted","Data":"48af6b37f233e07dec97433baecb224166e6e99670ad3d5d929d45db8603ea68"} Dec 11 13:31:40 crc kubenswrapper[4898]: I1211 13:31:40.913042 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Dec 11 13:31:40 crc kubenswrapper[4898]: I1211 13:31:40.969977 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=36.969958544 podStartE2EDuration="36.969958544s" podCreationTimestamp="2025-12-11 13:31:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 13:31:40.962937206 +0000 UTC m=+1658.535263663" watchObservedRunningTime="2025-12-11 13:31:40.969958544 +0000 UTC m=+1658.542284981" Dec 11 13:31:41 crc kubenswrapper[4898]: I1211 13:31:41.019338 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-59dc5cddb6-qswpf"] Dec 11 13:31:41 crc kubenswrapper[4898]: I1211 13:31:41.040049 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-api-59dc5cddb6-qswpf"] Dec 11 13:31:41 crc kubenswrapper[4898]: I1211 13:31:41.927611 4898 generic.go:334] "Generic (PLEG): container finished" podID="d7d19abc-90d0-413d-b8d7-67ae58b010f7" containerID="62ab4288f949bc1e3c7a921fe13871a2ec5e772ccee672c86dab942ef3b6c0ed" exitCode=0 Dec 11 13:31:41 crc kubenswrapper[4898]: I1211 13:31:41.928961 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"d7d19abc-90d0-413d-b8d7-67ae58b010f7","Type":"ContainerDied","Data":"62ab4288f949bc1e3c7a921fe13871a2ec5e772ccee672c86dab942ef3b6c0ed"} Dec 11 13:31:42 crc kubenswrapper[4898]: I1211 13:31:42.795206 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4def8e33-aab9-4c66-8b73-c866ac6c5047" path="/var/lib/kubelet/pods/4def8e33-aab9-4c66-8b73-c866ac6c5047/volumes" Dec 11 13:31:42 crc kubenswrapper[4898]: I1211 13:31:42.952340 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"d7d19abc-90d0-413d-b8d7-67ae58b010f7","Type":"ContainerStarted","Data":"98b08c30581099b10a47e7ecee901e4ac5154b3d371962d23a8e47b2ded94dcf"} Dec 11 13:31:42 crc kubenswrapper[4898]: I1211 13:31:42.952813 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Dec 11 13:31:42 crc kubenswrapper[4898]: I1211 13:31:42.981843 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=37.981825532 podStartE2EDuration="37.981825532s" podCreationTimestamp="2025-12-11 13:31:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 13:31:42.973654373 +0000 UTC m=+1660.545980810" watchObservedRunningTime="2025-12-11 13:31:42.981825532 +0000 UTC m=+1660.554151969" Dec 11 13:31:43 crc kubenswrapper[4898]: I1211 13:31:43.969740 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-engine-686c7f94b-jlnr7" Dec 11 13:31:44 crc kubenswrapper[4898]: I1211 13:31:44.025493 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-engine-5f8cd7c4cf-7zq47"] Dec 11 13:31:44 crc kubenswrapper[4898]: I1211 13:31:44.025731 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-engine-5f8cd7c4cf-7zq47" podUID="e113da52-055b-47d0-b16a-2fef9d302bbe" containerName="heat-engine" containerID="cri-o://4219744614ae19f5e8e2ef87f6e046fe6466914b8739d3d3af7ad27b29bc9fd1" gracePeriod=60 Dec 11 13:31:47 crc kubenswrapper[4898]: I1211 13:31:47.738725 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-db-sync-f9lpn"] Dec 11 13:31:47 crc kubenswrapper[4898]: I1211 13:31:47.757346 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-db-sync-f9lpn"] Dec 11 13:31:48 crc kubenswrapper[4898]: I1211 13:31:48.070364 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-db-sync-8hx6m"] Dec 11 13:31:48 crc kubenswrapper[4898]: E1211 13:31:48.071324 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84a913ce-6f2e-4327-89b0-eb40be31c03e" containerName="heat-cfnapi" Dec 11 13:31:48 crc kubenswrapper[4898]: I1211 13:31:48.071354 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="84a913ce-6f2e-4327-89b0-eb40be31c03e" containerName="heat-cfnapi" Dec 11 13:31:48 crc kubenswrapper[4898]: E1211 13:31:48.071420 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4def8e33-aab9-4c66-8b73-c866ac6c5047" containerName="heat-api" Dec 11 13:31:48 crc kubenswrapper[4898]: I1211 13:31:48.071430 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="4def8e33-aab9-4c66-8b73-c866ac6c5047" containerName="heat-api" Dec 11 13:31:48 crc kubenswrapper[4898]: I1211 13:31:48.071772 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="4def8e33-aab9-4c66-8b73-c866ac6c5047" containerName="heat-api" Dec 11 13:31:48 crc kubenswrapper[4898]: I1211 13:31:48.071824 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="84a913ce-6f2e-4327-89b0-eb40be31c03e" containerName="heat-cfnapi" Dec 11 13:31:48 crc kubenswrapper[4898]: I1211 13:31:48.073311 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-8hx6m" Dec 11 13:31:48 crc kubenswrapper[4898]: I1211 13:31:48.077124 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 11 13:31:48 crc kubenswrapper[4898]: I1211 13:31:48.093691 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-8hx6m"] Dec 11 13:31:48 crc kubenswrapper[4898]: I1211 13:31:48.214889 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4d9lj\" (UniqueName: \"kubernetes.io/projected/0c27755f-ec0b-4b73-a18b-f09ed1715ef7-kube-api-access-4d9lj\") pod \"aodh-db-sync-8hx6m\" (UID: \"0c27755f-ec0b-4b73-a18b-f09ed1715ef7\") " pod="openstack/aodh-db-sync-8hx6m" Dec 11 13:31:48 crc kubenswrapper[4898]: I1211 13:31:48.215482 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0c27755f-ec0b-4b73-a18b-f09ed1715ef7-scripts\") pod \"aodh-db-sync-8hx6m\" (UID: \"0c27755f-ec0b-4b73-a18b-f09ed1715ef7\") " pod="openstack/aodh-db-sync-8hx6m" Dec 11 13:31:48 crc kubenswrapper[4898]: I1211 13:31:48.215619 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c27755f-ec0b-4b73-a18b-f09ed1715ef7-config-data\") pod \"aodh-db-sync-8hx6m\" (UID: \"0c27755f-ec0b-4b73-a18b-f09ed1715ef7\") " pod="openstack/aodh-db-sync-8hx6m" Dec 11 13:31:48 crc kubenswrapper[4898]: I1211 13:31:48.216008 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c27755f-ec0b-4b73-a18b-f09ed1715ef7-combined-ca-bundle\") pod \"aodh-db-sync-8hx6m\" (UID: \"0c27755f-ec0b-4b73-a18b-f09ed1715ef7\") " pod="openstack/aodh-db-sync-8hx6m" Dec 11 13:31:48 crc kubenswrapper[4898]: I1211 13:31:48.318310 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4d9lj\" (UniqueName: \"kubernetes.io/projected/0c27755f-ec0b-4b73-a18b-f09ed1715ef7-kube-api-access-4d9lj\") pod \"aodh-db-sync-8hx6m\" (UID: \"0c27755f-ec0b-4b73-a18b-f09ed1715ef7\") " pod="openstack/aodh-db-sync-8hx6m" Dec 11 13:31:48 crc kubenswrapper[4898]: I1211 13:31:48.318374 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0c27755f-ec0b-4b73-a18b-f09ed1715ef7-scripts\") pod \"aodh-db-sync-8hx6m\" (UID: \"0c27755f-ec0b-4b73-a18b-f09ed1715ef7\") " pod="openstack/aodh-db-sync-8hx6m" Dec 11 13:31:48 crc kubenswrapper[4898]: I1211 13:31:48.318414 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c27755f-ec0b-4b73-a18b-f09ed1715ef7-config-data\") pod \"aodh-db-sync-8hx6m\" (UID: \"0c27755f-ec0b-4b73-a18b-f09ed1715ef7\") " pod="openstack/aodh-db-sync-8hx6m" Dec 11 13:31:48 crc kubenswrapper[4898]: I1211 13:31:48.318594 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c27755f-ec0b-4b73-a18b-f09ed1715ef7-combined-ca-bundle\") pod \"aodh-db-sync-8hx6m\" (UID: \"0c27755f-ec0b-4b73-a18b-f09ed1715ef7\") " pod="openstack/aodh-db-sync-8hx6m" Dec 11 13:31:48 crc kubenswrapper[4898]: I1211 13:31:48.325847 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c27755f-ec0b-4b73-a18b-f09ed1715ef7-combined-ca-bundle\") pod \"aodh-db-sync-8hx6m\" (UID: \"0c27755f-ec0b-4b73-a18b-f09ed1715ef7\") " pod="openstack/aodh-db-sync-8hx6m" Dec 11 13:31:48 crc kubenswrapper[4898]: I1211 13:31:48.336797 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c27755f-ec0b-4b73-a18b-f09ed1715ef7-config-data\") pod \"aodh-db-sync-8hx6m\" (UID: \"0c27755f-ec0b-4b73-a18b-f09ed1715ef7\") " pod="openstack/aodh-db-sync-8hx6m" Dec 11 13:31:48 crc kubenswrapper[4898]: I1211 13:31:48.338673 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0c27755f-ec0b-4b73-a18b-f09ed1715ef7-scripts\") pod \"aodh-db-sync-8hx6m\" (UID: \"0c27755f-ec0b-4b73-a18b-f09ed1715ef7\") " pod="openstack/aodh-db-sync-8hx6m" Dec 11 13:31:48 crc kubenswrapper[4898]: I1211 13:31:48.339597 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4d9lj\" (UniqueName: \"kubernetes.io/projected/0c27755f-ec0b-4b73-a18b-f09ed1715ef7-kube-api-access-4d9lj\") pod \"aodh-db-sync-8hx6m\" (UID: \"0c27755f-ec0b-4b73-a18b-f09ed1715ef7\") " pod="openstack/aodh-db-sync-8hx6m" Dec 11 13:31:48 crc kubenswrapper[4898]: I1211 13:31:48.407013 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-8hx6m" Dec 11 13:31:48 crc kubenswrapper[4898]: I1211 13:31:48.790728 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b381d75-882f-425b-9c58-ec00804fda34" path="/var/lib/kubelet/pods/0b381d75-882f-425b-9c58-ec00804fda34/volumes" Dec 11 13:31:52 crc kubenswrapper[4898]: E1211 13:31:52.688022 4898 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4219744614ae19f5e8e2ef87f6e046fe6466914b8739d3d3af7ad27b29bc9fd1" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Dec 11 13:31:52 crc kubenswrapper[4898]: E1211 13:31:52.690150 4898 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4219744614ae19f5e8e2ef87f6e046fe6466914b8739d3d3af7ad27b29bc9fd1" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Dec 11 13:31:52 crc kubenswrapper[4898]: E1211 13:31:52.691615 4898 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4219744614ae19f5e8e2ef87f6e046fe6466914b8739d3d3af7ad27b29bc9fd1" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Dec 11 13:31:52 crc kubenswrapper[4898]: E1211 13:31:52.691655 4898 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/heat-engine-5f8cd7c4cf-7zq47" podUID="e113da52-055b-47d0-b16a-2fef9d302bbe" containerName="heat-engine" Dec 11 13:31:54 crc kubenswrapper[4898]: I1211 13:31:54.668734 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Dec 11 13:31:56 crc kubenswrapper[4898]: E1211 13:31:56.300640 4898 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/openstack-ansibleee-runner:latest" Dec 11 13:31:56 crc kubenswrapper[4898]: E1211 13:31:56.301269 4898 kuberuntime_manager.go:1274] "Unhandled Error" err=< Dec 11 13:31:56 crc kubenswrapper[4898]: container &Container{Name:repo-setup-edpm-deployment-openstack-edpm-ipam,Image:quay.io/openstack-k8s-operators/openstack-ansibleee-runner:latest,Command:[],Args:[ansible-runner run /runner -p playbook.yaml -i repo-setup-edpm-deployment-openstack-edpm-ipam],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ANSIBLE_VERBOSITY,Value:2,ValueFrom:nil,},EnvVar{Name:RUNNER_PLAYBOOK,Value: Dec 11 13:31:56 crc kubenswrapper[4898]: - hosts: all Dec 11 13:31:56 crc kubenswrapper[4898]: strategy: linear Dec 11 13:31:56 crc kubenswrapper[4898]: tasks: Dec 11 13:31:56 crc kubenswrapper[4898]: - name: Enable podified-repos Dec 11 13:31:56 crc kubenswrapper[4898]: become: true Dec 11 13:31:56 crc kubenswrapper[4898]: ansible.builtin.shell: | Dec 11 13:31:56 crc kubenswrapper[4898]: set -euxo pipefail Dec 11 13:31:56 crc kubenswrapper[4898]: pushd /var/tmp Dec 11 13:31:56 crc kubenswrapper[4898]: curl -sL https://github.com/openstack-k8s-operators/repo-setup/archive/refs/heads/main.tar.gz | tar -xz Dec 11 13:31:56 crc kubenswrapper[4898]: pushd repo-setup-main Dec 11 13:31:56 crc kubenswrapper[4898]: python3 -m venv ./venv Dec 11 13:31:56 crc kubenswrapper[4898]: PBR_VERSION=0.0.0 ./venv/bin/pip install ./ Dec 11 13:31:56 crc kubenswrapper[4898]: ./venv/bin/repo-setup current-podified -b antelope Dec 11 13:31:56 crc kubenswrapper[4898]: popd Dec 11 13:31:56 crc kubenswrapper[4898]: rm -rf repo-setup-main Dec 11 13:31:56 crc kubenswrapper[4898]: Dec 11 13:31:56 crc kubenswrapper[4898]: Dec 11 13:31:56 crc kubenswrapper[4898]: ,ValueFrom:nil,},EnvVar{Name:RUNNER_EXTRA_VARS,Value: Dec 11 13:31:56 crc kubenswrapper[4898]: edpm_override_hosts: openstack-edpm-ipam Dec 11 13:31:56 crc kubenswrapper[4898]: edpm_service_type: repo-setup Dec 11 13:31:56 crc kubenswrapper[4898]: Dec 11 13:31:56 crc kubenswrapper[4898]: Dec 11 13:31:56 crc kubenswrapper[4898]: ,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:repo-setup-combined-ca-bundle,ReadOnly:false,MountPath:/var/lib/openstack/cacerts/repo-setup,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/runner/env/ssh_key,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:inventory,ReadOnly:false,MountPath:/runner/inventory/hosts,SubPath:inventory,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9dh4s,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:openstack-aee-default-env,},Optional:*true,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod repo-setup-edpm-deployment-openstack-edpm-ipam-mqdrp_openstack(8831e333-4a32-4f65-85e7-16a9a95360c9): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled Dec 11 13:31:56 crc kubenswrapper[4898]: > logger="UnhandledError" Dec 11 13:31:56 crc kubenswrapper[4898]: E1211 13:31:56.302534 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"repo-setup-edpm-deployment-openstack-edpm-ipam\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mqdrp" podUID="8831e333-4a32-4f65-85e7-16a9a95360c9" Dec 11 13:31:56 crc kubenswrapper[4898]: I1211 13:31:56.379671 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Dec 11 13:31:56 crc kubenswrapper[4898]: I1211 13:31:56.973517 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-8hx6m"] Dec 11 13:31:57 crc kubenswrapper[4898]: I1211 13:31:57.164506 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-8hx6m" event={"ID":"0c27755f-ec0b-4b73-a18b-f09ed1715ef7","Type":"ContainerStarted","Data":"4e32515575a152bcff775c9b33abf432df9cafe731d73dcf30140c9b690cf97c"} Dec 11 13:31:57 crc kubenswrapper[4898]: E1211 13:31:57.166004 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"repo-setup-edpm-deployment-openstack-edpm-ipam\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/openstack-ansibleee-runner:latest\\\"\"" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mqdrp" podUID="8831e333-4a32-4f65-85e7-16a9a95360c9" Dec 11 13:31:59 crc kubenswrapper[4898]: I1211 13:31:59.191772 4898 generic.go:334] "Generic (PLEG): container finished" podID="e113da52-055b-47d0-b16a-2fef9d302bbe" containerID="4219744614ae19f5e8e2ef87f6e046fe6466914b8739d3d3af7ad27b29bc9fd1" exitCode=0 Dec 11 13:31:59 crc kubenswrapper[4898]: I1211 13:31:59.191859 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-5f8cd7c4cf-7zq47" event={"ID":"e113da52-055b-47d0-b16a-2fef9d302bbe","Type":"ContainerDied","Data":"4219744614ae19f5e8e2ef87f6e046fe6466914b8739d3d3af7ad27b29bc9fd1"} Dec 11 13:32:01 crc kubenswrapper[4898]: I1211 13:32:01.752660 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-5f8cd7c4cf-7zq47" Dec 11 13:32:01 crc kubenswrapper[4898]: I1211 13:32:01.869552 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e113da52-055b-47d0-b16a-2fef9d302bbe-config-data-custom\") pod \"e113da52-055b-47d0-b16a-2fef9d302bbe\" (UID: \"e113da52-055b-47d0-b16a-2fef9d302bbe\") " Dec 11 13:32:01 crc kubenswrapper[4898]: I1211 13:32:01.869606 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e113da52-055b-47d0-b16a-2fef9d302bbe-combined-ca-bundle\") pod \"e113da52-055b-47d0-b16a-2fef9d302bbe\" (UID: \"e113da52-055b-47d0-b16a-2fef9d302bbe\") " Dec 11 13:32:01 crc kubenswrapper[4898]: I1211 13:32:01.869742 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bv5cw\" (UniqueName: \"kubernetes.io/projected/e113da52-055b-47d0-b16a-2fef9d302bbe-kube-api-access-bv5cw\") pod \"e113da52-055b-47d0-b16a-2fef9d302bbe\" (UID: \"e113da52-055b-47d0-b16a-2fef9d302bbe\") " Dec 11 13:32:01 crc kubenswrapper[4898]: I1211 13:32:01.869922 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e113da52-055b-47d0-b16a-2fef9d302bbe-config-data\") pod \"e113da52-055b-47d0-b16a-2fef9d302bbe\" (UID: \"e113da52-055b-47d0-b16a-2fef9d302bbe\") " Dec 11 13:32:01 crc kubenswrapper[4898]: I1211 13:32:01.875509 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e113da52-055b-47d0-b16a-2fef9d302bbe-kube-api-access-bv5cw" (OuterVolumeSpecName: "kube-api-access-bv5cw") pod "e113da52-055b-47d0-b16a-2fef9d302bbe" (UID: "e113da52-055b-47d0-b16a-2fef9d302bbe"). InnerVolumeSpecName "kube-api-access-bv5cw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:32:01 crc kubenswrapper[4898]: I1211 13:32:01.875798 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e113da52-055b-47d0-b16a-2fef9d302bbe-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "e113da52-055b-47d0-b16a-2fef9d302bbe" (UID: "e113da52-055b-47d0-b16a-2fef9d302bbe"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:32:01 crc kubenswrapper[4898]: I1211 13:32:01.913664 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e113da52-055b-47d0-b16a-2fef9d302bbe-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e113da52-055b-47d0-b16a-2fef9d302bbe" (UID: "e113da52-055b-47d0-b16a-2fef9d302bbe"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:32:01 crc kubenswrapper[4898]: I1211 13:32:01.956380 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e113da52-055b-47d0-b16a-2fef9d302bbe-config-data" (OuterVolumeSpecName: "config-data") pod "e113da52-055b-47d0-b16a-2fef9d302bbe" (UID: "e113da52-055b-47d0-b16a-2fef9d302bbe"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:32:01 crc kubenswrapper[4898]: I1211 13:32:01.973393 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bv5cw\" (UniqueName: \"kubernetes.io/projected/e113da52-055b-47d0-b16a-2fef9d302bbe-kube-api-access-bv5cw\") on node \"crc\" DevicePath \"\"" Dec 11 13:32:01 crc kubenswrapper[4898]: I1211 13:32:01.973429 4898 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e113da52-055b-47d0-b16a-2fef9d302bbe-config-data\") on node \"crc\" DevicePath \"\"" Dec 11 13:32:01 crc kubenswrapper[4898]: I1211 13:32:01.973444 4898 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e113da52-055b-47d0-b16a-2fef9d302bbe-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 11 13:32:01 crc kubenswrapper[4898]: I1211 13:32:01.973502 4898 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e113da52-055b-47d0-b16a-2fef9d302bbe-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 11 13:32:02 crc kubenswrapper[4898]: I1211 13:32:02.228339 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-5f8cd7c4cf-7zq47" event={"ID":"e113da52-055b-47d0-b16a-2fef9d302bbe","Type":"ContainerDied","Data":"a16387aef9d3e60ec85510eb7d40372515839f75829594ea56628444149f9b18"} Dec 11 13:32:02 crc kubenswrapper[4898]: I1211 13:32:02.229076 4898 scope.go:117] "RemoveContainer" containerID="4219744614ae19f5e8e2ef87f6e046fe6466914b8739d3d3af7ad27b29bc9fd1" Dec 11 13:32:02 crc kubenswrapper[4898]: I1211 13:32:02.228366 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-5f8cd7c4cf-7zq47" Dec 11 13:32:02 crc kubenswrapper[4898]: I1211 13:32:02.231171 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-8hx6m" event={"ID":"0c27755f-ec0b-4b73-a18b-f09ed1715ef7","Type":"ContainerStarted","Data":"f2617bd5248cbaeb588c5976d0c10977ae6e5fa5efacba0fe3d07fd368034cba"} Dec 11 13:32:02 crc kubenswrapper[4898]: I1211 13:32:02.273323 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-db-sync-8hx6m" podStartSLOduration=9.787682388 podStartE2EDuration="14.273297084s" podCreationTimestamp="2025-12-11 13:31:48 +0000 UTC" firstStartedPulling="2025-12-11 13:31:56.988337825 +0000 UTC m=+1674.560664262" lastFinishedPulling="2025-12-11 13:32:01.473952521 +0000 UTC m=+1679.046278958" observedRunningTime="2025-12-11 13:32:02.254247813 +0000 UTC m=+1679.826574260" watchObservedRunningTime="2025-12-11 13:32:02.273297084 +0000 UTC m=+1679.845623531" Dec 11 13:32:02 crc kubenswrapper[4898]: I1211 13:32:02.286663 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-engine-5f8cd7c4cf-7zq47"] Dec 11 13:32:02 crc kubenswrapper[4898]: I1211 13:32:02.299646 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-engine-5f8cd7c4cf-7zq47"] Dec 11 13:32:02 crc kubenswrapper[4898]: I1211 13:32:02.818713 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e113da52-055b-47d0-b16a-2fef9d302bbe" path="/var/lib/kubelet/pods/e113da52-055b-47d0-b16a-2fef9d302bbe/volumes" Dec 11 13:32:04 crc kubenswrapper[4898]: I1211 13:32:04.996099 4898 patch_prober.go:28] interesting pod/machine-config-daemon-7mmvk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 11 13:32:04 crc kubenswrapper[4898]: I1211 13:32:04.997531 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 11 13:32:04 crc kubenswrapper[4898]: I1211 13:32:04.997687 4898 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" Dec 11 13:32:04 crc kubenswrapper[4898]: I1211 13:32:04.998770 4898 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a48cb70f955c6dfbf71c08dc1817c1d1fa98db5c3f0203c3fcc75aa88e18e901"} pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 11 13:32:04 crc kubenswrapper[4898]: I1211 13:32:04.998917 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" containerName="machine-config-daemon" containerID="cri-o://a48cb70f955c6dfbf71c08dc1817c1d1fa98db5c3f0203c3fcc75aa88e18e901" gracePeriod=600 Dec 11 13:32:05 crc kubenswrapper[4898]: I1211 13:32:05.272345 4898 generic.go:334] "Generic (PLEG): container finished" podID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" containerID="a48cb70f955c6dfbf71c08dc1817c1d1fa98db5c3f0203c3fcc75aa88e18e901" exitCode=0 Dec 11 13:32:05 crc kubenswrapper[4898]: I1211 13:32:05.272423 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" event={"ID":"b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c","Type":"ContainerDied","Data":"a48cb70f955c6dfbf71c08dc1817c1d1fa98db5c3f0203c3fcc75aa88e18e901"} Dec 11 13:32:05 crc kubenswrapper[4898]: I1211 13:32:05.272516 4898 scope.go:117] "RemoveContainer" containerID="0cb68cd95a282ab2d60091b5e9b86bf1ff863de795c2f88e13ffcb75e9110201" Dec 11 13:32:05 crc kubenswrapper[4898]: I1211 13:32:05.275084 4898 generic.go:334] "Generic (PLEG): container finished" podID="0c27755f-ec0b-4b73-a18b-f09ed1715ef7" containerID="f2617bd5248cbaeb588c5976d0c10977ae6e5fa5efacba0fe3d07fd368034cba" exitCode=0 Dec 11 13:32:05 crc kubenswrapper[4898]: I1211 13:32:05.275119 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-8hx6m" event={"ID":"0c27755f-ec0b-4b73-a18b-f09ed1715ef7","Type":"ContainerDied","Data":"f2617bd5248cbaeb588c5976d0c10977ae6e5fa5efacba0fe3d07fd368034cba"} Dec 11 13:32:05 crc kubenswrapper[4898]: E1211 13:32:05.624367 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mmvk_openshift-machine-config-operator(b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" Dec 11 13:32:06 crc kubenswrapper[4898]: I1211 13:32:06.288286 4898 scope.go:117] "RemoveContainer" containerID="a48cb70f955c6dfbf71c08dc1817c1d1fa98db5c3f0203c3fcc75aa88e18e901" Dec 11 13:32:06 crc kubenswrapper[4898]: E1211 13:32:06.289040 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mmvk_openshift-machine-config-operator(b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" Dec 11 13:32:06 crc kubenswrapper[4898]: I1211 13:32:06.725487 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-8hx6m" Dec 11 13:32:06 crc kubenswrapper[4898]: I1211 13:32:06.896605 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0c27755f-ec0b-4b73-a18b-f09ed1715ef7-scripts\") pod \"0c27755f-ec0b-4b73-a18b-f09ed1715ef7\" (UID: \"0c27755f-ec0b-4b73-a18b-f09ed1715ef7\") " Dec 11 13:32:06 crc kubenswrapper[4898]: I1211 13:32:06.896820 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d9lj\" (UniqueName: \"kubernetes.io/projected/0c27755f-ec0b-4b73-a18b-f09ed1715ef7-kube-api-access-4d9lj\") pod \"0c27755f-ec0b-4b73-a18b-f09ed1715ef7\" (UID: \"0c27755f-ec0b-4b73-a18b-f09ed1715ef7\") " Dec 11 13:32:06 crc kubenswrapper[4898]: I1211 13:32:06.896994 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c27755f-ec0b-4b73-a18b-f09ed1715ef7-combined-ca-bundle\") pod \"0c27755f-ec0b-4b73-a18b-f09ed1715ef7\" (UID: \"0c27755f-ec0b-4b73-a18b-f09ed1715ef7\") " Dec 11 13:32:06 crc kubenswrapper[4898]: I1211 13:32:06.897064 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c27755f-ec0b-4b73-a18b-f09ed1715ef7-config-data\") pod \"0c27755f-ec0b-4b73-a18b-f09ed1715ef7\" (UID: \"0c27755f-ec0b-4b73-a18b-f09ed1715ef7\") " Dec 11 13:32:06 crc kubenswrapper[4898]: I1211 13:32:06.903739 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c27755f-ec0b-4b73-a18b-f09ed1715ef7-scripts" (OuterVolumeSpecName: "scripts") pod "0c27755f-ec0b-4b73-a18b-f09ed1715ef7" (UID: "0c27755f-ec0b-4b73-a18b-f09ed1715ef7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:32:06 crc kubenswrapper[4898]: I1211 13:32:06.903831 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c27755f-ec0b-4b73-a18b-f09ed1715ef7-kube-api-access-4d9lj" (OuterVolumeSpecName: "kube-api-access-4d9lj") pod "0c27755f-ec0b-4b73-a18b-f09ed1715ef7" (UID: "0c27755f-ec0b-4b73-a18b-f09ed1715ef7"). InnerVolumeSpecName "kube-api-access-4d9lj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:32:06 crc kubenswrapper[4898]: I1211 13:32:06.930611 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c27755f-ec0b-4b73-a18b-f09ed1715ef7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0c27755f-ec0b-4b73-a18b-f09ed1715ef7" (UID: "0c27755f-ec0b-4b73-a18b-f09ed1715ef7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:32:06 crc kubenswrapper[4898]: I1211 13:32:06.934061 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c27755f-ec0b-4b73-a18b-f09ed1715ef7-config-data" (OuterVolumeSpecName: "config-data") pod "0c27755f-ec0b-4b73-a18b-f09ed1715ef7" (UID: "0c27755f-ec0b-4b73-a18b-f09ed1715ef7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:32:07 crc kubenswrapper[4898]: I1211 13:32:07.000648 4898 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c27755f-ec0b-4b73-a18b-f09ed1715ef7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 11 13:32:07 crc kubenswrapper[4898]: I1211 13:32:07.000695 4898 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c27755f-ec0b-4b73-a18b-f09ed1715ef7-config-data\") on node \"crc\" DevicePath \"\"" Dec 11 13:32:07 crc kubenswrapper[4898]: I1211 13:32:07.000712 4898 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0c27755f-ec0b-4b73-a18b-f09ed1715ef7-scripts\") on node \"crc\" DevicePath \"\"" Dec 11 13:32:07 crc kubenswrapper[4898]: I1211 13:32:07.000725 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d9lj\" (UniqueName: \"kubernetes.io/projected/0c27755f-ec0b-4b73-a18b-f09ed1715ef7-kube-api-access-4d9lj\") on node \"crc\" DevicePath \"\"" Dec 11 13:32:07 crc kubenswrapper[4898]: I1211 13:32:07.300199 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-8hx6m" event={"ID":"0c27755f-ec0b-4b73-a18b-f09ed1715ef7","Type":"ContainerDied","Data":"4e32515575a152bcff775c9b33abf432df9cafe731d73dcf30140c9b690cf97c"} Dec 11 13:32:07 crc kubenswrapper[4898]: I1211 13:32:07.300235 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4e32515575a152bcff775c9b33abf432df9cafe731d73dcf30140c9b690cf97c" Dec 11 13:32:07 crc kubenswrapper[4898]: I1211 13:32:07.300270 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-8hx6m" Dec 11 13:32:07 crc kubenswrapper[4898]: I1211 13:32:07.801891 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-0"] Dec 11 13:32:07 crc kubenswrapper[4898]: I1211 13:32:07.802511 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="987dc857-e48a-418b-b021-4c6048a9c47d" containerName="aodh-listener" containerID="cri-o://d43c6a7ba4d05c274438d925d713c555505735c68c1af45658a1cbb485f781cd" gracePeriod=30 Dec 11 13:32:07 crc kubenswrapper[4898]: I1211 13:32:07.802557 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="987dc857-e48a-418b-b021-4c6048a9c47d" containerName="aodh-notifier" containerID="cri-o://6b5434e0063a62a0d125c37aa7e9aba4fae1680a17072d74cbf1cca9505d5d72" gracePeriod=30 Dec 11 13:32:07 crc kubenswrapper[4898]: I1211 13:32:07.802802 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="987dc857-e48a-418b-b021-4c6048a9c47d" containerName="aodh-evaluator" containerID="cri-o://de4b16c5afcb5773b006b4743cd600fd4507bc0a5cf283b2ebd4df8bb6cfe58e" gracePeriod=30 Dec 11 13:32:07 crc kubenswrapper[4898]: I1211 13:32:07.803029 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="987dc857-e48a-418b-b021-4c6048a9c47d" containerName="aodh-api" containerID="cri-o://5827c5a3c0bb30c0cc495c71bdc472acab633f448964aa6a6369156d76ec74ee" gracePeriod=30 Dec 11 13:32:08 crc kubenswrapper[4898]: I1211 13:32:08.313668 4898 generic.go:334] "Generic (PLEG): container finished" podID="987dc857-e48a-418b-b021-4c6048a9c47d" containerID="de4b16c5afcb5773b006b4743cd600fd4507bc0a5cf283b2ebd4df8bb6cfe58e" exitCode=0 Dec 11 13:32:08 crc kubenswrapper[4898]: I1211 13:32:08.313706 4898 generic.go:334] "Generic (PLEG): container finished" podID="987dc857-e48a-418b-b021-4c6048a9c47d" containerID="5827c5a3c0bb30c0cc495c71bdc472acab633f448964aa6a6369156d76ec74ee" exitCode=0 Dec 11 13:32:08 crc kubenswrapper[4898]: I1211 13:32:08.313729 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"987dc857-e48a-418b-b021-4c6048a9c47d","Type":"ContainerDied","Data":"de4b16c5afcb5773b006b4743cd600fd4507bc0a5cf283b2ebd4df8bb6cfe58e"} Dec 11 13:32:08 crc kubenswrapper[4898]: I1211 13:32:08.313757 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"987dc857-e48a-418b-b021-4c6048a9c47d","Type":"ContainerDied","Data":"5827c5a3c0bb30c0cc495c71bdc472acab633f448964aa6a6369156d76ec74ee"} Dec 11 13:32:10 crc kubenswrapper[4898]: I1211 13:32:10.215342 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 11 13:32:11 crc kubenswrapper[4898]: I1211 13:32:11.349487 4898 generic.go:334] "Generic (PLEG): container finished" podID="987dc857-e48a-418b-b021-4c6048a9c47d" containerID="6b5434e0063a62a0d125c37aa7e9aba4fae1680a17072d74cbf1cca9505d5d72" exitCode=0 Dec 11 13:32:11 crc kubenswrapper[4898]: I1211 13:32:11.349990 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"987dc857-e48a-418b-b021-4c6048a9c47d","Type":"ContainerDied","Data":"6b5434e0063a62a0d125c37aa7e9aba4fae1680a17072d74cbf1cca9505d5d72"} Dec 11 13:32:11 crc kubenswrapper[4898]: I1211 13:32:11.351743 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mqdrp" event={"ID":"8831e333-4a32-4f65-85e7-16a9a95360c9","Type":"ContainerStarted","Data":"9613319d4c278c8a19994f6480822ba5fd2894a6d6bbc5c325c7aa13cfc2affc"} Dec 11 13:32:11 crc kubenswrapper[4898]: I1211 13:32:11.376385 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mqdrp" podStartSLOduration=3.015848475 podStartE2EDuration="33.376360402s" podCreationTimestamp="2025-12-11 13:31:38 +0000 UTC" firstStartedPulling="2025-12-11 13:31:39.851403409 +0000 UTC m=+1657.423729846" lastFinishedPulling="2025-12-11 13:32:10.211915296 +0000 UTC m=+1687.784241773" observedRunningTime="2025-12-11 13:32:11.371005358 +0000 UTC m=+1688.943331795" watchObservedRunningTime="2025-12-11 13:32:11.376360402 +0000 UTC m=+1688.948686849" Dec 11 13:32:12 crc kubenswrapper[4898]: I1211 13:32:12.366832 4898 generic.go:334] "Generic (PLEG): container finished" podID="987dc857-e48a-418b-b021-4c6048a9c47d" containerID="d43c6a7ba4d05c274438d925d713c555505735c68c1af45658a1cbb485f781cd" exitCode=0 Dec 11 13:32:12 crc kubenswrapper[4898]: I1211 13:32:12.366892 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"987dc857-e48a-418b-b021-4c6048a9c47d","Type":"ContainerDied","Data":"d43c6a7ba4d05c274438d925d713c555505735c68c1af45658a1cbb485f781cd"} Dec 11 13:32:12 crc kubenswrapper[4898]: I1211 13:32:12.747540 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Dec 11 13:32:12 crc kubenswrapper[4898]: I1211 13:32:12.953703 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/987dc857-e48a-418b-b021-4c6048a9c47d-scripts\") pod \"987dc857-e48a-418b-b021-4c6048a9c47d\" (UID: \"987dc857-e48a-418b-b021-4c6048a9c47d\") " Dec 11 13:32:12 crc kubenswrapper[4898]: I1211 13:32:12.953812 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/987dc857-e48a-418b-b021-4c6048a9c47d-config-data\") pod \"987dc857-e48a-418b-b021-4c6048a9c47d\" (UID: \"987dc857-e48a-418b-b021-4c6048a9c47d\") " Dec 11 13:32:12 crc kubenswrapper[4898]: I1211 13:32:12.953856 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/987dc857-e48a-418b-b021-4c6048a9c47d-internal-tls-certs\") pod \"987dc857-e48a-418b-b021-4c6048a9c47d\" (UID: \"987dc857-e48a-418b-b021-4c6048a9c47d\") " Dec 11 13:32:12 crc kubenswrapper[4898]: I1211 13:32:12.953925 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c6gf7\" (UniqueName: \"kubernetes.io/projected/987dc857-e48a-418b-b021-4c6048a9c47d-kube-api-access-c6gf7\") pod \"987dc857-e48a-418b-b021-4c6048a9c47d\" (UID: \"987dc857-e48a-418b-b021-4c6048a9c47d\") " Dec 11 13:32:12 crc kubenswrapper[4898]: I1211 13:32:12.954079 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/987dc857-e48a-418b-b021-4c6048a9c47d-combined-ca-bundle\") pod \"987dc857-e48a-418b-b021-4c6048a9c47d\" (UID: \"987dc857-e48a-418b-b021-4c6048a9c47d\") " Dec 11 13:32:12 crc kubenswrapper[4898]: I1211 13:32:12.954131 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/987dc857-e48a-418b-b021-4c6048a9c47d-public-tls-certs\") pod \"987dc857-e48a-418b-b021-4c6048a9c47d\" (UID: \"987dc857-e48a-418b-b021-4c6048a9c47d\") " Dec 11 13:32:12 crc kubenswrapper[4898]: I1211 13:32:12.974681 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/987dc857-e48a-418b-b021-4c6048a9c47d-scripts" (OuterVolumeSpecName: "scripts") pod "987dc857-e48a-418b-b021-4c6048a9c47d" (UID: "987dc857-e48a-418b-b021-4c6048a9c47d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:32:12 crc kubenswrapper[4898]: I1211 13:32:12.977110 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/987dc857-e48a-418b-b021-4c6048a9c47d-kube-api-access-c6gf7" (OuterVolumeSpecName: "kube-api-access-c6gf7") pod "987dc857-e48a-418b-b021-4c6048a9c47d" (UID: "987dc857-e48a-418b-b021-4c6048a9c47d"). InnerVolumeSpecName "kube-api-access-c6gf7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:32:13 crc kubenswrapper[4898]: I1211 13:32:13.059255 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c6gf7\" (UniqueName: \"kubernetes.io/projected/987dc857-e48a-418b-b021-4c6048a9c47d-kube-api-access-c6gf7\") on node \"crc\" DevicePath \"\"" Dec 11 13:32:13 crc kubenswrapper[4898]: I1211 13:32:13.059570 4898 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/987dc857-e48a-418b-b021-4c6048a9c47d-scripts\") on node \"crc\" DevicePath \"\"" Dec 11 13:32:13 crc kubenswrapper[4898]: I1211 13:32:13.144607 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/987dc857-e48a-418b-b021-4c6048a9c47d-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "987dc857-e48a-418b-b021-4c6048a9c47d" (UID: "987dc857-e48a-418b-b021-4c6048a9c47d"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:32:13 crc kubenswrapper[4898]: I1211 13:32:13.161735 4898 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/987dc857-e48a-418b-b021-4c6048a9c47d-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 11 13:32:13 crc kubenswrapper[4898]: I1211 13:32:13.187645 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/987dc857-e48a-418b-b021-4c6048a9c47d-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "987dc857-e48a-418b-b021-4c6048a9c47d" (UID: "987dc857-e48a-418b-b021-4c6048a9c47d"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:32:13 crc kubenswrapper[4898]: I1211 13:32:13.230586 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/987dc857-e48a-418b-b021-4c6048a9c47d-config-data" (OuterVolumeSpecName: "config-data") pod "987dc857-e48a-418b-b021-4c6048a9c47d" (UID: "987dc857-e48a-418b-b021-4c6048a9c47d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:32:13 crc kubenswrapper[4898]: I1211 13:32:13.243798 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/987dc857-e48a-418b-b021-4c6048a9c47d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "987dc857-e48a-418b-b021-4c6048a9c47d" (UID: "987dc857-e48a-418b-b021-4c6048a9c47d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:32:13 crc kubenswrapper[4898]: I1211 13:32:13.264282 4898 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/987dc857-e48a-418b-b021-4c6048a9c47d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 11 13:32:13 crc kubenswrapper[4898]: I1211 13:32:13.264324 4898 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/987dc857-e48a-418b-b021-4c6048a9c47d-config-data\") on node \"crc\" DevicePath \"\"" Dec 11 13:32:13 crc kubenswrapper[4898]: I1211 13:32:13.264334 4898 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/987dc857-e48a-418b-b021-4c6048a9c47d-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 11 13:32:13 crc kubenswrapper[4898]: I1211 13:32:13.381880 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"987dc857-e48a-418b-b021-4c6048a9c47d","Type":"ContainerDied","Data":"3d95242720a9b4647f124d419dcd5baf02ce3ed1aed5bf25907b93d3e32a2a8c"} Dec 11 13:32:13 crc kubenswrapper[4898]: I1211 13:32:13.381943 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Dec 11 13:32:13 crc kubenswrapper[4898]: I1211 13:32:13.381947 4898 scope.go:117] "RemoveContainer" containerID="d43c6a7ba4d05c274438d925d713c555505735c68c1af45658a1cbb485f781cd" Dec 11 13:32:13 crc kubenswrapper[4898]: I1211 13:32:13.429643 4898 scope.go:117] "RemoveContainer" containerID="6b5434e0063a62a0d125c37aa7e9aba4fae1680a17072d74cbf1cca9505d5d72" Dec 11 13:32:13 crc kubenswrapper[4898]: I1211 13:32:13.450515 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-0"] Dec 11 13:32:13 crc kubenswrapper[4898]: I1211 13:32:13.467961 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-0"] Dec 11 13:32:13 crc kubenswrapper[4898]: I1211 13:32:13.492994 4898 scope.go:117] "RemoveContainer" containerID="de4b16c5afcb5773b006b4743cd600fd4507bc0a5cf283b2ebd4df8bb6cfe58e" Dec 11 13:32:13 crc kubenswrapper[4898]: I1211 13:32:13.523582 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-0"] Dec 11 13:32:13 crc kubenswrapper[4898]: E1211 13:32:13.524563 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="987dc857-e48a-418b-b021-4c6048a9c47d" containerName="aodh-evaluator" Dec 11 13:32:13 crc kubenswrapper[4898]: I1211 13:32:13.524584 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="987dc857-e48a-418b-b021-4c6048a9c47d" containerName="aodh-evaluator" Dec 11 13:32:13 crc kubenswrapper[4898]: E1211 13:32:13.524625 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="987dc857-e48a-418b-b021-4c6048a9c47d" containerName="aodh-notifier" Dec 11 13:32:13 crc kubenswrapper[4898]: I1211 13:32:13.524634 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="987dc857-e48a-418b-b021-4c6048a9c47d" containerName="aodh-notifier" Dec 11 13:32:13 crc kubenswrapper[4898]: E1211 13:32:13.524666 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="987dc857-e48a-418b-b021-4c6048a9c47d" containerName="aodh-api" Dec 11 13:32:13 crc kubenswrapper[4898]: I1211 13:32:13.524673 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="987dc857-e48a-418b-b021-4c6048a9c47d" containerName="aodh-api" Dec 11 13:32:13 crc kubenswrapper[4898]: E1211 13:32:13.524707 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="987dc857-e48a-418b-b021-4c6048a9c47d" containerName="aodh-listener" Dec 11 13:32:13 crc kubenswrapper[4898]: I1211 13:32:13.524713 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="987dc857-e48a-418b-b021-4c6048a9c47d" containerName="aodh-listener" Dec 11 13:32:13 crc kubenswrapper[4898]: E1211 13:32:13.524732 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e113da52-055b-47d0-b16a-2fef9d302bbe" containerName="heat-engine" Dec 11 13:32:13 crc kubenswrapper[4898]: I1211 13:32:13.524737 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="e113da52-055b-47d0-b16a-2fef9d302bbe" containerName="heat-engine" Dec 11 13:32:13 crc kubenswrapper[4898]: E1211 13:32:13.524757 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c27755f-ec0b-4b73-a18b-f09ed1715ef7" containerName="aodh-db-sync" Dec 11 13:32:13 crc kubenswrapper[4898]: I1211 13:32:13.524764 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c27755f-ec0b-4b73-a18b-f09ed1715ef7" containerName="aodh-db-sync" Dec 11 13:32:13 crc kubenswrapper[4898]: I1211 13:32:13.525264 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c27755f-ec0b-4b73-a18b-f09ed1715ef7" containerName="aodh-db-sync" Dec 11 13:32:13 crc kubenswrapper[4898]: I1211 13:32:13.525292 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="987dc857-e48a-418b-b021-4c6048a9c47d" containerName="aodh-listener" Dec 11 13:32:13 crc kubenswrapper[4898]: I1211 13:32:13.525319 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="e113da52-055b-47d0-b16a-2fef9d302bbe" containerName="heat-engine" Dec 11 13:32:13 crc kubenswrapper[4898]: I1211 13:32:13.525348 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="987dc857-e48a-418b-b021-4c6048a9c47d" containerName="aodh-evaluator" Dec 11 13:32:13 crc kubenswrapper[4898]: I1211 13:32:13.525374 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="987dc857-e48a-418b-b021-4c6048a9c47d" containerName="aodh-api" Dec 11 13:32:13 crc kubenswrapper[4898]: I1211 13:32:13.525387 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="987dc857-e48a-418b-b021-4c6048a9c47d" containerName="aodh-notifier" Dec 11 13:32:13 crc kubenswrapper[4898]: I1211 13:32:13.531826 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Dec 11 13:32:13 crc kubenswrapper[4898]: I1211 13:32:13.540078 4898 scope.go:117] "RemoveContainer" containerID="5827c5a3c0bb30c0cc495c71bdc472acab633f448964aa6a6369156d76ec74ee" Dec 11 13:32:13 crc kubenswrapper[4898]: I1211 13:32:13.545947 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Dec 11 13:32:13 crc kubenswrapper[4898]: I1211 13:32:13.546075 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-h8xn6" Dec 11 13:32:13 crc kubenswrapper[4898]: I1211 13:32:13.546105 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-aodh-public-svc" Dec 11 13:32:13 crc kubenswrapper[4898]: I1211 13:32:13.546227 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-aodh-internal-svc" Dec 11 13:32:13 crc kubenswrapper[4898]: I1211 13:32:13.546368 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Dec 11 13:32:13 crc kubenswrapper[4898]: I1211 13:32:13.557588 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Dec 11 13:32:13 crc kubenswrapper[4898]: I1211 13:32:13.674287 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd49b53a-7cbb-4ccf-953a-7ed292c090bf-public-tls-certs\") pod \"aodh-0\" (UID: \"bd49b53a-7cbb-4ccf-953a-7ed292c090bf\") " pod="openstack/aodh-0" Dec 11 13:32:13 crc kubenswrapper[4898]: I1211 13:32:13.674381 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mltf9\" (UniqueName: \"kubernetes.io/projected/bd49b53a-7cbb-4ccf-953a-7ed292c090bf-kube-api-access-mltf9\") pod \"aodh-0\" (UID: \"bd49b53a-7cbb-4ccf-953a-7ed292c090bf\") " pod="openstack/aodh-0" Dec 11 13:32:13 crc kubenswrapper[4898]: I1211 13:32:13.674449 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd49b53a-7cbb-4ccf-953a-7ed292c090bf-combined-ca-bundle\") pod \"aodh-0\" (UID: \"bd49b53a-7cbb-4ccf-953a-7ed292c090bf\") " pod="openstack/aodh-0" Dec 11 13:32:13 crc kubenswrapper[4898]: I1211 13:32:13.674596 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd49b53a-7cbb-4ccf-953a-7ed292c090bf-internal-tls-certs\") pod \"aodh-0\" (UID: \"bd49b53a-7cbb-4ccf-953a-7ed292c090bf\") " pod="openstack/aodh-0" Dec 11 13:32:13 crc kubenswrapper[4898]: I1211 13:32:13.674623 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bd49b53a-7cbb-4ccf-953a-7ed292c090bf-scripts\") pod \"aodh-0\" (UID: \"bd49b53a-7cbb-4ccf-953a-7ed292c090bf\") " pod="openstack/aodh-0" Dec 11 13:32:13 crc kubenswrapper[4898]: I1211 13:32:13.674737 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd49b53a-7cbb-4ccf-953a-7ed292c090bf-config-data\") pod \"aodh-0\" (UID: \"bd49b53a-7cbb-4ccf-953a-7ed292c090bf\") " pod="openstack/aodh-0" Dec 11 13:32:13 crc kubenswrapper[4898]: I1211 13:32:13.776402 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd49b53a-7cbb-4ccf-953a-7ed292c090bf-internal-tls-certs\") pod \"aodh-0\" (UID: \"bd49b53a-7cbb-4ccf-953a-7ed292c090bf\") " pod="openstack/aodh-0" Dec 11 13:32:13 crc kubenswrapper[4898]: I1211 13:32:13.776468 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bd49b53a-7cbb-4ccf-953a-7ed292c090bf-scripts\") pod \"aodh-0\" (UID: \"bd49b53a-7cbb-4ccf-953a-7ed292c090bf\") " pod="openstack/aodh-0" Dec 11 13:32:13 crc kubenswrapper[4898]: I1211 13:32:13.776574 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd49b53a-7cbb-4ccf-953a-7ed292c090bf-config-data\") pod \"aodh-0\" (UID: \"bd49b53a-7cbb-4ccf-953a-7ed292c090bf\") " pod="openstack/aodh-0" Dec 11 13:32:13 crc kubenswrapper[4898]: I1211 13:32:13.776597 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd49b53a-7cbb-4ccf-953a-7ed292c090bf-public-tls-certs\") pod \"aodh-0\" (UID: \"bd49b53a-7cbb-4ccf-953a-7ed292c090bf\") " pod="openstack/aodh-0" Dec 11 13:32:13 crc kubenswrapper[4898]: I1211 13:32:13.776662 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mltf9\" (UniqueName: \"kubernetes.io/projected/bd49b53a-7cbb-4ccf-953a-7ed292c090bf-kube-api-access-mltf9\") pod \"aodh-0\" (UID: \"bd49b53a-7cbb-4ccf-953a-7ed292c090bf\") " pod="openstack/aodh-0" Dec 11 13:32:13 crc kubenswrapper[4898]: I1211 13:32:13.776734 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd49b53a-7cbb-4ccf-953a-7ed292c090bf-combined-ca-bundle\") pod \"aodh-0\" (UID: \"bd49b53a-7cbb-4ccf-953a-7ed292c090bf\") " pod="openstack/aodh-0" Dec 11 13:32:13 crc kubenswrapper[4898]: I1211 13:32:13.781095 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bd49b53a-7cbb-4ccf-953a-7ed292c090bf-scripts\") pod \"aodh-0\" (UID: \"bd49b53a-7cbb-4ccf-953a-7ed292c090bf\") " pod="openstack/aodh-0" Dec 11 13:32:13 crc kubenswrapper[4898]: I1211 13:32:13.781319 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd49b53a-7cbb-4ccf-953a-7ed292c090bf-internal-tls-certs\") pod \"aodh-0\" (UID: \"bd49b53a-7cbb-4ccf-953a-7ed292c090bf\") " pod="openstack/aodh-0" Dec 11 13:32:13 crc kubenswrapper[4898]: I1211 13:32:13.782428 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd49b53a-7cbb-4ccf-953a-7ed292c090bf-public-tls-certs\") pod \"aodh-0\" (UID: \"bd49b53a-7cbb-4ccf-953a-7ed292c090bf\") " pod="openstack/aodh-0" Dec 11 13:32:13 crc kubenswrapper[4898]: I1211 13:32:13.784900 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd49b53a-7cbb-4ccf-953a-7ed292c090bf-config-data\") pod \"aodh-0\" (UID: \"bd49b53a-7cbb-4ccf-953a-7ed292c090bf\") " pod="openstack/aodh-0" Dec 11 13:32:13 crc kubenswrapper[4898]: I1211 13:32:13.785355 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd49b53a-7cbb-4ccf-953a-7ed292c090bf-combined-ca-bundle\") pod \"aodh-0\" (UID: \"bd49b53a-7cbb-4ccf-953a-7ed292c090bf\") " pod="openstack/aodh-0" Dec 11 13:32:13 crc kubenswrapper[4898]: I1211 13:32:13.795340 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mltf9\" (UniqueName: \"kubernetes.io/projected/bd49b53a-7cbb-4ccf-953a-7ed292c090bf-kube-api-access-mltf9\") pod \"aodh-0\" (UID: \"bd49b53a-7cbb-4ccf-953a-7ed292c090bf\") " pod="openstack/aodh-0" Dec 11 13:32:13 crc kubenswrapper[4898]: I1211 13:32:13.860358 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Dec 11 13:32:14 crc kubenswrapper[4898]: I1211 13:32:14.377842 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Dec 11 13:32:14 crc kubenswrapper[4898]: I1211 13:32:14.404101 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"bd49b53a-7cbb-4ccf-953a-7ed292c090bf","Type":"ContainerStarted","Data":"3d57730ac92204563500ec623d9968d77b3e43fb1eb1e25afea17e1df56fa70c"} Dec 11 13:32:14 crc kubenswrapper[4898]: I1211 13:32:14.788403 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="987dc857-e48a-418b-b021-4c6048a9c47d" path="/var/lib/kubelet/pods/987dc857-e48a-418b-b021-4c6048a9c47d/volumes" Dec 11 13:32:15 crc kubenswrapper[4898]: I1211 13:32:15.418306 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"bd49b53a-7cbb-4ccf-953a-7ed292c090bf","Type":"ContainerStarted","Data":"04cddd11292dc845f5301eb1e75a38c73125fc7f625b28c919f6288c0403e345"} Dec 11 13:32:17 crc kubenswrapper[4898]: I1211 13:32:17.442191 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"bd49b53a-7cbb-4ccf-953a-7ed292c090bf","Type":"ContainerStarted","Data":"7a5737ea331236627bf93afc96052c11b63f528e961100ed6c5f4e81d723a7cd"} Dec 11 13:32:17 crc kubenswrapper[4898]: I1211 13:32:17.775373 4898 scope.go:117] "RemoveContainer" containerID="a48cb70f955c6dfbf71c08dc1817c1d1fa98db5c3f0203c3fcc75aa88e18e901" Dec 11 13:32:17 crc kubenswrapper[4898]: E1211 13:32:17.775691 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mmvk_openshift-machine-config-operator(b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" Dec 11 13:32:19 crc kubenswrapper[4898]: I1211 13:32:19.478910 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"bd49b53a-7cbb-4ccf-953a-7ed292c090bf","Type":"ContainerStarted","Data":"4724fd21d54d4df5080f480d42322afe4d07b1a33b603906d602c067e29838a5"} Dec 11 13:32:20 crc kubenswrapper[4898]: I1211 13:32:20.496862 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"bd49b53a-7cbb-4ccf-953a-7ed292c090bf","Type":"ContainerStarted","Data":"ff142c405fcbc9c5c249c22bef44bc1619b5a7874e583421579c8fb0ca1b565d"} Dec 11 13:32:20 crc kubenswrapper[4898]: I1211 13:32:20.521295 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-0" podStartSLOduration=2.353492859 podStartE2EDuration="7.521274784s" podCreationTimestamp="2025-12-11 13:32:13 +0000 UTC" firstStartedPulling="2025-12-11 13:32:14.384082194 +0000 UTC m=+1691.956408631" lastFinishedPulling="2025-12-11 13:32:19.551864119 +0000 UTC m=+1697.124190556" observedRunningTime="2025-12-11 13:32:20.515781906 +0000 UTC m=+1698.088108343" watchObservedRunningTime="2025-12-11 13:32:20.521274784 +0000 UTC m=+1698.093601221" Dec 11 13:32:24 crc kubenswrapper[4898]: I1211 13:32:24.552748 4898 generic.go:334] "Generic (PLEG): container finished" podID="8831e333-4a32-4f65-85e7-16a9a95360c9" containerID="9613319d4c278c8a19994f6480822ba5fd2894a6d6bbc5c325c7aa13cfc2affc" exitCode=0 Dec 11 13:32:24 crc kubenswrapper[4898]: I1211 13:32:24.552845 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mqdrp" event={"ID":"8831e333-4a32-4f65-85e7-16a9a95360c9","Type":"ContainerDied","Data":"9613319d4c278c8a19994f6480822ba5fd2894a6d6bbc5c325c7aa13cfc2affc"} Dec 11 13:32:26 crc kubenswrapper[4898]: I1211 13:32:26.077326 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mqdrp" Dec 11 13:32:26 crc kubenswrapper[4898]: I1211 13:32:26.212792 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8831e333-4a32-4f65-85e7-16a9a95360c9-ssh-key\") pod \"8831e333-4a32-4f65-85e7-16a9a95360c9\" (UID: \"8831e333-4a32-4f65-85e7-16a9a95360c9\") " Dec 11 13:32:26 crc kubenswrapper[4898]: I1211 13:32:26.212891 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8831e333-4a32-4f65-85e7-16a9a95360c9-inventory\") pod \"8831e333-4a32-4f65-85e7-16a9a95360c9\" (UID: \"8831e333-4a32-4f65-85e7-16a9a95360c9\") " Dec 11 13:32:26 crc kubenswrapper[4898]: I1211 13:32:26.213011 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9dh4s\" (UniqueName: \"kubernetes.io/projected/8831e333-4a32-4f65-85e7-16a9a95360c9-kube-api-access-9dh4s\") pod \"8831e333-4a32-4f65-85e7-16a9a95360c9\" (UID: \"8831e333-4a32-4f65-85e7-16a9a95360c9\") " Dec 11 13:32:26 crc kubenswrapper[4898]: I1211 13:32:26.213043 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8831e333-4a32-4f65-85e7-16a9a95360c9-repo-setup-combined-ca-bundle\") pod \"8831e333-4a32-4f65-85e7-16a9a95360c9\" (UID: \"8831e333-4a32-4f65-85e7-16a9a95360c9\") " Dec 11 13:32:26 crc kubenswrapper[4898]: I1211 13:32:26.219172 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8831e333-4a32-4f65-85e7-16a9a95360c9-kube-api-access-9dh4s" (OuterVolumeSpecName: "kube-api-access-9dh4s") pod "8831e333-4a32-4f65-85e7-16a9a95360c9" (UID: "8831e333-4a32-4f65-85e7-16a9a95360c9"). InnerVolumeSpecName "kube-api-access-9dh4s". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:32:26 crc kubenswrapper[4898]: I1211 13:32:26.219733 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8831e333-4a32-4f65-85e7-16a9a95360c9-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "8831e333-4a32-4f65-85e7-16a9a95360c9" (UID: "8831e333-4a32-4f65-85e7-16a9a95360c9"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:32:26 crc kubenswrapper[4898]: I1211 13:32:26.247781 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8831e333-4a32-4f65-85e7-16a9a95360c9-inventory" (OuterVolumeSpecName: "inventory") pod "8831e333-4a32-4f65-85e7-16a9a95360c9" (UID: "8831e333-4a32-4f65-85e7-16a9a95360c9"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:32:26 crc kubenswrapper[4898]: I1211 13:32:26.259434 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8831e333-4a32-4f65-85e7-16a9a95360c9-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "8831e333-4a32-4f65-85e7-16a9a95360c9" (UID: "8831e333-4a32-4f65-85e7-16a9a95360c9"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:32:26 crc kubenswrapper[4898]: I1211 13:32:26.315560 4898 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8831e333-4a32-4f65-85e7-16a9a95360c9-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 11 13:32:26 crc kubenswrapper[4898]: I1211 13:32:26.315601 4898 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8831e333-4a32-4f65-85e7-16a9a95360c9-inventory\") on node \"crc\" DevicePath \"\"" Dec 11 13:32:26 crc kubenswrapper[4898]: I1211 13:32:26.315613 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9dh4s\" (UniqueName: \"kubernetes.io/projected/8831e333-4a32-4f65-85e7-16a9a95360c9-kube-api-access-9dh4s\") on node \"crc\" DevicePath \"\"" Dec 11 13:32:26 crc kubenswrapper[4898]: I1211 13:32:26.315625 4898 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8831e333-4a32-4f65-85e7-16a9a95360c9-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 11 13:32:26 crc kubenswrapper[4898]: I1211 13:32:26.584144 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mqdrp" event={"ID":"8831e333-4a32-4f65-85e7-16a9a95360c9","Type":"ContainerDied","Data":"0258d62e3dc79794e62212ed082321d988df13b9473d63db77fd236b4f9f9f60"} Dec 11 13:32:26 crc kubenswrapper[4898]: I1211 13:32:26.584529 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0258d62e3dc79794e62212ed082321d988df13b9473d63db77fd236b4f9f9f60" Dec 11 13:32:26 crc kubenswrapper[4898]: I1211 13:32:26.584207 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mqdrp" Dec 11 13:32:26 crc kubenswrapper[4898]: I1211 13:32:26.739083 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-89l85"] Dec 11 13:32:26 crc kubenswrapper[4898]: E1211 13:32:26.739665 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8831e333-4a32-4f65-85e7-16a9a95360c9" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Dec 11 13:32:26 crc kubenswrapper[4898]: I1211 13:32:26.739686 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="8831e333-4a32-4f65-85e7-16a9a95360c9" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Dec 11 13:32:26 crc kubenswrapper[4898]: I1211 13:32:26.740043 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="8831e333-4a32-4f65-85e7-16a9a95360c9" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Dec 11 13:32:26 crc kubenswrapper[4898]: I1211 13:32:26.740912 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-89l85" Dec 11 13:32:26 crc kubenswrapper[4898]: I1211 13:32:26.743862 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 11 13:32:26 crc kubenswrapper[4898]: I1211 13:32:26.744110 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 11 13:32:26 crc kubenswrapper[4898]: I1211 13:32:26.744296 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 11 13:32:26 crc kubenswrapper[4898]: I1211 13:32:26.744486 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-w8jc8" Dec 11 13:32:26 crc kubenswrapper[4898]: I1211 13:32:26.749929 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-89l85"] Dec 11 13:32:26 crc kubenswrapper[4898]: I1211 13:32:26.827653 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xljrw\" (UniqueName: \"kubernetes.io/projected/aec2abe1-e7dc-48b7-b34d-f5b50f289ab8-kube-api-access-xljrw\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-89l85\" (UID: \"aec2abe1-e7dc-48b7-b34d-f5b50f289ab8\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-89l85" Dec 11 13:32:26 crc kubenswrapper[4898]: I1211 13:32:26.827714 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/aec2abe1-e7dc-48b7-b34d-f5b50f289ab8-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-89l85\" (UID: \"aec2abe1-e7dc-48b7-b34d-f5b50f289ab8\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-89l85" Dec 11 13:32:26 crc kubenswrapper[4898]: I1211 13:32:26.827739 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/aec2abe1-e7dc-48b7-b34d-f5b50f289ab8-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-89l85\" (UID: \"aec2abe1-e7dc-48b7-b34d-f5b50f289ab8\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-89l85" Dec 11 13:32:26 crc kubenswrapper[4898]: I1211 13:32:26.931296 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xljrw\" (UniqueName: \"kubernetes.io/projected/aec2abe1-e7dc-48b7-b34d-f5b50f289ab8-kube-api-access-xljrw\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-89l85\" (UID: \"aec2abe1-e7dc-48b7-b34d-f5b50f289ab8\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-89l85" Dec 11 13:32:26 crc kubenswrapper[4898]: I1211 13:32:26.931364 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/aec2abe1-e7dc-48b7-b34d-f5b50f289ab8-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-89l85\" (UID: \"aec2abe1-e7dc-48b7-b34d-f5b50f289ab8\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-89l85" Dec 11 13:32:26 crc kubenswrapper[4898]: I1211 13:32:26.931396 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/aec2abe1-e7dc-48b7-b34d-f5b50f289ab8-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-89l85\" (UID: \"aec2abe1-e7dc-48b7-b34d-f5b50f289ab8\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-89l85" Dec 11 13:32:26 crc kubenswrapper[4898]: I1211 13:32:26.938326 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/aec2abe1-e7dc-48b7-b34d-f5b50f289ab8-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-89l85\" (UID: \"aec2abe1-e7dc-48b7-b34d-f5b50f289ab8\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-89l85" Dec 11 13:32:26 crc kubenswrapper[4898]: I1211 13:32:26.951149 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/aec2abe1-e7dc-48b7-b34d-f5b50f289ab8-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-89l85\" (UID: \"aec2abe1-e7dc-48b7-b34d-f5b50f289ab8\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-89l85" Dec 11 13:32:26 crc kubenswrapper[4898]: I1211 13:32:26.975643 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xljrw\" (UniqueName: \"kubernetes.io/projected/aec2abe1-e7dc-48b7-b34d-f5b50f289ab8-kube-api-access-xljrw\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-89l85\" (UID: \"aec2abe1-e7dc-48b7-b34d-f5b50f289ab8\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-89l85" Dec 11 13:32:27 crc kubenswrapper[4898]: I1211 13:32:27.059179 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-89l85" Dec 11 13:32:27 crc kubenswrapper[4898]: W1211 13:32:27.605193 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaec2abe1_e7dc_48b7_b34d_f5b50f289ab8.slice/crio-dda256fdc462173b653f7072b0a9eabb2cf3247aa058e3bc9212f651cbbae930 WatchSource:0}: Error finding container dda256fdc462173b653f7072b0a9eabb2cf3247aa058e3bc9212f651cbbae930: Status 404 returned error can't find the container with id dda256fdc462173b653f7072b0a9eabb2cf3247aa058e3bc9212f651cbbae930 Dec 11 13:32:27 crc kubenswrapper[4898]: I1211 13:32:27.608254 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-89l85"] Dec 11 13:32:28 crc kubenswrapper[4898]: I1211 13:32:28.612130 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-89l85" event={"ID":"aec2abe1-e7dc-48b7-b34d-f5b50f289ab8","Type":"ContainerStarted","Data":"dda256fdc462173b653f7072b0a9eabb2cf3247aa058e3bc9212f651cbbae930"} Dec 11 13:32:29 crc kubenswrapper[4898]: I1211 13:32:29.623727 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-89l85" event={"ID":"aec2abe1-e7dc-48b7-b34d-f5b50f289ab8","Type":"ContainerStarted","Data":"fc40cd03e7c86445990c0812a8aada078c540391f2d6c693ed0eb1e39f46c8a2"} Dec 11 13:32:29 crc kubenswrapper[4898]: I1211 13:32:29.648551 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-89l85" podStartSLOduration=2.20954236 podStartE2EDuration="3.648534141s" podCreationTimestamp="2025-12-11 13:32:26 +0000 UTC" firstStartedPulling="2025-12-11 13:32:27.616983295 +0000 UTC m=+1705.189309732" lastFinishedPulling="2025-12-11 13:32:29.055975076 +0000 UTC m=+1706.628301513" observedRunningTime="2025-12-11 13:32:29.644193985 +0000 UTC m=+1707.216520432" watchObservedRunningTime="2025-12-11 13:32:29.648534141 +0000 UTC m=+1707.220860578" Dec 11 13:32:29 crc kubenswrapper[4898]: I1211 13:32:29.775514 4898 scope.go:117] "RemoveContainer" containerID="a48cb70f955c6dfbf71c08dc1817c1d1fa98db5c3f0203c3fcc75aa88e18e901" Dec 11 13:32:29 crc kubenswrapper[4898]: E1211 13:32:29.775847 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mmvk_openshift-machine-config-operator(b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" Dec 11 13:32:32 crc kubenswrapper[4898]: I1211 13:32:32.672211 4898 generic.go:334] "Generic (PLEG): container finished" podID="aec2abe1-e7dc-48b7-b34d-f5b50f289ab8" containerID="fc40cd03e7c86445990c0812a8aada078c540391f2d6c693ed0eb1e39f46c8a2" exitCode=0 Dec 11 13:32:32 crc kubenswrapper[4898]: I1211 13:32:32.672295 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-89l85" event={"ID":"aec2abe1-e7dc-48b7-b34d-f5b50f289ab8","Type":"ContainerDied","Data":"fc40cd03e7c86445990c0812a8aada078c540391f2d6c693ed0eb1e39f46c8a2"} Dec 11 13:32:34 crc kubenswrapper[4898]: I1211 13:32:34.206577 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-89l85" Dec 11 13:32:34 crc kubenswrapper[4898]: I1211 13:32:34.310233 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/aec2abe1-e7dc-48b7-b34d-f5b50f289ab8-inventory\") pod \"aec2abe1-e7dc-48b7-b34d-f5b50f289ab8\" (UID: \"aec2abe1-e7dc-48b7-b34d-f5b50f289ab8\") " Dec 11 13:32:34 crc kubenswrapper[4898]: I1211 13:32:34.310350 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xljrw\" (UniqueName: \"kubernetes.io/projected/aec2abe1-e7dc-48b7-b34d-f5b50f289ab8-kube-api-access-xljrw\") pod \"aec2abe1-e7dc-48b7-b34d-f5b50f289ab8\" (UID: \"aec2abe1-e7dc-48b7-b34d-f5b50f289ab8\") " Dec 11 13:32:34 crc kubenswrapper[4898]: I1211 13:32:34.310432 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/aec2abe1-e7dc-48b7-b34d-f5b50f289ab8-ssh-key\") pod \"aec2abe1-e7dc-48b7-b34d-f5b50f289ab8\" (UID: \"aec2abe1-e7dc-48b7-b34d-f5b50f289ab8\") " Dec 11 13:32:34 crc kubenswrapper[4898]: I1211 13:32:34.315985 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aec2abe1-e7dc-48b7-b34d-f5b50f289ab8-kube-api-access-xljrw" (OuterVolumeSpecName: "kube-api-access-xljrw") pod "aec2abe1-e7dc-48b7-b34d-f5b50f289ab8" (UID: "aec2abe1-e7dc-48b7-b34d-f5b50f289ab8"). InnerVolumeSpecName "kube-api-access-xljrw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:32:34 crc kubenswrapper[4898]: I1211 13:32:34.348646 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aec2abe1-e7dc-48b7-b34d-f5b50f289ab8-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "aec2abe1-e7dc-48b7-b34d-f5b50f289ab8" (UID: "aec2abe1-e7dc-48b7-b34d-f5b50f289ab8"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:32:34 crc kubenswrapper[4898]: I1211 13:32:34.349178 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aec2abe1-e7dc-48b7-b34d-f5b50f289ab8-inventory" (OuterVolumeSpecName: "inventory") pod "aec2abe1-e7dc-48b7-b34d-f5b50f289ab8" (UID: "aec2abe1-e7dc-48b7-b34d-f5b50f289ab8"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:32:34 crc kubenswrapper[4898]: I1211 13:32:34.413332 4898 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/aec2abe1-e7dc-48b7-b34d-f5b50f289ab8-inventory\") on node \"crc\" DevicePath \"\"" Dec 11 13:32:34 crc kubenswrapper[4898]: I1211 13:32:34.413361 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xljrw\" (UniqueName: \"kubernetes.io/projected/aec2abe1-e7dc-48b7-b34d-f5b50f289ab8-kube-api-access-xljrw\") on node \"crc\" DevicePath \"\"" Dec 11 13:32:34 crc kubenswrapper[4898]: I1211 13:32:34.413372 4898 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/aec2abe1-e7dc-48b7-b34d-f5b50f289ab8-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 11 13:32:34 crc kubenswrapper[4898]: I1211 13:32:34.698986 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-89l85" event={"ID":"aec2abe1-e7dc-48b7-b34d-f5b50f289ab8","Type":"ContainerDied","Data":"dda256fdc462173b653f7072b0a9eabb2cf3247aa058e3bc9212f651cbbae930"} Dec 11 13:32:34 crc kubenswrapper[4898]: I1211 13:32:34.699393 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dda256fdc462173b653f7072b0a9eabb2cf3247aa058e3bc9212f651cbbae930" Dec 11 13:32:34 crc kubenswrapper[4898]: I1211 13:32:34.699151 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-89l85" Dec 11 13:32:34 crc kubenswrapper[4898]: I1211 13:32:34.798675 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-mh8pf"] Dec 11 13:32:34 crc kubenswrapper[4898]: E1211 13:32:34.799363 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aec2abe1-e7dc-48b7-b34d-f5b50f289ab8" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Dec 11 13:32:34 crc kubenswrapper[4898]: I1211 13:32:34.799388 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="aec2abe1-e7dc-48b7-b34d-f5b50f289ab8" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Dec 11 13:32:34 crc kubenswrapper[4898]: I1211 13:32:34.799749 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="aec2abe1-e7dc-48b7-b34d-f5b50f289ab8" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Dec 11 13:32:34 crc kubenswrapper[4898]: I1211 13:32:34.800842 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-mh8pf" Dec 11 13:32:34 crc kubenswrapper[4898]: I1211 13:32:34.803978 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 11 13:32:34 crc kubenswrapper[4898]: I1211 13:32:34.804085 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 11 13:32:34 crc kubenswrapper[4898]: I1211 13:32:34.804206 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 11 13:32:34 crc kubenswrapper[4898]: I1211 13:32:34.805267 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-w8jc8" Dec 11 13:32:34 crc kubenswrapper[4898]: I1211 13:32:34.826926 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-mh8pf"] Dec 11 13:32:34 crc kubenswrapper[4898]: I1211 13:32:34.924108 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v5xvg\" (UniqueName: \"kubernetes.io/projected/329b098d-eb3e-413e-b398-697e70fb3ef9-kube-api-access-v5xvg\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-mh8pf\" (UID: \"329b098d-eb3e-413e-b398-697e70fb3ef9\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-mh8pf" Dec 11 13:32:34 crc kubenswrapper[4898]: I1211 13:32:34.924399 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/329b098d-eb3e-413e-b398-697e70fb3ef9-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-mh8pf\" (UID: \"329b098d-eb3e-413e-b398-697e70fb3ef9\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-mh8pf" Dec 11 13:32:34 crc kubenswrapper[4898]: I1211 13:32:34.924951 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/329b098d-eb3e-413e-b398-697e70fb3ef9-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-mh8pf\" (UID: \"329b098d-eb3e-413e-b398-697e70fb3ef9\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-mh8pf" Dec 11 13:32:34 crc kubenswrapper[4898]: I1211 13:32:34.924995 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/329b098d-eb3e-413e-b398-697e70fb3ef9-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-mh8pf\" (UID: \"329b098d-eb3e-413e-b398-697e70fb3ef9\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-mh8pf" Dec 11 13:32:35 crc kubenswrapper[4898]: I1211 13:32:35.027277 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/329b098d-eb3e-413e-b398-697e70fb3ef9-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-mh8pf\" (UID: \"329b098d-eb3e-413e-b398-697e70fb3ef9\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-mh8pf" Dec 11 13:32:35 crc kubenswrapper[4898]: I1211 13:32:35.027367 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/329b098d-eb3e-413e-b398-697e70fb3ef9-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-mh8pf\" (UID: \"329b098d-eb3e-413e-b398-697e70fb3ef9\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-mh8pf" Dec 11 13:32:35 crc kubenswrapper[4898]: I1211 13:32:35.027496 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v5xvg\" (UniqueName: \"kubernetes.io/projected/329b098d-eb3e-413e-b398-697e70fb3ef9-kube-api-access-v5xvg\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-mh8pf\" (UID: \"329b098d-eb3e-413e-b398-697e70fb3ef9\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-mh8pf" Dec 11 13:32:35 crc kubenswrapper[4898]: I1211 13:32:35.027612 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/329b098d-eb3e-413e-b398-697e70fb3ef9-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-mh8pf\" (UID: \"329b098d-eb3e-413e-b398-697e70fb3ef9\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-mh8pf" Dec 11 13:32:35 crc kubenswrapper[4898]: I1211 13:32:35.033136 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/329b098d-eb3e-413e-b398-697e70fb3ef9-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-mh8pf\" (UID: \"329b098d-eb3e-413e-b398-697e70fb3ef9\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-mh8pf" Dec 11 13:32:35 crc kubenswrapper[4898]: I1211 13:32:35.033511 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/329b098d-eb3e-413e-b398-697e70fb3ef9-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-mh8pf\" (UID: \"329b098d-eb3e-413e-b398-697e70fb3ef9\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-mh8pf" Dec 11 13:32:35 crc kubenswrapper[4898]: I1211 13:32:35.035038 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/329b098d-eb3e-413e-b398-697e70fb3ef9-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-mh8pf\" (UID: \"329b098d-eb3e-413e-b398-697e70fb3ef9\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-mh8pf" Dec 11 13:32:35 crc kubenswrapper[4898]: I1211 13:32:35.048242 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v5xvg\" (UniqueName: \"kubernetes.io/projected/329b098d-eb3e-413e-b398-697e70fb3ef9-kube-api-access-v5xvg\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-mh8pf\" (UID: \"329b098d-eb3e-413e-b398-697e70fb3ef9\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-mh8pf" Dec 11 13:32:35 crc kubenswrapper[4898]: I1211 13:32:35.120898 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-mh8pf" Dec 11 13:32:35 crc kubenswrapper[4898]: I1211 13:32:35.710685 4898 scope.go:117] "RemoveContainer" containerID="5c45ba0adcd382f966e106c460bc5c3e858dd387f9988d492a1a441951833b10" Dec 11 13:32:35 crc kubenswrapper[4898]: I1211 13:32:35.724792 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-mh8pf"] Dec 11 13:32:35 crc kubenswrapper[4898]: W1211 13:32:35.739950 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod329b098d_eb3e_413e_b398_697e70fb3ef9.slice/crio-64a065a2c0746a92d6d30874eb60046840911d5669606a6838ad24cfb71a25bb WatchSource:0}: Error finding container 64a065a2c0746a92d6d30874eb60046840911d5669606a6838ad24cfb71a25bb: Status 404 returned error can't find the container with id 64a065a2c0746a92d6d30874eb60046840911d5669606a6838ad24cfb71a25bb Dec 11 13:32:35 crc kubenswrapper[4898]: I1211 13:32:35.791231 4898 scope.go:117] "RemoveContainer" containerID="a8fb01bc589dee8d9d5d9e46068c3f94228cd2cd6920721105182af1d784b83e" Dec 11 13:32:36 crc kubenswrapper[4898]: I1211 13:32:36.722833 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-mh8pf" event={"ID":"329b098d-eb3e-413e-b398-697e70fb3ef9","Type":"ContainerStarted","Data":"64a065a2c0746a92d6d30874eb60046840911d5669606a6838ad24cfb71a25bb"} Dec 11 13:32:38 crc kubenswrapper[4898]: I1211 13:32:38.757609 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-mh8pf" event={"ID":"329b098d-eb3e-413e-b398-697e70fb3ef9","Type":"ContainerStarted","Data":"499d397bc6d836e7904b74989063eacdf93d10da47a1f78e3d4a1cacf9c769b8"} Dec 11 13:32:38 crc kubenswrapper[4898]: I1211 13:32:38.791697 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-mh8pf" podStartSLOduration=2.964767808 podStartE2EDuration="4.791674545s" podCreationTimestamp="2025-12-11 13:32:34 +0000 UTC" firstStartedPulling="2025-12-11 13:32:35.790632362 +0000 UTC m=+1713.362958799" lastFinishedPulling="2025-12-11 13:32:37.617539109 +0000 UTC m=+1715.189865536" observedRunningTime="2025-12-11 13:32:38.780724642 +0000 UTC m=+1716.353051079" watchObservedRunningTime="2025-12-11 13:32:38.791674545 +0000 UTC m=+1716.364000992" Dec 11 13:32:44 crc kubenswrapper[4898]: I1211 13:32:44.775695 4898 scope.go:117] "RemoveContainer" containerID="a48cb70f955c6dfbf71c08dc1817c1d1fa98db5c3f0203c3fcc75aa88e18e901" Dec 11 13:32:44 crc kubenswrapper[4898]: E1211 13:32:44.776377 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mmvk_openshift-machine-config-operator(b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" Dec 11 13:32:59 crc kubenswrapper[4898]: I1211 13:32:59.775363 4898 scope.go:117] "RemoveContainer" containerID="a48cb70f955c6dfbf71c08dc1817c1d1fa98db5c3f0203c3fcc75aa88e18e901" Dec 11 13:32:59 crc kubenswrapper[4898]: E1211 13:32:59.777239 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mmvk_openshift-machine-config-operator(b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" Dec 11 13:33:14 crc kubenswrapper[4898]: I1211 13:33:14.775528 4898 scope.go:117] "RemoveContainer" containerID="a48cb70f955c6dfbf71c08dc1817c1d1fa98db5c3f0203c3fcc75aa88e18e901" Dec 11 13:33:14 crc kubenswrapper[4898]: E1211 13:33:14.776388 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mmvk_openshift-machine-config-operator(b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" Dec 11 13:33:25 crc kubenswrapper[4898]: I1211 13:33:25.776027 4898 scope.go:117] "RemoveContainer" containerID="a48cb70f955c6dfbf71c08dc1817c1d1fa98db5c3f0203c3fcc75aa88e18e901" Dec 11 13:33:25 crc kubenswrapper[4898]: E1211 13:33:25.776945 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mmvk_openshift-machine-config-operator(b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" Dec 11 13:33:40 crc kubenswrapper[4898]: I1211 13:33:40.774829 4898 scope.go:117] "RemoveContainer" containerID="a48cb70f955c6dfbf71c08dc1817c1d1fa98db5c3f0203c3fcc75aa88e18e901" Dec 11 13:33:40 crc kubenswrapper[4898]: E1211 13:33:40.775803 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mmvk_openshift-machine-config-operator(b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" Dec 11 13:33:51 crc kubenswrapper[4898]: I1211 13:33:51.775711 4898 scope.go:117] "RemoveContainer" containerID="a48cb70f955c6dfbf71c08dc1817c1d1fa98db5c3f0203c3fcc75aa88e18e901" Dec 11 13:33:51 crc kubenswrapper[4898]: E1211 13:33:51.776773 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mmvk_openshift-machine-config-operator(b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" Dec 11 13:34:05 crc kubenswrapper[4898]: I1211 13:34:05.775497 4898 scope.go:117] "RemoveContainer" containerID="a48cb70f955c6dfbf71c08dc1817c1d1fa98db5c3f0203c3fcc75aa88e18e901" Dec 11 13:34:05 crc kubenswrapper[4898]: E1211 13:34:05.776294 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mmvk_openshift-machine-config-operator(b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" Dec 11 13:34:19 crc kubenswrapper[4898]: I1211 13:34:19.776985 4898 scope.go:117] "RemoveContainer" containerID="a48cb70f955c6dfbf71c08dc1817c1d1fa98db5c3f0203c3fcc75aa88e18e901" Dec 11 13:34:19 crc kubenswrapper[4898]: E1211 13:34:19.778087 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mmvk_openshift-machine-config-operator(b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" Dec 11 13:34:34 crc kubenswrapper[4898]: I1211 13:34:34.777036 4898 scope.go:117] "RemoveContainer" containerID="a48cb70f955c6dfbf71c08dc1817c1d1fa98db5c3f0203c3fcc75aa88e18e901" Dec 11 13:34:34 crc kubenswrapper[4898]: E1211 13:34:34.778612 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mmvk_openshift-machine-config-operator(b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" Dec 11 13:34:36 crc kubenswrapper[4898]: I1211 13:34:36.224217 4898 scope.go:117] "RemoveContainer" containerID="a3c664e55bf0186f9da9cc6fb55b1679a3f6b85c6614e1aacc0f6f280ef624d8" Dec 11 13:34:36 crc kubenswrapper[4898]: I1211 13:34:36.258840 4898 scope.go:117] "RemoveContainer" containerID="514a94234818ba822cb459a5fdaa1a0acdd09e6aa1865cffe12dc7c985d3bc5c" Dec 11 13:34:48 crc kubenswrapper[4898]: I1211 13:34:48.775402 4898 scope.go:117] "RemoveContainer" containerID="a48cb70f955c6dfbf71c08dc1817c1d1fa98db5c3f0203c3fcc75aa88e18e901" Dec 11 13:34:48 crc kubenswrapper[4898]: E1211 13:34:48.776551 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mmvk_openshift-machine-config-operator(b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" Dec 11 13:35:03 crc kubenswrapper[4898]: I1211 13:35:03.775576 4898 scope.go:117] "RemoveContainer" containerID="a48cb70f955c6dfbf71c08dc1817c1d1fa98db5c3f0203c3fcc75aa88e18e901" Dec 11 13:35:03 crc kubenswrapper[4898]: E1211 13:35:03.776339 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mmvk_openshift-machine-config-operator(b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" Dec 11 13:35:08 crc kubenswrapper[4898]: I1211 13:35:08.065563 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-pddzq"] Dec 11 13:35:08 crc kubenswrapper[4898]: I1211 13:35:08.084955 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-267e-account-create-update-rkrvq"] Dec 11 13:35:08 crc kubenswrapper[4898]: I1211 13:35:08.104294 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-mj9zd"] Dec 11 13:35:08 crc kubenswrapper[4898]: I1211 13:35:08.115710 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-pddzq"] Dec 11 13:35:08 crc kubenswrapper[4898]: I1211 13:35:08.127677 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-95d5-account-create-update-62mxt"] Dec 11 13:35:08 crc kubenswrapper[4898]: I1211 13:35:08.162297 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-mj9zd"] Dec 11 13:35:08 crc kubenswrapper[4898]: I1211 13:35:08.172448 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-267e-account-create-update-rkrvq"] Dec 11 13:35:08 crc kubenswrapper[4898]: I1211 13:35:08.189158 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-95d5-account-create-update-62mxt"] Dec 11 13:35:08 crc kubenswrapper[4898]: I1211 13:35:08.797218 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="04e19d4f-0d50-4924-bd7d-812d753d76ac" path="/var/lib/kubelet/pods/04e19d4f-0d50-4924-bd7d-812d753d76ac/volumes" Dec 11 13:35:08 crc kubenswrapper[4898]: I1211 13:35:08.798427 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="376bd2c9-d19e-4322-8269-847515a788cb" path="/var/lib/kubelet/pods/376bd2c9-d19e-4322-8269-847515a788cb/volumes" Dec 11 13:35:08 crc kubenswrapper[4898]: I1211 13:35:08.799497 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="399ee20b-e403-4d88-bd87-77a1c8b71e93" path="/var/lib/kubelet/pods/399ee20b-e403-4d88-bd87-77a1c8b71e93/volumes" Dec 11 13:35:08 crc kubenswrapper[4898]: I1211 13:35:08.800824 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d589c38c-afcf-4107-bf16-ac57d302576e" path="/var/lib/kubelet/pods/d589c38c-afcf-4107-bf16-ac57d302576e/volumes" Dec 11 13:35:11 crc kubenswrapper[4898]: I1211 13:35:11.030558 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-6502-account-create-update-6vp66"] Dec 11 13:35:11 crc kubenswrapper[4898]: I1211 13:35:11.042709 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-6502-account-create-update-6vp66"] Dec 11 13:35:12 crc kubenswrapper[4898]: I1211 13:35:12.029597 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-x6tgv"] Dec 11 13:35:12 crc kubenswrapper[4898]: I1211 13:35:12.044906 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mysqld-exporter-openstack-db-create-2wggg"] Dec 11 13:35:12 crc kubenswrapper[4898]: I1211 13:35:12.054207 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-x6tgv"] Dec 11 13:35:12 crc kubenswrapper[4898]: I1211 13:35:12.064997 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mysqld-exporter-openstack-db-create-2wggg"] Dec 11 13:35:12 crc kubenswrapper[4898]: I1211 13:35:12.075357 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mysqld-exporter-2649-account-create-update-xf8ww"] Dec 11 13:35:12 crc kubenswrapper[4898]: I1211 13:35:12.087829 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mysqld-exporter-2649-account-create-update-xf8ww"] Dec 11 13:35:12 crc kubenswrapper[4898]: I1211 13:35:12.793942 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8d685a72-643d-4739-9e26-28a37c6391d3" path="/var/lib/kubelet/pods/8d685a72-643d-4739-9e26-28a37c6391d3/volumes" Dec 11 13:35:12 crc kubenswrapper[4898]: I1211 13:35:12.795563 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d28503d-c258-4bea-870f-7d8b34591c6e" path="/var/lib/kubelet/pods/9d28503d-c258-4bea-870f-7d8b34591c6e/volumes" Dec 11 13:35:12 crc kubenswrapper[4898]: I1211 13:35:12.796916 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b7b41e46-2b51-441b-b181-ab36339a8d19" path="/var/lib/kubelet/pods/b7b41e46-2b51-441b-b181-ab36339a8d19/volumes" Dec 11 13:35:12 crc kubenswrapper[4898]: I1211 13:35:12.797926 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b83689fb-d5d6-4c88-8976-791b41ff048f" path="/var/lib/kubelet/pods/b83689fb-d5d6-4c88-8976-791b41ff048f/volumes" Dec 11 13:35:14 crc kubenswrapper[4898]: I1211 13:35:14.775435 4898 scope.go:117] "RemoveContainer" containerID="a48cb70f955c6dfbf71c08dc1817c1d1fa98db5c3f0203c3fcc75aa88e18e901" Dec 11 13:35:14 crc kubenswrapper[4898]: E1211 13:35:14.776294 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mmvk_openshift-machine-config-operator(b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" Dec 11 13:35:18 crc kubenswrapper[4898]: I1211 13:35:18.046402 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mysqld-exporter-openstack-cell1-db-create-bhvfq"] Dec 11 13:35:18 crc kubenswrapper[4898]: I1211 13:35:18.060966 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mysqld-exporter-openstack-cell1-db-create-bhvfq"] Dec 11 13:35:18 crc kubenswrapper[4898]: I1211 13:35:18.073520 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mysqld-exporter-ca33-account-create-update-8x4rf"] Dec 11 13:35:18 crc kubenswrapper[4898]: I1211 13:35:18.085784 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mysqld-exporter-ca33-account-create-update-8x4rf"] Dec 11 13:35:18 crc kubenswrapper[4898]: I1211 13:35:18.791499 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cc39efd-7ed2-48a9-a27e-e637091064ef" path="/var/lib/kubelet/pods/8cc39efd-7ed2-48a9-a27e-e637091064ef/volumes" Dec 11 13:35:18 crc kubenswrapper[4898]: I1211 13:35:18.793576 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="95b56d2f-c89b-4531-9e3a-f2c7accab5dd" path="/var/lib/kubelet/pods/95b56d2f-c89b-4531-9e3a-f2c7accab5dd/volumes" Dec 11 13:35:26 crc kubenswrapper[4898]: I1211 13:35:26.775825 4898 scope.go:117] "RemoveContainer" containerID="a48cb70f955c6dfbf71c08dc1817c1d1fa98db5c3f0203c3fcc75aa88e18e901" Dec 11 13:35:26 crc kubenswrapper[4898]: E1211 13:35:26.776596 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mmvk_openshift-machine-config-operator(b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" Dec 11 13:35:36 crc kubenswrapper[4898]: I1211 13:35:36.335917 4898 scope.go:117] "RemoveContainer" containerID="e01a2f3c04c58809c753d3d56c8b5179d770c2fb21f9f086d7a992fd0e41e50d" Dec 11 13:35:36 crc kubenswrapper[4898]: I1211 13:35:36.384900 4898 scope.go:117] "RemoveContainer" containerID="e91e97d01c180d6d046e785fd4f812474b772dffb8fc3766a01e98017b145bee" Dec 11 13:35:36 crc kubenswrapper[4898]: I1211 13:35:36.452050 4898 scope.go:117] "RemoveContainer" containerID="fa755ac0fd9e6a647310ce8425f2d331da86e2399db53aa213fbc7e0c23ae838" Dec 11 13:35:36 crc kubenswrapper[4898]: I1211 13:35:36.516324 4898 scope.go:117] "RemoveContainer" containerID="ee6a9e1bfc5f1196653d49d8f163454806653e71d73badb97e660496a768d9c3" Dec 11 13:35:36 crc kubenswrapper[4898]: I1211 13:35:36.569191 4898 scope.go:117] "RemoveContainer" containerID="612c2c79ebc3623dba06ee7e330fab02c5aaf37676368645a297d4ecef5f9190" Dec 11 13:35:36 crc kubenswrapper[4898]: I1211 13:35:36.620426 4898 scope.go:117] "RemoveContainer" containerID="d628a291c281d94a14bfc2008ec184b1b172c8e4bf9783e218fe8826f72dca34" Dec 11 13:35:36 crc kubenswrapper[4898]: I1211 13:35:36.681767 4898 scope.go:117] "RemoveContainer" containerID="1a799b1cdd93058623481fb9cd2fa7b7de26c460b06cc8ba6b0a53d030d6a5fe" Dec 11 13:35:36 crc kubenswrapper[4898]: I1211 13:35:36.703972 4898 scope.go:117] "RemoveContainer" containerID="bb2a3d1269e0f3aba572d22ac5a2c83c4753175f6dccd0278ce3743a1138239a" Dec 11 13:35:36 crc kubenswrapper[4898]: I1211 13:35:36.727550 4898 scope.go:117] "RemoveContainer" containerID="369e4b9ed923862da0bc88beeeeff2e5ceda527cab10a6f59c016a734e9429f1" Dec 11 13:35:36 crc kubenswrapper[4898]: I1211 13:35:36.758085 4898 scope.go:117] "RemoveContainer" containerID="146f4995a97325967ce27374ab4cb69947c0be1a93c44dbee535156d898cc07b" Dec 11 13:35:36 crc kubenswrapper[4898]: I1211 13:35:36.788783 4898 scope.go:117] "RemoveContainer" containerID="c9a3f122a98d0205d6df97b02ea6119c381812cf008acf7587427145634ff0e4" Dec 11 13:35:41 crc kubenswrapper[4898]: I1211 13:35:41.775212 4898 scope.go:117] "RemoveContainer" containerID="a48cb70f955c6dfbf71c08dc1817c1d1fa98db5c3f0203c3fcc75aa88e18e901" Dec 11 13:35:41 crc kubenswrapper[4898]: E1211 13:35:41.776179 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mmvk_openshift-machine-config-operator(b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" Dec 11 13:35:43 crc kubenswrapper[4898]: I1211 13:35:43.062449 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-d4nc9"] Dec 11 13:35:43 crc kubenswrapper[4898]: I1211 13:35:43.084314 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-nwptx"] Dec 11 13:35:43 crc kubenswrapper[4898]: I1211 13:35:43.100651 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-d4nc9"] Dec 11 13:35:43 crc kubenswrapper[4898]: I1211 13:35:43.138702 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-create-llbsz"] Dec 11 13:35:43 crc kubenswrapper[4898]: I1211 13:35:43.154063 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-create-llbsz"] Dec 11 13:35:43 crc kubenswrapper[4898]: I1211 13:35:43.172531 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-nwptx"] Dec 11 13:35:44 crc kubenswrapper[4898]: I1211 13:35:44.047521 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-a629-account-create-update-fsml9"] Dec 11 13:35:44 crc kubenswrapper[4898]: I1211 13:35:44.058419 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-af81-account-create-update-rqlbc"] Dec 11 13:35:44 crc kubenswrapper[4898]: I1211 13:35:44.069131 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-a629-account-create-update-fsml9"] Dec 11 13:35:44 crc kubenswrapper[4898]: I1211 13:35:44.080276 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-c170-account-create-update-l27sp"] Dec 11 13:35:44 crc kubenswrapper[4898]: I1211 13:35:44.090833 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-c170-account-create-update-l27sp"] Dec 11 13:35:44 crc kubenswrapper[4898]: I1211 13:35:44.103639 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-af81-account-create-update-rqlbc"] Dec 11 13:35:44 crc kubenswrapper[4898]: I1211 13:35:44.802522 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="11baa6bc-e306-47e7-80c0-75a2236f35d0" path="/var/lib/kubelet/pods/11baa6bc-e306-47e7-80c0-75a2236f35d0/volumes" Dec 11 13:35:44 crc kubenswrapper[4898]: I1211 13:35:44.804815 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="204bbeca-833c-4e42-a955-03fde2c57e84" path="/var/lib/kubelet/pods/204bbeca-833c-4e42-a955-03fde2c57e84/volumes" Dec 11 13:35:44 crc kubenswrapper[4898]: I1211 13:35:44.807355 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="301f2f68-2aae-4019-8b8e-a9473250de65" path="/var/lib/kubelet/pods/301f2f68-2aae-4019-8b8e-a9473250de65/volumes" Dec 11 13:35:44 crc kubenswrapper[4898]: I1211 13:35:44.809290 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="821d27f4-deb1-4474-bee8-76c9caf611b1" path="/var/lib/kubelet/pods/821d27f4-deb1-4474-bee8-76c9caf611b1/volumes" Dec 11 13:35:44 crc kubenswrapper[4898]: I1211 13:35:44.812322 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9a3ef5d3-9ee5-4b40-bae4-3b5ddcfd036e" path="/var/lib/kubelet/pods/9a3ef5d3-9ee5-4b40-bae4-3b5ddcfd036e/volumes" Dec 11 13:35:44 crc kubenswrapper[4898]: I1211 13:35:44.813428 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eaf27272-313d-40d9-b882-151aaaf3da23" path="/var/lib/kubelet/pods/eaf27272-313d-40d9-b882-151aaaf3da23/volumes" Dec 11 13:35:49 crc kubenswrapper[4898]: I1211 13:35:49.045763 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-689f7"] Dec 11 13:35:49 crc kubenswrapper[4898]: I1211 13:35:49.063536 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-fed5-account-create-update-6hwgx"] Dec 11 13:35:49 crc kubenswrapper[4898]: I1211 13:35:49.078305 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-lkkj2"] Dec 11 13:35:49 crc kubenswrapper[4898]: I1211 13:35:49.096691 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-689f7"] Dec 11 13:35:49 crc kubenswrapper[4898]: I1211 13:35:49.117849 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-lkkj2"] Dec 11 13:35:49 crc kubenswrapper[4898]: I1211 13:35:49.132127 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-fed5-account-create-update-6hwgx"] Dec 11 13:35:49 crc kubenswrapper[4898]: I1211 13:35:49.216503 4898 generic.go:334] "Generic (PLEG): container finished" podID="329b098d-eb3e-413e-b398-697e70fb3ef9" containerID="499d397bc6d836e7904b74989063eacdf93d10da47a1f78e3d4a1cacf9c769b8" exitCode=0 Dec 11 13:35:49 crc kubenswrapper[4898]: I1211 13:35:49.216563 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-mh8pf" event={"ID":"329b098d-eb3e-413e-b398-697e70fb3ef9","Type":"ContainerDied","Data":"499d397bc6d836e7904b74989063eacdf93d10da47a1f78e3d4a1cacf9c769b8"} Dec 11 13:35:50 crc kubenswrapper[4898]: I1211 13:35:50.789377 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b57dfab-5b26-4f64-a406-ea1701ef79d1" path="/var/lib/kubelet/pods/0b57dfab-5b26-4f64-a406-ea1701ef79d1/volumes" Dec 11 13:35:50 crc kubenswrapper[4898]: I1211 13:35:50.791188 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ffd41d5-6bb8-4cbf-8054-93e5e6b2a96c" path="/var/lib/kubelet/pods/6ffd41d5-6bb8-4cbf-8054-93e5e6b2a96c/volumes" Dec 11 13:35:50 crc kubenswrapper[4898]: I1211 13:35:50.792095 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-mh8pf" Dec 11 13:35:50 crc kubenswrapper[4898]: I1211 13:35:50.792187 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca6259a8-1ee8-47a6-b102-e7e22b93c2c1" path="/var/lib/kubelet/pods/ca6259a8-1ee8-47a6-b102-e7e22b93c2c1/volumes" Dec 11 13:35:50 crc kubenswrapper[4898]: I1211 13:35:50.912371 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v5xvg\" (UniqueName: \"kubernetes.io/projected/329b098d-eb3e-413e-b398-697e70fb3ef9-kube-api-access-v5xvg\") pod \"329b098d-eb3e-413e-b398-697e70fb3ef9\" (UID: \"329b098d-eb3e-413e-b398-697e70fb3ef9\") " Dec 11 13:35:50 crc kubenswrapper[4898]: I1211 13:35:50.913196 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/329b098d-eb3e-413e-b398-697e70fb3ef9-bootstrap-combined-ca-bundle\") pod \"329b098d-eb3e-413e-b398-697e70fb3ef9\" (UID: \"329b098d-eb3e-413e-b398-697e70fb3ef9\") " Dec 11 13:35:50 crc kubenswrapper[4898]: I1211 13:35:50.913638 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/329b098d-eb3e-413e-b398-697e70fb3ef9-ssh-key\") pod \"329b098d-eb3e-413e-b398-697e70fb3ef9\" (UID: \"329b098d-eb3e-413e-b398-697e70fb3ef9\") " Dec 11 13:35:50 crc kubenswrapper[4898]: I1211 13:35:50.913732 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/329b098d-eb3e-413e-b398-697e70fb3ef9-inventory\") pod \"329b098d-eb3e-413e-b398-697e70fb3ef9\" (UID: \"329b098d-eb3e-413e-b398-697e70fb3ef9\") " Dec 11 13:35:50 crc kubenswrapper[4898]: I1211 13:35:50.922709 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/329b098d-eb3e-413e-b398-697e70fb3ef9-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "329b098d-eb3e-413e-b398-697e70fb3ef9" (UID: "329b098d-eb3e-413e-b398-697e70fb3ef9"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:35:50 crc kubenswrapper[4898]: I1211 13:35:50.922759 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/329b098d-eb3e-413e-b398-697e70fb3ef9-kube-api-access-v5xvg" (OuterVolumeSpecName: "kube-api-access-v5xvg") pod "329b098d-eb3e-413e-b398-697e70fb3ef9" (UID: "329b098d-eb3e-413e-b398-697e70fb3ef9"). InnerVolumeSpecName "kube-api-access-v5xvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:35:50 crc kubenswrapper[4898]: I1211 13:35:50.949944 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/329b098d-eb3e-413e-b398-697e70fb3ef9-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "329b098d-eb3e-413e-b398-697e70fb3ef9" (UID: "329b098d-eb3e-413e-b398-697e70fb3ef9"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:35:50 crc kubenswrapper[4898]: I1211 13:35:50.964090 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/329b098d-eb3e-413e-b398-697e70fb3ef9-inventory" (OuterVolumeSpecName: "inventory") pod "329b098d-eb3e-413e-b398-697e70fb3ef9" (UID: "329b098d-eb3e-413e-b398-697e70fb3ef9"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:35:51 crc kubenswrapper[4898]: I1211 13:35:51.017696 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v5xvg\" (UniqueName: \"kubernetes.io/projected/329b098d-eb3e-413e-b398-697e70fb3ef9-kube-api-access-v5xvg\") on node \"crc\" DevicePath \"\"" Dec 11 13:35:51 crc kubenswrapper[4898]: I1211 13:35:51.017738 4898 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/329b098d-eb3e-413e-b398-697e70fb3ef9-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 11 13:35:51 crc kubenswrapper[4898]: I1211 13:35:51.017750 4898 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/329b098d-eb3e-413e-b398-697e70fb3ef9-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 11 13:35:51 crc kubenswrapper[4898]: I1211 13:35:51.017762 4898 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/329b098d-eb3e-413e-b398-697e70fb3ef9-inventory\") on node \"crc\" DevicePath \"\"" Dec 11 13:35:51 crc kubenswrapper[4898]: I1211 13:35:51.256167 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-mh8pf" event={"ID":"329b098d-eb3e-413e-b398-697e70fb3ef9","Type":"ContainerDied","Data":"64a065a2c0746a92d6d30874eb60046840911d5669606a6838ad24cfb71a25bb"} Dec 11 13:35:51 crc kubenswrapper[4898]: I1211 13:35:51.256228 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-mh8pf" Dec 11 13:35:51 crc kubenswrapper[4898]: I1211 13:35:51.256238 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="64a065a2c0746a92d6d30874eb60046840911d5669606a6838ad24cfb71a25bb" Dec 11 13:35:51 crc kubenswrapper[4898]: I1211 13:35:51.362575 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-dj55k"] Dec 11 13:35:51 crc kubenswrapper[4898]: E1211 13:35:51.363177 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="329b098d-eb3e-413e-b398-697e70fb3ef9" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Dec 11 13:35:51 crc kubenswrapper[4898]: I1211 13:35:51.363195 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="329b098d-eb3e-413e-b398-697e70fb3ef9" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Dec 11 13:35:51 crc kubenswrapper[4898]: I1211 13:35:51.363411 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="329b098d-eb3e-413e-b398-697e70fb3ef9" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Dec 11 13:35:51 crc kubenswrapper[4898]: I1211 13:35:51.364275 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-dj55k" Dec 11 13:35:51 crc kubenswrapper[4898]: I1211 13:35:51.368148 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 11 13:35:51 crc kubenswrapper[4898]: I1211 13:35:51.368425 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 11 13:35:51 crc kubenswrapper[4898]: I1211 13:35:51.372177 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-w8jc8" Dec 11 13:35:51 crc kubenswrapper[4898]: I1211 13:35:51.372185 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 11 13:35:51 crc kubenswrapper[4898]: I1211 13:35:51.379346 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-dj55k"] Dec 11 13:35:51 crc kubenswrapper[4898]: I1211 13:35:51.430867 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-45f6z\" (UniqueName: \"kubernetes.io/projected/fcd00abe-3b13-42b3-81bc-671095c37415-kube-api-access-45f6z\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-dj55k\" (UID: \"fcd00abe-3b13-42b3-81bc-671095c37415\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-dj55k" Dec 11 13:35:51 crc kubenswrapper[4898]: I1211 13:35:51.430987 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fcd00abe-3b13-42b3-81bc-671095c37415-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-dj55k\" (UID: \"fcd00abe-3b13-42b3-81bc-671095c37415\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-dj55k" Dec 11 13:35:51 crc kubenswrapper[4898]: I1211 13:35:51.431136 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fcd00abe-3b13-42b3-81bc-671095c37415-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-dj55k\" (UID: \"fcd00abe-3b13-42b3-81bc-671095c37415\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-dj55k" Dec 11 13:35:51 crc kubenswrapper[4898]: I1211 13:35:51.532878 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fcd00abe-3b13-42b3-81bc-671095c37415-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-dj55k\" (UID: \"fcd00abe-3b13-42b3-81bc-671095c37415\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-dj55k" Dec 11 13:35:51 crc kubenswrapper[4898]: I1211 13:35:51.533516 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-45f6z\" (UniqueName: \"kubernetes.io/projected/fcd00abe-3b13-42b3-81bc-671095c37415-kube-api-access-45f6z\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-dj55k\" (UID: \"fcd00abe-3b13-42b3-81bc-671095c37415\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-dj55k" Dec 11 13:35:51 crc kubenswrapper[4898]: I1211 13:35:51.533641 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fcd00abe-3b13-42b3-81bc-671095c37415-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-dj55k\" (UID: \"fcd00abe-3b13-42b3-81bc-671095c37415\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-dj55k" Dec 11 13:35:51 crc kubenswrapper[4898]: I1211 13:35:51.539091 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fcd00abe-3b13-42b3-81bc-671095c37415-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-dj55k\" (UID: \"fcd00abe-3b13-42b3-81bc-671095c37415\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-dj55k" Dec 11 13:35:51 crc kubenswrapper[4898]: I1211 13:35:51.554318 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fcd00abe-3b13-42b3-81bc-671095c37415-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-dj55k\" (UID: \"fcd00abe-3b13-42b3-81bc-671095c37415\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-dj55k" Dec 11 13:35:51 crc kubenswrapper[4898]: I1211 13:35:51.557368 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-45f6z\" (UniqueName: \"kubernetes.io/projected/fcd00abe-3b13-42b3-81bc-671095c37415-kube-api-access-45f6z\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-dj55k\" (UID: \"fcd00abe-3b13-42b3-81bc-671095c37415\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-dj55k" Dec 11 13:35:51 crc kubenswrapper[4898]: E1211 13:35:51.579096 4898 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod329b098d_eb3e_413e_b398_697e70fb3ef9.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod329b098d_eb3e_413e_b398_697e70fb3ef9.slice/crio-64a065a2c0746a92d6d30874eb60046840911d5669606a6838ad24cfb71a25bb\": RecentStats: unable to find data in memory cache]" Dec 11 13:35:51 crc kubenswrapper[4898]: I1211 13:35:51.695784 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-dj55k" Dec 11 13:35:52 crc kubenswrapper[4898]: I1211 13:35:52.258375 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-dj55k"] Dec 11 13:35:52 crc kubenswrapper[4898]: W1211 13:35:52.261285 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfcd00abe_3b13_42b3_81bc_671095c37415.slice/crio-29cb82498b6edc9d6263f1b4585d7696e5f3f0d0c89994f7d9c3162108984d06 WatchSource:0}: Error finding container 29cb82498b6edc9d6263f1b4585d7696e5f3f0d0c89994f7d9c3162108984d06: Status 404 returned error can't find the container with id 29cb82498b6edc9d6263f1b4585d7696e5f3f0d0c89994f7d9c3162108984d06 Dec 11 13:35:52 crc kubenswrapper[4898]: I1211 13:35:52.266857 4898 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 11 13:35:53 crc kubenswrapper[4898]: I1211 13:35:53.285770 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-dj55k" event={"ID":"fcd00abe-3b13-42b3-81bc-671095c37415","Type":"ContainerStarted","Data":"29cb82498b6edc9d6263f1b4585d7696e5f3f0d0c89994f7d9c3162108984d06"} Dec 11 13:35:54 crc kubenswrapper[4898]: I1211 13:35:54.303305 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-dj55k" event={"ID":"fcd00abe-3b13-42b3-81bc-671095c37415","Type":"ContainerStarted","Data":"6bee7838f118c6e5806128a420105eb9b6cf128bc813d346b6f9610c7c512a6f"} Dec 11 13:35:54 crc kubenswrapper[4898]: I1211 13:35:54.343746 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-dj55k" podStartSLOduration=2.606760792 podStartE2EDuration="3.343720009s" podCreationTimestamp="2025-12-11 13:35:51 +0000 UTC" firstStartedPulling="2025-12-11 13:35:52.266563354 +0000 UTC m=+1909.838889801" lastFinishedPulling="2025-12-11 13:35:53.003522581 +0000 UTC m=+1910.575849018" observedRunningTime="2025-12-11 13:35:54.322967166 +0000 UTC m=+1911.895293643" watchObservedRunningTime="2025-12-11 13:35:54.343720009 +0000 UTC m=+1911.916046476" Dec 11 13:35:55 crc kubenswrapper[4898]: I1211 13:35:55.049180 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-4pkrx"] Dec 11 13:35:55 crc kubenswrapper[4898]: I1211 13:35:55.064202 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-4pkrx"] Dec 11 13:35:55 crc kubenswrapper[4898]: I1211 13:35:55.775297 4898 scope.go:117] "RemoveContainer" containerID="a48cb70f955c6dfbf71c08dc1817c1d1fa98db5c3f0203c3fcc75aa88e18e901" Dec 11 13:35:55 crc kubenswrapper[4898]: E1211 13:35:55.775854 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mmvk_openshift-machine-config-operator(b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" Dec 11 13:35:56 crc kubenswrapper[4898]: I1211 13:35:56.791233 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d314c3d5-6a51-4713-bae9-d25641533de2" path="/var/lib/kubelet/pods/d314c3d5-6a51-4713-bae9-d25641533de2/volumes" Dec 11 13:36:10 crc kubenswrapper[4898]: I1211 13:36:10.776249 4898 scope.go:117] "RemoveContainer" containerID="a48cb70f955c6dfbf71c08dc1817c1d1fa98db5c3f0203c3fcc75aa88e18e901" Dec 11 13:36:10 crc kubenswrapper[4898]: E1211 13:36:10.776945 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mmvk_openshift-machine-config-operator(b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" Dec 11 13:36:22 crc kubenswrapper[4898]: I1211 13:36:22.785811 4898 scope.go:117] "RemoveContainer" containerID="a48cb70f955c6dfbf71c08dc1817c1d1fa98db5c3f0203c3fcc75aa88e18e901" Dec 11 13:36:22 crc kubenswrapper[4898]: E1211 13:36:22.787154 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mmvk_openshift-machine-config-operator(b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" Dec 11 13:36:30 crc kubenswrapper[4898]: I1211 13:36:30.057159 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-f57gt"] Dec 11 13:36:30 crc kubenswrapper[4898]: I1211 13:36:30.070800 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-f57gt"] Dec 11 13:36:30 crc kubenswrapper[4898]: I1211 13:36:30.800872 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ff1a3a4-ba9d-4155-b289-46de3809f5f4" path="/var/lib/kubelet/pods/4ff1a3a4-ba9d-4155-b289-46de3809f5f4/volumes" Dec 11 13:36:36 crc kubenswrapper[4898]: I1211 13:36:36.776543 4898 scope.go:117] "RemoveContainer" containerID="a48cb70f955c6dfbf71c08dc1817c1d1fa98db5c3f0203c3fcc75aa88e18e901" Dec 11 13:36:36 crc kubenswrapper[4898]: E1211 13:36:36.777531 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mmvk_openshift-machine-config-operator(b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" Dec 11 13:36:37 crc kubenswrapper[4898]: I1211 13:36:37.089557 4898 scope.go:117] "RemoveContainer" containerID="f0f6384e87bf34cb6ab3ac838d184a33ef92aa13204fb285d70e73c0cee37487" Dec 11 13:36:37 crc kubenswrapper[4898]: I1211 13:36:37.121131 4898 scope.go:117] "RemoveContainer" containerID="b276dd4c401d4d881332ff7574fd26b6e60cd7531037eb63ff2213a045643cb6" Dec 11 13:36:37 crc kubenswrapper[4898]: I1211 13:36:37.185240 4898 scope.go:117] "RemoveContainer" containerID="98d3ce981c0ba2ba0f2609170da8e268d18bba274c4883942640e6a45a718f2c" Dec 11 13:36:37 crc kubenswrapper[4898]: I1211 13:36:37.282128 4898 scope.go:117] "RemoveContainer" containerID="8f1383c507a0d9e655255875ca4a2160451be78073d4b1e6294538ffaa30d522" Dec 11 13:36:37 crc kubenswrapper[4898]: I1211 13:36:37.320876 4898 scope.go:117] "RemoveContainer" containerID="2086865a139fffb7d9fb0bd0965100c91f1332465d40efaaa0782cfb315e8bcc" Dec 11 13:36:37 crc kubenswrapper[4898]: I1211 13:36:37.367177 4898 scope.go:117] "RemoveContainer" containerID="815d602c2ce595b20b6ecfa99677b314b69572a0ae1586f14b22081f787294c8" Dec 11 13:36:37 crc kubenswrapper[4898]: I1211 13:36:37.415118 4898 scope.go:117] "RemoveContainer" containerID="fa5a8c762147c773985fdb23692bd511f587c5c083dfe27381386c6ee7a53f2b" Dec 11 13:36:37 crc kubenswrapper[4898]: I1211 13:36:37.445120 4898 scope.go:117] "RemoveContainer" containerID="1705ff38453876f18c54dd48ada64a13d3af0c4717292a3feaf5b6e58d32fada" Dec 11 13:36:37 crc kubenswrapper[4898]: I1211 13:36:37.471940 4898 scope.go:117] "RemoveContainer" containerID="2aadd6b419d7953b73237ba0b801687d9b696c3257afe7c941accd0fc0fc9c4e" Dec 11 13:36:37 crc kubenswrapper[4898]: I1211 13:36:37.506775 4898 scope.go:117] "RemoveContainer" containerID="30bd400b99cd12f96e97ead55af6bcd9b8d9bf06cfe4939424e74ac281a9f0d5" Dec 11 13:36:37 crc kubenswrapper[4898]: I1211 13:36:37.542166 4898 scope.go:117] "RemoveContainer" containerID="99a106f5af64c3e0fc5dc2b7062d46228765e0e0110a7d694a95c2416f1ece26" Dec 11 13:36:40 crc kubenswrapper[4898]: I1211 13:36:40.034480 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-mhjjq"] Dec 11 13:36:40 crc kubenswrapper[4898]: I1211 13:36:40.045515 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-mhjjq"] Dec 11 13:36:40 crc kubenswrapper[4898]: I1211 13:36:40.798075 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b93b66d1-a195-42b8-912d-5029d9f0e6b3" path="/var/lib/kubelet/pods/b93b66d1-a195-42b8-912d-5029d9f0e6b3/volumes" Dec 11 13:36:43 crc kubenswrapper[4898]: I1211 13:36:43.040375 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-ml4kr"] Dec 11 13:36:43 crc kubenswrapper[4898]: I1211 13:36:43.056034 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-ml4kr"] Dec 11 13:36:44 crc kubenswrapper[4898]: I1211 13:36:44.798320 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="470d01dc-02f0-49f5-912b-087238320dba" path="/var/lib/kubelet/pods/470d01dc-02f0-49f5-912b-087238320dba/volumes" Dec 11 13:36:46 crc kubenswrapper[4898]: I1211 13:36:46.055044 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-gtqwj"] Dec 11 13:36:46 crc kubenswrapper[4898]: I1211 13:36:46.072116 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-gtqwj"] Dec 11 13:36:46 crc kubenswrapper[4898]: I1211 13:36:46.788533 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5924443a-434a-4efc-b04d-fdc73d3e2fe6" path="/var/lib/kubelet/pods/5924443a-434a-4efc-b04d-fdc73d3e2fe6/volumes" Dec 11 13:36:48 crc kubenswrapper[4898]: I1211 13:36:48.775760 4898 scope.go:117] "RemoveContainer" containerID="a48cb70f955c6dfbf71c08dc1817c1d1fa98db5c3f0203c3fcc75aa88e18e901" Dec 11 13:36:48 crc kubenswrapper[4898]: E1211 13:36:48.776633 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mmvk_openshift-machine-config-operator(b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" Dec 11 13:36:58 crc kubenswrapper[4898]: I1211 13:36:58.048619 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-6v7s7"] Dec 11 13:36:58 crc kubenswrapper[4898]: I1211 13:36:58.059851 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-6v7s7"] Dec 11 13:36:58 crc kubenswrapper[4898]: I1211 13:36:58.786601 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="76d89e82-8f2e-4198-8736-28293404a0bd" path="/var/lib/kubelet/pods/76d89e82-8f2e-4198-8736-28293404a0bd/volumes" Dec 11 13:37:00 crc kubenswrapper[4898]: I1211 13:37:00.775719 4898 scope.go:117] "RemoveContainer" containerID="a48cb70f955c6dfbf71c08dc1817c1d1fa98db5c3f0203c3fcc75aa88e18e901" Dec 11 13:37:00 crc kubenswrapper[4898]: E1211 13:37:00.776069 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mmvk_openshift-machine-config-operator(b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" Dec 11 13:37:14 crc kubenswrapper[4898]: I1211 13:37:14.776015 4898 scope.go:117] "RemoveContainer" containerID="a48cb70f955c6dfbf71c08dc1817c1d1fa98db5c3f0203c3fcc75aa88e18e901" Dec 11 13:37:15 crc kubenswrapper[4898]: I1211 13:37:15.400702 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" event={"ID":"b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c","Type":"ContainerStarted","Data":"2dc6b826c592db36a826a1bc1d1e5ae9c15e49f961f4c157520efd5d4a76b991"} Dec 11 13:37:36 crc kubenswrapper[4898]: I1211 13:37:36.059376 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-c008-account-create-update-5t4pl"] Dec 11 13:37:36 crc kubenswrapper[4898]: I1211 13:37:36.081207 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-1a34-account-create-update-g82b9"] Dec 11 13:37:36 crc kubenswrapper[4898]: I1211 13:37:36.104097 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-d8bf-account-create-update-f5twm"] Dec 11 13:37:36 crc kubenswrapper[4898]: I1211 13:37:36.113431 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-c008-account-create-update-5t4pl"] Dec 11 13:37:36 crc kubenswrapper[4898]: I1211 13:37:36.123943 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-d8bf-account-create-update-f5twm"] Dec 11 13:37:36 crc kubenswrapper[4898]: I1211 13:37:36.132780 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-tfn2j"] Dec 11 13:37:36 crc kubenswrapper[4898]: I1211 13:37:36.142411 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-rnnwz"] Dec 11 13:37:36 crc kubenswrapper[4898]: I1211 13:37:36.152409 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-1a34-account-create-update-g82b9"] Dec 11 13:37:36 crc kubenswrapper[4898]: I1211 13:37:36.162518 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-qnzh8"] Dec 11 13:37:36 crc kubenswrapper[4898]: I1211 13:37:36.170936 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-tfn2j"] Dec 11 13:37:36 crc kubenswrapper[4898]: I1211 13:37:36.182181 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-rnnwz"] Dec 11 13:37:36 crc kubenswrapper[4898]: I1211 13:37:36.193539 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-qnzh8"] Dec 11 13:37:36 crc kubenswrapper[4898]: I1211 13:37:36.795148 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1188efd5-cf9c-48dc-bd72-5259294cca4c" path="/var/lib/kubelet/pods/1188efd5-cf9c-48dc-bd72-5259294cca4c/volumes" Dec 11 13:37:36 crc kubenswrapper[4898]: I1211 13:37:36.797029 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b4db1d0-fd46-4731-8392-e8390610a2c8" path="/var/lib/kubelet/pods/2b4db1d0-fd46-4731-8392-e8390610a2c8/volumes" Dec 11 13:37:36 crc kubenswrapper[4898]: I1211 13:37:36.798396 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="446e3383-1a1b-4271-94ca-1662e36059d3" path="/var/lib/kubelet/pods/446e3383-1a1b-4271-94ca-1662e36059d3/volumes" Dec 11 13:37:36 crc kubenswrapper[4898]: I1211 13:37:36.799853 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d173a21-f567-430e-b7fe-c47892f42873" path="/var/lib/kubelet/pods/5d173a21-f567-430e-b7fe-c47892f42873/volumes" Dec 11 13:37:36 crc kubenswrapper[4898]: I1211 13:37:36.802877 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2cfe6b8-8f34-455c-afe3-61b33605d648" path="/var/lib/kubelet/pods/c2cfe6b8-8f34-455c-afe3-61b33605d648/volumes" Dec 11 13:37:36 crc kubenswrapper[4898]: I1211 13:37:36.804576 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f07f8aaa-f41a-4204-881e-274dc9a9ad74" path="/var/lib/kubelet/pods/f07f8aaa-f41a-4204-881e-274dc9a9ad74/volumes" Dec 11 13:37:37 crc kubenswrapper[4898]: I1211 13:37:37.828554 4898 scope.go:117] "RemoveContainer" containerID="12bc5f25f4fab7bd424b7ecd41040a348ea7f36cbdbd74a785bd9242cc9e9b44" Dec 11 13:37:37 crc kubenswrapper[4898]: I1211 13:37:37.869520 4898 scope.go:117] "RemoveContainer" containerID="b22c69ce81887d572d8d77d2dcea77704ea3414613a2ccc6e4b6e2abc100a5c4" Dec 11 13:37:37 crc kubenswrapper[4898]: I1211 13:37:37.943798 4898 scope.go:117] "RemoveContainer" containerID="5eb8b1fac0eefc98cdd4aa11ee9069c852eb895b98d696667be2c09f7aa07bbb" Dec 11 13:37:38 crc kubenswrapper[4898]: I1211 13:37:38.001366 4898 scope.go:117] "RemoveContainer" containerID="22c0ea90004f2589fdd305567b0d291a7c8b35733c827f0fd4d110900c546170" Dec 11 13:37:38 crc kubenswrapper[4898]: I1211 13:37:38.092115 4898 scope.go:117] "RemoveContainer" containerID="9f62647d653af839ebc11879be82bcfc9ac18bc6d9e6ce2f25d2671618157129" Dec 11 13:37:38 crc kubenswrapper[4898]: I1211 13:37:38.121605 4898 scope.go:117] "RemoveContainer" containerID="31418266dbda0e303b0b1289c6ae139d3a6995a48d451fcd08d3ae4def89749b" Dec 11 13:37:38 crc kubenswrapper[4898]: I1211 13:37:38.183197 4898 scope.go:117] "RemoveContainer" containerID="70ef5f3eb8612084e2aa274d7d73c0d22001ad25ea423aa131b75b0ee1ce27dc" Dec 11 13:37:38 crc kubenswrapper[4898]: I1211 13:37:38.226276 4898 scope.go:117] "RemoveContainer" containerID="dc81a7b7c44e4ed19357ecb521a7ccadf35a545b8d09718313ff58bd79adce59" Dec 11 13:37:38 crc kubenswrapper[4898]: I1211 13:37:38.253217 4898 scope.go:117] "RemoveContainer" containerID="87016d8e9b9f797522f428ba0d870d8d0bee8cdfd5017a713ac92eb31e20f22f" Dec 11 13:37:38 crc kubenswrapper[4898]: I1211 13:37:38.282422 4898 scope.go:117] "RemoveContainer" containerID="e4c171d2cc66e164716bb30d991949d4fbde33237792a70a6a026fe33cf71144" Dec 11 13:38:02 crc kubenswrapper[4898]: I1211 13:38:02.061405 4898 generic.go:334] "Generic (PLEG): container finished" podID="fcd00abe-3b13-42b3-81bc-671095c37415" containerID="6bee7838f118c6e5806128a420105eb9b6cf128bc813d346b6f9610c7c512a6f" exitCode=0 Dec 11 13:38:02 crc kubenswrapper[4898]: I1211 13:38:02.061514 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-dj55k" event={"ID":"fcd00abe-3b13-42b3-81bc-671095c37415","Type":"ContainerDied","Data":"6bee7838f118c6e5806128a420105eb9b6cf128bc813d346b6f9610c7c512a6f"} Dec 11 13:38:03 crc kubenswrapper[4898]: I1211 13:38:03.571085 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-dj55k" Dec 11 13:38:03 crc kubenswrapper[4898]: I1211 13:38:03.668118 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fcd00abe-3b13-42b3-81bc-671095c37415-inventory\") pod \"fcd00abe-3b13-42b3-81bc-671095c37415\" (UID: \"fcd00abe-3b13-42b3-81bc-671095c37415\") " Dec 11 13:38:03 crc kubenswrapper[4898]: I1211 13:38:03.668688 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fcd00abe-3b13-42b3-81bc-671095c37415-ssh-key\") pod \"fcd00abe-3b13-42b3-81bc-671095c37415\" (UID: \"fcd00abe-3b13-42b3-81bc-671095c37415\") " Dec 11 13:38:03 crc kubenswrapper[4898]: I1211 13:38:03.669169 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-45f6z\" (UniqueName: \"kubernetes.io/projected/fcd00abe-3b13-42b3-81bc-671095c37415-kube-api-access-45f6z\") pod \"fcd00abe-3b13-42b3-81bc-671095c37415\" (UID: \"fcd00abe-3b13-42b3-81bc-671095c37415\") " Dec 11 13:38:03 crc kubenswrapper[4898]: I1211 13:38:03.674690 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fcd00abe-3b13-42b3-81bc-671095c37415-kube-api-access-45f6z" (OuterVolumeSpecName: "kube-api-access-45f6z") pod "fcd00abe-3b13-42b3-81bc-671095c37415" (UID: "fcd00abe-3b13-42b3-81bc-671095c37415"). InnerVolumeSpecName "kube-api-access-45f6z". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:38:03 crc kubenswrapper[4898]: I1211 13:38:03.708332 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fcd00abe-3b13-42b3-81bc-671095c37415-inventory" (OuterVolumeSpecName: "inventory") pod "fcd00abe-3b13-42b3-81bc-671095c37415" (UID: "fcd00abe-3b13-42b3-81bc-671095c37415"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:38:03 crc kubenswrapper[4898]: I1211 13:38:03.728127 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fcd00abe-3b13-42b3-81bc-671095c37415-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "fcd00abe-3b13-42b3-81bc-671095c37415" (UID: "fcd00abe-3b13-42b3-81bc-671095c37415"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:38:03 crc kubenswrapper[4898]: I1211 13:38:03.773124 4898 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fcd00abe-3b13-42b3-81bc-671095c37415-inventory\") on node \"crc\" DevicePath \"\"" Dec 11 13:38:03 crc kubenswrapper[4898]: I1211 13:38:03.773426 4898 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fcd00abe-3b13-42b3-81bc-671095c37415-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 11 13:38:03 crc kubenswrapper[4898]: I1211 13:38:03.773438 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-45f6z\" (UniqueName: \"kubernetes.io/projected/fcd00abe-3b13-42b3-81bc-671095c37415-kube-api-access-45f6z\") on node \"crc\" DevicePath \"\"" Dec 11 13:38:04 crc kubenswrapper[4898]: I1211 13:38:04.126679 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-dj55k" event={"ID":"fcd00abe-3b13-42b3-81bc-671095c37415","Type":"ContainerDied","Data":"29cb82498b6edc9d6263f1b4585d7696e5f3f0d0c89994f7d9c3162108984d06"} Dec 11 13:38:04 crc kubenswrapper[4898]: I1211 13:38:04.126724 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="29cb82498b6edc9d6263f1b4585d7696e5f3f0d0c89994f7d9c3162108984d06" Dec 11 13:38:04 crc kubenswrapper[4898]: I1211 13:38:04.126785 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-dj55k" Dec 11 13:38:04 crc kubenswrapper[4898]: I1211 13:38:04.201626 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-df9km"] Dec 11 13:38:04 crc kubenswrapper[4898]: E1211 13:38:04.202297 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fcd00abe-3b13-42b3-81bc-671095c37415" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Dec 11 13:38:04 crc kubenswrapper[4898]: I1211 13:38:04.202323 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="fcd00abe-3b13-42b3-81bc-671095c37415" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Dec 11 13:38:04 crc kubenswrapper[4898]: I1211 13:38:04.202720 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="fcd00abe-3b13-42b3-81bc-671095c37415" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Dec 11 13:38:04 crc kubenswrapper[4898]: I1211 13:38:04.203861 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-df9km" Dec 11 13:38:04 crc kubenswrapper[4898]: I1211 13:38:04.207193 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 11 13:38:04 crc kubenswrapper[4898]: I1211 13:38:04.209248 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 11 13:38:04 crc kubenswrapper[4898]: I1211 13:38:04.209445 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 11 13:38:04 crc kubenswrapper[4898]: I1211 13:38:04.209566 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-w8jc8" Dec 11 13:38:04 crc kubenswrapper[4898]: I1211 13:38:04.227689 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-df9km"] Dec 11 13:38:04 crc kubenswrapper[4898]: I1211 13:38:04.292318 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/112b077a-0512-4528-8b26-158d512f09ad-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-df9km\" (UID: \"112b077a-0512-4528-8b26-158d512f09ad\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-df9km" Dec 11 13:38:04 crc kubenswrapper[4898]: I1211 13:38:04.292552 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/112b077a-0512-4528-8b26-158d512f09ad-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-df9km\" (UID: \"112b077a-0512-4528-8b26-158d512f09ad\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-df9km" Dec 11 13:38:04 crc kubenswrapper[4898]: I1211 13:38:04.294229 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4g9dp\" (UniqueName: \"kubernetes.io/projected/112b077a-0512-4528-8b26-158d512f09ad-kube-api-access-4g9dp\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-df9km\" (UID: \"112b077a-0512-4528-8b26-158d512f09ad\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-df9km" Dec 11 13:38:04 crc kubenswrapper[4898]: I1211 13:38:04.397524 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/112b077a-0512-4528-8b26-158d512f09ad-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-df9km\" (UID: \"112b077a-0512-4528-8b26-158d512f09ad\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-df9km" Dec 11 13:38:04 crc kubenswrapper[4898]: I1211 13:38:04.397622 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/112b077a-0512-4528-8b26-158d512f09ad-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-df9km\" (UID: \"112b077a-0512-4528-8b26-158d512f09ad\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-df9km" Dec 11 13:38:04 crc kubenswrapper[4898]: I1211 13:38:04.397903 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4g9dp\" (UniqueName: \"kubernetes.io/projected/112b077a-0512-4528-8b26-158d512f09ad-kube-api-access-4g9dp\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-df9km\" (UID: \"112b077a-0512-4528-8b26-158d512f09ad\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-df9km" Dec 11 13:38:04 crc kubenswrapper[4898]: I1211 13:38:04.403319 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/112b077a-0512-4528-8b26-158d512f09ad-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-df9km\" (UID: \"112b077a-0512-4528-8b26-158d512f09ad\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-df9km" Dec 11 13:38:04 crc kubenswrapper[4898]: I1211 13:38:04.404940 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/112b077a-0512-4528-8b26-158d512f09ad-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-df9km\" (UID: \"112b077a-0512-4528-8b26-158d512f09ad\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-df9km" Dec 11 13:38:04 crc kubenswrapper[4898]: I1211 13:38:04.415024 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4g9dp\" (UniqueName: \"kubernetes.io/projected/112b077a-0512-4528-8b26-158d512f09ad-kube-api-access-4g9dp\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-df9km\" (UID: \"112b077a-0512-4528-8b26-158d512f09ad\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-df9km" Dec 11 13:38:04 crc kubenswrapper[4898]: I1211 13:38:04.557610 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-df9km" Dec 11 13:38:05 crc kubenswrapper[4898]: I1211 13:38:05.177986 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-df9km"] Dec 11 13:38:06 crc kubenswrapper[4898]: I1211 13:38:06.155747 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-df9km" event={"ID":"112b077a-0512-4528-8b26-158d512f09ad","Type":"ContainerStarted","Data":"0d34cd528665501a66d3cd8360ce2025adac32cc43f2bf9f43d839623f09ff4e"} Dec 11 13:38:07 crc kubenswrapper[4898]: I1211 13:38:07.173217 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-df9km" event={"ID":"112b077a-0512-4528-8b26-158d512f09ad","Type":"ContainerStarted","Data":"b54d0d10724223d6b7febd09a41492b03a9a2929aa0048ec12e0b3cacdbabdb3"} Dec 11 13:38:07 crc kubenswrapper[4898]: I1211 13:38:07.206203 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-df9km" podStartSLOduration=2.191694366 podStartE2EDuration="3.206174715s" podCreationTimestamp="2025-12-11 13:38:04 +0000 UTC" firstStartedPulling="2025-12-11 13:38:05.18581739 +0000 UTC m=+2042.758143827" lastFinishedPulling="2025-12-11 13:38:06.200297739 +0000 UTC m=+2043.772624176" observedRunningTime="2025-12-11 13:38:07.192625917 +0000 UTC m=+2044.764952354" watchObservedRunningTime="2025-12-11 13:38:07.206174715 +0000 UTC m=+2044.778501162" Dec 11 13:38:15 crc kubenswrapper[4898]: I1211 13:38:15.045953 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-rpgj6"] Dec 11 13:38:15 crc kubenswrapper[4898]: I1211 13:38:15.057668 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-rpgj6"] Dec 11 13:38:16 crc kubenswrapper[4898]: I1211 13:38:16.792637 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="421ec273-9526-4100-9a5a-63e0512beee3" path="/var/lib/kubelet/pods/421ec273-9526-4100-9a5a-63e0512beee3/volumes" Dec 11 13:38:33 crc kubenswrapper[4898]: I1211 13:38:33.032620 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-db-create-hbds7"] Dec 11 13:38:33 crc kubenswrapper[4898]: I1211 13:38:33.045715 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-3c53-account-create-update-kzh6s"] Dec 11 13:38:33 crc kubenswrapper[4898]: I1211 13:38:33.060152 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-3c53-account-create-update-kzh6s"] Dec 11 13:38:33 crc kubenswrapper[4898]: I1211 13:38:33.075016 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-db-create-hbds7"] Dec 11 13:38:34 crc kubenswrapper[4898]: I1211 13:38:34.790650 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ca26903-ffb9-4637-aac0-5284a81cbe85" path="/var/lib/kubelet/pods/5ca26903-ffb9-4637-aac0-5284a81cbe85/volumes" Dec 11 13:38:34 crc kubenswrapper[4898]: I1211 13:38:34.792191 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="919f5290-479c-4afc-9160-f69d2f2b1e09" path="/var/lib/kubelet/pods/919f5290-479c-4afc-9160-f69d2f2b1e09/volumes" Dec 11 13:38:38 crc kubenswrapper[4898]: I1211 13:38:38.569994 4898 scope.go:117] "RemoveContainer" containerID="16872782332a1dbb940497943f7cacb007dc4833d1469644a0b34aa6937ab657" Dec 11 13:38:38 crc kubenswrapper[4898]: I1211 13:38:38.611823 4898 scope.go:117] "RemoveContainer" containerID="f5f0d7867b6e6d2bfc699b65ac6b96c700a4e6aaf23dcb4464fe1c5b49fa3507" Dec 11 13:38:38 crc kubenswrapper[4898]: I1211 13:38:38.654643 4898 scope.go:117] "RemoveContainer" containerID="10e802201759470e332ecae05d7658d71baf79f445281506f97ea9311a53af8e" Dec 11 13:38:42 crc kubenswrapper[4898]: I1211 13:38:42.035175 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-dgfd5"] Dec 11 13:38:42 crc kubenswrapper[4898]: I1211 13:38:42.046394 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-dgfd5"] Dec 11 13:38:42 crc kubenswrapper[4898]: I1211 13:38:42.797982 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="17ff0417-b6d7-42f4-9de4-e2482b659fc2" path="/var/lib/kubelet/pods/17ff0417-b6d7-42f4-9de4-e2482b659fc2/volumes" Dec 11 13:38:45 crc kubenswrapper[4898]: I1211 13:38:45.038017 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-sjh99"] Dec 11 13:38:45 crc kubenswrapper[4898]: I1211 13:38:45.047816 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-sjh99"] Dec 11 13:38:46 crc kubenswrapper[4898]: I1211 13:38:46.789617 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac397035-8dbc-46c6-8e4c-d4c25cb38c8f" path="/var/lib/kubelet/pods/ac397035-8dbc-46c6-8e4c-d4c25cb38c8f/volumes" Dec 11 13:39:21 crc kubenswrapper[4898]: I1211 13:39:21.739940 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-x548r"] Dec 11 13:39:21 crc kubenswrapper[4898]: I1211 13:39:21.744488 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-x548r" Dec 11 13:39:21 crc kubenswrapper[4898]: I1211 13:39:21.756738 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-x548r"] Dec 11 13:39:21 crc kubenswrapper[4898]: I1211 13:39:21.859390 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b7a63ef4-d54b-4e53-abdf-a8f2c720f51c-catalog-content\") pod \"certified-operators-x548r\" (UID: \"b7a63ef4-d54b-4e53-abdf-a8f2c720f51c\") " pod="openshift-marketplace/certified-operators-x548r" Dec 11 13:39:21 crc kubenswrapper[4898]: I1211 13:39:21.859792 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b7a63ef4-d54b-4e53-abdf-a8f2c720f51c-utilities\") pod \"certified-operators-x548r\" (UID: \"b7a63ef4-d54b-4e53-abdf-a8f2c720f51c\") " pod="openshift-marketplace/certified-operators-x548r" Dec 11 13:39:21 crc kubenswrapper[4898]: I1211 13:39:21.860188 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xgqsz\" (UniqueName: \"kubernetes.io/projected/b7a63ef4-d54b-4e53-abdf-a8f2c720f51c-kube-api-access-xgqsz\") pod \"certified-operators-x548r\" (UID: \"b7a63ef4-d54b-4e53-abdf-a8f2c720f51c\") " pod="openshift-marketplace/certified-operators-x548r" Dec 11 13:39:21 crc kubenswrapper[4898]: I1211 13:39:21.963124 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b7a63ef4-d54b-4e53-abdf-a8f2c720f51c-utilities\") pod \"certified-operators-x548r\" (UID: \"b7a63ef4-d54b-4e53-abdf-a8f2c720f51c\") " pod="openshift-marketplace/certified-operators-x548r" Dec 11 13:39:21 crc kubenswrapper[4898]: I1211 13:39:21.963305 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xgqsz\" (UniqueName: \"kubernetes.io/projected/b7a63ef4-d54b-4e53-abdf-a8f2c720f51c-kube-api-access-xgqsz\") pod \"certified-operators-x548r\" (UID: \"b7a63ef4-d54b-4e53-abdf-a8f2c720f51c\") " pod="openshift-marketplace/certified-operators-x548r" Dec 11 13:39:21 crc kubenswrapper[4898]: I1211 13:39:21.963373 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b7a63ef4-d54b-4e53-abdf-a8f2c720f51c-catalog-content\") pod \"certified-operators-x548r\" (UID: \"b7a63ef4-d54b-4e53-abdf-a8f2c720f51c\") " pod="openshift-marketplace/certified-operators-x548r" Dec 11 13:39:21 crc kubenswrapper[4898]: I1211 13:39:21.963980 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b7a63ef4-d54b-4e53-abdf-a8f2c720f51c-catalog-content\") pod \"certified-operators-x548r\" (UID: \"b7a63ef4-d54b-4e53-abdf-a8f2c720f51c\") " pod="openshift-marketplace/certified-operators-x548r" Dec 11 13:39:21 crc kubenswrapper[4898]: I1211 13:39:21.964210 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b7a63ef4-d54b-4e53-abdf-a8f2c720f51c-utilities\") pod \"certified-operators-x548r\" (UID: \"b7a63ef4-d54b-4e53-abdf-a8f2c720f51c\") " pod="openshift-marketplace/certified-operators-x548r" Dec 11 13:39:21 crc kubenswrapper[4898]: I1211 13:39:21.985606 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xgqsz\" (UniqueName: \"kubernetes.io/projected/b7a63ef4-d54b-4e53-abdf-a8f2c720f51c-kube-api-access-xgqsz\") pod \"certified-operators-x548r\" (UID: \"b7a63ef4-d54b-4e53-abdf-a8f2c720f51c\") " pod="openshift-marketplace/certified-operators-x548r" Dec 11 13:39:22 crc kubenswrapper[4898]: I1211 13:39:22.068618 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-x548r" Dec 11 13:39:22 crc kubenswrapper[4898]: I1211 13:39:22.561982 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-x548r"] Dec 11 13:39:23 crc kubenswrapper[4898]: I1211 13:39:23.044233 4898 generic.go:334] "Generic (PLEG): container finished" podID="b7a63ef4-d54b-4e53-abdf-a8f2c720f51c" containerID="a370e9f42ec63af0149f76d9c9640dce31f5c76f823136180dde9be44ca2accd" exitCode=0 Dec 11 13:39:23 crc kubenswrapper[4898]: I1211 13:39:23.044287 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x548r" event={"ID":"b7a63ef4-d54b-4e53-abdf-a8f2c720f51c","Type":"ContainerDied","Data":"a370e9f42ec63af0149f76d9c9640dce31f5c76f823136180dde9be44ca2accd"} Dec 11 13:39:23 crc kubenswrapper[4898]: I1211 13:39:23.044317 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x548r" event={"ID":"b7a63ef4-d54b-4e53-abdf-a8f2c720f51c","Type":"ContainerStarted","Data":"1f073faa09f7f430f65f052e6d61fd36dc314bd01011ad07f031f38e44379e2c"} Dec 11 13:39:24 crc kubenswrapper[4898]: I1211 13:39:24.058751 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x548r" event={"ID":"b7a63ef4-d54b-4e53-abdf-a8f2c720f51c","Type":"ContainerStarted","Data":"570c593eea35b0e6ea2515a7a12f266d7f1c004bb752f5ed71a1e73978513863"} Dec 11 13:39:24 crc kubenswrapper[4898]: I1211 13:39:24.063016 4898 generic.go:334] "Generic (PLEG): container finished" podID="112b077a-0512-4528-8b26-158d512f09ad" containerID="b54d0d10724223d6b7febd09a41492b03a9a2929aa0048ec12e0b3cacdbabdb3" exitCode=0 Dec 11 13:39:24 crc kubenswrapper[4898]: I1211 13:39:24.063105 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-df9km" event={"ID":"112b077a-0512-4528-8b26-158d512f09ad","Type":"ContainerDied","Data":"b54d0d10724223d6b7febd09a41492b03a9a2929aa0048ec12e0b3cacdbabdb3"} Dec 11 13:39:25 crc kubenswrapper[4898]: I1211 13:39:25.540344 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-sbn4d"] Dec 11 13:39:25 crc kubenswrapper[4898]: I1211 13:39:25.544064 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sbn4d" Dec 11 13:39:25 crc kubenswrapper[4898]: I1211 13:39:25.562313 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-sbn4d"] Dec 11 13:39:25 crc kubenswrapper[4898]: I1211 13:39:25.653055 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sm7m7\" (UniqueName: \"kubernetes.io/projected/959cd436-0db4-416b-a554-a7ee662e4744-kube-api-access-sm7m7\") pod \"redhat-marketplace-sbn4d\" (UID: \"959cd436-0db4-416b-a554-a7ee662e4744\") " pod="openshift-marketplace/redhat-marketplace-sbn4d" Dec 11 13:39:25 crc kubenswrapper[4898]: I1211 13:39:25.653175 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/959cd436-0db4-416b-a554-a7ee662e4744-catalog-content\") pod \"redhat-marketplace-sbn4d\" (UID: \"959cd436-0db4-416b-a554-a7ee662e4744\") " pod="openshift-marketplace/redhat-marketplace-sbn4d" Dec 11 13:39:25 crc kubenswrapper[4898]: I1211 13:39:25.653301 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/959cd436-0db4-416b-a554-a7ee662e4744-utilities\") pod \"redhat-marketplace-sbn4d\" (UID: \"959cd436-0db4-416b-a554-a7ee662e4744\") " pod="openshift-marketplace/redhat-marketplace-sbn4d" Dec 11 13:39:25 crc kubenswrapper[4898]: I1211 13:39:25.733079 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-df9km" Dec 11 13:39:25 crc kubenswrapper[4898]: I1211 13:39:25.755862 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/959cd436-0db4-416b-a554-a7ee662e4744-utilities\") pod \"redhat-marketplace-sbn4d\" (UID: \"959cd436-0db4-416b-a554-a7ee662e4744\") " pod="openshift-marketplace/redhat-marketplace-sbn4d" Dec 11 13:39:25 crc kubenswrapper[4898]: I1211 13:39:25.756198 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sm7m7\" (UniqueName: \"kubernetes.io/projected/959cd436-0db4-416b-a554-a7ee662e4744-kube-api-access-sm7m7\") pod \"redhat-marketplace-sbn4d\" (UID: \"959cd436-0db4-416b-a554-a7ee662e4744\") " pod="openshift-marketplace/redhat-marketplace-sbn4d" Dec 11 13:39:25 crc kubenswrapper[4898]: I1211 13:39:25.756360 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/959cd436-0db4-416b-a554-a7ee662e4744-catalog-content\") pod \"redhat-marketplace-sbn4d\" (UID: \"959cd436-0db4-416b-a554-a7ee662e4744\") " pod="openshift-marketplace/redhat-marketplace-sbn4d" Dec 11 13:39:25 crc kubenswrapper[4898]: I1211 13:39:25.756569 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/959cd436-0db4-416b-a554-a7ee662e4744-utilities\") pod \"redhat-marketplace-sbn4d\" (UID: \"959cd436-0db4-416b-a554-a7ee662e4744\") " pod="openshift-marketplace/redhat-marketplace-sbn4d" Dec 11 13:39:25 crc kubenswrapper[4898]: I1211 13:39:25.756767 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/959cd436-0db4-416b-a554-a7ee662e4744-catalog-content\") pod \"redhat-marketplace-sbn4d\" (UID: \"959cd436-0db4-416b-a554-a7ee662e4744\") " pod="openshift-marketplace/redhat-marketplace-sbn4d" Dec 11 13:39:25 crc kubenswrapper[4898]: I1211 13:39:25.780970 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sm7m7\" (UniqueName: \"kubernetes.io/projected/959cd436-0db4-416b-a554-a7ee662e4744-kube-api-access-sm7m7\") pod \"redhat-marketplace-sbn4d\" (UID: \"959cd436-0db4-416b-a554-a7ee662e4744\") " pod="openshift-marketplace/redhat-marketplace-sbn4d" Dec 11 13:39:25 crc kubenswrapper[4898]: I1211 13:39:25.858085 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4g9dp\" (UniqueName: \"kubernetes.io/projected/112b077a-0512-4528-8b26-158d512f09ad-kube-api-access-4g9dp\") pod \"112b077a-0512-4528-8b26-158d512f09ad\" (UID: \"112b077a-0512-4528-8b26-158d512f09ad\") " Dec 11 13:39:25 crc kubenswrapper[4898]: I1211 13:39:25.858221 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/112b077a-0512-4528-8b26-158d512f09ad-inventory\") pod \"112b077a-0512-4528-8b26-158d512f09ad\" (UID: \"112b077a-0512-4528-8b26-158d512f09ad\") " Dec 11 13:39:25 crc kubenswrapper[4898]: I1211 13:39:25.858396 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/112b077a-0512-4528-8b26-158d512f09ad-ssh-key\") pod \"112b077a-0512-4528-8b26-158d512f09ad\" (UID: \"112b077a-0512-4528-8b26-158d512f09ad\") " Dec 11 13:39:25 crc kubenswrapper[4898]: I1211 13:39:25.873037 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sbn4d" Dec 11 13:39:26 crc kubenswrapper[4898]: I1211 13:39:26.222352 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/112b077a-0512-4528-8b26-158d512f09ad-kube-api-access-4g9dp" (OuterVolumeSpecName: "kube-api-access-4g9dp") pod "112b077a-0512-4528-8b26-158d512f09ad" (UID: "112b077a-0512-4528-8b26-158d512f09ad"). InnerVolumeSpecName "kube-api-access-4g9dp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:39:26 crc kubenswrapper[4898]: I1211 13:39:26.329895 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4g9dp\" (UniqueName: \"kubernetes.io/projected/112b077a-0512-4528-8b26-158d512f09ad-kube-api-access-4g9dp\") on node \"crc\" DevicePath \"\"" Dec 11 13:39:26 crc kubenswrapper[4898]: I1211 13:39:26.383548 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/112b077a-0512-4528-8b26-158d512f09ad-inventory" (OuterVolumeSpecName: "inventory") pod "112b077a-0512-4528-8b26-158d512f09ad" (UID: "112b077a-0512-4528-8b26-158d512f09ad"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:39:26 crc kubenswrapper[4898]: I1211 13:39:26.386009 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-df9km" event={"ID":"112b077a-0512-4528-8b26-158d512f09ad","Type":"ContainerDied","Data":"0d34cd528665501a66d3cd8360ce2025adac32cc43f2bf9f43d839623f09ff4e"} Dec 11 13:39:26 crc kubenswrapper[4898]: I1211 13:39:26.386131 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0d34cd528665501a66d3cd8360ce2025adac32cc43f2bf9f43d839623f09ff4e" Dec 11 13:39:26 crc kubenswrapper[4898]: I1211 13:39:26.386137 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-df9km" Dec 11 13:39:26 crc kubenswrapper[4898]: I1211 13:39:26.441385 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/112b077a-0512-4528-8b26-158d512f09ad-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "112b077a-0512-4528-8b26-158d512f09ad" (UID: "112b077a-0512-4528-8b26-158d512f09ad"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:39:26 crc kubenswrapper[4898]: I1211 13:39:26.442602 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-dwntb"] Dec 11 13:39:26 crc kubenswrapper[4898]: E1211 13:39:26.479470 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="112b077a-0512-4528-8b26-158d512f09ad" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Dec 11 13:39:26 crc kubenswrapper[4898]: I1211 13:39:26.479500 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="112b077a-0512-4528-8b26-158d512f09ad" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Dec 11 13:39:26 crc kubenswrapper[4898]: I1211 13:39:26.479941 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="112b077a-0512-4528-8b26-158d512f09ad" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Dec 11 13:39:26 crc kubenswrapper[4898]: I1211 13:39:26.480758 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/112b077a-0512-4528-8b26-158d512f09ad-ssh-key\") pod \"112b077a-0512-4528-8b26-158d512f09ad\" (UID: \"112b077a-0512-4528-8b26-158d512f09ad\") " Dec 11 13:39:26 crc kubenswrapper[4898]: I1211 13:39:26.480829 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-dwntb" Dec 11 13:39:26 crc kubenswrapper[4898]: I1211 13:39:26.481440 4898 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/112b077a-0512-4528-8b26-158d512f09ad-inventory\") on node \"crc\" DevicePath \"\"" Dec 11 13:39:26 crc kubenswrapper[4898]: I1211 13:39:26.494612 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-dwntb"] Dec 11 13:39:26 crc kubenswrapper[4898]: I1211 13:39:26.587117 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9f59c773-f89d-41f6-97a4-26e5a362d79f-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-dwntb\" (UID: \"9f59c773-f89d-41f6-97a4-26e5a362d79f\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-dwntb" Dec 11 13:39:26 crc kubenswrapper[4898]: I1211 13:39:26.587538 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wxqsc\" (UniqueName: \"kubernetes.io/projected/9f59c773-f89d-41f6-97a4-26e5a362d79f-kube-api-access-wxqsc\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-dwntb\" (UID: \"9f59c773-f89d-41f6-97a4-26e5a362d79f\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-dwntb" Dec 11 13:39:26 crc kubenswrapper[4898]: I1211 13:39:26.587620 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9f59c773-f89d-41f6-97a4-26e5a362d79f-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-dwntb\" (UID: \"9f59c773-f89d-41f6-97a4-26e5a362d79f\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-dwntb" Dec 11 13:39:26 crc kubenswrapper[4898]: I1211 13:39:26.690019 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wxqsc\" (UniqueName: \"kubernetes.io/projected/9f59c773-f89d-41f6-97a4-26e5a362d79f-kube-api-access-wxqsc\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-dwntb\" (UID: \"9f59c773-f89d-41f6-97a4-26e5a362d79f\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-dwntb" Dec 11 13:39:26 crc kubenswrapper[4898]: I1211 13:39:26.690115 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9f59c773-f89d-41f6-97a4-26e5a362d79f-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-dwntb\" (UID: \"9f59c773-f89d-41f6-97a4-26e5a362d79f\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-dwntb" Dec 11 13:39:26 crc kubenswrapper[4898]: I1211 13:39:26.690224 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9f59c773-f89d-41f6-97a4-26e5a362d79f-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-dwntb\" (UID: \"9f59c773-f89d-41f6-97a4-26e5a362d79f\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-dwntb" Dec 11 13:39:26 crc kubenswrapper[4898]: I1211 13:39:26.694275 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9f59c773-f89d-41f6-97a4-26e5a362d79f-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-dwntb\" (UID: \"9f59c773-f89d-41f6-97a4-26e5a362d79f\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-dwntb" Dec 11 13:39:26 crc kubenswrapper[4898]: I1211 13:39:26.694307 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9f59c773-f89d-41f6-97a4-26e5a362d79f-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-dwntb\" (UID: \"9f59c773-f89d-41f6-97a4-26e5a362d79f\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-dwntb" Dec 11 13:39:26 crc kubenswrapper[4898]: I1211 13:39:26.714594 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wxqsc\" (UniqueName: \"kubernetes.io/projected/9f59c773-f89d-41f6-97a4-26e5a362d79f-kube-api-access-wxqsc\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-dwntb\" (UID: \"9f59c773-f89d-41f6-97a4-26e5a362d79f\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-dwntb" Dec 11 13:39:26 crc kubenswrapper[4898]: W1211 13:39:26.730287 4898 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/112b077a-0512-4528-8b26-158d512f09ad/volumes/kubernetes.io~secret/ssh-key Dec 11 13:39:26 crc kubenswrapper[4898]: I1211 13:39:26.730360 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/112b077a-0512-4528-8b26-158d512f09ad-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "112b077a-0512-4528-8b26-158d512f09ad" (UID: "112b077a-0512-4528-8b26-158d512f09ad"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:39:26 crc kubenswrapper[4898]: I1211 13:39:26.788711 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-sbn4d"] Dec 11 13:39:26 crc kubenswrapper[4898]: I1211 13:39:26.792264 4898 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/112b077a-0512-4528-8b26-158d512f09ad-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 11 13:39:26 crc kubenswrapper[4898]: I1211 13:39:26.805119 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-dwntb" Dec 11 13:39:27 crc kubenswrapper[4898]: I1211 13:39:27.079168 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-2hj8k"] Dec 11 13:39:27 crc kubenswrapper[4898]: I1211 13:39:27.092969 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-2hj8k"] Dec 11 13:39:27 crc kubenswrapper[4898]: I1211 13:39:27.398594 4898 generic.go:334] "Generic (PLEG): container finished" podID="b7a63ef4-d54b-4e53-abdf-a8f2c720f51c" containerID="570c593eea35b0e6ea2515a7a12f266d7f1c004bb752f5ed71a1e73978513863" exitCode=0 Dec 11 13:39:27 crc kubenswrapper[4898]: I1211 13:39:27.398702 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x548r" event={"ID":"b7a63ef4-d54b-4e53-abdf-a8f2c720f51c","Type":"ContainerDied","Data":"570c593eea35b0e6ea2515a7a12f266d7f1c004bb752f5ed71a1e73978513863"} Dec 11 13:39:27 crc kubenswrapper[4898]: I1211 13:39:27.401948 4898 generic.go:334] "Generic (PLEG): container finished" podID="959cd436-0db4-416b-a554-a7ee662e4744" containerID="d188c2946561dc0c4424017120d8d9639c39278acded6651b359f72ea0c6e9e0" exitCode=0 Dec 11 13:39:27 crc kubenswrapper[4898]: I1211 13:39:27.401994 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sbn4d" event={"ID":"959cd436-0db4-416b-a554-a7ee662e4744","Type":"ContainerDied","Data":"d188c2946561dc0c4424017120d8d9639c39278acded6651b359f72ea0c6e9e0"} Dec 11 13:39:27 crc kubenswrapper[4898]: I1211 13:39:27.402028 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sbn4d" event={"ID":"959cd436-0db4-416b-a554-a7ee662e4744","Type":"ContainerStarted","Data":"e6901a5c0794b20c3999e89aa3e5314822d86b50f84469590fd73af98e9ebef8"} Dec 11 13:39:27 crc kubenswrapper[4898]: I1211 13:39:27.492503 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-dwntb"] Dec 11 13:39:28 crc kubenswrapper[4898]: I1211 13:39:28.413734 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x548r" event={"ID":"b7a63ef4-d54b-4e53-abdf-a8f2c720f51c","Type":"ContainerStarted","Data":"b98617b84f65a5135c7da3b4d2400157dc98df63be35591143c07c8dd414d7f7"} Dec 11 13:39:28 crc kubenswrapper[4898]: I1211 13:39:28.419272 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sbn4d" event={"ID":"959cd436-0db4-416b-a554-a7ee662e4744","Type":"ContainerStarted","Data":"4a68ca091fb28e94da6263f1b8c5bb4afbae49588318deeb7a541f13e0bdfc39"} Dec 11 13:39:28 crc kubenswrapper[4898]: I1211 13:39:28.424134 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-dwntb" event={"ID":"9f59c773-f89d-41f6-97a4-26e5a362d79f","Type":"ContainerStarted","Data":"61ee2597887cd6e45e18849c95e20f0b842e9bba946ca9836fa0617761969765"} Dec 11 13:39:28 crc kubenswrapper[4898]: I1211 13:39:28.451370 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-x548r" podStartSLOduration=2.60263449 podStartE2EDuration="7.451350183s" podCreationTimestamp="2025-12-11 13:39:21 +0000 UTC" firstStartedPulling="2025-12-11 13:39:23.046639022 +0000 UTC m=+2120.618965459" lastFinishedPulling="2025-12-11 13:39:27.895354715 +0000 UTC m=+2125.467681152" observedRunningTime="2025-12-11 13:39:28.441956538 +0000 UTC m=+2126.014282975" watchObservedRunningTime="2025-12-11 13:39:28.451350183 +0000 UTC m=+2126.023676620" Dec 11 13:39:28 crc kubenswrapper[4898]: I1211 13:39:28.789360 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c4188522-fa4a-4c8e-92af-dd304dbc64f1" path="/var/lib/kubelet/pods/c4188522-fa4a-4c8e-92af-dd304dbc64f1/volumes" Dec 11 13:39:29 crc kubenswrapper[4898]: I1211 13:39:29.440204 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-dwntb" event={"ID":"9f59c773-f89d-41f6-97a4-26e5a362d79f","Type":"ContainerStarted","Data":"4aa7d4e6d6646daa59b29a2201c16ef509c95bcfa21690e90b5f0d05399608b2"} Dec 11 13:39:29 crc kubenswrapper[4898]: I1211 13:39:29.444783 4898 generic.go:334] "Generic (PLEG): container finished" podID="959cd436-0db4-416b-a554-a7ee662e4744" containerID="4a68ca091fb28e94da6263f1b8c5bb4afbae49588318deeb7a541f13e0bdfc39" exitCode=0 Dec 11 13:39:29 crc kubenswrapper[4898]: I1211 13:39:29.444858 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sbn4d" event={"ID":"959cd436-0db4-416b-a554-a7ee662e4744","Type":"ContainerDied","Data":"4a68ca091fb28e94da6263f1b8c5bb4afbae49588318deeb7a541f13e0bdfc39"} Dec 11 13:39:29 crc kubenswrapper[4898]: I1211 13:39:29.467525 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-dwntb" podStartSLOduration=2.555883259 podStartE2EDuration="3.467506397s" podCreationTimestamp="2025-12-11 13:39:26 +0000 UTC" firstStartedPulling="2025-12-11 13:39:27.493689786 +0000 UTC m=+2125.066016223" lastFinishedPulling="2025-12-11 13:39:28.405312924 +0000 UTC m=+2125.977639361" observedRunningTime="2025-12-11 13:39:29.464584288 +0000 UTC m=+2127.036910735" watchObservedRunningTime="2025-12-11 13:39:29.467506397 +0000 UTC m=+2127.039832834" Dec 11 13:39:30 crc kubenswrapper[4898]: I1211 13:39:30.458672 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sbn4d" event={"ID":"959cd436-0db4-416b-a554-a7ee662e4744","Type":"ContainerStarted","Data":"65c3825f8815a85583abf60e517d3c55980f573b09a6343b9b5c0e045531c8e0"} Dec 11 13:39:30 crc kubenswrapper[4898]: I1211 13:39:30.489647 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-sbn4d" podStartSLOduration=2.86015138 podStartE2EDuration="5.489627394s" podCreationTimestamp="2025-12-11 13:39:25 +0000 UTC" firstStartedPulling="2025-12-11 13:39:27.405187254 +0000 UTC m=+2124.977513691" lastFinishedPulling="2025-12-11 13:39:30.034663268 +0000 UTC m=+2127.606989705" observedRunningTime="2025-12-11 13:39:30.482100159 +0000 UTC m=+2128.054426606" watchObservedRunningTime="2025-12-11 13:39:30.489627394 +0000 UTC m=+2128.061953831" Dec 11 13:39:32 crc kubenswrapper[4898]: I1211 13:39:32.069259 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-x548r" Dec 11 13:39:32 crc kubenswrapper[4898]: I1211 13:39:32.069787 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-x548r" Dec 11 13:39:32 crc kubenswrapper[4898]: I1211 13:39:32.132242 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-x548r" Dec 11 13:39:34 crc kubenswrapper[4898]: I1211 13:39:34.996043 4898 patch_prober.go:28] interesting pod/machine-config-daemon-7mmvk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 11 13:39:34 crc kubenswrapper[4898]: I1211 13:39:34.996668 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 11 13:39:35 crc kubenswrapper[4898]: I1211 13:39:35.522902 4898 generic.go:334] "Generic (PLEG): container finished" podID="9f59c773-f89d-41f6-97a4-26e5a362d79f" containerID="4aa7d4e6d6646daa59b29a2201c16ef509c95bcfa21690e90b5f0d05399608b2" exitCode=0 Dec 11 13:39:35 crc kubenswrapper[4898]: I1211 13:39:35.522967 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-dwntb" event={"ID":"9f59c773-f89d-41f6-97a4-26e5a362d79f","Type":"ContainerDied","Data":"4aa7d4e6d6646daa59b29a2201c16ef509c95bcfa21690e90b5f0d05399608b2"} Dec 11 13:39:35 crc kubenswrapper[4898]: I1211 13:39:35.874236 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-sbn4d" Dec 11 13:39:35 crc kubenswrapper[4898]: I1211 13:39:35.874675 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-sbn4d" Dec 11 13:39:35 crc kubenswrapper[4898]: I1211 13:39:35.972137 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-sbn4d" Dec 11 13:39:36 crc kubenswrapper[4898]: I1211 13:39:36.596561 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-sbn4d" Dec 11 13:39:36 crc kubenswrapper[4898]: I1211 13:39:36.682296 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-sbn4d"] Dec 11 13:39:37 crc kubenswrapper[4898]: I1211 13:39:37.056152 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-dwntb" Dec 11 13:39:37 crc kubenswrapper[4898]: I1211 13:39:37.171397 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxqsc\" (UniqueName: \"kubernetes.io/projected/9f59c773-f89d-41f6-97a4-26e5a362d79f-kube-api-access-wxqsc\") pod \"9f59c773-f89d-41f6-97a4-26e5a362d79f\" (UID: \"9f59c773-f89d-41f6-97a4-26e5a362d79f\") " Dec 11 13:39:37 crc kubenswrapper[4898]: I1211 13:39:37.171858 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9f59c773-f89d-41f6-97a4-26e5a362d79f-inventory\") pod \"9f59c773-f89d-41f6-97a4-26e5a362d79f\" (UID: \"9f59c773-f89d-41f6-97a4-26e5a362d79f\") " Dec 11 13:39:37 crc kubenswrapper[4898]: I1211 13:39:37.172029 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9f59c773-f89d-41f6-97a4-26e5a362d79f-ssh-key\") pod \"9f59c773-f89d-41f6-97a4-26e5a362d79f\" (UID: \"9f59c773-f89d-41f6-97a4-26e5a362d79f\") " Dec 11 13:39:37 crc kubenswrapper[4898]: I1211 13:39:37.178388 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f59c773-f89d-41f6-97a4-26e5a362d79f-kube-api-access-wxqsc" (OuterVolumeSpecName: "kube-api-access-wxqsc") pod "9f59c773-f89d-41f6-97a4-26e5a362d79f" (UID: "9f59c773-f89d-41f6-97a4-26e5a362d79f"). InnerVolumeSpecName "kube-api-access-wxqsc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:39:37 crc kubenswrapper[4898]: I1211 13:39:37.221987 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f59c773-f89d-41f6-97a4-26e5a362d79f-inventory" (OuterVolumeSpecName: "inventory") pod "9f59c773-f89d-41f6-97a4-26e5a362d79f" (UID: "9f59c773-f89d-41f6-97a4-26e5a362d79f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:39:37 crc kubenswrapper[4898]: I1211 13:39:37.235127 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f59c773-f89d-41f6-97a4-26e5a362d79f-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "9f59c773-f89d-41f6-97a4-26e5a362d79f" (UID: "9f59c773-f89d-41f6-97a4-26e5a362d79f"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:39:37 crc kubenswrapper[4898]: I1211 13:39:37.275855 4898 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9f59c773-f89d-41f6-97a4-26e5a362d79f-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 11 13:39:37 crc kubenswrapper[4898]: I1211 13:39:37.275893 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxqsc\" (UniqueName: \"kubernetes.io/projected/9f59c773-f89d-41f6-97a4-26e5a362d79f-kube-api-access-wxqsc\") on node \"crc\" DevicePath \"\"" Dec 11 13:39:37 crc kubenswrapper[4898]: I1211 13:39:37.275912 4898 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9f59c773-f89d-41f6-97a4-26e5a362d79f-inventory\") on node \"crc\" DevicePath \"\"" Dec 11 13:39:37 crc kubenswrapper[4898]: I1211 13:39:37.549150 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-dwntb" event={"ID":"9f59c773-f89d-41f6-97a4-26e5a362d79f","Type":"ContainerDied","Data":"61ee2597887cd6e45e18849c95e20f0b842e9bba946ca9836fa0617761969765"} Dec 11 13:39:37 crc kubenswrapper[4898]: I1211 13:39:37.549215 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="61ee2597887cd6e45e18849c95e20f0b842e9bba946ca9836fa0617761969765" Dec 11 13:39:37 crc kubenswrapper[4898]: I1211 13:39:37.549178 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-dwntb" Dec 11 13:39:37 crc kubenswrapper[4898]: I1211 13:39:37.621827 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-lv4d4"] Dec 11 13:39:37 crc kubenswrapper[4898]: E1211 13:39:37.623077 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f59c773-f89d-41f6-97a4-26e5a362d79f" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Dec 11 13:39:37 crc kubenswrapper[4898]: I1211 13:39:37.623162 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f59c773-f89d-41f6-97a4-26e5a362d79f" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Dec 11 13:39:37 crc kubenswrapper[4898]: I1211 13:39:37.623570 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f59c773-f89d-41f6-97a4-26e5a362d79f" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Dec 11 13:39:37 crc kubenswrapper[4898]: I1211 13:39:37.624682 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-lv4d4" Dec 11 13:39:37 crc kubenswrapper[4898]: I1211 13:39:37.627470 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 11 13:39:37 crc kubenswrapper[4898]: I1211 13:39:37.627487 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 11 13:39:37 crc kubenswrapper[4898]: I1211 13:39:37.628111 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-w8jc8" Dec 11 13:39:37 crc kubenswrapper[4898]: I1211 13:39:37.632637 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 11 13:39:37 crc kubenswrapper[4898]: I1211 13:39:37.632937 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-lv4d4"] Dec 11 13:39:37 crc kubenswrapper[4898]: I1211 13:39:37.786855 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0c0edae9-777b-4b14-9014-8e7dddcc6319-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-lv4d4\" (UID: \"0c0edae9-777b-4b14-9014-8e7dddcc6319\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-lv4d4" Dec 11 13:39:37 crc kubenswrapper[4898]: I1211 13:39:37.786899 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5sh7h\" (UniqueName: \"kubernetes.io/projected/0c0edae9-777b-4b14-9014-8e7dddcc6319-kube-api-access-5sh7h\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-lv4d4\" (UID: \"0c0edae9-777b-4b14-9014-8e7dddcc6319\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-lv4d4" Dec 11 13:39:37 crc kubenswrapper[4898]: I1211 13:39:37.787512 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0c0edae9-777b-4b14-9014-8e7dddcc6319-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-lv4d4\" (UID: \"0c0edae9-777b-4b14-9014-8e7dddcc6319\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-lv4d4" Dec 11 13:39:37 crc kubenswrapper[4898]: I1211 13:39:37.889286 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0c0edae9-777b-4b14-9014-8e7dddcc6319-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-lv4d4\" (UID: \"0c0edae9-777b-4b14-9014-8e7dddcc6319\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-lv4d4" Dec 11 13:39:37 crc kubenswrapper[4898]: I1211 13:39:37.889693 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0c0edae9-777b-4b14-9014-8e7dddcc6319-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-lv4d4\" (UID: \"0c0edae9-777b-4b14-9014-8e7dddcc6319\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-lv4d4" Dec 11 13:39:37 crc kubenswrapper[4898]: I1211 13:39:37.889718 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5sh7h\" (UniqueName: \"kubernetes.io/projected/0c0edae9-777b-4b14-9014-8e7dddcc6319-kube-api-access-5sh7h\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-lv4d4\" (UID: \"0c0edae9-777b-4b14-9014-8e7dddcc6319\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-lv4d4" Dec 11 13:39:37 crc kubenswrapper[4898]: I1211 13:39:37.897759 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0c0edae9-777b-4b14-9014-8e7dddcc6319-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-lv4d4\" (UID: \"0c0edae9-777b-4b14-9014-8e7dddcc6319\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-lv4d4" Dec 11 13:39:37 crc kubenswrapper[4898]: I1211 13:39:37.898256 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0c0edae9-777b-4b14-9014-8e7dddcc6319-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-lv4d4\" (UID: \"0c0edae9-777b-4b14-9014-8e7dddcc6319\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-lv4d4" Dec 11 13:39:37 crc kubenswrapper[4898]: I1211 13:39:37.920357 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5sh7h\" (UniqueName: \"kubernetes.io/projected/0c0edae9-777b-4b14-9014-8e7dddcc6319-kube-api-access-5sh7h\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-lv4d4\" (UID: \"0c0edae9-777b-4b14-9014-8e7dddcc6319\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-lv4d4" Dec 11 13:39:37 crc kubenswrapper[4898]: I1211 13:39:37.950341 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-lv4d4" Dec 11 13:39:38 crc kubenswrapper[4898]: I1211 13:39:38.559292 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-sbn4d" podUID="959cd436-0db4-416b-a554-a7ee662e4744" containerName="registry-server" containerID="cri-o://65c3825f8815a85583abf60e517d3c55980f573b09a6343b9b5c0e045531c8e0" gracePeriod=2 Dec 11 13:39:38 crc kubenswrapper[4898]: I1211 13:39:38.587703 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-lv4d4"] Dec 11 13:39:38 crc kubenswrapper[4898]: I1211 13:39:38.790838 4898 scope.go:117] "RemoveContainer" containerID="b25bab6209ad2fceaf30d8c7226b13536eb0afd152d8951b38bf8d6cf63418d8" Dec 11 13:39:38 crc kubenswrapper[4898]: I1211 13:39:38.822221 4898 scope.go:117] "RemoveContainer" containerID="dd3e7074c59bad247315decb5917b222623bd6e03dd0c4043626d5da8c535707" Dec 11 13:39:38 crc kubenswrapper[4898]: I1211 13:39:38.903853 4898 scope.go:117] "RemoveContainer" containerID="f697e132855971325a13977d240628b9d340ff5b8fbc1d55606c7d0c7520615b" Dec 11 13:39:39 crc kubenswrapper[4898]: I1211 13:39:39.304284 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sbn4d" Dec 11 13:39:39 crc kubenswrapper[4898]: I1211 13:39:39.369394 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sm7m7\" (UniqueName: \"kubernetes.io/projected/959cd436-0db4-416b-a554-a7ee662e4744-kube-api-access-sm7m7\") pod \"959cd436-0db4-416b-a554-a7ee662e4744\" (UID: \"959cd436-0db4-416b-a554-a7ee662e4744\") " Dec 11 13:39:39 crc kubenswrapper[4898]: I1211 13:39:39.369615 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/959cd436-0db4-416b-a554-a7ee662e4744-catalog-content\") pod \"959cd436-0db4-416b-a554-a7ee662e4744\" (UID: \"959cd436-0db4-416b-a554-a7ee662e4744\") " Dec 11 13:39:39 crc kubenswrapper[4898]: I1211 13:39:39.369838 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/959cd436-0db4-416b-a554-a7ee662e4744-utilities\") pod \"959cd436-0db4-416b-a554-a7ee662e4744\" (UID: \"959cd436-0db4-416b-a554-a7ee662e4744\") " Dec 11 13:39:39 crc kubenswrapper[4898]: I1211 13:39:39.370942 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/959cd436-0db4-416b-a554-a7ee662e4744-utilities" (OuterVolumeSpecName: "utilities") pod "959cd436-0db4-416b-a554-a7ee662e4744" (UID: "959cd436-0db4-416b-a554-a7ee662e4744"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 13:39:39 crc kubenswrapper[4898]: I1211 13:39:39.376086 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/959cd436-0db4-416b-a554-a7ee662e4744-kube-api-access-sm7m7" (OuterVolumeSpecName: "kube-api-access-sm7m7") pod "959cd436-0db4-416b-a554-a7ee662e4744" (UID: "959cd436-0db4-416b-a554-a7ee662e4744"). InnerVolumeSpecName "kube-api-access-sm7m7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:39:39 crc kubenswrapper[4898]: I1211 13:39:39.407263 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/959cd436-0db4-416b-a554-a7ee662e4744-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "959cd436-0db4-416b-a554-a7ee662e4744" (UID: "959cd436-0db4-416b-a554-a7ee662e4744"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 13:39:39 crc kubenswrapper[4898]: I1211 13:39:39.473491 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sm7m7\" (UniqueName: \"kubernetes.io/projected/959cd436-0db4-416b-a554-a7ee662e4744-kube-api-access-sm7m7\") on node \"crc\" DevicePath \"\"" Dec 11 13:39:39 crc kubenswrapper[4898]: I1211 13:39:39.473528 4898 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/959cd436-0db4-416b-a554-a7ee662e4744-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 11 13:39:39 crc kubenswrapper[4898]: I1211 13:39:39.473544 4898 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/959cd436-0db4-416b-a554-a7ee662e4744-utilities\") on node \"crc\" DevicePath \"\"" Dec 11 13:39:39 crc kubenswrapper[4898]: I1211 13:39:39.574502 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-lv4d4" event={"ID":"0c0edae9-777b-4b14-9014-8e7dddcc6319","Type":"ContainerStarted","Data":"05f45cd92b12811fdc7f0a6d064aad5cc73611a0384ca57aa67d4d51f60c60f4"} Dec 11 13:39:39 crc kubenswrapper[4898]: I1211 13:39:39.574550 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-lv4d4" event={"ID":"0c0edae9-777b-4b14-9014-8e7dddcc6319","Type":"ContainerStarted","Data":"6dc15f7f60ae70c721dea623069035cf608baeb23514755751dcc93cbc063c77"} Dec 11 13:39:39 crc kubenswrapper[4898]: I1211 13:39:39.579528 4898 generic.go:334] "Generic (PLEG): container finished" podID="959cd436-0db4-416b-a554-a7ee662e4744" containerID="65c3825f8815a85583abf60e517d3c55980f573b09a6343b9b5c0e045531c8e0" exitCode=0 Dec 11 13:39:39 crc kubenswrapper[4898]: I1211 13:39:39.579591 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sbn4d" event={"ID":"959cd436-0db4-416b-a554-a7ee662e4744","Type":"ContainerDied","Data":"65c3825f8815a85583abf60e517d3c55980f573b09a6343b9b5c0e045531c8e0"} Dec 11 13:39:39 crc kubenswrapper[4898]: I1211 13:39:39.579626 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sbn4d" event={"ID":"959cd436-0db4-416b-a554-a7ee662e4744","Type":"ContainerDied","Data":"e6901a5c0794b20c3999e89aa3e5314822d86b50f84469590fd73af98e9ebef8"} Dec 11 13:39:39 crc kubenswrapper[4898]: I1211 13:39:39.579646 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sbn4d" Dec 11 13:39:39 crc kubenswrapper[4898]: I1211 13:39:39.579664 4898 scope.go:117] "RemoveContainer" containerID="65c3825f8815a85583abf60e517d3c55980f573b09a6343b9b5c0e045531c8e0" Dec 11 13:39:39 crc kubenswrapper[4898]: I1211 13:39:39.594518 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-lv4d4" podStartSLOduration=2.14745862 podStartE2EDuration="2.594426279s" podCreationTimestamp="2025-12-11 13:39:37 +0000 UTC" firstStartedPulling="2025-12-11 13:39:38.589826878 +0000 UTC m=+2136.162153315" lastFinishedPulling="2025-12-11 13:39:39.036794527 +0000 UTC m=+2136.609120974" observedRunningTime="2025-12-11 13:39:39.592092526 +0000 UTC m=+2137.164418983" watchObservedRunningTime="2025-12-11 13:39:39.594426279 +0000 UTC m=+2137.166752786" Dec 11 13:39:39 crc kubenswrapper[4898]: I1211 13:39:39.622398 4898 scope.go:117] "RemoveContainer" containerID="4a68ca091fb28e94da6263f1b8c5bb4afbae49588318deeb7a541f13e0bdfc39" Dec 11 13:39:39 crc kubenswrapper[4898]: I1211 13:39:39.644939 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-sbn4d"] Dec 11 13:39:39 crc kubenswrapper[4898]: I1211 13:39:39.652349 4898 scope.go:117] "RemoveContainer" containerID="d188c2946561dc0c4424017120d8d9639c39278acded6651b359f72ea0c6e9e0" Dec 11 13:39:39 crc kubenswrapper[4898]: I1211 13:39:39.654589 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-sbn4d"] Dec 11 13:39:39 crc kubenswrapper[4898]: I1211 13:39:39.680775 4898 scope.go:117] "RemoveContainer" containerID="65c3825f8815a85583abf60e517d3c55980f573b09a6343b9b5c0e045531c8e0" Dec 11 13:39:39 crc kubenswrapper[4898]: E1211 13:39:39.681171 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"65c3825f8815a85583abf60e517d3c55980f573b09a6343b9b5c0e045531c8e0\": container with ID starting with 65c3825f8815a85583abf60e517d3c55980f573b09a6343b9b5c0e045531c8e0 not found: ID does not exist" containerID="65c3825f8815a85583abf60e517d3c55980f573b09a6343b9b5c0e045531c8e0" Dec 11 13:39:39 crc kubenswrapper[4898]: I1211 13:39:39.681200 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65c3825f8815a85583abf60e517d3c55980f573b09a6343b9b5c0e045531c8e0"} err="failed to get container status \"65c3825f8815a85583abf60e517d3c55980f573b09a6343b9b5c0e045531c8e0\": rpc error: code = NotFound desc = could not find container \"65c3825f8815a85583abf60e517d3c55980f573b09a6343b9b5c0e045531c8e0\": container with ID starting with 65c3825f8815a85583abf60e517d3c55980f573b09a6343b9b5c0e045531c8e0 not found: ID does not exist" Dec 11 13:39:39 crc kubenswrapper[4898]: I1211 13:39:39.681223 4898 scope.go:117] "RemoveContainer" containerID="4a68ca091fb28e94da6263f1b8c5bb4afbae49588318deeb7a541f13e0bdfc39" Dec 11 13:39:39 crc kubenswrapper[4898]: E1211 13:39:39.681474 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a68ca091fb28e94da6263f1b8c5bb4afbae49588318deeb7a541f13e0bdfc39\": container with ID starting with 4a68ca091fb28e94da6263f1b8c5bb4afbae49588318deeb7a541f13e0bdfc39 not found: ID does not exist" containerID="4a68ca091fb28e94da6263f1b8c5bb4afbae49588318deeb7a541f13e0bdfc39" Dec 11 13:39:39 crc kubenswrapper[4898]: I1211 13:39:39.681509 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a68ca091fb28e94da6263f1b8c5bb4afbae49588318deeb7a541f13e0bdfc39"} err="failed to get container status \"4a68ca091fb28e94da6263f1b8c5bb4afbae49588318deeb7a541f13e0bdfc39\": rpc error: code = NotFound desc = could not find container \"4a68ca091fb28e94da6263f1b8c5bb4afbae49588318deeb7a541f13e0bdfc39\": container with ID starting with 4a68ca091fb28e94da6263f1b8c5bb4afbae49588318deeb7a541f13e0bdfc39 not found: ID does not exist" Dec 11 13:39:39 crc kubenswrapper[4898]: I1211 13:39:39.681534 4898 scope.go:117] "RemoveContainer" containerID="d188c2946561dc0c4424017120d8d9639c39278acded6651b359f72ea0c6e9e0" Dec 11 13:39:39 crc kubenswrapper[4898]: E1211 13:39:39.681939 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d188c2946561dc0c4424017120d8d9639c39278acded6651b359f72ea0c6e9e0\": container with ID starting with d188c2946561dc0c4424017120d8d9639c39278acded6651b359f72ea0c6e9e0 not found: ID does not exist" containerID="d188c2946561dc0c4424017120d8d9639c39278acded6651b359f72ea0c6e9e0" Dec 11 13:39:39 crc kubenswrapper[4898]: I1211 13:39:39.681979 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d188c2946561dc0c4424017120d8d9639c39278acded6651b359f72ea0c6e9e0"} err="failed to get container status \"d188c2946561dc0c4424017120d8d9639c39278acded6651b359f72ea0c6e9e0\": rpc error: code = NotFound desc = could not find container \"d188c2946561dc0c4424017120d8d9639c39278acded6651b359f72ea0c6e9e0\": container with ID starting with d188c2946561dc0c4424017120d8d9639c39278acded6651b359f72ea0c6e9e0 not found: ID does not exist" Dec 11 13:39:40 crc kubenswrapper[4898]: I1211 13:39:40.795065 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="959cd436-0db4-416b-a554-a7ee662e4744" path="/var/lib/kubelet/pods/959cd436-0db4-416b-a554-a7ee662e4744/volumes" Dec 11 13:39:42 crc kubenswrapper[4898]: I1211 13:39:42.132892 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-x548r" Dec 11 13:39:42 crc kubenswrapper[4898]: I1211 13:39:42.205031 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-x548r"] Dec 11 13:39:42 crc kubenswrapper[4898]: I1211 13:39:42.632663 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-x548r" podUID="b7a63ef4-d54b-4e53-abdf-a8f2c720f51c" containerName="registry-server" containerID="cri-o://b98617b84f65a5135c7da3b4d2400157dc98df63be35591143c07c8dd414d7f7" gracePeriod=2 Dec 11 13:39:43 crc kubenswrapper[4898]: I1211 13:39:43.225639 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-x548r" Dec 11 13:39:43 crc kubenswrapper[4898]: I1211 13:39:43.274104 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xgqsz\" (UniqueName: \"kubernetes.io/projected/b7a63ef4-d54b-4e53-abdf-a8f2c720f51c-kube-api-access-xgqsz\") pod \"b7a63ef4-d54b-4e53-abdf-a8f2c720f51c\" (UID: \"b7a63ef4-d54b-4e53-abdf-a8f2c720f51c\") " Dec 11 13:39:43 crc kubenswrapper[4898]: I1211 13:39:43.274416 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b7a63ef4-d54b-4e53-abdf-a8f2c720f51c-utilities\") pod \"b7a63ef4-d54b-4e53-abdf-a8f2c720f51c\" (UID: \"b7a63ef4-d54b-4e53-abdf-a8f2c720f51c\") " Dec 11 13:39:43 crc kubenswrapper[4898]: I1211 13:39:43.274660 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b7a63ef4-d54b-4e53-abdf-a8f2c720f51c-catalog-content\") pod \"b7a63ef4-d54b-4e53-abdf-a8f2c720f51c\" (UID: \"b7a63ef4-d54b-4e53-abdf-a8f2c720f51c\") " Dec 11 13:39:43 crc kubenswrapper[4898]: I1211 13:39:43.275140 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b7a63ef4-d54b-4e53-abdf-a8f2c720f51c-utilities" (OuterVolumeSpecName: "utilities") pod "b7a63ef4-d54b-4e53-abdf-a8f2c720f51c" (UID: "b7a63ef4-d54b-4e53-abdf-a8f2c720f51c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 13:39:43 crc kubenswrapper[4898]: I1211 13:39:43.284265 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b7a63ef4-d54b-4e53-abdf-a8f2c720f51c-kube-api-access-xgqsz" (OuterVolumeSpecName: "kube-api-access-xgqsz") pod "b7a63ef4-d54b-4e53-abdf-a8f2c720f51c" (UID: "b7a63ef4-d54b-4e53-abdf-a8f2c720f51c"). InnerVolumeSpecName "kube-api-access-xgqsz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:39:43 crc kubenswrapper[4898]: I1211 13:39:43.331556 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b7a63ef4-d54b-4e53-abdf-a8f2c720f51c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b7a63ef4-d54b-4e53-abdf-a8f2c720f51c" (UID: "b7a63ef4-d54b-4e53-abdf-a8f2c720f51c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 13:39:43 crc kubenswrapper[4898]: I1211 13:39:43.377146 4898 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b7a63ef4-d54b-4e53-abdf-a8f2c720f51c-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 11 13:39:43 crc kubenswrapper[4898]: I1211 13:39:43.377180 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xgqsz\" (UniqueName: \"kubernetes.io/projected/b7a63ef4-d54b-4e53-abdf-a8f2c720f51c-kube-api-access-xgqsz\") on node \"crc\" DevicePath \"\"" Dec 11 13:39:43 crc kubenswrapper[4898]: I1211 13:39:43.377191 4898 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b7a63ef4-d54b-4e53-abdf-a8f2c720f51c-utilities\") on node \"crc\" DevicePath \"\"" Dec 11 13:39:43 crc kubenswrapper[4898]: I1211 13:39:43.646516 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-x548r" Dec 11 13:39:43 crc kubenswrapper[4898]: I1211 13:39:43.646592 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x548r" event={"ID":"b7a63ef4-d54b-4e53-abdf-a8f2c720f51c","Type":"ContainerDied","Data":"b98617b84f65a5135c7da3b4d2400157dc98df63be35591143c07c8dd414d7f7"} Dec 11 13:39:43 crc kubenswrapper[4898]: I1211 13:39:43.646659 4898 scope.go:117] "RemoveContainer" containerID="b98617b84f65a5135c7da3b4d2400157dc98df63be35591143c07c8dd414d7f7" Dec 11 13:39:43 crc kubenswrapper[4898]: I1211 13:39:43.646438 4898 generic.go:334] "Generic (PLEG): container finished" podID="b7a63ef4-d54b-4e53-abdf-a8f2c720f51c" containerID="b98617b84f65a5135c7da3b4d2400157dc98df63be35591143c07c8dd414d7f7" exitCode=0 Dec 11 13:39:43 crc kubenswrapper[4898]: I1211 13:39:43.646838 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x548r" event={"ID":"b7a63ef4-d54b-4e53-abdf-a8f2c720f51c","Type":"ContainerDied","Data":"1f073faa09f7f430f65f052e6d61fd36dc314bd01011ad07f031f38e44379e2c"} Dec 11 13:39:43 crc kubenswrapper[4898]: I1211 13:39:43.679289 4898 scope.go:117] "RemoveContainer" containerID="570c593eea35b0e6ea2515a7a12f266d7f1c004bb752f5ed71a1e73978513863" Dec 11 13:39:43 crc kubenswrapper[4898]: I1211 13:39:43.714102 4898 scope.go:117] "RemoveContainer" containerID="a370e9f42ec63af0149f76d9c9640dce31f5c76f823136180dde9be44ca2accd" Dec 11 13:39:43 crc kubenswrapper[4898]: I1211 13:39:43.719762 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-x548r"] Dec 11 13:39:43 crc kubenswrapper[4898]: I1211 13:39:43.730038 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-x548r"] Dec 11 13:39:43 crc kubenswrapper[4898]: I1211 13:39:43.765249 4898 scope.go:117] "RemoveContainer" containerID="b98617b84f65a5135c7da3b4d2400157dc98df63be35591143c07c8dd414d7f7" Dec 11 13:39:43 crc kubenswrapper[4898]: E1211 13:39:43.765754 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b98617b84f65a5135c7da3b4d2400157dc98df63be35591143c07c8dd414d7f7\": container with ID starting with b98617b84f65a5135c7da3b4d2400157dc98df63be35591143c07c8dd414d7f7 not found: ID does not exist" containerID="b98617b84f65a5135c7da3b4d2400157dc98df63be35591143c07c8dd414d7f7" Dec 11 13:39:43 crc kubenswrapper[4898]: I1211 13:39:43.765908 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b98617b84f65a5135c7da3b4d2400157dc98df63be35591143c07c8dd414d7f7"} err="failed to get container status \"b98617b84f65a5135c7da3b4d2400157dc98df63be35591143c07c8dd414d7f7\": rpc error: code = NotFound desc = could not find container \"b98617b84f65a5135c7da3b4d2400157dc98df63be35591143c07c8dd414d7f7\": container with ID starting with b98617b84f65a5135c7da3b4d2400157dc98df63be35591143c07c8dd414d7f7 not found: ID does not exist" Dec 11 13:39:43 crc kubenswrapper[4898]: I1211 13:39:43.766098 4898 scope.go:117] "RemoveContainer" containerID="570c593eea35b0e6ea2515a7a12f266d7f1c004bb752f5ed71a1e73978513863" Dec 11 13:39:43 crc kubenswrapper[4898]: E1211 13:39:43.766503 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"570c593eea35b0e6ea2515a7a12f266d7f1c004bb752f5ed71a1e73978513863\": container with ID starting with 570c593eea35b0e6ea2515a7a12f266d7f1c004bb752f5ed71a1e73978513863 not found: ID does not exist" containerID="570c593eea35b0e6ea2515a7a12f266d7f1c004bb752f5ed71a1e73978513863" Dec 11 13:39:43 crc kubenswrapper[4898]: I1211 13:39:43.766549 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"570c593eea35b0e6ea2515a7a12f266d7f1c004bb752f5ed71a1e73978513863"} err="failed to get container status \"570c593eea35b0e6ea2515a7a12f266d7f1c004bb752f5ed71a1e73978513863\": rpc error: code = NotFound desc = could not find container \"570c593eea35b0e6ea2515a7a12f266d7f1c004bb752f5ed71a1e73978513863\": container with ID starting with 570c593eea35b0e6ea2515a7a12f266d7f1c004bb752f5ed71a1e73978513863 not found: ID does not exist" Dec 11 13:39:43 crc kubenswrapper[4898]: I1211 13:39:43.766565 4898 scope.go:117] "RemoveContainer" containerID="a370e9f42ec63af0149f76d9c9640dce31f5c76f823136180dde9be44ca2accd" Dec 11 13:39:43 crc kubenswrapper[4898]: E1211 13:39:43.766837 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a370e9f42ec63af0149f76d9c9640dce31f5c76f823136180dde9be44ca2accd\": container with ID starting with a370e9f42ec63af0149f76d9c9640dce31f5c76f823136180dde9be44ca2accd not found: ID does not exist" containerID="a370e9f42ec63af0149f76d9c9640dce31f5c76f823136180dde9be44ca2accd" Dec 11 13:39:43 crc kubenswrapper[4898]: I1211 13:39:43.766930 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a370e9f42ec63af0149f76d9c9640dce31f5c76f823136180dde9be44ca2accd"} err="failed to get container status \"a370e9f42ec63af0149f76d9c9640dce31f5c76f823136180dde9be44ca2accd\": rpc error: code = NotFound desc = could not find container \"a370e9f42ec63af0149f76d9c9640dce31f5c76f823136180dde9be44ca2accd\": container with ID starting with a370e9f42ec63af0149f76d9c9640dce31f5c76f823136180dde9be44ca2accd not found: ID does not exist" Dec 11 13:39:44 crc kubenswrapper[4898]: I1211 13:39:44.799086 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b7a63ef4-d54b-4e53-abdf-a8f2c720f51c" path="/var/lib/kubelet/pods/b7a63ef4-d54b-4e53-abdf-a8f2c720f51c/volumes" Dec 11 13:39:54 crc kubenswrapper[4898]: I1211 13:39:54.558384 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-sq2h8"] Dec 11 13:39:54 crc kubenswrapper[4898]: E1211 13:39:54.559554 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7a63ef4-d54b-4e53-abdf-a8f2c720f51c" containerName="extract-utilities" Dec 11 13:39:54 crc kubenswrapper[4898]: I1211 13:39:54.559573 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7a63ef4-d54b-4e53-abdf-a8f2c720f51c" containerName="extract-utilities" Dec 11 13:39:54 crc kubenswrapper[4898]: E1211 13:39:54.559594 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="959cd436-0db4-416b-a554-a7ee662e4744" containerName="extract-content" Dec 11 13:39:54 crc kubenswrapper[4898]: I1211 13:39:54.559602 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="959cd436-0db4-416b-a554-a7ee662e4744" containerName="extract-content" Dec 11 13:39:54 crc kubenswrapper[4898]: E1211 13:39:54.559618 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7a63ef4-d54b-4e53-abdf-a8f2c720f51c" containerName="extract-content" Dec 11 13:39:54 crc kubenswrapper[4898]: I1211 13:39:54.559627 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7a63ef4-d54b-4e53-abdf-a8f2c720f51c" containerName="extract-content" Dec 11 13:39:54 crc kubenswrapper[4898]: E1211 13:39:54.559644 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7a63ef4-d54b-4e53-abdf-a8f2c720f51c" containerName="registry-server" Dec 11 13:39:54 crc kubenswrapper[4898]: I1211 13:39:54.559652 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7a63ef4-d54b-4e53-abdf-a8f2c720f51c" containerName="registry-server" Dec 11 13:39:54 crc kubenswrapper[4898]: E1211 13:39:54.559660 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="959cd436-0db4-416b-a554-a7ee662e4744" containerName="extract-utilities" Dec 11 13:39:54 crc kubenswrapper[4898]: I1211 13:39:54.559667 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="959cd436-0db4-416b-a554-a7ee662e4744" containerName="extract-utilities" Dec 11 13:39:54 crc kubenswrapper[4898]: E1211 13:39:54.559688 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="959cd436-0db4-416b-a554-a7ee662e4744" containerName="registry-server" Dec 11 13:39:54 crc kubenswrapper[4898]: I1211 13:39:54.559696 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="959cd436-0db4-416b-a554-a7ee662e4744" containerName="registry-server" Dec 11 13:39:54 crc kubenswrapper[4898]: I1211 13:39:54.559989 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="959cd436-0db4-416b-a554-a7ee662e4744" containerName="registry-server" Dec 11 13:39:54 crc kubenswrapper[4898]: I1211 13:39:54.560028 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7a63ef4-d54b-4e53-abdf-a8f2c720f51c" containerName="registry-server" Dec 11 13:39:54 crc kubenswrapper[4898]: I1211 13:39:54.562502 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sq2h8" Dec 11 13:39:54 crc kubenswrapper[4898]: I1211 13:39:54.572203 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8f9497d2-2241-46c7-84cf-e78f5dc6b608-catalog-content\") pod \"redhat-operators-sq2h8\" (UID: \"8f9497d2-2241-46c7-84cf-e78f5dc6b608\") " pod="openshift-marketplace/redhat-operators-sq2h8" Dec 11 13:39:54 crc kubenswrapper[4898]: I1211 13:39:54.572288 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8f9497d2-2241-46c7-84cf-e78f5dc6b608-utilities\") pod \"redhat-operators-sq2h8\" (UID: \"8f9497d2-2241-46c7-84cf-e78f5dc6b608\") " pod="openshift-marketplace/redhat-operators-sq2h8" Dec 11 13:39:54 crc kubenswrapper[4898]: I1211 13:39:54.572494 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jx5hs\" (UniqueName: \"kubernetes.io/projected/8f9497d2-2241-46c7-84cf-e78f5dc6b608-kube-api-access-jx5hs\") pod \"redhat-operators-sq2h8\" (UID: \"8f9497d2-2241-46c7-84cf-e78f5dc6b608\") " pod="openshift-marketplace/redhat-operators-sq2h8" Dec 11 13:39:54 crc kubenswrapper[4898]: I1211 13:39:54.594522 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-sq2h8"] Dec 11 13:39:54 crc kubenswrapper[4898]: I1211 13:39:54.675061 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8f9497d2-2241-46c7-84cf-e78f5dc6b608-utilities\") pod \"redhat-operators-sq2h8\" (UID: \"8f9497d2-2241-46c7-84cf-e78f5dc6b608\") " pod="openshift-marketplace/redhat-operators-sq2h8" Dec 11 13:39:54 crc kubenswrapper[4898]: I1211 13:39:54.675536 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jx5hs\" (UniqueName: \"kubernetes.io/projected/8f9497d2-2241-46c7-84cf-e78f5dc6b608-kube-api-access-jx5hs\") pod \"redhat-operators-sq2h8\" (UID: \"8f9497d2-2241-46c7-84cf-e78f5dc6b608\") " pod="openshift-marketplace/redhat-operators-sq2h8" Dec 11 13:39:54 crc kubenswrapper[4898]: I1211 13:39:54.675693 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8f9497d2-2241-46c7-84cf-e78f5dc6b608-catalog-content\") pod \"redhat-operators-sq2h8\" (UID: \"8f9497d2-2241-46c7-84cf-e78f5dc6b608\") " pod="openshift-marketplace/redhat-operators-sq2h8" Dec 11 13:39:54 crc kubenswrapper[4898]: I1211 13:39:54.675701 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8f9497d2-2241-46c7-84cf-e78f5dc6b608-utilities\") pod \"redhat-operators-sq2h8\" (UID: \"8f9497d2-2241-46c7-84cf-e78f5dc6b608\") " pod="openshift-marketplace/redhat-operators-sq2h8" Dec 11 13:39:54 crc kubenswrapper[4898]: I1211 13:39:54.676191 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8f9497d2-2241-46c7-84cf-e78f5dc6b608-catalog-content\") pod \"redhat-operators-sq2h8\" (UID: \"8f9497d2-2241-46c7-84cf-e78f5dc6b608\") " pod="openshift-marketplace/redhat-operators-sq2h8" Dec 11 13:39:54 crc kubenswrapper[4898]: I1211 13:39:54.698690 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jx5hs\" (UniqueName: \"kubernetes.io/projected/8f9497d2-2241-46c7-84cf-e78f5dc6b608-kube-api-access-jx5hs\") pod \"redhat-operators-sq2h8\" (UID: \"8f9497d2-2241-46c7-84cf-e78f5dc6b608\") " pod="openshift-marketplace/redhat-operators-sq2h8" Dec 11 13:39:54 crc kubenswrapper[4898]: I1211 13:39:54.890578 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sq2h8" Dec 11 13:39:55 crc kubenswrapper[4898]: I1211 13:39:55.428097 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-sq2h8"] Dec 11 13:39:55 crc kubenswrapper[4898]: I1211 13:39:55.794699 4898 generic.go:334] "Generic (PLEG): container finished" podID="8f9497d2-2241-46c7-84cf-e78f5dc6b608" containerID="660a8dc02d5ebb50f7cb9de6e75a00030f453b12bb1081005870aeab5a787587" exitCode=0 Dec 11 13:39:55 crc kubenswrapper[4898]: I1211 13:39:55.794763 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sq2h8" event={"ID":"8f9497d2-2241-46c7-84cf-e78f5dc6b608","Type":"ContainerDied","Data":"660a8dc02d5ebb50f7cb9de6e75a00030f453b12bb1081005870aeab5a787587"} Dec 11 13:39:55 crc kubenswrapper[4898]: I1211 13:39:55.794974 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sq2h8" event={"ID":"8f9497d2-2241-46c7-84cf-e78f5dc6b608","Type":"ContainerStarted","Data":"393b87c4a41f7e1da3b0ad4c417be1eb98524d7159d23eb74acc83720db19faa"} Dec 11 13:39:56 crc kubenswrapper[4898]: I1211 13:39:56.809117 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sq2h8" event={"ID":"8f9497d2-2241-46c7-84cf-e78f5dc6b608","Type":"ContainerStarted","Data":"e04b23da52e8f80f4f8ef354072ec6fcb2bfca04914cbf69fdfc9dfdc8db451b"} Dec 11 13:40:00 crc kubenswrapper[4898]: I1211 13:40:00.860886 4898 generic.go:334] "Generic (PLEG): container finished" podID="8f9497d2-2241-46c7-84cf-e78f5dc6b608" containerID="e04b23da52e8f80f4f8ef354072ec6fcb2bfca04914cbf69fdfc9dfdc8db451b" exitCode=0 Dec 11 13:40:00 crc kubenswrapper[4898]: I1211 13:40:00.861055 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sq2h8" event={"ID":"8f9497d2-2241-46c7-84cf-e78f5dc6b608","Type":"ContainerDied","Data":"e04b23da52e8f80f4f8ef354072ec6fcb2bfca04914cbf69fdfc9dfdc8db451b"} Dec 11 13:40:01 crc kubenswrapper[4898]: I1211 13:40:01.876565 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sq2h8" event={"ID":"8f9497d2-2241-46c7-84cf-e78f5dc6b608","Type":"ContainerStarted","Data":"4c7b28588ff2f66bdc6f224e4d1e2564dc49e793635112cf0b736832a0a95282"} Dec 11 13:40:04 crc kubenswrapper[4898]: I1211 13:40:04.891128 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-sq2h8" Dec 11 13:40:04 crc kubenswrapper[4898]: I1211 13:40:04.891666 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-sq2h8" Dec 11 13:40:04 crc kubenswrapper[4898]: I1211 13:40:04.995915 4898 patch_prober.go:28] interesting pod/machine-config-daemon-7mmvk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 11 13:40:04 crc kubenswrapper[4898]: I1211 13:40:04.996298 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 11 13:40:05 crc kubenswrapper[4898]: I1211 13:40:05.945228 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-sq2h8" podUID="8f9497d2-2241-46c7-84cf-e78f5dc6b608" containerName="registry-server" probeResult="failure" output=< Dec 11 13:40:05 crc kubenswrapper[4898]: timeout: failed to connect service ":50051" within 1s Dec 11 13:40:05 crc kubenswrapper[4898]: > Dec 11 13:40:14 crc kubenswrapper[4898]: I1211 13:40:14.974076 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-sq2h8" Dec 11 13:40:14 crc kubenswrapper[4898]: I1211 13:40:14.996998 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-sq2h8" podStartSLOduration=15.475136962 podStartE2EDuration="20.996976434s" podCreationTimestamp="2025-12-11 13:39:54 +0000 UTC" firstStartedPulling="2025-12-11 13:39:55.796511207 +0000 UTC m=+2153.368837634" lastFinishedPulling="2025-12-11 13:40:01.318350669 +0000 UTC m=+2158.890677106" observedRunningTime="2025-12-11 13:40:01.902861793 +0000 UTC m=+2159.475188240" watchObservedRunningTime="2025-12-11 13:40:14.996976434 +0000 UTC m=+2172.569302881" Dec 11 13:40:15 crc kubenswrapper[4898]: I1211 13:40:15.036630 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-sq2h8" Dec 11 13:40:15 crc kubenswrapper[4898]: I1211 13:40:15.217136 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-sq2h8"] Dec 11 13:40:16 crc kubenswrapper[4898]: I1211 13:40:16.052722 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-sq2h8" podUID="8f9497d2-2241-46c7-84cf-e78f5dc6b608" containerName="registry-server" containerID="cri-o://4c7b28588ff2f66bdc6f224e4d1e2564dc49e793635112cf0b736832a0a95282" gracePeriod=2 Dec 11 13:40:16 crc kubenswrapper[4898]: I1211 13:40:16.637598 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sq2h8" Dec 11 13:40:16 crc kubenswrapper[4898]: I1211 13:40:16.770121 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8f9497d2-2241-46c7-84cf-e78f5dc6b608-catalog-content\") pod \"8f9497d2-2241-46c7-84cf-e78f5dc6b608\" (UID: \"8f9497d2-2241-46c7-84cf-e78f5dc6b608\") " Dec 11 13:40:16 crc kubenswrapper[4898]: I1211 13:40:16.770316 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8f9497d2-2241-46c7-84cf-e78f5dc6b608-utilities\") pod \"8f9497d2-2241-46c7-84cf-e78f5dc6b608\" (UID: \"8f9497d2-2241-46c7-84cf-e78f5dc6b608\") " Dec 11 13:40:16 crc kubenswrapper[4898]: I1211 13:40:16.770640 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jx5hs\" (UniqueName: \"kubernetes.io/projected/8f9497d2-2241-46c7-84cf-e78f5dc6b608-kube-api-access-jx5hs\") pod \"8f9497d2-2241-46c7-84cf-e78f5dc6b608\" (UID: \"8f9497d2-2241-46c7-84cf-e78f5dc6b608\") " Dec 11 13:40:16 crc kubenswrapper[4898]: I1211 13:40:16.771092 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f9497d2-2241-46c7-84cf-e78f5dc6b608-utilities" (OuterVolumeSpecName: "utilities") pod "8f9497d2-2241-46c7-84cf-e78f5dc6b608" (UID: "8f9497d2-2241-46c7-84cf-e78f5dc6b608"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 13:40:16 crc kubenswrapper[4898]: I1211 13:40:16.771848 4898 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8f9497d2-2241-46c7-84cf-e78f5dc6b608-utilities\") on node \"crc\" DevicePath \"\"" Dec 11 13:40:16 crc kubenswrapper[4898]: I1211 13:40:16.806477 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f9497d2-2241-46c7-84cf-e78f5dc6b608-kube-api-access-jx5hs" (OuterVolumeSpecName: "kube-api-access-jx5hs") pod "8f9497d2-2241-46c7-84cf-e78f5dc6b608" (UID: "8f9497d2-2241-46c7-84cf-e78f5dc6b608"). InnerVolumeSpecName "kube-api-access-jx5hs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:40:16 crc kubenswrapper[4898]: I1211 13:40:16.877082 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jx5hs\" (UniqueName: \"kubernetes.io/projected/8f9497d2-2241-46c7-84cf-e78f5dc6b608-kube-api-access-jx5hs\") on node \"crc\" DevicePath \"\"" Dec 11 13:40:16 crc kubenswrapper[4898]: I1211 13:40:16.906300 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f9497d2-2241-46c7-84cf-e78f5dc6b608-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8f9497d2-2241-46c7-84cf-e78f5dc6b608" (UID: "8f9497d2-2241-46c7-84cf-e78f5dc6b608"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 13:40:16 crc kubenswrapper[4898]: I1211 13:40:16.979366 4898 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8f9497d2-2241-46c7-84cf-e78f5dc6b608-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 11 13:40:17 crc kubenswrapper[4898]: I1211 13:40:17.069178 4898 generic.go:334] "Generic (PLEG): container finished" podID="8f9497d2-2241-46c7-84cf-e78f5dc6b608" containerID="4c7b28588ff2f66bdc6f224e4d1e2564dc49e793635112cf0b736832a0a95282" exitCode=0 Dec 11 13:40:17 crc kubenswrapper[4898]: I1211 13:40:17.069226 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sq2h8" event={"ID":"8f9497d2-2241-46c7-84cf-e78f5dc6b608","Type":"ContainerDied","Data":"4c7b28588ff2f66bdc6f224e4d1e2564dc49e793635112cf0b736832a0a95282"} Dec 11 13:40:17 crc kubenswrapper[4898]: I1211 13:40:17.069266 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sq2h8" event={"ID":"8f9497d2-2241-46c7-84cf-e78f5dc6b608","Type":"ContainerDied","Data":"393b87c4a41f7e1da3b0ad4c417be1eb98524d7159d23eb74acc83720db19faa"} Dec 11 13:40:17 crc kubenswrapper[4898]: I1211 13:40:17.069295 4898 scope.go:117] "RemoveContainer" containerID="4c7b28588ff2f66bdc6f224e4d1e2564dc49e793635112cf0b736832a0a95282" Dec 11 13:40:17 crc kubenswrapper[4898]: I1211 13:40:17.069334 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sq2h8" Dec 11 13:40:17 crc kubenswrapper[4898]: I1211 13:40:17.092589 4898 scope.go:117] "RemoveContainer" containerID="e04b23da52e8f80f4f8ef354072ec6fcb2bfca04914cbf69fdfc9dfdc8db451b" Dec 11 13:40:17 crc kubenswrapper[4898]: I1211 13:40:17.117587 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-sq2h8"] Dec 11 13:40:17 crc kubenswrapper[4898]: I1211 13:40:17.132390 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-sq2h8"] Dec 11 13:40:17 crc kubenswrapper[4898]: I1211 13:40:17.139539 4898 scope.go:117] "RemoveContainer" containerID="660a8dc02d5ebb50f7cb9de6e75a00030f453b12bb1081005870aeab5a787587" Dec 11 13:40:17 crc kubenswrapper[4898]: I1211 13:40:17.174742 4898 scope.go:117] "RemoveContainer" containerID="4c7b28588ff2f66bdc6f224e4d1e2564dc49e793635112cf0b736832a0a95282" Dec 11 13:40:17 crc kubenswrapper[4898]: E1211 13:40:17.175248 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4c7b28588ff2f66bdc6f224e4d1e2564dc49e793635112cf0b736832a0a95282\": container with ID starting with 4c7b28588ff2f66bdc6f224e4d1e2564dc49e793635112cf0b736832a0a95282 not found: ID does not exist" containerID="4c7b28588ff2f66bdc6f224e4d1e2564dc49e793635112cf0b736832a0a95282" Dec 11 13:40:17 crc kubenswrapper[4898]: I1211 13:40:17.175288 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c7b28588ff2f66bdc6f224e4d1e2564dc49e793635112cf0b736832a0a95282"} err="failed to get container status \"4c7b28588ff2f66bdc6f224e4d1e2564dc49e793635112cf0b736832a0a95282\": rpc error: code = NotFound desc = could not find container \"4c7b28588ff2f66bdc6f224e4d1e2564dc49e793635112cf0b736832a0a95282\": container with ID starting with 4c7b28588ff2f66bdc6f224e4d1e2564dc49e793635112cf0b736832a0a95282 not found: ID does not exist" Dec 11 13:40:17 crc kubenswrapper[4898]: I1211 13:40:17.175310 4898 scope.go:117] "RemoveContainer" containerID="e04b23da52e8f80f4f8ef354072ec6fcb2bfca04914cbf69fdfc9dfdc8db451b" Dec 11 13:40:17 crc kubenswrapper[4898]: E1211 13:40:17.175636 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e04b23da52e8f80f4f8ef354072ec6fcb2bfca04914cbf69fdfc9dfdc8db451b\": container with ID starting with e04b23da52e8f80f4f8ef354072ec6fcb2bfca04914cbf69fdfc9dfdc8db451b not found: ID does not exist" containerID="e04b23da52e8f80f4f8ef354072ec6fcb2bfca04914cbf69fdfc9dfdc8db451b" Dec 11 13:40:17 crc kubenswrapper[4898]: I1211 13:40:17.175661 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e04b23da52e8f80f4f8ef354072ec6fcb2bfca04914cbf69fdfc9dfdc8db451b"} err="failed to get container status \"e04b23da52e8f80f4f8ef354072ec6fcb2bfca04914cbf69fdfc9dfdc8db451b\": rpc error: code = NotFound desc = could not find container \"e04b23da52e8f80f4f8ef354072ec6fcb2bfca04914cbf69fdfc9dfdc8db451b\": container with ID starting with e04b23da52e8f80f4f8ef354072ec6fcb2bfca04914cbf69fdfc9dfdc8db451b not found: ID does not exist" Dec 11 13:40:17 crc kubenswrapper[4898]: I1211 13:40:17.175675 4898 scope.go:117] "RemoveContainer" containerID="660a8dc02d5ebb50f7cb9de6e75a00030f453b12bb1081005870aeab5a787587" Dec 11 13:40:17 crc kubenswrapper[4898]: E1211 13:40:17.175917 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"660a8dc02d5ebb50f7cb9de6e75a00030f453b12bb1081005870aeab5a787587\": container with ID starting with 660a8dc02d5ebb50f7cb9de6e75a00030f453b12bb1081005870aeab5a787587 not found: ID does not exist" containerID="660a8dc02d5ebb50f7cb9de6e75a00030f453b12bb1081005870aeab5a787587" Dec 11 13:40:17 crc kubenswrapper[4898]: I1211 13:40:17.175941 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"660a8dc02d5ebb50f7cb9de6e75a00030f453b12bb1081005870aeab5a787587"} err="failed to get container status \"660a8dc02d5ebb50f7cb9de6e75a00030f453b12bb1081005870aeab5a787587\": rpc error: code = NotFound desc = could not find container \"660a8dc02d5ebb50f7cb9de6e75a00030f453b12bb1081005870aeab5a787587\": container with ID starting with 660a8dc02d5ebb50f7cb9de6e75a00030f453b12bb1081005870aeab5a787587 not found: ID does not exist" Dec 11 13:40:18 crc kubenswrapper[4898]: I1211 13:40:18.791887 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f9497d2-2241-46c7-84cf-e78f5dc6b608" path="/var/lib/kubelet/pods/8f9497d2-2241-46c7-84cf-e78f5dc6b608/volumes" Dec 11 13:40:22 crc kubenswrapper[4898]: I1211 13:40:22.146896 4898 generic.go:334] "Generic (PLEG): container finished" podID="0c0edae9-777b-4b14-9014-8e7dddcc6319" containerID="05f45cd92b12811fdc7f0a6d064aad5cc73611a0384ca57aa67d4d51f60c60f4" exitCode=0 Dec 11 13:40:22 crc kubenswrapper[4898]: I1211 13:40:22.146987 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-lv4d4" event={"ID":"0c0edae9-777b-4b14-9014-8e7dddcc6319","Type":"ContainerDied","Data":"05f45cd92b12811fdc7f0a6d064aad5cc73611a0384ca57aa67d4d51f60c60f4"} Dec 11 13:40:23 crc kubenswrapper[4898]: I1211 13:40:23.721129 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-lv4d4" Dec 11 13:40:23 crc kubenswrapper[4898]: I1211 13:40:23.870393 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0c0edae9-777b-4b14-9014-8e7dddcc6319-inventory\") pod \"0c0edae9-777b-4b14-9014-8e7dddcc6319\" (UID: \"0c0edae9-777b-4b14-9014-8e7dddcc6319\") " Dec 11 13:40:23 crc kubenswrapper[4898]: I1211 13:40:23.870486 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0c0edae9-777b-4b14-9014-8e7dddcc6319-ssh-key\") pod \"0c0edae9-777b-4b14-9014-8e7dddcc6319\" (UID: \"0c0edae9-777b-4b14-9014-8e7dddcc6319\") " Dec 11 13:40:23 crc kubenswrapper[4898]: I1211 13:40:23.870730 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5sh7h\" (UniqueName: \"kubernetes.io/projected/0c0edae9-777b-4b14-9014-8e7dddcc6319-kube-api-access-5sh7h\") pod \"0c0edae9-777b-4b14-9014-8e7dddcc6319\" (UID: \"0c0edae9-777b-4b14-9014-8e7dddcc6319\") " Dec 11 13:40:23 crc kubenswrapper[4898]: I1211 13:40:23.876757 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c0edae9-777b-4b14-9014-8e7dddcc6319-kube-api-access-5sh7h" (OuterVolumeSpecName: "kube-api-access-5sh7h") pod "0c0edae9-777b-4b14-9014-8e7dddcc6319" (UID: "0c0edae9-777b-4b14-9014-8e7dddcc6319"). InnerVolumeSpecName "kube-api-access-5sh7h". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:40:23 crc kubenswrapper[4898]: I1211 13:40:23.906277 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c0edae9-777b-4b14-9014-8e7dddcc6319-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "0c0edae9-777b-4b14-9014-8e7dddcc6319" (UID: "0c0edae9-777b-4b14-9014-8e7dddcc6319"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:40:23 crc kubenswrapper[4898]: I1211 13:40:23.917674 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c0edae9-777b-4b14-9014-8e7dddcc6319-inventory" (OuterVolumeSpecName: "inventory") pod "0c0edae9-777b-4b14-9014-8e7dddcc6319" (UID: "0c0edae9-777b-4b14-9014-8e7dddcc6319"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:40:23 crc kubenswrapper[4898]: I1211 13:40:23.974640 4898 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0c0edae9-777b-4b14-9014-8e7dddcc6319-inventory\") on node \"crc\" DevicePath \"\"" Dec 11 13:40:23 crc kubenswrapper[4898]: I1211 13:40:23.974888 4898 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0c0edae9-777b-4b14-9014-8e7dddcc6319-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 11 13:40:23 crc kubenswrapper[4898]: I1211 13:40:23.974902 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5sh7h\" (UniqueName: \"kubernetes.io/projected/0c0edae9-777b-4b14-9014-8e7dddcc6319-kube-api-access-5sh7h\") on node \"crc\" DevicePath \"\"" Dec 11 13:40:24 crc kubenswrapper[4898]: I1211 13:40:24.170586 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-lv4d4" event={"ID":"0c0edae9-777b-4b14-9014-8e7dddcc6319","Type":"ContainerDied","Data":"6dc15f7f60ae70c721dea623069035cf608baeb23514755751dcc93cbc063c77"} Dec 11 13:40:24 crc kubenswrapper[4898]: I1211 13:40:24.170618 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-lv4d4" Dec 11 13:40:24 crc kubenswrapper[4898]: I1211 13:40:24.170629 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6dc15f7f60ae70c721dea623069035cf608baeb23514755751dcc93cbc063c77" Dec 11 13:40:24 crc kubenswrapper[4898]: I1211 13:40:24.292076 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-k57p2"] Dec 11 13:40:24 crc kubenswrapper[4898]: E1211 13:40:24.293140 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c0edae9-777b-4b14-9014-8e7dddcc6319" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Dec 11 13:40:24 crc kubenswrapper[4898]: I1211 13:40:24.293160 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c0edae9-777b-4b14-9014-8e7dddcc6319" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Dec 11 13:40:24 crc kubenswrapper[4898]: E1211 13:40:24.293189 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f9497d2-2241-46c7-84cf-e78f5dc6b608" containerName="extract-content" Dec 11 13:40:24 crc kubenswrapper[4898]: I1211 13:40:24.293197 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f9497d2-2241-46c7-84cf-e78f5dc6b608" containerName="extract-content" Dec 11 13:40:24 crc kubenswrapper[4898]: E1211 13:40:24.293215 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f9497d2-2241-46c7-84cf-e78f5dc6b608" containerName="extract-utilities" Dec 11 13:40:24 crc kubenswrapper[4898]: I1211 13:40:24.293222 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f9497d2-2241-46c7-84cf-e78f5dc6b608" containerName="extract-utilities" Dec 11 13:40:24 crc kubenswrapper[4898]: E1211 13:40:24.293242 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f9497d2-2241-46c7-84cf-e78f5dc6b608" containerName="registry-server" Dec 11 13:40:24 crc kubenswrapper[4898]: I1211 13:40:24.293248 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f9497d2-2241-46c7-84cf-e78f5dc6b608" containerName="registry-server" Dec 11 13:40:24 crc kubenswrapper[4898]: I1211 13:40:24.293674 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c0edae9-777b-4b14-9014-8e7dddcc6319" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Dec 11 13:40:24 crc kubenswrapper[4898]: I1211 13:40:24.293717 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f9497d2-2241-46c7-84cf-e78f5dc6b608" containerName="registry-server" Dec 11 13:40:24 crc kubenswrapper[4898]: I1211 13:40:24.294759 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-k57p2" Dec 11 13:40:24 crc kubenswrapper[4898]: I1211 13:40:24.300140 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 11 13:40:24 crc kubenswrapper[4898]: I1211 13:40:24.300607 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 11 13:40:24 crc kubenswrapper[4898]: I1211 13:40:24.300854 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 11 13:40:24 crc kubenswrapper[4898]: I1211 13:40:24.300884 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-w8jc8" Dec 11 13:40:24 crc kubenswrapper[4898]: I1211 13:40:24.306932 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-k57p2"] Dec 11 13:40:24 crc kubenswrapper[4898]: I1211 13:40:24.385833 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5c240120-432c-4eca-a36a-b16b03d2fbd2-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-k57p2\" (UID: \"5c240120-432c-4eca-a36a-b16b03d2fbd2\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-k57p2" Dec 11 13:40:24 crc kubenswrapper[4898]: I1211 13:40:24.385894 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5c240120-432c-4eca-a36a-b16b03d2fbd2-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-k57p2\" (UID: \"5c240120-432c-4eca-a36a-b16b03d2fbd2\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-k57p2" Dec 11 13:40:24 crc kubenswrapper[4898]: I1211 13:40:24.386031 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hv4pm\" (UniqueName: \"kubernetes.io/projected/5c240120-432c-4eca-a36a-b16b03d2fbd2-kube-api-access-hv4pm\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-k57p2\" (UID: \"5c240120-432c-4eca-a36a-b16b03d2fbd2\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-k57p2" Dec 11 13:40:24 crc kubenswrapper[4898]: I1211 13:40:24.487200 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5c240120-432c-4eca-a36a-b16b03d2fbd2-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-k57p2\" (UID: \"5c240120-432c-4eca-a36a-b16b03d2fbd2\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-k57p2" Dec 11 13:40:24 crc kubenswrapper[4898]: I1211 13:40:24.487257 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hv4pm\" (UniqueName: \"kubernetes.io/projected/5c240120-432c-4eca-a36a-b16b03d2fbd2-kube-api-access-hv4pm\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-k57p2\" (UID: \"5c240120-432c-4eca-a36a-b16b03d2fbd2\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-k57p2" Dec 11 13:40:24 crc kubenswrapper[4898]: I1211 13:40:24.487427 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5c240120-432c-4eca-a36a-b16b03d2fbd2-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-k57p2\" (UID: \"5c240120-432c-4eca-a36a-b16b03d2fbd2\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-k57p2" Dec 11 13:40:24 crc kubenswrapper[4898]: I1211 13:40:24.491190 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5c240120-432c-4eca-a36a-b16b03d2fbd2-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-k57p2\" (UID: \"5c240120-432c-4eca-a36a-b16b03d2fbd2\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-k57p2" Dec 11 13:40:24 crc kubenswrapper[4898]: I1211 13:40:24.491324 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5c240120-432c-4eca-a36a-b16b03d2fbd2-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-k57p2\" (UID: \"5c240120-432c-4eca-a36a-b16b03d2fbd2\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-k57p2" Dec 11 13:40:24 crc kubenswrapper[4898]: I1211 13:40:24.503845 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hv4pm\" (UniqueName: \"kubernetes.io/projected/5c240120-432c-4eca-a36a-b16b03d2fbd2-kube-api-access-hv4pm\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-k57p2\" (UID: \"5c240120-432c-4eca-a36a-b16b03d2fbd2\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-k57p2" Dec 11 13:40:24 crc kubenswrapper[4898]: I1211 13:40:24.647172 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-k57p2" Dec 11 13:40:25 crc kubenswrapper[4898]: I1211 13:40:25.214703 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-k57p2"] Dec 11 13:40:26 crc kubenswrapper[4898]: I1211 13:40:26.194825 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-k57p2" event={"ID":"5c240120-432c-4eca-a36a-b16b03d2fbd2","Type":"ContainerStarted","Data":"0ab2cd919f8dbcadc5061175e1ae17d028665642b38081365dd652408d626673"} Dec 11 13:40:26 crc kubenswrapper[4898]: I1211 13:40:26.195107 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-k57p2" event={"ID":"5c240120-432c-4eca-a36a-b16b03d2fbd2","Type":"ContainerStarted","Data":"3353ffd079066e7d9bc98f1f41c9894cb944dc12bfc367b2d7fa2a920fdfc9c8"} Dec 11 13:40:26 crc kubenswrapper[4898]: I1211 13:40:26.206925 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-k57p2" podStartSLOduration=1.591910178 podStartE2EDuration="2.206906442s" podCreationTimestamp="2025-12-11 13:40:24 +0000 UTC" firstStartedPulling="2025-12-11 13:40:25.216016581 +0000 UTC m=+2182.788343018" lastFinishedPulling="2025-12-11 13:40:25.831012845 +0000 UTC m=+2183.403339282" observedRunningTime="2025-12-11 13:40:26.206329576 +0000 UTC m=+2183.778656023" watchObservedRunningTime="2025-12-11 13:40:26.206906442 +0000 UTC m=+2183.779232889" Dec 11 13:40:34 crc kubenswrapper[4898]: I1211 13:40:34.995922 4898 patch_prober.go:28] interesting pod/machine-config-daemon-7mmvk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 11 13:40:34 crc kubenswrapper[4898]: I1211 13:40:34.996432 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 11 13:40:34 crc kubenswrapper[4898]: I1211 13:40:34.996491 4898 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" Dec 11 13:40:34 crc kubenswrapper[4898]: I1211 13:40:34.997421 4898 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2dc6b826c592db36a826a1bc1d1e5ae9c15e49f961f4c157520efd5d4a76b991"} pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 11 13:40:34 crc kubenswrapper[4898]: I1211 13:40:34.997558 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" containerName="machine-config-daemon" containerID="cri-o://2dc6b826c592db36a826a1bc1d1e5ae9c15e49f961f4c157520efd5d4a76b991" gracePeriod=600 Dec 11 13:40:35 crc kubenswrapper[4898]: I1211 13:40:35.295602 4898 generic.go:334] "Generic (PLEG): container finished" podID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" containerID="2dc6b826c592db36a826a1bc1d1e5ae9c15e49f961f4c157520efd5d4a76b991" exitCode=0 Dec 11 13:40:35 crc kubenswrapper[4898]: I1211 13:40:35.295654 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" event={"ID":"b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c","Type":"ContainerDied","Data":"2dc6b826c592db36a826a1bc1d1e5ae9c15e49f961f4c157520efd5d4a76b991"} Dec 11 13:40:35 crc kubenswrapper[4898]: I1211 13:40:35.295694 4898 scope.go:117] "RemoveContainer" containerID="a48cb70f955c6dfbf71c08dc1817c1d1fa98db5c3f0203c3fcc75aa88e18e901" Dec 11 13:40:36 crc kubenswrapper[4898]: I1211 13:40:36.310493 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" event={"ID":"b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c","Type":"ContainerStarted","Data":"27ad0d928a4e9168d47090faf6f1e780be1daad349640ac754f272e963e95eb8"} Dec 11 13:41:22 crc kubenswrapper[4898]: I1211 13:41:22.052846 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-sync-j2jpr"] Dec 11 13:41:22 crc kubenswrapper[4898]: I1211 13:41:22.068722 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-sync-j2jpr"] Dec 11 13:41:22 crc kubenswrapper[4898]: I1211 13:41:22.810912 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="77bd783a-a06c-400b-b29e-a6edb7d613b3" path="/var/lib/kubelet/pods/77bd783a-a06c-400b-b29e-a6edb7d613b3/volumes" Dec 11 13:41:24 crc kubenswrapper[4898]: I1211 13:41:24.003704 4898 generic.go:334] "Generic (PLEG): container finished" podID="5c240120-432c-4eca-a36a-b16b03d2fbd2" containerID="0ab2cd919f8dbcadc5061175e1ae17d028665642b38081365dd652408d626673" exitCode=0 Dec 11 13:41:24 crc kubenswrapper[4898]: I1211 13:41:24.003914 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-k57p2" event={"ID":"5c240120-432c-4eca-a36a-b16b03d2fbd2","Type":"ContainerDied","Data":"0ab2cd919f8dbcadc5061175e1ae17d028665642b38081365dd652408d626673"} Dec 11 13:41:25 crc kubenswrapper[4898]: I1211 13:41:25.516016 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-k57p2" Dec 11 13:41:25 crc kubenswrapper[4898]: I1211 13:41:25.564169 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5c240120-432c-4eca-a36a-b16b03d2fbd2-inventory\") pod \"5c240120-432c-4eca-a36a-b16b03d2fbd2\" (UID: \"5c240120-432c-4eca-a36a-b16b03d2fbd2\") " Dec 11 13:41:25 crc kubenswrapper[4898]: I1211 13:41:25.564233 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hv4pm\" (UniqueName: \"kubernetes.io/projected/5c240120-432c-4eca-a36a-b16b03d2fbd2-kube-api-access-hv4pm\") pod \"5c240120-432c-4eca-a36a-b16b03d2fbd2\" (UID: \"5c240120-432c-4eca-a36a-b16b03d2fbd2\") " Dec 11 13:41:25 crc kubenswrapper[4898]: I1211 13:41:25.564300 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5c240120-432c-4eca-a36a-b16b03d2fbd2-ssh-key\") pod \"5c240120-432c-4eca-a36a-b16b03d2fbd2\" (UID: \"5c240120-432c-4eca-a36a-b16b03d2fbd2\") " Dec 11 13:41:25 crc kubenswrapper[4898]: I1211 13:41:25.570782 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c240120-432c-4eca-a36a-b16b03d2fbd2-kube-api-access-hv4pm" (OuterVolumeSpecName: "kube-api-access-hv4pm") pod "5c240120-432c-4eca-a36a-b16b03d2fbd2" (UID: "5c240120-432c-4eca-a36a-b16b03d2fbd2"). InnerVolumeSpecName "kube-api-access-hv4pm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:41:25 crc kubenswrapper[4898]: I1211 13:41:25.596333 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c240120-432c-4eca-a36a-b16b03d2fbd2-inventory" (OuterVolumeSpecName: "inventory") pod "5c240120-432c-4eca-a36a-b16b03d2fbd2" (UID: "5c240120-432c-4eca-a36a-b16b03d2fbd2"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:41:25 crc kubenswrapper[4898]: I1211 13:41:25.604803 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c240120-432c-4eca-a36a-b16b03d2fbd2-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "5c240120-432c-4eca-a36a-b16b03d2fbd2" (UID: "5c240120-432c-4eca-a36a-b16b03d2fbd2"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:41:25 crc kubenswrapper[4898]: I1211 13:41:25.667416 4898 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5c240120-432c-4eca-a36a-b16b03d2fbd2-inventory\") on node \"crc\" DevicePath \"\"" Dec 11 13:41:25 crc kubenswrapper[4898]: I1211 13:41:25.667451 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hv4pm\" (UniqueName: \"kubernetes.io/projected/5c240120-432c-4eca-a36a-b16b03d2fbd2-kube-api-access-hv4pm\") on node \"crc\" DevicePath \"\"" Dec 11 13:41:25 crc kubenswrapper[4898]: I1211 13:41:25.667483 4898 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5c240120-432c-4eca-a36a-b16b03d2fbd2-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 11 13:41:26 crc kubenswrapper[4898]: I1211 13:41:26.030840 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-k57p2" event={"ID":"5c240120-432c-4eca-a36a-b16b03d2fbd2","Type":"ContainerDied","Data":"3353ffd079066e7d9bc98f1f41c9894cb944dc12bfc367b2d7fa2a920fdfc9c8"} Dec 11 13:41:26 crc kubenswrapper[4898]: I1211 13:41:26.030892 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3353ffd079066e7d9bc98f1f41c9894cb944dc12bfc367b2d7fa2a920fdfc9c8" Dec 11 13:41:26 crc kubenswrapper[4898]: I1211 13:41:26.030893 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-k57p2" Dec 11 13:41:26 crc kubenswrapper[4898]: I1211 13:41:26.134752 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-zzk8h"] Dec 11 13:41:26 crc kubenswrapper[4898]: E1211 13:41:26.135434 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c240120-432c-4eca-a36a-b16b03d2fbd2" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Dec 11 13:41:26 crc kubenswrapper[4898]: I1211 13:41:26.135474 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c240120-432c-4eca-a36a-b16b03d2fbd2" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Dec 11 13:41:26 crc kubenswrapper[4898]: I1211 13:41:26.135824 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c240120-432c-4eca-a36a-b16b03d2fbd2" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Dec 11 13:41:26 crc kubenswrapper[4898]: I1211 13:41:26.136839 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-zzk8h" Dec 11 13:41:26 crc kubenswrapper[4898]: I1211 13:41:26.140052 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 11 13:41:26 crc kubenswrapper[4898]: I1211 13:41:26.140311 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-w8jc8" Dec 11 13:41:26 crc kubenswrapper[4898]: I1211 13:41:26.141927 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 11 13:41:26 crc kubenswrapper[4898]: I1211 13:41:26.142004 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 11 13:41:26 crc kubenswrapper[4898]: I1211 13:41:26.166628 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-zzk8h"] Dec 11 13:41:26 crc kubenswrapper[4898]: I1211 13:41:26.284555 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/680cd71e-efd0-4750-80a1-3c719e9192c2-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-zzk8h\" (UID: \"680cd71e-efd0-4750-80a1-3c719e9192c2\") " pod="openstack/ssh-known-hosts-edpm-deployment-zzk8h" Dec 11 13:41:26 crc kubenswrapper[4898]: I1211 13:41:26.284779 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/680cd71e-efd0-4750-80a1-3c719e9192c2-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-zzk8h\" (UID: \"680cd71e-efd0-4750-80a1-3c719e9192c2\") " pod="openstack/ssh-known-hosts-edpm-deployment-zzk8h" Dec 11 13:41:26 crc kubenswrapper[4898]: I1211 13:41:26.285360 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6lwrq\" (UniqueName: \"kubernetes.io/projected/680cd71e-efd0-4750-80a1-3c719e9192c2-kube-api-access-6lwrq\") pod \"ssh-known-hosts-edpm-deployment-zzk8h\" (UID: \"680cd71e-efd0-4750-80a1-3c719e9192c2\") " pod="openstack/ssh-known-hosts-edpm-deployment-zzk8h" Dec 11 13:41:26 crc kubenswrapper[4898]: I1211 13:41:26.387505 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6lwrq\" (UniqueName: \"kubernetes.io/projected/680cd71e-efd0-4750-80a1-3c719e9192c2-kube-api-access-6lwrq\") pod \"ssh-known-hosts-edpm-deployment-zzk8h\" (UID: \"680cd71e-efd0-4750-80a1-3c719e9192c2\") " pod="openstack/ssh-known-hosts-edpm-deployment-zzk8h" Dec 11 13:41:26 crc kubenswrapper[4898]: I1211 13:41:26.387606 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/680cd71e-efd0-4750-80a1-3c719e9192c2-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-zzk8h\" (UID: \"680cd71e-efd0-4750-80a1-3c719e9192c2\") " pod="openstack/ssh-known-hosts-edpm-deployment-zzk8h" Dec 11 13:41:26 crc kubenswrapper[4898]: I1211 13:41:26.387675 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/680cd71e-efd0-4750-80a1-3c719e9192c2-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-zzk8h\" (UID: \"680cd71e-efd0-4750-80a1-3c719e9192c2\") " pod="openstack/ssh-known-hosts-edpm-deployment-zzk8h" Dec 11 13:41:26 crc kubenswrapper[4898]: I1211 13:41:26.393594 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/680cd71e-efd0-4750-80a1-3c719e9192c2-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-zzk8h\" (UID: \"680cd71e-efd0-4750-80a1-3c719e9192c2\") " pod="openstack/ssh-known-hosts-edpm-deployment-zzk8h" Dec 11 13:41:26 crc kubenswrapper[4898]: I1211 13:41:26.401615 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/680cd71e-efd0-4750-80a1-3c719e9192c2-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-zzk8h\" (UID: \"680cd71e-efd0-4750-80a1-3c719e9192c2\") " pod="openstack/ssh-known-hosts-edpm-deployment-zzk8h" Dec 11 13:41:26 crc kubenswrapper[4898]: I1211 13:41:26.404041 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6lwrq\" (UniqueName: \"kubernetes.io/projected/680cd71e-efd0-4750-80a1-3c719e9192c2-kube-api-access-6lwrq\") pod \"ssh-known-hosts-edpm-deployment-zzk8h\" (UID: \"680cd71e-efd0-4750-80a1-3c719e9192c2\") " pod="openstack/ssh-known-hosts-edpm-deployment-zzk8h" Dec 11 13:41:26 crc kubenswrapper[4898]: I1211 13:41:26.458514 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-zzk8h" Dec 11 13:41:27 crc kubenswrapper[4898]: I1211 13:41:27.033073 4898 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 11 13:41:27 crc kubenswrapper[4898]: I1211 13:41:27.034080 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-zzk8h"] Dec 11 13:41:27 crc kubenswrapper[4898]: I1211 13:41:27.044226 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-zzk8h" event={"ID":"680cd71e-efd0-4750-80a1-3c719e9192c2","Type":"ContainerStarted","Data":"98d1fba04a52ba9da7d670162a8e147e591cf1a7f4c0c1c4ab79534ecaebbde9"} Dec 11 13:41:28 crc kubenswrapper[4898]: I1211 13:41:28.053684 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-zzk8h" event={"ID":"680cd71e-efd0-4750-80a1-3c719e9192c2","Type":"ContainerStarted","Data":"85e9540f5e851a967071478bf42da69d462b09f82aedb5da71daa2316c92fc00"} Dec 11 13:41:28 crc kubenswrapper[4898]: I1211 13:41:28.075332 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-zzk8h" podStartSLOduration=1.590794373 podStartE2EDuration="2.075314608s" podCreationTimestamp="2025-12-11 13:41:26 +0000 UTC" firstStartedPulling="2025-12-11 13:41:27.032889777 +0000 UTC m=+2244.605216214" lastFinishedPulling="2025-12-11 13:41:27.517410002 +0000 UTC m=+2245.089736449" observedRunningTime="2025-12-11 13:41:28.074812215 +0000 UTC m=+2245.647138672" watchObservedRunningTime="2025-12-11 13:41:28.075314608 +0000 UTC m=+2245.647641045" Dec 11 13:41:35 crc kubenswrapper[4898]: I1211 13:41:35.130818 4898 generic.go:334] "Generic (PLEG): container finished" podID="680cd71e-efd0-4750-80a1-3c719e9192c2" containerID="85e9540f5e851a967071478bf42da69d462b09f82aedb5da71daa2316c92fc00" exitCode=0 Dec 11 13:41:35 crc kubenswrapper[4898]: I1211 13:41:35.130907 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-zzk8h" event={"ID":"680cd71e-efd0-4750-80a1-3c719e9192c2","Type":"ContainerDied","Data":"85e9540f5e851a967071478bf42da69d462b09f82aedb5da71daa2316c92fc00"} Dec 11 13:41:36 crc kubenswrapper[4898]: I1211 13:41:36.764123 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-zzk8h" Dec 11 13:41:36 crc kubenswrapper[4898]: I1211 13:41:36.857783 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6lwrq\" (UniqueName: \"kubernetes.io/projected/680cd71e-efd0-4750-80a1-3c719e9192c2-kube-api-access-6lwrq\") pod \"680cd71e-efd0-4750-80a1-3c719e9192c2\" (UID: \"680cd71e-efd0-4750-80a1-3c719e9192c2\") " Dec 11 13:41:36 crc kubenswrapper[4898]: I1211 13:41:36.857854 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/680cd71e-efd0-4750-80a1-3c719e9192c2-ssh-key-openstack-edpm-ipam\") pod \"680cd71e-efd0-4750-80a1-3c719e9192c2\" (UID: \"680cd71e-efd0-4750-80a1-3c719e9192c2\") " Dec 11 13:41:36 crc kubenswrapper[4898]: I1211 13:41:36.857895 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/680cd71e-efd0-4750-80a1-3c719e9192c2-inventory-0\") pod \"680cd71e-efd0-4750-80a1-3c719e9192c2\" (UID: \"680cd71e-efd0-4750-80a1-3c719e9192c2\") " Dec 11 13:41:36 crc kubenswrapper[4898]: I1211 13:41:36.881041 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/680cd71e-efd0-4750-80a1-3c719e9192c2-kube-api-access-6lwrq" (OuterVolumeSpecName: "kube-api-access-6lwrq") pod "680cd71e-efd0-4750-80a1-3c719e9192c2" (UID: "680cd71e-efd0-4750-80a1-3c719e9192c2"). InnerVolumeSpecName "kube-api-access-6lwrq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:41:36 crc kubenswrapper[4898]: I1211 13:41:36.892990 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/680cd71e-efd0-4750-80a1-3c719e9192c2-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "680cd71e-efd0-4750-80a1-3c719e9192c2" (UID: "680cd71e-efd0-4750-80a1-3c719e9192c2"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:41:36 crc kubenswrapper[4898]: I1211 13:41:36.901958 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/680cd71e-efd0-4750-80a1-3c719e9192c2-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "680cd71e-efd0-4750-80a1-3c719e9192c2" (UID: "680cd71e-efd0-4750-80a1-3c719e9192c2"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:41:36 crc kubenswrapper[4898]: I1211 13:41:36.960848 4898 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/680cd71e-efd0-4750-80a1-3c719e9192c2-inventory-0\") on node \"crc\" DevicePath \"\"" Dec 11 13:41:36 crc kubenswrapper[4898]: I1211 13:41:36.960882 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6lwrq\" (UniqueName: \"kubernetes.io/projected/680cd71e-efd0-4750-80a1-3c719e9192c2-kube-api-access-6lwrq\") on node \"crc\" DevicePath \"\"" Dec 11 13:41:36 crc kubenswrapper[4898]: I1211 13:41:36.960896 4898 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/680cd71e-efd0-4750-80a1-3c719e9192c2-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Dec 11 13:41:37 crc kubenswrapper[4898]: I1211 13:41:37.153659 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-zzk8h" event={"ID":"680cd71e-efd0-4750-80a1-3c719e9192c2","Type":"ContainerDied","Data":"98d1fba04a52ba9da7d670162a8e147e591cf1a7f4c0c1c4ab79534ecaebbde9"} Dec 11 13:41:37 crc kubenswrapper[4898]: I1211 13:41:37.153716 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="98d1fba04a52ba9da7d670162a8e147e591cf1a7f4c0c1c4ab79534ecaebbde9" Dec 11 13:41:37 crc kubenswrapper[4898]: I1211 13:41:37.153740 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-zzk8h" Dec 11 13:41:37 crc kubenswrapper[4898]: I1211 13:41:37.242335 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-nz76h"] Dec 11 13:41:37 crc kubenswrapper[4898]: E1211 13:41:37.243192 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="680cd71e-efd0-4750-80a1-3c719e9192c2" containerName="ssh-known-hosts-edpm-deployment" Dec 11 13:41:37 crc kubenswrapper[4898]: I1211 13:41:37.243231 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="680cd71e-efd0-4750-80a1-3c719e9192c2" containerName="ssh-known-hosts-edpm-deployment" Dec 11 13:41:37 crc kubenswrapper[4898]: I1211 13:41:37.243734 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="680cd71e-efd0-4750-80a1-3c719e9192c2" containerName="ssh-known-hosts-edpm-deployment" Dec 11 13:41:37 crc kubenswrapper[4898]: I1211 13:41:37.244967 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-nz76h" Dec 11 13:41:37 crc kubenswrapper[4898]: I1211 13:41:37.247585 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-w8jc8" Dec 11 13:41:37 crc kubenswrapper[4898]: I1211 13:41:37.250146 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 11 13:41:37 crc kubenswrapper[4898]: I1211 13:41:37.250277 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 11 13:41:37 crc kubenswrapper[4898]: I1211 13:41:37.253648 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 11 13:41:37 crc kubenswrapper[4898]: I1211 13:41:37.265906 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-nz76h"] Dec 11 13:41:37 crc kubenswrapper[4898]: I1211 13:41:37.269697 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0ffe508f-3789-4430-89e6-fa3faa46514d-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-nz76h\" (UID: \"0ffe508f-3789-4430-89e6-fa3faa46514d\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-nz76h" Dec 11 13:41:37 crc kubenswrapper[4898]: I1211 13:41:37.269809 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-66gxj\" (UniqueName: \"kubernetes.io/projected/0ffe508f-3789-4430-89e6-fa3faa46514d-kube-api-access-66gxj\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-nz76h\" (UID: \"0ffe508f-3789-4430-89e6-fa3faa46514d\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-nz76h" Dec 11 13:41:37 crc kubenswrapper[4898]: I1211 13:41:37.269987 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0ffe508f-3789-4430-89e6-fa3faa46514d-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-nz76h\" (UID: \"0ffe508f-3789-4430-89e6-fa3faa46514d\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-nz76h" Dec 11 13:41:37 crc kubenswrapper[4898]: I1211 13:41:37.372610 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0ffe508f-3789-4430-89e6-fa3faa46514d-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-nz76h\" (UID: \"0ffe508f-3789-4430-89e6-fa3faa46514d\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-nz76h" Dec 11 13:41:37 crc kubenswrapper[4898]: I1211 13:41:37.372938 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-66gxj\" (UniqueName: \"kubernetes.io/projected/0ffe508f-3789-4430-89e6-fa3faa46514d-kube-api-access-66gxj\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-nz76h\" (UID: \"0ffe508f-3789-4430-89e6-fa3faa46514d\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-nz76h" Dec 11 13:41:37 crc kubenswrapper[4898]: I1211 13:41:37.373086 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0ffe508f-3789-4430-89e6-fa3faa46514d-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-nz76h\" (UID: \"0ffe508f-3789-4430-89e6-fa3faa46514d\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-nz76h" Dec 11 13:41:37 crc kubenswrapper[4898]: I1211 13:41:37.378355 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0ffe508f-3789-4430-89e6-fa3faa46514d-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-nz76h\" (UID: \"0ffe508f-3789-4430-89e6-fa3faa46514d\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-nz76h" Dec 11 13:41:37 crc kubenswrapper[4898]: I1211 13:41:37.379025 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0ffe508f-3789-4430-89e6-fa3faa46514d-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-nz76h\" (UID: \"0ffe508f-3789-4430-89e6-fa3faa46514d\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-nz76h" Dec 11 13:41:37 crc kubenswrapper[4898]: I1211 13:41:37.394523 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-66gxj\" (UniqueName: \"kubernetes.io/projected/0ffe508f-3789-4430-89e6-fa3faa46514d-kube-api-access-66gxj\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-nz76h\" (UID: \"0ffe508f-3789-4430-89e6-fa3faa46514d\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-nz76h" Dec 11 13:41:37 crc kubenswrapper[4898]: I1211 13:41:37.568425 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-nz76h" Dec 11 13:41:38 crc kubenswrapper[4898]: I1211 13:41:38.147279 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-nz76h"] Dec 11 13:41:38 crc kubenswrapper[4898]: I1211 13:41:38.168443 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-nz76h" event={"ID":"0ffe508f-3789-4430-89e6-fa3faa46514d","Type":"ContainerStarted","Data":"bb1d3ca6dfbac1ca6f7e8e74376b1dc536eb034709acd73092d6a776ea10347d"} Dec 11 13:41:39 crc kubenswrapper[4898]: I1211 13:41:39.128464 4898 scope.go:117] "RemoveContainer" containerID="46916f77e2e5c90443e46eb8bb07913bae74b2a349750f84264c99971a3a5435" Dec 11 13:41:39 crc kubenswrapper[4898]: I1211 13:41:39.189566 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-nz76h" event={"ID":"0ffe508f-3789-4430-89e6-fa3faa46514d","Type":"ContainerStarted","Data":"0ee99a78431c4543eb606f6a17b3d6fe63f07c9efc1ade0f6b81487b0886fe75"} Dec 11 13:41:39 crc kubenswrapper[4898]: I1211 13:41:39.220846 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-nz76h" podStartSLOduration=1.5250002280000001 podStartE2EDuration="2.220820022s" podCreationTimestamp="2025-12-11 13:41:37 +0000 UTC" firstStartedPulling="2025-12-11 13:41:38.14493119 +0000 UTC m=+2255.717257647" lastFinishedPulling="2025-12-11 13:41:38.840750994 +0000 UTC m=+2256.413077441" observedRunningTime="2025-12-11 13:41:39.214632457 +0000 UTC m=+2256.786958894" watchObservedRunningTime="2025-12-11 13:41:39.220820022 +0000 UTC m=+2256.793146479" Dec 11 13:41:48 crc kubenswrapper[4898]: I1211 13:41:48.335906 4898 generic.go:334] "Generic (PLEG): container finished" podID="0ffe508f-3789-4430-89e6-fa3faa46514d" containerID="0ee99a78431c4543eb606f6a17b3d6fe63f07c9efc1ade0f6b81487b0886fe75" exitCode=0 Dec 11 13:41:48 crc kubenswrapper[4898]: I1211 13:41:48.336008 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-nz76h" event={"ID":"0ffe508f-3789-4430-89e6-fa3faa46514d","Type":"ContainerDied","Data":"0ee99a78431c4543eb606f6a17b3d6fe63f07c9efc1ade0f6b81487b0886fe75"} Dec 11 13:41:49 crc kubenswrapper[4898]: I1211 13:41:49.954620 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-nz76h" Dec 11 13:41:50 crc kubenswrapper[4898]: I1211 13:41:50.013650 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0ffe508f-3789-4430-89e6-fa3faa46514d-inventory\") pod \"0ffe508f-3789-4430-89e6-fa3faa46514d\" (UID: \"0ffe508f-3789-4430-89e6-fa3faa46514d\") " Dec 11 13:41:50 crc kubenswrapper[4898]: I1211 13:41:50.013752 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0ffe508f-3789-4430-89e6-fa3faa46514d-ssh-key\") pod \"0ffe508f-3789-4430-89e6-fa3faa46514d\" (UID: \"0ffe508f-3789-4430-89e6-fa3faa46514d\") " Dec 11 13:41:50 crc kubenswrapper[4898]: I1211 13:41:50.013795 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-66gxj\" (UniqueName: \"kubernetes.io/projected/0ffe508f-3789-4430-89e6-fa3faa46514d-kube-api-access-66gxj\") pod \"0ffe508f-3789-4430-89e6-fa3faa46514d\" (UID: \"0ffe508f-3789-4430-89e6-fa3faa46514d\") " Dec 11 13:41:50 crc kubenswrapper[4898]: I1211 13:41:50.019172 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ffe508f-3789-4430-89e6-fa3faa46514d-kube-api-access-66gxj" (OuterVolumeSpecName: "kube-api-access-66gxj") pod "0ffe508f-3789-4430-89e6-fa3faa46514d" (UID: "0ffe508f-3789-4430-89e6-fa3faa46514d"). InnerVolumeSpecName "kube-api-access-66gxj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:41:50 crc kubenswrapper[4898]: I1211 13:41:50.042802 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ffe508f-3789-4430-89e6-fa3faa46514d-inventory" (OuterVolumeSpecName: "inventory") pod "0ffe508f-3789-4430-89e6-fa3faa46514d" (UID: "0ffe508f-3789-4430-89e6-fa3faa46514d"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:41:50 crc kubenswrapper[4898]: I1211 13:41:50.055302 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ffe508f-3789-4430-89e6-fa3faa46514d-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "0ffe508f-3789-4430-89e6-fa3faa46514d" (UID: "0ffe508f-3789-4430-89e6-fa3faa46514d"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:41:50 crc kubenswrapper[4898]: I1211 13:41:50.116298 4898 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0ffe508f-3789-4430-89e6-fa3faa46514d-inventory\") on node \"crc\" DevicePath \"\"" Dec 11 13:41:50 crc kubenswrapper[4898]: I1211 13:41:50.116328 4898 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0ffe508f-3789-4430-89e6-fa3faa46514d-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 11 13:41:50 crc kubenswrapper[4898]: I1211 13:41:50.116340 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-66gxj\" (UniqueName: \"kubernetes.io/projected/0ffe508f-3789-4430-89e6-fa3faa46514d-kube-api-access-66gxj\") on node \"crc\" DevicePath \"\"" Dec 11 13:41:50 crc kubenswrapper[4898]: I1211 13:41:50.361800 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-nz76h" event={"ID":"0ffe508f-3789-4430-89e6-fa3faa46514d","Type":"ContainerDied","Data":"bb1d3ca6dfbac1ca6f7e8e74376b1dc536eb034709acd73092d6a776ea10347d"} Dec 11 13:41:50 crc kubenswrapper[4898]: I1211 13:41:50.361863 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bb1d3ca6dfbac1ca6f7e8e74376b1dc536eb034709acd73092d6a776ea10347d" Dec 11 13:41:50 crc kubenswrapper[4898]: I1211 13:41:50.361887 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-nz76h" Dec 11 13:41:50 crc kubenswrapper[4898]: I1211 13:41:50.445553 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-cgb92"] Dec 11 13:41:50 crc kubenswrapper[4898]: E1211 13:41:50.446026 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ffe508f-3789-4430-89e6-fa3faa46514d" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Dec 11 13:41:50 crc kubenswrapper[4898]: I1211 13:41:50.446043 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ffe508f-3789-4430-89e6-fa3faa46514d" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Dec 11 13:41:50 crc kubenswrapper[4898]: I1211 13:41:50.446280 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ffe508f-3789-4430-89e6-fa3faa46514d" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Dec 11 13:41:50 crc kubenswrapper[4898]: I1211 13:41:50.447019 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-cgb92" Dec 11 13:41:50 crc kubenswrapper[4898]: I1211 13:41:50.448951 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 11 13:41:50 crc kubenswrapper[4898]: I1211 13:41:50.450034 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 11 13:41:50 crc kubenswrapper[4898]: I1211 13:41:50.450102 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 11 13:41:50 crc kubenswrapper[4898]: I1211 13:41:50.450437 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-w8jc8" Dec 11 13:41:50 crc kubenswrapper[4898]: I1211 13:41:50.479944 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-cgb92"] Dec 11 13:41:50 crc kubenswrapper[4898]: I1211 13:41:50.527557 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r8q29\" (UniqueName: \"kubernetes.io/projected/968ddbd1-b46e-4d09-85d8-ebcf30b32cb6-kube-api-access-r8q29\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-cgb92\" (UID: \"968ddbd1-b46e-4d09-85d8-ebcf30b32cb6\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-cgb92" Dec 11 13:41:50 crc kubenswrapper[4898]: I1211 13:41:50.527647 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/968ddbd1-b46e-4d09-85d8-ebcf30b32cb6-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-cgb92\" (UID: \"968ddbd1-b46e-4d09-85d8-ebcf30b32cb6\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-cgb92" Dec 11 13:41:50 crc kubenswrapper[4898]: I1211 13:41:50.527860 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/968ddbd1-b46e-4d09-85d8-ebcf30b32cb6-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-cgb92\" (UID: \"968ddbd1-b46e-4d09-85d8-ebcf30b32cb6\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-cgb92" Dec 11 13:41:50 crc kubenswrapper[4898]: I1211 13:41:50.629962 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r8q29\" (UniqueName: \"kubernetes.io/projected/968ddbd1-b46e-4d09-85d8-ebcf30b32cb6-kube-api-access-r8q29\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-cgb92\" (UID: \"968ddbd1-b46e-4d09-85d8-ebcf30b32cb6\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-cgb92" Dec 11 13:41:50 crc kubenswrapper[4898]: I1211 13:41:50.630033 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/968ddbd1-b46e-4d09-85d8-ebcf30b32cb6-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-cgb92\" (UID: \"968ddbd1-b46e-4d09-85d8-ebcf30b32cb6\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-cgb92" Dec 11 13:41:50 crc kubenswrapper[4898]: I1211 13:41:50.630132 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/968ddbd1-b46e-4d09-85d8-ebcf30b32cb6-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-cgb92\" (UID: \"968ddbd1-b46e-4d09-85d8-ebcf30b32cb6\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-cgb92" Dec 11 13:41:50 crc kubenswrapper[4898]: I1211 13:41:50.642121 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/968ddbd1-b46e-4d09-85d8-ebcf30b32cb6-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-cgb92\" (UID: \"968ddbd1-b46e-4d09-85d8-ebcf30b32cb6\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-cgb92" Dec 11 13:41:50 crc kubenswrapper[4898]: I1211 13:41:50.642406 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/968ddbd1-b46e-4d09-85d8-ebcf30b32cb6-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-cgb92\" (UID: \"968ddbd1-b46e-4d09-85d8-ebcf30b32cb6\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-cgb92" Dec 11 13:41:50 crc kubenswrapper[4898]: I1211 13:41:50.654312 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r8q29\" (UniqueName: \"kubernetes.io/projected/968ddbd1-b46e-4d09-85d8-ebcf30b32cb6-kube-api-access-r8q29\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-cgb92\" (UID: \"968ddbd1-b46e-4d09-85d8-ebcf30b32cb6\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-cgb92" Dec 11 13:41:50 crc kubenswrapper[4898]: I1211 13:41:50.779630 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-cgb92" Dec 11 13:41:51 crc kubenswrapper[4898]: I1211 13:41:51.414303 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-cgb92"] Dec 11 13:41:52 crc kubenswrapper[4898]: I1211 13:41:52.389831 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-cgb92" event={"ID":"968ddbd1-b46e-4d09-85d8-ebcf30b32cb6","Type":"ContainerStarted","Data":"7a2fe73b8e41661e9d5c4aa3c406a376fb3227d931b38ce872f21a9f357d0b66"} Dec 11 13:41:52 crc kubenswrapper[4898]: I1211 13:41:52.390582 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-cgb92" event={"ID":"968ddbd1-b46e-4d09-85d8-ebcf30b32cb6","Type":"ContainerStarted","Data":"8e7ecaadfd7a12831fbee9c2093e3c5c42bb374f4fccc33b3990e3a3c02a562f"} Dec 11 13:41:52 crc kubenswrapper[4898]: I1211 13:41:52.432103 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-cgb92" podStartSLOduration=1.951494177 podStartE2EDuration="2.432081078s" podCreationTimestamp="2025-12-11 13:41:50 +0000 UTC" firstStartedPulling="2025-12-11 13:41:51.421411681 +0000 UTC m=+2268.993738118" lastFinishedPulling="2025-12-11 13:41:51.901998582 +0000 UTC m=+2269.474325019" observedRunningTime="2025-12-11 13:41:52.413595507 +0000 UTC m=+2269.985921944" watchObservedRunningTime="2025-12-11 13:41:52.432081078 +0000 UTC m=+2270.004407535" Dec 11 13:42:02 crc kubenswrapper[4898]: I1211 13:42:02.517912 4898 generic.go:334] "Generic (PLEG): container finished" podID="968ddbd1-b46e-4d09-85d8-ebcf30b32cb6" containerID="7a2fe73b8e41661e9d5c4aa3c406a376fb3227d931b38ce872f21a9f357d0b66" exitCode=0 Dec 11 13:42:02 crc kubenswrapper[4898]: I1211 13:42:02.517976 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-cgb92" event={"ID":"968ddbd1-b46e-4d09-85d8-ebcf30b32cb6","Type":"ContainerDied","Data":"7a2fe73b8e41661e9d5c4aa3c406a376fb3227d931b38ce872f21a9f357d0b66"} Dec 11 13:42:04 crc kubenswrapper[4898]: I1211 13:42:04.035043 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-cgb92" Dec 11 13:42:04 crc kubenswrapper[4898]: I1211 13:42:04.114738 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/968ddbd1-b46e-4d09-85d8-ebcf30b32cb6-inventory\") pod \"968ddbd1-b46e-4d09-85d8-ebcf30b32cb6\" (UID: \"968ddbd1-b46e-4d09-85d8-ebcf30b32cb6\") " Dec 11 13:42:04 crc kubenswrapper[4898]: I1211 13:42:04.115426 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r8q29\" (UniqueName: \"kubernetes.io/projected/968ddbd1-b46e-4d09-85d8-ebcf30b32cb6-kube-api-access-r8q29\") pod \"968ddbd1-b46e-4d09-85d8-ebcf30b32cb6\" (UID: \"968ddbd1-b46e-4d09-85d8-ebcf30b32cb6\") " Dec 11 13:42:04 crc kubenswrapper[4898]: I1211 13:42:04.115681 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/968ddbd1-b46e-4d09-85d8-ebcf30b32cb6-ssh-key\") pod \"968ddbd1-b46e-4d09-85d8-ebcf30b32cb6\" (UID: \"968ddbd1-b46e-4d09-85d8-ebcf30b32cb6\") " Dec 11 13:42:04 crc kubenswrapper[4898]: I1211 13:42:04.126741 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/968ddbd1-b46e-4d09-85d8-ebcf30b32cb6-kube-api-access-r8q29" (OuterVolumeSpecName: "kube-api-access-r8q29") pod "968ddbd1-b46e-4d09-85d8-ebcf30b32cb6" (UID: "968ddbd1-b46e-4d09-85d8-ebcf30b32cb6"). InnerVolumeSpecName "kube-api-access-r8q29". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:42:04 crc kubenswrapper[4898]: I1211 13:42:04.170727 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/968ddbd1-b46e-4d09-85d8-ebcf30b32cb6-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "968ddbd1-b46e-4d09-85d8-ebcf30b32cb6" (UID: "968ddbd1-b46e-4d09-85d8-ebcf30b32cb6"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:42:04 crc kubenswrapper[4898]: I1211 13:42:04.171852 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/968ddbd1-b46e-4d09-85d8-ebcf30b32cb6-inventory" (OuterVolumeSpecName: "inventory") pod "968ddbd1-b46e-4d09-85d8-ebcf30b32cb6" (UID: "968ddbd1-b46e-4d09-85d8-ebcf30b32cb6"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:42:04 crc kubenswrapper[4898]: I1211 13:42:04.238154 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r8q29\" (UniqueName: \"kubernetes.io/projected/968ddbd1-b46e-4d09-85d8-ebcf30b32cb6-kube-api-access-r8q29\") on node \"crc\" DevicePath \"\"" Dec 11 13:42:04 crc kubenswrapper[4898]: I1211 13:42:04.238185 4898 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/968ddbd1-b46e-4d09-85d8-ebcf30b32cb6-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 11 13:42:04 crc kubenswrapper[4898]: I1211 13:42:04.238195 4898 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/968ddbd1-b46e-4d09-85d8-ebcf30b32cb6-inventory\") on node \"crc\" DevicePath \"\"" Dec 11 13:42:04 crc kubenswrapper[4898]: I1211 13:42:04.543813 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-cgb92" event={"ID":"968ddbd1-b46e-4d09-85d8-ebcf30b32cb6","Type":"ContainerDied","Data":"8e7ecaadfd7a12831fbee9c2093e3c5c42bb374f4fccc33b3990e3a3c02a562f"} Dec 11 13:42:04 crc kubenswrapper[4898]: I1211 13:42:04.543866 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8e7ecaadfd7a12831fbee9c2093e3c5c42bb374f4fccc33b3990e3a3c02a562f" Dec 11 13:42:04 crc kubenswrapper[4898]: I1211 13:42:04.543923 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-cgb92" Dec 11 13:42:04 crc kubenswrapper[4898]: I1211 13:42:04.635496 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gvs5l"] Dec 11 13:42:04 crc kubenswrapper[4898]: E1211 13:42:04.636228 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="968ddbd1-b46e-4d09-85d8-ebcf30b32cb6" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Dec 11 13:42:04 crc kubenswrapper[4898]: I1211 13:42:04.636274 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="968ddbd1-b46e-4d09-85d8-ebcf30b32cb6" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Dec 11 13:42:04 crc kubenswrapper[4898]: I1211 13:42:04.636680 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="968ddbd1-b46e-4d09-85d8-ebcf30b32cb6" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Dec 11 13:42:04 crc kubenswrapper[4898]: I1211 13:42:04.637619 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gvs5l" Dec 11 13:42:04 crc kubenswrapper[4898]: I1211 13:42:04.639479 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-w8jc8" Dec 11 13:42:04 crc kubenswrapper[4898]: I1211 13:42:04.639753 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 11 13:42:04 crc kubenswrapper[4898]: I1211 13:42:04.640342 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 11 13:42:04 crc kubenswrapper[4898]: I1211 13:42:04.640679 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 11 13:42:04 crc kubenswrapper[4898]: I1211 13:42:04.643903 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Dec 11 13:42:04 crc kubenswrapper[4898]: I1211 13:42:04.649961 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Dec 11 13:42:04 crc kubenswrapper[4898]: I1211 13:42:04.650176 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Dec 11 13:42:04 crc kubenswrapper[4898]: I1211 13:42:04.650290 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0" Dec 11 13:42:04 crc kubenswrapper[4898]: I1211 13:42:04.651383 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Dec 11 13:42:04 crc kubenswrapper[4898]: I1211 13:42:04.654569 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gvs5l"] Dec 11 13:42:04 crc kubenswrapper[4898]: I1211 13:42:04.751775 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/799a6ddd-3abc-4961-92b4-4a9db3b41f64-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gvs5l\" (UID: \"799a6ddd-3abc-4961-92b4-4a9db3b41f64\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gvs5l" Dec 11 13:42:04 crc kubenswrapper[4898]: I1211 13:42:04.751841 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/799a6ddd-3abc-4961-92b4-4a9db3b41f64-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gvs5l\" (UID: \"799a6ddd-3abc-4961-92b4-4a9db3b41f64\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gvs5l" Dec 11 13:42:04 crc kubenswrapper[4898]: I1211 13:42:04.751875 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/799a6ddd-3abc-4961-92b4-4a9db3b41f64-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gvs5l\" (UID: \"799a6ddd-3abc-4961-92b4-4a9db3b41f64\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gvs5l" Dec 11 13:42:04 crc kubenswrapper[4898]: I1211 13:42:04.751937 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rbttr\" (UniqueName: \"kubernetes.io/projected/799a6ddd-3abc-4961-92b4-4a9db3b41f64-kube-api-access-rbttr\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gvs5l\" (UID: \"799a6ddd-3abc-4961-92b4-4a9db3b41f64\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gvs5l" Dec 11 13:42:04 crc kubenswrapper[4898]: I1211 13:42:04.752000 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/799a6ddd-3abc-4961-92b4-4a9db3b41f64-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gvs5l\" (UID: \"799a6ddd-3abc-4961-92b4-4a9db3b41f64\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gvs5l" Dec 11 13:42:04 crc kubenswrapper[4898]: I1211 13:42:04.752042 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/799a6ddd-3abc-4961-92b4-4a9db3b41f64-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gvs5l\" (UID: \"799a6ddd-3abc-4961-92b4-4a9db3b41f64\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gvs5l" Dec 11 13:42:04 crc kubenswrapper[4898]: I1211 13:42:04.752100 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/799a6ddd-3abc-4961-92b4-4a9db3b41f64-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gvs5l\" (UID: \"799a6ddd-3abc-4961-92b4-4a9db3b41f64\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gvs5l" Dec 11 13:42:04 crc kubenswrapper[4898]: I1211 13:42:04.752129 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/799a6ddd-3abc-4961-92b4-4a9db3b41f64-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gvs5l\" (UID: \"799a6ddd-3abc-4961-92b4-4a9db3b41f64\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gvs5l" Dec 11 13:42:04 crc kubenswrapper[4898]: I1211 13:42:04.752180 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/799a6ddd-3abc-4961-92b4-4a9db3b41f64-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gvs5l\" (UID: \"799a6ddd-3abc-4961-92b4-4a9db3b41f64\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gvs5l" Dec 11 13:42:04 crc kubenswrapper[4898]: I1211 13:42:04.752247 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/799a6ddd-3abc-4961-92b4-4a9db3b41f64-telemetry-power-monitoring-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gvs5l\" (UID: \"799a6ddd-3abc-4961-92b4-4a9db3b41f64\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gvs5l" Dec 11 13:42:04 crc kubenswrapper[4898]: I1211 13:42:04.752301 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/799a6ddd-3abc-4961-92b4-4a9db3b41f64-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gvs5l\" (UID: \"799a6ddd-3abc-4961-92b4-4a9db3b41f64\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gvs5l" Dec 11 13:42:04 crc kubenswrapper[4898]: I1211 13:42:04.752373 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/799a6ddd-3abc-4961-92b4-4a9db3b41f64-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gvs5l\" (UID: \"799a6ddd-3abc-4961-92b4-4a9db3b41f64\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gvs5l" Dec 11 13:42:04 crc kubenswrapper[4898]: I1211 13:42:04.752447 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\" (UniqueName: \"kubernetes.io/projected/799a6ddd-3abc-4961-92b4-4a9db3b41f64-openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gvs5l\" (UID: \"799a6ddd-3abc-4961-92b4-4a9db3b41f64\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gvs5l" Dec 11 13:42:04 crc kubenswrapper[4898]: I1211 13:42:04.752556 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/799a6ddd-3abc-4961-92b4-4a9db3b41f64-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gvs5l\" (UID: \"799a6ddd-3abc-4961-92b4-4a9db3b41f64\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gvs5l" Dec 11 13:42:04 crc kubenswrapper[4898]: I1211 13:42:04.752669 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/799a6ddd-3abc-4961-92b4-4a9db3b41f64-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gvs5l\" (UID: \"799a6ddd-3abc-4961-92b4-4a9db3b41f64\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gvs5l" Dec 11 13:42:04 crc kubenswrapper[4898]: I1211 13:42:04.752746 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/799a6ddd-3abc-4961-92b4-4a9db3b41f64-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gvs5l\" (UID: \"799a6ddd-3abc-4961-92b4-4a9db3b41f64\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gvs5l" Dec 11 13:42:04 crc kubenswrapper[4898]: I1211 13:42:04.862638 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/799a6ddd-3abc-4961-92b4-4a9db3b41f64-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gvs5l\" (UID: \"799a6ddd-3abc-4961-92b4-4a9db3b41f64\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gvs5l" Dec 11 13:42:04 crc kubenswrapper[4898]: I1211 13:42:04.862733 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/799a6ddd-3abc-4961-92b4-4a9db3b41f64-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gvs5l\" (UID: \"799a6ddd-3abc-4961-92b4-4a9db3b41f64\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gvs5l" Dec 11 13:42:04 crc kubenswrapper[4898]: I1211 13:42:04.862789 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/799a6ddd-3abc-4961-92b4-4a9db3b41f64-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gvs5l\" (UID: \"799a6ddd-3abc-4961-92b4-4a9db3b41f64\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gvs5l" Dec 11 13:42:04 crc kubenswrapper[4898]: I1211 13:42:04.862836 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/799a6ddd-3abc-4961-92b4-4a9db3b41f64-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gvs5l\" (UID: \"799a6ddd-3abc-4961-92b4-4a9db3b41f64\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gvs5l" Dec 11 13:42:04 crc kubenswrapper[4898]: I1211 13:42:04.862917 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/799a6ddd-3abc-4961-92b4-4a9db3b41f64-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gvs5l\" (UID: \"799a6ddd-3abc-4961-92b4-4a9db3b41f64\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gvs5l" Dec 11 13:42:04 crc kubenswrapper[4898]: I1211 13:42:04.862968 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/799a6ddd-3abc-4961-92b4-4a9db3b41f64-telemetry-power-monitoring-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gvs5l\" (UID: \"799a6ddd-3abc-4961-92b4-4a9db3b41f64\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gvs5l" Dec 11 13:42:04 crc kubenswrapper[4898]: I1211 13:42:04.863005 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/799a6ddd-3abc-4961-92b4-4a9db3b41f64-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gvs5l\" (UID: \"799a6ddd-3abc-4961-92b4-4a9db3b41f64\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gvs5l" Dec 11 13:42:04 crc kubenswrapper[4898]: I1211 13:42:04.863039 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/799a6ddd-3abc-4961-92b4-4a9db3b41f64-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gvs5l\" (UID: \"799a6ddd-3abc-4961-92b4-4a9db3b41f64\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gvs5l" Dec 11 13:42:04 crc kubenswrapper[4898]: I1211 13:42:04.863088 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\" (UniqueName: \"kubernetes.io/projected/799a6ddd-3abc-4961-92b4-4a9db3b41f64-openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gvs5l\" (UID: \"799a6ddd-3abc-4961-92b4-4a9db3b41f64\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gvs5l" Dec 11 13:42:04 crc kubenswrapper[4898]: I1211 13:42:04.863186 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/799a6ddd-3abc-4961-92b4-4a9db3b41f64-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gvs5l\" (UID: \"799a6ddd-3abc-4961-92b4-4a9db3b41f64\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gvs5l" Dec 11 13:42:04 crc kubenswrapper[4898]: I1211 13:42:04.863326 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/799a6ddd-3abc-4961-92b4-4a9db3b41f64-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gvs5l\" (UID: \"799a6ddd-3abc-4961-92b4-4a9db3b41f64\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gvs5l" Dec 11 13:42:04 crc kubenswrapper[4898]: I1211 13:42:04.863401 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/799a6ddd-3abc-4961-92b4-4a9db3b41f64-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gvs5l\" (UID: \"799a6ddd-3abc-4961-92b4-4a9db3b41f64\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gvs5l" Dec 11 13:42:04 crc kubenswrapper[4898]: I1211 13:42:04.863494 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/799a6ddd-3abc-4961-92b4-4a9db3b41f64-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gvs5l\" (UID: \"799a6ddd-3abc-4961-92b4-4a9db3b41f64\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gvs5l" Dec 11 13:42:04 crc kubenswrapper[4898]: I1211 13:42:04.863529 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/799a6ddd-3abc-4961-92b4-4a9db3b41f64-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gvs5l\" (UID: \"799a6ddd-3abc-4961-92b4-4a9db3b41f64\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gvs5l" Dec 11 13:42:04 crc kubenswrapper[4898]: I1211 13:42:04.863566 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/799a6ddd-3abc-4961-92b4-4a9db3b41f64-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gvs5l\" (UID: \"799a6ddd-3abc-4961-92b4-4a9db3b41f64\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gvs5l" Dec 11 13:42:04 crc kubenswrapper[4898]: I1211 13:42:04.863628 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rbttr\" (UniqueName: \"kubernetes.io/projected/799a6ddd-3abc-4961-92b4-4a9db3b41f64-kube-api-access-rbttr\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gvs5l\" (UID: \"799a6ddd-3abc-4961-92b4-4a9db3b41f64\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gvs5l" Dec 11 13:42:04 crc kubenswrapper[4898]: I1211 13:42:04.866756 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/799a6ddd-3abc-4961-92b4-4a9db3b41f64-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gvs5l\" (UID: \"799a6ddd-3abc-4961-92b4-4a9db3b41f64\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gvs5l" Dec 11 13:42:04 crc kubenswrapper[4898]: I1211 13:42:04.868054 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/799a6ddd-3abc-4961-92b4-4a9db3b41f64-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gvs5l\" (UID: \"799a6ddd-3abc-4961-92b4-4a9db3b41f64\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gvs5l" Dec 11 13:42:04 crc kubenswrapper[4898]: I1211 13:42:04.869362 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/799a6ddd-3abc-4961-92b4-4a9db3b41f64-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gvs5l\" (UID: \"799a6ddd-3abc-4961-92b4-4a9db3b41f64\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gvs5l" Dec 11 13:42:04 crc kubenswrapper[4898]: I1211 13:42:04.870773 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/799a6ddd-3abc-4961-92b4-4a9db3b41f64-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gvs5l\" (UID: \"799a6ddd-3abc-4961-92b4-4a9db3b41f64\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gvs5l" Dec 11 13:42:04 crc kubenswrapper[4898]: I1211 13:42:04.871518 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/799a6ddd-3abc-4961-92b4-4a9db3b41f64-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gvs5l\" (UID: \"799a6ddd-3abc-4961-92b4-4a9db3b41f64\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gvs5l" Dec 11 13:42:04 crc kubenswrapper[4898]: I1211 13:42:04.871675 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/799a6ddd-3abc-4961-92b4-4a9db3b41f64-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gvs5l\" (UID: \"799a6ddd-3abc-4961-92b4-4a9db3b41f64\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gvs5l" Dec 11 13:42:04 crc kubenswrapper[4898]: I1211 13:42:04.873114 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/799a6ddd-3abc-4961-92b4-4a9db3b41f64-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gvs5l\" (UID: \"799a6ddd-3abc-4961-92b4-4a9db3b41f64\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gvs5l" Dec 11 13:42:04 crc kubenswrapper[4898]: I1211 13:42:04.873789 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/799a6ddd-3abc-4961-92b4-4a9db3b41f64-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gvs5l\" (UID: \"799a6ddd-3abc-4961-92b4-4a9db3b41f64\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gvs5l" Dec 11 13:42:04 crc kubenswrapper[4898]: I1211 13:42:04.874804 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/799a6ddd-3abc-4961-92b4-4a9db3b41f64-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gvs5l\" (UID: \"799a6ddd-3abc-4961-92b4-4a9db3b41f64\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gvs5l" Dec 11 13:42:04 crc kubenswrapper[4898]: I1211 13:42:04.876324 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\" (UniqueName: \"kubernetes.io/projected/799a6ddd-3abc-4961-92b4-4a9db3b41f64-openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gvs5l\" (UID: \"799a6ddd-3abc-4961-92b4-4a9db3b41f64\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gvs5l" Dec 11 13:42:04 crc kubenswrapper[4898]: I1211 13:42:04.878765 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/799a6ddd-3abc-4961-92b4-4a9db3b41f64-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gvs5l\" (UID: \"799a6ddd-3abc-4961-92b4-4a9db3b41f64\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gvs5l" Dec 11 13:42:04 crc kubenswrapper[4898]: I1211 13:42:04.879328 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/799a6ddd-3abc-4961-92b4-4a9db3b41f64-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gvs5l\" (UID: \"799a6ddd-3abc-4961-92b4-4a9db3b41f64\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gvs5l" Dec 11 13:42:04 crc kubenswrapper[4898]: I1211 13:42:04.879903 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/799a6ddd-3abc-4961-92b4-4a9db3b41f64-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gvs5l\" (UID: \"799a6ddd-3abc-4961-92b4-4a9db3b41f64\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gvs5l" Dec 11 13:42:04 crc kubenswrapper[4898]: I1211 13:42:04.879996 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/799a6ddd-3abc-4961-92b4-4a9db3b41f64-telemetry-power-monitoring-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gvs5l\" (UID: \"799a6ddd-3abc-4961-92b4-4a9db3b41f64\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gvs5l" Dec 11 13:42:04 crc kubenswrapper[4898]: I1211 13:42:04.881210 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/799a6ddd-3abc-4961-92b4-4a9db3b41f64-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gvs5l\" (UID: \"799a6ddd-3abc-4961-92b4-4a9db3b41f64\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gvs5l" Dec 11 13:42:04 crc kubenswrapper[4898]: I1211 13:42:04.883402 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rbttr\" (UniqueName: \"kubernetes.io/projected/799a6ddd-3abc-4961-92b4-4a9db3b41f64-kube-api-access-rbttr\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gvs5l\" (UID: \"799a6ddd-3abc-4961-92b4-4a9db3b41f64\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gvs5l" Dec 11 13:42:05 crc kubenswrapper[4898]: I1211 13:42:05.004722 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gvs5l" Dec 11 13:42:06 crc kubenswrapper[4898]: I1211 13:42:05.627986 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gvs5l"] Dec 11 13:42:06 crc kubenswrapper[4898]: I1211 13:42:06.567983 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gvs5l" event={"ID":"799a6ddd-3abc-4961-92b4-4a9db3b41f64","Type":"ContainerStarted","Data":"daa4aec72ebb4c9172bc708f1f2a46def54521c786393326c4487662a8a51999"} Dec 11 13:42:07 crc kubenswrapper[4898]: I1211 13:42:07.043283 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-db-sync-8hx6m"] Dec 11 13:42:07 crc kubenswrapper[4898]: I1211 13:42:07.054727 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-db-sync-8hx6m"] Dec 11 13:42:07 crc kubenswrapper[4898]: I1211 13:42:07.583345 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gvs5l" event={"ID":"799a6ddd-3abc-4961-92b4-4a9db3b41f64","Type":"ContainerStarted","Data":"919db97973eecf8e1be4bc4d3618760af6bcc7e3791ec0d48e12851bbacd38ca"} Dec 11 13:42:07 crc kubenswrapper[4898]: I1211 13:42:07.621429 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gvs5l" podStartSLOduration=2.91791072 podStartE2EDuration="3.621402108s" podCreationTimestamp="2025-12-11 13:42:04 +0000 UTC" firstStartedPulling="2025-12-11 13:42:05.629482746 +0000 UTC m=+2283.201809183" lastFinishedPulling="2025-12-11 13:42:06.332974124 +0000 UTC m=+2283.905300571" observedRunningTime="2025-12-11 13:42:07.612559853 +0000 UTC m=+2285.184886360" watchObservedRunningTime="2025-12-11 13:42:07.621402108 +0000 UTC m=+2285.193728575" Dec 11 13:42:08 crc kubenswrapper[4898]: I1211 13:42:08.804085 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0c27755f-ec0b-4b73-a18b-f09ed1715ef7" path="/var/lib/kubelet/pods/0c27755f-ec0b-4b73-a18b-f09ed1715ef7/volumes" Dec 11 13:42:39 crc kubenswrapper[4898]: I1211 13:42:39.211223 4898 scope.go:117] "RemoveContainer" containerID="f2617bd5248cbaeb588c5976d0c10977ae6e5fa5efacba0fe3d07fd368034cba" Dec 11 13:42:57 crc kubenswrapper[4898]: I1211 13:42:57.228376 4898 generic.go:334] "Generic (PLEG): container finished" podID="799a6ddd-3abc-4961-92b4-4a9db3b41f64" containerID="919db97973eecf8e1be4bc4d3618760af6bcc7e3791ec0d48e12851bbacd38ca" exitCode=0 Dec 11 13:42:57 crc kubenswrapper[4898]: I1211 13:42:57.229889 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gvs5l" event={"ID":"799a6ddd-3abc-4961-92b4-4a9db3b41f64","Type":"ContainerDied","Data":"919db97973eecf8e1be4bc4d3618760af6bcc7e3791ec0d48e12851bbacd38ca"} Dec 11 13:42:58 crc kubenswrapper[4898]: I1211 13:42:58.758088 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gvs5l" Dec 11 13:42:58 crc kubenswrapper[4898]: I1211 13:42:58.838800 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/799a6ddd-3abc-4961-92b4-4a9db3b41f64-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"799a6ddd-3abc-4961-92b4-4a9db3b41f64\" (UID: \"799a6ddd-3abc-4961-92b4-4a9db3b41f64\") " Dec 11 13:42:58 crc kubenswrapper[4898]: I1211 13:42:58.838868 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/799a6ddd-3abc-4961-92b4-4a9db3b41f64-telemetry-combined-ca-bundle\") pod \"799a6ddd-3abc-4961-92b4-4a9db3b41f64\" (UID: \"799a6ddd-3abc-4961-92b4-4a9db3b41f64\") " Dec 11 13:42:58 crc kubenswrapper[4898]: I1211 13:42:58.838890 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\" (UniqueName: \"kubernetes.io/projected/799a6ddd-3abc-4961-92b4-4a9db3b41f64-openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\") pod \"799a6ddd-3abc-4961-92b4-4a9db3b41f64\" (UID: \"799a6ddd-3abc-4961-92b4-4a9db3b41f64\") " Dec 11 13:42:58 crc kubenswrapper[4898]: I1211 13:42:58.838913 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/799a6ddd-3abc-4961-92b4-4a9db3b41f64-repo-setup-combined-ca-bundle\") pod \"799a6ddd-3abc-4961-92b4-4a9db3b41f64\" (UID: \"799a6ddd-3abc-4961-92b4-4a9db3b41f64\") " Dec 11 13:42:58 crc kubenswrapper[4898]: I1211 13:42:58.838983 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/799a6ddd-3abc-4961-92b4-4a9db3b41f64-libvirt-combined-ca-bundle\") pod \"799a6ddd-3abc-4961-92b4-4a9db3b41f64\" (UID: \"799a6ddd-3abc-4961-92b4-4a9db3b41f64\") " Dec 11 13:42:58 crc kubenswrapper[4898]: I1211 13:42:58.839030 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/799a6ddd-3abc-4961-92b4-4a9db3b41f64-bootstrap-combined-ca-bundle\") pod \"799a6ddd-3abc-4961-92b4-4a9db3b41f64\" (UID: \"799a6ddd-3abc-4961-92b4-4a9db3b41f64\") " Dec 11 13:42:58 crc kubenswrapper[4898]: I1211 13:42:58.839057 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/799a6ddd-3abc-4961-92b4-4a9db3b41f64-openstack-edpm-ipam-ovn-default-certs-0\") pod \"799a6ddd-3abc-4961-92b4-4a9db3b41f64\" (UID: \"799a6ddd-3abc-4961-92b4-4a9db3b41f64\") " Dec 11 13:42:58 crc kubenswrapper[4898]: I1211 13:42:58.839154 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/799a6ddd-3abc-4961-92b4-4a9db3b41f64-nova-combined-ca-bundle\") pod \"799a6ddd-3abc-4961-92b4-4a9db3b41f64\" (UID: \"799a6ddd-3abc-4961-92b4-4a9db3b41f64\") " Dec 11 13:42:58 crc kubenswrapper[4898]: I1211 13:42:58.839178 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/799a6ddd-3abc-4961-92b4-4a9db3b41f64-ovn-combined-ca-bundle\") pod \"799a6ddd-3abc-4961-92b4-4a9db3b41f64\" (UID: \"799a6ddd-3abc-4961-92b4-4a9db3b41f64\") " Dec 11 13:42:58 crc kubenswrapper[4898]: I1211 13:42:58.839287 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rbttr\" (UniqueName: \"kubernetes.io/projected/799a6ddd-3abc-4961-92b4-4a9db3b41f64-kube-api-access-rbttr\") pod \"799a6ddd-3abc-4961-92b4-4a9db3b41f64\" (UID: \"799a6ddd-3abc-4961-92b4-4a9db3b41f64\") " Dec 11 13:42:58 crc kubenswrapper[4898]: I1211 13:42:58.839311 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/799a6ddd-3abc-4961-92b4-4a9db3b41f64-telemetry-power-monitoring-combined-ca-bundle\") pod \"799a6ddd-3abc-4961-92b4-4a9db3b41f64\" (UID: \"799a6ddd-3abc-4961-92b4-4a9db3b41f64\") " Dec 11 13:42:58 crc kubenswrapper[4898]: I1211 13:42:58.839346 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/799a6ddd-3abc-4961-92b4-4a9db3b41f64-neutron-metadata-combined-ca-bundle\") pod \"799a6ddd-3abc-4961-92b4-4a9db3b41f64\" (UID: \"799a6ddd-3abc-4961-92b4-4a9db3b41f64\") " Dec 11 13:42:58 crc kubenswrapper[4898]: I1211 13:42:58.839365 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/799a6ddd-3abc-4961-92b4-4a9db3b41f64-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"799a6ddd-3abc-4961-92b4-4a9db3b41f64\" (UID: \"799a6ddd-3abc-4961-92b4-4a9db3b41f64\") " Dec 11 13:42:58 crc kubenswrapper[4898]: I1211 13:42:58.839394 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/799a6ddd-3abc-4961-92b4-4a9db3b41f64-ssh-key\") pod \"799a6ddd-3abc-4961-92b4-4a9db3b41f64\" (UID: \"799a6ddd-3abc-4961-92b4-4a9db3b41f64\") " Dec 11 13:42:58 crc kubenswrapper[4898]: I1211 13:42:58.839475 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/799a6ddd-3abc-4961-92b4-4a9db3b41f64-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"799a6ddd-3abc-4961-92b4-4a9db3b41f64\" (UID: \"799a6ddd-3abc-4961-92b4-4a9db3b41f64\") " Dec 11 13:42:58 crc kubenswrapper[4898]: I1211 13:42:58.839505 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/799a6ddd-3abc-4961-92b4-4a9db3b41f64-inventory\") pod \"799a6ddd-3abc-4961-92b4-4a9db3b41f64\" (UID: \"799a6ddd-3abc-4961-92b4-4a9db3b41f64\") " Dec 11 13:42:58 crc kubenswrapper[4898]: I1211 13:42:58.846745 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/799a6ddd-3abc-4961-92b4-4a9db3b41f64-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "799a6ddd-3abc-4961-92b4-4a9db3b41f64" (UID: "799a6ddd-3abc-4961-92b4-4a9db3b41f64"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:42:58 crc kubenswrapper[4898]: I1211 13:42:58.847214 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/799a6ddd-3abc-4961-92b4-4a9db3b41f64-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "799a6ddd-3abc-4961-92b4-4a9db3b41f64" (UID: "799a6ddd-3abc-4961-92b4-4a9db3b41f64"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:42:58 crc kubenswrapper[4898]: I1211 13:42:58.848018 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/799a6ddd-3abc-4961-92b4-4a9db3b41f64-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "799a6ddd-3abc-4961-92b4-4a9db3b41f64" (UID: "799a6ddd-3abc-4961-92b4-4a9db3b41f64"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:42:58 crc kubenswrapper[4898]: I1211 13:42:58.848028 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/799a6ddd-3abc-4961-92b4-4a9db3b41f64-openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0") pod "799a6ddd-3abc-4961-92b4-4a9db3b41f64" (UID: "799a6ddd-3abc-4961-92b4-4a9db3b41f64"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:42:58 crc kubenswrapper[4898]: I1211 13:42:58.848271 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/799a6ddd-3abc-4961-92b4-4a9db3b41f64-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "799a6ddd-3abc-4961-92b4-4a9db3b41f64" (UID: "799a6ddd-3abc-4961-92b4-4a9db3b41f64"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:42:58 crc kubenswrapper[4898]: I1211 13:42:58.848380 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/799a6ddd-3abc-4961-92b4-4a9db3b41f64-kube-api-access-rbttr" (OuterVolumeSpecName: "kube-api-access-rbttr") pod "799a6ddd-3abc-4961-92b4-4a9db3b41f64" (UID: "799a6ddd-3abc-4961-92b4-4a9db3b41f64"). InnerVolumeSpecName "kube-api-access-rbttr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:42:58 crc kubenswrapper[4898]: I1211 13:42:58.849981 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/799a6ddd-3abc-4961-92b4-4a9db3b41f64-telemetry-power-monitoring-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-power-monitoring-combined-ca-bundle") pod "799a6ddd-3abc-4961-92b4-4a9db3b41f64" (UID: "799a6ddd-3abc-4961-92b4-4a9db3b41f64"). InnerVolumeSpecName "telemetry-power-monitoring-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:42:58 crc kubenswrapper[4898]: I1211 13:42:58.850021 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/799a6ddd-3abc-4961-92b4-4a9db3b41f64-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "799a6ddd-3abc-4961-92b4-4a9db3b41f64" (UID: "799a6ddd-3abc-4961-92b4-4a9db3b41f64"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:42:58 crc kubenswrapper[4898]: I1211 13:42:58.850491 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/799a6ddd-3abc-4961-92b4-4a9db3b41f64-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "799a6ddd-3abc-4961-92b4-4a9db3b41f64" (UID: "799a6ddd-3abc-4961-92b4-4a9db3b41f64"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:42:58 crc kubenswrapper[4898]: I1211 13:42:58.853165 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/799a6ddd-3abc-4961-92b4-4a9db3b41f64-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "799a6ddd-3abc-4961-92b4-4a9db3b41f64" (UID: "799a6ddd-3abc-4961-92b4-4a9db3b41f64"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:42:58 crc kubenswrapper[4898]: I1211 13:42:58.853248 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/799a6ddd-3abc-4961-92b4-4a9db3b41f64-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "799a6ddd-3abc-4961-92b4-4a9db3b41f64" (UID: "799a6ddd-3abc-4961-92b4-4a9db3b41f64"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:42:58 crc kubenswrapper[4898]: I1211 13:42:58.853335 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/799a6ddd-3abc-4961-92b4-4a9db3b41f64-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "799a6ddd-3abc-4961-92b4-4a9db3b41f64" (UID: "799a6ddd-3abc-4961-92b4-4a9db3b41f64"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:42:58 crc kubenswrapper[4898]: I1211 13:42:58.853482 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/799a6ddd-3abc-4961-92b4-4a9db3b41f64-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "799a6ddd-3abc-4961-92b4-4a9db3b41f64" (UID: "799a6ddd-3abc-4961-92b4-4a9db3b41f64"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:42:58 crc kubenswrapper[4898]: I1211 13:42:58.854663 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/799a6ddd-3abc-4961-92b4-4a9db3b41f64-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "799a6ddd-3abc-4961-92b4-4a9db3b41f64" (UID: "799a6ddd-3abc-4961-92b4-4a9db3b41f64"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:42:58 crc kubenswrapper[4898]: I1211 13:42:58.880958 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/799a6ddd-3abc-4961-92b4-4a9db3b41f64-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "799a6ddd-3abc-4961-92b4-4a9db3b41f64" (UID: "799a6ddd-3abc-4961-92b4-4a9db3b41f64"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:42:58 crc kubenswrapper[4898]: I1211 13:42:58.883721 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/799a6ddd-3abc-4961-92b4-4a9db3b41f64-inventory" (OuterVolumeSpecName: "inventory") pod "799a6ddd-3abc-4961-92b4-4a9db3b41f64" (UID: "799a6ddd-3abc-4961-92b4-4a9db3b41f64"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:42:58 crc kubenswrapper[4898]: I1211 13:42:58.942676 4898 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/799a6ddd-3abc-4961-92b4-4a9db3b41f64-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 11 13:42:58 crc kubenswrapper[4898]: I1211 13:42:58.943030 4898 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\" (UniqueName: \"kubernetes.io/projected/799a6ddd-3abc-4961-92b4-4a9db3b41f64-openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 11 13:42:58 crc kubenswrapper[4898]: I1211 13:42:58.943184 4898 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/799a6ddd-3abc-4961-92b4-4a9db3b41f64-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 11 13:42:58 crc kubenswrapper[4898]: I1211 13:42:58.943317 4898 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/799a6ddd-3abc-4961-92b4-4a9db3b41f64-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 11 13:42:58 crc kubenswrapper[4898]: I1211 13:42:58.943446 4898 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/799a6ddd-3abc-4961-92b4-4a9db3b41f64-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 11 13:42:58 crc kubenswrapper[4898]: I1211 13:42:58.943614 4898 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/799a6ddd-3abc-4961-92b4-4a9db3b41f64-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 11 13:42:58 crc kubenswrapper[4898]: I1211 13:42:58.943761 4898 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/799a6ddd-3abc-4961-92b4-4a9db3b41f64-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 11 13:42:58 crc kubenswrapper[4898]: I1211 13:42:58.943918 4898 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/799a6ddd-3abc-4961-92b4-4a9db3b41f64-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 11 13:42:58 crc kubenswrapper[4898]: I1211 13:42:58.944060 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rbttr\" (UniqueName: \"kubernetes.io/projected/799a6ddd-3abc-4961-92b4-4a9db3b41f64-kube-api-access-rbttr\") on node \"crc\" DevicePath \"\"" Dec 11 13:42:58 crc kubenswrapper[4898]: I1211 13:42:58.944156 4898 reconciler_common.go:293] "Volume detached for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/799a6ddd-3abc-4961-92b4-4a9db3b41f64-telemetry-power-monitoring-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 11 13:42:58 crc kubenswrapper[4898]: I1211 13:42:58.944266 4898 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/799a6ddd-3abc-4961-92b4-4a9db3b41f64-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 11 13:42:58 crc kubenswrapper[4898]: I1211 13:42:58.944396 4898 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/799a6ddd-3abc-4961-92b4-4a9db3b41f64-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 11 13:42:58 crc kubenswrapper[4898]: I1211 13:42:58.944565 4898 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/799a6ddd-3abc-4961-92b4-4a9db3b41f64-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 11 13:42:58 crc kubenswrapper[4898]: I1211 13:42:58.944724 4898 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/799a6ddd-3abc-4961-92b4-4a9db3b41f64-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 11 13:42:58 crc kubenswrapper[4898]: I1211 13:42:58.944886 4898 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/799a6ddd-3abc-4961-92b4-4a9db3b41f64-inventory\") on node \"crc\" DevicePath \"\"" Dec 11 13:42:58 crc kubenswrapper[4898]: I1211 13:42:58.945033 4898 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/799a6ddd-3abc-4961-92b4-4a9db3b41f64-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 11 13:42:59 crc kubenswrapper[4898]: I1211 13:42:59.285044 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gvs5l" Dec 11 13:42:59 crc kubenswrapper[4898]: I1211 13:42:59.285685 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gvs5l" event={"ID":"799a6ddd-3abc-4961-92b4-4a9db3b41f64","Type":"ContainerDied","Data":"daa4aec72ebb4c9172bc708f1f2a46def54521c786393326c4487662a8a51999"} Dec 11 13:42:59 crc kubenswrapper[4898]: I1211 13:42:59.285722 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="daa4aec72ebb4c9172bc708f1f2a46def54521c786393326c4487662a8a51999" Dec 11 13:42:59 crc kubenswrapper[4898]: I1211 13:42:59.440227 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-nhkjq"] Dec 11 13:42:59 crc kubenswrapper[4898]: E1211 13:42:59.440883 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="799a6ddd-3abc-4961-92b4-4a9db3b41f64" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Dec 11 13:42:59 crc kubenswrapper[4898]: I1211 13:42:59.440909 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="799a6ddd-3abc-4961-92b4-4a9db3b41f64" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Dec 11 13:42:59 crc kubenswrapper[4898]: I1211 13:42:59.441236 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="799a6ddd-3abc-4961-92b4-4a9db3b41f64" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Dec 11 13:42:59 crc kubenswrapper[4898]: I1211 13:42:59.442276 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-nhkjq" Dec 11 13:42:59 crc kubenswrapper[4898]: I1211 13:42:59.444491 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Dec 11 13:42:59 crc kubenswrapper[4898]: I1211 13:42:59.444825 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 11 13:42:59 crc kubenswrapper[4898]: I1211 13:42:59.445286 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 11 13:42:59 crc kubenswrapper[4898]: I1211 13:42:59.448234 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 11 13:42:59 crc kubenswrapper[4898]: I1211 13:42:59.448549 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-w8jc8" Dec 11 13:42:59 crc kubenswrapper[4898]: I1211 13:42:59.471847 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-nhkjq"] Dec 11 13:42:59 crc kubenswrapper[4898]: I1211 13:42:59.559091 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f4pbs\" (UniqueName: \"kubernetes.io/projected/68ef432f-8154-4f0d-b92f-5cfeed0c22ce-kube-api-access-f4pbs\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-nhkjq\" (UID: \"68ef432f-8154-4f0d-b92f-5cfeed0c22ce\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-nhkjq" Dec 11 13:42:59 crc kubenswrapper[4898]: I1211 13:42:59.559146 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/68ef432f-8154-4f0d-b92f-5cfeed0c22ce-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-nhkjq\" (UID: \"68ef432f-8154-4f0d-b92f-5cfeed0c22ce\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-nhkjq" Dec 11 13:42:59 crc kubenswrapper[4898]: I1211 13:42:59.559178 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68ef432f-8154-4f0d-b92f-5cfeed0c22ce-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-nhkjq\" (UID: \"68ef432f-8154-4f0d-b92f-5cfeed0c22ce\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-nhkjq" Dec 11 13:42:59 crc kubenswrapper[4898]: I1211 13:42:59.559309 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/68ef432f-8154-4f0d-b92f-5cfeed0c22ce-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-nhkjq\" (UID: \"68ef432f-8154-4f0d-b92f-5cfeed0c22ce\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-nhkjq" Dec 11 13:42:59 crc kubenswrapper[4898]: I1211 13:42:59.559591 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/68ef432f-8154-4f0d-b92f-5cfeed0c22ce-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-nhkjq\" (UID: \"68ef432f-8154-4f0d-b92f-5cfeed0c22ce\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-nhkjq" Dec 11 13:42:59 crc kubenswrapper[4898]: I1211 13:42:59.661967 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f4pbs\" (UniqueName: \"kubernetes.io/projected/68ef432f-8154-4f0d-b92f-5cfeed0c22ce-kube-api-access-f4pbs\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-nhkjq\" (UID: \"68ef432f-8154-4f0d-b92f-5cfeed0c22ce\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-nhkjq" Dec 11 13:42:59 crc kubenswrapper[4898]: I1211 13:42:59.662033 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/68ef432f-8154-4f0d-b92f-5cfeed0c22ce-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-nhkjq\" (UID: \"68ef432f-8154-4f0d-b92f-5cfeed0c22ce\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-nhkjq" Dec 11 13:42:59 crc kubenswrapper[4898]: I1211 13:42:59.662063 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68ef432f-8154-4f0d-b92f-5cfeed0c22ce-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-nhkjq\" (UID: \"68ef432f-8154-4f0d-b92f-5cfeed0c22ce\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-nhkjq" Dec 11 13:42:59 crc kubenswrapper[4898]: I1211 13:42:59.662149 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/68ef432f-8154-4f0d-b92f-5cfeed0c22ce-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-nhkjq\" (UID: \"68ef432f-8154-4f0d-b92f-5cfeed0c22ce\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-nhkjq" Dec 11 13:42:59 crc kubenswrapper[4898]: I1211 13:42:59.662297 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/68ef432f-8154-4f0d-b92f-5cfeed0c22ce-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-nhkjq\" (UID: \"68ef432f-8154-4f0d-b92f-5cfeed0c22ce\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-nhkjq" Dec 11 13:42:59 crc kubenswrapper[4898]: I1211 13:42:59.663113 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/68ef432f-8154-4f0d-b92f-5cfeed0c22ce-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-nhkjq\" (UID: \"68ef432f-8154-4f0d-b92f-5cfeed0c22ce\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-nhkjq" Dec 11 13:42:59 crc kubenswrapper[4898]: I1211 13:42:59.666181 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68ef432f-8154-4f0d-b92f-5cfeed0c22ce-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-nhkjq\" (UID: \"68ef432f-8154-4f0d-b92f-5cfeed0c22ce\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-nhkjq" Dec 11 13:42:59 crc kubenswrapper[4898]: I1211 13:42:59.667192 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/68ef432f-8154-4f0d-b92f-5cfeed0c22ce-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-nhkjq\" (UID: \"68ef432f-8154-4f0d-b92f-5cfeed0c22ce\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-nhkjq" Dec 11 13:42:59 crc kubenswrapper[4898]: I1211 13:42:59.667873 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/68ef432f-8154-4f0d-b92f-5cfeed0c22ce-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-nhkjq\" (UID: \"68ef432f-8154-4f0d-b92f-5cfeed0c22ce\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-nhkjq" Dec 11 13:42:59 crc kubenswrapper[4898]: I1211 13:42:59.679050 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f4pbs\" (UniqueName: \"kubernetes.io/projected/68ef432f-8154-4f0d-b92f-5cfeed0c22ce-kube-api-access-f4pbs\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-nhkjq\" (UID: \"68ef432f-8154-4f0d-b92f-5cfeed0c22ce\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-nhkjq" Dec 11 13:42:59 crc kubenswrapper[4898]: I1211 13:42:59.759822 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-nhkjq" Dec 11 13:43:00 crc kubenswrapper[4898]: I1211 13:43:00.339685 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-nhkjq"] Dec 11 13:43:01 crc kubenswrapper[4898]: I1211 13:43:01.306521 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-nhkjq" event={"ID":"68ef432f-8154-4f0d-b92f-5cfeed0c22ce","Type":"ContainerStarted","Data":"c2e7ccf691c77995f334243abcd358b672e19ea5e9f567390a3b05391d8883de"} Dec 11 13:43:02 crc kubenswrapper[4898]: I1211 13:43:02.321159 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-nhkjq" event={"ID":"68ef432f-8154-4f0d-b92f-5cfeed0c22ce","Type":"ContainerStarted","Data":"b1a140bf6adacdd0abd64ac95015f03d7d7e356d987f42da072ae7ed5753824c"} Dec 11 13:43:02 crc kubenswrapper[4898]: I1211 13:43:02.350799 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-nhkjq" podStartSLOduration=2.628843906 podStartE2EDuration="3.350751864s" podCreationTimestamp="2025-12-11 13:42:59 +0000 UTC" firstStartedPulling="2025-12-11 13:43:00.35419485 +0000 UTC m=+2337.926521307" lastFinishedPulling="2025-12-11 13:43:01.076102828 +0000 UTC m=+2338.648429265" observedRunningTime="2025-12-11 13:43:02.33741782 +0000 UTC m=+2339.909744307" watchObservedRunningTime="2025-12-11 13:43:02.350751864 +0000 UTC m=+2339.923078341" Dec 11 13:43:04 crc kubenswrapper[4898]: I1211 13:43:04.995927 4898 patch_prober.go:28] interesting pod/machine-config-daemon-7mmvk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 11 13:43:04 crc kubenswrapper[4898]: I1211 13:43:04.996253 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 11 13:43:34 crc kubenswrapper[4898]: I1211 13:43:34.996032 4898 patch_prober.go:28] interesting pod/machine-config-daemon-7mmvk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 11 13:43:34 crc kubenswrapper[4898]: I1211 13:43:34.996625 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 11 13:44:04 crc kubenswrapper[4898]: I1211 13:44:04.996367 4898 patch_prober.go:28] interesting pod/machine-config-daemon-7mmvk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 11 13:44:04 crc kubenswrapper[4898]: I1211 13:44:04.997122 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 11 13:44:04 crc kubenswrapper[4898]: I1211 13:44:04.997193 4898 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" Dec 11 13:44:04 crc kubenswrapper[4898]: I1211 13:44:04.998600 4898 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"27ad0d928a4e9168d47090faf6f1e780be1daad349640ac754f272e963e95eb8"} pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 11 13:44:04 crc kubenswrapper[4898]: I1211 13:44:04.998705 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" containerName="machine-config-daemon" containerID="cri-o://27ad0d928a4e9168d47090faf6f1e780be1daad349640ac754f272e963e95eb8" gracePeriod=600 Dec 11 13:44:05 crc kubenswrapper[4898]: E1211 13:44:05.128413 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mmvk_openshift-machine-config-operator(b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" Dec 11 13:44:06 crc kubenswrapper[4898]: I1211 13:44:06.054564 4898 generic.go:334] "Generic (PLEG): container finished" podID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" containerID="27ad0d928a4e9168d47090faf6f1e780be1daad349640ac754f272e963e95eb8" exitCode=0 Dec 11 13:44:06 crc kubenswrapper[4898]: I1211 13:44:06.054689 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" event={"ID":"b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c","Type":"ContainerDied","Data":"27ad0d928a4e9168d47090faf6f1e780be1daad349640ac754f272e963e95eb8"} Dec 11 13:44:06 crc kubenswrapper[4898]: I1211 13:44:06.054792 4898 scope.go:117] "RemoveContainer" containerID="2dc6b826c592db36a826a1bc1d1e5ae9c15e49f961f4c157520efd5d4a76b991" Dec 11 13:44:06 crc kubenswrapper[4898]: I1211 13:44:06.055608 4898 scope.go:117] "RemoveContainer" containerID="27ad0d928a4e9168d47090faf6f1e780be1daad349640ac754f272e963e95eb8" Dec 11 13:44:06 crc kubenswrapper[4898]: E1211 13:44:06.056010 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mmvk_openshift-machine-config-operator(b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" Dec 11 13:44:08 crc kubenswrapper[4898]: I1211 13:44:08.079624 4898 generic.go:334] "Generic (PLEG): container finished" podID="68ef432f-8154-4f0d-b92f-5cfeed0c22ce" containerID="b1a140bf6adacdd0abd64ac95015f03d7d7e356d987f42da072ae7ed5753824c" exitCode=0 Dec 11 13:44:08 crc kubenswrapper[4898]: I1211 13:44:08.079699 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-nhkjq" event={"ID":"68ef432f-8154-4f0d-b92f-5cfeed0c22ce","Type":"ContainerDied","Data":"b1a140bf6adacdd0abd64ac95015f03d7d7e356d987f42da072ae7ed5753824c"} Dec 11 13:44:09 crc kubenswrapper[4898]: I1211 13:44:09.587922 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-nhkjq" Dec 11 13:44:09 crc kubenswrapper[4898]: I1211 13:44:09.669110 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/68ef432f-8154-4f0d-b92f-5cfeed0c22ce-ssh-key\") pod \"68ef432f-8154-4f0d-b92f-5cfeed0c22ce\" (UID: \"68ef432f-8154-4f0d-b92f-5cfeed0c22ce\") " Dec 11 13:44:09 crc kubenswrapper[4898]: I1211 13:44:09.669513 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/68ef432f-8154-4f0d-b92f-5cfeed0c22ce-ovncontroller-config-0\") pod \"68ef432f-8154-4f0d-b92f-5cfeed0c22ce\" (UID: \"68ef432f-8154-4f0d-b92f-5cfeed0c22ce\") " Dec 11 13:44:09 crc kubenswrapper[4898]: I1211 13:44:09.669568 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/68ef432f-8154-4f0d-b92f-5cfeed0c22ce-inventory\") pod \"68ef432f-8154-4f0d-b92f-5cfeed0c22ce\" (UID: \"68ef432f-8154-4f0d-b92f-5cfeed0c22ce\") " Dec 11 13:44:09 crc kubenswrapper[4898]: I1211 13:44:09.669634 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f4pbs\" (UniqueName: \"kubernetes.io/projected/68ef432f-8154-4f0d-b92f-5cfeed0c22ce-kube-api-access-f4pbs\") pod \"68ef432f-8154-4f0d-b92f-5cfeed0c22ce\" (UID: \"68ef432f-8154-4f0d-b92f-5cfeed0c22ce\") " Dec 11 13:44:09 crc kubenswrapper[4898]: I1211 13:44:09.669663 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68ef432f-8154-4f0d-b92f-5cfeed0c22ce-ovn-combined-ca-bundle\") pod \"68ef432f-8154-4f0d-b92f-5cfeed0c22ce\" (UID: \"68ef432f-8154-4f0d-b92f-5cfeed0c22ce\") " Dec 11 13:44:09 crc kubenswrapper[4898]: I1211 13:44:09.676280 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68ef432f-8154-4f0d-b92f-5cfeed0c22ce-kube-api-access-f4pbs" (OuterVolumeSpecName: "kube-api-access-f4pbs") pod "68ef432f-8154-4f0d-b92f-5cfeed0c22ce" (UID: "68ef432f-8154-4f0d-b92f-5cfeed0c22ce"). InnerVolumeSpecName "kube-api-access-f4pbs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:44:09 crc kubenswrapper[4898]: I1211 13:44:09.679195 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68ef432f-8154-4f0d-b92f-5cfeed0c22ce-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "68ef432f-8154-4f0d-b92f-5cfeed0c22ce" (UID: "68ef432f-8154-4f0d-b92f-5cfeed0c22ce"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:44:09 crc kubenswrapper[4898]: I1211 13:44:09.702804 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68ef432f-8154-4f0d-b92f-5cfeed0c22ce-inventory" (OuterVolumeSpecName: "inventory") pod "68ef432f-8154-4f0d-b92f-5cfeed0c22ce" (UID: "68ef432f-8154-4f0d-b92f-5cfeed0c22ce"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:44:09 crc kubenswrapper[4898]: I1211 13:44:09.721963 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/68ef432f-8154-4f0d-b92f-5cfeed0c22ce-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "68ef432f-8154-4f0d-b92f-5cfeed0c22ce" (UID: "68ef432f-8154-4f0d-b92f-5cfeed0c22ce"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:44:09 crc kubenswrapper[4898]: I1211 13:44:09.729026 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68ef432f-8154-4f0d-b92f-5cfeed0c22ce-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "68ef432f-8154-4f0d-b92f-5cfeed0c22ce" (UID: "68ef432f-8154-4f0d-b92f-5cfeed0c22ce"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:44:09 crc kubenswrapper[4898]: I1211 13:44:09.772724 4898 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/68ef432f-8154-4f0d-b92f-5cfeed0c22ce-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Dec 11 13:44:09 crc kubenswrapper[4898]: I1211 13:44:09.772903 4898 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/68ef432f-8154-4f0d-b92f-5cfeed0c22ce-inventory\") on node \"crc\" DevicePath \"\"" Dec 11 13:44:09 crc kubenswrapper[4898]: I1211 13:44:09.772961 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f4pbs\" (UniqueName: \"kubernetes.io/projected/68ef432f-8154-4f0d-b92f-5cfeed0c22ce-kube-api-access-f4pbs\") on node \"crc\" DevicePath \"\"" Dec 11 13:44:09 crc kubenswrapper[4898]: I1211 13:44:09.773019 4898 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68ef432f-8154-4f0d-b92f-5cfeed0c22ce-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 11 13:44:09 crc kubenswrapper[4898]: I1211 13:44:09.773067 4898 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/68ef432f-8154-4f0d-b92f-5cfeed0c22ce-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 11 13:44:10 crc kubenswrapper[4898]: I1211 13:44:10.103546 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-nhkjq" event={"ID":"68ef432f-8154-4f0d-b92f-5cfeed0c22ce","Type":"ContainerDied","Data":"c2e7ccf691c77995f334243abcd358b672e19ea5e9f567390a3b05391d8883de"} Dec 11 13:44:10 crc kubenswrapper[4898]: I1211 13:44:10.103590 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c2e7ccf691c77995f334243abcd358b672e19ea5e9f567390a3b05391d8883de" Dec 11 13:44:10 crc kubenswrapper[4898]: I1211 13:44:10.103598 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-nhkjq" Dec 11 13:44:10 crc kubenswrapper[4898]: I1211 13:44:10.198481 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-6wt42"] Dec 11 13:44:10 crc kubenswrapper[4898]: E1211 13:44:10.199382 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68ef432f-8154-4f0d-b92f-5cfeed0c22ce" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Dec 11 13:44:10 crc kubenswrapper[4898]: I1211 13:44:10.199504 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="68ef432f-8154-4f0d-b92f-5cfeed0c22ce" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Dec 11 13:44:10 crc kubenswrapper[4898]: I1211 13:44:10.199879 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="68ef432f-8154-4f0d-b92f-5cfeed0c22ce" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Dec 11 13:44:10 crc kubenswrapper[4898]: I1211 13:44:10.201015 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-6wt42" Dec 11 13:44:10 crc kubenswrapper[4898]: I1211 13:44:10.203656 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 11 13:44:10 crc kubenswrapper[4898]: I1211 13:44:10.203924 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Dec 11 13:44:10 crc kubenswrapper[4898]: I1211 13:44:10.204753 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Dec 11 13:44:10 crc kubenswrapper[4898]: I1211 13:44:10.212502 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-6wt42"] Dec 11 13:44:10 crc kubenswrapper[4898]: I1211 13:44:10.248586 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 11 13:44:10 crc kubenswrapper[4898]: I1211 13:44:10.250449 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 11 13:44:10 crc kubenswrapper[4898]: I1211 13:44:10.251924 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-w8jc8" Dec 11 13:44:10 crc kubenswrapper[4898]: I1211 13:44:10.286649 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/67ae3c0c-6ce1-429e-953d-2cac885ee43c-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-6wt42\" (UID: \"67ae3c0c-6ce1-429e-953d-2cac885ee43c\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-6wt42" Dec 11 13:44:10 crc kubenswrapper[4898]: I1211 13:44:10.287237 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sx4xk\" (UniqueName: \"kubernetes.io/projected/67ae3c0c-6ce1-429e-953d-2cac885ee43c-kube-api-access-sx4xk\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-6wt42\" (UID: \"67ae3c0c-6ce1-429e-953d-2cac885ee43c\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-6wt42" Dec 11 13:44:10 crc kubenswrapper[4898]: I1211 13:44:10.287403 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67ae3c0c-6ce1-429e-953d-2cac885ee43c-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-6wt42\" (UID: \"67ae3c0c-6ce1-429e-953d-2cac885ee43c\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-6wt42" Dec 11 13:44:10 crc kubenswrapper[4898]: I1211 13:44:10.287699 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/67ae3c0c-6ce1-429e-953d-2cac885ee43c-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-6wt42\" (UID: \"67ae3c0c-6ce1-429e-953d-2cac885ee43c\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-6wt42" Dec 11 13:44:10 crc kubenswrapper[4898]: I1211 13:44:10.287907 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/67ae3c0c-6ce1-429e-953d-2cac885ee43c-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-6wt42\" (UID: \"67ae3c0c-6ce1-429e-953d-2cac885ee43c\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-6wt42" Dec 11 13:44:10 crc kubenswrapper[4898]: I1211 13:44:10.288063 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/67ae3c0c-6ce1-429e-953d-2cac885ee43c-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-6wt42\" (UID: \"67ae3c0c-6ce1-429e-953d-2cac885ee43c\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-6wt42" Dec 11 13:44:10 crc kubenswrapper[4898]: I1211 13:44:10.390787 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/67ae3c0c-6ce1-429e-953d-2cac885ee43c-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-6wt42\" (UID: \"67ae3c0c-6ce1-429e-953d-2cac885ee43c\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-6wt42" Dec 11 13:44:10 crc kubenswrapper[4898]: I1211 13:44:10.390863 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/67ae3c0c-6ce1-429e-953d-2cac885ee43c-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-6wt42\" (UID: \"67ae3c0c-6ce1-429e-953d-2cac885ee43c\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-6wt42" Dec 11 13:44:10 crc kubenswrapper[4898]: I1211 13:44:10.390896 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/67ae3c0c-6ce1-429e-953d-2cac885ee43c-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-6wt42\" (UID: \"67ae3c0c-6ce1-429e-953d-2cac885ee43c\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-6wt42" Dec 11 13:44:10 crc kubenswrapper[4898]: I1211 13:44:10.390940 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/67ae3c0c-6ce1-429e-953d-2cac885ee43c-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-6wt42\" (UID: \"67ae3c0c-6ce1-429e-953d-2cac885ee43c\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-6wt42" Dec 11 13:44:10 crc kubenswrapper[4898]: I1211 13:44:10.391044 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sx4xk\" (UniqueName: \"kubernetes.io/projected/67ae3c0c-6ce1-429e-953d-2cac885ee43c-kube-api-access-sx4xk\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-6wt42\" (UID: \"67ae3c0c-6ce1-429e-953d-2cac885ee43c\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-6wt42" Dec 11 13:44:10 crc kubenswrapper[4898]: I1211 13:44:10.391696 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67ae3c0c-6ce1-429e-953d-2cac885ee43c-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-6wt42\" (UID: \"67ae3c0c-6ce1-429e-953d-2cac885ee43c\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-6wt42" Dec 11 13:44:10 crc kubenswrapper[4898]: I1211 13:44:10.402212 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/67ae3c0c-6ce1-429e-953d-2cac885ee43c-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-6wt42\" (UID: \"67ae3c0c-6ce1-429e-953d-2cac885ee43c\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-6wt42" Dec 11 13:44:10 crc kubenswrapper[4898]: I1211 13:44:10.402216 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/67ae3c0c-6ce1-429e-953d-2cac885ee43c-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-6wt42\" (UID: \"67ae3c0c-6ce1-429e-953d-2cac885ee43c\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-6wt42" Dec 11 13:44:10 crc kubenswrapper[4898]: I1211 13:44:10.404147 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/67ae3c0c-6ce1-429e-953d-2cac885ee43c-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-6wt42\" (UID: \"67ae3c0c-6ce1-429e-953d-2cac885ee43c\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-6wt42" Dec 11 13:44:10 crc kubenswrapper[4898]: I1211 13:44:10.406024 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/67ae3c0c-6ce1-429e-953d-2cac885ee43c-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-6wt42\" (UID: \"67ae3c0c-6ce1-429e-953d-2cac885ee43c\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-6wt42" Dec 11 13:44:10 crc kubenswrapper[4898]: I1211 13:44:10.411962 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67ae3c0c-6ce1-429e-953d-2cac885ee43c-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-6wt42\" (UID: \"67ae3c0c-6ce1-429e-953d-2cac885ee43c\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-6wt42" Dec 11 13:44:10 crc kubenswrapper[4898]: I1211 13:44:10.433152 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sx4xk\" (UniqueName: \"kubernetes.io/projected/67ae3c0c-6ce1-429e-953d-2cac885ee43c-kube-api-access-sx4xk\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-6wt42\" (UID: \"67ae3c0c-6ce1-429e-953d-2cac885ee43c\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-6wt42" Dec 11 13:44:10 crc kubenswrapper[4898]: I1211 13:44:10.553624 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-6wt42" Dec 11 13:44:11 crc kubenswrapper[4898]: I1211 13:44:11.080300 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-6wt42"] Dec 11 13:44:11 crc kubenswrapper[4898]: I1211 13:44:11.116552 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-6wt42" event={"ID":"67ae3c0c-6ce1-429e-953d-2cac885ee43c","Type":"ContainerStarted","Data":"8e8e7b2aa6615611e03d53701093ac03c4fcd616d784185a3e439683b06b7191"} Dec 11 13:44:12 crc kubenswrapper[4898]: I1211 13:44:12.141260 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-6wt42" event={"ID":"67ae3c0c-6ce1-429e-953d-2cac885ee43c","Type":"ContainerStarted","Data":"cc6eace62db90f89b417d4b3f590f650e920a06dc4d4cadd270672c5af4f70e5"} Dec 11 13:44:12 crc kubenswrapper[4898]: I1211 13:44:12.160444 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-6wt42" podStartSLOduration=1.586833447 podStartE2EDuration="2.160416858s" podCreationTimestamp="2025-12-11 13:44:10 +0000 UTC" firstStartedPulling="2025-12-11 13:44:11.092898176 +0000 UTC m=+2408.665224613" lastFinishedPulling="2025-12-11 13:44:11.666481577 +0000 UTC m=+2409.238808024" observedRunningTime="2025-12-11 13:44:12.159236066 +0000 UTC m=+2409.731562593" watchObservedRunningTime="2025-12-11 13:44:12.160416858 +0000 UTC m=+2409.732743325" Dec 11 13:44:18 crc kubenswrapper[4898]: I1211 13:44:18.776070 4898 scope.go:117] "RemoveContainer" containerID="27ad0d928a4e9168d47090faf6f1e780be1daad349640ac754f272e963e95eb8" Dec 11 13:44:18 crc kubenswrapper[4898]: E1211 13:44:18.777070 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mmvk_openshift-machine-config-operator(b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" Dec 11 13:44:31 crc kubenswrapper[4898]: I1211 13:44:31.775959 4898 scope.go:117] "RemoveContainer" containerID="27ad0d928a4e9168d47090faf6f1e780be1daad349640ac754f272e963e95eb8" Dec 11 13:44:31 crc kubenswrapper[4898]: E1211 13:44:31.778294 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mmvk_openshift-machine-config-operator(b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" Dec 11 13:44:43 crc kubenswrapper[4898]: I1211 13:44:43.776428 4898 scope.go:117] "RemoveContainer" containerID="27ad0d928a4e9168d47090faf6f1e780be1daad349640ac754f272e963e95eb8" Dec 11 13:44:43 crc kubenswrapper[4898]: E1211 13:44:43.777214 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mmvk_openshift-machine-config-operator(b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" Dec 11 13:44:56 crc kubenswrapper[4898]: I1211 13:44:56.775322 4898 scope.go:117] "RemoveContainer" containerID="27ad0d928a4e9168d47090faf6f1e780be1daad349640ac754f272e963e95eb8" Dec 11 13:44:56 crc kubenswrapper[4898]: E1211 13:44:56.776038 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mmvk_openshift-machine-config-operator(b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" Dec 11 13:45:00 crc kubenswrapper[4898]: I1211 13:45:00.154657 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29424345-ksd98"] Dec 11 13:45:00 crc kubenswrapper[4898]: I1211 13:45:00.157363 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29424345-ksd98" Dec 11 13:45:00 crc kubenswrapper[4898]: I1211 13:45:00.159968 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 11 13:45:00 crc kubenswrapper[4898]: I1211 13:45:00.161772 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 11 13:45:00 crc kubenswrapper[4898]: I1211 13:45:00.165122 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29424345-ksd98"] Dec 11 13:45:00 crc kubenswrapper[4898]: I1211 13:45:00.253255 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6d34ea67-eff5-4813-b997-35b5656058dd-config-volume\") pod \"collect-profiles-29424345-ksd98\" (UID: \"6d34ea67-eff5-4813-b997-35b5656058dd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29424345-ksd98" Dec 11 13:45:00 crc kubenswrapper[4898]: I1211 13:45:00.253314 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lc5kx\" (UniqueName: \"kubernetes.io/projected/6d34ea67-eff5-4813-b997-35b5656058dd-kube-api-access-lc5kx\") pod \"collect-profiles-29424345-ksd98\" (UID: \"6d34ea67-eff5-4813-b997-35b5656058dd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29424345-ksd98" Dec 11 13:45:00 crc kubenswrapper[4898]: I1211 13:45:00.253924 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6d34ea67-eff5-4813-b997-35b5656058dd-secret-volume\") pod \"collect-profiles-29424345-ksd98\" (UID: \"6d34ea67-eff5-4813-b997-35b5656058dd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29424345-ksd98" Dec 11 13:45:00 crc kubenswrapper[4898]: I1211 13:45:00.356949 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6d34ea67-eff5-4813-b997-35b5656058dd-config-volume\") pod \"collect-profiles-29424345-ksd98\" (UID: \"6d34ea67-eff5-4813-b997-35b5656058dd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29424345-ksd98" Dec 11 13:45:00 crc kubenswrapper[4898]: I1211 13:45:00.357070 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lc5kx\" (UniqueName: \"kubernetes.io/projected/6d34ea67-eff5-4813-b997-35b5656058dd-kube-api-access-lc5kx\") pod \"collect-profiles-29424345-ksd98\" (UID: \"6d34ea67-eff5-4813-b997-35b5656058dd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29424345-ksd98" Dec 11 13:45:00 crc kubenswrapper[4898]: I1211 13:45:00.357420 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6d34ea67-eff5-4813-b997-35b5656058dd-secret-volume\") pod \"collect-profiles-29424345-ksd98\" (UID: \"6d34ea67-eff5-4813-b997-35b5656058dd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29424345-ksd98" Dec 11 13:45:00 crc kubenswrapper[4898]: I1211 13:45:00.358279 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6d34ea67-eff5-4813-b997-35b5656058dd-config-volume\") pod \"collect-profiles-29424345-ksd98\" (UID: \"6d34ea67-eff5-4813-b997-35b5656058dd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29424345-ksd98" Dec 11 13:45:00 crc kubenswrapper[4898]: I1211 13:45:00.370422 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6d34ea67-eff5-4813-b997-35b5656058dd-secret-volume\") pod \"collect-profiles-29424345-ksd98\" (UID: \"6d34ea67-eff5-4813-b997-35b5656058dd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29424345-ksd98" Dec 11 13:45:00 crc kubenswrapper[4898]: I1211 13:45:00.385988 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lc5kx\" (UniqueName: \"kubernetes.io/projected/6d34ea67-eff5-4813-b997-35b5656058dd-kube-api-access-lc5kx\") pod \"collect-profiles-29424345-ksd98\" (UID: \"6d34ea67-eff5-4813-b997-35b5656058dd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29424345-ksd98" Dec 11 13:45:00 crc kubenswrapper[4898]: I1211 13:45:00.527020 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29424345-ksd98" Dec 11 13:45:00 crc kubenswrapper[4898]: I1211 13:45:00.981207 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29424345-ksd98"] Dec 11 13:45:00 crc kubenswrapper[4898]: W1211 13:45:00.984753 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6d34ea67_eff5_4813_b997_35b5656058dd.slice/crio-73be8fb521d4d9e96210dcea944d3d8e658544df308858ed4f9790bfae54c655 WatchSource:0}: Error finding container 73be8fb521d4d9e96210dcea944d3d8e658544df308858ed4f9790bfae54c655: Status 404 returned error can't find the container with id 73be8fb521d4d9e96210dcea944d3d8e658544df308858ed4f9790bfae54c655 Dec 11 13:45:01 crc kubenswrapper[4898]: I1211 13:45:01.791809 4898 generic.go:334] "Generic (PLEG): container finished" podID="6d34ea67-eff5-4813-b997-35b5656058dd" containerID="fd0c91d3f24fedb4feab2794bb77331a3937188e1596bc5b4c6ebda68444b0cd" exitCode=0 Dec 11 13:45:01 crc kubenswrapper[4898]: I1211 13:45:01.791851 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29424345-ksd98" event={"ID":"6d34ea67-eff5-4813-b997-35b5656058dd","Type":"ContainerDied","Data":"fd0c91d3f24fedb4feab2794bb77331a3937188e1596bc5b4c6ebda68444b0cd"} Dec 11 13:45:01 crc kubenswrapper[4898]: I1211 13:45:01.791874 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29424345-ksd98" event={"ID":"6d34ea67-eff5-4813-b997-35b5656058dd","Type":"ContainerStarted","Data":"73be8fb521d4d9e96210dcea944d3d8e658544df308858ed4f9790bfae54c655"} Dec 11 13:45:02 crc kubenswrapper[4898]: I1211 13:45:02.809958 4898 generic.go:334] "Generic (PLEG): container finished" podID="67ae3c0c-6ce1-429e-953d-2cac885ee43c" containerID="cc6eace62db90f89b417d4b3f590f650e920a06dc4d4cadd270672c5af4f70e5" exitCode=0 Dec 11 13:45:02 crc kubenswrapper[4898]: I1211 13:45:02.810042 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-6wt42" event={"ID":"67ae3c0c-6ce1-429e-953d-2cac885ee43c","Type":"ContainerDied","Data":"cc6eace62db90f89b417d4b3f590f650e920a06dc4d4cadd270672c5af4f70e5"} Dec 11 13:45:03 crc kubenswrapper[4898]: I1211 13:45:03.285706 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29424345-ksd98" Dec 11 13:45:03 crc kubenswrapper[4898]: I1211 13:45:03.345629 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lc5kx\" (UniqueName: \"kubernetes.io/projected/6d34ea67-eff5-4813-b997-35b5656058dd-kube-api-access-lc5kx\") pod \"6d34ea67-eff5-4813-b997-35b5656058dd\" (UID: \"6d34ea67-eff5-4813-b997-35b5656058dd\") " Dec 11 13:45:03 crc kubenswrapper[4898]: I1211 13:45:03.345818 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6d34ea67-eff5-4813-b997-35b5656058dd-config-volume\") pod \"6d34ea67-eff5-4813-b997-35b5656058dd\" (UID: \"6d34ea67-eff5-4813-b997-35b5656058dd\") " Dec 11 13:45:03 crc kubenswrapper[4898]: I1211 13:45:03.345846 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6d34ea67-eff5-4813-b997-35b5656058dd-secret-volume\") pod \"6d34ea67-eff5-4813-b997-35b5656058dd\" (UID: \"6d34ea67-eff5-4813-b997-35b5656058dd\") " Dec 11 13:45:03 crc kubenswrapper[4898]: I1211 13:45:03.346310 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6d34ea67-eff5-4813-b997-35b5656058dd-config-volume" (OuterVolumeSpecName: "config-volume") pod "6d34ea67-eff5-4813-b997-35b5656058dd" (UID: "6d34ea67-eff5-4813-b997-35b5656058dd"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:45:03 crc kubenswrapper[4898]: I1211 13:45:03.355709 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d34ea67-eff5-4813-b997-35b5656058dd-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "6d34ea67-eff5-4813-b997-35b5656058dd" (UID: "6d34ea67-eff5-4813-b997-35b5656058dd"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:45:03 crc kubenswrapper[4898]: I1211 13:45:03.355809 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d34ea67-eff5-4813-b997-35b5656058dd-kube-api-access-lc5kx" (OuterVolumeSpecName: "kube-api-access-lc5kx") pod "6d34ea67-eff5-4813-b997-35b5656058dd" (UID: "6d34ea67-eff5-4813-b997-35b5656058dd"). InnerVolumeSpecName "kube-api-access-lc5kx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:45:03 crc kubenswrapper[4898]: I1211 13:45:03.448076 4898 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6d34ea67-eff5-4813-b997-35b5656058dd-config-volume\") on node \"crc\" DevicePath \"\"" Dec 11 13:45:03 crc kubenswrapper[4898]: I1211 13:45:03.448113 4898 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6d34ea67-eff5-4813-b997-35b5656058dd-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 11 13:45:03 crc kubenswrapper[4898]: I1211 13:45:03.448126 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lc5kx\" (UniqueName: \"kubernetes.io/projected/6d34ea67-eff5-4813-b997-35b5656058dd-kube-api-access-lc5kx\") on node \"crc\" DevicePath \"\"" Dec 11 13:45:03 crc kubenswrapper[4898]: I1211 13:45:03.822074 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29424345-ksd98" event={"ID":"6d34ea67-eff5-4813-b997-35b5656058dd","Type":"ContainerDied","Data":"73be8fb521d4d9e96210dcea944d3d8e658544df308858ed4f9790bfae54c655"} Dec 11 13:45:03 crc kubenswrapper[4898]: I1211 13:45:03.823608 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="73be8fb521d4d9e96210dcea944d3d8e658544df308858ed4f9790bfae54c655" Dec 11 13:45:03 crc kubenswrapper[4898]: I1211 13:45:03.822087 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29424345-ksd98" Dec 11 13:45:04 crc kubenswrapper[4898]: I1211 13:45:04.275258 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-6wt42" Dec 11 13:45:04 crc kubenswrapper[4898]: I1211 13:45:04.395998 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29424300-txml6"] Dec 11 13:45:04 crc kubenswrapper[4898]: I1211 13:45:04.409351 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29424300-txml6"] Dec 11 13:45:04 crc kubenswrapper[4898]: I1211 13:45:04.470644 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/67ae3c0c-6ce1-429e-953d-2cac885ee43c-nova-metadata-neutron-config-0\") pod \"67ae3c0c-6ce1-429e-953d-2cac885ee43c\" (UID: \"67ae3c0c-6ce1-429e-953d-2cac885ee43c\") " Dec 11 13:45:04 crc kubenswrapper[4898]: I1211 13:45:04.470873 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67ae3c0c-6ce1-429e-953d-2cac885ee43c-neutron-metadata-combined-ca-bundle\") pod \"67ae3c0c-6ce1-429e-953d-2cac885ee43c\" (UID: \"67ae3c0c-6ce1-429e-953d-2cac885ee43c\") " Dec 11 13:45:04 crc kubenswrapper[4898]: I1211 13:45:04.471101 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/67ae3c0c-6ce1-429e-953d-2cac885ee43c-ssh-key\") pod \"67ae3c0c-6ce1-429e-953d-2cac885ee43c\" (UID: \"67ae3c0c-6ce1-429e-953d-2cac885ee43c\") " Dec 11 13:45:04 crc kubenswrapper[4898]: I1211 13:45:04.471176 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/67ae3c0c-6ce1-429e-953d-2cac885ee43c-neutron-ovn-metadata-agent-neutron-config-0\") pod \"67ae3c0c-6ce1-429e-953d-2cac885ee43c\" (UID: \"67ae3c0c-6ce1-429e-953d-2cac885ee43c\") " Dec 11 13:45:04 crc kubenswrapper[4898]: I1211 13:45:04.471250 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sx4xk\" (UniqueName: \"kubernetes.io/projected/67ae3c0c-6ce1-429e-953d-2cac885ee43c-kube-api-access-sx4xk\") pod \"67ae3c0c-6ce1-429e-953d-2cac885ee43c\" (UID: \"67ae3c0c-6ce1-429e-953d-2cac885ee43c\") " Dec 11 13:45:04 crc kubenswrapper[4898]: I1211 13:45:04.471325 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/67ae3c0c-6ce1-429e-953d-2cac885ee43c-inventory\") pod \"67ae3c0c-6ce1-429e-953d-2cac885ee43c\" (UID: \"67ae3c0c-6ce1-429e-953d-2cac885ee43c\") " Dec 11 13:45:04 crc kubenswrapper[4898]: I1211 13:45:04.476964 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/67ae3c0c-6ce1-429e-953d-2cac885ee43c-kube-api-access-sx4xk" (OuterVolumeSpecName: "kube-api-access-sx4xk") pod "67ae3c0c-6ce1-429e-953d-2cac885ee43c" (UID: "67ae3c0c-6ce1-429e-953d-2cac885ee43c"). InnerVolumeSpecName "kube-api-access-sx4xk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:45:04 crc kubenswrapper[4898]: I1211 13:45:04.477544 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67ae3c0c-6ce1-429e-953d-2cac885ee43c-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "67ae3c0c-6ce1-429e-953d-2cac885ee43c" (UID: "67ae3c0c-6ce1-429e-953d-2cac885ee43c"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:45:04 crc kubenswrapper[4898]: I1211 13:45:04.511584 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67ae3c0c-6ce1-429e-953d-2cac885ee43c-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "67ae3c0c-6ce1-429e-953d-2cac885ee43c" (UID: "67ae3c0c-6ce1-429e-953d-2cac885ee43c"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:45:04 crc kubenswrapper[4898]: I1211 13:45:04.512321 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67ae3c0c-6ce1-429e-953d-2cac885ee43c-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "67ae3c0c-6ce1-429e-953d-2cac885ee43c" (UID: "67ae3c0c-6ce1-429e-953d-2cac885ee43c"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:45:04 crc kubenswrapper[4898]: I1211 13:45:04.517947 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67ae3c0c-6ce1-429e-953d-2cac885ee43c-inventory" (OuterVolumeSpecName: "inventory") pod "67ae3c0c-6ce1-429e-953d-2cac885ee43c" (UID: "67ae3c0c-6ce1-429e-953d-2cac885ee43c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:45:04 crc kubenswrapper[4898]: I1211 13:45:04.521019 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67ae3c0c-6ce1-429e-953d-2cac885ee43c-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "67ae3c0c-6ce1-429e-953d-2cac885ee43c" (UID: "67ae3c0c-6ce1-429e-953d-2cac885ee43c"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:45:04 crc kubenswrapper[4898]: I1211 13:45:04.574633 4898 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/67ae3c0c-6ce1-429e-953d-2cac885ee43c-inventory\") on node \"crc\" DevicePath \"\"" Dec 11 13:45:04 crc kubenswrapper[4898]: I1211 13:45:04.574671 4898 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/67ae3c0c-6ce1-429e-953d-2cac885ee43c-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Dec 11 13:45:04 crc kubenswrapper[4898]: I1211 13:45:04.574689 4898 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67ae3c0c-6ce1-429e-953d-2cac885ee43c-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 11 13:45:04 crc kubenswrapper[4898]: I1211 13:45:04.574717 4898 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/67ae3c0c-6ce1-429e-953d-2cac885ee43c-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 11 13:45:04 crc kubenswrapper[4898]: I1211 13:45:04.574732 4898 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/67ae3c0c-6ce1-429e-953d-2cac885ee43c-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Dec 11 13:45:04 crc kubenswrapper[4898]: I1211 13:45:04.574747 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sx4xk\" (UniqueName: \"kubernetes.io/projected/67ae3c0c-6ce1-429e-953d-2cac885ee43c-kube-api-access-sx4xk\") on node \"crc\" DevicePath \"\"" Dec 11 13:45:04 crc kubenswrapper[4898]: I1211 13:45:04.799107 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e650293a-d60c-4e05-88dd-ea1fa46b3492" path="/var/lib/kubelet/pods/e650293a-d60c-4e05-88dd-ea1fa46b3492/volumes" Dec 11 13:45:04 crc kubenswrapper[4898]: I1211 13:45:04.841185 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-6wt42" event={"ID":"67ae3c0c-6ce1-429e-953d-2cac885ee43c","Type":"ContainerDied","Data":"8e8e7b2aa6615611e03d53701093ac03c4fcd616d784185a3e439683b06b7191"} Dec 11 13:45:04 crc kubenswrapper[4898]: I1211 13:45:04.841241 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8e8e7b2aa6615611e03d53701093ac03c4fcd616d784185a3e439683b06b7191" Dec 11 13:45:04 crc kubenswrapper[4898]: I1211 13:45:04.841258 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-6wt42" Dec 11 13:45:04 crc kubenswrapper[4898]: I1211 13:45:04.914868 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mzktd"] Dec 11 13:45:04 crc kubenswrapper[4898]: E1211 13:45:04.915671 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67ae3c0c-6ce1-429e-953d-2cac885ee43c" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Dec 11 13:45:04 crc kubenswrapper[4898]: I1211 13:45:04.915704 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="67ae3c0c-6ce1-429e-953d-2cac885ee43c" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Dec 11 13:45:04 crc kubenswrapper[4898]: E1211 13:45:04.915738 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d34ea67-eff5-4813-b997-35b5656058dd" containerName="collect-profiles" Dec 11 13:45:04 crc kubenswrapper[4898]: I1211 13:45:04.915762 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d34ea67-eff5-4813-b997-35b5656058dd" containerName="collect-profiles" Dec 11 13:45:04 crc kubenswrapper[4898]: I1211 13:45:04.916241 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d34ea67-eff5-4813-b997-35b5656058dd" containerName="collect-profiles" Dec 11 13:45:04 crc kubenswrapper[4898]: I1211 13:45:04.916297 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="67ae3c0c-6ce1-429e-953d-2cac885ee43c" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Dec 11 13:45:04 crc kubenswrapper[4898]: I1211 13:45:04.917795 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mzktd" Dec 11 13:45:04 crc kubenswrapper[4898]: I1211 13:45:04.920912 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Dec 11 13:45:04 crc kubenswrapper[4898]: I1211 13:45:04.921397 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 11 13:45:04 crc kubenswrapper[4898]: I1211 13:45:04.921626 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-w8jc8" Dec 11 13:45:04 crc kubenswrapper[4898]: I1211 13:45:04.922401 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 11 13:45:04 crc kubenswrapper[4898]: I1211 13:45:04.923034 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 11 13:45:04 crc kubenswrapper[4898]: I1211 13:45:04.933673 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mzktd"] Dec 11 13:45:04 crc kubenswrapper[4898]: I1211 13:45:04.994397 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/500dff36-d95c-4690-8fad-db278c0c0ac9-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-mzktd\" (UID: \"500dff36-d95c-4690-8fad-db278c0c0ac9\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mzktd" Dec 11 13:45:04 crc kubenswrapper[4898]: I1211 13:45:04.994514 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-75vd8\" (UniqueName: \"kubernetes.io/projected/500dff36-d95c-4690-8fad-db278c0c0ac9-kube-api-access-75vd8\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-mzktd\" (UID: \"500dff36-d95c-4690-8fad-db278c0c0ac9\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mzktd" Dec 11 13:45:04 crc kubenswrapper[4898]: I1211 13:45:04.994570 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/500dff36-d95c-4690-8fad-db278c0c0ac9-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-mzktd\" (UID: \"500dff36-d95c-4690-8fad-db278c0c0ac9\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mzktd" Dec 11 13:45:04 crc kubenswrapper[4898]: I1211 13:45:04.994936 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/500dff36-d95c-4690-8fad-db278c0c0ac9-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-mzktd\" (UID: \"500dff36-d95c-4690-8fad-db278c0c0ac9\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mzktd" Dec 11 13:45:04 crc kubenswrapper[4898]: I1211 13:45:04.995170 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/500dff36-d95c-4690-8fad-db278c0c0ac9-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-mzktd\" (UID: \"500dff36-d95c-4690-8fad-db278c0c0ac9\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mzktd" Dec 11 13:45:05 crc kubenswrapper[4898]: I1211 13:45:05.097702 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/500dff36-d95c-4690-8fad-db278c0c0ac9-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-mzktd\" (UID: \"500dff36-d95c-4690-8fad-db278c0c0ac9\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mzktd" Dec 11 13:45:05 crc kubenswrapper[4898]: I1211 13:45:05.097899 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/500dff36-d95c-4690-8fad-db278c0c0ac9-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-mzktd\" (UID: \"500dff36-d95c-4690-8fad-db278c0c0ac9\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mzktd" Dec 11 13:45:05 crc kubenswrapper[4898]: I1211 13:45:05.097997 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/500dff36-d95c-4690-8fad-db278c0c0ac9-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-mzktd\" (UID: \"500dff36-d95c-4690-8fad-db278c0c0ac9\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mzktd" Dec 11 13:45:05 crc kubenswrapper[4898]: I1211 13:45:05.098046 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-75vd8\" (UniqueName: \"kubernetes.io/projected/500dff36-d95c-4690-8fad-db278c0c0ac9-kube-api-access-75vd8\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-mzktd\" (UID: \"500dff36-d95c-4690-8fad-db278c0c0ac9\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mzktd" Dec 11 13:45:05 crc kubenswrapper[4898]: I1211 13:45:05.098090 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/500dff36-d95c-4690-8fad-db278c0c0ac9-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-mzktd\" (UID: \"500dff36-d95c-4690-8fad-db278c0c0ac9\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mzktd" Dec 11 13:45:05 crc kubenswrapper[4898]: I1211 13:45:05.101920 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/500dff36-d95c-4690-8fad-db278c0c0ac9-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-mzktd\" (UID: \"500dff36-d95c-4690-8fad-db278c0c0ac9\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mzktd" Dec 11 13:45:05 crc kubenswrapper[4898]: I1211 13:45:05.101941 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/500dff36-d95c-4690-8fad-db278c0c0ac9-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-mzktd\" (UID: \"500dff36-d95c-4690-8fad-db278c0c0ac9\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mzktd" Dec 11 13:45:05 crc kubenswrapper[4898]: I1211 13:45:05.102519 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/500dff36-d95c-4690-8fad-db278c0c0ac9-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-mzktd\" (UID: \"500dff36-d95c-4690-8fad-db278c0c0ac9\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mzktd" Dec 11 13:45:05 crc kubenswrapper[4898]: I1211 13:45:05.105995 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/500dff36-d95c-4690-8fad-db278c0c0ac9-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-mzktd\" (UID: \"500dff36-d95c-4690-8fad-db278c0c0ac9\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mzktd" Dec 11 13:45:05 crc kubenswrapper[4898]: I1211 13:45:05.117039 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-75vd8\" (UniqueName: \"kubernetes.io/projected/500dff36-d95c-4690-8fad-db278c0c0ac9-kube-api-access-75vd8\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-mzktd\" (UID: \"500dff36-d95c-4690-8fad-db278c0c0ac9\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mzktd" Dec 11 13:45:05 crc kubenswrapper[4898]: I1211 13:45:05.308098 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mzktd" Dec 11 13:45:05 crc kubenswrapper[4898]: W1211 13:45:05.870184 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod500dff36_d95c_4690_8fad_db278c0c0ac9.slice/crio-6a0fb2407e3c28292da97f206521f23c7d4a0b16a79965d3d3f93257033bc2e3 WatchSource:0}: Error finding container 6a0fb2407e3c28292da97f206521f23c7d4a0b16a79965d3d3f93257033bc2e3: Status 404 returned error can't find the container with id 6a0fb2407e3c28292da97f206521f23c7d4a0b16a79965d3d3f93257033bc2e3 Dec 11 13:45:05 crc kubenswrapper[4898]: I1211 13:45:05.873179 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mzktd"] Dec 11 13:45:06 crc kubenswrapper[4898]: I1211 13:45:06.864531 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mzktd" event={"ID":"500dff36-d95c-4690-8fad-db278c0c0ac9","Type":"ContainerStarted","Data":"ef6fd2eb7989cafe85a69c74720700943ccb51f524ee5a26cc8f40f6915c1036"} Dec 11 13:45:06 crc kubenswrapper[4898]: I1211 13:45:06.865175 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mzktd" event={"ID":"500dff36-d95c-4690-8fad-db278c0c0ac9","Type":"ContainerStarted","Data":"6a0fb2407e3c28292da97f206521f23c7d4a0b16a79965d3d3f93257033bc2e3"} Dec 11 13:45:06 crc kubenswrapper[4898]: I1211 13:45:06.899674 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mzktd" podStartSLOduration=2.475630082 podStartE2EDuration="2.899652033s" podCreationTimestamp="2025-12-11 13:45:04 +0000 UTC" firstStartedPulling="2025-12-11 13:45:05.877514781 +0000 UTC m=+2463.449841218" lastFinishedPulling="2025-12-11 13:45:06.301536732 +0000 UTC m=+2463.873863169" observedRunningTime="2025-12-11 13:45:06.887080935 +0000 UTC m=+2464.459407412" watchObservedRunningTime="2025-12-11 13:45:06.899652033 +0000 UTC m=+2464.471978480" Dec 11 13:45:10 crc kubenswrapper[4898]: I1211 13:45:10.775202 4898 scope.go:117] "RemoveContainer" containerID="27ad0d928a4e9168d47090faf6f1e780be1daad349640ac754f272e963e95eb8" Dec 11 13:45:10 crc kubenswrapper[4898]: E1211 13:45:10.776225 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mmvk_openshift-machine-config-operator(b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" Dec 11 13:45:24 crc kubenswrapper[4898]: I1211 13:45:24.775617 4898 scope.go:117] "RemoveContainer" containerID="27ad0d928a4e9168d47090faf6f1e780be1daad349640ac754f272e963e95eb8" Dec 11 13:45:24 crc kubenswrapper[4898]: E1211 13:45:24.776479 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mmvk_openshift-machine-config-operator(b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" Dec 11 13:45:38 crc kubenswrapper[4898]: I1211 13:45:38.776214 4898 scope.go:117] "RemoveContainer" containerID="27ad0d928a4e9168d47090faf6f1e780be1daad349640ac754f272e963e95eb8" Dec 11 13:45:38 crc kubenswrapper[4898]: E1211 13:45:38.777427 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mmvk_openshift-machine-config-operator(b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" Dec 11 13:45:39 crc kubenswrapper[4898]: I1211 13:45:39.351508 4898 scope.go:117] "RemoveContainer" containerID="2336174d7dbc065d5466365b9f408e7c62157f8d84689adcf77e27663f53bd73" Dec 11 13:45:50 crc kubenswrapper[4898]: I1211 13:45:50.064195 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-bmzx4"] Dec 11 13:45:50 crc kubenswrapper[4898]: I1211 13:45:50.068076 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bmzx4" Dec 11 13:45:50 crc kubenswrapper[4898]: I1211 13:45:50.077130 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bmzx4"] Dec 11 13:45:50 crc kubenswrapper[4898]: I1211 13:45:50.225973 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lzhr2\" (UniqueName: \"kubernetes.io/projected/0568c70c-dfab-4791-bd62-b11e04dc147c-kube-api-access-lzhr2\") pod \"community-operators-bmzx4\" (UID: \"0568c70c-dfab-4791-bd62-b11e04dc147c\") " pod="openshift-marketplace/community-operators-bmzx4" Dec 11 13:45:50 crc kubenswrapper[4898]: I1211 13:45:50.226491 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0568c70c-dfab-4791-bd62-b11e04dc147c-catalog-content\") pod \"community-operators-bmzx4\" (UID: \"0568c70c-dfab-4791-bd62-b11e04dc147c\") " pod="openshift-marketplace/community-operators-bmzx4" Dec 11 13:45:50 crc kubenswrapper[4898]: I1211 13:45:50.226692 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0568c70c-dfab-4791-bd62-b11e04dc147c-utilities\") pod \"community-operators-bmzx4\" (UID: \"0568c70c-dfab-4791-bd62-b11e04dc147c\") " pod="openshift-marketplace/community-operators-bmzx4" Dec 11 13:45:50 crc kubenswrapper[4898]: I1211 13:45:50.328770 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0568c70c-dfab-4791-bd62-b11e04dc147c-utilities\") pod \"community-operators-bmzx4\" (UID: \"0568c70c-dfab-4791-bd62-b11e04dc147c\") " pod="openshift-marketplace/community-operators-bmzx4" Dec 11 13:45:50 crc kubenswrapper[4898]: I1211 13:45:50.328944 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lzhr2\" (UniqueName: \"kubernetes.io/projected/0568c70c-dfab-4791-bd62-b11e04dc147c-kube-api-access-lzhr2\") pod \"community-operators-bmzx4\" (UID: \"0568c70c-dfab-4791-bd62-b11e04dc147c\") " pod="openshift-marketplace/community-operators-bmzx4" Dec 11 13:45:50 crc kubenswrapper[4898]: I1211 13:45:50.328982 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0568c70c-dfab-4791-bd62-b11e04dc147c-catalog-content\") pod \"community-operators-bmzx4\" (UID: \"0568c70c-dfab-4791-bd62-b11e04dc147c\") " pod="openshift-marketplace/community-operators-bmzx4" Dec 11 13:45:50 crc kubenswrapper[4898]: I1211 13:45:50.329575 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0568c70c-dfab-4791-bd62-b11e04dc147c-utilities\") pod \"community-operators-bmzx4\" (UID: \"0568c70c-dfab-4791-bd62-b11e04dc147c\") " pod="openshift-marketplace/community-operators-bmzx4" Dec 11 13:45:50 crc kubenswrapper[4898]: I1211 13:45:50.329581 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0568c70c-dfab-4791-bd62-b11e04dc147c-catalog-content\") pod \"community-operators-bmzx4\" (UID: \"0568c70c-dfab-4791-bd62-b11e04dc147c\") " pod="openshift-marketplace/community-operators-bmzx4" Dec 11 13:45:50 crc kubenswrapper[4898]: I1211 13:45:50.350968 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lzhr2\" (UniqueName: \"kubernetes.io/projected/0568c70c-dfab-4791-bd62-b11e04dc147c-kube-api-access-lzhr2\") pod \"community-operators-bmzx4\" (UID: \"0568c70c-dfab-4791-bd62-b11e04dc147c\") " pod="openshift-marketplace/community-operators-bmzx4" Dec 11 13:45:50 crc kubenswrapper[4898]: I1211 13:45:50.456422 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bmzx4" Dec 11 13:45:50 crc kubenswrapper[4898]: I1211 13:45:50.999325 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bmzx4"] Dec 11 13:45:51 crc kubenswrapper[4898]: I1211 13:45:51.394760 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bmzx4" event={"ID":"0568c70c-dfab-4791-bd62-b11e04dc147c","Type":"ContainerStarted","Data":"8e5cc45055fbfe6510f8947e88e6acfb005d703b4b1a592f2ca5cffab83d21f3"} Dec 11 13:45:51 crc kubenswrapper[4898]: I1211 13:45:51.395159 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bmzx4" event={"ID":"0568c70c-dfab-4791-bd62-b11e04dc147c","Type":"ContainerStarted","Data":"b031016b7e94eaa31ad4fa3506495e2da9fc1883cb353e33c8bdc985182b4e07"} Dec 11 13:45:52 crc kubenswrapper[4898]: I1211 13:45:52.421059 4898 generic.go:334] "Generic (PLEG): container finished" podID="0568c70c-dfab-4791-bd62-b11e04dc147c" containerID="8e5cc45055fbfe6510f8947e88e6acfb005d703b4b1a592f2ca5cffab83d21f3" exitCode=0 Dec 11 13:45:52 crc kubenswrapper[4898]: I1211 13:45:52.421274 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bmzx4" event={"ID":"0568c70c-dfab-4791-bd62-b11e04dc147c","Type":"ContainerDied","Data":"8e5cc45055fbfe6510f8947e88e6acfb005d703b4b1a592f2ca5cffab83d21f3"} Dec 11 13:45:52 crc kubenswrapper[4898]: I1211 13:45:52.781424 4898 scope.go:117] "RemoveContainer" containerID="27ad0d928a4e9168d47090faf6f1e780be1daad349640ac754f272e963e95eb8" Dec 11 13:45:52 crc kubenswrapper[4898]: E1211 13:45:52.781760 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mmvk_openshift-machine-config-operator(b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" Dec 11 13:45:54 crc kubenswrapper[4898]: I1211 13:45:54.446706 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bmzx4" event={"ID":"0568c70c-dfab-4791-bd62-b11e04dc147c","Type":"ContainerStarted","Data":"eb45be53de91b7a5e10abcb0ab0e966abee79dc671ba15873d5786f5e9806ebf"} Dec 11 13:45:55 crc kubenswrapper[4898]: I1211 13:45:55.464495 4898 generic.go:334] "Generic (PLEG): container finished" podID="0568c70c-dfab-4791-bd62-b11e04dc147c" containerID="eb45be53de91b7a5e10abcb0ab0e966abee79dc671ba15873d5786f5e9806ebf" exitCode=0 Dec 11 13:45:55 crc kubenswrapper[4898]: I1211 13:45:55.464573 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bmzx4" event={"ID":"0568c70c-dfab-4791-bd62-b11e04dc147c","Type":"ContainerDied","Data":"eb45be53de91b7a5e10abcb0ab0e966abee79dc671ba15873d5786f5e9806ebf"} Dec 11 13:45:56 crc kubenswrapper[4898]: I1211 13:45:56.495746 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bmzx4" event={"ID":"0568c70c-dfab-4791-bd62-b11e04dc147c","Type":"ContainerStarted","Data":"8a35ed31824daaaa494128fd5f2112ce1a6fc2d34cd3fdfd4be97786bf48e064"} Dec 11 13:45:56 crc kubenswrapper[4898]: I1211 13:45:56.516193 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-bmzx4" podStartSLOduration=2.987131752 podStartE2EDuration="6.516172786s" podCreationTimestamp="2025-12-11 13:45:50 +0000 UTC" firstStartedPulling="2025-12-11 13:45:52.430217779 +0000 UTC m=+2510.002544216" lastFinishedPulling="2025-12-11 13:45:55.959258813 +0000 UTC m=+2513.531585250" observedRunningTime="2025-12-11 13:45:56.515725824 +0000 UTC m=+2514.088052301" watchObservedRunningTime="2025-12-11 13:45:56.516172786 +0000 UTC m=+2514.088499213" Dec 11 13:46:00 crc kubenswrapper[4898]: I1211 13:46:00.457245 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-bmzx4" Dec 11 13:46:00 crc kubenswrapper[4898]: I1211 13:46:00.457565 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-bmzx4" Dec 11 13:46:00 crc kubenswrapper[4898]: I1211 13:46:00.529262 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-bmzx4" Dec 11 13:46:06 crc kubenswrapper[4898]: I1211 13:46:06.776578 4898 scope.go:117] "RemoveContainer" containerID="27ad0d928a4e9168d47090faf6f1e780be1daad349640ac754f272e963e95eb8" Dec 11 13:46:06 crc kubenswrapper[4898]: E1211 13:46:06.778096 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mmvk_openshift-machine-config-operator(b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" Dec 11 13:46:10 crc kubenswrapper[4898]: I1211 13:46:10.506716 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-bmzx4" Dec 11 13:46:10 crc kubenswrapper[4898]: I1211 13:46:10.560472 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-bmzx4"] Dec 11 13:46:10 crc kubenswrapper[4898]: I1211 13:46:10.671405 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-bmzx4" podUID="0568c70c-dfab-4791-bd62-b11e04dc147c" containerName="registry-server" containerID="cri-o://8a35ed31824daaaa494128fd5f2112ce1a6fc2d34cd3fdfd4be97786bf48e064" gracePeriod=2 Dec 11 13:46:11 crc kubenswrapper[4898]: I1211 13:46:11.681820 4898 generic.go:334] "Generic (PLEG): container finished" podID="0568c70c-dfab-4791-bd62-b11e04dc147c" containerID="8a35ed31824daaaa494128fd5f2112ce1a6fc2d34cd3fdfd4be97786bf48e064" exitCode=0 Dec 11 13:46:11 crc kubenswrapper[4898]: I1211 13:46:11.681873 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bmzx4" event={"ID":"0568c70c-dfab-4791-bd62-b11e04dc147c","Type":"ContainerDied","Data":"8a35ed31824daaaa494128fd5f2112ce1a6fc2d34cd3fdfd4be97786bf48e064"} Dec 11 13:46:11 crc kubenswrapper[4898]: I1211 13:46:11.682471 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bmzx4" event={"ID":"0568c70c-dfab-4791-bd62-b11e04dc147c","Type":"ContainerDied","Data":"b031016b7e94eaa31ad4fa3506495e2da9fc1883cb353e33c8bdc985182b4e07"} Dec 11 13:46:11 crc kubenswrapper[4898]: I1211 13:46:11.682494 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b031016b7e94eaa31ad4fa3506495e2da9fc1883cb353e33c8bdc985182b4e07" Dec 11 13:46:11 crc kubenswrapper[4898]: I1211 13:46:11.769302 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bmzx4" Dec 11 13:46:11 crc kubenswrapper[4898]: I1211 13:46:11.959001 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzhr2\" (UniqueName: \"kubernetes.io/projected/0568c70c-dfab-4791-bd62-b11e04dc147c-kube-api-access-lzhr2\") pod \"0568c70c-dfab-4791-bd62-b11e04dc147c\" (UID: \"0568c70c-dfab-4791-bd62-b11e04dc147c\") " Dec 11 13:46:11 crc kubenswrapper[4898]: I1211 13:46:11.959104 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0568c70c-dfab-4791-bd62-b11e04dc147c-catalog-content\") pod \"0568c70c-dfab-4791-bd62-b11e04dc147c\" (UID: \"0568c70c-dfab-4791-bd62-b11e04dc147c\") " Dec 11 13:46:11 crc kubenswrapper[4898]: I1211 13:46:11.959288 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0568c70c-dfab-4791-bd62-b11e04dc147c-utilities\") pod \"0568c70c-dfab-4791-bd62-b11e04dc147c\" (UID: \"0568c70c-dfab-4791-bd62-b11e04dc147c\") " Dec 11 13:46:11 crc kubenswrapper[4898]: I1211 13:46:11.960342 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0568c70c-dfab-4791-bd62-b11e04dc147c-utilities" (OuterVolumeSpecName: "utilities") pod "0568c70c-dfab-4791-bd62-b11e04dc147c" (UID: "0568c70c-dfab-4791-bd62-b11e04dc147c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 13:46:11 crc kubenswrapper[4898]: I1211 13:46:11.966745 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0568c70c-dfab-4791-bd62-b11e04dc147c-kube-api-access-lzhr2" (OuterVolumeSpecName: "kube-api-access-lzhr2") pod "0568c70c-dfab-4791-bd62-b11e04dc147c" (UID: "0568c70c-dfab-4791-bd62-b11e04dc147c"). InnerVolumeSpecName "kube-api-access-lzhr2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:46:12 crc kubenswrapper[4898]: I1211 13:46:12.015610 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0568c70c-dfab-4791-bd62-b11e04dc147c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0568c70c-dfab-4791-bd62-b11e04dc147c" (UID: "0568c70c-dfab-4791-bd62-b11e04dc147c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 13:46:12 crc kubenswrapper[4898]: I1211 13:46:12.062389 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzhr2\" (UniqueName: \"kubernetes.io/projected/0568c70c-dfab-4791-bd62-b11e04dc147c-kube-api-access-lzhr2\") on node \"crc\" DevicePath \"\"" Dec 11 13:46:12 crc kubenswrapper[4898]: I1211 13:46:12.062476 4898 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0568c70c-dfab-4791-bd62-b11e04dc147c-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 11 13:46:12 crc kubenswrapper[4898]: I1211 13:46:12.062491 4898 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0568c70c-dfab-4791-bd62-b11e04dc147c-utilities\") on node \"crc\" DevicePath \"\"" Dec 11 13:46:12 crc kubenswrapper[4898]: I1211 13:46:12.697037 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bmzx4" Dec 11 13:46:12 crc kubenswrapper[4898]: I1211 13:46:12.747222 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-bmzx4"] Dec 11 13:46:12 crc kubenswrapper[4898]: I1211 13:46:12.761414 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-bmzx4"] Dec 11 13:46:12 crc kubenswrapper[4898]: I1211 13:46:12.788540 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0568c70c-dfab-4791-bd62-b11e04dc147c" path="/var/lib/kubelet/pods/0568c70c-dfab-4791-bd62-b11e04dc147c/volumes" Dec 11 13:46:19 crc kubenswrapper[4898]: I1211 13:46:19.775962 4898 scope.go:117] "RemoveContainer" containerID="27ad0d928a4e9168d47090faf6f1e780be1daad349640ac754f272e963e95eb8" Dec 11 13:46:19 crc kubenswrapper[4898]: E1211 13:46:19.778256 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mmvk_openshift-machine-config-operator(b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" Dec 11 13:46:31 crc kubenswrapper[4898]: I1211 13:46:31.775297 4898 scope.go:117] "RemoveContainer" containerID="27ad0d928a4e9168d47090faf6f1e780be1daad349640ac754f272e963e95eb8" Dec 11 13:46:31 crc kubenswrapper[4898]: E1211 13:46:31.776253 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mmvk_openshift-machine-config-operator(b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" Dec 11 13:46:46 crc kubenswrapper[4898]: I1211 13:46:46.775331 4898 scope.go:117] "RemoveContainer" containerID="27ad0d928a4e9168d47090faf6f1e780be1daad349640ac754f272e963e95eb8" Dec 11 13:46:46 crc kubenswrapper[4898]: E1211 13:46:46.777500 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mmvk_openshift-machine-config-operator(b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" Dec 11 13:46:58 crc kubenswrapper[4898]: I1211 13:46:58.776024 4898 scope.go:117] "RemoveContainer" containerID="27ad0d928a4e9168d47090faf6f1e780be1daad349640ac754f272e963e95eb8" Dec 11 13:46:58 crc kubenswrapper[4898]: E1211 13:46:58.777037 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mmvk_openshift-machine-config-operator(b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" Dec 11 13:47:10 crc kubenswrapper[4898]: I1211 13:47:10.776293 4898 scope.go:117] "RemoveContainer" containerID="27ad0d928a4e9168d47090faf6f1e780be1daad349640ac754f272e963e95eb8" Dec 11 13:47:10 crc kubenswrapper[4898]: E1211 13:47:10.777565 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mmvk_openshift-machine-config-operator(b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" Dec 11 13:47:23 crc kubenswrapper[4898]: I1211 13:47:23.775607 4898 scope.go:117] "RemoveContainer" containerID="27ad0d928a4e9168d47090faf6f1e780be1daad349640ac754f272e963e95eb8" Dec 11 13:47:23 crc kubenswrapper[4898]: E1211 13:47:23.776509 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mmvk_openshift-machine-config-operator(b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" Dec 11 13:47:36 crc kubenswrapper[4898]: I1211 13:47:36.775551 4898 scope.go:117] "RemoveContainer" containerID="27ad0d928a4e9168d47090faf6f1e780be1daad349640ac754f272e963e95eb8" Dec 11 13:47:36 crc kubenswrapper[4898]: E1211 13:47:36.776747 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mmvk_openshift-machine-config-operator(b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" Dec 11 13:47:47 crc kubenswrapper[4898]: I1211 13:47:47.775081 4898 scope.go:117] "RemoveContainer" containerID="27ad0d928a4e9168d47090faf6f1e780be1daad349640ac754f272e963e95eb8" Dec 11 13:47:47 crc kubenswrapper[4898]: E1211 13:47:47.776041 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mmvk_openshift-machine-config-operator(b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" Dec 11 13:48:00 crc kubenswrapper[4898]: I1211 13:48:00.775539 4898 scope.go:117] "RemoveContainer" containerID="27ad0d928a4e9168d47090faf6f1e780be1daad349640ac754f272e963e95eb8" Dec 11 13:48:00 crc kubenswrapper[4898]: E1211 13:48:00.776324 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mmvk_openshift-machine-config-operator(b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" Dec 11 13:48:14 crc kubenswrapper[4898]: I1211 13:48:14.775955 4898 scope.go:117] "RemoveContainer" containerID="27ad0d928a4e9168d47090faf6f1e780be1daad349640ac754f272e963e95eb8" Dec 11 13:48:14 crc kubenswrapper[4898]: E1211 13:48:14.776767 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mmvk_openshift-machine-config-operator(b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" Dec 11 13:48:29 crc kubenswrapper[4898]: I1211 13:48:29.775636 4898 scope.go:117] "RemoveContainer" containerID="27ad0d928a4e9168d47090faf6f1e780be1daad349640ac754f272e963e95eb8" Dec 11 13:48:29 crc kubenswrapper[4898]: E1211 13:48:29.776393 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mmvk_openshift-machine-config-operator(b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" Dec 11 13:48:40 crc kubenswrapper[4898]: I1211 13:48:40.775764 4898 scope.go:117] "RemoveContainer" containerID="27ad0d928a4e9168d47090faf6f1e780be1daad349640ac754f272e963e95eb8" Dec 11 13:48:40 crc kubenswrapper[4898]: E1211 13:48:40.777399 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mmvk_openshift-machine-config-operator(b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" Dec 11 13:48:52 crc kubenswrapper[4898]: I1211 13:48:52.782967 4898 scope.go:117] "RemoveContainer" containerID="27ad0d928a4e9168d47090faf6f1e780be1daad349640ac754f272e963e95eb8" Dec 11 13:48:52 crc kubenswrapper[4898]: E1211 13:48:52.783921 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mmvk_openshift-machine-config-operator(b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" Dec 11 13:49:03 crc kubenswrapper[4898]: I1211 13:49:03.775855 4898 scope.go:117] "RemoveContainer" containerID="27ad0d928a4e9168d47090faf6f1e780be1daad349640ac754f272e963e95eb8" Dec 11 13:49:03 crc kubenswrapper[4898]: E1211 13:49:03.777626 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mmvk_openshift-machine-config-operator(b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" Dec 11 13:49:15 crc kubenswrapper[4898]: I1211 13:49:15.775524 4898 scope.go:117] "RemoveContainer" containerID="27ad0d928a4e9168d47090faf6f1e780be1daad349640ac754f272e963e95eb8" Dec 11 13:49:16 crc kubenswrapper[4898]: I1211 13:49:16.076570 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" event={"ID":"b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c","Type":"ContainerStarted","Data":"af271096b767ce5903084b05e8ad0ee53c330083b7f59f687de7d1f33e49b2a5"} Dec 11 13:49:28 crc kubenswrapper[4898]: I1211 13:49:28.473895 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-pc8dn"] Dec 11 13:49:28 crc kubenswrapper[4898]: E1211 13:49:28.475270 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0568c70c-dfab-4791-bd62-b11e04dc147c" containerName="extract-content" Dec 11 13:49:28 crc kubenswrapper[4898]: I1211 13:49:28.475288 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="0568c70c-dfab-4791-bd62-b11e04dc147c" containerName="extract-content" Dec 11 13:49:28 crc kubenswrapper[4898]: E1211 13:49:28.475304 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0568c70c-dfab-4791-bd62-b11e04dc147c" containerName="registry-server" Dec 11 13:49:28 crc kubenswrapper[4898]: I1211 13:49:28.475312 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="0568c70c-dfab-4791-bd62-b11e04dc147c" containerName="registry-server" Dec 11 13:49:28 crc kubenswrapper[4898]: E1211 13:49:28.475371 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0568c70c-dfab-4791-bd62-b11e04dc147c" containerName="extract-utilities" Dec 11 13:49:28 crc kubenswrapper[4898]: I1211 13:49:28.475382 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="0568c70c-dfab-4791-bd62-b11e04dc147c" containerName="extract-utilities" Dec 11 13:49:28 crc kubenswrapper[4898]: I1211 13:49:28.475678 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="0568c70c-dfab-4791-bd62-b11e04dc147c" containerName="registry-server" Dec 11 13:49:28 crc kubenswrapper[4898]: I1211 13:49:28.478728 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pc8dn" Dec 11 13:49:28 crc kubenswrapper[4898]: I1211 13:49:28.491642 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-pc8dn"] Dec 11 13:49:28 crc kubenswrapper[4898]: I1211 13:49:28.580116 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b6a6bb00-888f-483a-b9f3-440c5fa5f62e-utilities\") pod \"certified-operators-pc8dn\" (UID: \"b6a6bb00-888f-483a-b9f3-440c5fa5f62e\") " pod="openshift-marketplace/certified-operators-pc8dn" Dec 11 13:49:28 crc kubenswrapper[4898]: I1211 13:49:28.580218 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kg77n\" (UniqueName: \"kubernetes.io/projected/b6a6bb00-888f-483a-b9f3-440c5fa5f62e-kube-api-access-kg77n\") pod \"certified-operators-pc8dn\" (UID: \"b6a6bb00-888f-483a-b9f3-440c5fa5f62e\") " pod="openshift-marketplace/certified-operators-pc8dn" Dec 11 13:49:28 crc kubenswrapper[4898]: I1211 13:49:28.580583 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b6a6bb00-888f-483a-b9f3-440c5fa5f62e-catalog-content\") pod \"certified-operators-pc8dn\" (UID: \"b6a6bb00-888f-483a-b9f3-440c5fa5f62e\") " pod="openshift-marketplace/certified-operators-pc8dn" Dec 11 13:49:28 crc kubenswrapper[4898]: I1211 13:49:28.683284 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b6a6bb00-888f-483a-b9f3-440c5fa5f62e-catalog-content\") pod \"certified-operators-pc8dn\" (UID: \"b6a6bb00-888f-483a-b9f3-440c5fa5f62e\") " pod="openshift-marketplace/certified-operators-pc8dn" Dec 11 13:49:28 crc kubenswrapper[4898]: I1211 13:49:28.683816 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b6a6bb00-888f-483a-b9f3-440c5fa5f62e-catalog-content\") pod \"certified-operators-pc8dn\" (UID: \"b6a6bb00-888f-483a-b9f3-440c5fa5f62e\") " pod="openshift-marketplace/certified-operators-pc8dn" Dec 11 13:49:28 crc kubenswrapper[4898]: I1211 13:49:28.683959 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b6a6bb00-888f-483a-b9f3-440c5fa5f62e-utilities\") pod \"certified-operators-pc8dn\" (UID: \"b6a6bb00-888f-483a-b9f3-440c5fa5f62e\") " pod="openshift-marketplace/certified-operators-pc8dn" Dec 11 13:49:28 crc kubenswrapper[4898]: I1211 13:49:28.684060 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kg77n\" (UniqueName: \"kubernetes.io/projected/b6a6bb00-888f-483a-b9f3-440c5fa5f62e-kube-api-access-kg77n\") pod \"certified-operators-pc8dn\" (UID: \"b6a6bb00-888f-483a-b9f3-440c5fa5f62e\") " pod="openshift-marketplace/certified-operators-pc8dn" Dec 11 13:49:28 crc kubenswrapper[4898]: I1211 13:49:28.684218 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b6a6bb00-888f-483a-b9f3-440c5fa5f62e-utilities\") pod \"certified-operators-pc8dn\" (UID: \"b6a6bb00-888f-483a-b9f3-440c5fa5f62e\") " pod="openshift-marketplace/certified-operators-pc8dn" Dec 11 13:49:28 crc kubenswrapper[4898]: I1211 13:49:28.704708 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kg77n\" (UniqueName: \"kubernetes.io/projected/b6a6bb00-888f-483a-b9f3-440c5fa5f62e-kube-api-access-kg77n\") pod \"certified-operators-pc8dn\" (UID: \"b6a6bb00-888f-483a-b9f3-440c5fa5f62e\") " pod="openshift-marketplace/certified-operators-pc8dn" Dec 11 13:49:28 crc kubenswrapper[4898]: I1211 13:49:28.802141 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pc8dn" Dec 11 13:49:29 crc kubenswrapper[4898]: I1211 13:49:29.358016 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-pc8dn"] Dec 11 13:49:30 crc kubenswrapper[4898]: I1211 13:49:30.223643 4898 generic.go:334] "Generic (PLEG): container finished" podID="b6a6bb00-888f-483a-b9f3-440c5fa5f62e" containerID="df6be2bd3001b92632f91facff448c6dfec2d0d6da8aca6b8294bd7d6865e108" exitCode=0 Dec 11 13:49:30 crc kubenswrapper[4898]: I1211 13:49:30.223739 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pc8dn" event={"ID":"b6a6bb00-888f-483a-b9f3-440c5fa5f62e","Type":"ContainerDied","Data":"df6be2bd3001b92632f91facff448c6dfec2d0d6da8aca6b8294bd7d6865e108"} Dec 11 13:49:30 crc kubenswrapper[4898]: I1211 13:49:30.223948 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pc8dn" event={"ID":"b6a6bb00-888f-483a-b9f3-440c5fa5f62e","Type":"ContainerStarted","Data":"5d9151c7f40d1774b39d7e8390a06e1f3733c2ec06291d428c223e4a663631b8"} Dec 11 13:49:30 crc kubenswrapper[4898]: I1211 13:49:30.225980 4898 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 11 13:49:30 crc kubenswrapper[4898]: I1211 13:49:30.666084 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-xg76c"] Dec 11 13:49:30 crc kubenswrapper[4898]: I1211 13:49:30.669188 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xg76c" Dec 11 13:49:30 crc kubenswrapper[4898]: I1211 13:49:30.679371 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-xg76c"] Dec 11 13:49:30 crc kubenswrapper[4898]: I1211 13:49:30.844524 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/89dd4b4d-f3fa-4232-95bb-2c1439cf9d92-utilities\") pod \"redhat-marketplace-xg76c\" (UID: \"89dd4b4d-f3fa-4232-95bb-2c1439cf9d92\") " pod="openshift-marketplace/redhat-marketplace-xg76c" Dec 11 13:49:30 crc kubenswrapper[4898]: I1211 13:49:30.844597 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rb5nx\" (UniqueName: \"kubernetes.io/projected/89dd4b4d-f3fa-4232-95bb-2c1439cf9d92-kube-api-access-rb5nx\") pod \"redhat-marketplace-xg76c\" (UID: \"89dd4b4d-f3fa-4232-95bb-2c1439cf9d92\") " pod="openshift-marketplace/redhat-marketplace-xg76c" Dec 11 13:49:30 crc kubenswrapper[4898]: I1211 13:49:30.844647 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/89dd4b4d-f3fa-4232-95bb-2c1439cf9d92-catalog-content\") pod \"redhat-marketplace-xg76c\" (UID: \"89dd4b4d-f3fa-4232-95bb-2c1439cf9d92\") " pod="openshift-marketplace/redhat-marketplace-xg76c" Dec 11 13:49:30 crc kubenswrapper[4898]: I1211 13:49:30.946596 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/89dd4b4d-f3fa-4232-95bb-2c1439cf9d92-utilities\") pod \"redhat-marketplace-xg76c\" (UID: \"89dd4b4d-f3fa-4232-95bb-2c1439cf9d92\") " pod="openshift-marketplace/redhat-marketplace-xg76c" Dec 11 13:49:30 crc kubenswrapper[4898]: I1211 13:49:30.946707 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rb5nx\" (UniqueName: \"kubernetes.io/projected/89dd4b4d-f3fa-4232-95bb-2c1439cf9d92-kube-api-access-rb5nx\") pod \"redhat-marketplace-xg76c\" (UID: \"89dd4b4d-f3fa-4232-95bb-2c1439cf9d92\") " pod="openshift-marketplace/redhat-marketplace-xg76c" Dec 11 13:49:30 crc kubenswrapper[4898]: I1211 13:49:30.946744 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/89dd4b4d-f3fa-4232-95bb-2c1439cf9d92-catalog-content\") pod \"redhat-marketplace-xg76c\" (UID: \"89dd4b4d-f3fa-4232-95bb-2c1439cf9d92\") " pod="openshift-marketplace/redhat-marketplace-xg76c" Dec 11 13:49:30 crc kubenswrapper[4898]: I1211 13:49:30.947339 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/89dd4b4d-f3fa-4232-95bb-2c1439cf9d92-catalog-content\") pod \"redhat-marketplace-xg76c\" (UID: \"89dd4b4d-f3fa-4232-95bb-2c1439cf9d92\") " pod="openshift-marketplace/redhat-marketplace-xg76c" Dec 11 13:49:30 crc kubenswrapper[4898]: I1211 13:49:30.947611 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/89dd4b4d-f3fa-4232-95bb-2c1439cf9d92-utilities\") pod \"redhat-marketplace-xg76c\" (UID: \"89dd4b4d-f3fa-4232-95bb-2c1439cf9d92\") " pod="openshift-marketplace/redhat-marketplace-xg76c" Dec 11 13:49:30 crc kubenswrapper[4898]: I1211 13:49:30.966578 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rb5nx\" (UniqueName: \"kubernetes.io/projected/89dd4b4d-f3fa-4232-95bb-2c1439cf9d92-kube-api-access-rb5nx\") pod \"redhat-marketplace-xg76c\" (UID: \"89dd4b4d-f3fa-4232-95bb-2c1439cf9d92\") " pod="openshift-marketplace/redhat-marketplace-xg76c" Dec 11 13:49:31 crc kubenswrapper[4898]: I1211 13:49:31.001877 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xg76c" Dec 11 13:49:31 crc kubenswrapper[4898]: I1211 13:49:31.489535 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-xg76c"] Dec 11 13:49:32 crc kubenswrapper[4898]: I1211 13:49:32.244860 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xg76c" event={"ID":"89dd4b4d-f3fa-4232-95bb-2c1439cf9d92","Type":"ContainerStarted","Data":"3d271a5b0a3e40862194b5ae66662c4777c133cdea00808e594a3133ab842a7b"} Dec 11 13:49:33 crc kubenswrapper[4898]: I1211 13:49:33.260269 4898 generic.go:334] "Generic (PLEG): container finished" podID="500dff36-d95c-4690-8fad-db278c0c0ac9" containerID="ef6fd2eb7989cafe85a69c74720700943ccb51f524ee5a26cc8f40f6915c1036" exitCode=0 Dec 11 13:49:33 crc kubenswrapper[4898]: I1211 13:49:33.260430 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mzktd" event={"ID":"500dff36-d95c-4690-8fad-db278c0c0ac9","Type":"ContainerDied","Data":"ef6fd2eb7989cafe85a69c74720700943ccb51f524ee5a26cc8f40f6915c1036"} Dec 11 13:49:34 crc kubenswrapper[4898]: I1211 13:49:34.276173 4898 generic.go:334] "Generic (PLEG): container finished" podID="89dd4b4d-f3fa-4232-95bb-2c1439cf9d92" containerID="da2f49098accca135033f3eed6b69b010bb2995a242bd7730bf82403e1cca0ba" exitCode=0 Dec 11 13:49:34 crc kubenswrapper[4898]: I1211 13:49:34.276239 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xg76c" event={"ID":"89dd4b4d-f3fa-4232-95bb-2c1439cf9d92","Type":"ContainerDied","Data":"da2f49098accca135033f3eed6b69b010bb2995a242bd7730bf82403e1cca0ba"} Dec 11 13:49:34 crc kubenswrapper[4898]: I1211 13:49:34.887682 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mzktd" Dec 11 13:49:35 crc kubenswrapper[4898]: I1211 13:49:35.046512 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-75vd8\" (UniqueName: \"kubernetes.io/projected/500dff36-d95c-4690-8fad-db278c0c0ac9-kube-api-access-75vd8\") pod \"500dff36-d95c-4690-8fad-db278c0c0ac9\" (UID: \"500dff36-d95c-4690-8fad-db278c0c0ac9\") " Dec 11 13:49:35 crc kubenswrapper[4898]: I1211 13:49:35.046751 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/500dff36-d95c-4690-8fad-db278c0c0ac9-ssh-key\") pod \"500dff36-d95c-4690-8fad-db278c0c0ac9\" (UID: \"500dff36-d95c-4690-8fad-db278c0c0ac9\") " Dec 11 13:49:35 crc kubenswrapper[4898]: I1211 13:49:35.046818 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/500dff36-d95c-4690-8fad-db278c0c0ac9-libvirt-combined-ca-bundle\") pod \"500dff36-d95c-4690-8fad-db278c0c0ac9\" (UID: \"500dff36-d95c-4690-8fad-db278c0c0ac9\") " Dec 11 13:49:35 crc kubenswrapper[4898]: I1211 13:49:35.046906 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/500dff36-d95c-4690-8fad-db278c0c0ac9-inventory\") pod \"500dff36-d95c-4690-8fad-db278c0c0ac9\" (UID: \"500dff36-d95c-4690-8fad-db278c0c0ac9\") " Dec 11 13:49:35 crc kubenswrapper[4898]: I1211 13:49:35.046939 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/500dff36-d95c-4690-8fad-db278c0c0ac9-libvirt-secret-0\") pod \"500dff36-d95c-4690-8fad-db278c0c0ac9\" (UID: \"500dff36-d95c-4690-8fad-db278c0c0ac9\") " Dec 11 13:49:35 crc kubenswrapper[4898]: I1211 13:49:35.063785 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/500dff36-d95c-4690-8fad-db278c0c0ac9-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "500dff36-d95c-4690-8fad-db278c0c0ac9" (UID: "500dff36-d95c-4690-8fad-db278c0c0ac9"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:49:35 crc kubenswrapper[4898]: I1211 13:49:35.064933 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/500dff36-d95c-4690-8fad-db278c0c0ac9-kube-api-access-75vd8" (OuterVolumeSpecName: "kube-api-access-75vd8") pod "500dff36-d95c-4690-8fad-db278c0c0ac9" (UID: "500dff36-d95c-4690-8fad-db278c0c0ac9"). InnerVolumeSpecName "kube-api-access-75vd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:49:35 crc kubenswrapper[4898]: I1211 13:49:35.089310 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/500dff36-d95c-4690-8fad-db278c0c0ac9-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "500dff36-d95c-4690-8fad-db278c0c0ac9" (UID: "500dff36-d95c-4690-8fad-db278c0c0ac9"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:49:35 crc kubenswrapper[4898]: I1211 13:49:35.093638 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/500dff36-d95c-4690-8fad-db278c0c0ac9-inventory" (OuterVolumeSpecName: "inventory") pod "500dff36-d95c-4690-8fad-db278c0c0ac9" (UID: "500dff36-d95c-4690-8fad-db278c0c0ac9"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:49:35 crc kubenswrapper[4898]: I1211 13:49:35.095971 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/500dff36-d95c-4690-8fad-db278c0c0ac9-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "500dff36-d95c-4690-8fad-db278c0c0ac9" (UID: "500dff36-d95c-4690-8fad-db278c0c0ac9"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:49:35 crc kubenswrapper[4898]: I1211 13:49:35.149914 4898 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/500dff36-d95c-4690-8fad-db278c0c0ac9-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 11 13:49:35 crc kubenswrapper[4898]: I1211 13:49:35.149958 4898 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/500dff36-d95c-4690-8fad-db278c0c0ac9-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 11 13:49:35 crc kubenswrapper[4898]: I1211 13:49:35.149973 4898 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/500dff36-d95c-4690-8fad-db278c0c0ac9-inventory\") on node \"crc\" DevicePath \"\"" Dec 11 13:49:35 crc kubenswrapper[4898]: I1211 13:49:35.149986 4898 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/500dff36-d95c-4690-8fad-db278c0c0ac9-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Dec 11 13:49:35 crc kubenswrapper[4898]: I1211 13:49:35.149999 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-75vd8\" (UniqueName: \"kubernetes.io/projected/500dff36-d95c-4690-8fad-db278c0c0ac9-kube-api-access-75vd8\") on node \"crc\" DevicePath \"\"" Dec 11 13:49:35 crc kubenswrapper[4898]: I1211 13:49:35.291946 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mzktd" event={"ID":"500dff36-d95c-4690-8fad-db278c0c0ac9","Type":"ContainerDied","Data":"6a0fb2407e3c28292da97f206521f23c7d4a0b16a79965d3d3f93257033bc2e3"} Dec 11 13:49:35 crc kubenswrapper[4898]: I1211 13:49:35.291984 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6a0fb2407e3c28292da97f206521f23c7d4a0b16a79965d3d3f93257033bc2e3" Dec 11 13:49:35 crc kubenswrapper[4898]: I1211 13:49:35.292033 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mzktd" Dec 11 13:49:35 crc kubenswrapper[4898]: I1211 13:49:35.401265 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-jrmnk"] Dec 11 13:49:35 crc kubenswrapper[4898]: E1211 13:49:35.401898 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="500dff36-d95c-4690-8fad-db278c0c0ac9" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Dec 11 13:49:35 crc kubenswrapper[4898]: I1211 13:49:35.401923 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="500dff36-d95c-4690-8fad-db278c0c0ac9" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Dec 11 13:49:35 crc kubenswrapper[4898]: I1211 13:49:35.402232 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="500dff36-d95c-4690-8fad-db278c0c0ac9" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Dec 11 13:49:35 crc kubenswrapper[4898]: I1211 13:49:35.403244 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jrmnk" Dec 11 13:49:35 crc kubenswrapper[4898]: I1211 13:49:35.407778 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Dec 11 13:49:35 crc kubenswrapper[4898]: I1211 13:49:35.408064 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Dec 11 13:49:35 crc kubenswrapper[4898]: I1211 13:49:35.408774 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-w8jc8" Dec 11 13:49:35 crc kubenswrapper[4898]: I1211 13:49:35.409220 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Dec 11 13:49:35 crc kubenswrapper[4898]: I1211 13:49:35.411717 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 11 13:49:35 crc kubenswrapper[4898]: I1211 13:49:35.415185 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 11 13:49:35 crc kubenswrapper[4898]: I1211 13:49:35.424867 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 11 13:49:35 crc kubenswrapper[4898]: I1211 13:49:35.435156 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-jrmnk"] Dec 11 13:49:35 crc kubenswrapper[4898]: I1211 13:49:35.456741 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/57ebbb8f-174c-4862-bba6-5644c98c7b1c-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jrmnk\" (UID: \"57ebbb8f-174c-4862-bba6-5644c98c7b1c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jrmnk" Dec 11 13:49:35 crc kubenswrapper[4898]: I1211 13:49:35.456805 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/57ebbb8f-174c-4862-bba6-5644c98c7b1c-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jrmnk\" (UID: \"57ebbb8f-174c-4862-bba6-5644c98c7b1c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jrmnk" Dec 11 13:49:35 crc kubenswrapper[4898]: I1211 13:49:35.456856 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/57ebbb8f-174c-4862-bba6-5644c98c7b1c-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jrmnk\" (UID: \"57ebbb8f-174c-4862-bba6-5644c98c7b1c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jrmnk" Dec 11 13:49:35 crc kubenswrapper[4898]: I1211 13:49:35.456873 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xvk9s\" (UniqueName: \"kubernetes.io/projected/57ebbb8f-174c-4862-bba6-5644c98c7b1c-kube-api-access-xvk9s\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jrmnk\" (UID: \"57ebbb8f-174c-4862-bba6-5644c98c7b1c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jrmnk" Dec 11 13:49:35 crc kubenswrapper[4898]: I1211 13:49:35.457084 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/57ebbb8f-174c-4862-bba6-5644c98c7b1c-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jrmnk\" (UID: \"57ebbb8f-174c-4862-bba6-5644c98c7b1c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jrmnk" Dec 11 13:49:35 crc kubenswrapper[4898]: I1211 13:49:35.457334 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/57ebbb8f-174c-4862-bba6-5644c98c7b1c-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jrmnk\" (UID: \"57ebbb8f-174c-4862-bba6-5644c98c7b1c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jrmnk" Dec 11 13:49:35 crc kubenswrapper[4898]: I1211 13:49:35.457569 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57ebbb8f-174c-4862-bba6-5644c98c7b1c-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jrmnk\" (UID: \"57ebbb8f-174c-4862-bba6-5644c98c7b1c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jrmnk" Dec 11 13:49:35 crc kubenswrapper[4898]: I1211 13:49:35.457610 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/57ebbb8f-174c-4862-bba6-5644c98c7b1c-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jrmnk\" (UID: \"57ebbb8f-174c-4862-bba6-5644c98c7b1c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jrmnk" Dec 11 13:49:35 crc kubenswrapper[4898]: I1211 13:49:35.457692 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/57ebbb8f-174c-4862-bba6-5644c98c7b1c-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jrmnk\" (UID: \"57ebbb8f-174c-4862-bba6-5644c98c7b1c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jrmnk" Dec 11 13:49:35 crc kubenswrapper[4898]: I1211 13:49:35.558728 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/57ebbb8f-174c-4862-bba6-5644c98c7b1c-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jrmnk\" (UID: \"57ebbb8f-174c-4862-bba6-5644c98c7b1c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jrmnk" Dec 11 13:49:35 crc kubenswrapper[4898]: I1211 13:49:35.558777 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xvk9s\" (UniqueName: \"kubernetes.io/projected/57ebbb8f-174c-4862-bba6-5644c98c7b1c-kube-api-access-xvk9s\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jrmnk\" (UID: \"57ebbb8f-174c-4862-bba6-5644c98c7b1c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jrmnk" Dec 11 13:49:35 crc kubenswrapper[4898]: I1211 13:49:35.558851 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/57ebbb8f-174c-4862-bba6-5644c98c7b1c-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jrmnk\" (UID: \"57ebbb8f-174c-4862-bba6-5644c98c7b1c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jrmnk" Dec 11 13:49:35 crc kubenswrapper[4898]: I1211 13:49:35.558938 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/57ebbb8f-174c-4862-bba6-5644c98c7b1c-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jrmnk\" (UID: \"57ebbb8f-174c-4862-bba6-5644c98c7b1c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jrmnk" Dec 11 13:49:35 crc kubenswrapper[4898]: I1211 13:49:35.559027 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57ebbb8f-174c-4862-bba6-5644c98c7b1c-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jrmnk\" (UID: \"57ebbb8f-174c-4862-bba6-5644c98c7b1c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jrmnk" Dec 11 13:49:35 crc kubenswrapper[4898]: I1211 13:49:35.559057 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/57ebbb8f-174c-4862-bba6-5644c98c7b1c-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jrmnk\" (UID: \"57ebbb8f-174c-4862-bba6-5644c98c7b1c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jrmnk" Dec 11 13:49:35 crc kubenswrapper[4898]: I1211 13:49:35.559095 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/57ebbb8f-174c-4862-bba6-5644c98c7b1c-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jrmnk\" (UID: \"57ebbb8f-174c-4862-bba6-5644c98c7b1c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jrmnk" Dec 11 13:49:35 crc kubenswrapper[4898]: I1211 13:49:35.559136 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/57ebbb8f-174c-4862-bba6-5644c98c7b1c-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jrmnk\" (UID: \"57ebbb8f-174c-4862-bba6-5644c98c7b1c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jrmnk" Dec 11 13:49:35 crc kubenswrapper[4898]: I1211 13:49:35.559187 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/57ebbb8f-174c-4862-bba6-5644c98c7b1c-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jrmnk\" (UID: \"57ebbb8f-174c-4862-bba6-5644c98c7b1c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jrmnk" Dec 11 13:49:35 crc kubenswrapper[4898]: I1211 13:49:35.559889 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/57ebbb8f-174c-4862-bba6-5644c98c7b1c-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jrmnk\" (UID: \"57ebbb8f-174c-4862-bba6-5644c98c7b1c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jrmnk" Dec 11 13:49:35 crc kubenswrapper[4898]: I1211 13:49:35.563695 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/57ebbb8f-174c-4862-bba6-5644c98c7b1c-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jrmnk\" (UID: \"57ebbb8f-174c-4862-bba6-5644c98c7b1c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jrmnk" Dec 11 13:49:35 crc kubenswrapper[4898]: I1211 13:49:35.563799 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/57ebbb8f-174c-4862-bba6-5644c98c7b1c-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jrmnk\" (UID: \"57ebbb8f-174c-4862-bba6-5644c98c7b1c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jrmnk" Dec 11 13:49:35 crc kubenswrapper[4898]: I1211 13:49:35.564209 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/57ebbb8f-174c-4862-bba6-5644c98c7b1c-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jrmnk\" (UID: \"57ebbb8f-174c-4862-bba6-5644c98c7b1c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jrmnk" Dec 11 13:49:35 crc kubenswrapper[4898]: I1211 13:49:35.564990 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/57ebbb8f-174c-4862-bba6-5644c98c7b1c-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jrmnk\" (UID: \"57ebbb8f-174c-4862-bba6-5644c98c7b1c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jrmnk" Dec 11 13:49:35 crc kubenswrapper[4898]: I1211 13:49:35.565352 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/57ebbb8f-174c-4862-bba6-5644c98c7b1c-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jrmnk\" (UID: \"57ebbb8f-174c-4862-bba6-5644c98c7b1c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jrmnk" Dec 11 13:49:35 crc kubenswrapper[4898]: I1211 13:49:35.566058 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57ebbb8f-174c-4862-bba6-5644c98c7b1c-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jrmnk\" (UID: \"57ebbb8f-174c-4862-bba6-5644c98c7b1c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jrmnk" Dec 11 13:49:35 crc kubenswrapper[4898]: I1211 13:49:35.566273 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/57ebbb8f-174c-4862-bba6-5644c98c7b1c-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jrmnk\" (UID: \"57ebbb8f-174c-4862-bba6-5644c98c7b1c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jrmnk" Dec 11 13:49:35 crc kubenswrapper[4898]: I1211 13:49:35.581618 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xvk9s\" (UniqueName: \"kubernetes.io/projected/57ebbb8f-174c-4862-bba6-5644c98c7b1c-kube-api-access-xvk9s\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jrmnk\" (UID: \"57ebbb8f-174c-4862-bba6-5644c98c7b1c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jrmnk" Dec 11 13:49:35 crc kubenswrapper[4898]: I1211 13:49:35.744487 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jrmnk" Dec 11 13:49:37 crc kubenswrapper[4898]: W1211 13:49:37.696479 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod57ebbb8f_174c_4862_bba6_5644c98c7b1c.slice/crio-f580840c1ac311c65d635856e19d2ae8db9435d0bcf31c74be369316fbf1c58b WatchSource:0}: Error finding container f580840c1ac311c65d635856e19d2ae8db9435d0bcf31c74be369316fbf1c58b: Status 404 returned error can't find the container with id f580840c1ac311c65d635856e19d2ae8db9435d0bcf31c74be369316fbf1c58b Dec 11 13:49:37 crc kubenswrapper[4898]: I1211 13:49:37.698570 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-jrmnk"] Dec 11 13:49:38 crc kubenswrapper[4898]: I1211 13:49:38.332689 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jrmnk" event={"ID":"57ebbb8f-174c-4862-bba6-5644c98c7b1c","Type":"ContainerStarted","Data":"f580840c1ac311c65d635856e19d2ae8db9435d0bcf31c74be369316fbf1c58b"} Dec 11 13:49:38 crc kubenswrapper[4898]: I1211 13:49:38.339656 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pc8dn" event={"ID":"b6a6bb00-888f-483a-b9f3-440c5fa5f62e","Type":"ContainerStarted","Data":"20aea438ccb0f34218fa4d3df0214020bb664ddb8e82f6e1052bbce6bb2e360e"} Dec 11 13:49:38 crc kubenswrapper[4898]: I1211 13:49:38.343814 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xg76c" event={"ID":"89dd4b4d-f3fa-4232-95bb-2c1439cf9d92","Type":"ContainerStarted","Data":"caeac67070dba1e9cc06ae4cf1271ec48d5ea43b4fe5b24db7397e51e53e3e00"} Dec 11 13:49:39 crc kubenswrapper[4898]: I1211 13:49:39.355371 4898 generic.go:334] "Generic (PLEG): container finished" podID="89dd4b4d-f3fa-4232-95bb-2c1439cf9d92" containerID="caeac67070dba1e9cc06ae4cf1271ec48d5ea43b4fe5b24db7397e51e53e3e00" exitCode=0 Dec 11 13:49:39 crc kubenswrapper[4898]: I1211 13:49:39.355473 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xg76c" event={"ID":"89dd4b4d-f3fa-4232-95bb-2c1439cf9d92","Type":"ContainerDied","Data":"caeac67070dba1e9cc06ae4cf1271ec48d5ea43b4fe5b24db7397e51e53e3e00"} Dec 11 13:49:41 crc kubenswrapper[4898]: I1211 13:49:41.389607 4898 generic.go:334] "Generic (PLEG): container finished" podID="b6a6bb00-888f-483a-b9f3-440c5fa5f62e" containerID="20aea438ccb0f34218fa4d3df0214020bb664ddb8e82f6e1052bbce6bb2e360e" exitCode=0 Dec 11 13:49:41 crc kubenswrapper[4898]: I1211 13:49:41.389695 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pc8dn" event={"ID":"b6a6bb00-888f-483a-b9f3-440c5fa5f62e","Type":"ContainerDied","Data":"20aea438ccb0f34218fa4d3df0214020bb664ddb8e82f6e1052bbce6bb2e360e"} Dec 11 13:49:43 crc kubenswrapper[4898]: I1211 13:49:43.416790 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jrmnk" event={"ID":"57ebbb8f-174c-4862-bba6-5644c98c7b1c","Type":"ContainerStarted","Data":"8004f1601799949605ee1cca104be273039dce6d3f4d1ca0f49f5829c374c5d2"} Dec 11 13:49:43 crc kubenswrapper[4898]: I1211 13:49:43.440643 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xg76c" event={"ID":"89dd4b4d-f3fa-4232-95bb-2c1439cf9d92","Type":"ContainerStarted","Data":"d44857e09848f6856c31986b41f33b78d75b2ac57133d1306ff33bc0a3c5c40e"} Dec 11 13:49:43 crc kubenswrapper[4898]: I1211 13:49:43.469415 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-xg76c" podStartSLOduration=5.745749463 podStartE2EDuration="13.469395995s" podCreationTimestamp="2025-12-11 13:49:30 +0000 UTC" firstStartedPulling="2025-12-11 13:49:35.081884905 +0000 UTC m=+2732.654211342" lastFinishedPulling="2025-12-11 13:49:42.805531437 +0000 UTC m=+2740.377857874" observedRunningTime="2025-12-11 13:49:43.460224728 +0000 UTC m=+2741.032551165" watchObservedRunningTime="2025-12-11 13:49:43.469395995 +0000 UTC m=+2741.041722432" Dec 11 13:49:43 crc kubenswrapper[4898]: I1211 13:49:43.471207 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jrmnk" podStartSLOduration=3.390699196 podStartE2EDuration="8.471201503s" podCreationTimestamp="2025-12-11 13:49:35 +0000 UTC" firstStartedPulling="2025-12-11 13:49:37.699534904 +0000 UTC m=+2735.271861341" lastFinishedPulling="2025-12-11 13:49:42.780037201 +0000 UTC m=+2740.352363648" observedRunningTime="2025-12-11 13:49:43.437038335 +0000 UTC m=+2741.009364772" watchObservedRunningTime="2025-12-11 13:49:43.471201503 +0000 UTC m=+2741.043527940" Dec 11 13:49:44 crc kubenswrapper[4898]: I1211 13:49:44.453410 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pc8dn" event={"ID":"b6a6bb00-888f-483a-b9f3-440c5fa5f62e","Type":"ContainerStarted","Data":"30ca9d6964c13957c220c44fa454beb68b1ab5f88acde343e85db01b8cd5c77f"} Dec 11 13:49:44 crc kubenswrapper[4898]: I1211 13:49:44.478942 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-pc8dn" podStartSLOduration=3.321936573 podStartE2EDuration="16.478924238s" podCreationTimestamp="2025-12-11 13:49:28 +0000 UTC" firstStartedPulling="2025-12-11 13:49:30.225556895 +0000 UTC m=+2727.797883332" lastFinishedPulling="2025-12-11 13:49:43.38254456 +0000 UTC m=+2740.954870997" observedRunningTime="2025-12-11 13:49:44.472987998 +0000 UTC m=+2742.045314435" watchObservedRunningTime="2025-12-11 13:49:44.478924238 +0000 UTC m=+2742.051250675" Dec 11 13:49:48 crc kubenswrapper[4898]: I1211 13:49:48.803036 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-pc8dn" Dec 11 13:49:48 crc kubenswrapper[4898]: I1211 13:49:48.804605 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-pc8dn" Dec 11 13:49:48 crc kubenswrapper[4898]: I1211 13:49:48.852272 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-pc8dn" Dec 11 13:49:49 crc kubenswrapper[4898]: I1211 13:49:49.548769 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-pc8dn" Dec 11 13:49:49 crc kubenswrapper[4898]: I1211 13:49:49.614060 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-pc8dn"] Dec 11 13:49:51 crc kubenswrapper[4898]: I1211 13:49:51.002674 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-xg76c" Dec 11 13:49:51 crc kubenswrapper[4898]: I1211 13:49:51.002783 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-xg76c" Dec 11 13:49:51 crc kubenswrapper[4898]: I1211 13:49:51.064366 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-xg76c" Dec 11 13:49:51 crc kubenswrapper[4898]: I1211 13:49:51.523919 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-pc8dn" podUID="b6a6bb00-888f-483a-b9f3-440c5fa5f62e" containerName="registry-server" containerID="cri-o://30ca9d6964c13957c220c44fa454beb68b1ab5f88acde343e85db01b8cd5c77f" gracePeriod=2 Dec 11 13:49:51 crc kubenswrapper[4898]: I1211 13:49:51.580896 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-xg76c" Dec 11 13:49:52 crc kubenswrapper[4898]: I1211 13:49:52.489524 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-xg76c"] Dec 11 13:49:52 crc kubenswrapper[4898]: I1211 13:49:52.535602 4898 generic.go:334] "Generic (PLEG): container finished" podID="b6a6bb00-888f-483a-b9f3-440c5fa5f62e" containerID="30ca9d6964c13957c220c44fa454beb68b1ab5f88acde343e85db01b8cd5c77f" exitCode=0 Dec 11 13:49:52 crc kubenswrapper[4898]: I1211 13:49:52.535685 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pc8dn" event={"ID":"b6a6bb00-888f-483a-b9f3-440c5fa5f62e","Type":"ContainerDied","Data":"30ca9d6964c13957c220c44fa454beb68b1ab5f88acde343e85db01b8cd5c77f"} Dec 11 13:49:52 crc kubenswrapper[4898]: I1211 13:49:52.535731 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pc8dn" event={"ID":"b6a6bb00-888f-483a-b9f3-440c5fa5f62e","Type":"ContainerDied","Data":"5d9151c7f40d1774b39d7e8390a06e1f3733c2ec06291d428c223e4a663631b8"} Dec 11 13:49:52 crc kubenswrapper[4898]: I1211 13:49:52.535743 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5d9151c7f40d1774b39d7e8390a06e1f3733c2ec06291d428c223e4a663631b8" Dec 11 13:49:52 crc kubenswrapper[4898]: I1211 13:49:52.622983 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pc8dn" Dec 11 13:49:52 crc kubenswrapper[4898]: I1211 13:49:52.820377 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kg77n\" (UniqueName: \"kubernetes.io/projected/b6a6bb00-888f-483a-b9f3-440c5fa5f62e-kube-api-access-kg77n\") pod \"b6a6bb00-888f-483a-b9f3-440c5fa5f62e\" (UID: \"b6a6bb00-888f-483a-b9f3-440c5fa5f62e\") " Dec 11 13:49:52 crc kubenswrapper[4898]: I1211 13:49:52.820447 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b6a6bb00-888f-483a-b9f3-440c5fa5f62e-catalog-content\") pod \"b6a6bb00-888f-483a-b9f3-440c5fa5f62e\" (UID: \"b6a6bb00-888f-483a-b9f3-440c5fa5f62e\") " Dec 11 13:49:52 crc kubenswrapper[4898]: I1211 13:49:52.820680 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b6a6bb00-888f-483a-b9f3-440c5fa5f62e-utilities\") pod \"b6a6bb00-888f-483a-b9f3-440c5fa5f62e\" (UID: \"b6a6bb00-888f-483a-b9f3-440c5fa5f62e\") " Dec 11 13:49:52 crc kubenswrapper[4898]: I1211 13:49:52.821591 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b6a6bb00-888f-483a-b9f3-440c5fa5f62e-utilities" (OuterVolumeSpecName: "utilities") pod "b6a6bb00-888f-483a-b9f3-440c5fa5f62e" (UID: "b6a6bb00-888f-483a-b9f3-440c5fa5f62e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 13:49:52 crc kubenswrapper[4898]: I1211 13:49:52.826029 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6a6bb00-888f-483a-b9f3-440c5fa5f62e-kube-api-access-kg77n" (OuterVolumeSpecName: "kube-api-access-kg77n") pod "b6a6bb00-888f-483a-b9f3-440c5fa5f62e" (UID: "b6a6bb00-888f-483a-b9f3-440c5fa5f62e"). InnerVolumeSpecName "kube-api-access-kg77n". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:49:52 crc kubenswrapper[4898]: I1211 13:49:52.876190 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b6a6bb00-888f-483a-b9f3-440c5fa5f62e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b6a6bb00-888f-483a-b9f3-440c5fa5f62e" (UID: "b6a6bb00-888f-483a-b9f3-440c5fa5f62e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 13:49:52 crc kubenswrapper[4898]: I1211 13:49:52.923847 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kg77n\" (UniqueName: \"kubernetes.io/projected/b6a6bb00-888f-483a-b9f3-440c5fa5f62e-kube-api-access-kg77n\") on node \"crc\" DevicePath \"\"" Dec 11 13:49:52 crc kubenswrapper[4898]: I1211 13:49:52.923891 4898 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b6a6bb00-888f-483a-b9f3-440c5fa5f62e-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 11 13:49:52 crc kubenswrapper[4898]: I1211 13:49:52.923902 4898 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b6a6bb00-888f-483a-b9f3-440c5fa5f62e-utilities\") on node \"crc\" DevicePath \"\"" Dec 11 13:49:53 crc kubenswrapper[4898]: I1211 13:49:53.547307 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pc8dn" Dec 11 13:49:53 crc kubenswrapper[4898]: I1211 13:49:53.547525 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-xg76c" podUID="89dd4b4d-f3fa-4232-95bb-2c1439cf9d92" containerName="registry-server" containerID="cri-o://d44857e09848f6856c31986b41f33b78d75b2ac57133d1306ff33bc0a3c5c40e" gracePeriod=2 Dec 11 13:49:53 crc kubenswrapper[4898]: I1211 13:49:53.592998 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-pc8dn"] Dec 11 13:49:53 crc kubenswrapper[4898]: I1211 13:49:53.602674 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-pc8dn"] Dec 11 13:49:54 crc kubenswrapper[4898]: I1211 13:49:54.559495 4898 generic.go:334] "Generic (PLEG): container finished" podID="89dd4b4d-f3fa-4232-95bb-2c1439cf9d92" containerID="d44857e09848f6856c31986b41f33b78d75b2ac57133d1306ff33bc0a3c5c40e" exitCode=0 Dec 11 13:49:54 crc kubenswrapper[4898]: I1211 13:49:54.559559 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xg76c" event={"ID":"89dd4b4d-f3fa-4232-95bb-2c1439cf9d92","Type":"ContainerDied","Data":"d44857e09848f6856c31986b41f33b78d75b2ac57133d1306ff33bc0a3c5c40e"} Dec 11 13:49:54 crc kubenswrapper[4898]: I1211 13:49:54.790756 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6a6bb00-888f-483a-b9f3-440c5fa5f62e" path="/var/lib/kubelet/pods/b6a6bb00-888f-483a-b9f3-440c5fa5f62e/volumes" Dec 11 13:49:55 crc kubenswrapper[4898]: I1211 13:49:55.570569 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xg76c" Dec 11 13:49:55 crc kubenswrapper[4898]: I1211 13:49:55.571689 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xg76c" event={"ID":"89dd4b4d-f3fa-4232-95bb-2c1439cf9d92","Type":"ContainerDied","Data":"3d271a5b0a3e40862194b5ae66662c4777c133cdea00808e594a3133ab842a7b"} Dec 11 13:49:55 crc kubenswrapper[4898]: I1211 13:49:55.571762 4898 scope.go:117] "RemoveContainer" containerID="d44857e09848f6856c31986b41f33b78d75b2ac57133d1306ff33bc0a3c5c40e" Dec 11 13:49:55 crc kubenswrapper[4898]: I1211 13:49:55.589242 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/89dd4b4d-f3fa-4232-95bb-2c1439cf9d92-utilities\") pod \"89dd4b4d-f3fa-4232-95bb-2c1439cf9d92\" (UID: \"89dd4b4d-f3fa-4232-95bb-2c1439cf9d92\") " Dec 11 13:49:55 crc kubenswrapper[4898]: I1211 13:49:55.589700 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rb5nx\" (UniqueName: \"kubernetes.io/projected/89dd4b4d-f3fa-4232-95bb-2c1439cf9d92-kube-api-access-rb5nx\") pod \"89dd4b4d-f3fa-4232-95bb-2c1439cf9d92\" (UID: \"89dd4b4d-f3fa-4232-95bb-2c1439cf9d92\") " Dec 11 13:49:55 crc kubenswrapper[4898]: I1211 13:49:55.590106 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/89dd4b4d-f3fa-4232-95bb-2c1439cf9d92-utilities" (OuterVolumeSpecName: "utilities") pod "89dd4b4d-f3fa-4232-95bb-2c1439cf9d92" (UID: "89dd4b4d-f3fa-4232-95bb-2c1439cf9d92"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 13:49:55 crc kubenswrapper[4898]: I1211 13:49:55.591049 4898 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/89dd4b4d-f3fa-4232-95bb-2c1439cf9d92-utilities\") on node \"crc\" DevicePath \"\"" Dec 11 13:49:55 crc kubenswrapper[4898]: I1211 13:49:55.603954 4898 scope.go:117] "RemoveContainer" containerID="caeac67070dba1e9cc06ae4cf1271ec48d5ea43b4fe5b24db7397e51e53e3e00" Dec 11 13:49:55 crc kubenswrapper[4898]: I1211 13:49:55.604424 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/89dd4b4d-f3fa-4232-95bb-2c1439cf9d92-kube-api-access-rb5nx" (OuterVolumeSpecName: "kube-api-access-rb5nx") pod "89dd4b4d-f3fa-4232-95bb-2c1439cf9d92" (UID: "89dd4b4d-f3fa-4232-95bb-2c1439cf9d92"). InnerVolumeSpecName "kube-api-access-rb5nx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:49:55 crc kubenswrapper[4898]: I1211 13:49:55.676027 4898 scope.go:117] "RemoveContainer" containerID="da2f49098accca135033f3eed6b69b010bb2995a242bd7730bf82403e1cca0ba" Dec 11 13:49:55 crc kubenswrapper[4898]: I1211 13:49:55.692917 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/89dd4b4d-f3fa-4232-95bb-2c1439cf9d92-catalog-content\") pod \"89dd4b4d-f3fa-4232-95bb-2c1439cf9d92\" (UID: \"89dd4b4d-f3fa-4232-95bb-2c1439cf9d92\") " Dec 11 13:49:55 crc kubenswrapper[4898]: I1211 13:49:55.693759 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rb5nx\" (UniqueName: \"kubernetes.io/projected/89dd4b4d-f3fa-4232-95bb-2c1439cf9d92-kube-api-access-rb5nx\") on node \"crc\" DevicePath \"\"" Dec 11 13:49:55 crc kubenswrapper[4898]: I1211 13:49:55.719318 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/89dd4b4d-f3fa-4232-95bb-2c1439cf9d92-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "89dd4b4d-f3fa-4232-95bb-2c1439cf9d92" (UID: "89dd4b4d-f3fa-4232-95bb-2c1439cf9d92"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 13:49:55 crc kubenswrapper[4898]: I1211 13:49:55.799720 4898 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/89dd4b4d-f3fa-4232-95bb-2c1439cf9d92-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 11 13:49:56 crc kubenswrapper[4898]: I1211 13:49:56.585262 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xg76c" Dec 11 13:49:56 crc kubenswrapper[4898]: I1211 13:49:56.640523 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-xg76c"] Dec 11 13:49:56 crc kubenswrapper[4898]: I1211 13:49:56.653662 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-xg76c"] Dec 11 13:49:56 crc kubenswrapper[4898]: I1211 13:49:56.788996 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="89dd4b4d-f3fa-4232-95bb-2c1439cf9d92" path="/var/lib/kubelet/pods/89dd4b4d-f3fa-4232-95bb-2c1439cf9d92/volumes" Dec 11 13:51:16 crc kubenswrapper[4898]: I1211 13:51:16.251922 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-jdzhh"] Dec 11 13:51:16 crc kubenswrapper[4898]: E1211 13:51:16.256187 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89dd4b4d-f3fa-4232-95bb-2c1439cf9d92" containerName="registry-server" Dec 11 13:51:16 crc kubenswrapper[4898]: I1211 13:51:16.256237 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="89dd4b4d-f3fa-4232-95bb-2c1439cf9d92" containerName="registry-server" Dec 11 13:51:16 crc kubenswrapper[4898]: E1211 13:51:16.256254 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6a6bb00-888f-483a-b9f3-440c5fa5f62e" containerName="extract-utilities" Dec 11 13:51:16 crc kubenswrapper[4898]: I1211 13:51:16.256267 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6a6bb00-888f-483a-b9f3-440c5fa5f62e" containerName="extract-utilities" Dec 11 13:51:16 crc kubenswrapper[4898]: E1211 13:51:16.256294 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6a6bb00-888f-483a-b9f3-440c5fa5f62e" containerName="registry-server" Dec 11 13:51:16 crc kubenswrapper[4898]: I1211 13:51:16.256308 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6a6bb00-888f-483a-b9f3-440c5fa5f62e" containerName="registry-server" Dec 11 13:51:16 crc kubenswrapper[4898]: E1211 13:51:16.256353 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89dd4b4d-f3fa-4232-95bb-2c1439cf9d92" containerName="extract-utilities" Dec 11 13:51:16 crc kubenswrapper[4898]: I1211 13:51:16.256381 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="89dd4b4d-f3fa-4232-95bb-2c1439cf9d92" containerName="extract-utilities" Dec 11 13:51:16 crc kubenswrapper[4898]: E1211 13:51:16.256407 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6a6bb00-888f-483a-b9f3-440c5fa5f62e" containerName="extract-content" Dec 11 13:51:16 crc kubenswrapper[4898]: I1211 13:51:16.256419 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6a6bb00-888f-483a-b9f3-440c5fa5f62e" containerName="extract-content" Dec 11 13:51:16 crc kubenswrapper[4898]: E1211 13:51:16.256489 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89dd4b4d-f3fa-4232-95bb-2c1439cf9d92" containerName="extract-content" Dec 11 13:51:16 crc kubenswrapper[4898]: I1211 13:51:16.256504 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="89dd4b4d-f3fa-4232-95bb-2c1439cf9d92" containerName="extract-content" Dec 11 13:51:16 crc kubenswrapper[4898]: I1211 13:51:16.256892 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="89dd4b4d-f3fa-4232-95bb-2c1439cf9d92" containerName="registry-server" Dec 11 13:51:16 crc kubenswrapper[4898]: I1211 13:51:16.256956 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6a6bb00-888f-483a-b9f3-440c5fa5f62e" containerName="registry-server" Dec 11 13:51:16 crc kubenswrapper[4898]: I1211 13:51:16.262049 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jdzhh" Dec 11 13:51:16 crc kubenswrapper[4898]: I1211 13:51:16.272612 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jdzhh"] Dec 11 13:51:16 crc kubenswrapper[4898]: I1211 13:51:16.380107 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/765e571e-58bc-4429-b230-e3ce096050ff-utilities\") pod \"redhat-operators-jdzhh\" (UID: \"765e571e-58bc-4429-b230-e3ce096050ff\") " pod="openshift-marketplace/redhat-operators-jdzhh" Dec 11 13:51:16 crc kubenswrapper[4898]: I1211 13:51:16.380511 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/765e571e-58bc-4429-b230-e3ce096050ff-catalog-content\") pod \"redhat-operators-jdzhh\" (UID: \"765e571e-58bc-4429-b230-e3ce096050ff\") " pod="openshift-marketplace/redhat-operators-jdzhh" Dec 11 13:51:16 crc kubenswrapper[4898]: I1211 13:51:16.380606 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-htbl9\" (UniqueName: \"kubernetes.io/projected/765e571e-58bc-4429-b230-e3ce096050ff-kube-api-access-htbl9\") pod \"redhat-operators-jdzhh\" (UID: \"765e571e-58bc-4429-b230-e3ce096050ff\") " pod="openshift-marketplace/redhat-operators-jdzhh" Dec 11 13:51:16 crc kubenswrapper[4898]: I1211 13:51:16.483247 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/765e571e-58bc-4429-b230-e3ce096050ff-catalog-content\") pod \"redhat-operators-jdzhh\" (UID: \"765e571e-58bc-4429-b230-e3ce096050ff\") " pod="openshift-marketplace/redhat-operators-jdzhh" Dec 11 13:51:16 crc kubenswrapper[4898]: I1211 13:51:16.483397 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-htbl9\" (UniqueName: \"kubernetes.io/projected/765e571e-58bc-4429-b230-e3ce096050ff-kube-api-access-htbl9\") pod \"redhat-operators-jdzhh\" (UID: \"765e571e-58bc-4429-b230-e3ce096050ff\") " pod="openshift-marketplace/redhat-operators-jdzhh" Dec 11 13:51:16 crc kubenswrapper[4898]: I1211 13:51:16.483598 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/765e571e-58bc-4429-b230-e3ce096050ff-utilities\") pod \"redhat-operators-jdzhh\" (UID: \"765e571e-58bc-4429-b230-e3ce096050ff\") " pod="openshift-marketplace/redhat-operators-jdzhh" Dec 11 13:51:16 crc kubenswrapper[4898]: I1211 13:51:16.484191 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/765e571e-58bc-4429-b230-e3ce096050ff-utilities\") pod \"redhat-operators-jdzhh\" (UID: \"765e571e-58bc-4429-b230-e3ce096050ff\") " pod="openshift-marketplace/redhat-operators-jdzhh" Dec 11 13:51:16 crc kubenswrapper[4898]: I1211 13:51:16.484303 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/765e571e-58bc-4429-b230-e3ce096050ff-catalog-content\") pod \"redhat-operators-jdzhh\" (UID: \"765e571e-58bc-4429-b230-e3ce096050ff\") " pod="openshift-marketplace/redhat-operators-jdzhh" Dec 11 13:51:16 crc kubenswrapper[4898]: I1211 13:51:16.515353 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-htbl9\" (UniqueName: \"kubernetes.io/projected/765e571e-58bc-4429-b230-e3ce096050ff-kube-api-access-htbl9\") pod \"redhat-operators-jdzhh\" (UID: \"765e571e-58bc-4429-b230-e3ce096050ff\") " pod="openshift-marketplace/redhat-operators-jdzhh" Dec 11 13:51:16 crc kubenswrapper[4898]: I1211 13:51:16.606251 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jdzhh" Dec 11 13:51:17 crc kubenswrapper[4898]: I1211 13:51:17.116235 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jdzhh"] Dec 11 13:51:17 crc kubenswrapper[4898]: I1211 13:51:17.850605 4898 generic.go:334] "Generic (PLEG): container finished" podID="765e571e-58bc-4429-b230-e3ce096050ff" containerID="496fb5ffbee74a10747a777d1cac1773b5c857704be6710678bdf5ac0b187043" exitCode=0 Dec 11 13:51:17 crc kubenswrapper[4898]: I1211 13:51:17.850656 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jdzhh" event={"ID":"765e571e-58bc-4429-b230-e3ce096050ff","Type":"ContainerDied","Data":"496fb5ffbee74a10747a777d1cac1773b5c857704be6710678bdf5ac0b187043"} Dec 11 13:51:17 crc kubenswrapper[4898]: I1211 13:51:17.850957 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jdzhh" event={"ID":"765e571e-58bc-4429-b230-e3ce096050ff","Type":"ContainerStarted","Data":"17eaeae264426ae8cd9935ffdce0ce907a76f74804417d4b20b08230889aaa23"} Dec 11 13:51:19 crc kubenswrapper[4898]: I1211 13:51:19.875872 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jdzhh" event={"ID":"765e571e-58bc-4429-b230-e3ce096050ff","Type":"ContainerStarted","Data":"2c0cdb8f64169515afb818da1abcc206847621e3096560b61ad9a8304e441f94"} Dec 11 13:51:27 crc kubenswrapper[4898]: I1211 13:51:27.973407 4898 generic.go:334] "Generic (PLEG): container finished" podID="765e571e-58bc-4429-b230-e3ce096050ff" containerID="2c0cdb8f64169515afb818da1abcc206847621e3096560b61ad9a8304e441f94" exitCode=0 Dec 11 13:51:27 crc kubenswrapper[4898]: I1211 13:51:27.974059 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jdzhh" event={"ID":"765e571e-58bc-4429-b230-e3ce096050ff","Type":"ContainerDied","Data":"2c0cdb8f64169515afb818da1abcc206847621e3096560b61ad9a8304e441f94"} Dec 11 13:51:28 crc kubenswrapper[4898]: I1211 13:51:28.988925 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jdzhh" event={"ID":"765e571e-58bc-4429-b230-e3ce096050ff","Type":"ContainerStarted","Data":"35ffffdc004e0ba44e5e5a3ce515c0ba88c74a4c3a0964737f310d2a24aab9a9"} Dec 11 13:51:29 crc kubenswrapper[4898]: I1211 13:51:29.017768 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-jdzhh" podStartSLOduration=2.405335076 podStartE2EDuration="13.017745467s" podCreationTimestamp="2025-12-11 13:51:16 +0000 UTC" firstStartedPulling="2025-12-11 13:51:17.852985405 +0000 UTC m=+2835.425311842" lastFinishedPulling="2025-12-11 13:51:28.465395806 +0000 UTC m=+2846.037722233" observedRunningTime="2025-12-11 13:51:29.005279441 +0000 UTC m=+2846.577605888" watchObservedRunningTime="2025-12-11 13:51:29.017745467 +0000 UTC m=+2846.590071904" Dec 11 13:51:34 crc kubenswrapper[4898]: I1211 13:51:34.995982 4898 patch_prober.go:28] interesting pod/machine-config-daemon-7mmvk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 11 13:51:34 crc kubenswrapper[4898]: I1211 13:51:34.996544 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 11 13:51:36 crc kubenswrapper[4898]: I1211 13:51:36.606520 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-jdzhh" Dec 11 13:51:36 crc kubenswrapper[4898]: I1211 13:51:36.606842 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-jdzhh" Dec 11 13:51:37 crc kubenswrapper[4898]: I1211 13:51:37.657992 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-jdzhh" podUID="765e571e-58bc-4429-b230-e3ce096050ff" containerName="registry-server" probeResult="failure" output=< Dec 11 13:51:37 crc kubenswrapper[4898]: timeout: failed to connect service ":50051" within 1s Dec 11 13:51:37 crc kubenswrapper[4898]: > Dec 11 13:51:46 crc kubenswrapper[4898]: I1211 13:51:46.667930 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-jdzhh" Dec 11 13:51:46 crc kubenswrapper[4898]: I1211 13:51:46.719022 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-jdzhh" Dec 11 13:51:47 crc kubenswrapper[4898]: I1211 13:51:47.447675 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jdzhh"] Dec 11 13:51:48 crc kubenswrapper[4898]: I1211 13:51:48.175368 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-jdzhh" podUID="765e571e-58bc-4429-b230-e3ce096050ff" containerName="registry-server" containerID="cri-o://35ffffdc004e0ba44e5e5a3ce515c0ba88c74a4c3a0964737f310d2a24aab9a9" gracePeriod=2 Dec 11 13:51:49 crc kubenswrapper[4898]: I1211 13:51:49.186573 4898 generic.go:334] "Generic (PLEG): container finished" podID="765e571e-58bc-4429-b230-e3ce096050ff" containerID="35ffffdc004e0ba44e5e5a3ce515c0ba88c74a4c3a0964737f310d2a24aab9a9" exitCode=0 Dec 11 13:51:49 crc kubenswrapper[4898]: I1211 13:51:49.186662 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jdzhh" event={"ID":"765e571e-58bc-4429-b230-e3ce096050ff","Type":"ContainerDied","Data":"35ffffdc004e0ba44e5e5a3ce515c0ba88c74a4c3a0964737f310d2a24aab9a9"} Dec 11 13:51:50 crc kubenswrapper[4898]: I1211 13:51:50.050770 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jdzhh" Dec 11 13:51:50 crc kubenswrapper[4898]: I1211 13:51:50.181570 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htbl9\" (UniqueName: \"kubernetes.io/projected/765e571e-58bc-4429-b230-e3ce096050ff-kube-api-access-htbl9\") pod \"765e571e-58bc-4429-b230-e3ce096050ff\" (UID: \"765e571e-58bc-4429-b230-e3ce096050ff\") " Dec 11 13:51:50 crc kubenswrapper[4898]: I1211 13:51:50.181627 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/765e571e-58bc-4429-b230-e3ce096050ff-utilities\") pod \"765e571e-58bc-4429-b230-e3ce096050ff\" (UID: \"765e571e-58bc-4429-b230-e3ce096050ff\") " Dec 11 13:51:50 crc kubenswrapper[4898]: I1211 13:51:50.181756 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/765e571e-58bc-4429-b230-e3ce096050ff-catalog-content\") pod \"765e571e-58bc-4429-b230-e3ce096050ff\" (UID: \"765e571e-58bc-4429-b230-e3ce096050ff\") " Dec 11 13:51:50 crc kubenswrapper[4898]: I1211 13:51:50.183141 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/765e571e-58bc-4429-b230-e3ce096050ff-utilities" (OuterVolumeSpecName: "utilities") pod "765e571e-58bc-4429-b230-e3ce096050ff" (UID: "765e571e-58bc-4429-b230-e3ce096050ff"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 13:51:50 crc kubenswrapper[4898]: I1211 13:51:50.187816 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/765e571e-58bc-4429-b230-e3ce096050ff-kube-api-access-htbl9" (OuterVolumeSpecName: "kube-api-access-htbl9") pod "765e571e-58bc-4429-b230-e3ce096050ff" (UID: "765e571e-58bc-4429-b230-e3ce096050ff"). InnerVolumeSpecName "kube-api-access-htbl9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:51:50 crc kubenswrapper[4898]: I1211 13:51:50.199241 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jdzhh" event={"ID":"765e571e-58bc-4429-b230-e3ce096050ff","Type":"ContainerDied","Data":"17eaeae264426ae8cd9935ffdce0ce907a76f74804417d4b20b08230889aaa23"} Dec 11 13:51:50 crc kubenswrapper[4898]: I1211 13:51:50.199299 4898 scope.go:117] "RemoveContainer" containerID="35ffffdc004e0ba44e5e5a3ce515c0ba88c74a4c3a0964737f310d2a24aab9a9" Dec 11 13:51:50 crc kubenswrapper[4898]: I1211 13:51:50.199517 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jdzhh" Dec 11 13:51:50 crc kubenswrapper[4898]: I1211 13:51:50.257427 4898 scope.go:117] "RemoveContainer" containerID="2c0cdb8f64169515afb818da1abcc206847621e3096560b61ad9a8304e441f94" Dec 11 13:51:50 crc kubenswrapper[4898]: I1211 13:51:50.282987 4898 scope.go:117] "RemoveContainer" containerID="496fb5ffbee74a10747a777d1cac1773b5c857704be6710678bdf5ac0b187043" Dec 11 13:51:50 crc kubenswrapper[4898]: I1211 13:51:50.284800 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htbl9\" (UniqueName: \"kubernetes.io/projected/765e571e-58bc-4429-b230-e3ce096050ff-kube-api-access-htbl9\") on node \"crc\" DevicePath \"\"" Dec 11 13:51:50 crc kubenswrapper[4898]: I1211 13:51:50.284824 4898 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/765e571e-58bc-4429-b230-e3ce096050ff-utilities\") on node \"crc\" DevicePath \"\"" Dec 11 13:51:50 crc kubenswrapper[4898]: I1211 13:51:50.301160 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/765e571e-58bc-4429-b230-e3ce096050ff-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "765e571e-58bc-4429-b230-e3ce096050ff" (UID: "765e571e-58bc-4429-b230-e3ce096050ff"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 13:51:50 crc kubenswrapper[4898]: I1211 13:51:50.387093 4898 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/765e571e-58bc-4429-b230-e3ce096050ff-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 11 13:51:50 crc kubenswrapper[4898]: I1211 13:51:50.543670 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jdzhh"] Dec 11 13:51:50 crc kubenswrapper[4898]: I1211 13:51:50.552912 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-jdzhh"] Dec 11 13:51:50 crc kubenswrapper[4898]: I1211 13:51:50.793797 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="765e571e-58bc-4429-b230-e3ce096050ff" path="/var/lib/kubelet/pods/765e571e-58bc-4429-b230-e3ce096050ff/volumes" Dec 11 13:52:04 crc kubenswrapper[4898]: I1211 13:52:04.996119 4898 patch_prober.go:28] interesting pod/machine-config-daemon-7mmvk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 11 13:52:04 crc kubenswrapper[4898]: I1211 13:52:04.997638 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 11 13:52:34 crc kubenswrapper[4898]: I1211 13:52:34.996328 4898 patch_prober.go:28] interesting pod/machine-config-daemon-7mmvk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 11 13:52:34 crc kubenswrapper[4898]: I1211 13:52:34.996889 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 11 13:52:34 crc kubenswrapper[4898]: I1211 13:52:34.996947 4898 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" Dec 11 13:52:34 crc kubenswrapper[4898]: I1211 13:52:34.998762 4898 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"af271096b767ce5903084b05e8ad0ee53c330083b7f59f687de7d1f33e49b2a5"} pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 11 13:52:34 crc kubenswrapper[4898]: I1211 13:52:34.998859 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" containerName="machine-config-daemon" containerID="cri-o://af271096b767ce5903084b05e8ad0ee53c330083b7f59f687de7d1f33e49b2a5" gracePeriod=600 Dec 11 13:52:35 crc kubenswrapper[4898]: I1211 13:52:35.730049 4898 generic.go:334] "Generic (PLEG): container finished" podID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" containerID="af271096b767ce5903084b05e8ad0ee53c330083b7f59f687de7d1f33e49b2a5" exitCode=0 Dec 11 13:52:35 crc kubenswrapper[4898]: I1211 13:52:35.730281 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" event={"ID":"b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c","Type":"ContainerDied","Data":"af271096b767ce5903084b05e8ad0ee53c330083b7f59f687de7d1f33e49b2a5"} Dec 11 13:52:35 crc kubenswrapper[4898]: I1211 13:52:35.730431 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" event={"ID":"b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c","Type":"ContainerStarted","Data":"c9e922d5ee469f620afdfb2ddd69dbee93879150084a4966647adbf551d5f7e9"} Dec 11 13:52:35 crc kubenswrapper[4898]: I1211 13:52:35.730448 4898 scope.go:117] "RemoveContainer" containerID="27ad0d928a4e9168d47090faf6f1e780be1daad349640ac754f272e963e95eb8" Dec 11 13:52:36 crc kubenswrapper[4898]: I1211 13:52:36.749091 4898 generic.go:334] "Generic (PLEG): container finished" podID="57ebbb8f-174c-4862-bba6-5644c98c7b1c" containerID="8004f1601799949605ee1cca104be273039dce6d3f4d1ca0f49f5829c374c5d2" exitCode=0 Dec 11 13:52:36 crc kubenswrapper[4898]: I1211 13:52:36.749171 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jrmnk" event={"ID":"57ebbb8f-174c-4862-bba6-5644c98c7b1c","Type":"ContainerDied","Data":"8004f1601799949605ee1cca104be273039dce6d3f4d1ca0f49f5829c374c5d2"} Dec 11 13:52:38 crc kubenswrapper[4898]: I1211 13:52:38.251994 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jrmnk" Dec 11 13:52:38 crc kubenswrapper[4898]: I1211 13:52:38.444434 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/57ebbb8f-174c-4862-bba6-5644c98c7b1c-nova-migration-ssh-key-1\") pod \"57ebbb8f-174c-4862-bba6-5644c98c7b1c\" (UID: \"57ebbb8f-174c-4862-bba6-5644c98c7b1c\") " Dec 11 13:52:38 crc kubenswrapper[4898]: I1211 13:52:38.444534 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xvk9s\" (UniqueName: \"kubernetes.io/projected/57ebbb8f-174c-4862-bba6-5644c98c7b1c-kube-api-access-xvk9s\") pod \"57ebbb8f-174c-4862-bba6-5644c98c7b1c\" (UID: \"57ebbb8f-174c-4862-bba6-5644c98c7b1c\") " Dec 11 13:52:38 crc kubenswrapper[4898]: I1211 13:52:38.444567 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/57ebbb8f-174c-4862-bba6-5644c98c7b1c-nova-cell1-compute-config-0\") pod \"57ebbb8f-174c-4862-bba6-5644c98c7b1c\" (UID: \"57ebbb8f-174c-4862-bba6-5644c98c7b1c\") " Dec 11 13:52:38 crc kubenswrapper[4898]: I1211 13:52:38.444621 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/57ebbb8f-174c-4862-bba6-5644c98c7b1c-nova-extra-config-0\") pod \"57ebbb8f-174c-4862-bba6-5644c98c7b1c\" (UID: \"57ebbb8f-174c-4862-bba6-5644c98c7b1c\") " Dec 11 13:52:38 crc kubenswrapper[4898]: I1211 13:52:38.444664 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/57ebbb8f-174c-4862-bba6-5644c98c7b1c-inventory\") pod \"57ebbb8f-174c-4862-bba6-5644c98c7b1c\" (UID: \"57ebbb8f-174c-4862-bba6-5644c98c7b1c\") " Dec 11 13:52:38 crc kubenswrapper[4898]: I1211 13:52:38.444843 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/57ebbb8f-174c-4862-bba6-5644c98c7b1c-nova-cell1-compute-config-1\") pod \"57ebbb8f-174c-4862-bba6-5644c98c7b1c\" (UID: \"57ebbb8f-174c-4862-bba6-5644c98c7b1c\") " Dec 11 13:52:38 crc kubenswrapper[4898]: I1211 13:52:38.444935 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57ebbb8f-174c-4862-bba6-5644c98c7b1c-nova-combined-ca-bundle\") pod \"57ebbb8f-174c-4862-bba6-5644c98c7b1c\" (UID: \"57ebbb8f-174c-4862-bba6-5644c98c7b1c\") " Dec 11 13:52:38 crc kubenswrapper[4898]: I1211 13:52:38.445239 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/57ebbb8f-174c-4862-bba6-5644c98c7b1c-ssh-key\") pod \"57ebbb8f-174c-4862-bba6-5644c98c7b1c\" (UID: \"57ebbb8f-174c-4862-bba6-5644c98c7b1c\") " Dec 11 13:52:38 crc kubenswrapper[4898]: I1211 13:52:38.445307 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/57ebbb8f-174c-4862-bba6-5644c98c7b1c-nova-migration-ssh-key-0\") pod \"57ebbb8f-174c-4862-bba6-5644c98c7b1c\" (UID: \"57ebbb8f-174c-4862-bba6-5644c98c7b1c\") " Dec 11 13:52:38 crc kubenswrapper[4898]: I1211 13:52:38.464923 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57ebbb8f-174c-4862-bba6-5644c98c7b1c-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "57ebbb8f-174c-4862-bba6-5644c98c7b1c" (UID: "57ebbb8f-174c-4862-bba6-5644c98c7b1c"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:52:38 crc kubenswrapper[4898]: I1211 13:52:38.464978 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57ebbb8f-174c-4862-bba6-5644c98c7b1c-kube-api-access-xvk9s" (OuterVolumeSpecName: "kube-api-access-xvk9s") pod "57ebbb8f-174c-4862-bba6-5644c98c7b1c" (UID: "57ebbb8f-174c-4862-bba6-5644c98c7b1c"). InnerVolumeSpecName "kube-api-access-xvk9s". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:52:38 crc kubenswrapper[4898]: I1211 13:52:38.491933 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57ebbb8f-174c-4862-bba6-5644c98c7b1c-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "57ebbb8f-174c-4862-bba6-5644c98c7b1c" (UID: "57ebbb8f-174c-4862-bba6-5644c98c7b1c"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:52:38 crc kubenswrapper[4898]: I1211 13:52:38.492700 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57ebbb8f-174c-4862-bba6-5644c98c7b1c-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "57ebbb8f-174c-4862-bba6-5644c98c7b1c" (UID: "57ebbb8f-174c-4862-bba6-5644c98c7b1c"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:52:38 crc kubenswrapper[4898]: I1211 13:52:38.496523 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57ebbb8f-174c-4862-bba6-5644c98c7b1c-inventory" (OuterVolumeSpecName: "inventory") pod "57ebbb8f-174c-4862-bba6-5644c98c7b1c" (UID: "57ebbb8f-174c-4862-bba6-5644c98c7b1c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:52:38 crc kubenswrapper[4898]: I1211 13:52:38.496637 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57ebbb8f-174c-4862-bba6-5644c98c7b1c-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "57ebbb8f-174c-4862-bba6-5644c98c7b1c" (UID: "57ebbb8f-174c-4862-bba6-5644c98c7b1c"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:52:38 crc kubenswrapper[4898]: I1211 13:52:38.509038 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57ebbb8f-174c-4862-bba6-5644c98c7b1c-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "57ebbb8f-174c-4862-bba6-5644c98c7b1c" (UID: "57ebbb8f-174c-4862-bba6-5644c98c7b1c"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:52:38 crc kubenswrapper[4898]: I1211 13:52:38.510884 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57ebbb8f-174c-4862-bba6-5644c98c7b1c-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "57ebbb8f-174c-4862-bba6-5644c98c7b1c" (UID: "57ebbb8f-174c-4862-bba6-5644c98c7b1c"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:52:38 crc kubenswrapper[4898]: I1211 13:52:38.511309 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/57ebbb8f-174c-4862-bba6-5644c98c7b1c-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "57ebbb8f-174c-4862-bba6-5644c98c7b1c" (UID: "57ebbb8f-174c-4862-bba6-5644c98c7b1c"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 13:52:38 crc kubenswrapper[4898]: I1211 13:52:38.548912 4898 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/57ebbb8f-174c-4862-bba6-5644c98c7b1c-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 11 13:52:38 crc kubenswrapper[4898]: I1211 13:52:38.548971 4898 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/57ebbb8f-174c-4862-bba6-5644c98c7b1c-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Dec 11 13:52:38 crc kubenswrapper[4898]: I1211 13:52:38.548991 4898 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/57ebbb8f-174c-4862-bba6-5644c98c7b1c-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Dec 11 13:52:38 crc kubenswrapper[4898]: I1211 13:52:38.549010 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xvk9s\" (UniqueName: \"kubernetes.io/projected/57ebbb8f-174c-4862-bba6-5644c98c7b1c-kube-api-access-xvk9s\") on node \"crc\" DevicePath \"\"" Dec 11 13:52:38 crc kubenswrapper[4898]: I1211 13:52:38.549026 4898 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/57ebbb8f-174c-4862-bba6-5644c98c7b1c-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Dec 11 13:52:38 crc kubenswrapper[4898]: I1211 13:52:38.549039 4898 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/57ebbb8f-174c-4862-bba6-5644c98c7b1c-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Dec 11 13:52:38 crc kubenswrapper[4898]: I1211 13:52:38.549053 4898 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/57ebbb8f-174c-4862-bba6-5644c98c7b1c-inventory\") on node \"crc\" DevicePath \"\"" Dec 11 13:52:38 crc kubenswrapper[4898]: I1211 13:52:38.549068 4898 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/57ebbb8f-174c-4862-bba6-5644c98c7b1c-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Dec 11 13:52:38 crc kubenswrapper[4898]: I1211 13:52:38.549079 4898 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57ebbb8f-174c-4862-bba6-5644c98c7b1c-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 11 13:52:38 crc kubenswrapper[4898]: I1211 13:52:38.779778 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jrmnk" Dec 11 13:52:38 crc kubenswrapper[4898]: I1211 13:52:38.794711 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jrmnk" event={"ID":"57ebbb8f-174c-4862-bba6-5644c98c7b1c","Type":"ContainerDied","Data":"f580840c1ac311c65d635856e19d2ae8db9435d0bcf31c74be369316fbf1c58b"} Dec 11 13:52:38 crc kubenswrapper[4898]: I1211 13:52:38.795375 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f580840c1ac311c65d635856e19d2ae8db9435d0bcf31c74be369316fbf1c58b" Dec 11 13:52:38 crc kubenswrapper[4898]: I1211 13:52:38.875903 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-tg6rn"] Dec 11 13:52:38 crc kubenswrapper[4898]: E1211 13:52:38.876600 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57ebbb8f-174c-4862-bba6-5644c98c7b1c" containerName="nova-edpm-deployment-openstack-edpm-ipam" Dec 11 13:52:38 crc kubenswrapper[4898]: I1211 13:52:38.876700 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="57ebbb8f-174c-4862-bba6-5644c98c7b1c" containerName="nova-edpm-deployment-openstack-edpm-ipam" Dec 11 13:52:38 crc kubenswrapper[4898]: E1211 13:52:38.876786 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="765e571e-58bc-4429-b230-e3ce096050ff" containerName="registry-server" Dec 11 13:52:38 crc kubenswrapper[4898]: I1211 13:52:38.876843 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="765e571e-58bc-4429-b230-e3ce096050ff" containerName="registry-server" Dec 11 13:52:38 crc kubenswrapper[4898]: E1211 13:52:38.876903 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="765e571e-58bc-4429-b230-e3ce096050ff" containerName="extract-utilities" Dec 11 13:52:38 crc kubenswrapper[4898]: I1211 13:52:38.876976 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="765e571e-58bc-4429-b230-e3ce096050ff" containerName="extract-utilities" Dec 11 13:52:38 crc kubenswrapper[4898]: E1211 13:52:38.877086 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="765e571e-58bc-4429-b230-e3ce096050ff" containerName="extract-content" Dec 11 13:52:38 crc kubenswrapper[4898]: I1211 13:52:38.877153 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="765e571e-58bc-4429-b230-e3ce096050ff" containerName="extract-content" Dec 11 13:52:38 crc kubenswrapper[4898]: I1211 13:52:38.877476 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="765e571e-58bc-4429-b230-e3ce096050ff" containerName="registry-server" Dec 11 13:52:38 crc kubenswrapper[4898]: I1211 13:52:38.877587 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="57ebbb8f-174c-4862-bba6-5644c98c7b1c" containerName="nova-edpm-deployment-openstack-edpm-ipam" Dec 11 13:52:38 crc kubenswrapper[4898]: I1211 13:52:38.878495 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-tg6rn" Dec 11 13:52:38 crc kubenswrapper[4898]: I1211 13:52:38.880911 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 11 13:52:38 crc kubenswrapper[4898]: I1211 13:52:38.881566 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-w8jc8" Dec 11 13:52:38 crc kubenswrapper[4898]: I1211 13:52:38.882006 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Dec 11 13:52:38 crc kubenswrapper[4898]: I1211 13:52:38.882552 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 11 13:52:38 crc kubenswrapper[4898]: I1211 13:52:38.883644 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 11 13:52:38 crc kubenswrapper[4898]: I1211 13:52:38.895441 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-tg6rn"] Dec 11 13:52:38 crc kubenswrapper[4898]: I1211 13:52:38.958530 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/c4f48310-0f95-46eb-9a4f-d1058c7a47c0-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-tg6rn\" (UID: \"c4f48310-0f95-46eb-9a4f-d1058c7a47c0\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-tg6rn" Dec 11 13:52:38 crc kubenswrapper[4898]: I1211 13:52:38.958570 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c4f48310-0f95-46eb-9a4f-d1058c7a47c0-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-tg6rn\" (UID: \"c4f48310-0f95-46eb-9a4f-d1058c7a47c0\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-tg6rn" Dec 11 13:52:38 crc kubenswrapper[4898]: I1211 13:52:38.958620 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lq9p4\" (UniqueName: \"kubernetes.io/projected/c4f48310-0f95-46eb-9a4f-d1058c7a47c0-kube-api-access-lq9p4\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-tg6rn\" (UID: \"c4f48310-0f95-46eb-9a4f-d1058c7a47c0\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-tg6rn" Dec 11 13:52:38 crc kubenswrapper[4898]: I1211 13:52:38.958644 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/c4f48310-0f95-46eb-9a4f-d1058c7a47c0-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-tg6rn\" (UID: \"c4f48310-0f95-46eb-9a4f-d1058c7a47c0\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-tg6rn" Dec 11 13:52:38 crc kubenswrapper[4898]: I1211 13:52:38.958717 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c4f48310-0f95-46eb-9a4f-d1058c7a47c0-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-tg6rn\" (UID: \"c4f48310-0f95-46eb-9a4f-d1058c7a47c0\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-tg6rn" Dec 11 13:52:38 crc kubenswrapper[4898]: I1211 13:52:38.958742 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/c4f48310-0f95-46eb-9a4f-d1058c7a47c0-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-tg6rn\" (UID: \"c4f48310-0f95-46eb-9a4f-d1058c7a47c0\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-tg6rn" Dec 11 13:52:38 crc kubenswrapper[4898]: I1211 13:52:38.958835 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4f48310-0f95-46eb-9a4f-d1058c7a47c0-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-tg6rn\" (UID: \"c4f48310-0f95-46eb-9a4f-d1058c7a47c0\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-tg6rn" Dec 11 13:52:39 crc kubenswrapper[4898]: I1211 13:52:39.060643 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c4f48310-0f95-46eb-9a4f-d1058c7a47c0-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-tg6rn\" (UID: \"c4f48310-0f95-46eb-9a4f-d1058c7a47c0\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-tg6rn" Dec 11 13:52:39 crc kubenswrapper[4898]: I1211 13:52:39.061174 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/c4f48310-0f95-46eb-9a4f-d1058c7a47c0-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-tg6rn\" (UID: \"c4f48310-0f95-46eb-9a4f-d1058c7a47c0\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-tg6rn" Dec 11 13:52:39 crc kubenswrapper[4898]: I1211 13:52:39.061428 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4f48310-0f95-46eb-9a4f-d1058c7a47c0-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-tg6rn\" (UID: \"c4f48310-0f95-46eb-9a4f-d1058c7a47c0\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-tg6rn" Dec 11 13:52:39 crc kubenswrapper[4898]: I1211 13:52:39.061588 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/c4f48310-0f95-46eb-9a4f-d1058c7a47c0-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-tg6rn\" (UID: \"c4f48310-0f95-46eb-9a4f-d1058c7a47c0\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-tg6rn" Dec 11 13:52:39 crc kubenswrapper[4898]: I1211 13:52:39.061705 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c4f48310-0f95-46eb-9a4f-d1058c7a47c0-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-tg6rn\" (UID: \"c4f48310-0f95-46eb-9a4f-d1058c7a47c0\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-tg6rn" Dec 11 13:52:39 crc kubenswrapper[4898]: I1211 13:52:39.061933 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lq9p4\" (UniqueName: \"kubernetes.io/projected/c4f48310-0f95-46eb-9a4f-d1058c7a47c0-kube-api-access-lq9p4\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-tg6rn\" (UID: \"c4f48310-0f95-46eb-9a4f-d1058c7a47c0\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-tg6rn" Dec 11 13:52:39 crc kubenswrapper[4898]: I1211 13:52:39.062053 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/c4f48310-0f95-46eb-9a4f-d1058c7a47c0-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-tg6rn\" (UID: \"c4f48310-0f95-46eb-9a4f-d1058c7a47c0\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-tg6rn" Dec 11 13:52:39 crc kubenswrapper[4898]: I1211 13:52:39.066627 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c4f48310-0f95-46eb-9a4f-d1058c7a47c0-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-tg6rn\" (UID: \"c4f48310-0f95-46eb-9a4f-d1058c7a47c0\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-tg6rn" Dec 11 13:52:39 crc kubenswrapper[4898]: I1211 13:52:39.066722 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c4f48310-0f95-46eb-9a4f-d1058c7a47c0-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-tg6rn\" (UID: \"c4f48310-0f95-46eb-9a4f-d1058c7a47c0\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-tg6rn" Dec 11 13:52:39 crc kubenswrapper[4898]: I1211 13:52:39.067422 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/c4f48310-0f95-46eb-9a4f-d1058c7a47c0-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-tg6rn\" (UID: \"c4f48310-0f95-46eb-9a4f-d1058c7a47c0\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-tg6rn" Dec 11 13:52:39 crc kubenswrapper[4898]: I1211 13:52:39.067667 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/c4f48310-0f95-46eb-9a4f-d1058c7a47c0-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-tg6rn\" (UID: \"c4f48310-0f95-46eb-9a4f-d1058c7a47c0\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-tg6rn" Dec 11 13:52:39 crc kubenswrapper[4898]: I1211 13:52:39.068383 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4f48310-0f95-46eb-9a4f-d1058c7a47c0-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-tg6rn\" (UID: \"c4f48310-0f95-46eb-9a4f-d1058c7a47c0\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-tg6rn" Dec 11 13:52:39 crc kubenswrapper[4898]: I1211 13:52:39.078558 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/c4f48310-0f95-46eb-9a4f-d1058c7a47c0-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-tg6rn\" (UID: \"c4f48310-0f95-46eb-9a4f-d1058c7a47c0\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-tg6rn" Dec 11 13:52:39 crc kubenswrapper[4898]: I1211 13:52:39.082136 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lq9p4\" (UniqueName: \"kubernetes.io/projected/c4f48310-0f95-46eb-9a4f-d1058c7a47c0-kube-api-access-lq9p4\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-tg6rn\" (UID: \"c4f48310-0f95-46eb-9a4f-d1058c7a47c0\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-tg6rn" Dec 11 13:52:39 crc kubenswrapper[4898]: I1211 13:52:39.201280 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-tg6rn" Dec 11 13:52:39 crc kubenswrapper[4898]: I1211 13:52:39.603956 4898 scope.go:117] "RemoveContainer" containerID="8e5cc45055fbfe6510f8947e88e6acfb005d703b4b1a592f2ca5cffab83d21f3" Dec 11 13:52:39 crc kubenswrapper[4898]: I1211 13:52:39.638611 4898 scope.go:117] "RemoveContainer" containerID="eb45be53de91b7a5e10abcb0ab0e966abee79dc671ba15873d5786f5e9806ebf" Dec 11 13:52:39 crc kubenswrapper[4898]: I1211 13:52:39.715547 4898 scope.go:117] "RemoveContainer" containerID="8a35ed31824daaaa494128fd5f2112ce1a6fc2d34cd3fdfd4be97786bf48e064" Dec 11 13:52:39 crc kubenswrapper[4898]: I1211 13:52:39.758662 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-tg6rn"] Dec 11 13:52:39 crc kubenswrapper[4898]: W1211 13:52:39.778898 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc4f48310_0f95_46eb_9a4f_d1058c7a47c0.slice/crio-ff484a71c25cb34786d6902c4fe73c97d3b565c708187a449b8f1ab7fc718108 WatchSource:0}: Error finding container ff484a71c25cb34786d6902c4fe73c97d3b565c708187a449b8f1ab7fc718108: Status 404 returned error can't find the container with id ff484a71c25cb34786d6902c4fe73c97d3b565c708187a449b8f1ab7fc718108 Dec 11 13:52:39 crc kubenswrapper[4898]: I1211 13:52:39.797834 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-tg6rn" event={"ID":"c4f48310-0f95-46eb-9a4f-d1058c7a47c0","Type":"ContainerStarted","Data":"ff484a71c25cb34786d6902c4fe73c97d3b565c708187a449b8f1ab7fc718108"} Dec 11 13:52:41 crc kubenswrapper[4898]: I1211 13:52:41.830452 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-tg6rn" event={"ID":"c4f48310-0f95-46eb-9a4f-d1058c7a47c0","Type":"ContainerStarted","Data":"6665b08dc32aeacba73865bb8d1311fc683214fa23bc81c6581e2ae0136c1166"} Dec 11 13:52:41 crc kubenswrapper[4898]: I1211 13:52:41.853206 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-tg6rn" podStartSLOduration=3.289267031 podStartE2EDuration="3.85318263s" podCreationTimestamp="2025-12-11 13:52:38 +0000 UTC" firstStartedPulling="2025-12-11 13:52:39.781860704 +0000 UTC m=+2917.354187151" lastFinishedPulling="2025-12-11 13:52:40.345776313 +0000 UTC m=+2917.918102750" observedRunningTime="2025-12-11 13:52:41.851834694 +0000 UTC m=+2919.424161161" watchObservedRunningTime="2025-12-11 13:52:41.85318263 +0000 UTC m=+2919.425509107" Dec 11 13:55:04 crc kubenswrapper[4898]: I1211 13:55:04.996182 4898 patch_prober.go:28] interesting pod/machine-config-daemon-7mmvk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 11 13:55:04 crc kubenswrapper[4898]: I1211 13:55:04.996781 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 11 13:55:06 crc kubenswrapper[4898]: I1211 13:55:06.456701 4898 generic.go:334] "Generic (PLEG): container finished" podID="c4f48310-0f95-46eb-9a4f-d1058c7a47c0" containerID="6665b08dc32aeacba73865bb8d1311fc683214fa23bc81c6581e2ae0136c1166" exitCode=0 Dec 11 13:55:06 crc kubenswrapper[4898]: I1211 13:55:06.456773 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-tg6rn" event={"ID":"c4f48310-0f95-46eb-9a4f-d1058c7a47c0","Type":"ContainerDied","Data":"6665b08dc32aeacba73865bb8d1311fc683214fa23bc81c6581e2ae0136c1166"} Dec 11 13:55:07 crc kubenswrapper[4898]: I1211 13:55:07.932216 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-tg6rn" Dec 11 13:55:07 crc kubenswrapper[4898]: I1211 13:55:07.982495 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lq9p4\" (UniqueName: \"kubernetes.io/projected/c4f48310-0f95-46eb-9a4f-d1058c7a47c0-kube-api-access-lq9p4\") pod \"c4f48310-0f95-46eb-9a4f-d1058c7a47c0\" (UID: \"c4f48310-0f95-46eb-9a4f-d1058c7a47c0\") " Dec 11 13:55:07 crc kubenswrapper[4898]: I1211 13:55:07.982600 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c4f48310-0f95-46eb-9a4f-d1058c7a47c0-ssh-key\") pod \"c4f48310-0f95-46eb-9a4f-d1058c7a47c0\" (UID: \"c4f48310-0f95-46eb-9a4f-d1058c7a47c0\") " Dec 11 13:55:07 crc kubenswrapper[4898]: I1211 13:55:07.982749 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c4f48310-0f95-46eb-9a4f-d1058c7a47c0-inventory\") pod \"c4f48310-0f95-46eb-9a4f-d1058c7a47c0\" (UID: \"c4f48310-0f95-46eb-9a4f-d1058c7a47c0\") " Dec 11 13:55:07 crc kubenswrapper[4898]: I1211 13:55:07.982914 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/c4f48310-0f95-46eb-9a4f-d1058c7a47c0-ceilometer-compute-config-data-0\") pod \"c4f48310-0f95-46eb-9a4f-d1058c7a47c0\" (UID: \"c4f48310-0f95-46eb-9a4f-d1058c7a47c0\") " Dec 11 13:55:07 crc kubenswrapper[4898]: I1211 13:55:07.982992 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4f48310-0f95-46eb-9a4f-d1058c7a47c0-telemetry-combined-ca-bundle\") pod \"c4f48310-0f95-46eb-9a4f-d1058c7a47c0\" (UID: \"c4f48310-0f95-46eb-9a4f-d1058c7a47c0\") " Dec 11 13:55:07 crc kubenswrapper[4898]: I1211 13:55:07.983051 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/c4f48310-0f95-46eb-9a4f-d1058c7a47c0-ceilometer-compute-config-data-2\") pod \"c4f48310-0f95-46eb-9a4f-d1058c7a47c0\" (UID: \"c4f48310-0f95-46eb-9a4f-d1058c7a47c0\") " Dec 11 13:55:07 crc kubenswrapper[4898]: I1211 13:55:07.983078 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/c4f48310-0f95-46eb-9a4f-d1058c7a47c0-ceilometer-compute-config-data-1\") pod \"c4f48310-0f95-46eb-9a4f-d1058c7a47c0\" (UID: \"c4f48310-0f95-46eb-9a4f-d1058c7a47c0\") " Dec 11 13:55:07 crc kubenswrapper[4898]: I1211 13:55:07.990145 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c4f48310-0f95-46eb-9a4f-d1058c7a47c0-kube-api-access-lq9p4" (OuterVolumeSpecName: "kube-api-access-lq9p4") pod "c4f48310-0f95-46eb-9a4f-d1058c7a47c0" (UID: "c4f48310-0f95-46eb-9a4f-d1058c7a47c0"). InnerVolumeSpecName "kube-api-access-lq9p4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:55:07 crc kubenswrapper[4898]: I1211 13:55:07.992074 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4f48310-0f95-46eb-9a4f-d1058c7a47c0-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "c4f48310-0f95-46eb-9a4f-d1058c7a47c0" (UID: "c4f48310-0f95-46eb-9a4f-d1058c7a47c0"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:55:08 crc kubenswrapper[4898]: I1211 13:55:08.024271 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4f48310-0f95-46eb-9a4f-d1058c7a47c0-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "c4f48310-0f95-46eb-9a4f-d1058c7a47c0" (UID: "c4f48310-0f95-46eb-9a4f-d1058c7a47c0"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:55:08 crc kubenswrapper[4898]: I1211 13:55:08.028776 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4f48310-0f95-46eb-9a4f-d1058c7a47c0-inventory" (OuterVolumeSpecName: "inventory") pod "c4f48310-0f95-46eb-9a4f-d1058c7a47c0" (UID: "c4f48310-0f95-46eb-9a4f-d1058c7a47c0"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:55:08 crc kubenswrapper[4898]: I1211 13:55:08.042653 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4f48310-0f95-46eb-9a4f-d1058c7a47c0-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "c4f48310-0f95-46eb-9a4f-d1058c7a47c0" (UID: "c4f48310-0f95-46eb-9a4f-d1058c7a47c0"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:55:08 crc kubenswrapper[4898]: I1211 13:55:08.049723 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4f48310-0f95-46eb-9a4f-d1058c7a47c0-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "c4f48310-0f95-46eb-9a4f-d1058c7a47c0" (UID: "c4f48310-0f95-46eb-9a4f-d1058c7a47c0"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:55:08 crc kubenswrapper[4898]: I1211 13:55:08.054910 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4f48310-0f95-46eb-9a4f-d1058c7a47c0-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "c4f48310-0f95-46eb-9a4f-d1058c7a47c0" (UID: "c4f48310-0f95-46eb-9a4f-d1058c7a47c0"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:55:08 crc kubenswrapper[4898]: I1211 13:55:08.086897 4898 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/c4f48310-0f95-46eb-9a4f-d1058c7a47c0-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Dec 11 13:55:08 crc kubenswrapper[4898]: I1211 13:55:08.086934 4898 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4f48310-0f95-46eb-9a4f-d1058c7a47c0-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 11 13:55:08 crc kubenswrapper[4898]: I1211 13:55:08.086946 4898 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/c4f48310-0f95-46eb-9a4f-d1058c7a47c0-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Dec 11 13:55:08 crc kubenswrapper[4898]: I1211 13:55:08.086958 4898 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/c4f48310-0f95-46eb-9a4f-d1058c7a47c0-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Dec 11 13:55:08 crc kubenswrapper[4898]: I1211 13:55:08.086983 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lq9p4\" (UniqueName: \"kubernetes.io/projected/c4f48310-0f95-46eb-9a4f-d1058c7a47c0-kube-api-access-lq9p4\") on node \"crc\" DevicePath \"\"" Dec 11 13:55:08 crc kubenswrapper[4898]: I1211 13:55:08.086994 4898 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c4f48310-0f95-46eb-9a4f-d1058c7a47c0-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 11 13:55:08 crc kubenswrapper[4898]: I1211 13:55:08.087005 4898 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c4f48310-0f95-46eb-9a4f-d1058c7a47c0-inventory\") on node \"crc\" DevicePath \"\"" Dec 11 13:55:08 crc kubenswrapper[4898]: I1211 13:55:08.482883 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-tg6rn" event={"ID":"c4f48310-0f95-46eb-9a4f-d1058c7a47c0","Type":"ContainerDied","Data":"ff484a71c25cb34786d6902c4fe73c97d3b565c708187a449b8f1ab7fc718108"} Dec 11 13:55:08 crc kubenswrapper[4898]: I1211 13:55:08.483170 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ff484a71c25cb34786d6902c4fe73c97d3b565c708187a449b8f1ab7fc718108" Dec 11 13:55:08 crc kubenswrapper[4898]: I1211 13:55:08.483236 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-tg6rn" Dec 11 13:55:08 crc kubenswrapper[4898]: I1211 13:55:08.594495 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-998xg"] Dec 11 13:55:08 crc kubenswrapper[4898]: E1211 13:55:08.594971 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4f48310-0f95-46eb-9a4f-d1058c7a47c0" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Dec 11 13:55:08 crc kubenswrapper[4898]: I1211 13:55:08.594988 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4f48310-0f95-46eb-9a4f-d1058c7a47c0" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Dec 11 13:55:08 crc kubenswrapper[4898]: I1211 13:55:08.595247 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4f48310-0f95-46eb-9a4f-d1058c7a47c0" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Dec 11 13:55:08 crc kubenswrapper[4898]: I1211 13:55:08.596031 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-998xg" Dec 11 13:55:08 crc kubenswrapper[4898]: I1211 13:55:08.598717 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 11 13:55:08 crc kubenswrapper[4898]: I1211 13:55:08.603258 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 11 13:55:08 crc kubenswrapper[4898]: I1211 13:55:08.603335 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-w8jc8" Dec 11 13:55:08 crc kubenswrapper[4898]: I1211 13:55:08.603347 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-ipmi-config-data" Dec 11 13:55:08 crc kubenswrapper[4898]: I1211 13:55:08.603375 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 11 13:55:08 crc kubenswrapper[4898]: I1211 13:55:08.617705 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-998xg"] Dec 11 13:55:08 crc kubenswrapper[4898]: I1211 13:55:08.700599 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2330f5fe-a653-4a3b-8563-d9bce00bd081-ssh-key\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-998xg\" (UID: \"2330f5fe-a653-4a3b-8563-d9bce00bd081\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-998xg" Dec 11 13:55:08 crc kubenswrapper[4898]: I1211 13:55:08.700742 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-ipmi-config-data-1\" (UniqueName: \"kubernetes.io/secret/2330f5fe-a653-4a3b-8563-d9bce00bd081-ceilometer-ipmi-config-data-1\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-998xg\" (UID: \"2330f5fe-a653-4a3b-8563-d9bce00bd081\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-998xg" Dec 11 13:55:08 crc kubenswrapper[4898]: I1211 13:55:08.700855 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2330f5fe-a653-4a3b-8563-d9bce00bd081-telemetry-power-monitoring-combined-ca-bundle\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-998xg\" (UID: \"2330f5fe-a653-4a3b-8563-d9bce00bd081\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-998xg" Dec 11 13:55:08 crc kubenswrapper[4898]: I1211 13:55:08.700984 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-ipmi-config-data-2\" (UniqueName: \"kubernetes.io/secret/2330f5fe-a653-4a3b-8563-d9bce00bd081-ceilometer-ipmi-config-data-2\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-998xg\" (UID: \"2330f5fe-a653-4a3b-8563-d9bce00bd081\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-998xg" Dec 11 13:55:08 crc kubenswrapper[4898]: I1211 13:55:08.701035 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-ipmi-config-data-0\" (UniqueName: \"kubernetes.io/secret/2330f5fe-a653-4a3b-8563-d9bce00bd081-ceilometer-ipmi-config-data-0\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-998xg\" (UID: \"2330f5fe-a653-4a3b-8563-d9bce00bd081\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-998xg" Dec 11 13:55:08 crc kubenswrapper[4898]: I1211 13:55:08.701113 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2330f5fe-a653-4a3b-8563-d9bce00bd081-inventory\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-998xg\" (UID: \"2330f5fe-a653-4a3b-8563-d9bce00bd081\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-998xg" Dec 11 13:55:08 crc kubenswrapper[4898]: I1211 13:55:08.701170 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xw8lt\" (UniqueName: \"kubernetes.io/projected/2330f5fe-a653-4a3b-8563-d9bce00bd081-kube-api-access-xw8lt\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-998xg\" (UID: \"2330f5fe-a653-4a3b-8563-d9bce00bd081\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-998xg" Dec 11 13:55:08 crc kubenswrapper[4898]: I1211 13:55:08.804636 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2330f5fe-a653-4a3b-8563-d9bce00bd081-ssh-key\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-998xg\" (UID: \"2330f5fe-a653-4a3b-8563-d9bce00bd081\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-998xg" Dec 11 13:55:08 crc kubenswrapper[4898]: I1211 13:55:08.804737 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-ipmi-config-data-1\" (UniqueName: \"kubernetes.io/secret/2330f5fe-a653-4a3b-8563-d9bce00bd081-ceilometer-ipmi-config-data-1\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-998xg\" (UID: \"2330f5fe-a653-4a3b-8563-d9bce00bd081\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-998xg" Dec 11 13:55:08 crc kubenswrapper[4898]: I1211 13:55:08.804822 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2330f5fe-a653-4a3b-8563-d9bce00bd081-telemetry-power-monitoring-combined-ca-bundle\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-998xg\" (UID: \"2330f5fe-a653-4a3b-8563-d9bce00bd081\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-998xg" Dec 11 13:55:08 crc kubenswrapper[4898]: I1211 13:55:08.804922 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-ipmi-config-data-2\" (UniqueName: \"kubernetes.io/secret/2330f5fe-a653-4a3b-8563-d9bce00bd081-ceilometer-ipmi-config-data-2\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-998xg\" (UID: \"2330f5fe-a653-4a3b-8563-d9bce00bd081\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-998xg" Dec 11 13:55:08 crc kubenswrapper[4898]: I1211 13:55:08.804954 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-ipmi-config-data-0\" (UniqueName: \"kubernetes.io/secret/2330f5fe-a653-4a3b-8563-d9bce00bd081-ceilometer-ipmi-config-data-0\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-998xg\" (UID: \"2330f5fe-a653-4a3b-8563-d9bce00bd081\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-998xg" Dec 11 13:55:08 crc kubenswrapper[4898]: I1211 13:55:08.805014 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2330f5fe-a653-4a3b-8563-d9bce00bd081-inventory\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-998xg\" (UID: \"2330f5fe-a653-4a3b-8563-d9bce00bd081\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-998xg" Dec 11 13:55:08 crc kubenswrapper[4898]: I1211 13:55:08.805061 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xw8lt\" (UniqueName: \"kubernetes.io/projected/2330f5fe-a653-4a3b-8563-d9bce00bd081-kube-api-access-xw8lt\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-998xg\" (UID: \"2330f5fe-a653-4a3b-8563-d9bce00bd081\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-998xg" Dec 11 13:55:08 crc kubenswrapper[4898]: I1211 13:55:08.810125 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-ipmi-config-data-1\" (UniqueName: \"kubernetes.io/secret/2330f5fe-a653-4a3b-8563-d9bce00bd081-ceilometer-ipmi-config-data-1\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-998xg\" (UID: \"2330f5fe-a653-4a3b-8563-d9bce00bd081\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-998xg" Dec 11 13:55:08 crc kubenswrapper[4898]: I1211 13:55:08.810598 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-ipmi-config-data-2\" (UniqueName: \"kubernetes.io/secret/2330f5fe-a653-4a3b-8563-d9bce00bd081-ceilometer-ipmi-config-data-2\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-998xg\" (UID: \"2330f5fe-a653-4a3b-8563-d9bce00bd081\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-998xg" Dec 11 13:55:08 crc kubenswrapper[4898]: I1211 13:55:08.810684 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2330f5fe-a653-4a3b-8563-d9bce00bd081-inventory\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-998xg\" (UID: \"2330f5fe-a653-4a3b-8563-d9bce00bd081\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-998xg" Dec 11 13:55:08 crc kubenswrapper[4898]: I1211 13:55:08.810701 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-ipmi-config-data-0\" (UniqueName: \"kubernetes.io/secret/2330f5fe-a653-4a3b-8563-d9bce00bd081-ceilometer-ipmi-config-data-0\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-998xg\" (UID: \"2330f5fe-a653-4a3b-8563-d9bce00bd081\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-998xg" Dec 11 13:55:08 crc kubenswrapper[4898]: I1211 13:55:08.816693 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2330f5fe-a653-4a3b-8563-d9bce00bd081-ssh-key\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-998xg\" (UID: \"2330f5fe-a653-4a3b-8563-d9bce00bd081\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-998xg" Dec 11 13:55:08 crc kubenswrapper[4898]: I1211 13:55:08.816826 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2330f5fe-a653-4a3b-8563-d9bce00bd081-telemetry-power-monitoring-combined-ca-bundle\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-998xg\" (UID: \"2330f5fe-a653-4a3b-8563-d9bce00bd081\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-998xg" Dec 11 13:55:08 crc kubenswrapper[4898]: I1211 13:55:08.823846 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xw8lt\" (UniqueName: \"kubernetes.io/projected/2330f5fe-a653-4a3b-8563-d9bce00bd081-kube-api-access-xw8lt\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-998xg\" (UID: \"2330f5fe-a653-4a3b-8563-d9bce00bd081\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-998xg" Dec 11 13:55:08 crc kubenswrapper[4898]: I1211 13:55:08.922586 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-998xg" Dec 11 13:55:09 crc kubenswrapper[4898]: I1211 13:55:09.489699 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-998xg"] Dec 11 13:55:09 crc kubenswrapper[4898]: I1211 13:55:09.502045 4898 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 11 13:55:10 crc kubenswrapper[4898]: I1211 13:55:10.525672 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-998xg" event={"ID":"2330f5fe-a653-4a3b-8563-d9bce00bd081","Type":"ContainerStarted","Data":"302bb5c2b4786ba3a877779624f270cb05e1b19d55c88752bcb92d19ba1c6d6d"} Dec 11 13:55:12 crc kubenswrapper[4898]: I1211 13:55:12.557255 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-998xg" event={"ID":"2330f5fe-a653-4a3b-8563-d9bce00bd081","Type":"ContainerStarted","Data":"a0438c55e351bdc0b34802a97d9dd73ee09f4d6c2c3e23c0fc48d2f31f5b3b90"} Dec 11 13:55:12 crc kubenswrapper[4898]: I1211 13:55:12.585359 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-998xg" podStartSLOduration=2.831565054 podStartE2EDuration="4.585335406s" podCreationTimestamp="2025-12-11 13:55:08 +0000 UTC" firstStartedPulling="2025-12-11 13:55:09.501739205 +0000 UTC m=+3067.074065642" lastFinishedPulling="2025-12-11 13:55:11.255509557 +0000 UTC m=+3068.827835994" observedRunningTime="2025-12-11 13:55:12.579555381 +0000 UTC m=+3070.151881828" watchObservedRunningTime="2025-12-11 13:55:12.585335406 +0000 UTC m=+3070.157661843" Dec 11 13:55:34 crc kubenswrapper[4898]: I1211 13:55:34.996295 4898 patch_prober.go:28] interesting pod/machine-config-daemon-7mmvk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 11 13:55:34 crc kubenswrapper[4898]: I1211 13:55:34.996824 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 11 13:55:39 crc kubenswrapper[4898]: I1211 13:55:39.879499 4898 scope.go:117] "RemoveContainer" containerID="20aea438ccb0f34218fa4d3df0214020bb664ddb8e82f6e1052bbce6bb2e360e" Dec 11 13:55:39 crc kubenswrapper[4898]: I1211 13:55:39.923330 4898 scope.go:117] "RemoveContainer" containerID="df6be2bd3001b92632f91facff448c6dfec2d0d6da8aca6b8294bd7d6865e108" Dec 11 13:56:00 crc kubenswrapper[4898]: I1211 13:56:00.202657 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-d8j98"] Dec 11 13:56:00 crc kubenswrapper[4898]: I1211 13:56:00.205787 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-d8j98" Dec 11 13:56:00 crc kubenswrapper[4898]: I1211 13:56:00.218239 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-d8j98"] Dec 11 13:56:00 crc kubenswrapper[4898]: I1211 13:56:00.371820 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4ffa32d-cc15-4d59-a3d1-70161aa942bf-catalog-content\") pod \"community-operators-d8j98\" (UID: \"b4ffa32d-cc15-4d59-a3d1-70161aa942bf\") " pod="openshift-marketplace/community-operators-d8j98" Dec 11 13:56:00 crc kubenswrapper[4898]: I1211 13:56:00.372297 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-94c9t\" (UniqueName: \"kubernetes.io/projected/b4ffa32d-cc15-4d59-a3d1-70161aa942bf-kube-api-access-94c9t\") pod \"community-operators-d8j98\" (UID: \"b4ffa32d-cc15-4d59-a3d1-70161aa942bf\") " pod="openshift-marketplace/community-operators-d8j98" Dec 11 13:56:00 crc kubenswrapper[4898]: I1211 13:56:00.372663 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4ffa32d-cc15-4d59-a3d1-70161aa942bf-utilities\") pod \"community-operators-d8j98\" (UID: \"b4ffa32d-cc15-4d59-a3d1-70161aa942bf\") " pod="openshift-marketplace/community-operators-d8j98" Dec 11 13:56:00 crc kubenswrapper[4898]: I1211 13:56:00.474644 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4ffa32d-cc15-4d59-a3d1-70161aa942bf-utilities\") pod \"community-operators-d8j98\" (UID: \"b4ffa32d-cc15-4d59-a3d1-70161aa942bf\") " pod="openshift-marketplace/community-operators-d8j98" Dec 11 13:56:00 crc kubenswrapper[4898]: I1211 13:56:00.474728 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4ffa32d-cc15-4d59-a3d1-70161aa942bf-catalog-content\") pod \"community-operators-d8j98\" (UID: \"b4ffa32d-cc15-4d59-a3d1-70161aa942bf\") " pod="openshift-marketplace/community-operators-d8j98" Dec 11 13:56:00 crc kubenswrapper[4898]: I1211 13:56:00.474862 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-94c9t\" (UniqueName: \"kubernetes.io/projected/b4ffa32d-cc15-4d59-a3d1-70161aa942bf-kube-api-access-94c9t\") pod \"community-operators-d8j98\" (UID: \"b4ffa32d-cc15-4d59-a3d1-70161aa942bf\") " pod="openshift-marketplace/community-operators-d8j98" Dec 11 13:56:00 crc kubenswrapper[4898]: I1211 13:56:00.475211 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4ffa32d-cc15-4d59-a3d1-70161aa942bf-utilities\") pod \"community-operators-d8j98\" (UID: \"b4ffa32d-cc15-4d59-a3d1-70161aa942bf\") " pod="openshift-marketplace/community-operators-d8j98" Dec 11 13:56:00 crc kubenswrapper[4898]: I1211 13:56:00.475387 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4ffa32d-cc15-4d59-a3d1-70161aa942bf-catalog-content\") pod \"community-operators-d8j98\" (UID: \"b4ffa32d-cc15-4d59-a3d1-70161aa942bf\") " pod="openshift-marketplace/community-operators-d8j98" Dec 11 13:56:00 crc kubenswrapper[4898]: I1211 13:56:00.510347 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-94c9t\" (UniqueName: \"kubernetes.io/projected/b4ffa32d-cc15-4d59-a3d1-70161aa942bf-kube-api-access-94c9t\") pod \"community-operators-d8j98\" (UID: \"b4ffa32d-cc15-4d59-a3d1-70161aa942bf\") " pod="openshift-marketplace/community-operators-d8j98" Dec 11 13:56:00 crc kubenswrapper[4898]: I1211 13:56:00.537653 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-d8j98" Dec 11 13:56:01 crc kubenswrapper[4898]: I1211 13:56:01.110649 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-d8j98"] Dec 11 13:56:01 crc kubenswrapper[4898]: W1211 13:56:01.119487 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb4ffa32d_cc15_4d59_a3d1_70161aa942bf.slice/crio-a7f3a7cf4c5bb19fb994d5564b1e07b1d820c3a43d3f9270ac6d8b1903c587e6 WatchSource:0}: Error finding container a7f3a7cf4c5bb19fb994d5564b1e07b1d820c3a43d3f9270ac6d8b1903c587e6: Status 404 returned error can't find the container with id a7f3a7cf4c5bb19fb994d5564b1e07b1d820c3a43d3f9270ac6d8b1903c587e6 Dec 11 13:56:01 crc kubenswrapper[4898]: I1211 13:56:01.766817 4898 generic.go:334] "Generic (PLEG): container finished" podID="b4ffa32d-cc15-4d59-a3d1-70161aa942bf" containerID="7ac9d71d6159e9f18b08db7cf010885c10dde6da98196f1b8e1df09c06d1b470" exitCode=0 Dec 11 13:56:01 crc kubenswrapper[4898]: I1211 13:56:01.766857 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d8j98" event={"ID":"b4ffa32d-cc15-4d59-a3d1-70161aa942bf","Type":"ContainerDied","Data":"7ac9d71d6159e9f18b08db7cf010885c10dde6da98196f1b8e1df09c06d1b470"} Dec 11 13:56:01 crc kubenswrapper[4898]: I1211 13:56:01.767097 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d8j98" event={"ID":"b4ffa32d-cc15-4d59-a3d1-70161aa942bf","Type":"ContainerStarted","Data":"a7f3a7cf4c5bb19fb994d5564b1e07b1d820c3a43d3f9270ac6d8b1903c587e6"} Dec 11 13:56:04 crc kubenswrapper[4898]: I1211 13:56:04.802887 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d8j98" event={"ID":"b4ffa32d-cc15-4d59-a3d1-70161aa942bf","Type":"ContainerStarted","Data":"b0257a125d5720e8cbc0ea3c4a78b541643acd20f618a5ab3914284cdbd58b76"} Dec 11 13:56:04 crc kubenswrapper[4898]: I1211 13:56:04.995658 4898 patch_prober.go:28] interesting pod/machine-config-daemon-7mmvk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 11 13:56:04 crc kubenswrapper[4898]: I1211 13:56:04.995742 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 11 13:56:04 crc kubenswrapper[4898]: I1211 13:56:04.995795 4898 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" Dec 11 13:56:04 crc kubenswrapper[4898]: I1211 13:56:04.996827 4898 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c9e922d5ee469f620afdfb2ddd69dbee93879150084a4966647adbf551d5f7e9"} pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 11 13:56:04 crc kubenswrapper[4898]: I1211 13:56:04.996911 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" containerName="machine-config-daemon" containerID="cri-o://c9e922d5ee469f620afdfb2ddd69dbee93879150084a4966647adbf551d5f7e9" gracePeriod=600 Dec 11 13:56:05 crc kubenswrapper[4898]: I1211 13:56:05.820403 4898 generic.go:334] "Generic (PLEG): container finished" podID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" containerID="c9e922d5ee469f620afdfb2ddd69dbee93879150084a4966647adbf551d5f7e9" exitCode=0 Dec 11 13:56:05 crc kubenswrapper[4898]: I1211 13:56:05.820500 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" event={"ID":"b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c","Type":"ContainerDied","Data":"c9e922d5ee469f620afdfb2ddd69dbee93879150084a4966647adbf551d5f7e9"} Dec 11 13:56:05 crc kubenswrapper[4898]: I1211 13:56:05.820857 4898 scope.go:117] "RemoveContainer" containerID="af271096b767ce5903084b05e8ad0ee53c330083b7f59f687de7d1f33e49b2a5" Dec 11 13:56:05 crc kubenswrapper[4898]: I1211 13:56:05.825849 4898 generic.go:334] "Generic (PLEG): container finished" podID="b4ffa32d-cc15-4d59-a3d1-70161aa942bf" containerID="b0257a125d5720e8cbc0ea3c4a78b541643acd20f618a5ab3914284cdbd58b76" exitCode=0 Dec 11 13:56:05 crc kubenswrapper[4898]: I1211 13:56:05.825892 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d8j98" event={"ID":"b4ffa32d-cc15-4d59-a3d1-70161aa942bf","Type":"ContainerDied","Data":"b0257a125d5720e8cbc0ea3c4a78b541643acd20f618a5ab3914284cdbd58b76"} Dec 11 13:56:06 crc kubenswrapper[4898]: E1211 13:56:06.226398 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mmvk_openshift-machine-config-operator(b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" Dec 11 13:56:06 crc kubenswrapper[4898]: I1211 13:56:06.842802 4898 scope.go:117] "RemoveContainer" containerID="c9e922d5ee469f620afdfb2ddd69dbee93879150084a4966647adbf551d5f7e9" Dec 11 13:56:06 crc kubenswrapper[4898]: E1211 13:56:06.843668 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mmvk_openshift-machine-config-operator(b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" Dec 11 13:56:06 crc kubenswrapper[4898]: I1211 13:56:06.847923 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d8j98" event={"ID":"b4ffa32d-cc15-4d59-a3d1-70161aa942bf","Type":"ContainerStarted","Data":"251c58e1b9e552d48cf6d940bc0bd8788f33761d87c9661a557a151389656160"} Dec 11 13:56:06 crc kubenswrapper[4898]: I1211 13:56:06.888892 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-d8j98" podStartSLOduration=2.340102058 podStartE2EDuration="6.888870694s" podCreationTimestamp="2025-12-11 13:56:00 +0000 UTC" firstStartedPulling="2025-12-11 13:56:01.76911281 +0000 UTC m=+3119.341439257" lastFinishedPulling="2025-12-11 13:56:06.317881436 +0000 UTC m=+3123.890207893" observedRunningTime="2025-12-11 13:56:06.882986006 +0000 UTC m=+3124.455312463" watchObservedRunningTime="2025-12-11 13:56:06.888870694 +0000 UTC m=+3124.461197151" Dec 11 13:56:10 crc kubenswrapper[4898]: I1211 13:56:10.538002 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-d8j98" Dec 11 13:56:10 crc kubenswrapper[4898]: I1211 13:56:10.538940 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-d8j98" Dec 11 13:56:10 crc kubenswrapper[4898]: I1211 13:56:10.588858 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-d8j98" Dec 11 13:56:11 crc kubenswrapper[4898]: I1211 13:56:11.952682 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-d8j98" Dec 11 13:56:12 crc kubenswrapper[4898]: I1211 13:56:12.007528 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-d8j98"] Dec 11 13:56:13 crc kubenswrapper[4898]: I1211 13:56:13.921844 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-d8j98" podUID="b4ffa32d-cc15-4d59-a3d1-70161aa942bf" containerName="registry-server" containerID="cri-o://251c58e1b9e552d48cf6d940bc0bd8788f33761d87c9661a557a151389656160" gracePeriod=2 Dec 11 13:56:14 crc kubenswrapper[4898]: I1211 13:56:14.412376 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-d8j98" Dec 11 13:56:14 crc kubenswrapper[4898]: I1211 13:56:14.545051 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-94c9t\" (UniqueName: \"kubernetes.io/projected/b4ffa32d-cc15-4d59-a3d1-70161aa942bf-kube-api-access-94c9t\") pod \"b4ffa32d-cc15-4d59-a3d1-70161aa942bf\" (UID: \"b4ffa32d-cc15-4d59-a3d1-70161aa942bf\") " Dec 11 13:56:14 crc kubenswrapper[4898]: I1211 13:56:14.545165 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4ffa32d-cc15-4d59-a3d1-70161aa942bf-catalog-content\") pod \"b4ffa32d-cc15-4d59-a3d1-70161aa942bf\" (UID: \"b4ffa32d-cc15-4d59-a3d1-70161aa942bf\") " Dec 11 13:56:14 crc kubenswrapper[4898]: I1211 13:56:14.545227 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4ffa32d-cc15-4d59-a3d1-70161aa942bf-utilities\") pod \"b4ffa32d-cc15-4d59-a3d1-70161aa942bf\" (UID: \"b4ffa32d-cc15-4d59-a3d1-70161aa942bf\") " Dec 11 13:56:14 crc kubenswrapper[4898]: I1211 13:56:14.546119 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b4ffa32d-cc15-4d59-a3d1-70161aa942bf-utilities" (OuterVolumeSpecName: "utilities") pod "b4ffa32d-cc15-4d59-a3d1-70161aa942bf" (UID: "b4ffa32d-cc15-4d59-a3d1-70161aa942bf"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 13:56:14 crc kubenswrapper[4898]: I1211 13:56:14.556510 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4ffa32d-cc15-4d59-a3d1-70161aa942bf-kube-api-access-94c9t" (OuterVolumeSpecName: "kube-api-access-94c9t") pod "b4ffa32d-cc15-4d59-a3d1-70161aa942bf" (UID: "b4ffa32d-cc15-4d59-a3d1-70161aa942bf"). InnerVolumeSpecName "kube-api-access-94c9t". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:56:14 crc kubenswrapper[4898]: I1211 13:56:14.613976 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b4ffa32d-cc15-4d59-a3d1-70161aa942bf-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b4ffa32d-cc15-4d59-a3d1-70161aa942bf" (UID: "b4ffa32d-cc15-4d59-a3d1-70161aa942bf"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 13:56:14 crc kubenswrapper[4898]: I1211 13:56:14.647874 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-94c9t\" (UniqueName: \"kubernetes.io/projected/b4ffa32d-cc15-4d59-a3d1-70161aa942bf-kube-api-access-94c9t\") on node \"crc\" DevicePath \"\"" Dec 11 13:56:14 crc kubenswrapper[4898]: I1211 13:56:14.647907 4898 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4ffa32d-cc15-4d59-a3d1-70161aa942bf-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 11 13:56:14 crc kubenswrapper[4898]: I1211 13:56:14.647922 4898 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4ffa32d-cc15-4d59-a3d1-70161aa942bf-utilities\") on node \"crc\" DevicePath \"\"" Dec 11 13:56:14 crc kubenswrapper[4898]: I1211 13:56:14.935296 4898 generic.go:334] "Generic (PLEG): container finished" podID="b4ffa32d-cc15-4d59-a3d1-70161aa942bf" containerID="251c58e1b9e552d48cf6d940bc0bd8788f33761d87c9661a557a151389656160" exitCode=0 Dec 11 13:56:14 crc kubenswrapper[4898]: I1211 13:56:14.935363 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d8j98" event={"ID":"b4ffa32d-cc15-4d59-a3d1-70161aa942bf","Type":"ContainerDied","Data":"251c58e1b9e552d48cf6d940bc0bd8788f33761d87c9661a557a151389656160"} Dec 11 13:56:14 crc kubenswrapper[4898]: I1211 13:56:14.935372 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-d8j98" Dec 11 13:56:14 crc kubenswrapper[4898]: I1211 13:56:14.935416 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d8j98" event={"ID":"b4ffa32d-cc15-4d59-a3d1-70161aa942bf","Type":"ContainerDied","Data":"a7f3a7cf4c5bb19fb994d5564b1e07b1d820c3a43d3f9270ac6d8b1903c587e6"} Dec 11 13:56:14 crc kubenswrapper[4898]: I1211 13:56:14.935446 4898 scope.go:117] "RemoveContainer" containerID="251c58e1b9e552d48cf6d940bc0bd8788f33761d87c9661a557a151389656160" Dec 11 13:56:14 crc kubenswrapper[4898]: I1211 13:56:14.967210 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-d8j98"] Dec 11 13:56:14 crc kubenswrapper[4898]: I1211 13:56:14.973276 4898 scope.go:117] "RemoveContainer" containerID="b0257a125d5720e8cbc0ea3c4a78b541643acd20f618a5ab3914284cdbd58b76" Dec 11 13:56:14 crc kubenswrapper[4898]: I1211 13:56:14.979536 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-d8j98"] Dec 11 13:56:15 crc kubenswrapper[4898]: I1211 13:56:15.001538 4898 scope.go:117] "RemoveContainer" containerID="7ac9d71d6159e9f18b08db7cf010885c10dde6da98196f1b8e1df09c06d1b470" Dec 11 13:56:15 crc kubenswrapper[4898]: I1211 13:56:15.052740 4898 scope.go:117] "RemoveContainer" containerID="251c58e1b9e552d48cf6d940bc0bd8788f33761d87c9661a557a151389656160" Dec 11 13:56:15 crc kubenswrapper[4898]: E1211 13:56:15.053304 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"251c58e1b9e552d48cf6d940bc0bd8788f33761d87c9661a557a151389656160\": container with ID starting with 251c58e1b9e552d48cf6d940bc0bd8788f33761d87c9661a557a151389656160 not found: ID does not exist" containerID="251c58e1b9e552d48cf6d940bc0bd8788f33761d87c9661a557a151389656160" Dec 11 13:56:15 crc kubenswrapper[4898]: I1211 13:56:15.053346 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"251c58e1b9e552d48cf6d940bc0bd8788f33761d87c9661a557a151389656160"} err="failed to get container status \"251c58e1b9e552d48cf6d940bc0bd8788f33761d87c9661a557a151389656160\": rpc error: code = NotFound desc = could not find container \"251c58e1b9e552d48cf6d940bc0bd8788f33761d87c9661a557a151389656160\": container with ID starting with 251c58e1b9e552d48cf6d940bc0bd8788f33761d87c9661a557a151389656160 not found: ID does not exist" Dec 11 13:56:15 crc kubenswrapper[4898]: I1211 13:56:15.053373 4898 scope.go:117] "RemoveContainer" containerID="b0257a125d5720e8cbc0ea3c4a78b541643acd20f618a5ab3914284cdbd58b76" Dec 11 13:56:15 crc kubenswrapper[4898]: E1211 13:56:15.053765 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b0257a125d5720e8cbc0ea3c4a78b541643acd20f618a5ab3914284cdbd58b76\": container with ID starting with b0257a125d5720e8cbc0ea3c4a78b541643acd20f618a5ab3914284cdbd58b76 not found: ID does not exist" containerID="b0257a125d5720e8cbc0ea3c4a78b541643acd20f618a5ab3914284cdbd58b76" Dec 11 13:56:15 crc kubenswrapper[4898]: I1211 13:56:15.053855 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b0257a125d5720e8cbc0ea3c4a78b541643acd20f618a5ab3914284cdbd58b76"} err="failed to get container status \"b0257a125d5720e8cbc0ea3c4a78b541643acd20f618a5ab3914284cdbd58b76\": rpc error: code = NotFound desc = could not find container \"b0257a125d5720e8cbc0ea3c4a78b541643acd20f618a5ab3914284cdbd58b76\": container with ID starting with b0257a125d5720e8cbc0ea3c4a78b541643acd20f618a5ab3914284cdbd58b76 not found: ID does not exist" Dec 11 13:56:15 crc kubenswrapper[4898]: I1211 13:56:15.053891 4898 scope.go:117] "RemoveContainer" containerID="7ac9d71d6159e9f18b08db7cf010885c10dde6da98196f1b8e1df09c06d1b470" Dec 11 13:56:15 crc kubenswrapper[4898]: E1211 13:56:15.054173 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7ac9d71d6159e9f18b08db7cf010885c10dde6da98196f1b8e1df09c06d1b470\": container with ID starting with 7ac9d71d6159e9f18b08db7cf010885c10dde6da98196f1b8e1df09c06d1b470 not found: ID does not exist" containerID="7ac9d71d6159e9f18b08db7cf010885c10dde6da98196f1b8e1df09c06d1b470" Dec 11 13:56:15 crc kubenswrapper[4898]: I1211 13:56:15.054205 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ac9d71d6159e9f18b08db7cf010885c10dde6da98196f1b8e1df09c06d1b470"} err="failed to get container status \"7ac9d71d6159e9f18b08db7cf010885c10dde6da98196f1b8e1df09c06d1b470\": rpc error: code = NotFound desc = could not find container \"7ac9d71d6159e9f18b08db7cf010885c10dde6da98196f1b8e1df09c06d1b470\": container with ID starting with 7ac9d71d6159e9f18b08db7cf010885c10dde6da98196f1b8e1df09c06d1b470 not found: ID does not exist" Dec 11 13:56:16 crc kubenswrapper[4898]: I1211 13:56:16.786988 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b4ffa32d-cc15-4d59-a3d1-70161aa942bf" path="/var/lib/kubelet/pods/b4ffa32d-cc15-4d59-a3d1-70161aa942bf/volumes" Dec 11 13:56:18 crc kubenswrapper[4898]: I1211 13:56:18.775864 4898 scope.go:117] "RemoveContainer" containerID="c9e922d5ee469f620afdfb2ddd69dbee93879150084a4966647adbf551d5f7e9" Dec 11 13:56:18 crc kubenswrapper[4898]: E1211 13:56:18.776272 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mmvk_openshift-machine-config-operator(b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" Dec 11 13:56:29 crc kubenswrapper[4898]: I1211 13:56:29.775107 4898 scope.go:117] "RemoveContainer" containerID="c9e922d5ee469f620afdfb2ddd69dbee93879150084a4966647adbf551d5f7e9" Dec 11 13:56:29 crc kubenswrapper[4898]: E1211 13:56:29.776156 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mmvk_openshift-machine-config-operator(b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" Dec 11 13:56:39 crc kubenswrapper[4898]: I1211 13:56:39.987091 4898 scope.go:117] "RemoveContainer" containerID="30ca9d6964c13957c220c44fa454beb68b1ab5f88acde343e85db01b8cd5c77f" Dec 11 13:56:40 crc kubenswrapper[4898]: I1211 13:56:40.774913 4898 scope.go:117] "RemoveContainer" containerID="c9e922d5ee469f620afdfb2ddd69dbee93879150084a4966647adbf551d5f7e9" Dec 11 13:56:40 crc kubenswrapper[4898]: E1211 13:56:40.775219 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mmvk_openshift-machine-config-operator(b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" Dec 11 13:56:52 crc kubenswrapper[4898]: I1211 13:56:52.793728 4898 scope.go:117] "RemoveContainer" containerID="c9e922d5ee469f620afdfb2ddd69dbee93879150084a4966647adbf551d5f7e9" Dec 11 13:56:52 crc kubenswrapper[4898]: E1211 13:56:52.795935 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mmvk_openshift-machine-config-operator(b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" Dec 11 13:57:05 crc kubenswrapper[4898]: I1211 13:57:05.775119 4898 scope.go:117] "RemoveContainer" containerID="c9e922d5ee469f620afdfb2ddd69dbee93879150084a4966647adbf551d5f7e9" Dec 11 13:57:05 crc kubenswrapper[4898]: E1211 13:57:05.776185 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mmvk_openshift-machine-config-operator(b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" Dec 11 13:57:17 crc kubenswrapper[4898]: I1211 13:57:17.775695 4898 scope.go:117] "RemoveContainer" containerID="c9e922d5ee469f620afdfb2ddd69dbee93879150084a4966647adbf551d5f7e9" Dec 11 13:57:17 crc kubenswrapper[4898]: E1211 13:57:17.776690 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mmvk_openshift-machine-config-operator(b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" Dec 11 13:57:21 crc kubenswrapper[4898]: I1211 13:57:21.706316 4898 generic.go:334] "Generic (PLEG): container finished" podID="2330f5fe-a653-4a3b-8563-d9bce00bd081" containerID="a0438c55e351bdc0b34802a97d9dd73ee09f4d6c2c3e23c0fc48d2f31f5b3b90" exitCode=0 Dec 11 13:57:21 crc kubenswrapper[4898]: I1211 13:57:21.706400 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-998xg" event={"ID":"2330f5fe-a653-4a3b-8563-d9bce00bd081","Type":"ContainerDied","Data":"a0438c55e351bdc0b34802a97d9dd73ee09f4d6c2c3e23c0fc48d2f31f5b3b90"} Dec 11 13:57:23 crc kubenswrapper[4898]: I1211 13:57:23.211790 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-998xg" Dec 11 13:57:23 crc kubenswrapper[4898]: I1211 13:57:23.397389 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2330f5fe-a653-4a3b-8563-d9bce00bd081-ssh-key\") pod \"2330f5fe-a653-4a3b-8563-d9bce00bd081\" (UID: \"2330f5fe-a653-4a3b-8563-d9bce00bd081\") " Dec 11 13:57:23 crc kubenswrapper[4898]: I1211 13:57:23.397728 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xw8lt\" (UniqueName: \"kubernetes.io/projected/2330f5fe-a653-4a3b-8563-d9bce00bd081-kube-api-access-xw8lt\") pod \"2330f5fe-a653-4a3b-8563-d9bce00bd081\" (UID: \"2330f5fe-a653-4a3b-8563-d9bce00bd081\") " Dec 11 13:57:23 crc kubenswrapper[4898]: I1211 13:57:23.397942 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-ipmi-config-data-2\" (UniqueName: \"kubernetes.io/secret/2330f5fe-a653-4a3b-8563-d9bce00bd081-ceilometer-ipmi-config-data-2\") pod \"2330f5fe-a653-4a3b-8563-d9bce00bd081\" (UID: \"2330f5fe-a653-4a3b-8563-d9bce00bd081\") " Dec 11 13:57:23 crc kubenswrapper[4898]: I1211 13:57:23.398043 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-ipmi-config-data-1\" (UniqueName: \"kubernetes.io/secret/2330f5fe-a653-4a3b-8563-d9bce00bd081-ceilometer-ipmi-config-data-1\") pod \"2330f5fe-a653-4a3b-8563-d9bce00bd081\" (UID: \"2330f5fe-a653-4a3b-8563-d9bce00bd081\") " Dec 11 13:57:23 crc kubenswrapper[4898]: I1211 13:57:23.398225 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-ipmi-config-data-0\" (UniqueName: \"kubernetes.io/secret/2330f5fe-a653-4a3b-8563-d9bce00bd081-ceilometer-ipmi-config-data-0\") pod \"2330f5fe-a653-4a3b-8563-d9bce00bd081\" (UID: \"2330f5fe-a653-4a3b-8563-d9bce00bd081\") " Dec 11 13:57:23 crc kubenswrapper[4898]: I1211 13:57:23.398349 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2330f5fe-a653-4a3b-8563-d9bce00bd081-telemetry-power-monitoring-combined-ca-bundle\") pod \"2330f5fe-a653-4a3b-8563-d9bce00bd081\" (UID: \"2330f5fe-a653-4a3b-8563-d9bce00bd081\") " Dec 11 13:57:23 crc kubenswrapper[4898]: I1211 13:57:23.398976 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2330f5fe-a653-4a3b-8563-d9bce00bd081-inventory\") pod \"2330f5fe-a653-4a3b-8563-d9bce00bd081\" (UID: \"2330f5fe-a653-4a3b-8563-d9bce00bd081\") " Dec 11 13:57:23 crc kubenswrapper[4898]: I1211 13:57:23.403388 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2330f5fe-a653-4a3b-8563-d9bce00bd081-telemetry-power-monitoring-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-power-monitoring-combined-ca-bundle") pod "2330f5fe-a653-4a3b-8563-d9bce00bd081" (UID: "2330f5fe-a653-4a3b-8563-d9bce00bd081"). InnerVolumeSpecName "telemetry-power-monitoring-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:57:23 crc kubenswrapper[4898]: I1211 13:57:23.408663 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2330f5fe-a653-4a3b-8563-d9bce00bd081-kube-api-access-xw8lt" (OuterVolumeSpecName: "kube-api-access-xw8lt") pod "2330f5fe-a653-4a3b-8563-d9bce00bd081" (UID: "2330f5fe-a653-4a3b-8563-d9bce00bd081"). InnerVolumeSpecName "kube-api-access-xw8lt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:57:23 crc kubenswrapper[4898]: I1211 13:57:23.434479 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2330f5fe-a653-4a3b-8563-d9bce00bd081-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "2330f5fe-a653-4a3b-8563-d9bce00bd081" (UID: "2330f5fe-a653-4a3b-8563-d9bce00bd081"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:57:23 crc kubenswrapper[4898]: I1211 13:57:23.435083 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2330f5fe-a653-4a3b-8563-d9bce00bd081-ceilometer-ipmi-config-data-1" (OuterVolumeSpecName: "ceilometer-ipmi-config-data-1") pod "2330f5fe-a653-4a3b-8563-d9bce00bd081" (UID: "2330f5fe-a653-4a3b-8563-d9bce00bd081"). InnerVolumeSpecName "ceilometer-ipmi-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:57:23 crc kubenswrapper[4898]: I1211 13:57:23.454620 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2330f5fe-a653-4a3b-8563-d9bce00bd081-ceilometer-ipmi-config-data-0" (OuterVolumeSpecName: "ceilometer-ipmi-config-data-0") pod "2330f5fe-a653-4a3b-8563-d9bce00bd081" (UID: "2330f5fe-a653-4a3b-8563-d9bce00bd081"). InnerVolumeSpecName "ceilometer-ipmi-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:57:23 crc kubenswrapper[4898]: I1211 13:57:23.466625 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2330f5fe-a653-4a3b-8563-d9bce00bd081-ceilometer-ipmi-config-data-2" (OuterVolumeSpecName: "ceilometer-ipmi-config-data-2") pod "2330f5fe-a653-4a3b-8563-d9bce00bd081" (UID: "2330f5fe-a653-4a3b-8563-d9bce00bd081"). InnerVolumeSpecName "ceilometer-ipmi-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:57:23 crc kubenswrapper[4898]: I1211 13:57:23.475585 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2330f5fe-a653-4a3b-8563-d9bce00bd081-inventory" (OuterVolumeSpecName: "inventory") pod "2330f5fe-a653-4a3b-8563-d9bce00bd081" (UID: "2330f5fe-a653-4a3b-8563-d9bce00bd081"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:57:23 crc kubenswrapper[4898]: I1211 13:57:23.504092 4898 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2330f5fe-a653-4a3b-8563-d9bce00bd081-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 11 13:57:23 crc kubenswrapper[4898]: I1211 13:57:23.504514 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xw8lt\" (UniqueName: \"kubernetes.io/projected/2330f5fe-a653-4a3b-8563-d9bce00bd081-kube-api-access-xw8lt\") on node \"crc\" DevicePath \"\"" Dec 11 13:57:23 crc kubenswrapper[4898]: I1211 13:57:23.504604 4898 reconciler_common.go:293] "Volume detached for volume \"ceilometer-ipmi-config-data-2\" (UniqueName: \"kubernetes.io/secret/2330f5fe-a653-4a3b-8563-d9bce00bd081-ceilometer-ipmi-config-data-2\") on node \"crc\" DevicePath \"\"" Dec 11 13:57:23 crc kubenswrapper[4898]: I1211 13:57:23.504668 4898 reconciler_common.go:293] "Volume detached for volume \"ceilometer-ipmi-config-data-1\" (UniqueName: \"kubernetes.io/secret/2330f5fe-a653-4a3b-8563-d9bce00bd081-ceilometer-ipmi-config-data-1\") on node \"crc\" DevicePath \"\"" Dec 11 13:57:23 crc kubenswrapper[4898]: I1211 13:57:23.504725 4898 reconciler_common.go:293] "Volume detached for volume \"ceilometer-ipmi-config-data-0\" (UniqueName: \"kubernetes.io/secret/2330f5fe-a653-4a3b-8563-d9bce00bd081-ceilometer-ipmi-config-data-0\") on node \"crc\" DevicePath \"\"" Dec 11 13:57:23 crc kubenswrapper[4898]: I1211 13:57:23.504781 4898 reconciler_common.go:293] "Volume detached for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2330f5fe-a653-4a3b-8563-d9bce00bd081-telemetry-power-monitoring-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 11 13:57:23 crc kubenswrapper[4898]: I1211 13:57:23.504931 4898 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2330f5fe-a653-4a3b-8563-d9bce00bd081-inventory\") on node \"crc\" DevicePath \"\"" Dec 11 13:57:23 crc kubenswrapper[4898]: I1211 13:57:23.726789 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-998xg" event={"ID":"2330f5fe-a653-4a3b-8563-d9bce00bd081","Type":"ContainerDied","Data":"302bb5c2b4786ba3a877779624f270cb05e1b19d55c88752bcb92d19ba1c6d6d"} Dec 11 13:57:23 crc kubenswrapper[4898]: I1211 13:57:23.726836 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="302bb5c2b4786ba3a877779624f270cb05e1b19d55c88752bcb92d19ba1c6d6d" Dec 11 13:57:23 crc kubenswrapper[4898]: I1211 13:57:23.726883 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-998xg" Dec 11 13:57:23 crc kubenswrapper[4898]: I1211 13:57:23.838006 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/logging-edpm-deployment-openstack-edpm-ipam-qbckn"] Dec 11 13:57:23 crc kubenswrapper[4898]: E1211 13:57:23.839039 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4ffa32d-cc15-4d59-a3d1-70161aa942bf" containerName="extract-utilities" Dec 11 13:57:23 crc kubenswrapper[4898]: I1211 13:57:23.839066 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4ffa32d-cc15-4d59-a3d1-70161aa942bf" containerName="extract-utilities" Dec 11 13:57:23 crc kubenswrapper[4898]: E1211 13:57:23.839134 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4ffa32d-cc15-4d59-a3d1-70161aa942bf" containerName="extract-content" Dec 11 13:57:23 crc kubenswrapper[4898]: I1211 13:57:23.839144 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4ffa32d-cc15-4d59-a3d1-70161aa942bf" containerName="extract-content" Dec 11 13:57:23 crc kubenswrapper[4898]: E1211 13:57:23.839170 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4ffa32d-cc15-4d59-a3d1-70161aa942bf" containerName="registry-server" Dec 11 13:57:23 crc kubenswrapper[4898]: I1211 13:57:23.839177 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4ffa32d-cc15-4d59-a3d1-70161aa942bf" containerName="registry-server" Dec 11 13:57:23 crc kubenswrapper[4898]: E1211 13:57:23.839194 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2330f5fe-a653-4a3b-8563-d9bce00bd081" containerName="telemetry-power-monitoring-edpm-deployment-openstack-edpm-ipam" Dec 11 13:57:23 crc kubenswrapper[4898]: I1211 13:57:23.839204 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="2330f5fe-a653-4a3b-8563-d9bce00bd081" containerName="telemetry-power-monitoring-edpm-deployment-openstack-edpm-ipam" Dec 11 13:57:23 crc kubenswrapper[4898]: I1211 13:57:23.839508 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4ffa32d-cc15-4d59-a3d1-70161aa942bf" containerName="registry-server" Dec 11 13:57:23 crc kubenswrapper[4898]: I1211 13:57:23.839545 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="2330f5fe-a653-4a3b-8563-d9bce00bd081" containerName="telemetry-power-monitoring-edpm-deployment-openstack-edpm-ipam" Dec 11 13:57:23 crc kubenswrapper[4898]: I1211 13:57:23.840722 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-qbckn" Dec 11 13:57:23 crc kubenswrapper[4898]: I1211 13:57:23.843009 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"logging-compute-config-data" Dec 11 13:57:23 crc kubenswrapper[4898]: I1211 13:57:23.843107 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 11 13:57:23 crc kubenswrapper[4898]: I1211 13:57:23.843141 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-w8jc8" Dec 11 13:57:23 crc kubenswrapper[4898]: I1211 13:57:23.843594 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 11 13:57:23 crc kubenswrapper[4898]: I1211 13:57:23.845020 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 11 13:57:23 crc kubenswrapper[4898]: I1211 13:57:23.865870 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/logging-edpm-deployment-openstack-edpm-ipam-qbckn"] Dec 11 13:57:24 crc kubenswrapper[4898]: I1211 13:57:24.019758 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qm7km\" (UniqueName: \"kubernetes.io/projected/2b009a23-c578-4a3c-aca4-b68d1b9e9118-kube-api-access-qm7km\") pod \"logging-edpm-deployment-openstack-edpm-ipam-qbckn\" (UID: \"2b009a23-c578-4a3c-aca4-b68d1b9e9118\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-qbckn" Dec 11 13:57:24 crc kubenswrapper[4898]: I1211 13:57:24.019824 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2b009a23-c578-4a3c-aca4-b68d1b9e9118-ssh-key\") pod \"logging-edpm-deployment-openstack-edpm-ipam-qbckn\" (UID: \"2b009a23-c578-4a3c-aca4-b68d1b9e9118\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-qbckn" Dec 11 13:57:24 crc kubenswrapper[4898]: I1211 13:57:24.019884 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/2b009a23-c578-4a3c-aca4-b68d1b9e9118-logging-compute-config-data-1\") pod \"logging-edpm-deployment-openstack-edpm-ipam-qbckn\" (UID: \"2b009a23-c578-4a3c-aca4-b68d1b9e9118\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-qbckn" Dec 11 13:57:24 crc kubenswrapper[4898]: I1211 13:57:24.020019 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/2b009a23-c578-4a3c-aca4-b68d1b9e9118-logging-compute-config-data-0\") pod \"logging-edpm-deployment-openstack-edpm-ipam-qbckn\" (UID: \"2b009a23-c578-4a3c-aca4-b68d1b9e9118\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-qbckn" Dec 11 13:57:24 crc kubenswrapper[4898]: I1211 13:57:24.020165 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2b009a23-c578-4a3c-aca4-b68d1b9e9118-inventory\") pod \"logging-edpm-deployment-openstack-edpm-ipam-qbckn\" (UID: \"2b009a23-c578-4a3c-aca4-b68d1b9e9118\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-qbckn" Dec 11 13:57:24 crc kubenswrapper[4898]: I1211 13:57:24.121928 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/2b009a23-c578-4a3c-aca4-b68d1b9e9118-logging-compute-config-data-0\") pod \"logging-edpm-deployment-openstack-edpm-ipam-qbckn\" (UID: \"2b009a23-c578-4a3c-aca4-b68d1b9e9118\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-qbckn" Dec 11 13:57:24 crc kubenswrapper[4898]: I1211 13:57:24.122295 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2b009a23-c578-4a3c-aca4-b68d1b9e9118-inventory\") pod \"logging-edpm-deployment-openstack-edpm-ipam-qbckn\" (UID: \"2b009a23-c578-4a3c-aca4-b68d1b9e9118\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-qbckn" Dec 11 13:57:24 crc kubenswrapper[4898]: I1211 13:57:24.122526 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qm7km\" (UniqueName: \"kubernetes.io/projected/2b009a23-c578-4a3c-aca4-b68d1b9e9118-kube-api-access-qm7km\") pod \"logging-edpm-deployment-openstack-edpm-ipam-qbckn\" (UID: \"2b009a23-c578-4a3c-aca4-b68d1b9e9118\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-qbckn" Dec 11 13:57:24 crc kubenswrapper[4898]: I1211 13:57:24.122664 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2b009a23-c578-4a3c-aca4-b68d1b9e9118-ssh-key\") pod \"logging-edpm-deployment-openstack-edpm-ipam-qbckn\" (UID: \"2b009a23-c578-4a3c-aca4-b68d1b9e9118\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-qbckn" Dec 11 13:57:24 crc kubenswrapper[4898]: I1211 13:57:24.122806 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/2b009a23-c578-4a3c-aca4-b68d1b9e9118-logging-compute-config-data-1\") pod \"logging-edpm-deployment-openstack-edpm-ipam-qbckn\" (UID: \"2b009a23-c578-4a3c-aca4-b68d1b9e9118\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-qbckn" Dec 11 13:57:24 crc kubenswrapper[4898]: I1211 13:57:24.127578 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2b009a23-c578-4a3c-aca4-b68d1b9e9118-ssh-key\") pod \"logging-edpm-deployment-openstack-edpm-ipam-qbckn\" (UID: \"2b009a23-c578-4a3c-aca4-b68d1b9e9118\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-qbckn" Dec 11 13:57:24 crc kubenswrapper[4898]: I1211 13:57:24.128831 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/2b009a23-c578-4a3c-aca4-b68d1b9e9118-logging-compute-config-data-0\") pod \"logging-edpm-deployment-openstack-edpm-ipam-qbckn\" (UID: \"2b009a23-c578-4a3c-aca4-b68d1b9e9118\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-qbckn" Dec 11 13:57:24 crc kubenswrapper[4898]: I1211 13:57:24.131151 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2b009a23-c578-4a3c-aca4-b68d1b9e9118-inventory\") pod \"logging-edpm-deployment-openstack-edpm-ipam-qbckn\" (UID: \"2b009a23-c578-4a3c-aca4-b68d1b9e9118\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-qbckn" Dec 11 13:57:24 crc kubenswrapper[4898]: I1211 13:57:24.132786 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/2b009a23-c578-4a3c-aca4-b68d1b9e9118-logging-compute-config-data-1\") pod \"logging-edpm-deployment-openstack-edpm-ipam-qbckn\" (UID: \"2b009a23-c578-4a3c-aca4-b68d1b9e9118\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-qbckn" Dec 11 13:57:24 crc kubenswrapper[4898]: I1211 13:57:24.139868 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qm7km\" (UniqueName: \"kubernetes.io/projected/2b009a23-c578-4a3c-aca4-b68d1b9e9118-kube-api-access-qm7km\") pod \"logging-edpm-deployment-openstack-edpm-ipam-qbckn\" (UID: \"2b009a23-c578-4a3c-aca4-b68d1b9e9118\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-qbckn" Dec 11 13:57:24 crc kubenswrapper[4898]: I1211 13:57:24.164093 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-qbckn" Dec 11 13:57:24 crc kubenswrapper[4898]: I1211 13:57:24.691057 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/logging-edpm-deployment-openstack-edpm-ipam-qbckn"] Dec 11 13:57:24 crc kubenswrapper[4898]: I1211 13:57:24.738792 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-qbckn" event={"ID":"2b009a23-c578-4a3c-aca4-b68d1b9e9118","Type":"ContainerStarted","Data":"3138ed67dd9e38b73918af77225b7733c37aee0778cb80b7bb933a8aa7c39baf"} Dec 11 13:57:26 crc kubenswrapper[4898]: I1211 13:57:26.763231 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-qbckn" event={"ID":"2b009a23-c578-4a3c-aca4-b68d1b9e9118","Type":"ContainerStarted","Data":"4c0a8e85f6f03bf051d2ed22d4a535b0ad9a027530266c40a29e512243ba142c"} Dec 11 13:57:26 crc kubenswrapper[4898]: I1211 13:57:26.790979 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-qbckn" podStartSLOduration=3.071167925 podStartE2EDuration="3.790958939s" podCreationTimestamp="2025-12-11 13:57:23 +0000 UTC" firstStartedPulling="2025-12-11 13:57:24.691036254 +0000 UTC m=+3202.263362691" lastFinishedPulling="2025-12-11 13:57:25.410827268 +0000 UTC m=+3202.983153705" observedRunningTime="2025-12-11 13:57:26.780040366 +0000 UTC m=+3204.352366813" watchObservedRunningTime="2025-12-11 13:57:26.790958939 +0000 UTC m=+3204.363285376" Dec 11 13:57:28 crc kubenswrapper[4898]: I1211 13:57:28.777866 4898 scope.go:117] "RemoveContainer" containerID="c9e922d5ee469f620afdfb2ddd69dbee93879150084a4966647adbf551d5f7e9" Dec 11 13:57:28 crc kubenswrapper[4898]: E1211 13:57:28.778491 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mmvk_openshift-machine-config-operator(b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" Dec 11 13:57:42 crc kubenswrapper[4898]: I1211 13:57:42.963650 4898 generic.go:334] "Generic (PLEG): container finished" podID="2b009a23-c578-4a3c-aca4-b68d1b9e9118" containerID="4c0a8e85f6f03bf051d2ed22d4a535b0ad9a027530266c40a29e512243ba142c" exitCode=0 Dec 11 13:57:42 crc kubenswrapper[4898]: I1211 13:57:42.963737 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-qbckn" event={"ID":"2b009a23-c578-4a3c-aca4-b68d1b9e9118","Type":"ContainerDied","Data":"4c0a8e85f6f03bf051d2ed22d4a535b0ad9a027530266c40a29e512243ba142c"} Dec 11 13:57:43 crc kubenswrapper[4898]: I1211 13:57:43.775030 4898 scope.go:117] "RemoveContainer" containerID="c9e922d5ee469f620afdfb2ddd69dbee93879150084a4966647adbf551d5f7e9" Dec 11 13:57:43 crc kubenswrapper[4898]: E1211 13:57:43.775523 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mmvk_openshift-machine-config-operator(b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" Dec 11 13:57:44 crc kubenswrapper[4898]: I1211 13:57:44.496437 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-qbckn" Dec 11 13:57:44 crc kubenswrapper[4898]: I1211 13:57:44.603829 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2b009a23-c578-4a3c-aca4-b68d1b9e9118-inventory\") pod \"2b009a23-c578-4a3c-aca4-b68d1b9e9118\" (UID: \"2b009a23-c578-4a3c-aca4-b68d1b9e9118\") " Dec 11 13:57:44 crc kubenswrapper[4898]: I1211 13:57:44.603950 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2b009a23-c578-4a3c-aca4-b68d1b9e9118-ssh-key\") pod \"2b009a23-c578-4a3c-aca4-b68d1b9e9118\" (UID: \"2b009a23-c578-4a3c-aca4-b68d1b9e9118\") " Dec 11 13:57:44 crc kubenswrapper[4898]: I1211 13:57:44.604272 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logging-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/2b009a23-c578-4a3c-aca4-b68d1b9e9118-logging-compute-config-data-0\") pod \"2b009a23-c578-4a3c-aca4-b68d1b9e9118\" (UID: \"2b009a23-c578-4a3c-aca4-b68d1b9e9118\") " Dec 11 13:57:44 crc kubenswrapper[4898]: I1211 13:57:44.604551 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qm7km\" (UniqueName: \"kubernetes.io/projected/2b009a23-c578-4a3c-aca4-b68d1b9e9118-kube-api-access-qm7km\") pod \"2b009a23-c578-4a3c-aca4-b68d1b9e9118\" (UID: \"2b009a23-c578-4a3c-aca4-b68d1b9e9118\") " Dec 11 13:57:44 crc kubenswrapper[4898]: I1211 13:57:44.604582 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logging-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/2b009a23-c578-4a3c-aca4-b68d1b9e9118-logging-compute-config-data-1\") pod \"2b009a23-c578-4a3c-aca4-b68d1b9e9118\" (UID: \"2b009a23-c578-4a3c-aca4-b68d1b9e9118\") " Dec 11 13:57:44 crc kubenswrapper[4898]: I1211 13:57:44.609785 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b009a23-c578-4a3c-aca4-b68d1b9e9118-kube-api-access-qm7km" (OuterVolumeSpecName: "kube-api-access-qm7km") pod "2b009a23-c578-4a3c-aca4-b68d1b9e9118" (UID: "2b009a23-c578-4a3c-aca4-b68d1b9e9118"). InnerVolumeSpecName "kube-api-access-qm7km". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:57:44 crc kubenswrapper[4898]: I1211 13:57:44.646705 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b009a23-c578-4a3c-aca4-b68d1b9e9118-logging-compute-config-data-0" (OuterVolumeSpecName: "logging-compute-config-data-0") pod "2b009a23-c578-4a3c-aca4-b68d1b9e9118" (UID: "2b009a23-c578-4a3c-aca4-b68d1b9e9118"). InnerVolumeSpecName "logging-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:57:44 crc kubenswrapper[4898]: I1211 13:57:44.647531 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b009a23-c578-4a3c-aca4-b68d1b9e9118-logging-compute-config-data-1" (OuterVolumeSpecName: "logging-compute-config-data-1") pod "2b009a23-c578-4a3c-aca4-b68d1b9e9118" (UID: "2b009a23-c578-4a3c-aca4-b68d1b9e9118"). InnerVolumeSpecName "logging-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:57:44 crc kubenswrapper[4898]: I1211 13:57:44.649172 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b009a23-c578-4a3c-aca4-b68d1b9e9118-inventory" (OuterVolumeSpecName: "inventory") pod "2b009a23-c578-4a3c-aca4-b68d1b9e9118" (UID: "2b009a23-c578-4a3c-aca4-b68d1b9e9118"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:57:44 crc kubenswrapper[4898]: I1211 13:57:44.656187 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b009a23-c578-4a3c-aca4-b68d1b9e9118-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "2b009a23-c578-4a3c-aca4-b68d1b9e9118" (UID: "2b009a23-c578-4a3c-aca4-b68d1b9e9118"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 13:57:44 crc kubenswrapper[4898]: I1211 13:57:44.706320 4898 reconciler_common.go:293] "Volume detached for volume \"logging-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/2b009a23-c578-4a3c-aca4-b68d1b9e9118-logging-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Dec 11 13:57:44 crc kubenswrapper[4898]: I1211 13:57:44.706358 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qm7km\" (UniqueName: \"kubernetes.io/projected/2b009a23-c578-4a3c-aca4-b68d1b9e9118-kube-api-access-qm7km\") on node \"crc\" DevicePath \"\"" Dec 11 13:57:44 crc kubenswrapper[4898]: I1211 13:57:44.706368 4898 reconciler_common.go:293] "Volume detached for volume \"logging-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/2b009a23-c578-4a3c-aca4-b68d1b9e9118-logging-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Dec 11 13:57:44 crc kubenswrapper[4898]: I1211 13:57:44.706379 4898 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2b009a23-c578-4a3c-aca4-b68d1b9e9118-inventory\") on node \"crc\" DevicePath \"\"" Dec 11 13:57:44 crc kubenswrapper[4898]: I1211 13:57:44.706389 4898 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2b009a23-c578-4a3c-aca4-b68d1b9e9118-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 11 13:57:44 crc kubenswrapper[4898]: I1211 13:57:44.991648 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-qbckn" event={"ID":"2b009a23-c578-4a3c-aca4-b68d1b9e9118","Type":"ContainerDied","Data":"3138ed67dd9e38b73918af77225b7733c37aee0778cb80b7bb933a8aa7c39baf"} Dec 11 13:57:44 crc kubenswrapper[4898]: I1211 13:57:44.992558 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3138ed67dd9e38b73918af77225b7733c37aee0778cb80b7bb933a8aa7c39baf" Dec 11 13:57:44 crc kubenswrapper[4898]: I1211 13:57:44.991681 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-qbckn" Dec 11 13:57:58 crc kubenswrapper[4898]: I1211 13:57:58.775485 4898 scope.go:117] "RemoveContainer" containerID="c9e922d5ee469f620afdfb2ddd69dbee93879150084a4966647adbf551d5f7e9" Dec 11 13:57:58 crc kubenswrapper[4898]: E1211 13:57:58.776351 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mmvk_openshift-machine-config-operator(b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" Dec 11 13:58:10 crc kubenswrapper[4898]: I1211 13:58:10.776018 4898 scope.go:117] "RemoveContainer" containerID="c9e922d5ee469f620afdfb2ddd69dbee93879150084a4966647adbf551d5f7e9" Dec 11 13:58:10 crc kubenswrapper[4898]: E1211 13:58:10.778009 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mmvk_openshift-machine-config-operator(b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" Dec 11 13:58:23 crc kubenswrapper[4898]: I1211 13:58:23.776394 4898 scope.go:117] "RemoveContainer" containerID="c9e922d5ee469f620afdfb2ddd69dbee93879150084a4966647adbf551d5f7e9" Dec 11 13:58:23 crc kubenswrapper[4898]: E1211 13:58:23.777791 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mmvk_openshift-machine-config-operator(b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" Dec 11 13:58:36 crc kubenswrapper[4898]: I1211 13:58:36.775703 4898 scope.go:117] "RemoveContainer" containerID="c9e922d5ee469f620afdfb2ddd69dbee93879150084a4966647adbf551d5f7e9" Dec 11 13:58:36 crc kubenswrapper[4898]: E1211 13:58:36.776948 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mmvk_openshift-machine-config-operator(b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" Dec 11 13:58:47 crc kubenswrapper[4898]: I1211 13:58:47.776907 4898 scope.go:117] "RemoveContainer" containerID="c9e922d5ee469f620afdfb2ddd69dbee93879150084a4966647adbf551d5f7e9" Dec 11 13:58:47 crc kubenswrapper[4898]: E1211 13:58:47.777966 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mmvk_openshift-machine-config-operator(b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" Dec 11 13:59:02 crc kubenswrapper[4898]: I1211 13:59:02.790899 4898 scope.go:117] "RemoveContainer" containerID="c9e922d5ee469f620afdfb2ddd69dbee93879150084a4966647adbf551d5f7e9" Dec 11 13:59:02 crc kubenswrapper[4898]: E1211 13:59:02.791804 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mmvk_openshift-machine-config-operator(b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" Dec 11 13:59:16 crc kubenswrapper[4898]: I1211 13:59:16.776008 4898 scope.go:117] "RemoveContainer" containerID="c9e922d5ee469f620afdfb2ddd69dbee93879150084a4966647adbf551d5f7e9" Dec 11 13:59:16 crc kubenswrapper[4898]: E1211 13:59:16.776940 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mmvk_openshift-machine-config-operator(b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" Dec 11 13:59:29 crc kubenswrapper[4898]: I1211 13:59:29.775736 4898 scope.go:117] "RemoveContainer" containerID="c9e922d5ee469f620afdfb2ddd69dbee93879150084a4966647adbf551d5f7e9" Dec 11 13:59:29 crc kubenswrapper[4898]: E1211 13:59:29.776734 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mmvk_openshift-machine-config-operator(b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" Dec 11 13:59:37 crc kubenswrapper[4898]: I1211 13:59:37.310942 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-2tm58"] Dec 11 13:59:37 crc kubenswrapper[4898]: E1211 13:59:37.312029 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b009a23-c578-4a3c-aca4-b68d1b9e9118" containerName="logging-edpm-deployment-openstack-edpm-ipam" Dec 11 13:59:37 crc kubenswrapper[4898]: I1211 13:59:37.312044 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b009a23-c578-4a3c-aca4-b68d1b9e9118" containerName="logging-edpm-deployment-openstack-edpm-ipam" Dec 11 13:59:37 crc kubenswrapper[4898]: I1211 13:59:37.312314 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b009a23-c578-4a3c-aca4-b68d1b9e9118" containerName="logging-edpm-deployment-openstack-edpm-ipam" Dec 11 13:59:37 crc kubenswrapper[4898]: I1211 13:59:37.314324 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2tm58" Dec 11 13:59:37 crc kubenswrapper[4898]: I1211 13:59:37.336256 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2tm58"] Dec 11 13:59:37 crc kubenswrapper[4898]: I1211 13:59:37.496690 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96ff8a69-cebb-478c-84c0-10cef85f3ee0-catalog-content\") pod \"redhat-marketplace-2tm58\" (UID: \"96ff8a69-cebb-478c-84c0-10cef85f3ee0\") " pod="openshift-marketplace/redhat-marketplace-2tm58" Dec 11 13:59:37 crc kubenswrapper[4898]: I1211 13:59:37.497114 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96ff8a69-cebb-478c-84c0-10cef85f3ee0-utilities\") pod \"redhat-marketplace-2tm58\" (UID: \"96ff8a69-cebb-478c-84c0-10cef85f3ee0\") " pod="openshift-marketplace/redhat-marketplace-2tm58" Dec 11 13:59:37 crc kubenswrapper[4898]: I1211 13:59:37.497197 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vq6g\" (UniqueName: \"kubernetes.io/projected/96ff8a69-cebb-478c-84c0-10cef85f3ee0-kube-api-access-6vq6g\") pod \"redhat-marketplace-2tm58\" (UID: \"96ff8a69-cebb-478c-84c0-10cef85f3ee0\") " pod="openshift-marketplace/redhat-marketplace-2tm58" Dec 11 13:59:37 crc kubenswrapper[4898]: I1211 13:59:37.600396 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96ff8a69-cebb-478c-84c0-10cef85f3ee0-catalog-content\") pod \"redhat-marketplace-2tm58\" (UID: \"96ff8a69-cebb-478c-84c0-10cef85f3ee0\") " pod="openshift-marketplace/redhat-marketplace-2tm58" Dec 11 13:59:37 crc kubenswrapper[4898]: I1211 13:59:37.600528 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96ff8a69-cebb-478c-84c0-10cef85f3ee0-utilities\") pod \"redhat-marketplace-2tm58\" (UID: \"96ff8a69-cebb-478c-84c0-10cef85f3ee0\") " pod="openshift-marketplace/redhat-marketplace-2tm58" Dec 11 13:59:37 crc kubenswrapper[4898]: I1211 13:59:37.600678 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6vq6g\" (UniqueName: \"kubernetes.io/projected/96ff8a69-cebb-478c-84c0-10cef85f3ee0-kube-api-access-6vq6g\") pod \"redhat-marketplace-2tm58\" (UID: \"96ff8a69-cebb-478c-84c0-10cef85f3ee0\") " pod="openshift-marketplace/redhat-marketplace-2tm58" Dec 11 13:59:37 crc kubenswrapper[4898]: I1211 13:59:37.601163 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96ff8a69-cebb-478c-84c0-10cef85f3ee0-utilities\") pod \"redhat-marketplace-2tm58\" (UID: \"96ff8a69-cebb-478c-84c0-10cef85f3ee0\") " pod="openshift-marketplace/redhat-marketplace-2tm58" Dec 11 13:59:37 crc kubenswrapper[4898]: I1211 13:59:37.601160 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96ff8a69-cebb-478c-84c0-10cef85f3ee0-catalog-content\") pod \"redhat-marketplace-2tm58\" (UID: \"96ff8a69-cebb-478c-84c0-10cef85f3ee0\") " pod="openshift-marketplace/redhat-marketplace-2tm58" Dec 11 13:59:37 crc kubenswrapper[4898]: I1211 13:59:37.628292 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vq6g\" (UniqueName: \"kubernetes.io/projected/96ff8a69-cebb-478c-84c0-10cef85f3ee0-kube-api-access-6vq6g\") pod \"redhat-marketplace-2tm58\" (UID: \"96ff8a69-cebb-478c-84c0-10cef85f3ee0\") " pod="openshift-marketplace/redhat-marketplace-2tm58" Dec 11 13:59:37 crc kubenswrapper[4898]: I1211 13:59:37.660802 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2tm58" Dec 11 13:59:38 crc kubenswrapper[4898]: I1211 13:59:38.317576 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2tm58"] Dec 11 13:59:39 crc kubenswrapper[4898]: I1211 13:59:39.327619 4898 generic.go:334] "Generic (PLEG): container finished" podID="96ff8a69-cebb-478c-84c0-10cef85f3ee0" containerID="52a90dbff1a94dd5eb222944ebe4075ed23e314b07114a583d52309f173c5dec" exitCode=0 Dec 11 13:59:39 crc kubenswrapper[4898]: I1211 13:59:39.327678 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2tm58" event={"ID":"96ff8a69-cebb-478c-84c0-10cef85f3ee0","Type":"ContainerDied","Data":"52a90dbff1a94dd5eb222944ebe4075ed23e314b07114a583d52309f173c5dec"} Dec 11 13:59:39 crc kubenswrapper[4898]: I1211 13:59:39.327935 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2tm58" event={"ID":"96ff8a69-cebb-478c-84c0-10cef85f3ee0","Type":"ContainerStarted","Data":"9dcaeec98afefda78df0a580d8a517059f15217f3f2734143db8d22b97247319"} Dec 11 13:59:40 crc kubenswrapper[4898]: I1211 13:59:40.341144 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2tm58" event={"ID":"96ff8a69-cebb-478c-84c0-10cef85f3ee0","Type":"ContainerStarted","Data":"f4622adfe43964e33d43593dd0b219f63b72f3bde23ced621c2c39eea193c101"} Dec 11 13:59:41 crc kubenswrapper[4898]: I1211 13:59:41.775881 4898 scope.go:117] "RemoveContainer" containerID="c9e922d5ee469f620afdfb2ddd69dbee93879150084a4966647adbf551d5f7e9" Dec 11 13:59:41 crc kubenswrapper[4898]: E1211 13:59:41.776503 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mmvk_openshift-machine-config-operator(b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" Dec 11 13:59:42 crc kubenswrapper[4898]: I1211 13:59:42.368422 4898 generic.go:334] "Generic (PLEG): container finished" podID="96ff8a69-cebb-478c-84c0-10cef85f3ee0" containerID="f4622adfe43964e33d43593dd0b219f63b72f3bde23ced621c2c39eea193c101" exitCode=0 Dec 11 13:59:42 crc kubenswrapper[4898]: I1211 13:59:42.368481 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2tm58" event={"ID":"96ff8a69-cebb-478c-84c0-10cef85f3ee0","Type":"ContainerDied","Data":"f4622adfe43964e33d43593dd0b219f63b72f3bde23ced621c2c39eea193c101"} Dec 11 13:59:43 crc kubenswrapper[4898]: I1211 13:59:43.385471 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2tm58" event={"ID":"96ff8a69-cebb-478c-84c0-10cef85f3ee0","Type":"ContainerStarted","Data":"d707f03125e521950534c4bfaaa4639aaf2a5e57114fe7d8983b09a538004add"} Dec 11 13:59:43 crc kubenswrapper[4898]: I1211 13:59:43.413522 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-2tm58" podStartSLOduration=2.68377278 podStartE2EDuration="6.413498418s" podCreationTimestamp="2025-12-11 13:59:37 +0000 UTC" firstStartedPulling="2025-12-11 13:59:39.329417586 +0000 UTC m=+3336.901744023" lastFinishedPulling="2025-12-11 13:59:43.059143184 +0000 UTC m=+3340.631469661" observedRunningTime="2025-12-11 13:59:43.404971729 +0000 UTC m=+3340.977298166" watchObservedRunningTime="2025-12-11 13:59:43.413498418 +0000 UTC m=+3340.985824865" Dec 11 13:59:47 crc kubenswrapper[4898]: I1211 13:59:47.661832 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-2tm58" Dec 11 13:59:47 crc kubenswrapper[4898]: I1211 13:59:47.662624 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-2tm58" Dec 11 13:59:47 crc kubenswrapper[4898]: I1211 13:59:47.727681 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-2tm58" Dec 11 13:59:48 crc kubenswrapper[4898]: I1211 13:59:48.499326 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-2tm58" Dec 11 13:59:48 crc kubenswrapper[4898]: I1211 13:59:48.559342 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2tm58"] Dec 11 13:59:50 crc kubenswrapper[4898]: I1211 13:59:50.454781 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-2tm58" podUID="96ff8a69-cebb-478c-84c0-10cef85f3ee0" containerName="registry-server" containerID="cri-o://d707f03125e521950534c4bfaaa4639aaf2a5e57114fe7d8983b09a538004add" gracePeriod=2 Dec 11 13:59:51 crc kubenswrapper[4898]: I1211 13:59:51.470991 4898 generic.go:334] "Generic (PLEG): container finished" podID="96ff8a69-cebb-478c-84c0-10cef85f3ee0" containerID="d707f03125e521950534c4bfaaa4639aaf2a5e57114fe7d8983b09a538004add" exitCode=0 Dec 11 13:59:51 crc kubenswrapper[4898]: I1211 13:59:51.471053 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2tm58" event={"ID":"96ff8a69-cebb-478c-84c0-10cef85f3ee0","Type":"ContainerDied","Data":"d707f03125e521950534c4bfaaa4639aaf2a5e57114fe7d8983b09a538004add"} Dec 11 13:59:51 crc kubenswrapper[4898]: I1211 13:59:51.760159 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2tm58" Dec 11 13:59:51 crc kubenswrapper[4898]: I1211 13:59:51.884325 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96ff8a69-cebb-478c-84c0-10cef85f3ee0-utilities\") pod \"96ff8a69-cebb-478c-84c0-10cef85f3ee0\" (UID: \"96ff8a69-cebb-478c-84c0-10cef85f3ee0\") " Dec 11 13:59:51 crc kubenswrapper[4898]: I1211 13:59:51.884672 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96ff8a69-cebb-478c-84c0-10cef85f3ee0-catalog-content\") pod \"96ff8a69-cebb-478c-84c0-10cef85f3ee0\" (UID: \"96ff8a69-cebb-478c-84c0-10cef85f3ee0\") " Dec 11 13:59:51 crc kubenswrapper[4898]: I1211 13:59:51.884778 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6vq6g\" (UniqueName: \"kubernetes.io/projected/96ff8a69-cebb-478c-84c0-10cef85f3ee0-kube-api-access-6vq6g\") pod \"96ff8a69-cebb-478c-84c0-10cef85f3ee0\" (UID: \"96ff8a69-cebb-478c-84c0-10cef85f3ee0\") " Dec 11 13:59:51 crc kubenswrapper[4898]: I1211 13:59:51.885141 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/96ff8a69-cebb-478c-84c0-10cef85f3ee0-utilities" (OuterVolumeSpecName: "utilities") pod "96ff8a69-cebb-478c-84c0-10cef85f3ee0" (UID: "96ff8a69-cebb-478c-84c0-10cef85f3ee0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 13:59:51 crc kubenswrapper[4898]: I1211 13:59:51.885959 4898 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96ff8a69-cebb-478c-84c0-10cef85f3ee0-utilities\") on node \"crc\" DevicePath \"\"" Dec 11 13:59:51 crc kubenswrapper[4898]: I1211 13:59:51.908313 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/96ff8a69-cebb-478c-84c0-10cef85f3ee0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "96ff8a69-cebb-478c-84c0-10cef85f3ee0" (UID: "96ff8a69-cebb-478c-84c0-10cef85f3ee0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 13:59:51 crc kubenswrapper[4898]: I1211 13:59:51.908922 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96ff8a69-cebb-478c-84c0-10cef85f3ee0-kube-api-access-6vq6g" (OuterVolumeSpecName: "kube-api-access-6vq6g") pod "96ff8a69-cebb-478c-84c0-10cef85f3ee0" (UID: "96ff8a69-cebb-478c-84c0-10cef85f3ee0"). InnerVolumeSpecName "kube-api-access-6vq6g". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 13:59:51 crc kubenswrapper[4898]: I1211 13:59:51.989060 4898 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96ff8a69-cebb-478c-84c0-10cef85f3ee0-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 11 13:59:51 crc kubenswrapper[4898]: I1211 13:59:51.989101 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6vq6g\" (UniqueName: \"kubernetes.io/projected/96ff8a69-cebb-478c-84c0-10cef85f3ee0-kube-api-access-6vq6g\") on node \"crc\" DevicePath \"\"" Dec 11 13:59:52 crc kubenswrapper[4898]: I1211 13:59:52.483959 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2tm58" event={"ID":"96ff8a69-cebb-478c-84c0-10cef85f3ee0","Type":"ContainerDied","Data":"9dcaeec98afefda78df0a580d8a517059f15217f3f2734143db8d22b97247319"} Dec 11 13:59:52 crc kubenswrapper[4898]: I1211 13:59:52.484043 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2tm58" Dec 11 13:59:52 crc kubenswrapper[4898]: I1211 13:59:52.484325 4898 scope.go:117] "RemoveContainer" containerID="d707f03125e521950534c4bfaaa4639aaf2a5e57114fe7d8983b09a538004add" Dec 11 13:59:52 crc kubenswrapper[4898]: I1211 13:59:52.511337 4898 scope.go:117] "RemoveContainer" containerID="f4622adfe43964e33d43593dd0b219f63b72f3bde23ced621c2c39eea193c101" Dec 11 13:59:52 crc kubenswrapper[4898]: I1211 13:59:52.521218 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2tm58"] Dec 11 13:59:52 crc kubenswrapper[4898]: I1211 13:59:52.541704 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-2tm58"] Dec 11 13:59:52 crc kubenswrapper[4898]: I1211 13:59:52.548727 4898 scope.go:117] "RemoveContainer" containerID="52a90dbff1a94dd5eb222944ebe4075ed23e314b07114a583d52309f173c5dec" Dec 11 13:59:52 crc kubenswrapper[4898]: I1211 13:59:52.798672 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96ff8a69-cebb-478c-84c0-10cef85f3ee0" path="/var/lib/kubelet/pods/96ff8a69-cebb-478c-84c0-10cef85f3ee0/volumes" Dec 11 13:59:54 crc kubenswrapper[4898]: I1211 13:59:54.775261 4898 scope.go:117] "RemoveContainer" containerID="c9e922d5ee469f620afdfb2ddd69dbee93879150084a4966647adbf551d5f7e9" Dec 11 13:59:54 crc kubenswrapper[4898]: E1211 13:59:54.775667 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mmvk_openshift-machine-config-operator(b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" Dec 11 14:00:00 crc kubenswrapper[4898]: I1211 14:00:00.151154 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29424360-8ltlj"] Dec 11 14:00:00 crc kubenswrapper[4898]: E1211 14:00:00.153809 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96ff8a69-cebb-478c-84c0-10cef85f3ee0" containerName="registry-server" Dec 11 14:00:00 crc kubenswrapper[4898]: I1211 14:00:00.153827 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="96ff8a69-cebb-478c-84c0-10cef85f3ee0" containerName="registry-server" Dec 11 14:00:00 crc kubenswrapper[4898]: E1211 14:00:00.153850 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96ff8a69-cebb-478c-84c0-10cef85f3ee0" containerName="extract-content" Dec 11 14:00:00 crc kubenswrapper[4898]: I1211 14:00:00.153857 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="96ff8a69-cebb-478c-84c0-10cef85f3ee0" containerName="extract-content" Dec 11 14:00:00 crc kubenswrapper[4898]: E1211 14:00:00.153890 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96ff8a69-cebb-478c-84c0-10cef85f3ee0" containerName="extract-utilities" Dec 11 14:00:00 crc kubenswrapper[4898]: I1211 14:00:00.153898 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="96ff8a69-cebb-478c-84c0-10cef85f3ee0" containerName="extract-utilities" Dec 11 14:00:00 crc kubenswrapper[4898]: I1211 14:00:00.154196 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="96ff8a69-cebb-478c-84c0-10cef85f3ee0" containerName="registry-server" Dec 11 14:00:00 crc kubenswrapper[4898]: I1211 14:00:00.155283 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29424360-8ltlj" Dec 11 14:00:00 crc kubenswrapper[4898]: I1211 14:00:00.157678 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 11 14:00:00 crc kubenswrapper[4898]: I1211 14:00:00.158037 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 11 14:00:00 crc kubenswrapper[4898]: I1211 14:00:00.177479 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29424360-8ltlj"] Dec 11 14:00:00 crc kubenswrapper[4898]: I1211 14:00:00.185224 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cmzwg\" (UniqueName: \"kubernetes.io/projected/016e1dee-c122-4f9b-8720-19bf59ce0987-kube-api-access-cmzwg\") pod \"collect-profiles-29424360-8ltlj\" (UID: \"016e1dee-c122-4f9b-8720-19bf59ce0987\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29424360-8ltlj" Dec 11 14:00:00 crc kubenswrapper[4898]: I1211 14:00:00.185489 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/016e1dee-c122-4f9b-8720-19bf59ce0987-config-volume\") pod \"collect-profiles-29424360-8ltlj\" (UID: \"016e1dee-c122-4f9b-8720-19bf59ce0987\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29424360-8ltlj" Dec 11 14:00:00 crc kubenswrapper[4898]: I1211 14:00:00.185689 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/016e1dee-c122-4f9b-8720-19bf59ce0987-secret-volume\") pod \"collect-profiles-29424360-8ltlj\" (UID: \"016e1dee-c122-4f9b-8720-19bf59ce0987\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29424360-8ltlj" Dec 11 14:00:00 crc kubenswrapper[4898]: I1211 14:00:00.287565 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/016e1dee-c122-4f9b-8720-19bf59ce0987-secret-volume\") pod \"collect-profiles-29424360-8ltlj\" (UID: \"016e1dee-c122-4f9b-8720-19bf59ce0987\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29424360-8ltlj" Dec 11 14:00:00 crc kubenswrapper[4898]: I1211 14:00:00.287646 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cmzwg\" (UniqueName: \"kubernetes.io/projected/016e1dee-c122-4f9b-8720-19bf59ce0987-kube-api-access-cmzwg\") pod \"collect-profiles-29424360-8ltlj\" (UID: \"016e1dee-c122-4f9b-8720-19bf59ce0987\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29424360-8ltlj" Dec 11 14:00:00 crc kubenswrapper[4898]: I1211 14:00:00.287790 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/016e1dee-c122-4f9b-8720-19bf59ce0987-config-volume\") pod \"collect-profiles-29424360-8ltlj\" (UID: \"016e1dee-c122-4f9b-8720-19bf59ce0987\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29424360-8ltlj" Dec 11 14:00:00 crc kubenswrapper[4898]: I1211 14:00:00.288719 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/016e1dee-c122-4f9b-8720-19bf59ce0987-config-volume\") pod \"collect-profiles-29424360-8ltlj\" (UID: \"016e1dee-c122-4f9b-8720-19bf59ce0987\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29424360-8ltlj" Dec 11 14:00:00 crc kubenswrapper[4898]: I1211 14:00:00.293086 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/016e1dee-c122-4f9b-8720-19bf59ce0987-secret-volume\") pod \"collect-profiles-29424360-8ltlj\" (UID: \"016e1dee-c122-4f9b-8720-19bf59ce0987\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29424360-8ltlj" Dec 11 14:00:00 crc kubenswrapper[4898]: I1211 14:00:00.308676 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cmzwg\" (UniqueName: \"kubernetes.io/projected/016e1dee-c122-4f9b-8720-19bf59ce0987-kube-api-access-cmzwg\") pod \"collect-profiles-29424360-8ltlj\" (UID: \"016e1dee-c122-4f9b-8720-19bf59ce0987\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29424360-8ltlj" Dec 11 14:00:00 crc kubenswrapper[4898]: I1211 14:00:00.481967 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29424360-8ltlj" Dec 11 14:00:00 crc kubenswrapper[4898]: I1211 14:00:00.965852 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29424360-8ltlj"] Dec 11 14:00:01 crc kubenswrapper[4898]: I1211 14:00:01.594491 4898 generic.go:334] "Generic (PLEG): container finished" podID="016e1dee-c122-4f9b-8720-19bf59ce0987" containerID="50152bfd32e95e33b7afd08f1a0665c7beb6fb918ba3c6e5d20e85c9572b44d4" exitCode=0 Dec 11 14:00:01 crc kubenswrapper[4898]: I1211 14:00:01.594809 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29424360-8ltlj" event={"ID":"016e1dee-c122-4f9b-8720-19bf59ce0987","Type":"ContainerDied","Data":"50152bfd32e95e33b7afd08f1a0665c7beb6fb918ba3c6e5d20e85c9572b44d4"} Dec 11 14:00:01 crc kubenswrapper[4898]: I1211 14:00:01.595336 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29424360-8ltlj" event={"ID":"016e1dee-c122-4f9b-8720-19bf59ce0987","Type":"ContainerStarted","Data":"8c78618175bfec62d26e868e73653b48b563a840e801f6506207bcfcf6e4c469"} Dec 11 14:00:03 crc kubenswrapper[4898]: I1211 14:00:03.014201 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29424360-8ltlj" Dec 11 14:00:03 crc kubenswrapper[4898]: I1211 14:00:03.187478 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cmzwg\" (UniqueName: \"kubernetes.io/projected/016e1dee-c122-4f9b-8720-19bf59ce0987-kube-api-access-cmzwg\") pod \"016e1dee-c122-4f9b-8720-19bf59ce0987\" (UID: \"016e1dee-c122-4f9b-8720-19bf59ce0987\") " Dec 11 14:00:03 crc kubenswrapper[4898]: I1211 14:00:03.187690 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/016e1dee-c122-4f9b-8720-19bf59ce0987-config-volume\") pod \"016e1dee-c122-4f9b-8720-19bf59ce0987\" (UID: \"016e1dee-c122-4f9b-8720-19bf59ce0987\") " Dec 11 14:00:03 crc kubenswrapper[4898]: I1211 14:00:03.187772 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/016e1dee-c122-4f9b-8720-19bf59ce0987-secret-volume\") pod \"016e1dee-c122-4f9b-8720-19bf59ce0987\" (UID: \"016e1dee-c122-4f9b-8720-19bf59ce0987\") " Dec 11 14:00:03 crc kubenswrapper[4898]: I1211 14:00:03.188228 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/016e1dee-c122-4f9b-8720-19bf59ce0987-config-volume" (OuterVolumeSpecName: "config-volume") pod "016e1dee-c122-4f9b-8720-19bf59ce0987" (UID: "016e1dee-c122-4f9b-8720-19bf59ce0987"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 14:00:03 crc kubenswrapper[4898]: I1211 14:00:03.193709 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/016e1dee-c122-4f9b-8720-19bf59ce0987-kube-api-access-cmzwg" (OuterVolumeSpecName: "kube-api-access-cmzwg") pod "016e1dee-c122-4f9b-8720-19bf59ce0987" (UID: "016e1dee-c122-4f9b-8720-19bf59ce0987"). InnerVolumeSpecName "kube-api-access-cmzwg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 14:00:03 crc kubenswrapper[4898]: I1211 14:00:03.201800 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/016e1dee-c122-4f9b-8720-19bf59ce0987-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "016e1dee-c122-4f9b-8720-19bf59ce0987" (UID: "016e1dee-c122-4f9b-8720-19bf59ce0987"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 14:00:03 crc kubenswrapper[4898]: I1211 14:00:03.290550 4898 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/016e1dee-c122-4f9b-8720-19bf59ce0987-config-volume\") on node \"crc\" DevicePath \"\"" Dec 11 14:00:03 crc kubenswrapper[4898]: I1211 14:00:03.290592 4898 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/016e1dee-c122-4f9b-8720-19bf59ce0987-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 11 14:00:03 crc kubenswrapper[4898]: I1211 14:00:03.290605 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cmzwg\" (UniqueName: \"kubernetes.io/projected/016e1dee-c122-4f9b-8720-19bf59ce0987-kube-api-access-cmzwg\") on node \"crc\" DevicePath \"\"" Dec 11 14:00:03 crc kubenswrapper[4898]: I1211 14:00:03.621401 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29424360-8ltlj" event={"ID":"016e1dee-c122-4f9b-8720-19bf59ce0987","Type":"ContainerDied","Data":"8c78618175bfec62d26e868e73653b48b563a840e801f6506207bcfcf6e4c469"} Dec 11 14:00:03 crc kubenswrapper[4898]: I1211 14:00:03.621697 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8c78618175bfec62d26e868e73653b48b563a840e801f6506207bcfcf6e4c469" Dec 11 14:00:03 crc kubenswrapper[4898]: I1211 14:00:03.621518 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29424360-8ltlj" Dec 11 14:00:04 crc kubenswrapper[4898]: I1211 14:00:04.093414 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29424315-nc8pn"] Dec 11 14:00:04 crc kubenswrapper[4898]: I1211 14:00:04.116237 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29424315-nc8pn"] Dec 11 14:00:04 crc kubenswrapper[4898]: I1211 14:00:04.789947 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2eff25b9-7a31-4118-b8c3-c57b8b4714fa" path="/var/lib/kubelet/pods/2eff25b9-7a31-4118-b8c3-c57b8b4714fa/volumes" Dec 11 14:00:06 crc kubenswrapper[4898]: I1211 14:00:06.775969 4898 scope.go:117] "RemoveContainer" containerID="c9e922d5ee469f620afdfb2ddd69dbee93879150084a4966647adbf551d5f7e9" Dec 11 14:00:06 crc kubenswrapper[4898]: E1211 14:00:06.776657 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mmvk_openshift-machine-config-operator(b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" Dec 11 14:00:19 crc kubenswrapper[4898]: I1211 14:00:19.775000 4898 scope.go:117] "RemoveContainer" containerID="c9e922d5ee469f620afdfb2ddd69dbee93879150084a4966647adbf551d5f7e9" Dec 11 14:00:19 crc kubenswrapper[4898]: E1211 14:00:19.775715 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mmvk_openshift-machine-config-operator(b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" Dec 11 14:00:20 crc kubenswrapper[4898]: I1211 14:00:20.866968 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-g7vh2"] Dec 11 14:00:20 crc kubenswrapper[4898]: E1211 14:00:20.867669 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="016e1dee-c122-4f9b-8720-19bf59ce0987" containerName="collect-profiles" Dec 11 14:00:20 crc kubenswrapper[4898]: I1211 14:00:20.867689 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="016e1dee-c122-4f9b-8720-19bf59ce0987" containerName="collect-profiles" Dec 11 14:00:20 crc kubenswrapper[4898]: I1211 14:00:20.868023 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="016e1dee-c122-4f9b-8720-19bf59ce0987" containerName="collect-profiles" Dec 11 14:00:20 crc kubenswrapper[4898]: I1211 14:00:20.870270 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-g7vh2" Dec 11 14:00:20 crc kubenswrapper[4898]: I1211 14:00:20.915107 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-g7vh2"] Dec 11 14:00:21 crc kubenswrapper[4898]: I1211 14:00:21.006628 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57ab3cc8-ec1d-45a6-9670-e4bc9c5e32b2-catalog-content\") pod \"certified-operators-g7vh2\" (UID: \"57ab3cc8-ec1d-45a6-9670-e4bc9c5e32b2\") " pod="openshift-marketplace/certified-operators-g7vh2" Dec 11 14:00:21 crc kubenswrapper[4898]: I1211 14:00:21.006994 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-brkg4\" (UniqueName: \"kubernetes.io/projected/57ab3cc8-ec1d-45a6-9670-e4bc9c5e32b2-kube-api-access-brkg4\") pod \"certified-operators-g7vh2\" (UID: \"57ab3cc8-ec1d-45a6-9670-e4bc9c5e32b2\") " pod="openshift-marketplace/certified-operators-g7vh2" Dec 11 14:00:21 crc kubenswrapper[4898]: I1211 14:00:21.007176 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57ab3cc8-ec1d-45a6-9670-e4bc9c5e32b2-utilities\") pod \"certified-operators-g7vh2\" (UID: \"57ab3cc8-ec1d-45a6-9670-e4bc9c5e32b2\") " pod="openshift-marketplace/certified-operators-g7vh2" Dec 11 14:00:21 crc kubenswrapper[4898]: I1211 14:00:21.109219 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-brkg4\" (UniqueName: \"kubernetes.io/projected/57ab3cc8-ec1d-45a6-9670-e4bc9c5e32b2-kube-api-access-brkg4\") pod \"certified-operators-g7vh2\" (UID: \"57ab3cc8-ec1d-45a6-9670-e4bc9c5e32b2\") " pod="openshift-marketplace/certified-operators-g7vh2" Dec 11 14:00:21 crc kubenswrapper[4898]: I1211 14:00:21.109517 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57ab3cc8-ec1d-45a6-9670-e4bc9c5e32b2-utilities\") pod \"certified-operators-g7vh2\" (UID: \"57ab3cc8-ec1d-45a6-9670-e4bc9c5e32b2\") " pod="openshift-marketplace/certified-operators-g7vh2" Dec 11 14:00:21 crc kubenswrapper[4898]: I1211 14:00:21.109648 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57ab3cc8-ec1d-45a6-9670-e4bc9c5e32b2-catalog-content\") pod \"certified-operators-g7vh2\" (UID: \"57ab3cc8-ec1d-45a6-9670-e4bc9c5e32b2\") " pod="openshift-marketplace/certified-operators-g7vh2" Dec 11 14:00:21 crc kubenswrapper[4898]: I1211 14:00:21.110080 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57ab3cc8-ec1d-45a6-9670-e4bc9c5e32b2-utilities\") pod \"certified-operators-g7vh2\" (UID: \"57ab3cc8-ec1d-45a6-9670-e4bc9c5e32b2\") " pod="openshift-marketplace/certified-operators-g7vh2" Dec 11 14:00:21 crc kubenswrapper[4898]: I1211 14:00:21.110096 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57ab3cc8-ec1d-45a6-9670-e4bc9c5e32b2-catalog-content\") pod \"certified-operators-g7vh2\" (UID: \"57ab3cc8-ec1d-45a6-9670-e4bc9c5e32b2\") " pod="openshift-marketplace/certified-operators-g7vh2" Dec 11 14:00:21 crc kubenswrapper[4898]: I1211 14:00:21.136403 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-brkg4\" (UniqueName: \"kubernetes.io/projected/57ab3cc8-ec1d-45a6-9670-e4bc9c5e32b2-kube-api-access-brkg4\") pod \"certified-operators-g7vh2\" (UID: \"57ab3cc8-ec1d-45a6-9670-e4bc9c5e32b2\") " pod="openshift-marketplace/certified-operators-g7vh2" Dec 11 14:00:21 crc kubenswrapper[4898]: I1211 14:00:21.228786 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-g7vh2" Dec 11 14:00:21 crc kubenswrapper[4898]: I1211 14:00:21.880789 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-g7vh2"] Dec 11 14:00:21 crc kubenswrapper[4898]: W1211 14:00:21.892632 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod57ab3cc8_ec1d_45a6_9670_e4bc9c5e32b2.slice/crio-8f3ea153a4dcdad445cc742ff24b3309df4aadcfbfcb7223a49ec289bf7ed956 WatchSource:0}: Error finding container 8f3ea153a4dcdad445cc742ff24b3309df4aadcfbfcb7223a49ec289bf7ed956: Status 404 returned error can't find the container with id 8f3ea153a4dcdad445cc742ff24b3309df4aadcfbfcb7223a49ec289bf7ed956 Dec 11 14:00:22 crc kubenswrapper[4898]: I1211 14:00:22.831827 4898 generic.go:334] "Generic (PLEG): container finished" podID="57ab3cc8-ec1d-45a6-9670-e4bc9c5e32b2" containerID="f2b0c779f702169bd401a2929d7d18a39ebeddb8d9b75c1fa1cf5bd4360117ae" exitCode=0 Dec 11 14:00:22 crc kubenswrapper[4898]: I1211 14:00:22.832089 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g7vh2" event={"ID":"57ab3cc8-ec1d-45a6-9670-e4bc9c5e32b2","Type":"ContainerDied","Data":"f2b0c779f702169bd401a2929d7d18a39ebeddb8d9b75c1fa1cf5bd4360117ae"} Dec 11 14:00:22 crc kubenswrapper[4898]: I1211 14:00:22.832115 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g7vh2" event={"ID":"57ab3cc8-ec1d-45a6-9670-e4bc9c5e32b2","Type":"ContainerStarted","Data":"8f3ea153a4dcdad445cc742ff24b3309df4aadcfbfcb7223a49ec289bf7ed956"} Dec 11 14:00:22 crc kubenswrapper[4898]: I1211 14:00:22.834448 4898 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 11 14:00:23 crc kubenswrapper[4898]: I1211 14:00:23.843604 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g7vh2" event={"ID":"57ab3cc8-ec1d-45a6-9670-e4bc9c5e32b2","Type":"ContainerStarted","Data":"22f1d82d5e1de5128e39412ce0116ce7317fb2bfd0fcf41c71e87b0bfcb724a1"} Dec 11 14:00:25 crc kubenswrapper[4898]: I1211 14:00:25.873790 4898 generic.go:334] "Generic (PLEG): container finished" podID="57ab3cc8-ec1d-45a6-9670-e4bc9c5e32b2" containerID="22f1d82d5e1de5128e39412ce0116ce7317fb2bfd0fcf41c71e87b0bfcb724a1" exitCode=0 Dec 11 14:00:25 crc kubenswrapper[4898]: I1211 14:00:25.874098 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g7vh2" event={"ID":"57ab3cc8-ec1d-45a6-9670-e4bc9c5e32b2","Type":"ContainerDied","Data":"22f1d82d5e1de5128e39412ce0116ce7317fb2bfd0fcf41c71e87b0bfcb724a1"} Dec 11 14:00:26 crc kubenswrapper[4898]: I1211 14:00:26.887911 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g7vh2" event={"ID":"57ab3cc8-ec1d-45a6-9670-e4bc9c5e32b2","Type":"ContainerStarted","Data":"ed4374b6f9b2f907e280dff247a4db8f60b69d7cc0c0ad03c8e87eee29ffc48f"} Dec 11 14:00:26 crc kubenswrapper[4898]: I1211 14:00:26.911818 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-g7vh2" podStartSLOduration=3.456805599 podStartE2EDuration="6.911797872s" podCreationTimestamp="2025-12-11 14:00:20 +0000 UTC" firstStartedPulling="2025-12-11 14:00:22.833975409 +0000 UTC m=+3380.406301846" lastFinishedPulling="2025-12-11 14:00:26.288967682 +0000 UTC m=+3383.861294119" observedRunningTime="2025-12-11 14:00:26.908919975 +0000 UTC m=+3384.481246422" watchObservedRunningTime="2025-12-11 14:00:26.911797872 +0000 UTC m=+3384.484124309" Dec 11 14:00:30 crc kubenswrapper[4898]: I1211 14:00:30.776740 4898 scope.go:117] "RemoveContainer" containerID="c9e922d5ee469f620afdfb2ddd69dbee93879150084a4966647adbf551d5f7e9" Dec 11 14:00:30 crc kubenswrapper[4898]: E1211 14:00:30.779977 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mmvk_openshift-machine-config-operator(b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" Dec 11 14:00:31 crc kubenswrapper[4898]: I1211 14:00:31.229301 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-g7vh2" Dec 11 14:00:31 crc kubenswrapper[4898]: I1211 14:00:31.231872 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-g7vh2" Dec 11 14:00:31 crc kubenswrapper[4898]: I1211 14:00:31.278138 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-g7vh2" Dec 11 14:00:32 crc kubenswrapper[4898]: I1211 14:00:32.017339 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-g7vh2" Dec 11 14:00:32 crc kubenswrapper[4898]: I1211 14:00:32.079257 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-g7vh2"] Dec 11 14:00:33 crc kubenswrapper[4898]: I1211 14:00:33.965251 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-g7vh2" podUID="57ab3cc8-ec1d-45a6-9670-e4bc9c5e32b2" containerName="registry-server" containerID="cri-o://ed4374b6f9b2f907e280dff247a4db8f60b69d7cc0c0ad03c8e87eee29ffc48f" gracePeriod=2 Dec 11 14:00:34 crc kubenswrapper[4898]: I1211 14:00:34.695668 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-g7vh2" Dec 11 14:00:34 crc kubenswrapper[4898]: I1211 14:00:34.790394 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57ab3cc8-ec1d-45a6-9670-e4bc9c5e32b2-catalog-content\") pod \"57ab3cc8-ec1d-45a6-9670-e4bc9c5e32b2\" (UID: \"57ab3cc8-ec1d-45a6-9670-e4bc9c5e32b2\") " Dec 11 14:00:34 crc kubenswrapper[4898]: I1211 14:00:34.790549 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-brkg4\" (UniqueName: \"kubernetes.io/projected/57ab3cc8-ec1d-45a6-9670-e4bc9c5e32b2-kube-api-access-brkg4\") pod \"57ab3cc8-ec1d-45a6-9670-e4bc9c5e32b2\" (UID: \"57ab3cc8-ec1d-45a6-9670-e4bc9c5e32b2\") " Dec 11 14:00:34 crc kubenswrapper[4898]: I1211 14:00:34.790675 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57ab3cc8-ec1d-45a6-9670-e4bc9c5e32b2-utilities\") pod \"57ab3cc8-ec1d-45a6-9670-e4bc9c5e32b2\" (UID: \"57ab3cc8-ec1d-45a6-9670-e4bc9c5e32b2\") " Dec 11 14:00:34 crc kubenswrapper[4898]: I1211 14:00:34.792116 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57ab3cc8-ec1d-45a6-9670-e4bc9c5e32b2-utilities" (OuterVolumeSpecName: "utilities") pod "57ab3cc8-ec1d-45a6-9670-e4bc9c5e32b2" (UID: "57ab3cc8-ec1d-45a6-9670-e4bc9c5e32b2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 14:00:34 crc kubenswrapper[4898]: I1211 14:00:34.798849 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57ab3cc8-ec1d-45a6-9670-e4bc9c5e32b2-kube-api-access-brkg4" (OuterVolumeSpecName: "kube-api-access-brkg4") pod "57ab3cc8-ec1d-45a6-9670-e4bc9c5e32b2" (UID: "57ab3cc8-ec1d-45a6-9670-e4bc9c5e32b2"). InnerVolumeSpecName "kube-api-access-brkg4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 14:00:34 crc kubenswrapper[4898]: I1211 14:00:34.852916 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57ab3cc8-ec1d-45a6-9670-e4bc9c5e32b2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57ab3cc8-ec1d-45a6-9670-e4bc9c5e32b2" (UID: "57ab3cc8-ec1d-45a6-9670-e4bc9c5e32b2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 14:00:34 crc kubenswrapper[4898]: I1211 14:00:34.893111 4898 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57ab3cc8-ec1d-45a6-9670-e4bc9c5e32b2-utilities\") on node \"crc\" DevicePath \"\"" Dec 11 14:00:34 crc kubenswrapper[4898]: I1211 14:00:34.893361 4898 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57ab3cc8-ec1d-45a6-9670-e4bc9c5e32b2-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 11 14:00:34 crc kubenswrapper[4898]: I1211 14:00:34.893451 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-brkg4\" (UniqueName: \"kubernetes.io/projected/57ab3cc8-ec1d-45a6-9670-e4bc9c5e32b2-kube-api-access-brkg4\") on node \"crc\" DevicePath \"\"" Dec 11 14:00:34 crc kubenswrapper[4898]: I1211 14:00:34.976954 4898 generic.go:334] "Generic (PLEG): container finished" podID="57ab3cc8-ec1d-45a6-9670-e4bc9c5e32b2" containerID="ed4374b6f9b2f907e280dff247a4db8f60b69d7cc0c0ad03c8e87eee29ffc48f" exitCode=0 Dec 11 14:00:34 crc kubenswrapper[4898]: I1211 14:00:34.977015 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-g7vh2" Dec 11 14:00:34 crc kubenswrapper[4898]: I1211 14:00:34.977041 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g7vh2" event={"ID":"57ab3cc8-ec1d-45a6-9670-e4bc9c5e32b2","Type":"ContainerDied","Data":"ed4374b6f9b2f907e280dff247a4db8f60b69d7cc0c0ad03c8e87eee29ffc48f"} Dec 11 14:00:34 crc kubenswrapper[4898]: I1211 14:00:34.977916 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g7vh2" event={"ID":"57ab3cc8-ec1d-45a6-9670-e4bc9c5e32b2","Type":"ContainerDied","Data":"8f3ea153a4dcdad445cc742ff24b3309df4aadcfbfcb7223a49ec289bf7ed956"} Dec 11 14:00:34 crc kubenswrapper[4898]: I1211 14:00:34.977942 4898 scope.go:117] "RemoveContainer" containerID="ed4374b6f9b2f907e280dff247a4db8f60b69d7cc0c0ad03c8e87eee29ffc48f" Dec 11 14:00:35 crc kubenswrapper[4898]: I1211 14:00:35.011693 4898 scope.go:117] "RemoveContainer" containerID="22f1d82d5e1de5128e39412ce0116ce7317fb2bfd0fcf41c71e87b0bfcb724a1" Dec 11 14:00:35 crc kubenswrapper[4898]: I1211 14:00:35.016313 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-g7vh2"] Dec 11 14:00:35 crc kubenswrapper[4898]: I1211 14:00:35.026320 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-g7vh2"] Dec 11 14:00:35 crc kubenswrapper[4898]: I1211 14:00:35.033431 4898 scope.go:117] "RemoveContainer" containerID="f2b0c779f702169bd401a2929d7d18a39ebeddb8d9b75c1fa1cf5bd4360117ae" Dec 11 14:00:35 crc kubenswrapper[4898]: I1211 14:00:35.104230 4898 scope.go:117] "RemoveContainer" containerID="ed4374b6f9b2f907e280dff247a4db8f60b69d7cc0c0ad03c8e87eee29ffc48f" Dec 11 14:00:35 crc kubenswrapper[4898]: E1211 14:00:35.105075 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ed4374b6f9b2f907e280dff247a4db8f60b69d7cc0c0ad03c8e87eee29ffc48f\": container with ID starting with ed4374b6f9b2f907e280dff247a4db8f60b69d7cc0c0ad03c8e87eee29ffc48f not found: ID does not exist" containerID="ed4374b6f9b2f907e280dff247a4db8f60b69d7cc0c0ad03c8e87eee29ffc48f" Dec 11 14:00:35 crc kubenswrapper[4898]: I1211 14:00:35.105128 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed4374b6f9b2f907e280dff247a4db8f60b69d7cc0c0ad03c8e87eee29ffc48f"} err="failed to get container status \"ed4374b6f9b2f907e280dff247a4db8f60b69d7cc0c0ad03c8e87eee29ffc48f\": rpc error: code = NotFound desc = could not find container \"ed4374b6f9b2f907e280dff247a4db8f60b69d7cc0c0ad03c8e87eee29ffc48f\": container with ID starting with ed4374b6f9b2f907e280dff247a4db8f60b69d7cc0c0ad03c8e87eee29ffc48f not found: ID does not exist" Dec 11 14:00:35 crc kubenswrapper[4898]: I1211 14:00:35.105158 4898 scope.go:117] "RemoveContainer" containerID="22f1d82d5e1de5128e39412ce0116ce7317fb2bfd0fcf41c71e87b0bfcb724a1" Dec 11 14:00:35 crc kubenswrapper[4898]: E1211 14:00:35.105575 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"22f1d82d5e1de5128e39412ce0116ce7317fb2bfd0fcf41c71e87b0bfcb724a1\": container with ID starting with 22f1d82d5e1de5128e39412ce0116ce7317fb2bfd0fcf41c71e87b0bfcb724a1 not found: ID does not exist" containerID="22f1d82d5e1de5128e39412ce0116ce7317fb2bfd0fcf41c71e87b0bfcb724a1" Dec 11 14:00:35 crc kubenswrapper[4898]: I1211 14:00:35.105612 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22f1d82d5e1de5128e39412ce0116ce7317fb2bfd0fcf41c71e87b0bfcb724a1"} err="failed to get container status \"22f1d82d5e1de5128e39412ce0116ce7317fb2bfd0fcf41c71e87b0bfcb724a1\": rpc error: code = NotFound desc = could not find container \"22f1d82d5e1de5128e39412ce0116ce7317fb2bfd0fcf41c71e87b0bfcb724a1\": container with ID starting with 22f1d82d5e1de5128e39412ce0116ce7317fb2bfd0fcf41c71e87b0bfcb724a1 not found: ID does not exist" Dec 11 14:00:35 crc kubenswrapper[4898]: I1211 14:00:35.105633 4898 scope.go:117] "RemoveContainer" containerID="f2b0c779f702169bd401a2929d7d18a39ebeddb8d9b75c1fa1cf5bd4360117ae" Dec 11 14:00:35 crc kubenswrapper[4898]: E1211 14:00:35.105931 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f2b0c779f702169bd401a2929d7d18a39ebeddb8d9b75c1fa1cf5bd4360117ae\": container with ID starting with f2b0c779f702169bd401a2929d7d18a39ebeddb8d9b75c1fa1cf5bd4360117ae not found: ID does not exist" containerID="f2b0c779f702169bd401a2929d7d18a39ebeddb8d9b75c1fa1cf5bd4360117ae" Dec 11 14:00:35 crc kubenswrapper[4898]: I1211 14:00:35.105956 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f2b0c779f702169bd401a2929d7d18a39ebeddb8d9b75c1fa1cf5bd4360117ae"} err="failed to get container status \"f2b0c779f702169bd401a2929d7d18a39ebeddb8d9b75c1fa1cf5bd4360117ae\": rpc error: code = NotFound desc = could not find container \"f2b0c779f702169bd401a2929d7d18a39ebeddb8d9b75c1fa1cf5bd4360117ae\": container with ID starting with f2b0c779f702169bd401a2929d7d18a39ebeddb8d9b75c1fa1cf5bd4360117ae not found: ID does not exist" Dec 11 14:00:36 crc kubenswrapper[4898]: I1211 14:00:36.788536 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57ab3cc8-ec1d-45a6-9670-e4bc9c5e32b2" path="/var/lib/kubelet/pods/57ab3cc8-ec1d-45a6-9670-e4bc9c5e32b2/volumes" Dec 11 14:00:40 crc kubenswrapper[4898]: I1211 14:00:40.141185 4898 scope.go:117] "RemoveContainer" containerID="be19fc1280d69a9d6fa41cdb51d79c5e5bb5bf4dee9f5d0453662f5904a9dc80" Dec 11 14:00:41 crc kubenswrapper[4898]: I1211 14:00:41.775065 4898 scope.go:117] "RemoveContainer" containerID="c9e922d5ee469f620afdfb2ddd69dbee93879150084a4966647adbf551d5f7e9" Dec 11 14:00:41 crc kubenswrapper[4898]: E1211 14:00:41.776025 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mmvk_openshift-machine-config-operator(b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" Dec 11 14:00:55 crc kubenswrapper[4898]: I1211 14:00:55.775085 4898 scope.go:117] "RemoveContainer" containerID="c9e922d5ee469f620afdfb2ddd69dbee93879150084a4966647adbf551d5f7e9" Dec 11 14:00:55 crc kubenswrapper[4898]: E1211 14:00:55.775886 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mmvk_openshift-machine-config-operator(b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" Dec 11 14:01:00 crc kubenswrapper[4898]: I1211 14:01:00.163340 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29424361-bt5ps"] Dec 11 14:01:00 crc kubenswrapper[4898]: E1211 14:01:00.166058 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57ab3cc8-ec1d-45a6-9670-e4bc9c5e32b2" containerName="extract-utilities" Dec 11 14:01:00 crc kubenswrapper[4898]: I1211 14:01:00.166205 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="57ab3cc8-ec1d-45a6-9670-e4bc9c5e32b2" containerName="extract-utilities" Dec 11 14:01:00 crc kubenswrapper[4898]: E1211 14:01:00.166335 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57ab3cc8-ec1d-45a6-9670-e4bc9c5e32b2" containerName="extract-content" Dec 11 14:01:00 crc kubenswrapper[4898]: I1211 14:01:00.166448 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="57ab3cc8-ec1d-45a6-9670-e4bc9c5e32b2" containerName="extract-content" Dec 11 14:01:00 crc kubenswrapper[4898]: E1211 14:01:00.166622 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57ab3cc8-ec1d-45a6-9670-e4bc9c5e32b2" containerName="registry-server" Dec 11 14:01:00 crc kubenswrapper[4898]: I1211 14:01:00.166755 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="57ab3cc8-ec1d-45a6-9670-e4bc9c5e32b2" containerName="registry-server" Dec 11 14:01:00 crc kubenswrapper[4898]: I1211 14:01:00.167261 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="57ab3cc8-ec1d-45a6-9670-e4bc9c5e32b2" containerName="registry-server" Dec 11 14:01:00 crc kubenswrapper[4898]: I1211 14:01:00.168796 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29424361-bt5ps" Dec 11 14:01:00 crc kubenswrapper[4898]: I1211 14:01:00.174171 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29424361-bt5ps"] Dec 11 14:01:00 crc kubenswrapper[4898]: I1211 14:01:00.212956 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/587f3b75-4378-4c14-a2e0-e990a8270221-config-data\") pod \"keystone-cron-29424361-bt5ps\" (UID: \"587f3b75-4378-4c14-a2e0-e990a8270221\") " pod="openstack/keystone-cron-29424361-bt5ps" Dec 11 14:01:00 crc kubenswrapper[4898]: I1211 14:01:00.213011 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/587f3b75-4378-4c14-a2e0-e990a8270221-fernet-keys\") pod \"keystone-cron-29424361-bt5ps\" (UID: \"587f3b75-4378-4c14-a2e0-e990a8270221\") " pod="openstack/keystone-cron-29424361-bt5ps" Dec 11 14:01:00 crc kubenswrapper[4898]: I1211 14:01:00.213044 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r9fbb\" (UniqueName: \"kubernetes.io/projected/587f3b75-4378-4c14-a2e0-e990a8270221-kube-api-access-r9fbb\") pod \"keystone-cron-29424361-bt5ps\" (UID: \"587f3b75-4378-4c14-a2e0-e990a8270221\") " pod="openstack/keystone-cron-29424361-bt5ps" Dec 11 14:01:00 crc kubenswrapper[4898]: I1211 14:01:00.213307 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/587f3b75-4378-4c14-a2e0-e990a8270221-combined-ca-bundle\") pod \"keystone-cron-29424361-bt5ps\" (UID: \"587f3b75-4378-4c14-a2e0-e990a8270221\") " pod="openstack/keystone-cron-29424361-bt5ps" Dec 11 14:01:00 crc kubenswrapper[4898]: I1211 14:01:00.315421 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r9fbb\" (UniqueName: \"kubernetes.io/projected/587f3b75-4378-4c14-a2e0-e990a8270221-kube-api-access-r9fbb\") pod \"keystone-cron-29424361-bt5ps\" (UID: \"587f3b75-4378-4c14-a2e0-e990a8270221\") " pod="openstack/keystone-cron-29424361-bt5ps" Dec 11 14:01:00 crc kubenswrapper[4898]: I1211 14:01:00.315794 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/587f3b75-4378-4c14-a2e0-e990a8270221-combined-ca-bundle\") pod \"keystone-cron-29424361-bt5ps\" (UID: \"587f3b75-4378-4c14-a2e0-e990a8270221\") " pod="openstack/keystone-cron-29424361-bt5ps" Dec 11 14:01:00 crc kubenswrapper[4898]: I1211 14:01:00.315914 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/587f3b75-4378-4c14-a2e0-e990a8270221-config-data\") pod \"keystone-cron-29424361-bt5ps\" (UID: \"587f3b75-4378-4c14-a2e0-e990a8270221\") " pod="openstack/keystone-cron-29424361-bt5ps" Dec 11 14:01:00 crc kubenswrapper[4898]: I1211 14:01:00.315988 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/587f3b75-4378-4c14-a2e0-e990a8270221-fernet-keys\") pod \"keystone-cron-29424361-bt5ps\" (UID: \"587f3b75-4378-4c14-a2e0-e990a8270221\") " pod="openstack/keystone-cron-29424361-bt5ps" Dec 11 14:01:00 crc kubenswrapper[4898]: I1211 14:01:00.322307 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/587f3b75-4378-4c14-a2e0-e990a8270221-combined-ca-bundle\") pod \"keystone-cron-29424361-bt5ps\" (UID: \"587f3b75-4378-4c14-a2e0-e990a8270221\") " pod="openstack/keystone-cron-29424361-bt5ps" Dec 11 14:01:00 crc kubenswrapper[4898]: I1211 14:01:00.322667 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/587f3b75-4378-4c14-a2e0-e990a8270221-fernet-keys\") pod \"keystone-cron-29424361-bt5ps\" (UID: \"587f3b75-4378-4c14-a2e0-e990a8270221\") " pod="openstack/keystone-cron-29424361-bt5ps" Dec 11 14:01:00 crc kubenswrapper[4898]: I1211 14:01:00.330387 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/587f3b75-4378-4c14-a2e0-e990a8270221-config-data\") pod \"keystone-cron-29424361-bt5ps\" (UID: \"587f3b75-4378-4c14-a2e0-e990a8270221\") " pod="openstack/keystone-cron-29424361-bt5ps" Dec 11 14:01:00 crc kubenswrapper[4898]: I1211 14:01:00.338019 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r9fbb\" (UniqueName: \"kubernetes.io/projected/587f3b75-4378-4c14-a2e0-e990a8270221-kube-api-access-r9fbb\") pod \"keystone-cron-29424361-bt5ps\" (UID: \"587f3b75-4378-4c14-a2e0-e990a8270221\") " pod="openstack/keystone-cron-29424361-bt5ps" Dec 11 14:01:00 crc kubenswrapper[4898]: I1211 14:01:00.501238 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29424361-bt5ps" Dec 11 14:01:01 crc kubenswrapper[4898]: I1211 14:01:01.044523 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29424361-bt5ps"] Dec 11 14:01:01 crc kubenswrapper[4898]: W1211 14:01:01.048677 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod587f3b75_4378_4c14_a2e0_e990a8270221.slice/crio-ffb820e9a3a61a9c373af791433a8042bfc4db65fcf3e0157931f0a3878c3a42 WatchSource:0}: Error finding container ffb820e9a3a61a9c373af791433a8042bfc4db65fcf3e0157931f0a3878c3a42: Status 404 returned error can't find the container with id ffb820e9a3a61a9c373af791433a8042bfc4db65fcf3e0157931f0a3878c3a42 Dec 11 14:01:01 crc kubenswrapper[4898]: I1211 14:01:01.285536 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29424361-bt5ps" event={"ID":"587f3b75-4378-4c14-a2e0-e990a8270221","Type":"ContainerStarted","Data":"990a62f9ff82ff3530c329b636c3527df7a6c684d6a51abdb5a33f86435d293b"} Dec 11 14:01:01 crc kubenswrapper[4898]: I1211 14:01:01.285581 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29424361-bt5ps" event={"ID":"587f3b75-4378-4c14-a2e0-e990a8270221","Type":"ContainerStarted","Data":"ffb820e9a3a61a9c373af791433a8042bfc4db65fcf3e0157931f0a3878c3a42"} Dec 11 14:01:01 crc kubenswrapper[4898]: I1211 14:01:01.303874 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29424361-bt5ps" podStartSLOduration=1.303855759 podStartE2EDuration="1.303855759s" podCreationTimestamp="2025-12-11 14:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 14:01:01.301067354 +0000 UTC m=+3418.873393781" watchObservedRunningTime="2025-12-11 14:01:01.303855759 +0000 UTC m=+3418.876182206" Dec 11 14:01:04 crc kubenswrapper[4898]: I1211 14:01:04.335036 4898 generic.go:334] "Generic (PLEG): container finished" podID="587f3b75-4378-4c14-a2e0-e990a8270221" containerID="990a62f9ff82ff3530c329b636c3527df7a6c684d6a51abdb5a33f86435d293b" exitCode=0 Dec 11 14:01:04 crc kubenswrapper[4898]: I1211 14:01:04.335574 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29424361-bt5ps" event={"ID":"587f3b75-4378-4c14-a2e0-e990a8270221","Type":"ContainerDied","Data":"990a62f9ff82ff3530c329b636c3527df7a6c684d6a51abdb5a33f86435d293b"} Dec 11 14:01:05 crc kubenswrapper[4898]: I1211 14:01:05.800036 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29424361-bt5ps" Dec 11 14:01:05 crc kubenswrapper[4898]: I1211 14:01:05.852184 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/587f3b75-4378-4c14-a2e0-e990a8270221-config-data\") pod \"587f3b75-4378-4c14-a2e0-e990a8270221\" (UID: \"587f3b75-4378-4c14-a2e0-e990a8270221\") " Dec 11 14:01:05 crc kubenswrapper[4898]: I1211 14:01:05.852306 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r9fbb\" (UniqueName: \"kubernetes.io/projected/587f3b75-4378-4c14-a2e0-e990a8270221-kube-api-access-r9fbb\") pod \"587f3b75-4378-4c14-a2e0-e990a8270221\" (UID: \"587f3b75-4378-4c14-a2e0-e990a8270221\") " Dec 11 14:01:05 crc kubenswrapper[4898]: I1211 14:01:05.852411 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/587f3b75-4378-4c14-a2e0-e990a8270221-fernet-keys\") pod \"587f3b75-4378-4c14-a2e0-e990a8270221\" (UID: \"587f3b75-4378-4c14-a2e0-e990a8270221\") " Dec 11 14:01:05 crc kubenswrapper[4898]: I1211 14:01:05.852450 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/587f3b75-4378-4c14-a2e0-e990a8270221-combined-ca-bundle\") pod \"587f3b75-4378-4c14-a2e0-e990a8270221\" (UID: \"587f3b75-4378-4c14-a2e0-e990a8270221\") " Dec 11 14:01:05 crc kubenswrapper[4898]: I1211 14:01:05.860503 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/587f3b75-4378-4c14-a2e0-e990a8270221-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "587f3b75-4378-4c14-a2e0-e990a8270221" (UID: "587f3b75-4378-4c14-a2e0-e990a8270221"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 14:01:05 crc kubenswrapper[4898]: I1211 14:01:05.869675 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/587f3b75-4378-4c14-a2e0-e990a8270221-kube-api-access-r9fbb" (OuterVolumeSpecName: "kube-api-access-r9fbb") pod "587f3b75-4378-4c14-a2e0-e990a8270221" (UID: "587f3b75-4378-4c14-a2e0-e990a8270221"). InnerVolumeSpecName "kube-api-access-r9fbb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 14:01:05 crc kubenswrapper[4898]: I1211 14:01:05.887276 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/587f3b75-4378-4c14-a2e0-e990a8270221-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "587f3b75-4378-4c14-a2e0-e990a8270221" (UID: "587f3b75-4378-4c14-a2e0-e990a8270221"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 14:01:05 crc kubenswrapper[4898]: I1211 14:01:05.928324 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/587f3b75-4378-4c14-a2e0-e990a8270221-config-data" (OuterVolumeSpecName: "config-data") pod "587f3b75-4378-4c14-a2e0-e990a8270221" (UID: "587f3b75-4378-4c14-a2e0-e990a8270221"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 14:01:05 crc kubenswrapper[4898]: I1211 14:01:05.954578 4898 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/587f3b75-4378-4c14-a2e0-e990a8270221-config-data\") on node \"crc\" DevicePath \"\"" Dec 11 14:01:05 crc kubenswrapper[4898]: I1211 14:01:05.954622 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r9fbb\" (UniqueName: \"kubernetes.io/projected/587f3b75-4378-4c14-a2e0-e990a8270221-kube-api-access-r9fbb\") on node \"crc\" DevicePath \"\"" Dec 11 14:01:05 crc kubenswrapper[4898]: I1211 14:01:05.954635 4898 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/587f3b75-4378-4c14-a2e0-e990a8270221-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 11 14:01:05 crc kubenswrapper[4898]: I1211 14:01:05.954646 4898 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/587f3b75-4378-4c14-a2e0-e990a8270221-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 11 14:01:06 crc kubenswrapper[4898]: I1211 14:01:06.363230 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29424361-bt5ps" event={"ID":"587f3b75-4378-4c14-a2e0-e990a8270221","Type":"ContainerDied","Data":"ffb820e9a3a61a9c373af791433a8042bfc4db65fcf3e0157931f0a3878c3a42"} Dec 11 14:01:06 crc kubenswrapper[4898]: I1211 14:01:06.363314 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ffb820e9a3a61a9c373af791433a8042bfc4db65fcf3e0157931f0a3878c3a42" Dec 11 14:01:06 crc kubenswrapper[4898]: I1211 14:01:06.363385 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29424361-bt5ps" Dec 11 14:01:08 crc kubenswrapper[4898]: I1211 14:01:08.774738 4898 scope.go:117] "RemoveContainer" containerID="c9e922d5ee469f620afdfb2ddd69dbee93879150084a4966647adbf551d5f7e9" Dec 11 14:01:09 crc kubenswrapper[4898]: I1211 14:01:09.395567 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" event={"ID":"b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c","Type":"ContainerStarted","Data":"7c0859071c39fa4017006af9ae39872de783996a7610e283d386c593f08b4099"} Dec 11 14:02:36 crc kubenswrapper[4898]: I1211 14:02:36.894916 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-snwg8"] Dec 11 14:02:36 crc kubenswrapper[4898]: E1211 14:02:36.896179 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="587f3b75-4378-4c14-a2e0-e990a8270221" containerName="keystone-cron" Dec 11 14:02:36 crc kubenswrapper[4898]: I1211 14:02:36.896198 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="587f3b75-4378-4c14-a2e0-e990a8270221" containerName="keystone-cron" Dec 11 14:02:36 crc kubenswrapper[4898]: I1211 14:02:36.896568 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="587f3b75-4378-4c14-a2e0-e990a8270221" containerName="keystone-cron" Dec 11 14:02:36 crc kubenswrapper[4898]: I1211 14:02:36.900720 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-snwg8" Dec 11 14:02:36 crc kubenswrapper[4898]: I1211 14:02:36.936112 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-snwg8"] Dec 11 14:02:37 crc kubenswrapper[4898]: I1211 14:02:37.007778 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zpjxl\" (UniqueName: \"kubernetes.io/projected/d3668c73-ebe4-470c-a5bf-bae1f1dbf067-kube-api-access-zpjxl\") pod \"redhat-operators-snwg8\" (UID: \"d3668c73-ebe4-470c-a5bf-bae1f1dbf067\") " pod="openshift-marketplace/redhat-operators-snwg8" Dec 11 14:02:37 crc kubenswrapper[4898]: I1211 14:02:37.007841 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d3668c73-ebe4-470c-a5bf-bae1f1dbf067-catalog-content\") pod \"redhat-operators-snwg8\" (UID: \"d3668c73-ebe4-470c-a5bf-bae1f1dbf067\") " pod="openshift-marketplace/redhat-operators-snwg8" Dec 11 14:02:37 crc kubenswrapper[4898]: I1211 14:02:37.008032 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d3668c73-ebe4-470c-a5bf-bae1f1dbf067-utilities\") pod \"redhat-operators-snwg8\" (UID: \"d3668c73-ebe4-470c-a5bf-bae1f1dbf067\") " pod="openshift-marketplace/redhat-operators-snwg8" Dec 11 14:02:37 crc kubenswrapper[4898]: I1211 14:02:37.110309 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zpjxl\" (UniqueName: \"kubernetes.io/projected/d3668c73-ebe4-470c-a5bf-bae1f1dbf067-kube-api-access-zpjxl\") pod \"redhat-operators-snwg8\" (UID: \"d3668c73-ebe4-470c-a5bf-bae1f1dbf067\") " pod="openshift-marketplace/redhat-operators-snwg8" Dec 11 14:02:37 crc kubenswrapper[4898]: I1211 14:02:37.110382 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d3668c73-ebe4-470c-a5bf-bae1f1dbf067-catalog-content\") pod \"redhat-operators-snwg8\" (UID: \"d3668c73-ebe4-470c-a5bf-bae1f1dbf067\") " pod="openshift-marketplace/redhat-operators-snwg8" Dec 11 14:02:37 crc kubenswrapper[4898]: I1211 14:02:37.110535 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d3668c73-ebe4-470c-a5bf-bae1f1dbf067-utilities\") pod \"redhat-operators-snwg8\" (UID: \"d3668c73-ebe4-470c-a5bf-bae1f1dbf067\") " pod="openshift-marketplace/redhat-operators-snwg8" Dec 11 14:02:37 crc kubenswrapper[4898]: I1211 14:02:37.111262 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d3668c73-ebe4-470c-a5bf-bae1f1dbf067-utilities\") pod \"redhat-operators-snwg8\" (UID: \"d3668c73-ebe4-470c-a5bf-bae1f1dbf067\") " pod="openshift-marketplace/redhat-operators-snwg8" Dec 11 14:02:37 crc kubenswrapper[4898]: I1211 14:02:37.111890 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d3668c73-ebe4-470c-a5bf-bae1f1dbf067-catalog-content\") pod \"redhat-operators-snwg8\" (UID: \"d3668c73-ebe4-470c-a5bf-bae1f1dbf067\") " pod="openshift-marketplace/redhat-operators-snwg8" Dec 11 14:02:37 crc kubenswrapper[4898]: I1211 14:02:37.144126 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zpjxl\" (UniqueName: \"kubernetes.io/projected/d3668c73-ebe4-470c-a5bf-bae1f1dbf067-kube-api-access-zpjxl\") pod \"redhat-operators-snwg8\" (UID: \"d3668c73-ebe4-470c-a5bf-bae1f1dbf067\") " pod="openshift-marketplace/redhat-operators-snwg8" Dec 11 14:02:37 crc kubenswrapper[4898]: I1211 14:02:37.229373 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-snwg8" Dec 11 14:02:37 crc kubenswrapper[4898]: I1211 14:02:37.744190 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-snwg8"] Dec 11 14:02:38 crc kubenswrapper[4898]: I1211 14:02:38.518610 4898 generic.go:334] "Generic (PLEG): container finished" podID="d3668c73-ebe4-470c-a5bf-bae1f1dbf067" containerID="c085bdaa75c18a6be2f7d6ee35a4e2164cf9052c4b238a75a41a9ae5c0bc3266" exitCode=0 Dec 11 14:02:38 crc kubenswrapper[4898]: I1211 14:02:38.518678 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-snwg8" event={"ID":"d3668c73-ebe4-470c-a5bf-bae1f1dbf067","Type":"ContainerDied","Data":"c085bdaa75c18a6be2f7d6ee35a4e2164cf9052c4b238a75a41a9ae5c0bc3266"} Dec 11 14:02:38 crc kubenswrapper[4898]: I1211 14:02:38.519128 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-snwg8" event={"ID":"d3668c73-ebe4-470c-a5bf-bae1f1dbf067","Type":"ContainerStarted","Data":"15e1a2f2224323dfef59d6f0b0d58882e8717fbf383387acd9b498546b65f0c4"} Dec 11 14:02:39 crc kubenswrapper[4898]: I1211 14:02:39.535262 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-snwg8" event={"ID":"d3668c73-ebe4-470c-a5bf-bae1f1dbf067","Type":"ContainerStarted","Data":"24880cb642ae5f5f0cb8db0e85cbd309edd293d73797399a9a991b0bc542b4b3"} Dec 11 14:02:44 crc kubenswrapper[4898]: I1211 14:02:44.593758 4898 generic.go:334] "Generic (PLEG): container finished" podID="d3668c73-ebe4-470c-a5bf-bae1f1dbf067" containerID="24880cb642ae5f5f0cb8db0e85cbd309edd293d73797399a9a991b0bc542b4b3" exitCode=0 Dec 11 14:02:44 crc kubenswrapper[4898]: I1211 14:02:44.593835 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-snwg8" event={"ID":"d3668c73-ebe4-470c-a5bf-bae1f1dbf067","Type":"ContainerDied","Data":"24880cb642ae5f5f0cb8db0e85cbd309edd293d73797399a9a991b0bc542b4b3"} Dec 11 14:02:46 crc kubenswrapper[4898]: I1211 14:02:46.630905 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-snwg8" event={"ID":"d3668c73-ebe4-470c-a5bf-bae1f1dbf067","Type":"ContainerStarted","Data":"85057ded354dbec746de981673754c2a90041360dd2ce1e1a40b7a2317a25490"} Dec 11 14:02:46 crc kubenswrapper[4898]: I1211 14:02:46.662418 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-snwg8" podStartSLOduration=3.662233831 podStartE2EDuration="10.662399002s" podCreationTimestamp="2025-12-11 14:02:36 +0000 UTC" firstStartedPulling="2025-12-11 14:02:38.520844556 +0000 UTC m=+3516.093170993" lastFinishedPulling="2025-12-11 14:02:45.521009727 +0000 UTC m=+3523.093336164" observedRunningTime="2025-12-11 14:02:46.652438613 +0000 UTC m=+3524.224765050" watchObservedRunningTime="2025-12-11 14:02:46.662399002 +0000 UTC m=+3524.234725439" Dec 11 14:02:47 crc kubenswrapper[4898]: I1211 14:02:47.229664 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-snwg8" Dec 11 14:02:47 crc kubenswrapper[4898]: I1211 14:02:47.229743 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-snwg8" Dec 11 14:02:48 crc kubenswrapper[4898]: I1211 14:02:48.302005 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-snwg8" podUID="d3668c73-ebe4-470c-a5bf-bae1f1dbf067" containerName="registry-server" probeResult="failure" output=< Dec 11 14:02:48 crc kubenswrapper[4898]: timeout: failed to connect service ":50051" within 1s Dec 11 14:02:48 crc kubenswrapper[4898]: > Dec 11 14:02:57 crc kubenswrapper[4898]: I1211 14:02:57.279310 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-snwg8" Dec 11 14:02:57 crc kubenswrapper[4898]: I1211 14:02:57.342616 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-snwg8" Dec 11 14:02:57 crc kubenswrapper[4898]: I1211 14:02:57.516994 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-snwg8"] Dec 11 14:02:58 crc kubenswrapper[4898]: I1211 14:02:58.764405 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-snwg8" podUID="d3668c73-ebe4-470c-a5bf-bae1f1dbf067" containerName="registry-server" containerID="cri-o://85057ded354dbec746de981673754c2a90041360dd2ce1e1a40b7a2317a25490" gracePeriod=2 Dec 11 14:02:59 crc kubenswrapper[4898]: I1211 14:02:59.380350 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-snwg8" Dec 11 14:02:59 crc kubenswrapper[4898]: I1211 14:02:59.535168 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zpjxl\" (UniqueName: \"kubernetes.io/projected/d3668c73-ebe4-470c-a5bf-bae1f1dbf067-kube-api-access-zpjxl\") pod \"d3668c73-ebe4-470c-a5bf-bae1f1dbf067\" (UID: \"d3668c73-ebe4-470c-a5bf-bae1f1dbf067\") " Dec 11 14:02:59 crc kubenswrapper[4898]: I1211 14:02:59.535340 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d3668c73-ebe4-470c-a5bf-bae1f1dbf067-utilities\") pod \"d3668c73-ebe4-470c-a5bf-bae1f1dbf067\" (UID: \"d3668c73-ebe4-470c-a5bf-bae1f1dbf067\") " Dec 11 14:02:59 crc kubenswrapper[4898]: I1211 14:02:59.535487 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d3668c73-ebe4-470c-a5bf-bae1f1dbf067-catalog-content\") pod \"d3668c73-ebe4-470c-a5bf-bae1f1dbf067\" (UID: \"d3668c73-ebe4-470c-a5bf-bae1f1dbf067\") " Dec 11 14:02:59 crc kubenswrapper[4898]: I1211 14:02:59.536351 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d3668c73-ebe4-470c-a5bf-bae1f1dbf067-utilities" (OuterVolumeSpecName: "utilities") pod "d3668c73-ebe4-470c-a5bf-bae1f1dbf067" (UID: "d3668c73-ebe4-470c-a5bf-bae1f1dbf067"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 14:02:59 crc kubenswrapper[4898]: I1211 14:02:59.542572 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3668c73-ebe4-470c-a5bf-bae1f1dbf067-kube-api-access-zpjxl" (OuterVolumeSpecName: "kube-api-access-zpjxl") pod "d3668c73-ebe4-470c-a5bf-bae1f1dbf067" (UID: "d3668c73-ebe4-470c-a5bf-bae1f1dbf067"). InnerVolumeSpecName "kube-api-access-zpjxl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 14:02:59 crc kubenswrapper[4898]: I1211 14:02:59.639084 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zpjxl\" (UniqueName: \"kubernetes.io/projected/d3668c73-ebe4-470c-a5bf-bae1f1dbf067-kube-api-access-zpjxl\") on node \"crc\" DevicePath \"\"" Dec 11 14:02:59 crc kubenswrapper[4898]: I1211 14:02:59.640618 4898 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d3668c73-ebe4-470c-a5bf-bae1f1dbf067-utilities\") on node \"crc\" DevicePath \"\"" Dec 11 14:02:59 crc kubenswrapper[4898]: I1211 14:02:59.649757 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d3668c73-ebe4-470c-a5bf-bae1f1dbf067-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d3668c73-ebe4-470c-a5bf-bae1f1dbf067" (UID: "d3668c73-ebe4-470c-a5bf-bae1f1dbf067"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 14:02:59 crc kubenswrapper[4898]: I1211 14:02:59.743284 4898 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d3668c73-ebe4-470c-a5bf-bae1f1dbf067-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 11 14:02:59 crc kubenswrapper[4898]: I1211 14:02:59.776702 4898 generic.go:334] "Generic (PLEG): container finished" podID="d3668c73-ebe4-470c-a5bf-bae1f1dbf067" containerID="85057ded354dbec746de981673754c2a90041360dd2ce1e1a40b7a2317a25490" exitCode=0 Dec 11 14:02:59 crc kubenswrapper[4898]: I1211 14:02:59.776750 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-snwg8" event={"ID":"d3668c73-ebe4-470c-a5bf-bae1f1dbf067","Type":"ContainerDied","Data":"85057ded354dbec746de981673754c2a90041360dd2ce1e1a40b7a2317a25490"} Dec 11 14:02:59 crc kubenswrapper[4898]: I1211 14:02:59.776779 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-snwg8" event={"ID":"d3668c73-ebe4-470c-a5bf-bae1f1dbf067","Type":"ContainerDied","Data":"15e1a2f2224323dfef59d6f0b0d58882e8717fbf383387acd9b498546b65f0c4"} Dec 11 14:02:59 crc kubenswrapper[4898]: I1211 14:02:59.776799 4898 scope.go:117] "RemoveContainer" containerID="85057ded354dbec746de981673754c2a90041360dd2ce1e1a40b7a2317a25490" Dec 11 14:02:59 crc kubenswrapper[4898]: I1211 14:02:59.776981 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-snwg8" Dec 11 14:02:59 crc kubenswrapper[4898]: I1211 14:02:59.810127 4898 scope.go:117] "RemoveContainer" containerID="24880cb642ae5f5f0cb8db0e85cbd309edd293d73797399a9a991b0bc542b4b3" Dec 11 14:02:59 crc kubenswrapper[4898]: I1211 14:02:59.812050 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-snwg8"] Dec 11 14:02:59 crc kubenswrapper[4898]: I1211 14:02:59.822966 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-snwg8"] Dec 11 14:02:59 crc kubenswrapper[4898]: I1211 14:02:59.863274 4898 scope.go:117] "RemoveContainer" containerID="c085bdaa75c18a6be2f7d6ee35a4e2164cf9052c4b238a75a41a9ae5c0bc3266" Dec 11 14:02:59 crc kubenswrapper[4898]: I1211 14:02:59.913034 4898 scope.go:117] "RemoveContainer" containerID="85057ded354dbec746de981673754c2a90041360dd2ce1e1a40b7a2317a25490" Dec 11 14:02:59 crc kubenswrapper[4898]: E1211 14:02:59.913410 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"85057ded354dbec746de981673754c2a90041360dd2ce1e1a40b7a2317a25490\": container with ID starting with 85057ded354dbec746de981673754c2a90041360dd2ce1e1a40b7a2317a25490 not found: ID does not exist" containerID="85057ded354dbec746de981673754c2a90041360dd2ce1e1a40b7a2317a25490" Dec 11 14:02:59 crc kubenswrapper[4898]: I1211 14:02:59.913499 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85057ded354dbec746de981673754c2a90041360dd2ce1e1a40b7a2317a25490"} err="failed to get container status \"85057ded354dbec746de981673754c2a90041360dd2ce1e1a40b7a2317a25490\": rpc error: code = NotFound desc = could not find container \"85057ded354dbec746de981673754c2a90041360dd2ce1e1a40b7a2317a25490\": container with ID starting with 85057ded354dbec746de981673754c2a90041360dd2ce1e1a40b7a2317a25490 not found: ID does not exist" Dec 11 14:02:59 crc kubenswrapper[4898]: I1211 14:02:59.913531 4898 scope.go:117] "RemoveContainer" containerID="24880cb642ae5f5f0cb8db0e85cbd309edd293d73797399a9a991b0bc542b4b3" Dec 11 14:02:59 crc kubenswrapper[4898]: E1211 14:02:59.914033 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"24880cb642ae5f5f0cb8db0e85cbd309edd293d73797399a9a991b0bc542b4b3\": container with ID starting with 24880cb642ae5f5f0cb8db0e85cbd309edd293d73797399a9a991b0bc542b4b3 not found: ID does not exist" containerID="24880cb642ae5f5f0cb8db0e85cbd309edd293d73797399a9a991b0bc542b4b3" Dec 11 14:02:59 crc kubenswrapper[4898]: I1211 14:02:59.914064 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"24880cb642ae5f5f0cb8db0e85cbd309edd293d73797399a9a991b0bc542b4b3"} err="failed to get container status \"24880cb642ae5f5f0cb8db0e85cbd309edd293d73797399a9a991b0bc542b4b3\": rpc error: code = NotFound desc = could not find container \"24880cb642ae5f5f0cb8db0e85cbd309edd293d73797399a9a991b0bc542b4b3\": container with ID starting with 24880cb642ae5f5f0cb8db0e85cbd309edd293d73797399a9a991b0bc542b4b3 not found: ID does not exist" Dec 11 14:02:59 crc kubenswrapper[4898]: I1211 14:02:59.914087 4898 scope.go:117] "RemoveContainer" containerID="c085bdaa75c18a6be2f7d6ee35a4e2164cf9052c4b238a75a41a9ae5c0bc3266" Dec 11 14:02:59 crc kubenswrapper[4898]: E1211 14:02:59.914298 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c085bdaa75c18a6be2f7d6ee35a4e2164cf9052c4b238a75a41a9ae5c0bc3266\": container with ID starting with c085bdaa75c18a6be2f7d6ee35a4e2164cf9052c4b238a75a41a9ae5c0bc3266 not found: ID does not exist" containerID="c085bdaa75c18a6be2f7d6ee35a4e2164cf9052c4b238a75a41a9ae5c0bc3266" Dec 11 14:02:59 crc kubenswrapper[4898]: I1211 14:02:59.914333 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c085bdaa75c18a6be2f7d6ee35a4e2164cf9052c4b238a75a41a9ae5c0bc3266"} err="failed to get container status \"c085bdaa75c18a6be2f7d6ee35a4e2164cf9052c4b238a75a41a9ae5c0bc3266\": rpc error: code = NotFound desc = could not find container \"c085bdaa75c18a6be2f7d6ee35a4e2164cf9052c4b238a75a41a9ae5c0bc3266\": container with ID starting with c085bdaa75c18a6be2f7d6ee35a4e2164cf9052c4b238a75a41a9ae5c0bc3266 not found: ID does not exist" Dec 11 14:03:00 crc kubenswrapper[4898]: I1211 14:03:00.794757 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d3668c73-ebe4-470c-a5bf-bae1f1dbf067" path="/var/lib/kubelet/pods/d3668c73-ebe4-470c-a5bf-bae1f1dbf067/volumes" Dec 11 14:03:34 crc kubenswrapper[4898]: I1211 14:03:34.996567 4898 patch_prober.go:28] interesting pod/machine-config-daemon-7mmvk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 11 14:03:34 crc kubenswrapper[4898]: I1211 14:03:34.997540 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 11 14:04:04 crc kubenswrapper[4898]: I1211 14:04:04.996247 4898 patch_prober.go:28] interesting pod/machine-config-daemon-7mmvk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 11 14:04:04 crc kubenswrapper[4898]: I1211 14:04:04.996812 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 11 14:04:34 crc kubenswrapper[4898]: I1211 14:04:34.996045 4898 patch_prober.go:28] interesting pod/machine-config-daemon-7mmvk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 11 14:04:34 crc kubenswrapper[4898]: I1211 14:04:34.997484 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 11 14:04:34 crc kubenswrapper[4898]: I1211 14:04:34.997629 4898 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" Dec 11 14:04:34 crc kubenswrapper[4898]: I1211 14:04:34.998610 4898 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7c0859071c39fa4017006af9ae39872de783996a7610e283d386c593f08b4099"} pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 11 14:04:34 crc kubenswrapper[4898]: I1211 14:04:34.998686 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" containerName="machine-config-daemon" containerID="cri-o://7c0859071c39fa4017006af9ae39872de783996a7610e283d386c593f08b4099" gracePeriod=600 Dec 11 14:04:35 crc kubenswrapper[4898]: I1211 14:04:35.895837 4898 generic.go:334] "Generic (PLEG): container finished" podID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" containerID="7c0859071c39fa4017006af9ae39872de783996a7610e283d386c593f08b4099" exitCode=0 Dec 11 14:04:35 crc kubenswrapper[4898]: I1211 14:04:35.895874 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" event={"ID":"b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c","Type":"ContainerDied","Data":"7c0859071c39fa4017006af9ae39872de783996a7610e283d386c593f08b4099"} Dec 11 14:04:35 crc kubenswrapper[4898]: I1211 14:04:35.896093 4898 scope.go:117] "RemoveContainer" containerID="c9e922d5ee469f620afdfb2ddd69dbee93879150084a4966647adbf551d5f7e9" Dec 11 14:04:36 crc kubenswrapper[4898]: I1211 14:04:36.909774 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" event={"ID":"b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c","Type":"ContainerStarted","Data":"84b920018f008e0bbb176a3783d9c1ca5cdd95c65a5b9d87d90ba78149ad04f3"} Dec 11 14:06:23 crc kubenswrapper[4898]: I1211 14:06:23.378336 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-n9s2t"] Dec 11 14:06:23 crc kubenswrapper[4898]: E1211 14:06:23.379691 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3668c73-ebe4-470c-a5bf-bae1f1dbf067" containerName="registry-server" Dec 11 14:06:23 crc kubenswrapper[4898]: I1211 14:06:23.379714 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3668c73-ebe4-470c-a5bf-bae1f1dbf067" containerName="registry-server" Dec 11 14:06:23 crc kubenswrapper[4898]: E1211 14:06:23.379763 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3668c73-ebe4-470c-a5bf-bae1f1dbf067" containerName="extract-content" Dec 11 14:06:23 crc kubenswrapper[4898]: I1211 14:06:23.379772 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3668c73-ebe4-470c-a5bf-bae1f1dbf067" containerName="extract-content" Dec 11 14:06:23 crc kubenswrapper[4898]: E1211 14:06:23.379784 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3668c73-ebe4-470c-a5bf-bae1f1dbf067" containerName="extract-utilities" Dec 11 14:06:23 crc kubenswrapper[4898]: I1211 14:06:23.379792 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3668c73-ebe4-470c-a5bf-bae1f1dbf067" containerName="extract-utilities" Dec 11 14:06:23 crc kubenswrapper[4898]: I1211 14:06:23.380066 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3668c73-ebe4-470c-a5bf-bae1f1dbf067" containerName="registry-server" Dec 11 14:06:23 crc kubenswrapper[4898]: I1211 14:06:23.382167 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-n9s2t" Dec 11 14:06:23 crc kubenswrapper[4898]: I1211 14:06:23.400531 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-n9s2t"] Dec 11 14:06:23 crc kubenswrapper[4898]: I1211 14:06:23.529976 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b2cffdf2-e6c5-488a-a407-41548b22716d-utilities\") pod \"community-operators-n9s2t\" (UID: \"b2cffdf2-e6c5-488a-a407-41548b22716d\") " pod="openshift-marketplace/community-operators-n9s2t" Dec 11 14:06:23 crc kubenswrapper[4898]: I1211 14:06:23.530520 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hpz4s\" (UniqueName: \"kubernetes.io/projected/b2cffdf2-e6c5-488a-a407-41548b22716d-kube-api-access-hpz4s\") pod \"community-operators-n9s2t\" (UID: \"b2cffdf2-e6c5-488a-a407-41548b22716d\") " pod="openshift-marketplace/community-operators-n9s2t" Dec 11 14:06:23 crc kubenswrapper[4898]: I1211 14:06:23.530563 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b2cffdf2-e6c5-488a-a407-41548b22716d-catalog-content\") pod \"community-operators-n9s2t\" (UID: \"b2cffdf2-e6c5-488a-a407-41548b22716d\") " pod="openshift-marketplace/community-operators-n9s2t" Dec 11 14:06:23 crc kubenswrapper[4898]: I1211 14:06:23.632790 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hpz4s\" (UniqueName: \"kubernetes.io/projected/b2cffdf2-e6c5-488a-a407-41548b22716d-kube-api-access-hpz4s\") pod \"community-operators-n9s2t\" (UID: \"b2cffdf2-e6c5-488a-a407-41548b22716d\") " pod="openshift-marketplace/community-operators-n9s2t" Dec 11 14:06:23 crc kubenswrapper[4898]: I1211 14:06:23.632840 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b2cffdf2-e6c5-488a-a407-41548b22716d-catalog-content\") pod \"community-operators-n9s2t\" (UID: \"b2cffdf2-e6c5-488a-a407-41548b22716d\") " pod="openshift-marketplace/community-operators-n9s2t" Dec 11 14:06:23 crc kubenswrapper[4898]: I1211 14:06:23.632931 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b2cffdf2-e6c5-488a-a407-41548b22716d-utilities\") pod \"community-operators-n9s2t\" (UID: \"b2cffdf2-e6c5-488a-a407-41548b22716d\") " pod="openshift-marketplace/community-operators-n9s2t" Dec 11 14:06:23 crc kubenswrapper[4898]: I1211 14:06:23.633422 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b2cffdf2-e6c5-488a-a407-41548b22716d-utilities\") pod \"community-operators-n9s2t\" (UID: \"b2cffdf2-e6c5-488a-a407-41548b22716d\") " pod="openshift-marketplace/community-operators-n9s2t" Dec 11 14:06:23 crc kubenswrapper[4898]: I1211 14:06:23.633525 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b2cffdf2-e6c5-488a-a407-41548b22716d-catalog-content\") pod \"community-operators-n9s2t\" (UID: \"b2cffdf2-e6c5-488a-a407-41548b22716d\") " pod="openshift-marketplace/community-operators-n9s2t" Dec 11 14:06:23 crc kubenswrapper[4898]: I1211 14:06:23.655714 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hpz4s\" (UniqueName: \"kubernetes.io/projected/b2cffdf2-e6c5-488a-a407-41548b22716d-kube-api-access-hpz4s\") pod \"community-operators-n9s2t\" (UID: \"b2cffdf2-e6c5-488a-a407-41548b22716d\") " pod="openshift-marketplace/community-operators-n9s2t" Dec 11 14:06:23 crc kubenswrapper[4898]: I1211 14:06:23.712222 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-n9s2t" Dec 11 14:06:24 crc kubenswrapper[4898]: I1211 14:06:24.361176 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-n9s2t"] Dec 11 14:06:25 crc kubenswrapper[4898]: I1211 14:06:25.127822 4898 generic.go:334] "Generic (PLEG): container finished" podID="b2cffdf2-e6c5-488a-a407-41548b22716d" containerID="cb4196af8592504614bffce079d4b911ac9d4d6a5caf275e244eed2a72977b07" exitCode=0 Dec 11 14:06:25 crc kubenswrapper[4898]: I1211 14:06:25.127907 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n9s2t" event={"ID":"b2cffdf2-e6c5-488a-a407-41548b22716d","Type":"ContainerDied","Data":"cb4196af8592504614bffce079d4b911ac9d4d6a5caf275e244eed2a72977b07"} Dec 11 14:06:25 crc kubenswrapper[4898]: I1211 14:06:25.128175 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n9s2t" event={"ID":"b2cffdf2-e6c5-488a-a407-41548b22716d","Type":"ContainerStarted","Data":"db879c7a545a6560e6b98f2edd246fbef03e73443b56584f9ab0b55967dc319b"} Dec 11 14:06:25 crc kubenswrapper[4898]: I1211 14:06:25.131947 4898 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 11 14:06:28 crc kubenswrapper[4898]: I1211 14:06:28.167648 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n9s2t" event={"ID":"b2cffdf2-e6c5-488a-a407-41548b22716d","Type":"ContainerStarted","Data":"5e8005d4104b7ea50a7cc1faf0be196042d095bc66a715059e1b2aa9bf4b6ac6"} Dec 11 14:06:29 crc kubenswrapper[4898]: I1211 14:06:29.180349 4898 generic.go:334] "Generic (PLEG): container finished" podID="b2cffdf2-e6c5-488a-a407-41548b22716d" containerID="5e8005d4104b7ea50a7cc1faf0be196042d095bc66a715059e1b2aa9bf4b6ac6" exitCode=0 Dec 11 14:06:29 crc kubenswrapper[4898]: I1211 14:06:29.180485 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n9s2t" event={"ID":"b2cffdf2-e6c5-488a-a407-41548b22716d","Type":"ContainerDied","Data":"5e8005d4104b7ea50a7cc1faf0be196042d095bc66a715059e1b2aa9bf4b6ac6"} Dec 11 14:06:30 crc kubenswrapper[4898]: I1211 14:06:30.193388 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n9s2t" event={"ID":"b2cffdf2-e6c5-488a-a407-41548b22716d","Type":"ContainerStarted","Data":"35aea468155b3386e6e528b77fc5bded3547a82d8dc151c26dc67e49df19e677"} Dec 11 14:06:30 crc kubenswrapper[4898]: I1211 14:06:30.218721 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-n9s2t" podStartSLOduration=2.507735082 podStartE2EDuration="7.218697601s" podCreationTimestamp="2025-12-11 14:06:23 +0000 UTC" firstStartedPulling="2025-12-11 14:06:25.131676199 +0000 UTC m=+3742.704002636" lastFinishedPulling="2025-12-11 14:06:29.842638728 +0000 UTC m=+3747.414965155" observedRunningTime="2025-12-11 14:06:30.209943645 +0000 UTC m=+3747.782270082" watchObservedRunningTime="2025-12-11 14:06:30.218697601 +0000 UTC m=+3747.791024038" Dec 11 14:06:33 crc kubenswrapper[4898]: I1211 14:06:33.712387 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-n9s2t" Dec 11 14:06:33 crc kubenswrapper[4898]: I1211 14:06:33.712971 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-n9s2t" Dec 11 14:06:34 crc kubenswrapper[4898]: I1211 14:06:34.760508 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-n9s2t" podUID="b2cffdf2-e6c5-488a-a407-41548b22716d" containerName="registry-server" probeResult="failure" output=< Dec 11 14:06:34 crc kubenswrapper[4898]: timeout: failed to connect service ":50051" within 1s Dec 11 14:06:34 crc kubenswrapper[4898]: > Dec 11 14:06:43 crc kubenswrapper[4898]: I1211 14:06:43.761404 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-n9s2t" Dec 11 14:06:43 crc kubenswrapper[4898]: I1211 14:06:43.811961 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-n9s2t" Dec 11 14:06:44 crc kubenswrapper[4898]: I1211 14:06:44.006260 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-n9s2t"] Dec 11 14:06:45 crc kubenswrapper[4898]: I1211 14:06:45.337888 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-n9s2t" podUID="b2cffdf2-e6c5-488a-a407-41548b22716d" containerName="registry-server" containerID="cri-o://35aea468155b3386e6e528b77fc5bded3547a82d8dc151c26dc67e49df19e677" gracePeriod=2 Dec 11 14:06:45 crc kubenswrapper[4898]: I1211 14:06:45.896420 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-n9s2t" Dec 11 14:06:46 crc kubenswrapper[4898]: I1211 14:06:45.996373 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b2cffdf2-e6c5-488a-a407-41548b22716d-utilities\") pod \"b2cffdf2-e6c5-488a-a407-41548b22716d\" (UID: \"b2cffdf2-e6c5-488a-a407-41548b22716d\") " Dec 11 14:06:46 crc kubenswrapper[4898]: I1211 14:06:45.996569 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b2cffdf2-e6c5-488a-a407-41548b22716d-catalog-content\") pod \"b2cffdf2-e6c5-488a-a407-41548b22716d\" (UID: \"b2cffdf2-e6c5-488a-a407-41548b22716d\") " Dec 11 14:06:46 crc kubenswrapper[4898]: I1211 14:06:45.996754 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hpz4s\" (UniqueName: \"kubernetes.io/projected/b2cffdf2-e6c5-488a-a407-41548b22716d-kube-api-access-hpz4s\") pod \"b2cffdf2-e6c5-488a-a407-41548b22716d\" (UID: \"b2cffdf2-e6c5-488a-a407-41548b22716d\") " Dec 11 14:06:46 crc kubenswrapper[4898]: I1211 14:06:46.003544 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b2cffdf2-e6c5-488a-a407-41548b22716d-utilities" (OuterVolumeSpecName: "utilities") pod "b2cffdf2-e6c5-488a-a407-41548b22716d" (UID: "b2cffdf2-e6c5-488a-a407-41548b22716d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 14:06:46 crc kubenswrapper[4898]: I1211 14:06:46.023418 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b2cffdf2-e6c5-488a-a407-41548b22716d-kube-api-access-hpz4s" (OuterVolumeSpecName: "kube-api-access-hpz4s") pod "b2cffdf2-e6c5-488a-a407-41548b22716d" (UID: "b2cffdf2-e6c5-488a-a407-41548b22716d"). InnerVolumeSpecName "kube-api-access-hpz4s". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 14:06:46 crc kubenswrapper[4898]: I1211 14:06:46.080836 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b2cffdf2-e6c5-488a-a407-41548b22716d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b2cffdf2-e6c5-488a-a407-41548b22716d" (UID: "b2cffdf2-e6c5-488a-a407-41548b22716d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 14:06:46 crc kubenswrapper[4898]: I1211 14:06:46.100318 4898 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b2cffdf2-e6c5-488a-a407-41548b22716d-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 11 14:06:46 crc kubenswrapper[4898]: I1211 14:06:46.100356 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hpz4s\" (UniqueName: \"kubernetes.io/projected/b2cffdf2-e6c5-488a-a407-41548b22716d-kube-api-access-hpz4s\") on node \"crc\" DevicePath \"\"" Dec 11 14:06:46 crc kubenswrapper[4898]: I1211 14:06:46.100370 4898 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b2cffdf2-e6c5-488a-a407-41548b22716d-utilities\") on node \"crc\" DevicePath \"\"" Dec 11 14:06:46 crc kubenswrapper[4898]: I1211 14:06:46.352300 4898 generic.go:334] "Generic (PLEG): container finished" podID="b2cffdf2-e6c5-488a-a407-41548b22716d" containerID="35aea468155b3386e6e528b77fc5bded3547a82d8dc151c26dc67e49df19e677" exitCode=0 Dec 11 14:06:46 crc kubenswrapper[4898]: I1211 14:06:46.352397 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-n9s2t" Dec 11 14:06:46 crc kubenswrapper[4898]: I1211 14:06:46.352405 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n9s2t" event={"ID":"b2cffdf2-e6c5-488a-a407-41548b22716d","Type":"ContainerDied","Data":"35aea468155b3386e6e528b77fc5bded3547a82d8dc151c26dc67e49df19e677"} Dec 11 14:06:46 crc kubenswrapper[4898]: I1211 14:06:46.352875 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n9s2t" event={"ID":"b2cffdf2-e6c5-488a-a407-41548b22716d","Type":"ContainerDied","Data":"db879c7a545a6560e6b98f2edd246fbef03e73443b56584f9ab0b55967dc319b"} Dec 11 14:06:46 crc kubenswrapper[4898]: I1211 14:06:46.352901 4898 scope.go:117] "RemoveContainer" containerID="35aea468155b3386e6e528b77fc5bded3547a82d8dc151c26dc67e49df19e677" Dec 11 14:06:46 crc kubenswrapper[4898]: I1211 14:06:46.395329 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-n9s2t"] Dec 11 14:06:46 crc kubenswrapper[4898]: I1211 14:06:46.401392 4898 scope.go:117] "RemoveContainer" containerID="5e8005d4104b7ea50a7cc1faf0be196042d095bc66a715059e1b2aa9bf4b6ac6" Dec 11 14:06:46 crc kubenswrapper[4898]: I1211 14:06:46.406565 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-n9s2t"] Dec 11 14:06:46 crc kubenswrapper[4898]: I1211 14:06:46.431787 4898 scope.go:117] "RemoveContainer" containerID="cb4196af8592504614bffce079d4b911ac9d4d6a5caf275e244eed2a72977b07" Dec 11 14:06:46 crc kubenswrapper[4898]: I1211 14:06:46.494132 4898 scope.go:117] "RemoveContainer" containerID="35aea468155b3386e6e528b77fc5bded3547a82d8dc151c26dc67e49df19e677" Dec 11 14:06:46 crc kubenswrapper[4898]: E1211 14:06:46.494680 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"35aea468155b3386e6e528b77fc5bded3547a82d8dc151c26dc67e49df19e677\": container with ID starting with 35aea468155b3386e6e528b77fc5bded3547a82d8dc151c26dc67e49df19e677 not found: ID does not exist" containerID="35aea468155b3386e6e528b77fc5bded3547a82d8dc151c26dc67e49df19e677" Dec 11 14:06:46 crc kubenswrapper[4898]: I1211 14:06:46.494736 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"35aea468155b3386e6e528b77fc5bded3547a82d8dc151c26dc67e49df19e677"} err="failed to get container status \"35aea468155b3386e6e528b77fc5bded3547a82d8dc151c26dc67e49df19e677\": rpc error: code = NotFound desc = could not find container \"35aea468155b3386e6e528b77fc5bded3547a82d8dc151c26dc67e49df19e677\": container with ID starting with 35aea468155b3386e6e528b77fc5bded3547a82d8dc151c26dc67e49df19e677 not found: ID does not exist" Dec 11 14:06:46 crc kubenswrapper[4898]: I1211 14:06:46.494773 4898 scope.go:117] "RemoveContainer" containerID="5e8005d4104b7ea50a7cc1faf0be196042d095bc66a715059e1b2aa9bf4b6ac6" Dec 11 14:06:46 crc kubenswrapper[4898]: E1211 14:06:46.495055 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e8005d4104b7ea50a7cc1faf0be196042d095bc66a715059e1b2aa9bf4b6ac6\": container with ID starting with 5e8005d4104b7ea50a7cc1faf0be196042d095bc66a715059e1b2aa9bf4b6ac6 not found: ID does not exist" containerID="5e8005d4104b7ea50a7cc1faf0be196042d095bc66a715059e1b2aa9bf4b6ac6" Dec 11 14:06:46 crc kubenswrapper[4898]: I1211 14:06:46.495090 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e8005d4104b7ea50a7cc1faf0be196042d095bc66a715059e1b2aa9bf4b6ac6"} err="failed to get container status \"5e8005d4104b7ea50a7cc1faf0be196042d095bc66a715059e1b2aa9bf4b6ac6\": rpc error: code = NotFound desc = could not find container \"5e8005d4104b7ea50a7cc1faf0be196042d095bc66a715059e1b2aa9bf4b6ac6\": container with ID starting with 5e8005d4104b7ea50a7cc1faf0be196042d095bc66a715059e1b2aa9bf4b6ac6 not found: ID does not exist" Dec 11 14:06:46 crc kubenswrapper[4898]: I1211 14:06:46.495109 4898 scope.go:117] "RemoveContainer" containerID="cb4196af8592504614bffce079d4b911ac9d4d6a5caf275e244eed2a72977b07" Dec 11 14:06:46 crc kubenswrapper[4898]: E1211 14:06:46.495381 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb4196af8592504614bffce079d4b911ac9d4d6a5caf275e244eed2a72977b07\": container with ID starting with cb4196af8592504614bffce079d4b911ac9d4d6a5caf275e244eed2a72977b07 not found: ID does not exist" containerID="cb4196af8592504614bffce079d4b911ac9d4d6a5caf275e244eed2a72977b07" Dec 11 14:06:46 crc kubenswrapper[4898]: I1211 14:06:46.495428 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb4196af8592504614bffce079d4b911ac9d4d6a5caf275e244eed2a72977b07"} err="failed to get container status \"cb4196af8592504614bffce079d4b911ac9d4d6a5caf275e244eed2a72977b07\": rpc error: code = NotFound desc = could not find container \"cb4196af8592504614bffce079d4b911ac9d4d6a5caf275e244eed2a72977b07\": container with ID starting with cb4196af8592504614bffce079d4b911ac9d4d6a5caf275e244eed2a72977b07 not found: ID does not exist" Dec 11 14:06:46 crc kubenswrapper[4898]: I1211 14:06:46.791179 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b2cffdf2-e6c5-488a-a407-41548b22716d" path="/var/lib/kubelet/pods/b2cffdf2-e6c5-488a-a407-41548b22716d/volumes" Dec 11 14:06:58 crc kubenswrapper[4898]: I1211 14:06:58.959124 4898 patch_prober.go:28] interesting pod/nmstate-webhook-f8fb84555-z6l22 container/nmstate-webhook namespace/openshift-nmstate: Readiness probe status=failure output="Get \"https://10.217.0.90:9443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 11 14:06:58 crc kubenswrapper[4898]: I1211 14:06:58.959784 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-nmstate/nmstate-webhook-f8fb84555-z6l22" podUID="79dd8f49-7447-49a9-84a3-252ac5286cc3" containerName="nmstate-webhook" probeResult="failure" output="Get \"https://10.217.0.90:9443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 11 14:07:04 crc kubenswrapper[4898]: I1211 14:07:04.996119 4898 patch_prober.go:28] interesting pod/machine-config-daemon-7mmvk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 11 14:07:04 crc kubenswrapper[4898]: I1211 14:07:04.996683 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 11 14:07:34 crc kubenswrapper[4898]: I1211 14:07:34.995860 4898 patch_prober.go:28] interesting pod/machine-config-daemon-7mmvk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 11 14:07:34 crc kubenswrapper[4898]: I1211 14:07:34.996662 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 11 14:08:04 crc kubenswrapper[4898]: I1211 14:08:04.995898 4898 patch_prober.go:28] interesting pod/machine-config-daemon-7mmvk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 11 14:08:04 crc kubenswrapper[4898]: I1211 14:08:04.996939 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 11 14:08:04 crc kubenswrapper[4898]: I1211 14:08:04.997047 4898 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" Dec 11 14:08:04 crc kubenswrapper[4898]: I1211 14:08:04.999240 4898 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"84b920018f008e0bbb176a3783d9c1ca5cdd95c65a5b9d87d90ba78149ad04f3"} pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 11 14:08:04 crc kubenswrapper[4898]: I1211 14:08:04.999306 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" containerName="machine-config-daemon" containerID="cri-o://84b920018f008e0bbb176a3783d9c1ca5cdd95c65a5b9d87d90ba78149ad04f3" gracePeriod=600 Dec 11 14:08:05 crc kubenswrapper[4898]: E1211 14:08:05.625497 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mmvk_openshift-machine-config-operator(b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" Dec 11 14:08:06 crc kubenswrapper[4898]: I1211 14:08:06.255571 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" event={"ID":"b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c","Type":"ContainerDied","Data":"84b920018f008e0bbb176a3783d9c1ca5cdd95c65a5b9d87d90ba78149ad04f3"} Dec 11 14:08:06 crc kubenswrapper[4898]: I1211 14:08:06.255585 4898 generic.go:334] "Generic (PLEG): container finished" podID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" containerID="84b920018f008e0bbb176a3783d9c1ca5cdd95c65a5b9d87d90ba78149ad04f3" exitCode=0 Dec 11 14:08:06 crc kubenswrapper[4898]: I1211 14:08:06.255824 4898 scope.go:117] "RemoveContainer" containerID="7c0859071c39fa4017006af9ae39872de783996a7610e283d386c593f08b4099" Dec 11 14:08:06 crc kubenswrapper[4898]: I1211 14:08:06.256700 4898 scope.go:117] "RemoveContainer" containerID="84b920018f008e0bbb176a3783d9c1ca5cdd95c65a5b9d87d90ba78149ad04f3" Dec 11 14:08:06 crc kubenswrapper[4898]: E1211 14:08:06.257102 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mmvk_openshift-machine-config-operator(b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" Dec 11 14:08:19 crc kubenswrapper[4898]: I1211 14:08:19.775051 4898 scope.go:117] "RemoveContainer" containerID="84b920018f008e0bbb176a3783d9c1ca5cdd95c65a5b9d87d90ba78149ad04f3" Dec 11 14:08:19 crc kubenswrapper[4898]: E1211 14:08:19.776118 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mmvk_openshift-machine-config-operator(b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" Dec 11 14:08:30 crc kubenswrapper[4898]: I1211 14:08:30.775433 4898 scope.go:117] "RemoveContainer" containerID="84b920018f008e0bbb176a3783d9c1ca5cdd95c65a5b9d87d90ba78149ad04f3" Dec 11 14:08:30 crc kubenswrapper[4898]: E1211 14:08:30.776334 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mmvk_openshift-machine-config-operator(b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" Dec 11 14:08:44 crc kubenswrapper[4898]: I1211 14:08:44.775765 4898 scope.go:117] "RemoveContainer" containerID="84b920018f008e0bbb176a3783d9c1ca5cdd95c65a5b9d87d90ba78149ad04f3" Dec 11 14:08:44 crc kubenswrapper[4898]: E1211 14:08:44.776407 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mmvk_openshift-machine-config-operator(b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" Dec 11 14:08:56 crc kubenswrapper[4898]: I1211 14:08:56.777550 4898 scope.go:117] "RemoveContainer" containerID="84b920018f008e0bbb176a3783d9c1ca5cdd95c65a5b9d87d90ba78149ad04f3" Dec 11 14:08:56 crc kubenswrapper[4898]: E1211 14:08:56.779738 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mmvk_openshift-machine-config-operator(b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" Dec 11 14:09:10 crc kubenswrapper[4898]: I1211 14:09:10.775132 4898 scope.go:117] "RemoveContainer" containerID="84b920018f008e0bbb176a3783d9c1ca5cdd95c65a5b9d87d90ba78149ad04f3" Dec 11 14:09:10 crc kubenswrapper[4898]: E1211 14:09:10.776902 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mmvk_openshift-machine-config-operator(b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" Dec 11 14:09:25 crc kubenswrapper[4898]: I1211 14:09:25.775953 4898 scope.go:117] "RemoveContainer" containerID="84b920018f008e0bbb176a3783d9c1ca5cdd95c65a5b9d87d90ba78149ad04f3" Dec 11 14:09:25 crc kubenswrapper[4898]: E1211 14:09:25.776863 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mmvk_openshift-machine-config-operator(b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" Dec 11 14:09:36 crc kubenswrapper[4898]: I1211 14:09:36.774752 4898 scope.go:117] "RemoveContainer" containerID="84b920018f008e0bbb176a3783d9c1ca5cdd95c65a5b9d87d90ba78149ad04f3" Dec 11 14:09:36 crc kubenswrapper[4898]: E1211 14:09:36.775480 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mmvk_openshift-machine-config-operator(b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" Dec 11 14:09:47 crc kubenswrapper[4898]: I1211 14:09:47.776784 4898 scope.go:117] "RemoveContainer" containerID="84b920018f008e0bbb176a3783d9c1ca5cdd95c65a5b9d87d90ba78149ad04f3" Dec 11 14:09:47 crc kubenswrapper[4898]: E1211 14:09:47.777676 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mmvk_openshift-machine-config-operator(b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" Dec 11 14:10:00 crc kubenswrapper[4898]: I1211 14:10:00.775812 4898 scope.go:117] "RemoveContainer" containerID="84b920018f008e0bbb176a3783d9c1ca5cdd95c65a5b9d87d90ba78149ad04f3" Dec 11 14:10:00 crc kubenswrapper[4898]: E1211 14:10:00.776587 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mmvk_openshift-machine-config-operator(b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" Dec 11 14:10:03 crc kubenswrapper[4898]: I1211 14:10:03.165136 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-m7bh5"] Dec 11 14:10:03 crc kubenswrapper[4898]: E1211 14:10:03.166309 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2cffdf2-e6c5-488a-a407-41548b22716d" containerName="extract-utilities" Dec 11 14:10:03 crc kubenswrapper[4898]: I1211 14:10:03.166328 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2cffdf2-e6c5-488a-a407-41548b22716d" containerName="extract-utilities" Dec 11 14:10:03 crc kubenswrapper[4898]: E1211 14:10:03.166350 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2cffdf2-e6c5-488a-a407-41548b22716d" containerName="registry-server" Dec 11 14:10:03 crc kubenswrapper[4898]: I1211 14:10:03.166357 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2cffdf2-e6c5-488a-a407-41548b22716d" containerName="registry-server" Dec 11 14:10:03 crc kubenswrapper[4898]: E1211 14:10:03.166417 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2cffdf2-e6c5-488a-a407-41548b22716d" containerName="extract-content" Dec 11 14:10:03 crc kubenswrapper[4898]: I1211 14:10:03.166426 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2cffdf2-e6c5-488a-a407-41548b22716d" containerName="extract-content" Dec 11 14:10:03 crc kubenswrapper[4898]: I1211 14:10:03.166756 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="b2cffdf2-e6c5-488a-a407-41548b22716d" containerName="registry-server" Dec 11 14:10:03 crc kubenswrapper[4898]: I1211 14:10:03.169155 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-m7bh5" Dec 11 14:10:03 crc kubenswrapper[4898]: I1211 14:10:03.179637 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-m7bh5"] Dec 11 14:10:03 crc kubenswrapper[4898]: I1211 14:10:03.242511 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bcb40a2c-18df-40db-8593-d3af9054d22a-catalog-content\") pod \"redhat-marketplace-m7bh5\" (UID: \"bcb40a2c-18df-40db-8593-d3af9054d22a\") " pod="openshift-marketplace/redhat-marketplace-m7bh5" Dec 11 14:10:03 crc kubenswrapper[4898]: I1211 14:10:03.242614 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xbsj4\" (UniqueName: \"kubernetes.io/projected/bcb40a2c-18df-40db-8593-d3af9054d22a-kube-api-access-xbsj4\") pod \"redhat-marketplace-m7bh5\" (UID: \"bcb40a2c-18df-40db-8593-d3af9054d22a\") " pod="openshift-marketplace/redhat-marketplace-m7bh5" Dec 11 14:10:03 crc kubenswrapper[4898]: I1211 14:10:03.242807 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bcb40a2c-18df-40db-8593-d3af9054d22a-utilities\") pod \"redhat-marketplace-m7bh5\" (UID: \"bcb40a2c-18df-40db-8593-d3af9054d22a\") " pod="openshift-marketplace/redhat-marketplace-m7bh5" Dec 11 14:10:03 crc kubenswrapper[4898]: I1211 14:10:03.345695 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bcb40a2c-18df-40db-8593-d3af9054d22a-utilities\") pod \"redhat-marketplace-m7bh5\" (UID: \"bcb40a2c-18df-40db-8593-d3af9054d22a\") " pod="openshift-marketplace/redhat-marketplace-m7bh5" Dec 11 14:10:03 crc kubenswrapper[4898]: I1211 14:10:03.345955 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bcb40a2c-18df-40db-8593-d3af9054d22a-catalog-content\") pod \"redhat-marketplace-m7bh5\" (UID: \"bcb40a2c-18df-40db-8593-d3af9054d22a\") " pod="openshift-marketplace/redhat-marketplace-m7bh5" Dec 11 14:10:03 crc kubenswrapper[4898]: I1211 14:10:03.346026 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xbsj4\" (UniqueName: \"kubernetes.io/projected/bcb40a2c-18df-40db-8593-d3af9054d22a-kube-api-access-xbsj4\") pod \"redhat-marketplace-m7bh5\" (UID: \"bcb40a2c-18df-40db-8593-d3af9054d22a\") " pod="openshift-marketplace/redhat-marketplace-m7bh5" Dec 11 14:10:03 crc kubenswrapper[4898]: I1211 14:10:03.347376 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bcb40a2c-18df-40db-8593-d3af9054d22a-utilities\") pod \"redhat-marketplace-m7bh5\" (UID: \"bcb40a2c-18df-40db-8593-d3af9054d22a\") " pod="openshift-marketplace/redhat-marketplace-m7bh5" Dec 11 14:10:03 crc kubenswrapper[4898]: I1211 14:10:03.347472 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bcb40a2c-18df-40db-8593-d3af9054d22a-catalog-content\") pod \"redhat-marketplace-m7bh5\" (UID: \"bcb40a2c-18df-40db-8593-d3af9054d22a\") " pod="openshift-marketplace/redhat-marketplace-m7bh5" Dec 11 14:10:03 crc kubenswrapper[4898]: I1211 14:10:03.372185 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xbsj4\" (UniqueName: \"kubernetes.io/projected/bcb40a2c-18df-40db-8593-d3af9054d22a-kube-api-access-xbsj4\") pod \"redhat-marketplace-m7bh5\" (UID: \"bcb40a2c-18df-40db-8593-d3af9054d22a\") " pod="openshift-marketplace/redhat-marketplace-m7bh5" Dec 11 14:10:03 crc kubenswrapper[4898]: I1211 14:10:03.499265 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-m7bh5" Dec 11 14:10:04 crc kubenswrapper[4898]: I1211 14:10:04.041849 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-m7bh5"] Dec 11 14:10:04 crc kubenswrapper[4898]: I1211 14:10:04.660895 4898 generic.go:334] "Generic (PLEG): container finished" podID="bcb40a2c-18df-40db-8593-d3af9054d22a" containerID="89bebdae736e79a1bf81d49ba4bbee925007366bec0c1487b5daf2aeeeb7d21b" exitCode=0 Dec 11 14:10:04 crc kubenswrapper[4898]: I1211 14:10:04.660996 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m7bh5" event={"ID":"bcb40a2c-18df-40db-8593-d3af9054d22a","Type":"ContainerDied","Data":"89bebdae736e79a1bf81d49ba4bbee925007366bec0c1487b5daf2aeeeb7d21b"} Dec 11 14:10:04 crc kubenswrapper[4898]: I1211 14:10:04.661207 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m7bh5" event={"ID":"bcb40a2c-18df-40db-8593-d3af9054d22a","Type":"ContainerStarted","Data":"5e23f35a7c215ff18b6d9bbefb4740b13349f576d65653c2b15c1568ebec3868"} Dec 11 14:10:06 crc kubenswrapper[4898]: I1211 14:10:06.680774 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m7bh5" event={"ID":"bcb40a2c-18df-40db-8593-d3af9054d22a","Type":"ContainerStarted","Data":"aad04881e7e42f944f11661e8a2336d3060e6c86e21f1a318d26ace0c98f7667"} Dec 11 14:10:07 crc kubenswrapper[4898]: I1211 14:10:07.366948 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/swift-proxy-6867fd7bcf-bbj7b" podUID="4ed17564-edd0-4a66-8b9b-04aabd280113" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 502" Dec 11 14:10:08 crc kubenswrapper[4898]: I1211 14:10:08.703033 4898 generic.go:334] "Generic (PLEG): container finished" podID="bcb40a2c-18df-40db-8593-d3af9054d22a" containerID="aad04881e7e42f944f11661e8a2336d3060e6c86e21f1a318d26ace0c98f7667" exitCode=0 Dec 11 14:10:08 crc kubenswrapper[4898]: I1211 14:10:08.703115 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m7bh5" event={"ID":"bcb40a2c-18df-40db-8593-d3af9054d22a","Type":"ContainerDied","Data":"aad04881e7e42f944f11661e8a2336d3060e6c86e21f1a318d26ace0c98f7667"} Dec 11 14:10:09 crc kubenswrapper[4898]: I1211 14:10:09.720515 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m7bh5" event={"ID":"bcb40a2c-18df-40db-8593-d3af9054d22a","Type":"ContainerStarted","Data":"6e582d47294cabf56dc54d49a577cceeb4b715d71e6c75deb183d92037343e34"} Dec 11 14:10:09 crc kubenswrapper[4898]: I1211 14:10:09.748715 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-m7bh5" podStartSLOduration=2.040475344 podStartE2EDuration="6.748693796s" podCreationTimestamp="2025-12-11 14:10:03 +0000 UTC" firstStartedPulling="2025-12-11 14:10:04.663400613 +0000 UTC m=+3962.235727040" lastFinishedPulling="2025-12-11 14:10:09.371619055 +0000 UTC m=+3966.943945492" observedRunningTime="2025-12-11 14:10:09.742679783 +0000 UTC m=+3967.315006220" watchObservedRunningTime="2025-12-11 14:10:09.748693796 +0000 UTC m=+3967.321020233" Dec 11 14:10:13 crc kubenswrapper[4898]: I1211 14:10:13.499929 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-m7bh5" Dec 11 14:10:13 crc kubenswrapper[4898]: I1211 14:10:13.500417 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-m7bh5" Dec 11 14:10:13 crc kubenswrapper[4898]: I1211 14:10:13.548168 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-m7bh5" Dec 11 14:10:15 crc kubenswrapper[4898]: I1211 14:10:15.792904 4898 scope.go:117] "RemoveContainer" containerID="84b920018f008e0bbb176a3783d9c1ca5cdd95c65a5b9d87d90ba78149ad04f3" Dec 11 14:10:15 crc kubenswrapper[4898]: E1211 14:10:15.797001 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mmvk_openshift-machine-config-operator(b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" Dec 11 14:10:23 crc kubenswrapper[4898]: I1211 14:10:23.566276 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-m7bh5" Dec 11 14:10:23 crc kubenswrapper[4898]: I1211 14:10:23.634733 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-m7bh5"] Dec 11 14:10:23 crc kubenswrapper[4898]: I1211 14:10:23.894706 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-m7bh5" podUID="bcb40a2c-18df-40db-8593-d3af9054d22a" containerName="registry-server" containerID="cri-o://6e582d47294cabf56dc54d49a577cceeb4b715d71e6c75deb183d92037343e34" gracePeriod=2 Dec 11 14:10:24 crc kubenswrapper[4898]: I1211 14:10:24.664705 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-m7bh5" Dec 11 14:10:24 crc kubenswrapper[4898]: I1211 14:10:24.783303 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bcb40a2c-18df-40db-8593-d3af9054d22a-catalog-content\") pod \"bcb40a2c-18df-40db-8593-d3af9054d22a\" (UID: \"bcb40a2c-18df-40db-8593-d3af9054d22a\") " Dec 11 14:10:24 crc kubenswrapper[4898]: I1211 14:10:24.783565 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xbsj4\" (UniqueName: \"kubernetes.io/projected/bcb40a2c-18df-40db-8593-d3af9054d22a-kube-api-access-xbsj4\") pod \"bcb40a2c-18df-40db-8593-d3af9054d22a\" (UID: \"bcb40a2c-18df-40db-8593-d3af9054d22a\") " Dec 11 14:10:24 crc kubenswrapper[4898]: I1211 14:10:24.783678 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bcb40a2c-18df-40db-8593-d3af9054d22a-utilities\") pod \"bcb40a2c-18df-40db-8593-d3af9054d22a\" (UID: \"bcb40a2c-18df-40db-8593-d3af9054d22a\") " Dec 11 14:10:24 crc kubenswrapper[4898]: I1211 14:10:24.787008 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bcb40a2c-18df-40db-8593-d3af9054d22a-utilities" (OuterVolumeSpecName: "utilities") pod "bcb40a2c-18df-40db-8593-d3af9054d22a" (UID: "bcb40a2c-18df-40db-8593-d3af9054d22a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 14:10:24 crc kubenswrapper[4898]: I1211 14:10:24.793172 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bcb40a2c-18df-40db-8593-d3af9054d22a-kube-api-access-xbsj4" (OuterVolumeSpecName: "kube-api-access-xbsj4") pod "bcb40a2c-18df-40db-8593-d3af9054d22a" (UID: "bcb40a2c-18df-40db-8593-d3af9054d22a"). InnerVolumeSpecName "kube-api-access-xbsj4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 14:10:24 crc kubenswrapper[4898]: I1211 14:10:24.804275 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bcb40a2c-18df-40db-8593-d3af9054d22a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bcb40a2c-18df-40db-8593-d3af9054d22a" (UID: "bcb40a2c-18df-40db-8593-d3af9054d22a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 14:10:24 crc kubenswrapper[4898]: I1211 14:10:24.887359 4898 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bcb40a2c-18df-40db-8593-d3af9054d22a-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 11 14:10:24 crc kubenswrapper[4898]: I1211 14:10:24.887682 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xbsj4\" (UniqueName: \"kubernetes.io/projected/bcb40a2c-18df-40db-8593-d3af9054d22a-kube-api-access-xbsj4\") on node \"crc\" DevicePath \"\"" Dec 11 14:10:24 crc kubenswrapper[4898]: I1211 14:10:24.887699 4898 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bcb40a2c-18df-40db-8593-d3af9054d22a-utilities\") on node \"crc\" DevicePath \"\"" Dec 11 14:10:24 crc kubenswrapper[4898]: I1211 14:10:24.908709 4898 generic.go:334] "Generic (PLEG): container finished" podID="bcb40a2c-18df-40db-8593-d3af9054d22a" containerID="6e582d47294cabf56dc54d49a577cceeb4b715d71e6c75deb183d92037343e34" exitCode=0 Dec 11 14:10:24 crc kubenswrapper[4898]: I1211 14:10:24.908752 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m7bh5" event={"ID":"bcb40a2c-18df-40db-8593-d3af9054d22a","Type":"ContainerDied","Data":"6e582d47294cabf56dc54d49a577cceeb4b715d71e6c75deb183d92037343e34"} Dec 11 14:10:24 crc kubenswrapper[4898]: I1211 14:10:24.908782 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m7bh5" event={"ID":"bcb40a2c-18df-40db-8593-d3af9054d22a","Type":"ContainerDied","Data":"5e23f35a7c215ff18b6d9bbefb4740b13349f576d65653c2b15c1568ebec3868"} Dec 11 14:10:24 crc kubenswrapper[4898]: I1211 14:10:24.908803 4898 scope.go:117] "RemoveContainer" containerID="6e582d47294cabf56dc54d49a577cceeb4b715d71e6c75deb183d92037343e34" Dec 11 14:10:24 crc kubenswrapper[4898]: I1211 14:10:24.908957 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-m7bh5" Dec 11 14:10:24 crc kubenswrapper[4898]: I1211 14:10:24.953285 4898 scope.go:117] "RemoveContainer" containerID="aad04881e7e42f944f11661e8a2336d3060e6c86e21f1a318d26ace0c98f7667" Dec 11 14:10:24 crc kubenswrapper[4898]: I1211 14:10:24.957623 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-m7bh5"] Dec 11 14:10:24 crc kubenswrapper[4898]: I1211 14:10:24.967049 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-m7bh5"] Dec 11 14:10:25 crc kubenswrapper[4898]: I1211 14:10:25.704018 4898 scope.go:117] "RemoveContainer" containerID="89bebdae736e79a1bf81d49ba4bbee925007366bec0c1487b5daf2aeeeb7d21b" Dec 11 14:10:25 crc kubenswrapper[4898]: I1211 14:10:25.756961 4898 scope.go:117] "RemoveContainer" containerID="6e582d47294cabf56dc54d49a577cceeb4b715d71e6c75deb183d92037343e34" Dec 11 14:10:25 crc kubenswrapper[4898]: E1211 14:10:25.757499 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6e582d47294cabf56dc54d49a577cceeb4b715d71e6c75deb183d92037343e34\": container with ID starting with 6e582d47294cabf56dc54d49a577cceeb4b715d71e6c75deb183d92037343e34 not found: ID does not exist" containerID="6e582d47294cabf56dc54d49a577cceeb4b715d71e6c75deb183d92037343e34" Dec 11 14:10:25 crc kubenswrapper[4898]: I1211 14:10:25.757542 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e582d47294cabf56dc54d49a577cceeb4b715d71e6c75deb183d92037343e34"} err="failed to get container status \"6e582d47294cabf56dc54d49a577cceeb4b715d71e6c75deb183d92037343e34\": rpc error: code = NotFound desc = could not find container \"6e582d47294cabf56dc54d49a577cceeb4b715d71e6c75deb183d92037343e34\": container with ID starting with 6e582d47294cabf56dc54d49a577cceeb4b715d71e6c75deb183d92037343e34 not found: ID does not exist" Dec 11 14:10:25 crc kubenswrapper[4898]: I1211 14:10:25.757573 4898 scope.go:117] "RemoveContainer" containerID="aad04881e7e42f944f11661e8a2336d3060e6c86e21f1a318d26ace0c98f7667" Dec 11 14:10:25 crc kubenswrapper[4898]: E1211 14:10:25.757899 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aad04881e7e42f944f11661e8a2336d3060e6c86e21f1a318d26ace0c98f7667\": container with ID starting with aad04881e7e42f944f11661e8a2336d3060e6c86e21f1a318d26ace0c98f7667 not found: ID does not exist" containerID="aad04881e7e42f944f11661e8a2336d3060e6c86e21f1a318d26ace0c98f7667" Dec 11 14:10:25 crc kubenswrapper[4898]: I1211 14:10:25.757939 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aad04881e7e42f944f11661e8a2336d3060e6c86e21f1a318d26ace0c98f7667"} err="failed to get container status \"aad04881e7e42f944f11661e8a2336d3060e6c86e21f1a318d26ace0c98f7667\": rpc error: code = NotFound desc = could not find container \"aad04881e7e42f944f11661e8a2336d3060e6c86e21f1a318d26ace0c98f7667\": container with ID starting with aad04881e7e42f944f11661e8a2336d3060e6c86e21f1a318d26ace0c98f7667 not found: ID does not exist" Dec 11 14:10:25 crc kubenswrapper[4898]: I1211 14:10:25.757951 4898 scope.go:117] "RemoveContainer" containerID="89bebdae736e79a1bf81d49ba4bbee925007366bec0c1487b5daf2aeeeb7d21b" Dec 11 14:10:25 crc kubenswrapper[4898]: E1211 14:10:25.758159 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"89bebdae736e79a1bf81d49ba4bbee925007366bec0c1487b5daf2aeeeb7d21b\": container with ID starting with 89bebdae736e79a1bf81d49ba4bbee925007366bec0c1487b5daf2aeeeb7d21b not found: ID does not exist" containerID="89bebdae736e79a1bf81d49ba4bbee925007366bec0c1487b5daf2aeeeb7d21b" Dec 11 14:10:25 crc kubenswrapper[4898]: I1211 14:10:25.758205 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"89bebdae736e79a1bf81d49ba4bbee925007366bec0c1487b5daf2aeeeb7d21b"} err="failed to get container status \"89bebdae736e79a1bf81d49ba4bbee925007366bec0c1487b5daf2aeeeb7d21b\": rpc error: code = NotFound desc = could not find container \"89bebdae736e79a1bf81d49ba4bbee925007366bec0c1487b5daf2aeeeb7d21b\": container with ID starting with 89bebdae736e79a1bf81d49ba4bbee925007366bec0c1487b5daf2aeeeb7d21b not found: ID does not exist" Dec 11 14:10:26 crc kubenswrapper[4898]: I1211 14:10:26.791892 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bcb40a2c-18df-40db-8593-d3af9054d22a" path="/var/lib/kubelet/pods/bcb40a2c-18df-40db-8593-d3af9054d22a/volumes" Dec 11 14:10:29 crc kubenswrapper[4898]: I1211 14:10:29.774785 4898 scope.go:117] "RemoveContainer" containerID="84b920018f008e0bbb176a3783d9c1ca5cdd95c65a5b9d87d90ba78149ad04f3" Dec 11 14:10:29 crc kubenswrapper[4898]: E1211 14:10:29.777010 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mmvk_openshift-machine-config-operator(b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" Dec 11 14:10:43 crc kubenswrapper[4898]: I1211 14:10:43.775146 4898 scope.go:117] "RemoveContainer" containerID="84b920018f008e0bbb176a3783d9c1ca5cdd95c65a5b9d87d90ba78149ad04f3" Dec 11 14:10:43 crc kubenswrapper[4898]: E1211 14:10:43.775870 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mmvk_openshift-machine-config-operator(b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" Dec 11 14:10:50 crc kubenswrapper[4898]: E1211 14:10:50.938889 4898 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.18:47110->38.102.83.18:42973: write tcp 38.102.83.18:47110->38.102.83.18:42973: write: broken pipe Dec 11 14:10:53 crc kubenswrapper[4898]: I1211 14:10:53.785752 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-p6pcm"] Dec 11 14:10:53 crc kubenswrapper[4898]: E1211 14:10:53.786904 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bcb40a2c-18df-40db-8593-d3af9054d22a" containerName="extract-content" Dec 11 14:10:53 crc kubenswrapper[4898]: I1211 14:10:53.786919 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="bcb40a2c-18df-40db-8593-d3af9054d22a" containerName="extract-content" Dec 11 14:10:53 crc kubenswrapper[4898]: E1211 14:10:53.786960 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bcb40a2c-18df-40db-8593-d3af9054d22a" containerName="registry-server" Dec 11 14:10:53 crc kubenswrapper[4898]: I1211 14:10:53.786966 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="bcb40a2c-18df-40db-8593-d3af9054d22a" containerName="registry-server" Dec 11 14:10:53 crc kubenswrapper[4898]: E1211 14:10:53.786992 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bcb40a2c-18df-40db-8593-d3af9054d22a" containerName="extract-utilities" Dec 11 14:10:53 crc kubenswrapper[4898]: I1211 14:10:53.786999 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="bcb40a2c-18df-40db-8593-d3af9054d22a" containerName="extract-utilities" Dec 11 14:10:53 crc kubenswrapper[4898]: I1211 14:10:53.787296 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="bcb40a2c-18df-40db-8593-d3af9054d22a" containerName="registry-server" Dec 11 14:10:53 crc kubenswrapper[4898]: I1211 14:10:53.789560 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-p6pcm" Dec 11 14:10:53 crc kubenswrapper[4898]: I1211 14:10:53.802008 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-p6pcm"] Dec 11 14:10:53 crc kubenswrapper[4898]: I1211 14:10:53.938942 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hgmpb\" (UniqueName: \"kubernetes.io/projected/4034b52c-b8d2-4d7a-951d-1c9538e273dc-kube-api-access-hgmpb\") pod \"certified-operators-p6pcm\" (UID: \"4034b52c-b8d2-4d7a-951d-1c9538e273dc\") " pod="openshift-marketplace/certified-operators-p6pcm" Dec 11 14:10:53 crc kubenswrapper[4898]: I1211 14:10:53.938989 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4034b52c-b8d2-4d7a-951d-1c9538e273dc-utilities\") pod \"certified-operators-p6pcm\" (UID: \"4034b52c-b8d2-4d7a-951d-1c9538e273dc\") " pod="openshift-marketplace/certified-operators-p6pcm" Dec 11 14:10:53 crc kubenswrapper[4898]: I1211 14:10:53.939023 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4034b52c-b8d2-4d7a-951d-1c9538e273dc-catalog-content\") pod \"certified-operators-p6pcm\" (UID: \"4034b52c-b8d2-4d7a-951d-1c9538e273dc\") " pod="openshift-marketplace/certified-operators-p6pcm" Dec 11 14:10:54 crc kubenswrapper[4898]: I1211 14:10:54.041908 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hgmpb\" (UniqueName: \"kubernetes.io/projected/4034b52c-b8d2-4d7a-951d-1c9538e273dc-kube-api-access-hgmpb\") pod \"certified-operators-p6pcm\" (UID: \"4034b52c-b8d2-4d7a-951d-1c9538e273dc\") " pod="openshift-marketplace/certified-operators-p6pcm" Dec 11 14:10:54 crc kubenswrapper[4898]: I1211 14:10:54.041977 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4034b52c-b8d2-4d7a-951d-1c9538e273dc-utilities\") pod \"certified-operators-p6pcm\" (UID: \"4034b52c-b8d2-4d7a-951d-1c9538e273dc\") " pod="openshift-marketplace/certified-operators-p6pcm" Dec 11 14:10:54 crc kubenswrapper[4898]: I1211 14:10:54.042026 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4034b52c-b8d2-4d7a-951d-1c9538e273dc-catalog-content\") pod \"certified-operators-p6pcm\" (UID: \"4034b52c-b8d2-4d7a-951d-1c9538e273dc\") " pod="openshift-marketplace/certified-operators-p6pcm" Dec 11 14:10:54 crc kubenswrapper[4898]: I1211 14:10:54.042632 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4034b52c-b8d2-4d7a-951d-1c9538e273dc-utilities\") pod \"certified-operators-p6pcm\" (UID: \"4034b52c-b8d2-4d7a-951d-1c9538e273dc\") " pod="openshift-marketplace/certified-operators-p6pcm" Dec 11 14:10:54 crc kubenswrapper[4898]: I1211 14:10:54.042709 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4034b52c-b8d2-4d7a-951d-1c9538e273dc-catalog-content\") pod \"certified-operators-p6pcm\" (UID: \"4034b52c-b8d2-4d7a-951d-1c9538e273dc\") " pod="openshift-marketplace/certified-operators-p6pcm" Dec 11 14:10:54 crc kubenswrapper[4898]: I1211 14:10:54.062320 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hgmpb\" (UniqueName: \"kubernetes.io/projected/4034b52c-b8d2-4d7a-951d-1c9538e273dc-kube-api-access-hgmpb\") pod \"certified-operators-p6pcm\" (UID: \"4034b52c-b8d2-4d7a-951d-1c9538e273dc\") " pod="openshift-marketplace/certified-operators-p6pcm" Dec 11 14:10:54 crc kubenswrapper[4898]: I1211 14:10:54.121009 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-p6pcm" Dec 11 14:10:54 crc kubenswrapper[4898]: I1211 14:10:54.773470 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-p6pcm"] Dec 11 14:10:55 crc kubenswrapper[4898]: I1211 14:10:55.238270 4898 generic.go:334] "Generic (PLEG): container finished" podID="4034b52c-b8d2-4d7a-951d-1c9538e273dc" containerID="8c2411f9952b81bfce8d6cabd85f427093f493e0fded28efa53c0f1f3c0ddfdc" exitCode=0 Dec 11 14:10:55 crc kubenswrapper[4898]: I1211 14:10:55.238341 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p6pcm" event={"ID":"4034b52c-b8d2-4d7a-951d-1c9538e273dc","Type":"ContainerDied","Data":"8c2411f9952b81bfce8d6cabd85f427093f493e0fded28efa53c0f1f3c0ddfdc"} Dec 11 14:10:55 crc kubenswrapper[4898]: I1211 14:10:55.238646 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p6pcm" event={"ID":"4034b52c-b8d2-4d7a-951d-1c9538e273dc","Type":"ContainerStarted","Data":"ac24e1d15dc820c891d8a660c152151761672d36c827dd9f9eddf80b3581adb7"} Dec 11 14:10:55 crc kubenswrapper[4898]: I1211 14:10:55.774764 4898 scope.go:117] "RemoveContainer" containerID="84b920018f008e0bbb176a3783d9c1ca5cdd95c65a5b9d87d90ba78149ad04f3" Dec 11 14:10:55 crc kubenswrapper[4898]: E1211 14:10:55.775113 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mmvk_openshift-machine-config-operator(b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" Dec 11 14:10:56 crc kubenswrapper[4898]: I1211 14:10:56.249315 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p6pcm" event={"ID":"4034b52c-b8d2-4d7a-951d-1c9538e273dc","Type":"ContainerStarted","Data":"2de7679de5baa44c87094f017375cb41f4f59f56da9abf37d2a07dd2c3bc5ea3"} Dec 11 14:10:57 crc kubenswrapper[4898]: I1211 14:10:57.261688 4898 generic.go:334] "Generic (PLEG): container finished" podID="4034b52c-b8d2-4d7a-951d-1c9538e273dc" containerID="2de7679de5baa44c87094f017375cb41f4f59f56da9abf37d2a07dd2c3bc5ea3" exitCode=0 Dec 11 14:10:57 crc kubenswrapper[4898]: I1211 14:10:57.262234 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p6pcm" event={"ID":"4034b52c-b8d2-4d7a-951d-1c9538e273dc","Type":"ContainerDied","Data":"2de7679de5baa44c87094f017375cb41f4f59f56da9abf37d2a07dd2c3bc5ea3"} Dec 11 14:10:58 crc kubenswrapper[4898]: I1211 14:10:58.280401 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p6pcm" event={"ID":"4034b52c-b8d2-4d7a-951d-1c9538e273dc","Type":"ContainerStarted","Data":"1829b7b58bccdbc1292402820d19d26682780ac071bed6e9bc1a2c9ae21f05e1"} Dec 11 14:10:58 crc kubenswrapper[4898]: I1211 14:10:58.302847 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-p6pcm" podStartSLOduration=2.8843996 podStartE2EDuration="5.302825567s" podCreationTimestamp="2025-12-11 14:10:53 +0000 UTC" firstStartedPulling="2025-12-11 14:10:55.241736678 +0000 UTC m=+4012.814063105" lastFinishedPulling="2025-12-11 14:10:57.660162635 +0000 UTC m=+4015.232489072" observedRunningTime="2025-12-11 14:10:58.299998971 +0000 UTC m=+4015.872325408" watchObservedRunningTime="2025-12-11 14:10:58.302825567 +0000 UTC m=+4015.875152004" Dec 11 14:11:04 crc kubenswrapper[4898]: I1211 14:11:04.121768 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-p6pcm" Dec 11 14:11:04 crc kubenswrapper[4898]: I1211 14:11:04.122357 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-p6pcm" Dec 11 14:11:04 crc kubenswrapper[4898]: I1211 14:11:04.187480 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-p6pcm" Dec 11 14:11:05 crc kubenswrapper[4898]: I1211 14:11:05.123285 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-p6pcm" Dec 11 14:11:05 crc kubenswrapper[4898]: I1211 14:11:05.196738 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-p6pcm"] Dec 11 14:11:06 crc kubenswrapper[4898]: I1211 14:11:06.367084 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-p6pcm" podUID="4034b52c-b8d2-4d7a-951d-1c9538e273dc" containerName="registry-server" containerID="cri-o://1829b7b58bccdbc1292402820d19d26682780ac071bed6e9bc1a2c9ae21f05e1" gracePeriod=2 Dec 11 14:11:07 crc kubenswrapper[4898]: I1211 14:11:07.391968 4898 generic.go:334] "Generic (PLEG): container finished" podID="4034b52c-b8d2-4d7a-951d-1c9538e273dc" containerID="1829b7b58bccdbc1292402820d19d26682780ac071bed6e9bc1a2c9ae21f05e1" exitCode=0 Dec 11 14:11:07 crc kubenswrapper[4898]: I1211 14:11:07.392047 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p6pcm" event={"ID":"4034b52c-b8d2-4d7a-951d-1c9538e273dc","Type":"ContainerDied","Data":"1829b7b58bccdbc1292402820d19d26682780ac071bed6e9bc1a2c9ae21f05e1"} Dec 11 14:11:08 crc kubenswrapper[4898]: I1211 14:11:08.605618 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-p6pcm" Dec 11 14:11:08 crc kubenswrapper[4898]: I1211 14:11:08.725255 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4034b52c-b8d2-4d7a-951d-1c9538e273dc-catalog-content\") pod \"4034b52c-b8d2-4d7a-951d-1c9538e273dc\" (UID: \"4034b52c-b8d2-4d7a-951d-1c9538e273dc\") " Dec 11 14:11:08 crc kubenswrapper[4898]: I1211 14:11:08.725523 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4034b52c-b8d2-4d7a-951d-1c9538e273dc-utilities\") pod \"4034b52c-b8d2-4d7a-951d-1c9538e273dc\" (UID: \"4034b52c-b8d2-4d7a-951d-1c9538e273dc\") " Dec 11 14:11:08 crc kubenswrapper[4898]: I1211 14:11:08.725578 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hgmpb\" (UniqueName: \"kubernetes.io/projected/4034b52c-b8d2-4d7a-951d-1c9538e273dc-kube-api-access-hgmpb\") pod \"4034b52c-b8d2-4d7a-951d-1c9538e273dc\" (UID: \"4034b52c-b8d2-4d7a-951d-1c9538e273dc\") " Dec 11 14:11:08 crc kubenswrapper[4898]: I1211 14:11:08.726236 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4034b52c-b8d2-4d7a-951d-1c9538e273dc-utilities" (OuterVolumeSpecName: "utilities") pod "4034b52c-b8d2-4d7a-951d-1c9538e273dc" (UID: "4034b52c-b8d2-4d7a-951d-1c9538e273dc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 14:11:08 crc kubenswrapper[4898]: I1211 14:11:08.731447 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4034b52c-b8d2-4d7a-951d-1c9538e273dc-kube-api-access-hgmpb" (OuterVolumeSpecName: "kube-api-access-hgmpb") pod "4034b52c-b8d2-4d7a-951d-1c9538e273dc" (UID: "4034b52c-b8d2-4d7a-951d-1c9538e273dc"). InnerVolumeSpecName "kube-api-access-hgmpb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 14:11:08 crc kubenswrapper[4898]: I1211 14:11:08.773697 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4034b52c-b8d2-4d7a-951d-1c9538e273dc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4034b52c-b8d2-4d7a-951d-1c9538e273dc" (UID: "4034b52c-b8d2-4d7a-951d-1c9538e273dc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 14:11:08 crc kubenswrapper[4898]: I1211 14:11:08.774968 4898 scope.go:117] "RemoveContainer" containerID="84b920018f008e0bbb176a3783d9c1ca5cdd95c65a5b9d87d90ba78149ad04f3" Dec 11 14:11:08 crc kubenswrapper[4898]: E1211 14:11:08.775386 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mmvk_openshift-machine-config-operator(b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" Dec 11 14:11:08 crc kubenswrapper[4898]: I1211 14:11:08.828625 4898 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4034b52c-b8d2-4d7a-951d-1c9538e273dc-utilities\") on node \"crc\" DevicePath \"\"" Dec 11 14:11:08 crc kubenswrapper[4898]: I1211 14:11:08.828664 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hgmpb\" (UniqueName: \"kubernetes.io/projected/4034b52c-b8d2-4d7a-951d-1c9538e273dc-kube-api-access-hgmpb\") on node \"crc\" DevicePath \"\"" Dec 11 14:11:08 crc kubenswrapper[4898]: I1211 14:11:08.828675 4898 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4034b52c-b8d2-4d7a-951d-1c9538e273dc-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 11 14:11:09 crc kubenswrapper[4898]: I1211 14:11:09.415910 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p6pcm" event={"ID":"4034b52c-b8d2-4d7a-951d-1c9538e273dc","Type":"ContainerDied","Data":"ac24e1d15dc820c891d8a660c152151761672d36c827dd9f9eddf80b3581adb7"} Dec 11 14:11:09 crc kubenswrapper[4898]: I1211 14:11:09.415973 4898 scope.go:117] "RemoveContainer" containerID="1829b7b58bccdbc1292402820d19d26682780ac071bed6e9bc1a2c9ae21f05e1" Dec 11 14:11:09 crc kubenswrapper[4898]: I1211 14:11:09.415999 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-p6pcm" Dec 11 14:11:09 crc kubenswrapper[4898]: I1211 14:11:09.439225 4898 scope.go:117] "RemoveContainer" containerID="2de7679de5baa44c87094f017375cb41f4f59f56da9abf37d2a07dd2c3bc5ea3" Dec 11 14:11:09 crc kubenswrapper[4898]: I1211 14:11:09.450289 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-p6pcm"] Dec 11 14:11:09 crc kubenswrapper[4898]: I1211 14:11:09.468364 4898 scope.go:117] "RemoveContainer" containerID="8c2411f9952b81bfce8d6cabd85f427093f493e0fded28efa53c0f1f3c0ddfdc" Dec 11 14:11:09 crc kubenswrapper[4898]: I1211 14:11:09.469241 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-p6pcm"] Dec 11 14:11:10 crc kubenswrapper[4898]: I1211 14:11:10.788148 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4034b52c-b8d2-4d7a-951d-1c9538e273dc" path="/var/lib/kubelet/pods/4034b52c-b8d2-4d7a-951d-1c9538e273dc/volumes" Dec 11 14:11:23 crc kubenswrapper[4898]: I1211 14:11:23.775496 4898 scope.go:117] "RemoveContainer" containerID="84b920018f008e0bbb176a3783d9c1ca5cdd95c65a5b9d87d90ba78149ad04f3" Dec 11 14:11:23 crc kubenswrapper[4898]: E1211 14:11:23.776291 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mmvk_openshift-machine-config-operator(b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" Dec 11 14:11:37 crc kubenswrapper[4898]: I1211 14:11:37.775757 4898 scope.go:117] "RemoveContainer" containerID="84b920018f008e0bbb176a3783d9c1ca5cdd95c65a5b9d87d90ba78149ad04f3" Dec 11 14:11:37 crc kubenswrapper[4898]: E1211 14:11:37.776816 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mmvk_openshift-machine-config-operator(b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" Dec 11 14:11:48 crc kubenswrapper[4898]: I1211 14:11:48.775628 4898 scope.go:117] "RemoveContainer" containerID="84b920018f008e0bbb176a3783d9c1ca5cdd95c65a5b9d87d90ba78149ad04f3" Dec 11 14:11:48 crc kubenswrapper[4898]: E1211 14:11:48.776336 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mmvk_openshift-machine-config-operator(b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" Dec 11 14:12:01 crc kubenswrapper[4898]: I1211 14:12:01.775985 4898 scope.go:117] "RemoveContainer" containerID="84b920018f008e0bbb176a3783d9c1ca5cdd95c65a5b9d87d90ba78149ad04f3" Dec 11 14:12:01 crc kubenswrapper[4898]: E1211 14:12:01.776908 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mmvk_openshift-machine-config-operator(b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" Dec 11 14:12:13 crc kubenswrapper[4898]: I1211 14:12:13.774940 4898 scope.go:117] "RemoveContainer" containerID="84b920018f008e0bbb176a3783d9c1ca5cdd95c65a5b9d87d90ba78149ad04f3" Dec 11 14:12:13 crc kubenswrapper[4898]: E1211 14:12:13.775914 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mmvk_openshift-machine-config-operator(b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" Dec 11 14:12:25 crc kubenswrapper[4898]: I1211 14:12:25.775108 4898 scope.go:117] "RemoveContainer" containerID="84b920018f008e0bbb176a3783d9c1ca5cdd95c65a5b9d87d90ba78149ad04f3" Dec 11 14:12:25 crc kubenswrapper[4898]: E1211 14:12:25.776007 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mmvk_openshift-machine-config-operator(b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" Dec 11 14:12:39 crc kubenswrapper[4898]: I1211 14:12:39.775146 4898 scope.go:117] "RemoveContainer" containerID="84b920018f008e0bbb176a3783d9c1ca5cdd95c65a5b9d87d90ba78149ad04f3" Dec 11 14:12:39 crc kubenswrapper[4898]: E1211 14:12:39.776324 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mmvk_openshift-machine-config-operator(b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" Dec 11 14:12:51 crc kubenswrapper[4898]: I1211 14:12:51.775783 4898 scope.go:117] "RemoveContainer" containerID="84b920018f008e0bbb176a3783d9c1ca5cdd95c65a5b9d87d90ba78149ad04f3" Dec 11 14:12:51 crc kubenswrapper[4898]: E1211 14:12:51.776559 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mmvk_openshift-machine-config-operator(b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" Dec 11 14:13:03 crc kubenswrapper[4898]: I1211 14:13:03.066120 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-n8hj6"] Dec 11 14:13:03 crc kubenswrapper[4898]: E1211 14:13:03.067643 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4034b52c-b8d2-4d7a-951d-1c9538e273dc" containerName="registry-server" Dec 11 14:13:03 crc kubenswrapper[4898]: I1211 14:13:03.067663 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="4034b52c-b8d2-4d7a-951d-1c9538e273dc" containerName="registry-server" Dec 11 14:13:03 crc kubenswrapper[4898]: E1211 14:13:03.067673 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4034b52c-b8d2-4d7a-951d-1c9538e273dc" containerName="extract-content" Dec 11 14:13:03 crc kubenswrapper[4898]: I1211 14:13:03.067680 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="4034b52c-b8d2-4d7a-951d-1c9538e273dc" containerName="extract-content" Dec 11 14:13:03 crc kubenswrapper[4898]: E1211 14:13:03.067701 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4034b52c-b8d2-4d7a-951d-1c9538e273dc" containerName="extract-utilities" Dec 11 14:13:03 crc kubenswrapper[4898]: I1211 14:13:03.067708 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="4034b52c-b8d2-4d7a-951d-1c9538e273dc" containerName="extract-utilities" Dec 11 14:13:03 crc kubenswrapper[4898]: I1211 14:13:03.068187 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="4034b52c-b8d2-4d7a-951d-1c9538e273dc" containerName="registry-server" Dec 11 14:13:03 crc kubenswrapper[4898]: I1211 14:13:03.073542 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n8hj6" Dec 11 14:13:03 crc kubenswrapper[4898]: I1211 14:13:03.078340 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-n8hj6"] Dec 11 14:13:03 crc kubenswrapper[4898]: I1211 14:13:03.157699 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9qndq\" (UniqueName: \"kubernetes.io/projected/d0467ecc-ddb4-49c5-9411-9fc49f32b179-kube-api-access-9qndq\") pod \"redhat-operators-n8hj6\" (UID: \"d0467ecc-ddb4-49c5-9411-9fc49f32b179\") " pod="openshift-marketplace/redhat-operators-n8hj6" Dec 11 14:13:03 crc kubenswrapper[4898]: I1211 14:13:03.157994 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d0467ecc-ddb4-49c5-9411-9fc49f32b179-catalog-content\") pod \"redhat-operators-n8hj6\" (UID: \"d0467ecc-ddb4-49c5-9411-9fc49f32b179\") " pod="openshift-marketplace/redhat-operators-n8hj6" Dec 11 14:13:03 crc kubenswrapper[4898]: I1211 14:13:03.158090 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d0467ecc-ddb4-49c5-9411-9fc49f32b179-utilities\") pod \"redhat-operators-n8hj6\" (UID: \"d0467ecc-ddb4-49c5-9411-9fc49f32b179\") " pod="openshift-marketplace/redhat-operators-n8hj6" Dec 11 14:13:03 crc kubenswrapper[4898]: I1211 14:13:03.260029 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d0467ecc-ddb4-49c5-9411-9fc49f32b179-catalog-content\") pod \"redhat-operators-n8hj6\" (UID: \"d0467ecc-ddb4-49c5-9411-9fc49f32b179\") " pod="openshift-marketplace/redhat-operators-n8hj6" Dec 11 14:13:03 crc kubenswrapper[4898]: I1211 14:13:03.260100 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d0467ecc-ddb4-49c5-9411-9fc49f32b179-utilities\") pod \"redhat-operators-n8hj6\" (UID: \"d0467ecc-ddb4-49c5-9411-9fc49f32b179\") " pod="openshift-marketplace/redhat-operators-n8hj6" Dec 11 14:13:03 crc kubenswrapper[4898]: I1211 14:13:03.260168 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9qndq\" (UniqueName: \"kubernetes.io/projected/d0467ecc-ddb4-49c5-9411-9fc49f32b179-kube-api-access-9qndq\") pod \"redhat-operators-n8hj6\" (UID: \"d0467ecc-ddb4-49c5-9411-9fc49f32b179\") " pod="openshift-marketplace/redhat-operators-n8hj6" Dec 11 14:13:03 crc kubenswrapper[4898]: I1211 14:13:03.260962 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d0467ecc-ddb4-49c5-9411-9fc49f32b179-catalog-content\") pod \"redhat-operators-n8hj6\" (UID: \"d0467ecc-ddb4-49c5-9411-9fc49f32b179\") " pod="openshift-marketplace/redhat-operators-n8hj6" Dec 11 14:13:03 crc kubenswrapper[4898]: I1211 14:13:03.261075 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d0467ecc-ddb4-49c5-9411-9fc49f32b179-utilities\") pod \"redhat-operators-n8hj6\" (UID: \"d0467ecc-ddb4-49c5-9411-9fc49f32b179\") " pod="openshift-marketplace/redhat-operators-n8hj6" Dec 11 14:13:03 crc kubenswrapper[4898]: I1211 14:13:03.278710 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9qndq\" (UniqueName: \"kubernetes.io/projected/d0467ecc-ddb4-49c5-9411-9fc49f32b179-kube-api-access-9qndq\") pod \"redhat-operators-n8hj6\" (UID: \"d0467ecc-ddb4-49c5-9411-9fc49f32b179\") " pod="openshift-marketplace/redhat-operators-n8hj6" Dec 11 14:13:03 crc kubenswrapper[4898]: I1211 14:13:03.398954 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n8hj6" Dec 11 14:13:03 crc kubenswrapper[4898]: I1211 14:13:03.779335 4898 scope.go:117] "RemoveContainer" containerID="84b920018f008e0bbb176a3783d9c1ca5cdd95c65a5b9d87d90ba78149ad04f3" Dec 11 14:13:03 crc kubenswrapper[4898]: E1211 14:13:03.780307 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mmvk_openshift-machine-config-operator(b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" Dec 11 14:13:03 crc kubenswrapper[4898]: I1211 14:13:03.953152 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-n8hj6"] Dec 11 14:13:03 crc kubenswrapper[4898]: W1211 14:13:03.968775 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd0467ecc_ddb4_49c5_9411_9fc49f32b179.slice/crio-8b7f2d02003bd07933b461301ee439899732edd995cd6a7ed019e8ba2138777d WatchSource:0}: Error finding container 8b7f2d02003bd07933b461301ee439899732edd995cd6a7ed019e8ba2138777d: Status 404 returned error can't find the container with id 8b7f2d02003bd07933b461301ee439899732edd995cd6a7ed019e8ba2138777d Dec 11 14:13:04 crc kubenswrapper[4898]: I1211 14:13:04.762964 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n8hj6" event={"ID":"d0467ecc-ddb4-49c5-9411-9fc49f32b179","Type":"ContainerStarted","Data":"8b7f2d02003bd07933b461301ee439899732edd995cd6a7ed019e8ba2138777d"} Dec 11 14:13:05 crc kubenswrapper[4898]: I1211 14:13:05.776356 4898 generic.go:334] "Generic (PLEG): container finished" podID="d0467ecc-ddb4-49c5-9411-9fc49f32b179" containerID="7fb6600585eea0d82fae1817cc455ff3f6382ae9a62675d45486ca96037318ad" exitCode=0 Dec 11 14:13:05 crc kubenswrapper[4898]: I1211 14:13:05.776476 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n8hj6" event={"ID":"d0467ecc-ddb4-49c5-9411-9fc49f32b179","Type":"ContainerDied","Data":"7fb6600585eea0d82fae1817cc455ff3f6382ae9a62675d45486ca96037318ad"} Dec 11 14:13:05 crc kubenswrapper[4898]: I1211 14:13:05.778673 4898 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 11 14:13:07 crc kubenswrapper[4898]: I1211 14:13:07.800703 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n8hj6" event={"ID":"d0467ecc-ddb4-49c5-9411-9fc49f32b179","Type":"ContainerStarted","Data":"b3eda30b710c547193599c26a69e8cddf4ce22e4c070920201c3338c3ad075c4"} Dec 11 14:13:11 crc kubenswrapper[4898]: I1211 14:13:11.850545 4898 generic.go:334] "Generic (PLEG): container finished" podID="d0467ecc-ddb4-49c5-9411-9fc49f32b179" containerID="b3eda30b710c547193599c26a69e8cddf4ce22e4c070920201c3338c3ad075c4" exitCode=0 Dec 11 14:13:11 crc kubenswrapper[4898]: I1211 14:13:11.850627 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n8hj6" event={"ID":"d0467ecc-ddb4-49c5-9411-9fc49f32b179","Type":"ContainerDied","Data":"b3eda30b710c547193599c26a69e8cddf4ce22e4c070920201c3338c3ad075c4"} Dec 11 14:13:13 crc kubenswrapper[4898]: I1211 14:13:13.877168 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n8hj6" event={"ID":"d0467ecc-ddb4-49c5-9411-9fc49f32b179","Type":"ContainerStarted","Data":"94dd50ea2f7841d6ed041d7db2b5f8375d3984457aaf4c2b5b3f5a49fd3454f9"} Dec 11 14:13:13 crc kubenswrapper[4898]: I1211 14:13:13.939135 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-n8hj6" podStartSLOduration=3.892867154 podStartE2EDuration="10.939112864s" podCreationTimestamp="2025-12-11 14:13:03 +0000 UTC" firstStartedPulling="2025-12-11 14:13:05.778315142 +0000 UTC m=+4143.350641589" lastFinishedPulling="2025-12-11 14:13:12.824560862 +0000 UTC m=+4150.396887299" observedRunningTime="2025-12-11 14:13:13.906607316 +0000 UTC m=+4151.478933773" watchObservedRunningTime="2025-12-11 14:13:13.939112864 +0000 UTC m=+4151.511439301" Dec 11 14:13:17 crc kubenswrapper[4898]: I1211 14:13:17.775262 4898 scope.go:117] "RemoveContainer" containerID="84b920018f008e0bbb176a3783d9c1ca5cdd95c65a5b9d87d90ba78149ad04f3" Dec 11 14:13:18 crc kubenswrapper[4898]: I1211 14:13:18.927834 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" event={"ID":"b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c","Type":"ContainerStarted","Data":"fa652197dab8b227fc745df831449c10b94718795d3847ba94b52a0e5e7de671"} Dec 11 14:13:23 crc kubenswrapper[4898]: I1211 14:13:23.403284 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-n8hj6" Dec 11 14:13:23 crc kubenswrapper[4898]: I1211 14:13:23.403936 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-n8hj6" Dec 11 14:13:23 crc kubenswrapper[4898]: I1211 14:13:23.480503 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-n8hj6" Dec 11 14:13:24 crc kubenswrapper[4898]: I1211 14:13:24.034315 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-n8hj6" Dec 11 14:13:25 crc kubenswrapper[4898]: I1211 14:13:25.210015 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-n8hj6"] Dec 11 14:13:25 crc kubenswrapper[4898]: I1211 14:13:25.999695 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-n8hj6" podUID="d0467ecc-ddb4-49c5-9411-9fc49f32b179" containerName="registry-server" containerID="cri-o://94dd50ea2f7841d6ed041d7db2b5f8375d3984457aaf4c2b5b3f5a49fd3454f9" gracePeriod=2 Dec 11 14:13:27 crc kubenswrapper[4898]: I1211 14:13:27.025461 4898 generic.go:334] "Generic (PLEG): container finished" podID="d0467ecc-ddb4-49c5-9411-9fc49f32b179" containerID="94dd50ea2f7841d6ed041d7db2b5f8375d3984457aaf4c2b5b3f5a49fd3454f9" exitCode=0 Dec 11 14:13:27 crc kubenswrapper[4898]: I1211 14:13:27.026261 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n8hj6" event={"ID":"d0467ecc-ddb4-49c5-9411-9fc49f32b179","Type":"ContainerDied","Data":"94dd50ea2f7841d6ed041d7db2b5f8375d3984457aaf4c2b5b3f5a49fd3454f9"} Dec 11 14:13:27 crc kubenswrapper[4898]: I1211 14:13:27.026321 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n8hj6" event={"ID":"d0467ecc-ddb4-49c5-9411-9fc49f32b179","Type":"ContainerDied","Data":"8b7f2d02003bd07933b461301ee439899732edd995cd6a7ed019e8ba2138777d"} Dec 11 14:13:27 crc kubenswrapper[4898]: I1211 14:13:27.026339 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8b7f2d02003bd07933b461301ee439899732edd995cd6a7ed019e8ba2138777d" Dec 11 14:13:27 crc kubenswrapper[4898]: I1211 14:13:27.154865 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n8hj6" Dec 11 14:13:27 crc kubenswrapper[4898]: I1211 14:13:27.248056 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9qndq\" (UniqueName: \"kubernetes.io/projected/d0467ecc-ddb4-49c5-9411-9fc49f32b179-kube-api-access-9qndq\") pod \"d0467ecc-ddb4-49c5-9411-9fc49f32b179\" (UID: \"d0467ecc-ddb4-49c5-9411-9fc49f32b179\") " Dec 11 14:13:27 crc kubenswrapper[4898]: I1211 14:13:27.248170 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d0467ecc-ddb4-49c5-9411-9fc49f32b179-catalog-content\") pod \"d0467ecc-ddb4-49c5-9411-9fc49f32b179\" (UID: \"d0467ecc-ddb4-49c5-9411-9fc49f32b179\") " Dec 11 14:13:27 crc kubenswrapper[4898]: I1211 14:13:27.248366 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d0467ecc-ddb4-49c5-9411-9fc49f32b179-utilities\") pod \"d0467ecc-ddb4-49c5-9411-9fc49f32b179\" (UID: \"d0467ecc-ddb4-49c5-9411-9fc49f32b179\") " Dec 11 14:13:27 crc kubenswrapper[4898]: I1211 14:13:27.250053 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d0467ecc-ddb4-49c5-9411-9fc49f32b179-utilities" (OuterVolumeSpecName: "utilities") pod "d0467ecc-ddb4-49c5-9411-9fc49f32b179" (UID: "d0467ecc-ddb4-49c5-9411-9fc49f32b179"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 14:13:27 crc kubenswrapper[4898]: I1211 14:13:27.254895 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0467ecc-ddb4-49c5-9411-9fc49f32b179-kube-api-access-9qndq" (OuterVolumeSpecName: "kube-api-access-9qndq") pod "d0467ecc-ddb4-49c5-9411-9fc49f32b179" (UID: "d0467ecc-ddb4-49c5-9411-9fc49f32b179"). InnerVolumeSpecName "kube-api-access-9qndq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 14:13:27 crc kubenswrapper[4898]: I1211 14:13:27.352782 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9qndq\" (UniqueName: \"kubernetes.io/projected/d0467ecc-ddb4-49c5-9411-9fc49f32b179-kube-api-access-9qndq\") on node \"crc\" DevicePath \"\"" Dec 11 14:13:27 crc kubenswrapper[4898]: I1211 14:13:27.352809 4898 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d0467ecc-ddb4-49c5-9411-9fc49f32b179-utilities\") on node \"crc\" DevicePath \"\"" Dec 11 14:13:27 crc kubenswrapper[4898]: I1211 14:13:27.397513 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d0467ecc-ddb4-49c5-9411-9fc49f32b179-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d0467ecc-ddb4-49c5-9411-9fc49f32b179" (UID: "d0467ecc-ddb4-49c5-9411-9fc49f32b179"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 14:13:27 crc kubenswrapper[4898]: I1211 14:13:27.454822 4898 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d0467ecc-ddb4-49c5-9411-9fc49f32b179-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 11 14:13:28 crc kubenswrapper[4898]: I1211 14:13:28.036493 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n8hj6" Dec 11 14:13:28 crc kubenswrapper[4898]: I1211 14:13:28.076006 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-n8hj6"] Dec 11 14:13:28 crc kubenswrapper[4898]: I1211 14:13:28.085933 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-n8hj6"] Dec 11 14:13:28 crc kubenswrapper[4898]: I1211 14:13:28.793071 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d0467ecc-ddb4-49c5-9411-9fc49f32b179" path="/var/lib/kubelet/pods/d0467ecc-ddb4-49c5-9411-9fc49f32b179/volumes" Dec 11 14:15:00 crc kubenswrapper[4898]: I1211 14:15:00.196933 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29424375-qb249"] Dec 11 14:15:00 crc kubenswrapper[4898]: E1211 14:15:00.198021 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0467ecc-ddb4-49c5-9411-9fc49f32b179" containerName="extract-content" Dec 11 14:15:00 crc kubenswrapper[4898]: I1211 14:15:00.198035 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0467ecc-ddb4-49c5-9411-9fc49f32b179" containerName="extract-content" Dec 11 14:15:00 crc kubenswrapper[4898]: E1211 14:15:00.198068 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0467ecc-ddb4-49c5-9411-9fc49f32b179" containerName="registry-server" Dec 11 14:15:00 crc kubenswrapper[4898]: I1211 14:15:00.198075 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0467ecc-ddb4-49c5-9411-9fc49f32b179" containerName="registry-server" Dec 11 14:15:00 crc kubenswrapper[4898]: E1211 14:15:00.198097 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0467ecc-ddb4-49c5-9411-9fc49f32b179" containerName="extract-utilities" Dec 11 14:15:00 crc kubenswrapper[4898]: I1211 14:15:00.198105 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0467ecc-ddb4-49c5-9411-9fc49f32b179" containerName="extract-utilities" Dec 11 14:15:00 crc kubenswrapper[4898]: I1211 14:15:00.198361 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0467ecc-ddb4-49c5-9411-9fc49f32b179" containerName="registry-server" Dec 11 14:15:00 crc kubenswrapper[4898]: I1211 14:15:00.199419 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29424375-qb249" Dec 11 14:15:00 crc kubenswrapper[4898]: I1211 14:15:00.201715 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 11 14:15:00 crc kubenswrapper[4898]: I1211 14:15:00.202924 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 11 14:15:00 crc kubenswrapper[4898]: I1211 14:15:00.211474 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29424375-qb249"] Dec 11 14:15:00 crc kubenswrapper[4898]: I1211 14:15:00.211994 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b6df206e-8f23-47c0-aafb-f37b8143cd18-config-volume\") pod \"collect-profiles-29424375-qb249\" (UID: \"b6df206e-8f23-47c0-aafb-f37b8143cd18\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29424375-qb249" Dec 11 14:15:00 crc kubenswrapper[4898]: I1211 14:15:00.212787 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nn4sz\" (UniqueName: \"kubernetes.io/projected/b6df206e-8f23-47c0-aafb-f37b8143cd18-kube-api-access-nn4sz\") pod \"collect-profiles-29424375-qb249\" (UID: \"b6df206e-8f23-47c0-aafb-f37b8143cd18\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29424375-qb249" Dec 11 14:15:00 crc kubenswrapper[4898]: I1211 14:15:00.212880 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b6df206e-8f23-47c0-aafb-f37b8143cd18-secret-volume\") pod \"collect-profiles-29424375-qb249\" (UID: \"b6df206e-8f23-47c0-aafb-f37b8143cd18\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29424375-qb249" Dec 11 14:15:00 crc kubenswrapper[4898]: I1211 14:15:00.315332 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b6df206e-8f23-47c0-aafb-f37b8143cd18-config-volume\") pod \"collect-profiles-29424375-qb249\" (UID: \"b6df206e-8f23-47c0-aafb-f37b8143cd18\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29424375-qb249" Dec 11 14:15:00 crc kubenswrapper[4898]: I1211 14:15:00.315583 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nn4sz\" (UniqueName: \"kubernetes.io/projected/b6df206e-8f23-47c0-aafb-f37b8143cd18-kube-api-access-nn4sz\") pod \"collect-profiles-29424375-qb249\" (UID: \"b6df206e-8f23-47c0-aafb-f37b8143cd18\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29424375-qb249" Dec 11 14:15:00 crc kubenswrapper[4898]: I1211 14:15:00.315650 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b6df206e-8f23-47c0-aafb-f37b8143cd18-secret-volume\") pod \"collect-profiles-29424375-qb249\" (UID: \"b6df206e-8f23-47c0-aafb-f37b8143cd18\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29424375-qb249" Dec 11 14:15:00 crc kubenswrapper[4898]: I1211 14:15:00.316384 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b6df206e-8f23-47c0-aafb-f37b8143cd18-config-volume\") pod \"collect-profiles-29424375-qb249\" (UID: \"b6df206e-8f23-47c0-aafb-f37b8143cd18\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29424375-qb249" Dec 11 14:15:00 crc kubenswrapper[4898]: I1211 14:15:00.323186 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b6df206e-8f23-47c0-aafb-f37b8143cd18-secret-volume\") pod \"collect-profiles-29424375-qb249\" (UID: \"b6df206e-8f23-47c0-aafb-f37b8143cd18\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29424375-qb249" Dec 11 14:15:00 crc kubenswrapper[4898]: I1211 14:15:00.332061 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nn4sz\" (UniqueName: \"kubernetes.io/projected/b6df206e-8f23-47c0-aafb-f37b8143cd18-kube-api-access-nn4sz\") pod \"collect-profiles-29424375-qb249\" (UID: \"b6df206e-8f23-47c0-aafb-f37b8143cd18\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29424375-qb249" Dec 11 14:15:00 crc kubenswrapper[4898]: I1211 14:15:00.522999 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29424375-qb249" Dec 11 14:15:01 crc kubenswrapper[4898]: I1211 14:15:00.999446 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29424375-qb249"] Dec 11 14:15:01 crc kubenswrapper[4898]: I1211 14:15:01.200351 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29424375-qb249" event={"ID":"b6df206e-8f23-47c0-aafb-f37b8143cd18","Type":"ContainerStarted","Data":"a8f8fe912e0e46fa3f1e9ae1544dad45ee2440a796c89579b3d1cbb9f3673fc1"} Dec 11 14:15:01 crc kubenswrapper[4898]: I1211 14:15:01.200628 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29424375-qb249" event={"ID":"b6df206e-8f23-47c0-aafb-f37b8143cd18","Type":"ContainerStarted","Data":"34123ba21f64cf1a7885e663320e74c71d1f0ad8748764879c42dc2df1a859fe"} Dec 11 14:15:01 crc kubenswrapper[4898]: I1211 14:15:01.232296 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29424375-qb249" podStartSLOduration=1.232272778 podStartE2EDuration="1.232272778s" podCreationTimestamp="2025-12-11 14:15:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 14:15:01.214987492 +0000 UTC m=+4258.787313949" watchObservedRunningTime="2025-12-11 14:15:01.232272778 +0000 UTC m=+4258.804599225" Dec 11 14:15:02 crc kubenswrapper[4898]: I1211 14:15:02.211392 4898 generic.go:334] "Generic (PLEG): container finished" podID="b6df206e-8f23-47c0-aafb-f37b8143cd18" containerID="a8f8fe912e0e46fa3f1e9ae1544dad45ee2440a796c89579b3d1cbb9f3673fc1" exitCode=0 Dec 11 14:15:02 crc kubenswrapper[4898]: I1211 14:15:02.211447 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29424375-qb249" event={"ID":"b6df206e-8f23-47c0-aafb-f37b8143cd18","Type":"ContainerDied","Data":"a8f8fe912e0e46fa3f1e9ae1544dad45ee2440a796c89579b3d1cbb9f3673fc1"} Dec 11 14:15:03 crc kubenswrapper[4898]: I1211 14:15:03.624164 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29424375-qb249" Dec 11 14:15:03 crc kubenswrapper[4898]: I1211 14:15:03.799132 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b6df206e-8f23-47c0-aafb-f37b8143cd18-secret-volume\") pod \"b6df206e-8f23-47c0-aafb-f37b8143cd18\" (UID: \"b6df206e-8f23-47c0-aafb-f37b8143cd18\") " Dec 11 14:15:03 crc kubenswrapper[4898]: I1211 14:15:03.799675 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nn4sz\" (UniqueName: \"kubernetes.io/projected/b6df206e-8f23-47c0-aafb-f37b8143cd18-kube-api-access-nn4sz\") pod \"b6df206e-8f23-47c0-aafb-f37b8143cd18\" (UID: \"b6df206e-8f23-47c0-aafb-f37b8143cd18\") " Dec 11 14:15:03 crc kubenswrapper[4898]: I1211 14:15:03.799771 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b6df206e-8f23-47c0-aafb-f37b8143cd18-config-volume\") pod \"b6df206e-8f23-47c0-aafb-f37b8143cd18\" (UID: \"b6df206e-8f23-47c0-aafb-f37b8143cd18\") " Dec 11 14:15:03 crc kubenswrapper[4898]: I1211 14:15:03.800935 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6df206e-8f23-47c0-aafb-f37b8143cd18-config-volume" (OuterVolumeSpecName: "config-volume") pod "b6df206e-8f23-47c0-aafb-f37b8143cd18" (UID: "b6df206e-8f23-47c0-aafb-f37b8143cd18"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 14:15:03 crc kubenswrapper[4898]: I1211 14:15:03.808196 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6df206e-8f23-47c0-aafb-f37b8143cd18-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "b6df206e-8f23-47c0-aafb-f37b8143cd18" (UID: "b6df206e-8f23-47c0-aafb-f37b8143cd18"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 14:15:03 crc kubenswrapper[4898]: I1211 14:15:03.811408 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6df206e-8f23-47c0-aafb-f37b8143cd18-kube-api-access-nn4sz" (OuterVolumeSpecName: "kube-api-access-nn4sz") pod "b6df206e-8f23-47c0-aafb-f37b8143cd18" (UID: "b6df206e-8f23-47c0-aafb-f37b8143cd18"). InnerVolumeSpecName "kube-api-access-nn4sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 14:15:03 crc kubenswrapper[4898]: I1211 14:15:03.903529 4898 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b6df206e-8f23-47c0-aafb-f37b8143cd18-config-volume\") on node \"crc\" DevicePath \"\"" Dec 11 14:15:03 crc kubenswrapper[4898]: I1211 14:15:03.903564 4898 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b6df206e-8f23-47c0-aafb-f37b8143cd18-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 11 14:15:03 crc kubenswrapper[4898]: I1211 14:15:03.903574 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nn4sz\" (UniqueName: \"kubernetes.io/projected/b6df206e-8f23-47c0-aafb-f37b8143cd18-kube-api-access-nn4sz\") on node \"crc\" DevicePath \"\"" Dec 11 14:15:04 crc kubenswrapper[4898]: I1211 14:15:04.237949 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29424375-qb249" event={"ID":"b6df206e-8f23-47c0-aafb-f37b8143cd18","Type":"ContainerDied","Data":"34123ba21f64cf1a7885e663320e74c71d1f0ad8748764879c42dc2df1a859fe"} Dec 11 14:15:04 crc kubenswrapper[4898]: I1211 14:15:04.237990 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29424375-qb249" Dec 11 14:15:04 crc kubenswrapper[4898]: I1211 14:15:04.237997 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="34123ba21f64cf1a7885e663320e74c71d1f0ad8748764879c42dc2df1a859fe" Dec 11 14:15:04 crc kubenswrapper[4898]: I1211 14:15:04.293051 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29424330-fzl2t"] Dec 11 14:15:04 crc kubenswrapper[4898]: I1211 14:15:04.304296 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29424330-fzl2t"] Dec 11 14:15:04 crc kubenswrapper[4898]: I1211 14:15:04.792111 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f590609a-e2e1-4add-97e0-9d08b6a2c723" path="/var/lib/kubelet/pods/f590609a-e2e1-4add-97e0-9d08b6a2c723/volumes" Dec 11 14:15:34 crc kubenswrapper[4898]: I1211 14:15:34.995744 4898 patch_prober.go:28] interesting pod/machine-config-daemon-7mmvk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 11 14:15:34 crc kubenswrapper[4898]: I1211 14:15:34.996349 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 11 14:15:40 crc kubenswrapper[4898]: I1211 14:15:40.602613 4898 scope.go:117] "RemoveContainer" containerID="e80ed6a6aac5e97971b98ea3a79f0be65d4937042cd9fa25691caa735cae5157" Dec 11 14:16:04 crc kubenswrapper[4898]: I1211 14:16:04.995583 4898 patch_prober.go:28] interesting pod/machine-config-daemon-7mmvk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 11 14:16:04 crc kubenswrapper[4898]: I1211 14:16:04.996151 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 11 14:16:34 crc kubenswrapper[4898]: I1211 14:16:34.995615 4898 patch_prober.go:28] interesting pod/machine-config-daemon-7mmvk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 11 14:16:34 crc kubenswrapper[4898]: I1211 14:16:34.996376 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 11 14:16:34 crc kubenswrapper[4898]: I1211 14:16:34.996446 4898 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" Dec 11 14:16:34 crc kubenswrapper[4898]: I1211 14:16:34.998802 4898 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"fa652197dab8b227fc745df831449c10b94718795d3847ba94b52a0e5e7de671"} pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 11 14:16:34 crc kubenswrapper[4898]: I1211 14:16:34.998919 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" containerName="machine-config-daemon" containerID="cri-o://fa652197dab8b227fc745df831449c10b94718795d3847ba94b52a0e5e7de671" gracePeriod=600 Dec 11 14:16:35 crc kubenswrapper[4898]: I1211 14:16:35.318181 4898 generic.go:334] "Generic (PLEG): container finished" podID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" containerID="fa652197dab8b227fc745df831449c10b94718795d3847ba94b52a0e5e7de671" exitCode=0 Dec 11 14:16:35 crc kubenswrapper[4898]: I1211 14:16:35.318255 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" event={"ID":"b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c","Type":"ContainerDied","Data":"fa652197dab8b227fc745df831449c10b94718795d3847ba94b52a0e5e7de671"} Dec 11 14:16:35 crc kubenswrapper[4898]: I1211 14:16:35.318719 4898 scope.go:117] "RemoveContainer" containerID="84b920018f008e0bbb176a3783d9c1ca5cdd95c65a5b9d87d90ba78149ad04f3" Dec 11 14:16:36 crc kubenswrapper[4898]: I1211 14:16:36.345983 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" event={"ID":"b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c","Type":"ContainerStarted","Data":"b394cf34ba2fd657a20c472d818353aff5e4e06ca5e8c6123308991c69d40158"} Dec 11 14:17:42 crc kubenswrapper[4898]: I1211 14:17:42.506089 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-xqhb7"] Dec 11 14:17:42 crc kubenswrapper[4898]: E1211 14:17:42.507562 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6df206e-8f23-47c0-aafb-f37b8143cd18" containerName="collect-profiles" Dec 11 14:17:42 crc kubenswrapper[4898]: I1211 14:17:42.507587 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6df206e-8f23-47c0-aafb-f37b8143cd18" containerName="collect-profiles" Dec 11 14:17:42 crc kubenswrapper[4898]: I1211 14:17:42.508004 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6df206e-8f23-47c0-aafb-f37b8143cd18" containerName="collect-profiles" Dec 11 14:17:42 crc kubenswrapper[4898]: I1211 14:17:42.510913 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xqhb7" Dec 11 14:17:42 crc kubenswrapper[4898]: I1211 14:17:42.539968 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xqhb7"] Dec 11 14:17:42 crc kubenswrapper[4898]: I1211 14:17:42.581237 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dbhzc\" (UniqueName: \"kubernetes.io/projected/56128d2c-b6db-46c6-976a-23d10e540195-kube-api-access-dbhzc\") pod \"community-operators-xqhb7\" (UID: \"56128d2c-b6db-46c6-976a-23d10e540195\") " pod="openshift-marketplace/community-operators-xqhb7" Dec 11 14:17:42 crc kubenswrapper[4898]: I1211 14:17:42.581363 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/56128d2c-b6db-46c6-976a-23d10e540195-utilities\") pod \"community-operators-xqhb7\" (UID: \"56128d2c-b6db-46c6-976a-23d10e540195\") " pod="openshift-marketplace/community-operators-xqhb7" Dec 11 14:17:42 crc kubenswrapper[4898]: I1211 14:17:42.581439 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/56128d2c-b6db-46c6-976a-23d10e540195-catalog-content\") pod \"community-operators-xqhb7\" (UID: \"56128d2c-b6db-46c6-976a-23d10e540195\") " pod="openshift-marketplace/community-operators-xqhb7" Dec 11 14:17:42 crc kubenswrapper[4898]: I1211 14:17:42.684110 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dbhzc\" (UniqueName: \"kubernetes.io/projected/56128d2c-b6db-46c6-976a-23d10e540195-kube-api-access-dbhzc\") pod \"community-operators-xqhb7\" (UID: \"56128d2c-b6db-46c6-976a-23d10e540195\") " pod="openshift-marketplace/community-operators-xqhb7" Dec 11 14:17:42 crc kubenswrapper[4898]: I1211 14:17:42.684710 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/56128d2c-b6db-46c6-976a-23d10e540195-utilities\") pod \"community-operators-xqhb7\" (UID: \"56128d2c-b6db-46c6-976a-23d10e540195\") " pod="openshift-marketplace/community-operators-xqhb7" Dec 11 14:17:42 crc kubenswrapper[4898]: I1211 14:17:42.685246 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/56128d2c-b6db-46c6-976a-23d10e540195-utilities\") pod \"community-operators-xqhb7\" (UID: \"56128d2c-b6db-46c6-976a-23d10e540195\") " pod="openshift-marketplace/community-operators-xqhb7" Dec 11 14:17:42 crc kubenswrapper[4898]: I1211 14:17:42.685683 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/56128d2c-b6db-46c6-976a-23d10e540195-catalog-content\") pod \"community-operators-xqhb7\" (UID: \"56128d2c-b6db-46c6-976a-23d10e540195\") " pod="openshift-marketplace/community-operators-xqhb7" Dec 11 14:17:42 crc kubenswrapper[4898]: I1211 14:17:42.685347 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/56128d2c-b6db-46c6-976a-23d10e540195-catalog-content\") pod \"community-operators-xqhb7\" (UID: \"56128d2c-b6db-46c6-976a-23d10e540195\") " pod="openshift-marketplace/community-operators-xqhb7" Dec 11 14:17:42 crc kubenswrapper[4898]: I1211 14:17:42.707474 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dbhzc\" (UniqueName: \"kubernetes.io/projected/56128d2c-b6db-46c6-976a-23d10e540195-kube-api-access-dbhzc\") pod \"community-operators-xqhb7\" (UID: \"56128d2c-b6db-46c6-976a-23d10e540195\") " pod="openshift-marketplace/community-operators-xqhb7" Dec 11 14:17:42 crc kubenswrapper[4898]: I1211 14:17:42.840742 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xqhb7" Dec 11 14:17:43 crc kubenswrapper[4898]: I1211 14:17:43.510515 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xqhb7"] Dec 11 14:17:44 crc kubenswrapper[4898]: I1211 14:17:44.214941 4898 generic.go:334] "Generic (PLEG): container finished" podID="56128d2c-b6db-46c6-976a-23d10e540195" containerID="b00d23c80d694165cedba2abd7da4433e534ae0ab976674bf35ddea5e5222290" exitCode=0 Dec 11 14:17:44 crc kubenswrapper[4898]: I1211 14:17:44.215001 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xqhb7" event={"ID":"56128d2c-b6db-46c6-976a-23d10e540195","Type":"ContainerDied","Data":"b00d23c80d694165cedba2abd7da4433e534ae0ab976674bf35ddea5e5222290"} Dec 11 14:17:44 crc kubenswrapper[4898]: I1211 14:17:44.215349 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xqhb7" event={"ID":"56128d2c-b6db-46c6-976a-23d10e540195","Type":"ContainerStarted","Data":"1f4fe1a2bab22b8f9f9316240739194d0c9c81fba589b7688e9b53cfd472375b"} Dec 11 14:17:45 crc kubenswrapper[4898]: I1211 14:17:45.227209 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xqhb7" event={"ID":"56128d2c-b6db-46c6-976a-23d10e540195","Type":"ContainerStarted","Data":"5546cd7e0ef816ba03eaa553c8f0ad639a854d178adb5691d309be72f2391c8f"} Dec 11 14:17:47 crc kubenswrapper[4898]: I1211 14:17:47.249517 4898 generic.go:334] "Generic (PLEG): container finished" podID="56128d2c-b6db-46c6-976a-23d10e540195" containerID="5546cd7e0ef816ba03eaa553c8f0ad639a854d178adb5691d309be72f2391c8f" exitCode=0 Dec 11 14:17:47 crc kubenswrapper[4898]: I1211 14:17:47.249600 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xqhb7" event={"ID":"56128d2c-b6db-46c6-976a-23d10e540195","Type":"ContainerDied","Data":"5546cd7e0ef816ba03eaa553c8f0ad639a854d178adb5691d309be72f2391c8f"} Dec 11 14:17:47 crc kubenswrapper[4898]: E1211 14:17:47.595733 4898 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.18:40356->38.102.83.18:42973: write tcp 38.102.83.18:40356->38.102.83.18:42973: write: broken pipe Dec 11 14:17:48 crc kubenswrapper[4898]: I1211 14:17:48.269277 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xqhb7" event={"ID":"56128d2c-b6db-46c6-976a-23d10e540195","Type":"ContainerStarted","Data":"79346fcfba9c93e82b47d3d8f926dd55f1fab9643ce97c9d72616c75f59a5f76"} Dec 11 14:17:48 crc kubenswrapper[4898]: I1211 14:17:48.297536 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-xqhb7" podStartSLOduration=2.713031088 podStartE2EDuration="6.297512679s" podCreationTimestamp="2025-12-11 14:17:42 +0000 UTC" firstStartedPulling="2025-12-11 14:17:44.21827998 +0000 UTC m=+4421.790606457" lastFinishedPulling="2025-12-11 14:17:47.802761591 +0000 UTC m=+4425.375088048" observedRunningTime="2025-12-11 14:17:48.287769876 +0000 UTC m=+4425.860096323" watchObservedRunningTime="2025-12-11 14:17:48.297512679 +0000 UTC m=+4425.869839116" Dec 11 14:17:52 crc kubenswrapper[4898]: I1211 14:17:52.841298 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-xqhb7" Dec 11 14:17:52 crc kubenswrapper[4898]: I1211 14:17:52.843049 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-xqhb7" Dec 11 14:17:52 crc kubenswrapper[4898]: I1211 14:17:52.904151 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-xqhb7" Dec 11 14:17:53 crc kubenswrapper[4898]: I1211 14:17:53.423523 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-xqhb7" Dec 11 14:17:53 crc kubenswrapper[4898]: I1211 14:17:53.482756 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xqhb7"] Dec 11 14:17:55 crc kubenswrapper[4898]: I1211 14:17:55.361057 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-xqhb7" podUID="56128d2c-b6db-46c6-976a-23d10e540195" containerName="registry-server" containerID="cri-o://79346fcfba9c93e82b47d3d8f926dd55f1fab9643ce97c9d72616c75f59a5f76" gracePeriod=2 Dec 11 14:17:55 crc kubenswrapper[4898]: I1211 14:17:55.936912 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xqhb7" Dec 11 14:17:56 crc kubenswrapper[4898]: I1211 14:17:56.130808 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/56128d2c-b6db-46c6-976a-23d10e540195-utilities\") pod \"56128d2c-b6db-46c6-976a-23d10e540195\" (UID: \"56128d2c-b6db-46c6-976a-23d10e540195\") " Dec 11 14:17:56 crc kubenswrapper[4898]: I1211 14:17:56.130887 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/56128d2c-b6db-46c6-976a-23d10e540195-catalog-content\") pod \"56128d2c-b6db-46c6-976a-23d10e540195\" (UID: \"56128d2c-b6db-46c6-976a-23d10e540195\") " Dec 11 14:17:56 crc kubenswrapper[4898]: I1211 14:17:56.131198 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbhzc\" (UniqueName: \"kubernetes.io/projected/56128d2c-b6db-46c6-976a-23d10e540195-kube-api-access-dbhzc\") pod \"56128d2c-b6db-46c6-976a-23d10e540195\" (UID: \"56128d2c-b6db-46c6-976a-23d10e540195\") " Dec 11 14:17:56 crc kubenswrapper[4898]: I1211 14:17:56.133623 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/56128d2c-b6db-46c6-976a-23d10e540195-utilities" (OuterVolumeSpecName: "utilities") pod "56128d2c-b6db-46c6-976a-23d10e540195" (UID: "56128d2c-b6db-46c6-976a-23d10e540195"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 14:17:56 crc kubenswrapper[4898]: I1211 14:17:56.144048 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56128d2c-b6db-46c6-976a-23d10e540195-kube-api-access-dbhzc" (OuterVolumeSpecName: "kube-api-access-dbhzc") pod "56128d2c-b6db-46c6-976a-23d10e540195" (UID: "56128d2c-b6db-46c6-976a-23d10e540195"). InnerVolumeSpecName "kube-api-access-dbhzc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 14:17:56 crc kubenswrapper[4898]: I1211 14:17:56.209384 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/56128d2c-b6db-46c6-976a-23d10e540195-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "56128d2c-b6db-46c6-976a-23d10e540195" (UID: "56128d2c-b6db-46c6-976a-23d10e540195"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 14:17:56 crc kubenswrapper[4898]: I1211 14:17:56.233764 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbhzc\" (UniqueName: \"kubernetes.io/projected/56128d2c-b6db-46c6-976a-23d10e540195-kube-api-access-dbhzc\") on node \"crc\" DevicePath \"\"" Dec 11 14:17:56 crc kubenswrapper[4898]: I1211 14:17:56.233798 4898 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/56128d2c-b6db-46c6-976a-23d10e540195-utilities\") on node \"crc\" DevicePath \"\"" Dec 11 14:17:56 crc kubenswrapper[4898]: I1211 14:17:56.233808 4898 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/56128d2c-b6db-46c6-976a-23d10e540195-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 11 14:17:56 crc kubenswrapper[4898]: I1211 14:17:56.375735 4898 generic.go:334] "Generic (PLEG): container finished" podID="56128d2c-b6db-46c6-976a-23d10e540195" containerID="79346fcfba9c93e82b47d3d8f926dd55f1fab9643ce97c9d72616c75f59a5f76" exitCode=0 Dec 11 14:17:56 crc kubenswrapper[4898]: I1211 14:17:56.375854 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xqhb7" Dec 11 14:17:56 crc kubenswrapper[4898]: I1211 14:17:56.375871 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xqhb7" event={"ID":"56128d2c-b6db-46c6-976a-23d10e540195","Type":"ContainerDied","Data":"79346fcfba9c93e82b47d3d8f926dd55f1fab9643ce97c9d72616c75f59a5f76"} Dec 11 14:17:56 crc kubenswrapper[4898]: I1211 14:17:56.375924 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xqhb7" event={"ID":"56128d2c-b6db-46c6-976a-23d10e540195","Type":"ContainerDied","Data":"1f4fe1a2bab22b8f9f9316240739194d0c9c81fba589b7688e9b53cfd472375b"} Dec 11 14:17:56 crc kubenswrapper[4898]: I1211 14:17:56.375945 4898 scope.go:117] "RemoveContainer" containerID="79346fcfba9c93e82b47d3d8f926dd55f1fab9643ce97c9d72616c75f59a5f76" Dec 11 14:17:56 crc kubenswrapper[4898]: I1211 14:17:56.408415 4898 scope.go:117] "RemoveContainer" containerID="5546cd7e0ef816ba03eaa553c8f0ad639a854d178adb5691d309be72f2391c8f" Dec 11 14:17:56 crc kubenswrapper[4898]: I1211 14:17:56.435482 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xqhb7"] Dec 11 14:17:56 crc kubenswrapper[4898]: I1211 14:17:56.448427 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-xqhb7"] Dec 11 14:17:56 crc kubenswrapper[4898]: I1211 14:17:56.451262 4898 scope.go:117] "RemoveContainer" containerID="b00d23c80d694165cedba2abd7da4433e534ae0ab976674bf35ddea5e5222290" Dec 11 14:17:56 crc kubenswrapper[4898]: I1211 14:17:56.518165 4898 scope.go:117] "RemoveContainer" containerID="79346fcfba9c93e82b47d3d8f926dd55f1fab9643ce97c9d72616c75f59a5f76" Dec 11 14:17:56 crc kubenswrapper[4898]: E1211 14:17:56.518505 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"79346fcfba9c93e82b47d3d8f926dd55f1fab9643ce97c9d72616c75f59a5f76\": container with ID starting with 79346fcfba9c93e82b47d3d8f926dd55f1fab9643ce97c9d72616c75f59a5f76 not found: ID does not exist" containerID="79346fcfba9c93e82b47d3d8f926dd55f1fab9643ce97c9d72616c75f59a5f76" Dec 11 14:17:56 crc kubenswrapper[4898]: I1211 14:17:56.518535 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"79346fcfba9c93e82b47d3d8f926dd55f1fab9643ce97c9d72616c75f59a5f76"} err="failed to get container status \"79346fcfba9c93e82b47d3d8f926dd55f1fab9643ce97c9d72616c75f59a5f76\": rpc error: code = NotFound desc = could not find container \"79346fcfba9c93e82b47d3d8f926dd55f1fab9643ce97c9d72616c75f59a5f76\": container with ID starting with 79346fcfba9c93e82b47d3d8f926dd55f1fab9643ce97c9d72616c75f59a5f76 not found: ID does not exist" Dec 11 14:17:56 crc kubenswrapper[4898]: I1211 14:17:56.518557 4898 scope.go:117] "RemoveContainer" containerID="5546cd7e0ef816ba03eaa553c8f0ad639a854d178adb5691d309be72f2391c8f" Dec 11 14:17:56 crc kubenswrapper[4898]: E1211 14:17:56.518834 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5546cd7e0ef816ba03eaa553c8f0ad639a854d178adb5691d309be72f2391c8f\": container with ID starting with 5546cd7e0ef816ba03eaa553c8f0ad639a854d178adb5691d309be72f2391c8f not found: ID does not exist" containerID="5546cd7e0ef816ba03eaa553c8f0ad639a854d178adb5691d309be72f2391c8f" Dec 11 14:17:56 crc kubenswrapper[4898]: I1211 14:17:56.518867 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5546cd7e0ef816ba03eaa553c8f0ad639a854d178adb5691d309be72f2391c8f"} err="failed to get container status \"5546cd7e0ef816ba03eaa553c8f0ad639a854d178adb5691d309be72f2391c8f\": rpc error: code = NotFound desc = could not find container \"5546cd7e0ef816ba03eaa553c8f0ad639a854d178adb5691d309be72f2391c8f\": container with ID starting with 5546cd7e0ef816ba03eaa553c8f0ad639a854d178adb5691d309be72f2391c8f not found: ID does not exist" Dec 11 14:17:56 crc kubenswrapper[4898]: I1211 14:17:56.518886 4898 scope.go:117] "RemoveContainer" containerID="b00d23c80d694165cedba2abd7da4433e534ae0ab976674bf35ddea5e5222290" Dec 11 14:17:56 crc kubenswrapper[4898]: E1211 14:17:56.519154 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b00d23c80d694165cedba2abd7da4433e534ae0ab976674bf35ddea5e5222290\": container with ID starting with b00d23c80d694165cedba2abd7da4433e534ae0ab976674bf35ddea5e5222290 not found: ID does not exist" containerID="b00d23c80d694165cedba2abd7da4433e534ae0ab976674bf35ddea5e5222290" Dec 11 14:17:56 crc kubenswrapper[4898]: I1211 14:17:56.519184 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b00d23c80d694165cedba2abd7da4433e534ae0ab976674bf35ddea5e5222290"} err="failed to get container status \"b00d23c80d694165cedba2abd7da4433e534ae0ab976674bf35ddea5e5222290\": rpc error: code = NotFound desc = could not find container \"b00d23c80d694165cedba2abd7da4433e534ae0ab976674bf35ddea5e5222290\": container with ID starting with b00d23c80d694165cedba2abd7da4433e534ae0ab976674bf35ddea5e5222290 not found: ID does not exist" Dec 11 14:17:56 crc kubenswrapper[4898]: I1211 14:17:56.788238 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="56128d2c-b6db-46c6-976a-23d10e540195" path="/var/lib/kubelet/pods/56128d2c-b6db-46c6-976a-23d10e540195/volumes" Dec 11 14:19:04 crc kubenswrapper[4898]: I1211 14:19:04.996115 4898 patch_prober.go:28] interesting pod/machine-config-daemon-7mmvk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 11 14:19:04 crc kubenswrapper[4898]: I1211 14:19:04.997277 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 11 14:19:34 crc kubenswrapper[4898]: I1211 14:19:34.995649 4898 patch_prober.go:28] interesting pod/machine-config-daemon-7mmvk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 11 14:19:34 crc kubenswrapper[4898]: I1211 14:19:34.996080 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 11 14:19:40 crc kubenswrapper[4898]: I1211 14:19:40.788134 4898 scope.go:117] "RemoveContainer" containerID="94dd50ea2f7841d6ed041d7db2b5f8375d3984457aaf4c2b5b3f5a49fd3454f9" Dec 11 14:19:40 crc kubenswrapper[4898]: I1211 14:19:40.811395 4898 scope.go:117] "RemoveContainer" containerID="b3eda30b710c547193599c26a69e8cddf4ce22e4c070920201c3338c3ad075c4" Dec 11 14:19:41 crc kubenswrapper[4898]: I1211 14:19:41.210684 4898 scope.go:117] "RemoveContainer" containerID="7fb6600585eea0d82fae1817cc455ff3f6382ae9a62675d45486ca96037318ad" Dec 11 14:19:45 crc kubenswrapper[4898]: E1211 14:19:45.896358 4898 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.18:43294->38.102.83.18:42973: write tcp 38.102.83.18:43294->38.102.83.18:42973: write: broken pipe Dec 11 14:20:04 crc kubenswrapper[4898]: I1211 14:20:04.995794 4898 patch_prober.go:28] interesting pod/machine-config-daemon-7mmvk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 11 14:20:04 crc kubenswrapper[4898]: I1211 14:20:04.996561 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 11 14:20:04 crc kubenswrapper[4898]: I1211 14:20:04.996637 4898 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" Dec 11 14:20:04 crc kubenswrapper[4898]: I1211 14:20:04.997672 4898 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b394cf34ba2fd657a20c472d818353aff5e4e06ca5e8c6123308991c69d40158"} pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 11 14:20:04 crc kubenswrapper[4898]: I1211 14:20:04.997778 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" containerName="machine-config-daemon" containerID="cri-o://b394cf34ba2fd657a20c472d818353aff5e4e06ca5e8c6123308991c69d40158" gracePeriod=600 Dec 11 14:20:05 crc kubenswrapper[4898]: E1211 14:20:05.122958 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mmvk_openshift-machine-config-operator(b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" Dec 11 14:20:06 crc kubenswrapper[4898]: I1211 14:20:06.116589 4898 generic.go:334] "Generic (PLEG): container finished" podID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" containerID="b394cf34ba2fd657a20c472d818353aff5e4e06ca5e8c6123308991c69d40158" exitCode=0 Dec 11 14:20:06 crc kubenswrapper[4898]: I1211 14:20:06.116650 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" event={"ID":"b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c","Type":"ContainerDied","Data":"b394cf34ba2fd657a20c472d818353aff5e4e06ca5e8c6123308991c69d40158"} Dec 11 14:20:06 crc kubenswrapper[4898]: I1211 14:20:06.116691 4898 scope.go:117] "RemoveContainer" containerID="fa652197dab8b227fc745df831449c10b94718795d3847ba94b52a0e5e7de671" Dec 11 14:20:06 crc kubenswrapper[4898]: I1211 14:20:06.117581 4898 scope.go:117] "RemoveContainer" containerID="b394cf34ba2fd657a20c472d818353aff5e4e06ca5e8c6123308991c69d40158" Dec 11 14:20:06 crc kubenswrapper[4898]: E1211 14:20:06.117947 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mmvk_openshift-machine-config-operator(b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" Dec 11 14:20:18 crc kubenswrapper[4898]: I1211 14:20:18.775323 4898 scope.go:117] "RemoveContainer" containerID="b394cf34ba2fd657a20c472d818353aff5e4e06ca5e8c6123308991c69d40158" Dec 11 14:20:18 crc kubenswrapper[4898]: E1211 14:20:18.776240 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mmvk_openshift-machine-config-operator(b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" Dec 11 14:20:30 crc kubenswrapper[4898]: I1211 14:20:30.783314 4898 scope.go:117] "RemoveContainer" containerID="b394cf34ba2fd657a20c472d818353aff5e4e06ca5e8c6123308991c69d40158" Dec 11 14:20:30 crc kubenswrapper[4898]: E1211 14:20:30.784594 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mmvk_openshift-machine-config-operator(b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" Dec 11 14:20:42 crc kubenswrapper[4898]: I1211 14:20:42.784870 4898 scope.go:117] "RemoveContainer" containerID="b394cf34ba2fd657a20c472d818353aff5e4e06ca5e8c6123308991c69d40158" Dec 11 14:20:42 crc kubenswrapper[4898]: E1211 14:20:42.786267 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mmvk_openshift-machine-config-operator(b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" Dec 11 14:20:45 crc kubenswrapper[4898]: I1211 14:20:45.530620 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-65wpc"] Dec 11 14:20:45 crc kubenswrapper[4898]: E1211 14:20:45.531776 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56128d2c-b6db-46c6-976a-23d10e540195" containerName="extract-utilities" Dec 11 14:20:45 crc kubenswrapper[4898]: I1211 14:20:45.531792 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="56128d2c-b6db-46c6-976a-23d10e540195" containerName="extract-utilities" Dec 11 14:20:45 crc kubenswrapper[4898]: E1211 14:20:45.531830 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56128d2c-b6db-46c6-976a-23d10e540195" containerName="extract-content" Dec 11 14:20:45 crc kubenswrapper[4898]: I1211 14:20:45.531837 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="56128d2c-b6db-46c6-976a-23d10e540195" containerName="extract-content" Dec 11 14:20:45 crc kubenswrapper[4898]: E1211 14:20:45.531859 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56128d2c-b6db-46c6-976a-23d10e540195" containerName="registry-server" Dec 11 14:20:45 crc kubenswrapper[4898]: I1211 14:20:45.531867 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="56128d2c-b6db-46c6-976a-23d10e540195" containerName="registry-server" Dec 11 14:20:45 crc kubenswrapper[4898]: I1211 14:20:45.532119 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="56128d2c-b6db-46c6-976a-23d10e540195" containerName="registry-server" Dec 11 14:20:45 crc kubenswrapper[4898]: I1211 14:20:45.534061 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-65wpc" Dec 11 14:20:45 crc kubenswrapper[4898]: I1211 14:20:45.549770 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-65wpc"] Dec 11 14:20:45 crc kubenswrapper[4898]: I1211 14:20:45.644739 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d537893d-df47-49a2-abfc-12f11067377d-catalog-content\") pod \"redhat-marketplace-65wpc\" (UID: \"d537893d-df47-49a2-abfc-12f11067377d\") " pod="openshift-marketplace/redhat-marketplace-65wpc" Dec 11 14:20:45 crc kubenswrapper[4898]: I1211 14:20:45.644811 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d537893d-df47-49a2-abfc-12f11067377d-utilities\") pod \"redhat-marketplace-65wpc\" (UID: \"d537893d-df47-49a2-abfc-12f11067377d\") " pod="openshift-marketplace/redhat-marketplace-65wpc" Dec 11 14:20:45 crc kubenswrapper[4898]: I1211 14:20:45.644897 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bttw7\" (UniqueName: \"kubernetes.io/projected/d537893d-df47-49a2-abfc-12f11067377d-kube-api-access-bttw7\") pod \"redhat-marketplace-65wpc\" (UID: \"d537893d-df47-49a2-abfc-12f11067377d\") " pod="openshift-marketplace/redhat-marketplace-65wpc" Dec 11 14:20:45 crc kubenswrapper[4898]: I1211 14:20:45.746759 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d537893d-df47-49a2-abfc-12f11067377d-utilities\") pod \"redhat-marketplace-65wpc\" (UID: \"d537893d-df47-49a2-abfc-12f11067377d\") " pod="openshift-marketplace/redhat-marketplace-65wpc" Dec 11 14:20:45 crc kubenswrapper[4898]: I1211 14:20:45.746879 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bttw7\" (UniqueName: \"kubernetes.io/projected/d537893d-df47-49a2-abfc-12f11067377d-kube-api-access-bttw7\") pod \"redhat-marketplace-65wpc\" (UID: \"d537893d-df47-49a2-abfc-12f11067377d\") " pod="openshift-marketplace/redhat-marketplace-65wpc" Dec 11 14:20:45 crc kubenswrapper[4898]: I1211 14:20:45.747403 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d537893d-df47-49a2-abfc-12f11067377d-utilities\") pod \"redhat-marketplace-65wpc\" (UID: \"d537893d-df47-49a2-abfc-12f11067377d\") " pod="openshift-marketplace/redhat-marketplace-65wpc" Dec 11 14:20:45 crc kubenswrapper[4898]: I1211 14:20:45.747434 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d537893d-df47-49a2-abfc-12f11067377d-catalog-content\") pod \"redhat-marketplace-65wpc\" (UID: \"d537893d-df47-49a2-abfc-12f11067377d\") " pod="openshift-marketplace/redhat-marketplace-65wpc" Dec 11 14:20:45 crc kubenswrapper[4898]: I1211 14:20:45.747796 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d537893d-df47-49a2-abfc-12f11067377d-catalog-content\") pod \"redhat-marketplace-65wpc\" (UID: \"d537893d-df47-49a2-abfc-12f11067377d\") " pod="openshift-marketplace/redhat-marketplace-65wpc" Dec 11 14:20:45 crc kubenswrapper[4898]: I1211 14:20:45.773662 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bttw7\" (UniqueName: \"kubernetes.io/projected/d537893d-df47-49a2-abfc-12f11067377d-kube-api-access-bttw7\") pod \"redhat-marketplace-65wpc\" (UID: \"d537893d-df47-49a2-abfc-12f11067377d\") " pod="openshift-marketplace/redhat-marketplace-65wpc" Dec 11 14:20:45 crc kubenswrapper[4898]: I1211 14:20:45.870893 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-65wpc" Dec 11 14:20:46 crc kubenswrapper[4898]: I1211 14:20:46.449094 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-65wpc"] Dec 11 14:20:46 crc kubenswrapper[4898]: I1211 14:20:46.644698 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-65wpc" event={"ID":"d537893d-df47-49a2-abfc-12f11067377d","Type":"ContainerStarted","Data":"0cc9cd32a30acd6d511f5438f87f94da7fda4fe23b20e6260691a47811395631"} Dec 11 14:20:47 crc kubenswrapper[4898]: I1211 14:20:47.672181 4898 generic.go:334] "Generic (PLEG): container finished" podID="d537893d-df47-49a2-abfc-12f11067377d" containerID="55c49fad696e072084c8cf3f7128c41700a3a0fae86b332256a2888ecc2839fa" exitCode=0 Dec 11 14:20:47 crc kubenswrapper[4898]: I1211 14:20:47.672327 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-65wpc" event={"ID":"d537893d-df47-49a2-abfc-12f11067377d","Type":"ContainerDied","Data":"55c49fad696e072084c8cf3f7128c41700a3a0fae86b332256a2888ecc2839fa"} Dec 11 14:20:47 crc kubenswrapper[4898]: I1211 14:20:47.675990 4898 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 11 14:20:48 crc kubenswrapper[4898]: I1211 14:20:48.685338 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-65wpc" event={"ID":"d537893d-df47-49a2-abfc-12f11067377d","Type":"ContainerStarted","Data":"a033f3562dd939fe2880e6e2980931e5632172966a9af639ea522b60fec0cce1"} Dec 11 14:20:49 crc kubenswrapper[4898]: I1211 14:20:49.702554 4898 generic.go:334] "Generic (PLEG): container finished" podID="d537893d-df47-49a2-abfc-12f11067377d" containerID="a033f3562dd939fe2880e6e2980931e5632172966a9af639ea522b60fec0cce1" exitCode=0 Dec 11 14:20:49 crc kubenswrapper[4898]: I1211 14:20:49.703557 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-65wpc" event={"ID":"d537893d-df47-49a2-abfc-12f11067377d","Type":"ContainerDied","Data":"a033f3562dd939fe2880e6e2980931e5632172966a9af639ea522b60fec0cce1"} Dec 11 14:20:50 crc kubenswrapper[4898]: I1211 14:20:50.715744 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-65wpc" event={"ID":"d537893d-df47-49a2-abfc-12f11067377d","Type":"ContainerStarted","Data":"6d3223196ac8b0ffcf8617cf068a5df0ced5e6b5c7f0e02ba45ea9693f79d2b2"} Dec 11 14:20:50 crc kubenswrapper[4898]: I1211 14:20:50.747912 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-65wpc" podStartSLOduration=3.027371965 podStartE2EDuration="5.747892999s" podCreationTimestamp="2025-12-11 14:20:45 +0000 UTC" firstStartedPulling="2025-12-11 14:20:47.675692899 +0000 UTC m=+4605.248019336" lastFinishedPulling="2025-12-11 14:20:50.396213933 +0000 UTC m=+4607.968540370" observedRunningTime="2025-12-11 14:20:50.737434546 +0000 UTC m=+4608.309760983" watchObservedRunningTime="2025-12-11 14:20:50.747892999 +0000 UTC m=+4608.320219436" Dec 11 14:20:53 crc kubenswrapper[4898]: I1211 14:20:53.776062 4898 scope.go:117] "RemoveContainer" containerID="b394cf34ba2fd657a20c472d818353aff5e4e06ca5e8c6123308991c69d40158" Dec 11 14:20:53 crc kubenswrapper[4898]: E1211 14:20:53.777014 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mmvk_openshift-machine-config-operator(b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" Dec 11 14:20:55 crc kubenswrapper[4898]: I1211 14:20:55.872161 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-65wpc" Dec 11 14:20:55 crc kubenswrapper[4898]: I1211 14:20:55.872481 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-65wpc" Dec 11 14:20:55 crc kubenswrapper[4898]: I1211 14:20:55.942143 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-65wpc" Dec 11 14:20:56 crc kubenswrapper[4898]: I1211 14:20:56.850530 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-65wpc" Dec 11 14:20:56 crc kubenswrapper[4898]: I1211 14:20:56.904161 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-65wpc"] Dec 11 14:20:58 crc kubenswrapper[4898]: I1211 14:20:58.802678 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-65wpc" podUID="d537893d-df47-49a2-abfc-12f11067377d" containerName="registry-server" containerID="cri-o://6d3223196ac8b0ffcf8617cf068a5df0ced5e6b5c7f0e02ba45ea9693f79d2b2" gracePeriod=2 Dec 11 14:20:59 crc kubenswrapper[4898]: I1211 14:20:59.337524 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-65wpc" Dec 11 14:20:59 crc kubenswrapper[4898]: I1211 14:20:59.384421 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d537893d-df47-49a2-abfc-12f11067377d-utilities\") pod \"d537893d-df47-49a2-abfc-12f11067377d\" (UID: \"d537893d-df47-49a2-abfc-12f11067377d\") " Dec 11 14:20:59 crc kubenswrapper[4898]: I1211 14:20:59.384781 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d537893d-df47-49a2-abfc-12f11067377d-catalog-content\") pod \"d537893d-df47-49a2-abfc-12f11067377d\" (UID: \"d537893d-df47-49a2-abfc-12f11067377d\") " Dec 11 14:20:59 crc kubenswrapper[4898]: I1211 14:20:59.384975 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bttw7\" (UniqueName: \"kubernetes.io/projected/d537893d-df47-49a2-abfc-12f11067377d-kube-api-access-bttw7\") pod \"d537893d-df47-49a2-abfc-12f11067377d\" (UID: \"d537893d-df47-49a2-abfc-12f11067377d\") " Dec 11 14:20:59 crc kubenswrapper[4898]: I1211 14:20:59.385420 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d537893d-df47-49a2-abfc-12f11067377d-utilities" (OuterVolumeSpecName: "utilities") pod "d537893d-df47-49a2-abfc-12f11067377d" (UID: "d537893d-df47-49a2-abfc-12f11067377d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 14:20:59 crc kubenswrapper[4898]: I1211 14:20:59.386133 4898 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d537893d-df47-49a2-abfc-12f11067377d-utilities\") on node \"crc\" DevicePath \"\"" Dec 11 14:20:59 crc kubenswrapper[4898]: I1211 14:20:59.391899 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d537893d-df47-49a2-abfc-12f11067377d-kube-api-access-bttw7" (OuterVolumeSpecName: "kube-api-access-bttw7") pod "d537893d-df47-49a2-abfc-12f11067377d" (UID: "d537893d-df47-49a2-abfc-12f11067377d"). InnerVolumeSpecName "kube-api-access-bttw7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 14:20:59 crc kubenswrapper[4898]: I1211 14:20:59.418349 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d537893d-df47-49a2-abfc-12f11067377d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d537893d-df47-49a2-abfc-12f11067377d" (UID: "d537893d-df47-49a2-abfc-12f11067377d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 14:20:59 crc kubenswrapper[4898]: I1211 14:20:59.487511 4898 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d537893d-df47-49a2-abfc-12f11067377d-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 11 14:20:59 crc kubenswrapper[4898]: I1211 14:20:59.487545 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bttw7\" (UniqueName: \"kubernetes.io/projected/d537893d-df47-49a2-abfc-12f11067377d-kube-api-access-bttw7\") on node \"crc\" DevicePath \"\"" Dec 11 14:20:59 crc kubenswrapper[4898]: I1211 14:20:59.892607 4898 generic.go:334] "Generic (PLEG): container finished" podID="d537893d-df47-49a2-abfc-12f11067377d" containerID="6d3223196ac8b0ffcf8617cf068a5df0ced5e6b5c7f0e02ba45ea9693f79d2b2" exitCode=0 Dec 11 14:20:59 crc kubenswrapper[4898]: I1211 14:20:59.892664 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-65wpc" event={"ID":"d537893d-df47-49a2-abfc-12f11067377d","Type":"ContainerDied","Data":"6d3223196ac8b0ffcf8617cf068a5df0ced5e6b5c7f0e02ba45ea9693f79d2b2"} Dec 11 14:20:59 crc kubenswrapper[4898]: I1211 14:20:59.892697 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-65wpc" event={"ID":"d537893d-df47-49a2-abfc-12f11067377d","Type":"ContainerDied","Data":"0cc9cd32a30acd6d511f5438f87f94da7fda4fe23b20e6260691a47811395631"} Dec 11 14:20:59 crc kubenswrapper[4898]: I1211 14:20:59.892718 4898 scope.go:117] "RemoveContainer" containerID="6d3223196ac8b0ffcf8617cf068a5df0ced5e6b5c7f0e02ba45ea9693f79d2b2" Dec 11 14:20:59 crc kubenswrapper[4898]: I1211 14:20:59.892903 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-65wpc" Dec 11 14:20:59 crc kubenswrapper[4898]: I1211 14:20:59.940367 4898 scope.go:117] "RemoveContainer" containerID="a033f3562dd939fe2880e6e2980931e5632172966a9af639ea522b60fec0cce1" Dec 11 14:20:59 crc kubenswrapper[4898]: I1211 14:20:59.962592 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-65wpc"] Dec 11 14:20:59 crc kubenswrapper[4898]: I1211 14:20:59.969841 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-65wpc"] Dec 11 14:21:00 crc kubenswrapper[4898]: I1211 14:21:00.015597 4898 scope.go:117] "RemoveContainer" containerID="55c49fad696e072084c8cf3f7128c41700a3a0fae86b332256a2888ecc2839fa" Dec 11 14:21:00 crc kubenswrapper[4898]: I1211 14:21:00.059803 4898 scope.go:117] "RemoveContainer" containerID="6d3223196ac8b0ffcf8617cf068a5df0ced5e6b5c7f0e02ba45ea9693f79d2b2" Dec 11 14:21:00 crc kubenswrapper[4898]: E1211 14:21:00.060250 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d3223196ac8b0ffcf8617cf068a5df0ced5e6b5c7f0e02ba45ea9693f79d2b2\": container with ID starting with 6d3223196ac8b0ffcf8617cf068a5df0ced5e6b5c7f0e02ba45ea9693f79d2b2 not found: ID does not exist" containerID="6d3223196ac8b0ffcf8617cf068a5df0ced5e6b5c7f0e02ba45ea9693f79d2b2" Dec 11 14:21:00 crc kubenswrapper[4898]: I1211 14:21:00.060301 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d3223196ac8b0ffcf8617cf068a5df0ced5e6b5c7f0e02ba45ea9693f79d2b2"} err="failed to get container status \"6d3223196ac8b0ffcf8617cf068a5df0ced5e6b5c7f0e02ba45ea9693f79d2b2\": rpc error: code = NotFound desc = could not find container \"6d3223196ac8b0ffcf8617cf068a5df0ced5e6b5c7f0e02ba45ea9693f79d2b2\": container with ID starting with 6d3223196ac8b0ffcf8617cf068a5df0ced5e6b5c7f0e02ba45ea9693f79d2b2 not found: ID does not exist" Dec 11 14:21:00 crc kubenswrapper[4898]: I1211 14:21:00.060340 4898 scope.go:117] "RemoveContainer" containerID="a033f3562dd939fe2880e6e2980931e5632172966a9af639ea522b60fec0cce1" Dec 11 14:21:00 crc kubenswrapper[4898]: E1211 14:21:00.060654 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a033f3562dd939fe2880e6e2980931e5632172966a9af639ea522b60fec0cce1\": container with ID starting with a033f3562dd939fe2880e6e2980931e5632172966a9af639ea522b60fec0cce1 not found: ID does not exist" containerID="a033f3562dd939fe2880e6e2980931e5632172966a9af639ea522b60fec0cce1" Dec 11 14:21:00 crc kubenswrapper[4898]: I1211 14:21:00.060687 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a033f3562dd939fe2880e6e2980931e5632172966a9af639ea522b60fec0cce1"} err="failed to get container status \"a033f3562dd939fe2880e6e2980931e5632172966a9af639ea522b60fec0cce1\": rpc error: code = NotFound desc = could not find container \"a033f3562dd939fe2880e6e2980931e5632172966a9af639ea522b60fec0cce1\": container with ID starting with a033f3562dd939fe2880e6e2980931e5632172966a9af639ea522b60fec0cce1 not found: ID does not exist" Dec 11 14:21:00 crc kubenswrapper[4898]: I1211 14:21:00.060712 4898 scope.go:117] "RemoveContainer" containerID="55c49fad696e072084c8cf3f7128c41700a3a0fae86b332256a2888ecc2839fa" Dec 11 14:21:00 crc kubenswrapper[4898]: E1211 14:21:00.060883 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"55c49fad696e072084c8cf3f7128c41700a3a0fae86b332256a2888ecc2839fa\": container with ID starting with 55c49fad696e072084c8cf3f7128c41700a3a0fae86b332256a2888ecc2839fa not found: ID does not exist" containerID="55c49fad696e072084c8cf3f7128c41700a3a0fae86b332256a2888ecc2839fa" Dec 11 14:21:00 crc kubenswrapper[4898]: I1211 14:21:00.060904 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55c49fad696e072084c8cf3f7128c41700a3a0fae86b332256a2888ecc2839fa"} err="failed to get container status \"55c49fad696e072084c8cf3f7128c41700a3a0fae86b332256a2888ecc2839fa\": rpc error: code = NotFound desc = could not find container \"55c49fad696e072084c8cf3f7128c41700a3a0fae86b332256a2888ecc2839fa\": container with ID starting with 55c49fad696e072084c8cf3f7128c41700a3a0fae86b332256a2888ecc2839fa not found: ID does not exist" Dec 11 14:21:00 crc kubenswrapper[4898]: I1211 14:21:00.791507 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d537893d-df47-49a2-abfc-12f11067377d" path="/var/lib/kubelet/pods/d537893d-df47-49a2-abfc-12f11067377d/volumes" Dec 11 14:21:04 crc kubenswrapper[4898]: I1211 14:21:04.865947 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-prbz7"] Dec 11 14:21:04 crc kubenswrapper[4898]: E1211 14:21:04.866822 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d537893d-df47-49a2-abfc-12f11067377d" containerName="registry-server" Dec 11 14:21:04 crc kubenswrapper[4898]: I1211 14:21:04.866833 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="d537893d-df47-49a2-abfc-12f11067377d" containerName="registry-server" Dec 11 14:21:04 crc kubenswrapper[4898]: E1211 14:21:04.866858 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d537893d-df47-49a2-abfc-12f11067377d" containerName="extract-utilities" Dec 11 14:21:04 crc kubenswrapper[4898]: I1211 14:21:04.866864 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="d537893d-df47-49a2-abfc-12f11067377d" containerName="extract-utilities" Dec 11 14:21:04 crc kubenswrapper[4898]: E1211 14:21:04.866897 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d537893d-df47-49a2-abfc-12f11067377d" containerName="extract-content" Dec 11 14:21:04 crc kubenswrapper[4898]: I1211 14:21:04.866903 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="d537893d-df47-49a2-abfc-12f11067377d" containerName="extract-content" Dec 11 14:21:04 crc kubenswrapper[4898]: I1211 14:21:04.867115 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="d537893d-df47-49a2-abfc-12f11067377d" containerName="registry-server" Dec 11 14:21:04 crc kubenswrapper[4898]: I1211 14:21:04.868677 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-prbz7" Dec 11 14:21:04 crc kubenswrapper[4898]: I1211 14:21:04.911495 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-prbz7"] Dec 11 14:21:05 crc kubenswrapper[4898]: I1211 14:21:05.040860 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2c4b4454-c28c-46dd-a28f-ae05666ccb71-catalog-content\") pod \"certified-operators-prbz7\" (UID: \"2c4b4454-c28c-46dd-a28f-ae05666ccb71\") " pod="openshift-marketplace/certified-operators-prbz7" Dec 11 14:21:05 crc kubenswrapper[4898]: I1211 14:21:05.041180 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ln2bt\" (UniqueName: \"kubernetes.io/projected/2c4b4454-c28c-46dd-a28f-ae05666ccb71-kube-api-access-ln2bt\") pod \"certified-operators-prbz7\" (UID: \"2c4b4454-c28c-46dd-a28f-ae05666ccb71\") " pod="openshift-marketplace/certified-operators-prbz7" Dec 11 14:21:05 crc kubenswrapper[4898]: I1211 14:21:05.041467 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2c4b4454-c28c-46dd-a28f-ae05666ccb71-utilities\") pod \"certified-operators-prbz7\" (UID: \"2c4b4454-c28c-46dd-a28f-ae05666ccb71\") " pod="openshift-marketplace/certified-operators-prbz7" Dec 11 14:21:05 crc kubenswrapper[4898]: I1211 14:21:05.144382 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2c4b4454-c28c-46dd-a28f-ae05666ccb71-catalog-content\") pod \"certified-operators-prbz7\" (UID: \"2c4b4454-c28c-46dd-a28f-ae05666ccb71\") " pod="openshift-marketplace/certified-operators-prbz7" Dec 11 14:21:05 crc kubenswrapper[4898]: I1211 14:21:05.144449 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ln2bt\" (UniqueName: \"kubernetes.io/projected/2c4b4454-c28c-46dd-a28f-ae05666ccb71-kube-api-access-ln2bt\") pod \"certified-operators-prbz7\" (UID: \"2c4b4454-c28c-46dd-a28f-ae05666ccb71\") " pod="openshift-marketplace/certified-operators-prbz7" Dec 11 14:21:05 crc kubenswrapper[4898]: I1211 14:21:05.144522 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2c4b4454-c28c-46dd-a28f-ae05666ccb71-utilities\") pod \"certified-operators-prbz7\" (UID: \"2c4b4454-c28c-46dd-a28f-ae05666ccb71\") " pod="openshift-marketplace/certified-operators-prbz7" Dec 11 14:21:05 crc kubenswrapper[4898]: I1211 14:21:05.145046 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2c4b4454-c28c-46dd-a28f-ae05666ccb71-utilities\") pod \"certified-operators-prbz7\" (UID: \"2c4b4454-c28c-46dd-a28f-ae05666ccb71\") " pod="openshift-marketplace/certified-operators-prbz7" Dec 11 14:21:05 crc kubenswrapper[4898]: I1211 14:21:05.145362 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2c4b4454-c28c-46dd-a28f-ae05666ccb71-catalog-content\") pod \"certified-operators-prbz7\" (UID: \"2c4b4454-c28c-46dd-a28f-ae05666ccb71\") " pod="openshift-marketplace/certified-operators-prbz7" Dec 11 14:21:05 crc kubenswrapper[4898]: I1211 14:21:05.166871 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ln2bt\" (UniqueName: \"kubernetes.io/projected/2c4b4454-c28c-46dd-a28f-ae05666ccb71-kube-api-access-ln2bt\") pod \"certified-operators-prbz7\" (UID: \"2c4b4454-c28c-46dd-a28f-ae05666ccb71\") " pod="openshift-marketplace/certified-operators-prbz7" Dec 11 14:21:05 crc kubenswrapper[4898]: I1211 14:21:05.189691 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-prbz7" Dec 11 14:21:05 crc kubenswrapper[4898]: I1211 14:21:05.737394 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-prbz7"] Dec 11 14:21:06 crc kubenswrapper[4898]: I1211 14:21:06.012289 4898 generic.go:334] "Generic (PLEG): container finished" podID="2c4b4454-c28c-46dd-a28f-ae05666ccb71" containerID="1082ffe2121e4f4c72e8b550c76c60e23279a9e47569262d6149fbdd0e6c5ba6" exitCode=0 Dec 11 14:21:06 crc kubenswrapper[4898]: I1211 14:21:06.012626 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-prbz7" event={"ID":"2c4b4454-c28c-46dd-a28f-ae05666ccb71","Type":"ContainerDied","Data":"1082ffe2121e4f4c72e8b550c76c60e23279a9e47569262d6149fbdd0e6c5ba6"} Dec 11 14:21:06 crc kubenswrapper[4898]: I1211 14:21:06.012677 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-prbz7" event={"ID":"2c4b4454-c28c-46dd-a28f-ae05666ccb71","Type":"ContainerStarted","Data":"1bc84f1effd89db125919b1a288c4d465848988065a7568881f26b0c3ed0a2ac"} Dec 11 14:21:07 crc kubenswrapper[4898]: I1211 14:21:07.798918 4898 scope.go:117] "RemoveContainer" containerID="b394cf34ba2fd657a20c472d818353aff5e4e06ca5e8c6123308991c69d40158" Dec 11 14:21:07 crc kubenswrapper[4898]: E1211 14:21:07.800808 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mmvk_openshift-machine-config-operator(b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" Dec 11 14:21:08 crc kubenswrapper[4898]: I1211 14:21:08.046243 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-prbz7" event={"ID":"2c4b4454-c28c-46dd-a28f-ae05666ccb71","Type":"ContainerStarted","Data":"0f9bb67ba9c46268e8a1a9d1850669c5a9938bb7c1ca98f253e423e5e4b197c5"} Dec 11 14:21:09 crc kubenswrapper[4898]: I1211 14:21:09.059279 4898 generic.go:334] "Generic (PLEG): container finished" podID="2c4b4454-c28c-46dd-a28f-ae05666ccb71" containerID="0f9bb67ba9c46268e8a1a9d1850669c5a9938bb7c1ca98f253e423e5e4b197c5" exitCode=0 Dec 11 14:21:09 crc kubenswrapper[4898]: I1211 14:21:09.059331 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-prbz7" event={"ID":"2c4b4454-c28c-46dd-a28f-ae05666ccb71","Type":"ContainerDied","Data":"0f9bb67ba9c46268e8a1a9d1850669c5a9938bb7c1ca98f253e423e5e4b197c5"} Dec 11 14:21:10 crc kubenswrapper[4898]: I1211 14:21:10.074126 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-prbz7" event={"ID":"2c4b4454-c28c-46dd-a28f-ae05666ccb71","Type":"ContainerStarted","Data":"0834c1ac43e6ff1899ae590adda353454fb0504e5f0d743f9a90e971fa9141f9"} Dec 11 14:21:10 crc kubenswrapper[4898]: I1211 14:21:10.115638 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-prbz7" podStartSLOduration=2.4562813549999998 podStartE2EDuration="6.115613637s" podCreationTimestamp="2025-12-11 14:21:04 +0000 UTC" firstStartedPulling="2025-12-11 14:21:06.01501736 +0000 UTC m=+4623.587343797" lastFinishedPulling="2025-12-11 14:21:09.674349632 +0000 UTC m=+4627.246676079" observedRunningTime="2025-12-11 14:21:10.103981372 +0000 UTC m=+4627.676307829" watchObservedRunningTime="2025-12-11 14:21:10.115613637 +0000 UTC m=+4627.687940094" Dec 11 14:21:15 crc kubenswrapper[4898]: I1211 14:21:15.190605 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-prbz7" Dec 11 14:21:15 crc kubenswrapper[4898]: I1211 14:21:15.191805 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-prbz7" Dec 11 14:21:15 crc kubenswrapper[4898]: I1211 14:21:15.239996 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-prbz7" Dec 11 14:21:16 crc kubenswrapper[4898]: I1211 14:21:16.201481 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-prbz7" Dec 11 14:21:16 crc kubenswrapper[4898]: I1211 14:21:16.478471 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-prbz7"] Dec 11 14:21:18 crc kubenswrapper[4898]: I1211 14:21:18.163669 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-prbz7" podUID="2c4b4454-c28c-46dd-a28f-ae05666ccb71" containerName="registry-server" containerID="cri-o://0834c1ac43e6ff1899ae590adda353454fb0504e5f0d743f9a90e971fa9141f9" gracePeriod=2 Dec 11 14:21:18 crc kubenswrapper[4898]: I1211 14:21:18.710580 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-prbz7" Dec 11 14:21:18 crc kubenswrapper[4898]: I1211 14:21:18.805582 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2c4b4454-c28c-46dd-a28f-ae05666ccb71-utilities\") pod \"2c4b4454-c28c-46dd-a28f-ae05666ccb71\" (UID: \"2c4b4454-c28c-46dd-a28f-ae05666ccb71\") " Dec 11 14:21:18 crc kubenswrapper[4898]: I1211 14:21:18.805691 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2c4b4454-c28c-46dd-a28f-ae05666ccb71-catalog-content\") pod \"2c4b4454-c28c-46dd-a28f-ae05666ccb71\" (UID: \"2c4b4454-c28c-46dd-a28f-ae05666ccb71\") " Dec 11 14:21:18 crc kubenswrapper[4898]: I1211 14:21:18.805768 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ln2bt\" (UniqueName: \"kubernetes.io/projected/2c4b4454-c28c-46dd-a28f-ae05666ccb71-kube-api-access-ln2bt\") pod \"2c4b4454-c28c-46dd-a28f-ae05666ccb71\" (UID: \"2c4b4454-c28c-46dd-a28f-ae05666ccb71\") " Dec 11 14:21:18 crc kubenswrapper[4898]: I1211 14:21:18.814445 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2c4b4454-c28c-46dd-a28f-ae05666ccb71-utilities" (OuterVolumeSpecName: "utilities") pod "2c4b4454-c28c-46dd-a28f-ae05666ccb71" (UID: "2c4b4454-c28c-46dd-a28f-ae05666ccb71"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 14:21:18 crc kubenswrapper[4898]: I1211 14:21:18.818518 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c4b4454-c28c-46dd-a28f-ae05666ccb71-kube-api-access-ln2bt" (OuterVolumeSpecName: "kube-api-access-ln2bt") pod "2c4b4454-c28c-46dd-a28f-ae05666ccb71" (UID: "2c4b4454-c28c-46dd-a28f-ae05666ccb71"). InnerVolumeSpecName "kube-api-access-ln2bt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 14:21:18 crc kubenswrapper[4898]: I1211 14:21:18.869546 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2c4b4454-c28c-46dd-a28f-ae05666ccb71-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2c4b4454-c28c-46dd-a28f-ae05666ccb71" (UID: "2c4b4454-c28c-46dd-a28f-ae05666ccb71"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 14:21:18 crc kubenswrapper[4898]: I1211 14:21:18.913811 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ln2bt\" (UniqueName: \"kubernetes.io/projected/2c4b4454-c28c-46dd-a28f-ae05666ccb71-kube-api-access-ln2bt\") on node \"crc\" DevicePath \"\"" Dec 11 14:21:18 crc kubenswrapper[4898]: I1211 14:21:18.913862 4898 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2c4b4454-c28c-46dd-a28f-ae05666ccb71-utilities\") on node \"crc\" DevicePath \"\"" Dec 11 14:21:18 crc kubenswrapper[4898]: I1211 14:21:18.913876 4898 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2c4b4454-c28c-46dd-a28f-ae05666ccb71-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 11 14:21:19 crc kubenswrapper[4898]: I1211 14:21:19.178819 4898 generic.go:334] "Generic (PLEG): container finished" podID="2c4b4454-c28c-46dd-a28f-ae05666ccb71" containerID="0834c1ac43e6ff1899ae590adda353454fb0504e5f0d743f9a90e971fa9141f9" exitCode=0 Dec 11 14:21:19 crc kubenswrapper[4898]: I1211 14:21:19.178995 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-prbz7" event={"ID":"2c4b4454-c28c-46dd-a28f-ae05666ccb71","Type":"ContainerDied","Data":"0834c1ac43e6ff1899ae590adda353454fb0504e5f0d743f9a90e971fa9141f9"} Dec 11 14:21:19 crc kubenswrapper[4898]: I1211 14:21:19.180281 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-prbz7" event={"ID":"2c4b4454-c28c-46dd-a28f-ae05666ccb71","Type":"ContainerDied","Data":"1bc84f1effd89db125919b1a288c4d465848988065a7568881f26b0c3ed0a2ac"} Dec 11 14:21:19 crc kubenswrapper[4898]: I1211 14:21:19.180392 4898 scope.go:117] "RemoveContainer" containerID="0834c1ac43e6ff1899ae590adda353454fb0504e5f0d743f9a90e971fa9141f9" Dec 11 14:21:19 crc kubenswrapper[4898]: I1211 14:21:19.179137 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-prbz7" Dec 11 14:21:19 crc kubenswrapper[4898]: I1211 14:21:19.224229 4898 scope.go:117] "RemoveContainer" containerID="0f9bb67ba9c46268e8a1a9d1850669c5a9938bb7c1ca98f253e423e5e4b197c5" Dec 11 14:21:19 crc kubenswrapper[4898]: I1211 14:21:19.245592 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-prbz7"] Dec 11 14:21:19 crc kubenswrapper[4898]: I1211 14:21:19.271201 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-prbz7"] Dec 11 14:21:19 crc kubenswrapper[4898]: I1211 14:21:19.288134 4898 scope.go:117] "RemoveContainer" containerID="1082ffe2121e4f4c72e8b550c76c60e23279a9e47569262d6149fbdd0e6c5ba6" Dec 11 14:21:19 crc kubenswrapper[4898]: I1211 14:21:19.342138 4898 scope.go:117] "RemoveContainer" containerID="0834c1ac43e6ff1899ae590adda353454fb0504e5f0d743f9a90e971fa9141f9" Dec 11 14:21:19 crc kubenswrapper[4898]: E1211 14:21:19.349611 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0834c1ac43e6ff1899ae590adda353454fb0504e5f0d743f9a90e971fa9141f9\": container with ID starting with 0834c1ac43e6ff1899ae590adda353454fb0504e5f0d743f9a90e971fa9141f9 not found: ID does not exist" containerID="0834c1ac43e6ff1899ae590adda353454fb0504e5f0d743f9a90e971fa9141f9" Dec 11 14:21:19 crc kubenswrapper[4898]: I1211 14:21:19.349667 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0834c1ac43e6ff1899ae590adda353454fb0504e5f0d743f9a90e971fa9141f9"} err="failed to get container status \"0834c1ac43e6ff1899ae590adda353454fb0504e5f0d743f9a90e971fa9141f9\": rpc error: code = NotFound desc = could not find container \"0834c1ac43e6ff1899ae590adda353454fb0504e5f0d743f9a90e971fa9141f9\": container with ID starting with 0834c1ac43e6ff1899ae590adda353454fb0504e5f0d743f9a90e971fa9141f9 not found: ID does not exist" Dec 11 14:21:19 crc kubenswrapper[4898]: I1211 14:21:19.349700 4898 scope.go:117] "RemoveContainer" containerID="0f9bb67ba9c46268e8a1a9d1850669c5a9938bb7c1ca98f253e423e5e4b197c5" Dec 11 14:21:19 crc kubenswrapper[4898]: E1211 14:21:19.357000 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0f9bb67ba9c46268e8a1a9d1850669c5a9938bb7c1ca98f253e423e5e4b197c5\": container with ID starting with 0f9bb67ba9c46268e8a1a9d1850669c5a9938bb7c1ca98f253e423e5e4b197c5 not found: ID does not exist" containerID="0f9bb67ba9c46268e8a1a9d1850669c5a9938bb7c1ca98f253e423e5e4b197c5" Dec 11 14:21:19 crc kubenswrapper[4898]: I1211 14:21:19.357056 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f9bb67ba9c46268e8a1a9d1850669c5a9938bb7c1ca98f253e423e5e4b197c5"} err="failed to get container status \"0f9bb67ba9c46268e8a1a9d1850669c5a9938bb7c1ca98f253e423e5e4b197c5\": rpc error: code = NotFound desc = could not find container \"0f9bb67ba9c46268e8a1a9d1850669c5a9938bb7c1ca98f253e423e5e4b197c5\": container with ID starting with 0f9bb67ba9c46268e8a1a9d1850669c5a9938bb7c1ca98f253e423e5e4b197c5 not found: ID does not exist" Dec 11 14:21:19 crc kubenswrapper[4898]: I1211 14:21:19.357091 4898 scope.go:117] "RemoveContainer" containerID="1082ffe2121e4f4c72e8b550c76c60e23279a9e47569262d6149fbdd0e6c5ba6" Dec 11 14:21:19 crc kubenswrapper[4898]: E1211 14:21:19.359966 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1082ffe2121e4f4c72e8b550c76c60e23279a9e47569262d6149fbdd0e6c5ba6\": container with ID starting with 1082ffe2121e4f4c72e8b550c76c60e23279a9e47569262d6149fbdd0e6c5ba6 not found: ID does not exist" containerID="1082ffe2121e4f4c72e8b550c76c60e23279a9e47569262d6149fbdd0e6c5ba6" Dec 11 14:21:19 crc kubenswrapper[4898]: I1211 14:21:19.359997 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1082ffe2121e4f4c72e8b550c76c60e23279a9e47569262d6149fbdd0e6c5ba6"} err="failed to get container status \"1082ffe2121e4f4c72e8b550c76c60e23279a9e47569262d6149fbdd0e6c5ba6\": rpc error: code = NotFound desc = could not find container \"1082ffe2121e4f4c72e8b550c76c60e23279a9e47569262d6149fbdd0e6c5ba6\": container with ID starting with 1082ffe2121e4f4c72e8b550c76c60e23279a9e47569262d6149fbdd0e6c5ba6 not found: ID does not exist" Dec 11 14:21:20 crc kubenswrapper[4898]: I1211 14:21:20.775787 4898 scope.go:117] "RemoveContainer" containerID="b394cf34ba2fd657a20c472d818353aff5e4e06ca5e8c6123308991c69d40158" Dec 11 14:21:20 crc kubenswrapper[4898]: E1211 14:21:20.776362 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mmvk_openshift-machine-config-operator(b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" Dec 11 14:21:20 crc kubenswrapper[4898]: I1211 14:21:20.791304 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c4b4454-c28c-46dd-a28f-ae05666ccb71" path="/var/lib/kubelet/pods/2c4b4454-c28c-46dd-a28f-ae05666ccb71/volumes" Dec 11 14:21:35 crc kubenswrapper[4898]: I1211 14:21:35.775219 4898 scope.go:117] "RemoveContainer" containerID="b394cf34ba2fd657a20c472d818353aff5e4e06ca5e8c6123308991c69d40158" Dec 11 14:21:35 crc kubenswrapper[4898]: E1211 14:21:35.775966 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mmvk_openshift-machine-config-operator(b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" Dec 11 14:21:47 crc kubenswrapper[4898]: I1211 14:21:47.775362 4898 scope.go:117] "RemoveContainer" containerID="b394cf34ba2fd657a20c472d818353aff5e4e06ca5e8c6123308991c69d40158" Dec 11 14:21:47 crc kubenswrapper[4898]: E1211 14:21:47.776194 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mmvk_openshift-machine-config-operator(b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" Dec 11 14:21:58 crc kubenswrapper[4898]: I1211 14:21:58.775226 4898 scope.go:117] "RemoveContainer" containerID="b394cf34ba2fd657a20c472d818353aff5e4e06ca5e8c6123308991c69d40158" Dec 11 14:21:58 crc kubenswrapper[4898]: E1211 14:21:58.776305 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mmvk_openshift-machine-config-operator(b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" Dec 11 14:22:10 crc kubenswrapper[4898]: I1211 14:22:10.775405 4898 scope.go:117] "RemoveContainer" containerID="b394cf34ba2fd657a20c472d818353aff5e4e06ca5e8c6123308991c69d40158" Dec 11 14:22:10 crc kubenswrapper[4898]: E1211 14:22:10.776587 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mmvk_openshift-machine-config-operator(b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" Dec 11 14:22:25 crc kubenswrapper[4898]: I1211 14:22:25.775686 4898 scope.go:117] "RemoveContainer" containerID="b394cf34ba2fd657a20c472d818353aff5e4e06ca5e8c6123308991c69d40158" Dec 11 14:22:25 crc kubenswrapper[4898]: E1211 14:22:25.776567 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mmvk_openshift-machine-config-operator(b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" Dec 11 14:22:37 crc kubenswrapper[4898]: I1211 14:22:37.781145 4898 scope.go:117] "RemoveContainer" containerID="b394cf34ba2fd657a20c472d818353aff5e4e06ca5e8c6123308991c69d40158" Dec 11 14:22:37 crc kubenswrapper[4898]: E1211 14:22:37.782367 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mmvk_openshift-machine-config-operator(b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" Dec 11 14:22:48 crc kubenswrapper[4898]: I1211 14:22:48.776097 4898 scope.go:117] "RemoveContainer" containerID="b394cf34ba2fd657a20c472d818353aff5e4e06ca5e8c6123308991c69d40158" Dec 11 14:22:48 crc kubenswrapper[4898]: E1211 14:22:48.776967 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mmvk_openshift-machine-config-operator(b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" Dec 11 14:22:59 crc kubenswrapper[4898]: I1211 14:22:59.775715 4898 scope.go:117] "RemoveContainer" containerID="b394cf34ba2fd657a20c472d818353aff5e4e06ca5e8c6123308991c69d40158" Dec 11 14:22:59 crc kubenswrapper[4898]: E1211 14:22:59.776576 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mmvk_openshift-machine-config-operator(b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" Dec 11 14:23:11 crc kubenswrapper[4898]: I1211 14:23:11.776005 4898 scope.go:117] "RemoveContainer" containerID="b394cf34ba2fd657a20c472d818353aff5e4e06ca5e8c6123308991c69d40158" Dec 11 14:23:11 crc kubenswrapper[4898]: E1211 14:23:11.777962 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mmvk_openshift-machine-config-operator(b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" Dec 11 14:23:22 crc kubenswrapper[4898]: I1211 14:23:22.786963 4898 scope.go:117] "RemoveContainer" containerID="b394cf34ba2fd657a20c472d818353aff5e4e06ca5e8c6123308991c69d40158" Dec 11 14:23:22 crc kubenswrapper[4898]: E1211 14:23:22.787953 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mmvk_openshift-machine-config-operator(b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" Dec 11 14:23:37 crc kubenswrapper[4898]: I1211 14:23:37.776036 4898 scope.go:117] "RemoveContainer" containerID="b394cf34ba2fd657a20c472d818353aff5e4e06ca5e8c6123308991c69d40158" Dec 11 14:23:37 crc kubenswrapper[4898]: E1211 14:23:37.777389 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mmvk_openshift-machine-config-operator(b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" Dec 11 14:23:51 crc kubenswrapper[4898]: I1211 14:23:51.775615 4898 scope.go:117] "RemoveContainer" containerID="b394cf34ba2fd657a20c472d818353aff5e4e06ca5e8c6123308991c69d40158" Dec 11 14:23:51 crc kubenswrapper[4898]: E1211 14:23:51.776684 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mmvk_openshift-machine-config-operator(b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" Dec 11 14:24:06 crc kubenswrapper[4898]: I1211 14:24:06.775999 4898 scope.go:117] "RemoveContainer" containerID="b394cf34ba2fd657a20c472d818353aff5e4e06ca5e8c6123308991c69d40158" Dec 11 14:24:06 crc kubenswrapper[4898]: E1211 14:24:06.776888 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mmvk_openshift-machine-config-operator(b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" Dec 11 14:24:15 crc kubenswrapper[4898]: I1211 14:24:15.248559 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Dec 11 14:24:15 crc kubenswrapper[4898]: E1211 14:24:15.250110 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c4b4454-c28c-46dd-a28f-ae05666ccb71" containerName="registry-server" Dec 11 14:24:15 crc kubenswrapper[4898]: I1211 14:24:15.250148 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c4b4454-c28c-46dd-a28f-ae05666ccb71" containerName="registry-server" Dec 11 14:24:15 crc kubenswrapper[4898]: E1211 14:24:15.250250 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c4b4454-c28c-46dd-a28f-ae05666ccb71" containerName="extract-content" Dec 11 14:24:15 crc kubenswrapper[4898]: I1211 14:24:15.250270 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c4b4454-c28c-46dd-a28f-ae05666ccb71" containerName="extract-content" Dec 11 14:24:15 crc kubenswrapper[4898]: E1211 14:24:15.250314 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c4b4454-c28c-46dd-a28f-ae05666ccb71" containerName="extract-utilities" Dec 11 14:24:15 crc kubenswrapper[4898]: I1211 14:24:15.250329 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c4b4454-c28c-46dd-a28f-ae05666ccb71" containerName="extract-utilities" Dec 11 14:24:15 crc kubenswrapper[4898]: I1211 14:24:15.250972 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c4b4454-c28c-46dd-a28f-ae05666ccb71" containerName="registry-server" Dec 11 14:24:15 crc kubenswrapper[4898]: I1211 14:24:15.252587 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Dec 11 14:24:15 crc kubenswrapper[4898]: I1211 14:24:15.255141 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Dec 11 14:24:15 crc kubenswrapper[4898]: I1211 14:24:15.255218 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Dec 11 14:24:15 crc kubenswrapper[4898]: I1211 14:24:15.255825 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Dec 11 14:24:15 crc kubenswrapper[4898]: I1211 14:24:15.256550 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-wn9nm" Dec 11 14:24:15 crc kubenswrapper[4898]: I1211 14:24:15.274179 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Dec 11 14:24:15 crc kubenswrapper[4898]: I1211 14:24:15.290920 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/a1c00711-2048-486e-b3f7-3d5441032df8-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"a1c00711-2048-486e-b3f7-3d5441032df8\") " pod="openstack/tempest-tests-tempest" Dec 11 14:24:15 crc kubenswrapper[4898]: I1211 14:24:15.291118 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/a1c00711-2048-486e-b3f7-3d5441032df8-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"a1c00711-2048-486e-b3f7-3d5441032df8\") " pod="openstack/tempest-tests-tempest" Dec 11 14:24:15 crc kubenswrapper[4898]: I1211 14:24:15.291263 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/a1c00711-2048-486e-b3f7-3d5441032df8-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"a1c00711-2048-486e-b3f7-3d5441032df8\") " pod="openstack/tempest-tests-tempest" Dec 11 14:24:15 crc kubenswrapper[4898]: I1211 14:24:15.291331 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/a1c00711-2048-486e-b3f7-3d5441032df8-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"a1c00711-2048-486e-b3f7-3d5441032df8\") " pod="openstack/tempest-tests-tempest" Dec 11 14:24:15 crc kubenswrapper[4898]: I1211 14:24:15.291400 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a1c00711-2048-486e-b3f7-3d5441032df8-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"a1c00711-2048-486e-b3f7-3d5441032df8\") " pod="openstack/tempest-tests-tempest" Dec 11 14:24:15 crc kubenswrapper[4898]: I1211 14:24:15.291478 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/a1c00711-2048-486e-b3f7-3d5441032df8-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"a1c00711-2048-486e-b3f7-3d5441032df8\") " pod="openstack/tempest-tests-tempest" Dec 11 14:24:15 crc kubenswrapper[4898]: I1211 14:24:15.291511 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a1c00711-2048-486e-b3f7-3d5441032df8-config-data\") pod \"tempest-tests-tempest\" (UID: \"a1c00711-2048-486e-b3f7-3d5441032df8\") " pod="openstack/tempest-tests-tempest" Dec 11 14:24:15 crc kubenswrapper[4898]: I1211 14:24:15.291611 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"tempest-tests-tempest\" (UID: \"a1c00711-2048-486e-b3f7-3d5441032df8\") " pod="openstack/tempest-tests-tempest" Dec 11 14:24:15 crc kubenswrapper[4898]: I1211 14:24:15.291697 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t6xgj\" (UniqueName: \"kubernetes.io/projected/a1c00711-2048-486e-b3f7-3d5441032df8-kube-api-access-t6xgj\") pod \"tempest-tests-tempest\" (UID: \"a1c00711-2048-486e-b3f7-3d5441032df8\") " pod="openstack/tempest-tests-tempest" Dec 11 14:24:15 crc kubenswrapper[4898]: I1211 14:24:15.394139 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t6xgj\" (UniqueName: \"kubernetes.io/projected/a1c00711-2048-486e-b3f7-3d5441032df8-kube-api-access-t6xgj\") pod \"tempest-tests-tempest\" (UID: \"a1c00711-2048-486e-b3f7-3d5441032df8\") " pod="openstack/tempest-tests-tempest" Dec 11 14:24:15 crc kubenswrapper[4898]: I1211 14:24:15.394533 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/a1c00711-2048-486e-b3f7-3d5441032df8-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"a1c00711-2048-486e-b3f7-3d5441032df8\") " pod="openstack/tempest-tests-tempest" Dec 11 14:24:15 crc kubenswrapper[4898]: I1211 14:24:15.394746 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/a1c00711-2048-486e-b3f7-3d5441032df8-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"a1c00711-2048-486e-b3f7-3d5441032df8\") " pod="openstack/tempest-tests-tempest" Dec 11 14:24:15 crc kubenswrapper[4898]: I1211 14:24:15.394955 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/a1c00711-2048-486e-b3f7-3d5441032df8-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"a1c00711-2048-486e-b3f7-3d5441032df8\") " pod="openstack/tempest-tests-tempest" Dec 11 14:24:15 crc kubenswrapper[4898]: I1211 14:24:15.395088 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/a1c00711-2048-486e-b3f7-3d5441032df8-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"a1c00711-2048-486e-b3f7-3d5441032df8\") " pod="openstack/tempest-tests-tempest" Dec 11 14:24:15 crc kubenswrapper[4898]: I1211 14:24:15.395217 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a1c00711-2048-486e-b3f7-3d5441032df8-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"a1c00711-2048-486e-b3f7-3d5441032df8\") " pod="openstack/tempest-tests-tempest" Dec 11 14:24:15 crc kubenswrapper[4898]: I1211 14:24:15.395361 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/a1c00711-2048-486e-b3f7-3d5441032df8-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"a1c00711-2048-486e-b3f7-3d5441032df8\") " pod="openstack/tempest-tests-tempest" Dec 11 14:24:15 crc kubenswrapper[4898]: I1211 14:24:15.395515 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a1c00711-2048-486e-b3f7-3d5441032df8-config-data\") pod \"tempest-tests-tempest\" (UID: \"a1c00711-2048-486e-b3f7-3d5441032df8\") " pod="openstack/tempest-tests-tempest" Dec 11 14:24:15 crc kubenswrapper[4898]: I1211 14:24:15.395613 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/a1c00711-2048-486e-b3f7-3d5441032df8-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"a1c00711-2048-486e-b3f7-3d5441032df8\") " pod="openstack/tempest-tests-tempest" Dec 11 14:24:15 crc kubenswrapper[4898]: I1211 14:24:15.395786 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"tempest-tests-tempest\" (UID: \"a1c00711-2048-486e-b3f7-3d5441032df8\") " pod="openstack/tempest-tests-tempest" Dec 11 14:24:15 crc kubenswrapper[4898]: I1211 14:24:15.395928 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/a1c00711-2048-486e-b3f7-3d5441032df8-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"a1c00711-2048-486e-b3f7-3d5441032df8\") " pod="openstack/tempest-tests-tempest" Dec 11 14:24:15 crc kubenswrapper[4898]: I1211 14:24:15.396368 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/a1c00711-2048-486e-b3f7-3d5441032df8-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"a1c00711-2048-486e-b3f7-3d5441032df8\") " pod="openstack/tempest-tests-tempest" Dec 11 14:24:15 crc kubenswrapper[4898]: I1211 14:24:15.397672 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a1c00711-2048-486e-b3f7-3d5441032df8-config-data\") pod \"tempest-tests-tempest\" (UID: \"a1c00711-2048-486e-b3f7-3d5441032df8\") " pod="openstack/tempest-tests-tempest" Dec 11 14:24:15 crc kubenswrapper[4898]: I1211 14:24:15.398322 4898 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"tempest-tests-tempest\" (UID: \"a1c00711-2048-486e-b3f7-3d5441032df8\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/tempest-tests-tempest" Dec 11 14:24:15 crc kubenswrapper[4898]: I1211 14:24:15.401115 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a1c00711-2048-486e-b3f7-3d5441032df8-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"a1c00711-2048-486e-b3f7-3d5441032df8\") " pod="openstack/tempest-tests-tempest" Dec 11 14:24:15 crc kubenswrapper[4898]: I1211 14:24:15.401997 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/a1c00711-2048-486e-b3f7-3d5441032df8-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"a1c00711-2048-486e-b3f7-3d5441032df8\") " pod="openstack/tempest-tests-tempest" Dec 11 14:24:15 crc kubenswrapper[4898]: I1211 14:24:15.404946 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/a1c00711-2048-486e-b3f7-3d5441032df8-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"a1c00711-2048-486e-b3f7-3d5441032df8\") " pod="openstack/tempest-tests-tempest" Dec 11 14:24:15 crc kubenswrapper[4898]: I1211 14:24:15.416092 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t6xgj\" (UniqueName: \"kubernetes.io/projected/a1c00711-2048-486e-b3f7-3d5441032df8-kube-api-access-t6xgj\") pod \"tempest-tests-tempest\" (UID: \"a1c00711-2048-486e-b3f7-3d5441032df8\") " pod="openstack/tempest-tests-tempest" Dec 11 14:24:15 crc kubenswrapper[4898]: I1211 14:24:15.442816 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"tempest-tests-tempest\" (UID: \"a1c00711-2048-486e-b3f7-3d5441032df8\") " pod="openstack/tempest-tests-tempest" Dec 11 14:24:15 crc kubenswrapper[4898]: I1211 14:24:15.582876 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Dec 11 14:24:16 crc kubenswrapper[4898]: I1211 14:24:16.239608 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Dec 11 14:24:16 crc kubenswrapper[4898]: I1211 14:24:16.354617 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"a1c00711-2048-486e-b3f7-3d5441032df8","Type":"ContainerStarted","Data":"9350096e9801513c7502031d4fa8739d2c915e503f1c7a4c8fd07d48bf747dfe"} Dec 11 14:24:20 crc kubenswrapper[4898]: I1211 14:24:20.775699 4898 scope.go:117] "RemoveContainer" containerID="b394cf34ba2fd657a20c472d818353aff5e4e06ca5e8c6123308991c69d40158" Dec 11 14:24:20 crc kubenswrapper[4898]: E1211 14:24:20.776598 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mmvk_openshift-machine-config-operator(b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" Dec 11 14:24:31 crc kubenswrapper[4898]: I1211 14:24:31.775738 4898 scope.go:117] "RemoveContainer" containerID="b394cf34ba2fd657a20c472d818353aff5e4e06ca5e8c6123308991c69d40158" Dec 11 14:24:31 crc kubenswrapper[4898]: E1211 14:24:31.776763 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mmvk_openshift-machine-config-operator(b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" Dec 11 14:24:42 crc kubenswrapper[4898]: I1211 14:24:42.787340 4898 scope.go:117] "RemoveContainer" containerID="b394cf34ba2fd657a20c472d818353aff5e4e06ca5e8c6123308991c69d40158" Dec 11 14:24:42 crc kubenswrapper[4898]: E1211 14:24:42.788228 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mmvk_openshift-machine-config-operator(b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" Dec 11 14:24:53 crc kubenswrapper[4898]: I1211 14:24:53.776268 4898 scope.go:117] "RemoveContainer" containerID="b394cf34ba2fd657a20c472d818353aff5e4e06ca5e8c6123308991c69d40158" Dec 11 14:24:53 crc kubenswrapper[4898]: E1211 14:24:53.777088 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mmvk_openshift-machine-config-operator(b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" Dec 11 14:24:58 crc kubenswrapper[4898]: E1211 14:24:58.080475 4898 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified" Dec 11 14:24:58 crc kubenswrapper[4898]: E1211 14:24:58.082119 4898 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:tempest-tests-tempest-tests-runner,Image:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/test_operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/tempest,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/tempest/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/var/lib/tempest/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/var/lib/tempest/id_ecdsa,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-t6xgj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42480,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42480,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-custom-data-s0,},Optional:nil,},SecretRef:nil,},EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-env-vars-s0,},Optional:nil,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod tempest-tests-tempest_openstack(a1c00711-2048-486e-b3f7-3d5441032df8): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 11 14:24:58 crc kubenswrapper[4898]: E1211 14:24:58.083616 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/tempest-tests-tempest" podUID="a1c00711-2048-486e-b3f7-3d5441032df8" Dec 11 14:24:58 crc kubenswrapper[4898]: E1211 14:24:58.889573 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified\\\"\"" pod="openstack/tempest-tests-tempest" podUID="a1c00711-2048-486e-b3f7-3d5441032df8" Dec 11 14:25:07 crc kubenswrapper[4898]: I1211 14:25:07.776295 4898 scope.go:117] "RemoveContainer" containerID="b394cf34ba2fd657a20c472d818353aff5e4e06ca5e8c6123308991c69d40158" Dec 11 14:25:09 crc kubenswrapper[4898]: I1211 14:25:09.006312 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" event={"ID":"b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c","Type":"ContainerStarted","Data":"da62fe95b38866dac78489cf9559a274d6c76f5f176513a8e912deec87459d47"} Dec 11 14:25:11 crc kubenswrapper[4898]: I1211 14:25:11.584834 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Dec 11 14:25:14 crc kubenswrapper[4898]: I1211 14:25:14.092926 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"a1c00711-2048-486e-b3f7-3d5441032df8","Type":"ContainerStarted","Data":"36b9059582a127334c7da49faa3c0ab449f5011fcc0db21ade710e2d72fddbb7"} Dec 11 14:25:14 crc kubenswrapper[4898]: I1211 14:25:14.125874 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=4.781377922 podStartE2EDuration="1m0.125839235s" podCreationTimestamp="2025-12-11 14:24:14 +0000 UTC" firstStartedPulling="2025-12-11 14:24:16.235759621 +0000 UTC m=+4813.808086058" lastFinishedPulling="2025-12-11 14:25:11.580220934 +0000 UTC m=+4869.152547371" observedRunningTime="2025-12-11 14:25:14.115929248 +0000 UTC m=+4871.688255695" watchObservedRunningTime="2025-12-11 14:25:14.125839235 +0000 UTC m=+4871.698165662" Dec 11 14:26:09 crc kubenswrapper[4898]: I1211 14:26:09.451645 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-9bw58"] Dec 11 14:26:09 crc kubenswrapper[4898]: I1211 14:26:09.458127 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9bw58" Dec 11 14:26:09 crc kubenswrapper[4898]: I1211 14:26:09.543783 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gs6lt\" (UniqueName: \"kubernetes.io/projected/2b267048-1a44-4a75-bd64-24641a6b5a63-kube-api-access-gs6lt\") pod \"redhat-operators-9bw58\" (UID: \"2b267048-1a44-4a75-bd64-24641a6b5a63\") " pod="openshift-marketplace/redhat-operators-9bw58" Dec 11 14:26:09 crc kubenswrapper[4898]: I1211 14:26:09.543900 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b267048-1a44-4a75-bd64-24641a6b5a63-catalog-content\") pod \"redhat-operators-9bw58\" (UID: \"2b267048-1a44-4a75-bd64-24641a6b5a63\") " pod="openshift-marketplace/redhat-operators-9bw58" Dec 11 14:26:09 crc kubenswrapper[4898]: I1211 14:26:09.544036 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b267048-1a44-4a75-bd64-24641a6b5a63-utilities\") pod \"redhat-operators-9bw58\" (UID: \"2b267048-1a44-4a75-bd64-24641a6b5a63\") " pod="openshift-marketplace/redhat-operators-9bw58" Dec 11 14:26:09 crc kubenswrapper[4898]: I1211 14:26:09.562169 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9bw58"] Dec 11 14:26:09 crc kubenswrapper[4898]: I1211 14:26:09.650315 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gs6lt\" (UniqueName: \"kubernetes.io/projected/2b267048-1a44-4a75-bd64-24641a6b5a63-kube-api-access-gs6lt\") pod \"redhat-operators-9bw58\" (UID: \"2b267048-1a44-4a75-bd64-24641a6b5a63\") " pod="openshift-marketplace/redhat-operators-9bw58" Dec 11 14:26:09 crc kubenswrapper[4898]: I1211 14:26:09.650436 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b267048-1a44-4a75-bd64-24641a6b5a63-catalog-content\") pod \"redhat-operators-9bw58\" (UID: \"2b267048-1a44-4a75-bd64-24641a6b5a63\") " pod="openshift-marketplace/redhat-operators-9bw58" Dec 11 14:26:09 crc kubenswrapper[4898]: I1211 14:26:09.650605 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b267048-1a44-4a75-bd64-24641a6b5a63-utilities\") pod \"redhat-operators-9bw58\" (UID: \"2b267048-1a44-4a75-bd64-24641a6b5a63\") " pod="openshift-marketplace/redhat-operators-9bw58" Dec 11 14:26:09 crc kubenswrapper[4898]: I1211 14:26:09.653896 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b267048-1a44-4a75-bd64-24641a6b5a63-utilities\") pod \"redhat-operators-9bw58\" (UID: \"2b267048-1a44-4a75-bd64-24641a6b5a63\") " pod="openshift-marketplace/redhat-operators-9bw58" Dec 11 14:26:09 crc kubenswrapper[4898]: I1211 14:26:09.656902 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b267048-1a44-4a75-bd64-24641a6b5a63-catalog-content\") pod \"redhat-operators-9bw58\" (UID: \"2b267048-1a44-4a75-bd64-24641a6b5a63\") " pod="openshift-marketplace/redhat-operators-9bw58" Dec 11 14:26:09 crc kubenswrapper[4898]: I1211 14:26:09.683833 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gs6lt\" (UniqueName: \"kubernetes.io/projected/2b267048-1a44-4a75-bd64-24641a6b5a63-kube-api-access-gs6lt\") pod \"redhat-operators-9bw58\" (UID: \"2b267048-1a44-4a75-bd64-24641a6b5a63\") " pod="openshift-marketplace/redhat-operators-9bw58" Dec 11 14:26:09 crc kubenswrapper[4898]: I1211 14:26:09.783888 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9bw58" Dec 11 14:26:11 crc kubenswrapper[4898]: I1211 14:26:11.478050 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9bw58"] Dec 11 14:26:11 crc kubenswrapper[4898]: I1211 14:26:11.799213 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9bw58" event={"ID":"2b267048-1a44-4a75-bd64-24641a6b5a63","Type":"ContainerStarted","Data":"03b65a6d965c0c58c5fb3e39eca279c8d8bb61774498fb612bac05726e7a6102"} Dec 11 14:26:12 crc kubenswrapper[4898]: I1211 14:26:12.828775 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9bw58" event={"ID":"2b267048-1a44-4a75-bd64-24641a6b5a63","Type":"ContainerDied","Data":"4a553195cc0dc577d426468f06a9cc33b11263e816533e5169604bf3a360ed8e"} Dec 11 14:26:12 crc kubenswrapper[4898]: I1211 14:26:12.829771 4898 generic.go:334] "Generic (PLEG): container finished" podID="2b267048-1a44-4a75-bd64-24641a6b5a63" containerID="4a553195cc0dc577d426468f06a9cc33b11263e816533e5169604bf3a360ed8e" exitCode=0 Dec 11 14:26:12 crc kubenswrapper[4898]: I1211 14:26:12.833026 4898 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 11 14:26:14 crc kubenswrapper[4898]: I1211 14:26:14.853290 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9bw58" event={"ID":"2b267048-1a44-4a75-bd64-24641a6b5a63","Type":"ContainerStarted","Data":"9f49690e934231b33c37d5ca5f51e06294074d8a4fe199e0da14708bca1565b2"} Dec 11 14:26:22 crc kubenswrapper[4898]: I1211 14:26:22.507664 4898 patch_prober.go:28] interesting pod/router-default-5444994796-jmljz container/router namespace/openshift-ingress: Liveness probe status=failure output="Get \"http://localhost:1936/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 11 14:26:22 crc kubenswrapper[4898]: I1211 14:26:22.537528 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-ingress/router-default-5444994796-jmljz" podUID="7b9b7fbd-6a91-4761-8604-b82c417e12f8" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 11 14:26:22 crc kubenswrapper[4898]: I1211 14:26:22.519326 4898 patch_prober.go:28] interesting pod/prometheus-operator-admission-webhook-f54c54754-56mlq container/prometheus-operator-admission-webhook namespace/openshift-monitoring: Liveness probe status=failure output="Get \"https://10.217.0.60:8443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 11 14:26:22 crc kubenswrapper[4898]: I1211 14:26:22.541337 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-56mlq" podUID="eca70e56-1feb-4a48-9c32-db8e075ebfff" containerName="prometheus-operator-admission-webhook" probeResult="failure" output="Get \"https://10.217.0.60:8443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 11 14:26:22 crc kubenswrapper[4898]: I1211 14:26:22.523420 4898 patch_prober.go:28] interesting pod/router-default-5444994796-jmljz container/router namespace/openshift-ingress: Readiness probe status=failure output="Get \"http://localhost:1936/healthz/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 11 14:26:22 crc kubenswrapper[4898]: I1211 14:26:22.544697 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-ingress/router-default-5444994796-jmljz" podUID="7b9b7fbd-6a91-4761-8604-b82c417e12f8" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 11 14:26:24 crc kubenswrapper[4898]: I1211 14:26:24.611270 4898 generic.go:334] "Generic (PLEG): container finished" podID="2b267048-1a44-4a75-bd64-24641a6b5a63" containerID="9f49690e934231b33c37d5ca5f51e06294074d8a4fe199e0da14708bca1565b2" exitCode=0 Dec 11 14:26:24 crc kubenswrapper[4898]: I1211 14:26:24.611375 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9bw58" event={"ID":"2b267048-1a44-4a75-bd64-24641a6b5a63","Type":"ContainerDied","Data":"9f49690e934231b33c37d5ca5f51e06294074d8a4fe199e0da14708bca1565b2"} Dec 11 14:26:26 crc kubenswrapper[4898]: I1211 14:26:26.634353 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9bw58" event={"ID":"2b267048-1a44-4a75-bd64-24641a6b5a63","Type":"ContainerStarted","Data":"17cae00930c797a1735d0d7a2c1dce3066f4554f07ef4ccc04fa9f4ec1578dfc"} Dec 11 14:26:26 crc kubenswrapper[4898]: I1211 14:26:26.653239 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-9bw58" podStartSLOduration=5.306867821 podStartE2EDuration="17.65275418s" podCreationTimestamp="2025-12-11 14:26:09 +0000 UTC" firstStartedPulling="2025-12-11 14:26:12.832253276 +0000 UTC m=+4930.404579713" lastFinishedPulling="2025-12-11 14:26:25.178139635 +0000 UTC m=+4942.750466072" observedRunningTime="2025-12-11 14:26:26.650259013 +0000 UTC m=+4944.222585460" watchObservedRunningTime="2025-12-11 14:26:26.65275418 +0000 UTC m=+4944.225080617" Dec 11 14:26:29 crc kubenswrapper[4898]: I1211 14:26:29.832355 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-9bw58" Dec 11 14:26:29 crc kubenswrapper[4898]: I1211 14:26:29.832677 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-9bw58" Dec 11 14:26:31 crc kubenswrapper[4898]: I1211 14:26:31.283649 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-9bw58" podUID="2b267048-1a44-4a75-bd64-24641a6b5a63" containerName="registry-server" probeResult="failure" output=< Dec 11 14:26:31 crc kubenswrapper[4898]: timeout: failed to connect service ":50051" within 1s Dec 11 14:26:31 crc kubenswrapper[4898]: > Dec 11 14:26:41 crc kubenswrapper[4898]: I1211 14:26:41.776184 4898 trace.go:236] Trace[277569743]: "Calculate volume metrics of ovnkube-config for pod openshift-ovn-kubernetes/ovnkube-node-lwjvl" (11-Dec-2025 14:26:40.240) (total time: 1530ms): Dec 11 14:26:41 crc kubenswrapper[4898]: Trace[277569743]: [1.53073003s] [1.53073003s] END Dec 11 14:26:41 crc kubenswrapper[4898]: I1211 14:26:41.782453 4898 fsHandler.go:133] fs: disk usage and inodes count on following dirs took 2.377517314s: [/var/lib/containers/storage/overlay/408fc88068024451e865589adcf4dbe1b17ea9fadcddf05d26321fb79f2b3bc4/diff /var/log/pods/openstack-operators_nova-operator-controller-manager-697bc559fc-pqc9z_ac70ed50-7e53-4bb9-ac63-35e5c0651db5/manager/0.log]; will not log again for this container unless duration exceeds 2s Dec 11 14:26:41 crc kubenswrapper[4898]: I1211 14:26:41.782569 4898 fsHandler.go:133] fs: disk usage and inodes count on following dirs took 2.187585535s: [/var/lib/containers/storage/overlay/eec7d82040e51ed4f5606a2b9b561357b43135b47ef0bf0fd3c89e7c4375bcb6/diff /var/log/pods/openstack_neutron-84b4b98fdc-tbjdg_72ca8f16-912b-44f0-bc9d-868f381fb8fb/neutron-api/0.log]; will not log again for this container unless duration exceeds 2s Dec 11 14:26:41 crc kubenswrapper[4898]: I1211 14:26:41.783821 4898 fsHandler.go:133] fs: disk usage and inodes count on following dirs took 2.789662112s: [/var/lib/containers/storage/overlay/78a10a3afa610de416f6d0caf81bff294dfd6327e9cfa928d87ba543d5532636/diff /var/log/pods/openstack-operators_neutron-operator-controller-manager-5fdfd5b6b5-pgfnv_09d9c781-008c-4486-807c-159f4fefe857/manager/0.log]; will not log again for this container unless duration exceeds 2s Dec 11 14:26:41 crc kubenswrapper[4898]: I1211 14:26:41.784113 4898 fsHandler.go:133] fs: disk usage and inodes count on following dirs took 2.379125607s: [/var/lib/containers/storage/overlay/186f059e84a576e0f2d5fcb01510038463db928335c2d612ed3f6fc64d2fdff2/diff /var/log/pods/openstack-operators_keystone-operator-controller-manager-7765d96ddf-wlqm4_e87a760e-bf60-4a98-bb37-1f44745e250f/manager/0.log]; will not log again for this container unless duration exceeds 2s Dec 11 14:26:41 crc kubenswrapper[4898]: I1211 14:26:41.782610 4898 fsHandler.go:133] fs: disk usage and inodes count on following dirs took 2.365384556s: [/var/lib/containers/storage/overlay/559a891cf07cd5b5b234e2bcb2df5d66489a5685e2f3af13b497e0ffd2b1413d/diff /var/log/pods/openstack-operators_glance-operator-controller-manager-5697bb5779-7kffw_5c391a19-7c2d-4838-9269-2c5cd8eea1ad/manager/0.log]; will not log again for this container unless duration exceeds 2s Dec 11 14:26:41 crc kubenswrapper[4898]: I1211 14:26:41.785715 4898 patch_prober.go:28] interesting pod/observability-operator-d8bb48f5d-cnkcn container/operator namespace/openshift-operators: Liveness probe status=failure output="Get \"http://10.217.0.13:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 11 14:26:41 crc kubenswrapper[4898]: I1211 14:26:41.786309 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operators/observability-operator-d8bb48f5d-cnkcn" podUID="2fed6ea1-0c4d-476c-9f45-ca4b5a9dc406" containerName="operator" probeResult="failure" output="Get \"http://10.217.0.13:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 11 14:26:41 crc kubenswrapper[4898]: I1211 14:26:41.792137 4898 patch_prober.go:28] interesting pod/perses-operator-5446b9c989-nq968 container/perses-operator namespace/openshift-operators: Readiness probe status=failure output="Get \"http://10.217.0.11:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 11 14:26:41 crc kubenswrapper[4898]: I1211 14:26:41.792198 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operators/perses-operator-5446b9c989-nq968" podUID="c5f15058-ca6b-40a5-bad2-83ea7339d28b" containerName="perses-operator" probeResult="failure" output="Get \"http://10.217.0.11:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 11 14:26:41 crc kubenswrapper[4898]: I1211 14:26:41.806286 4898 fsHandler.go:133] fs: disk usage and inodes count on following dirs took 2.865578451s: [/var/lib/containers/storage/overlay/de576e76b89e082b3427961813ba7086047f49f6de4c55475615d0df1f04d785/diff /var/log/pods/openstack-operators_manila-operator-controller-manager-5b5fd79c9c-wxz25_a99a2194-b89b-4a6a-a086-acd20b489632/manager/0.log]; will not log again for this container unless duration exceeds 2s Dec 11 14:26:41 crc kubenswrapper[4898]: I1211 14:26:41.827640 4898 trace.go:236] Trace[1573761317]: "Calculate volume metrics of trusted-ca for pod openshift-console-operator/console-operator-58897d9998-rkglj" (11-Dec-2025 14:26:40.044) (total time: 1783ms): Dec 11 14:26:41 crc kubenswrapper[4898]: Trace[1573761317]: [1.783498795s] [1.783498795s] END Dec 11 14:26:41 crc kubenswrapper[4898]: E1211 14:26:41.895056 4898 kubelet.go:2526] "Housekeeping took longer than expected" err="housekeeping took too long" expected="1s" actual="3.121s" Dec 11 14:26:42 crc kubenswrapper[4898]: I1211 14:26:42.846024 4898 patch_prober.go:28] interesting pod/downloads-7954f5f757-4bmq4 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.12:8080/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 11 14:26:42 crc kubenswrapper[4898]: I1211 14:26:42.846015 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-4lcmf" podUID="7a0813f7-7167-46ed-b9f8-e2157e92f620" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.107:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 11 14:26:42 crc kubenswrapper[4898]: I1211 14:26:42.846028 4898 patch_prober.go:28] interesting pod/downloads-7954f5f757-4bmq4 container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.12:8080/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 11 14:26:42 crc kubenswrapper[4898]: I1211 14:26:42.846429 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-4bmq4" podUID="5442029d-e1aa-4496-b3c6-18f11b034179" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 11 14:26:42 crc kubenswrapper[4898]: I1211 14:26:42.846373 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-4bmq4" podUID="5442029d-e1aa-4496-b3c6-18f11b034179" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 11 14:26:42 crc kubenswrapper[4898]: I1211 14:26:42.855382 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-galera-0" podUID="2e7cffb6-80f8-45e8-a4ab-219dc834a613" containerName="galera" probeResult="failure" output="command timed out" Dec 11 14:26:42 crc kubenswrapper[4898]: I1211 14:26:42.855423 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-cell1-galera-0" podUID="5ae31191-f9f6-452a-8f45-a48b4736012e" containerName="galera" probeResult="failure" output="command timed out" Dec 11 14:26:42 crc kubenswrapper[4898]: I1211 14:26:42.855466 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/openstack-galera-0" podUID="2e7cffb6-80f8-45e8-a4ab-219dc834a613" containerName="galera" probeResult="failure" output="command timed out" Dec 11 14:26:42 crc kubenswrapper[4898]: I1211 14:26:42.855389 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-northd-0" podUID="69fcbdad-bb34-4a36-9100-352ddce7c906" containerName="ovn-northd" probeResult="failure" output="command timed out" Dec 11 14:26:42 crc kubenswrapper[4898]: I1211 14:26:42.855387 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/openstack-cell1-galera-0" podUID="5ae31191-f9f6-452a-8f45-a48b4736012e" containerName="galera" probeResult="failure" output="command timed out" Dec 11 14:26:42 crc kubenswrapper[4898]: I1211 14:26:42.857516 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/ovn-northd-0" podUID="69fcbdad-bb34-4a36-9100-352ddce7c906" containerName="ovn-northd" probeResult="failure" output="command timed out" Dec 11 14:26:43 crc kubenswrapper[4898]: I1211 14:26:43.435854 4898 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-pjgq4 container/packageserver namespace/openshift-operator-lifecycle-manager: Liveness probe status=failure output="Get \"https://10.217.0.23:5443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 11 14:26:43 crc kubenswrapper[4898]: I1211 14:26:43.436213 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pjgq4" podUID="14360874-1ac7-4262-b7ed-3ccc4d909191" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.23:5443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 11 14:26:43 crc kubenswrapper[4898]: I1211 14:26:43.435939 4898 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-pjgq4 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.23:5443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 11 14:26:43 crc kubenswrapper[4898]: I1211 14:26:43.436571 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pjgq4" podUID="14360874-1ac7-4262-b7ed-3ccc4d909191" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.23:5443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 11 14:26:43 crc kubenswrapper[4898]: I1211 14:26:43.787079 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/ceilometer-0" podUID="d71ee22f-68e7-43d7-8a6a-012ff8b8104e" containerName="ceilometer-central-agent" probeResult="failure" output="command timed out" Dec 11 14:26:44 crc kubenswrapper[4898]: I1211 14:26:44.497751 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/metallb-operator-webhook-server-585b5958fd-m8kkm" podUID="351e2ec9-301e-4fd9-b8ef-45494b9a1291" containerName="webhook-server" probeResult="failure" output="Get \"http://10.217.0.96:7472/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 11 14:26:44 crc kubenswrapper[4898]: I1211 14:26:44.497805 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/metallb-operator-webhook-server-585b5958fd-m8kkm" podUID="351e2ec9-301e-4fd9-b8ef-45494b9a1291" containerName="webhook-server" probeResult="failure" output="Get \"http://10.217.0.96:7472/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 11 14:26:44 crc kubenswrapper[4898]: I1211 14:26:44.788293 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-9bw58" podUID="2b267048-1a44-4a75-bd64-24641a6b5a63" containerName="registry-server" probeResult="failure" output="command timed out" Dec 11 14:26:45 crc kubenswrapper[4898]: I1211 14:26:45.195986 4898 patch_prober.go:28] interesting pod/route-controller-manager-6fcffdd775-jsnhs container/route-controller-manager namespace/openshift-route-controller-manager: Liveness probe status=failure output="Get \"https://10.217.0.69:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 11 14:26:45 crc kubenswrapper[4898]: I1211 14:26:45.196062 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-route-controller-manager/route-controller-manager-6fcffdd775-jsnhs" podUID="2dcb95b5-448f-4cc1-8399-9c4c1cc5046f" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.69:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 11 14:26:45 crc kubenswrapper[4898]: I1211 14:26:45.231743 4898 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-trbn2 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.57:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 11 14:26:45 crc kubenswrapper[4898]: I1211 14:26:45.232394 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-trbn2" podUID="bb6a1bca-9d01-4e70-882f-47a6e90923df" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.57:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 11 14:26:45 crc kubenswrapper[4898]: I1211 14:26:45.231993 4898 patch_prober.go:28] interesting pod/controller-manager-6d6f47df77-zcqxc container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.68:8443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 11 14:26:45 crc kubenswrapper[4898]: I1211 14:26:45.232485 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-6d6f47df77-zcqxc" podUID="0fd6bf11-4912-4c55-b89d-5866167a0283" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.68:8443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 11 14:26:45 crc kubenswrapper[4898]: I1211 14:26:45.272695 4898 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-trbn2 container/marketplace-operator namespace/openshift-marketplace: Liveness probe status=failure output="Get \"http://10.217.0.57:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 11 14:26:45 crc kubenswrapper[4898]: I1211 14:26:45.272754 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/marketplace-operator-79b997595-trbn2" podUID="bb6a1bca-9d01-4e70-882f-47a6e90923df" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.57:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 11 14:26:45 crc kubenswrapper[4898]: I1211 14:26:45.272713 4898 patch_prober.go:28] interesting pod/controller-manager-6d6f47df77-zcqxc container/controller-manager namespace/openshift-controller-manager: Liveness probe status=failure output="Get \"https://10.217.0.68:8443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 11 14:26:45 crc kubenswrapper[4898]: I1211 14:26:45.272821 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-controller-manager/controller-manager-6d6f47df77-zcqxc" podUID="0fd6bf11-4912-4c55-b89d-5866167a0283" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.68:8443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 11 14:26:45 crc kubenswrapper[4898]: I1211 14:26:45.272752 4898 patch_prober.go:28] interesting pod/route-controller-manager-6fcffdd775-jsnhs container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.69:8443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 11 14:26:45 crc kubenswrapper[4898]: I1211 14:26:45.272863 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6fcffdd775-jsnhs" podUID="2dcb95b5-448f-4cc1-8399-9c4c1cc5046f" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.69:8443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 11 14:26:45 crc kubenswrapper[4898]: I1211 14:26:45.397838 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/frr-k8s-6gtqk" podUID="d898c00f-7c50-483d-84f3-9c502696b39a" containerName="controller" probeResult="failure" output="Get \"http://127.0.0.1:7572/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 11 14:26:45 crc kubenswrapper[4898]: I1211 14:26:45.397831 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/frr-k8s-6gtqk" podUID="d898c00f-7c50-483d-84f3-9c502696b39a" containerName="frr" probeResult="failure" output="Get \"http://127.0.0.1:7573/livez\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 11 14:26:45 crc kubenswrapper[4898]: I1211 14:26:45.397920 4898 patch_prober.go:28] interesting pod/console-9bc76884c-z28hg container/console namespace/openshift-console: Readiness probe status=failure output="Get \"https://10.217.0.137:8443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 11 14:26:45 crc kubenswrapper[4898]: I1211 14:26:45.397939 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/console-9bc76884c-z28hg" podUID="4ba352c0-f542-46ac-abcc-c136ddfb67fc" containerName="console" probeResult="failure" output="Get \"https://10.217.0.137:8443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 11 14:26:45 crc kubenswrapper[4898]: I1211 14:26:45.397924 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/frr-k8s-6gtqk" podUID="d898c00f-7c50-483d-84f3-9c502696b39a" containerName="controller" probeResult="failure" output="Get \"http://127.0.0.1:7572/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 11 14:26:45 crc kubenswrapper[4898]: I1211 14:26:45.457654 4898 patch_prober.go:28] interesting pod/logging-loki-distributor-76cc67bf56-qjz7m container/loki-distributor namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.52:3101/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 11 14:26:45 crc kubenswrapper[4898]: I1211 14:26:45.457738 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-distributor-76cc67bf56-qjz7m" podUID="841a3e5b-876d-43b2-b24a-d5c01876c30d" containerName="loki-distributor" probeResult="failure" output="Get \"https://10.217.0.52:3101/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 11 14:26:45 crc kubenswrapper[4898]: I1211 14:26:45.561068 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/controller-5bddd4b946-bkr8v" podUID="cfd0fc01-1bde-4b11-bbdd-d95693d0dd15" containerName="controller" probeResult="failure" output="Get \"http://10.217.0.98:29150/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 11 14:26:45 crc kubenswrapper[4898]: I1211 14:26:45.561612 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/controller-5bddd4b946-bkr8v" podUID="cfd0fc01-1bde-4b11-bbdd-d95693d0dd15" containerName="controller" probeResult="failure" output="Get \"http://10.217.0.98:29150/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 11 14:26:45 crc kubenswrapper[4898]: I1211 14:26:45.599900 4898 patch_prober.go:28] interesting pod/logging-loki-querier-5895d59bb8-62tqd container/loki-querier namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.53:3101/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 11 14:26:45 crc kubenswrapper[4898]: I1211 14:26:45.599959 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-querier-5895d59bb8-62tqd" podUID="4ecec7bc-8ce0-46ce-99fd-2a6fbcc626d0" containerName="loki-querier" probeResult="failure" output="Get \"https://10.217.0.53:3101/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 11 14:26:45 crc kubenswrapper[4898]: I1211 14:26:45.609927 4898 trace.go:236] Trace[1360641957]: "Calculate volume metrics of registry-storage for pod openshift-image-registry/image-registry-66df7c8f76-jhl8f" (11-Dec-2025 14:26:41.785) (total time: 3824ms): Dec 11 14:26:45 crc kubenswrapper[4898]: Trace[1360641957]: [3.824371338s] [3.824371338s] END Dec 11 14:26:46 crc kubenswrapper[4898]: I1211 14:26:46.196767 4898 patch_prober.go:28] interesting pod/logging-loki-query-frontend-84558f7c9f-rjbbq container/loki-query-frontend namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.54:3101/loki/api/v1/status/buildinfo\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 11 14:26:46 crc kubenswrapper[4898]: I1211 14:26:46.197163 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-rjbbq" podUID="98efd2cb-8cd8-49c2-a54b-5a04cf51dc71" containerName="loki-query-frontend" probeResult="failure" output="Get \"https://10.217.0.54:3101/loki/api/v1/status/buildinfo\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 11 14:26:46 crc kubenswrapper[4898]: I1211 14:26:46.285781 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/frr-k8s-webhook-server-7784b6fcf-llp2z" podUID="c0569df8-06fa-4d31-a59a-904b90e4a0ca" containerName="frr-k8s-webhook-server" probeResult="failure" output="Get \"http://10.217.0.97:7572/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 11 14:26:46 crc kubenswrapper[4898]: I1211 14:26:46.285781 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/frr-k8s-webhook-server-7784b6fcf-llp2z" podUID="c0569df8-06fa-4d31-a59a-904b90e4a0ca" containerName="frr-k8s-webhook-server" probeResult="failure" output="Get \"http://10.217.0.97:7572/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 11 14:26:47 crc kubenswrapper[4898]: I1211 14:26:46.741672 4898 patch_prober.go:28] interesting pod/logging-loki-compactor-0 container/loki-compactor namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.75:3101/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 11 14:26:47 crc kubenswrapper[4898]: I1211 14:26:46.741915 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-compactor-0" podUID="3cd3cd1d-9ead-4620-a346-f83e9e5190ba" containerName="loki-compactor" probeResult="failure" output="Get \"https://10.217.0.75:3101/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 11 14:26:47 crc kubenswrapper[4898]: I1211 14:26:46.741672 4898 patch_prober.go:28] interesting pod/loki-operator-controller-manager-7d9d9f99f6-7sstc container/manager namespace/openshift-operators-redhat: Readiness probe status=failure output="Get \"http://10.217.0.49:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 11 14:26:47 crc kubenswrapper[4898]: I1211 14:26:46.741968 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operators-redhat/loki-operator-controller-manager-7d9d9f99f6-7sstc" podUID="b4cf67d3-b13e-4afb-be20-80dc0801c69c" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.49:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 11 14:26:47 crc kubenswrapper[4898]: I1211 14:26:46.800764 4898 patch_prober.go:28] interesting pod/logging-loki-index-gateway-0 container/loki-index-gateway namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.80:3101/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 11 14:26:47 crc kubenswrapper[4898]: I1211 14:26:46.800842 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-index-gateway-0" podUID="8571f1ff-bc62-45f4-a34d-d221e36df569" containerName="loki-index-gateway" probeResult="failure" output="Get \"https://10.217.0.80:3101/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 11 14:26:47 crc kubenswrapper[4898]: I1211 14:26:46.893864 4898 patch_prober.go:28] interesting pod/logging-loki-ingester-0 container/loki-ingester namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.62:3101/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 11 14:26:47 crc kubenswrapper[4898]: I1211 14:26:46.893919 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-ingester-0" podUID="f6ad4db8-64f3-403c-9c92-9033a73ed12c" containerName="loki-ingester" probeResult="failure" output="Get \"https://10.217.0.62:3101/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 11 14:26:47 crc kubenswrapper[4898]: I1211 14:26:46.995686 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/speaker-7jgw9" podUID="80d1af81-ad34-4f94-afd2-94c3773ea9ea" containerName="speaker" probeResult="failure" output="Get \"http://localhost:29150/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 11 14:26:47 crc kubenswrapper[4898]: I1211 14:26:47.201977 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/speaker-7jgw9" podUID="80d1af81-ad34-4f94-afd2-94c3773ea9ea" containerName="speaker" probeResult="failure" output="Get \"http://localhost:29150/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 11 14:26:48 crc kubenswrapper[4898]: I1211 14:26:48.790184 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/ceilometer-0" podUID="d71ee22f-68e7-43d7-8a6a-012ff8b8104e" containerName="ceilometer-notification-agent" probeResult="failure" output="command timed out" Dec 11 14:26:50 crc kubenswrapper[4898]: I1211 14:26:50.841366 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-9bw58" podUID="2b267048-1a44-4a75-bd64-24641a6b5a63" containerName="registry-server" probeResult="failure" output=< Dec 11 14:26:50 crc kubenswrapper[4898]: timeout: failed to connect service ":50051" within 1s Dec 11 14:26:50 crc kubenswrapper[4898]: > Dec 11 14:27:01 crc kubenswrapper[4898]: I1211 14:27:01.054319 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-9bw58" podUID="2b267048-1a44-4a75-bd64-24641a6b5a63" containerName="registry-server" probeResult="failure" output=< Dec 11 14:27:01 crc kubenswrapper[4898]: timeout: failed to connect service ":50051" within 1s Dec 11 14:27:01 crc kubenswrapper[4898]: > Dec 11 14:27:03 crc kubenswrapper[4898]: I1211 14:27:03.103487 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-wlqm4" podUID="e87a760e-bf60-4a98-bb37-1f44745e250f" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.111:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 11 14:27:05 crc kubenswrapper[4898]: I1211 14:27:05.303822 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/frr-k8s-6gtqk" podUID="d898c00f-7c50-483d-84f3-9c502696b39a" containerName="frr" probeResult="failure" output="Get \"http://127.0.0.1:7573/livez\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 11 14:27:05 crc kubenswrapper[4898]: I1211 14:27:05.497102 4898 patch_prober.go:28] interesting pod/console-9bc76884c-z28hg container/console namespace/openshift-console: Readiness probe status=failure output="Get \"https://10.217.0.137:8443/health\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 11 14:27:05 crc kubenswrapper[4898]: I1211 14:27:05.498773 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/console-9bc76884c-z28hg" podUID="4ba352c0-f542-46ac-abcc-c136ddfb67fc" containerName="console" probeResult="failure" output="Get \"https://10.217.0.137:8443/health\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 11 14:27:10 crc kubenswrapper[4898]: I1211 14:27:10.851576 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-9bw58" podUID="2b267048-1a44-4a75-bd64-24641a6b5a63" containerName="registry-server" probeResult="failure" output=< Dec 11 14:27:10 crc kubenswrapper[4898]: timeout: failed to connect service ":50051" within 1s Dec 11 14:27:10 crc kubenswrapper[4898]: > Dec 11 14:27:16 crc kubenswrapper[4898]: I1211 14:27:16.014640 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/openstack-operator-controller-operator-5dc777b99d-mszqj" podUID="4e56a824-5c00-4a67-a8c3-a32a001f0ce4" containerName="operator" probeResult="failure" output="Get \"http://10.217.0.102:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 11 14:27:20 crc kubenswrapper[4898]: I1211 14:27:20.925887 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-9bw58" podUID="2b267048-1a44-4a75-bd64-24641a6b5a63" containerName="registry-server" probeResult="failure" output=< Dec 11 14:27:20 crc kubenswrapper[4898]: timeout: failed to connect service ":50051" within 1s Dec 11 14:27:20 crc kubenswrapper[4898]: > Dec 11 14:27:23 crc kubenswrapper[4898]: I1211 14:27:23.528648 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/swift-operator-controller-manager-9d58d64bc-2wt95" podUID="cb6adf46-208a-4945-97aa-2c457b9c2614" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.120:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 11 14:27:23 crc kubenswrapper[4898]: I1211 14:27:23.528678 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/telemetry-operator-controller-manager-766b45bcdb-ksffb" podUID="f52c9389-ea61-4327-afd9-f4c92541a821" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.121:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 11 14:27:29 crc kubenswrapper[4898]: I1211 14:27:23.587720 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/test-operator-controller-manager-5854674fcc-8d8zb" podUID="1a7e7363-7657-4eb2-a969-9f4c08a50984" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.122:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 11 14:27:29 crc kubenswrapper[4898]: I1211 14:27:23.629622 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/watcher-operator-controller-manager-75944c9b7-9kw6z" podUID="7d6dbccc-94de-44f9-b7d2-5bbcfee1d119" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.123:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 11 14:27:29 crc kubenswrapper[4898]: I1211 14:27:27.583613 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/cinder-api-0" podUID="3d7a3f50-05f9-4ffd-b473-b7c0c1563bc2" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.0.216:8776/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 11 14:27:29 crc kubenswrapper[4898]: I1211 14:27:29.409276 4898 fsHandler.go:133] fs: disk usage and inodes count on following dirs took 2.798224016s: [/var/lib/containers/storage/overlay/7120918b1daceff96128a291e998e6f85969d21ac1e85fbb9b4ef587e9af7902/diff /var/log/pods/openstack_heat-api-58ffc484cf-pk2vt_d83af347-3774-4e08-8138-6e67557da826/heat-api/0.log]; will not log again for this container unless duration exceeds 2s Dec 11 14:27:29 crc kubenswrapper[4898]: I1211 14:27:29.412499 4898 fsHandler.go:133] fs: disk usage and inodes count on following dirs took 3.301079933s: [/var/lib/containers/storage/overlay/2a4d7eed15482543841f00a88d4c223852879962b8ebd820856255341f737366/diff /var/log/pods/openshift-marketplace_redhat-operators-9bw58_2b267048-1a44-4a75-bd64-24641a6b5a63/registry-server/0.log]; will not log again for this container unless duration exceeds 2s Dec 11 14:27:29 crc kubenswrapper[4898]: I1211 14:27:29.424725 4898 trace.go:236] Trace[170027695]: "Calculate volume metrics of metrics-client-ca for pod openshift-monitoring/alertmanager-main-0" (11-Dec-2025 14:27:28.419) (total time: 1005ms): Dec 11 14:27:29 crc kubenswrapper[4898]: Trace[170027695]: [1.005601282s] [1.005601282s] END Dec 11 14:27:29 crc kubenswrapper[4898]: I1211 14:27:29.425262 4898 trace.go:236] Trace[421168359]: "Calculate volume metrics of logging-loki-ca-bundle for pod openshift-logging/logging-loki-gateway-69ffd5987-jj95c" (11-Dec-2025 14:27:24.009) (total time: 5415ms): Dec 11 14:27:29 crc kubenswrapper[4898]: Trace[421168359]: [5.415500132s] [5.415500132s] END Dec 11 14:27:29 crc kubenswrapper[4898]: I1211 14:27:29.425360 4898 fsHandler.go:133] fs: disk usage and inodes count on following dirs took 5.431647353s: [/var/lib/containers/storage/overlay/10834d8a4894305788c2aa433d47030e1a8b2054a79152df09a9e0e252fa5312/diff /var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_71bb4a3aecc4ba5b26c4b7318770ce13/kube-apiserver-check-endpoints/0.log]; will not log again for this container unless duration exceeds 2s Dec 11 14:27:29 crc kubenswrapper[4898]: I1211 14:27:29.425899 4898 fsHandler.go:133] fs: disk usage and inodes count on following dirs took 5.820966841s: [/var/lib/containers/storage/overlay/c0bbb7d64fa559de635d4641b63181b7436d6ea3160d318d43bfa16eec18db61/diff /var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_71bb4a3aecc4ba5b26c4b7318770ce13/kube-apiserver-cert-regeneration-controller/0.log]; will not log again for this container unless duration exceeds 2s Dec 11 14:27:29 crc kubenswrapper[4898]: I1211 14:27:29.426300 4898 fsHandler.go:133] fs: disk usage and inodes count on following dirs took 5.912856724s: [/var/lib/containers/storage/overlay/2c8316a098aacccb5c8df1c977e68928358331ab43c9ee8181d8e645a59fd183/diff /var/log/pods/openshift-dns_node-resolver-h86cf_00c0a5c3-6be3-4c77-a628-cf6710a1f10f/dns-node-resolver/0.log]; will not log again for this container unless duration exceeds 2s Dec 11 14:27:29 crc kubenswrapper[4898]: I1211 14:27:29.430219 4898 fsHandler.go:133] fs: disk usage and inodes count on following dirs took 4.584825537s: [/var/lib/containers/storage/overlay/084c60eb9ea4afb864b8ef688fed809fdc42abf9e31121eed1ce75b53787240e/diff /var/log/pods/openstack_heat-engine-686c7f94b-jlnr7_836f22d0-0883-463e-942b-abb6931a997f/heat-engine/0.log]; will not log again for this container unless duration exceeds 2s Dec 11 14:27:29 crc kubenswrapper[4898]: I1211 14:27:29.430737 4898 trace.go:236] Trace[174090783]: "Calculate volume metrics of scripts for pod openstack/ovn-controller-lsxj7" (11-Dec-2025 14:27:24.409) (total time: 5020ms): Dec 11 14:27:29 crc kubenswrapper[4898]: Trace[174090783]: [5.02076968s] [5.02076968s] END Dec 11 14:27:29 crc kubenswrapper[4898]: I1211 14:27:29.435658 4898 fsHandler.go:133] fs: disk usage and inodes count on following dirs took 6.28489468s: [/var/lib/containers/storage/overlay/c6b9b840090024256b39731c46420d28af9f04259d402a0191cbcabd99c59229/diff /var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_71bb4a3aecc4ba5b26c4b7318770ce13/kube-apiserver/0.log]; will not log again for this container unless duration exceeds 2s Dec 11 14:27:29 crc kubenswrapper[4898]: I1211 14:27:29.435775 4898 fsHandler.go:133] fs: disk usage and inodes count on following dirs took 4.182963795s: [/var/lib/containers/storage/overlay/57bc7212ad89c2f7aa4fda54012bc3104bb2eaa244a409a8d44e4babb6c644d4/diff /var/log/pods/openstack_ceilometer-0_d71ee22f-68e7-43d7-8a6a-012ff8b8104e/ceilometer-central-agent/0.log]; will not log again for this container unless duration exceeds 2s Dec 11 14:27:29 crc kubenswrapper[4898]: I1211 14:27:29.436897 4898 fsHandler.go:133] fs: disk usage and inodes count on following dirs took 6.482175765s: [/var/lib/containers/storage/overlay/7d3cfdcc46135841f3dcff6eec2c898f8803aef04b1a88343695b465c1e80f99/diff /var/log/pods/openstack_swift-proxy-6867fd7bcf-bbj7b_4ed17564-edd0-4a66-8b9b-04aabd280113/proxy-server/0.log]; will not log again for this container unless duration exceeds 2s Dec 11 14:27:29 crc kubenswrapper[4898]: I1211 14:27:29.438206 4898 fsHandler.go:133] fs: disk usage and inodes count on following dirs took 2.839104506s: [/var/lib/containers/storage/overlay/913f138ea1a3a5176551d9414a18e7fee66d014190aaf7d6924c10ab7a2e360d/diff /var/log/pods/openstack_heat-cfnapi-68bbb97c49-zk2lz_911f5e17-1a51-4bf3-8f1c-cdedc2f4404c/heat-cfnapi/0.log]; will not log again for this container unless duration exceeds 2s Dec 11 14:27:29 crc kubenswrapper[4898]: I1211 14:27:29.443800 4898 trace.go:236] Trace[773580160]: "Calculate volume metrics of marketplace-trusted-ca for pod openshift-marketplace/marketplace-operator-79b997595-trbn2" (11-Dec-2025 14:27:24.447) (total time: 4967ms): Dec 11 14:27:29 crc kubenswrapper[4898]: Trace[773580160]: [4.967782836s] [4.967782836s] END Dec 11 14:27:29 crc kubenswrapper[4898]: I1211 14:27:29.443816 4898 trace.go:236] Trace[946043487]: "Calculate volume metrics of config for pod openshift-etcd-operator/etcd-operator-b45778765-42442" (11-Dec-2025 14:27:27.300) (total time: 2114ms): Dec 11 14:27:29 crc kubenswrapper[4898]: Trace[946043487]: [2.114589893s] [2.114589893s] END Dec 11 14:27:29 crc kubenswrapper[4898]: I1211 14:27:29.443828 4898 trace.go:236] Trace[2122475645]: "Calculate volume metrics of config for pod openstack/ovsdbserver-sb-0" (11-Dec-2025 14:27:24.601) (total time: 4813ms): Dec 11 14:27:29 crc kubenswrapper[4898]: Trace[2122475645]: [4.813930441s] [4.813930441s] END Dec 11 14:27:29 crc kubenswrapper[4898]: I1211 14:27:29.494499 4898 trace.go:236] Trace[1946733704]: "Calculate volume metrics of metrics-client-ca for pod openshift-monitoring/prometheus-operator-db54df47d-q452r" (11-Dec-2025 14:27:22.709) (total time: 6784ms): Dec 11 14:27:29 crc kubenswrapper[4898]: Trace[1946733704]: [6.784941454s] [6.784941454s] END Dec 11 14:27:29 crc kubenswrapper[4898]: E1211 14:27:29.642129 4898 kubelet.go:2526] "Housekeeping took longer than expected" err="housekeeping took too long" expected="1s" actual="6.865s" Dec 11 14:27:30 crc kubenswrapper[4898]: I1211 14:27:30.578769 4898 patch_prober.go:28] interesting pod/controller-manager-6d6f47df77-zcqxc container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.68:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 11 14:27:30 crc kubenswrapper[4898]: I1211 14:27:30.579689 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-6d6f47df77-zcqxc" podUID="0fd6bf11-4912-4c55-b89d-5866167a0283" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.68:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 11 14:27:30 crc kubenswrapper[4898]: I1211 14:27:30.580229 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/frr-k8s-6gtqk" podUID="d898c00f-7c50-483d-84f3-9c502696b39a" containerName="frr" probeResult="failure" output="Get \"http://127.0.0.1:7573/livez\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 11 14:27:30 crc kubenswrapper[4898]: I1211 14:27:30.580310 4898 patch_prober.go:28] interesting pod/monitoring-plugin-5ffffb4f84-84dnp container/monitoring-plugin namespace/openshift-monitoring: Readiness probe status=failure output="Get \"https://10.217.0.77:9443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 11 14:27:30 crc kubenswrapper[4898]: I1211 14:27:30.580381 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/monitoring-plugin-5ffffb4f84-84dnp" podUID="1111559b-96c1-4918-b502-1b5045b8a9da" containerName="monitoring-plugin" probeResult="failure" output="Get \"https://10.217.0.77:9443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 11 14:27:30 crc kubenswrapper[4898]: I1211 14:27:30.600420 4898 patch_prober.go:28] interesting pod/nmstate-webhook-f8fb84555-z6l22 container/nmstate-webhook namespace/openshift-nmstate: Readiness probe status=failure output="Get \"https://10.217.0.90:9443/readyz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 11 14:27:30 crc kubenswrapper[4898]: I1211 14:27:30.600511 4898 patch_prober.go:28] interesting pod/controller-manager-6d6f47df77-zcqxc container/controller-manager namespace/openshift-controller-manager: Liveness probe status=failure output="Get \"https://10.217.0.68:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 11 14:27:30 crc kubenswrapper[4898]: I1211 14:27:30.600568 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-controller-manager/controller-manager-6d6f47df77-zcqxc" podUID="0fd6bf11-4912-4c55-b89d-5866167a0283" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.68:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 11 14:27:30 crc kubenswrapper[4898]: I1211 14:27:30.602657 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-nmstate/nmstate-webhook-f8fb84555-z6l22" podUID="79dd8f49-7447-49a9-84a3-252ac5286cc3" containerName="nmstate-webhook" probeResult="failure" output="Get \"https://10.217.0.90:9443/readyz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 11 14:27:30 crc kubenswrapper[4898]: I1211 14:27:30.785202 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-galera-0" podUID="2e7cffb6-80f8-45e8-a4ab-219dc834a613" containerName="galera" probeResult="failure" output="command timed out" Dec 11 14:27:30 crc kubenswrapper[4898]: I1211 14:27:30.785368 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/openstack-galera-0" podUID="2e7cffb6-80f8-45e8-a4ab-219dc834a613" containerName="galera" probeResult="failure" output="command timed out" Dec 11 14:27:30 crc kubenswrapper[4898]: I1211 14:27:30.785511 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/ovn-northd-0" podUID="69fcbdad-bb34-4a36-9100-352ddce7c906" containerName="ovn-northd" probeResult="failure" output="command timed out" Dec 11 14:27:30 crc kubenswrapper[4898]: I1211 14:27:30.785746 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-northd-0" podUID="69fcbdad-bb34-4a36-9100-352ddce7c906" containerName="ovn-northd" probeResult="failure" output="command timed out" Dec 11 14:27:32 crc kubenswrapper[4898]: I1211 14:27:32.781805 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-northd-0" podUID="69fcbdad-bb34-4a36-9100-352ddce7c906" containerName="ovn-northd" probeResult="failure" output="command timed out" Dec 11 14:27:32 crc kubenswrapper[4898]: I1211 14:27:32.782193 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/ovn-northd-0" podUID="69fcbdad-bb34-4a36-9100-352ddce7c906" containerName="ovn-northd" probeResult="failure" output="command timed out" Dec 11 14:27:34 crc kubenswrapper[4898]: I1211 14:27:32.782530 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-cell1-galera-0" podUID="5ae31191-f9f6-452a-8f45-a48b4736012e" containerName="galera" probeResult="failure" output="command timed out" Dec 11 14:27:34 crc kubenswrapper[4898]: I1211 14:27:32.782939 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/openstack-cell1-galera-0" podUID="5ae31191-f9f6-452a-8f45-a48b4736012e" containerName="galera" probeResult="failure" output="command timed out" Dec 11 14:27:34 crc kubenswrapper[4898]: I1211 14:27:33.294078 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/redhat-marketplace-jqfm2" podUID="f1d2e209-8101-4341-b232-ed52d1d9f629" containerName="registry-server" probeResult="failure" output=< Dec 11 14:27:34 crc kubenswrapper[4898]: timeout: failed to connect service ":50051" within 1s Dec 11 14:27:34 crc kubenswrapper[4898]: > Dec 11 14:27:34 crc kubenswrapper[4898]: I1211 14:27:33.294258 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/redhat-operators-kxmhp" podUID="c3a84ac6-e73e-4e1f-bd6e-cfb6c607dd4a" containerName="registry-server" probeResult="failure" output=< Dec 11 14:27:34 crc kubenswrapper[4898]: timeout: failed to connect service ":50051" within 1s Dec 11 14:27:34 crc kubenswrapper[4898]: > Dec 11 14:27:34 crc kubenswrapper[4898]: I1211 14:27:33.503789 4898 patch_prober.go:28] interesting pod/package-server-manager-789f6589d5-zl9tf container/package-server-manager namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"http://10.217.0.37:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 11 14:27:34 crc kubenswrapper[4898]: I1211 14:27:33.503819 4898 patch_prober.go:28] interesting pod/package-server-manager-789f6589d5-zl9tf container/package-server-manager namespace/openshift-operator-lifecycle-manager: Liveness probe status=failure output="Get \"http://10.217.0.37:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 11 14:27:34 crc kubenswrapper[4898]: I1211 14:27:33.503878 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zl9tf" podUID="d5b90074-739f-4f6d-a41c-29612abca57e" containerName="package-server-manager" probeResult="failure" output="Get \"http://10.217.0.37:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 11 14:27:34 crc kubenswrapper[4898]: I1211 14:27:33.503909 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zl9tf" podUID="d5b90074-739f-4f6d-a41c-29612abca57e" containerName="package-server-manager" probeResult="failure" output="Get \"http://10.217.0.37:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 11 14:27:34 crc kubenswrapper[4898]: E1211 14:27:34.071044 4898 kubelet.go:2526] "Housekeeping took longer than expected" err="housekeeping took too long" expected="1s" actual="1.287s" Dec 11 14:27:34 crc kubenswrapper[4898]: I1211 14:27:34.072440 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/certified-operators-nkc2q" podUID="b434beff-f5d1-4b10-8715-10cdfe445919" containerName="registry-server" probeResult="failure" output=< Dec 11 14:27:34 crc kubenswrapper[4898]: timeout: failed to connect service ":50051" within 1s Dec 11 14:27:34 crc kubenswrapper[4898]: > Dec 11 14:27:34 crc kubenswrapper[4898]: I1211 14:27:34.076991 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/certified-operators-nkc2q" podUID="b434beff-f5d1-4b10-8715-10cdfe445919" containerName="registry-server" probeResult="failure" output=< Dec 11 14:27:34 crc kubenswrapper[4898]: timeout: failed to connect service ":50051" within 1s Dec 11 14:27:34 crc kubenswrapper[4898]: > Dec 11 14:27:34 crc kubenswrapper[4898]: I1211 14:27:34.078450 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/redhat-operators-kxmhp" podUID="c3a84ac6-e73e-4e1f-bd6e-cfb6c607dd4a" containerName="registry-server" probeResult="failure" output=< Dec 11 14:27:34 crc kubenswrapper[4898]: timeout: failed to connect service ":50051" within 1s Dec 11 14:27:34 crc kubenswrapper[4898]: > Dec 11 14:27:34 crc kubenswrapper[4898]: I1211 14:27:34.083646 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/community-operators-jncnt" podUID="e837aefb-2b43-47e8-87b0-232560ff1b37" containerName="registry-server" probeResult="failure" output=< Dec 11 14:27:34 crc kubenswrapper[4898]: timeout: failed to connect service ":50051" within 1s Dec 11 14:27:34 crc kubenswrapper[4898]: > Dec 11 14:27:34 crc kubenswrapper[4898]: I1211 14:27:34.084426 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/community-operators-jncnt" podUID="e837aefb-2b43-47e8-87b0-232560ff1b37" containerName="registry-server" probeResult="failure" output=< Dec 11 14:27:34 crc kubenswrapper[4898]: timeout: failed to connect service ":50051" within 1s Dec 11 14:27:34 crc kubenswrapper[4898]: > Dec 11 14:27:34 crc kubenswrapper[4898]: I1211 14:27:34.088578 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/redhat-marketplace-jqfm2" podUID="f1d2e209-8101-4341-b232-ed52d1d9f629" containerName="registry-server" probeResult="failure" output=< Dec 11 14:27:34 crc kubenswrapper[4898]: timeout: failed to connect service ":50051" within 1s Dec 11 14:27:34 crc kubenswrapper[4898]: > Dec 11 14:27:34 crc kubenswrapper[4898]: I1211 14:27:34.088912 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-9bw58" podUID="2b267048-1a44-4a75-bd64-24641a6b5a63" containerName="registry-server" probeResult="failure" output=< Dec 11 14:27:34 crc kubenswrapper[4898]: timeout: failed to connect service ":50051" within 1s Dec 11 14:27:34 crc kubenswrapper[4898]: > Dec 11 14:27:35 crc kubenswrapper[4898]: I1211 14:27:35.044729 4898 patch_prober.go:28] interesting pod/machine-config-daemon-7mmvk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 11 14:27:35 crc kubenswrapper[4898]: I1211 14:27:35.049147 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 11 14:27:35 crc kubenswrapper[4898]: I1211 14:27:35.141023 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/metallb-operator-webhook-server-585b5958fd-m8kkm" podUID="351e2ec9-301e-4fd9-b8ef-45494b9a1291" containerName="webhook-server" probeResult="failure" output="Get \"http://10.217.0.96:7472/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 11 14:27:35 crc kubenswrapper[4898]: I1211 14:27:35.141131 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/metallb-operator-webhook-server-585b5958fd-m8kkm" podUID="351e2ec9-301e-4fd9-b8ef-45494b9a1291" containerName="webhook-server" probeResult="failure" output="Get \"http://10.217.0.96:7472/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 11 14:27:35 crc kubenswrapper[4898]: I1211 14:27:35.182325 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/community-operators-jncnt" podUID="e837aefb-2b43-47e8-87b0-232560ff1b37" containerName="registry-server" probeResult="failure" output=< Dec 11 14:27:35 crc kubenswrapper[4898]: timeout: health rpc did not complete within 1s Dec 11 14:27:35 crc kubenswrapper[4898]: > Dec 11 14:27:35 crc kubenswrapper[4898]: I1211 14:27:35.190024 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/community-operators-jncnt" podUID="e837aefb-2b43-47e8-87b0-232560ff1b37" containerName="registry-server" probeResult="failure" output=< Dec 11 14:27:35 crc kubenswrapper[4898]: timeout: health rpc did not complete within 1s Dec 11 14:27:35 crc kubenswrapper[4898]: > Dec 11 14:27:35 crc kubenswrapper[4898]: I1211 14:27:35.191164 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/certified-operators-nkc2q" podUID="b434beff-f5d1-4b10-8715-10cdfe445919" containerName="registry-server" probeResult="failure" output=< Dec 11 14:27:35 crc kubenswrapper[4898]: timeout: health rpc did not complete within 1s Dec 11 14:27:35 crc kubenswrapper[4898]: > Dec 11 14:27:35 crc kubenswrapper[4898]: I1211 14:27:35.192046 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/certified-operators-nkc2q" podUID="b434beff-f5d1-4b10-8715-10cdfe445919" containerName="registry-server" probeResult="failure" output=< Dec 11 14:27:35 crc kubenswrapper[4898]: timeout: health rpc did not complete within 1s Dec 11 14:27:35 crc kubenswrapper[4898]: > Dec 11 14:27:35 crc kubenswrapper[4898]: I1211 14:27:35.302691 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/frr-k8s-6gtqk" podUID="d898c00f-7c50-483d-84f3-9c502696b39a" containerName="frr" probeResult="failure" output="Get \"http://127.0.0.1:7573/livez\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 11 14:27:35 crc kubenswrapper[4898]: I1211 14:27:35.302695 4898 patch_prober.go:28] interesting pod/console-9bc76884c-z28hg container/console namespace/openshift-console: Readiness probe status=failure output="Get \"https://10.217.0.137:8443/health\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 11 14:27:35 crc kubenswrapper[4898]: I1211 14:27:35.302806 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/console-9bc76884c-z28hg" podUID="4ba352c0-f542-46ac-abcc-c136ddfb67fc" containerName="console" probeResult="failure" output="Get \"https://10.217.0.137:8443/health\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 11 14:27:35 crc kubenswrapper[4898]: I1211 14:27:35.561970 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/controller-5bddd4b946-bkr8v" podUID="cfd0fc01-1bde-4b11-bbdd-d95693d0dd15" containerName="controller" probeResult="failure" output="Get \"http://10.217.0.98:29150/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 11 14:27:35 crc kubenswrapper[4898]: I1211 14:27:35.562320 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/controller-5bddd4b946-bkr8v" podUID="cfd0fc01-1bde-4b11-bbdd-d95693d0dd15" containerName="controller" probeResult="failure" output="Get \"http://10.217.0.98:29150/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 11 14:27:35 crc kubenswrapper[4898]: I1211 14:27:35.787114 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/ceilometer-0" podUID="d71ee22f-68e7-43d7-8a6a-012ff8b8104e" containerName="ceilometer-central-agent" probeResult="failure" output="command timed out" Dec 11 14:27:35 crc kubenswrapper[4898]: I1211 14:27:35.787886 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/openstack-operator-index-mxsdw" podUID="2911f97f-4469-4335-b6be-48a0e3c6fda8" containerName="registry-server" probeResult="failure" output="command timed out" Dec 11 14:27:35 crc kubenswrapper[4898]: I1211 14:27:35.787985 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/openstack-operator-index-mxsdw" podUID="2911f97f-4469-4335-b6be-48a0e3c6fda8" containerName="registry-server" probeResult="failure" output="command timed out" Dec 11 14:27:36 crc kubenswrapper[4898]: I1211 14:27:36.476738 4898 patch_prober.go:28] interesting pod/monitoring-plugin-5ffffb4f84-84dnp container/monitoring-plugin namespace/openshift-monitoring: Readiness probe status=failure output="Get \"https://10.217.0.77:9443/health\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 11 14:27:36 crc kubenswrapper[4898]: I1211 14:27:36.477036 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/monitoring-plugin-5ffffb4f84-84dnp" podUID="1111559b-96c1-4918-b502-1b5045b8a9da" containerName="monitoring-plugin" probeResult="failure" output="Get \"https://10.217.0.77:9443/health\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 11 14:27:37 crc kubenswrapper[4898]: I1211 14:27:36.782614 4898 patch_prober.go:28] interesting pod/loki-operator-controller-manager-7d9d9f99f6-7sstc container/manager namespace/openshift-operators-redhat: Readiness probe status=failure output="Get \"http://10.217.0.49:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 11 14:27:37 crc kubenswrapper[4898]: I1211 14:27:36.782656 4898 patch_prober.go:28] interesting pod/loki-operator-controller-manager-7d9d9f99f6-7sstc container/manager namespace/openshift-operators-redhat: Liveness probe status=failure output="Get \"http://10.217.0.49:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 11 14:27:37 crc kubenswrapper[4898]: I1211 14:27:36.782674 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operators-redhat/loki-operator-controller-manager-7d9d9f99f6-7sstc" podUID="b4cf67d3-b13e-4afb-be20-80dc0801c69c" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.49:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 11 14:27:37 crc kubenswrapper[4898]: I1211 14:27:36.782733 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operators-redhat/loki-operator-controller-manager-7d9d9f99f6-7sstc" podUID="b4cf67d3-b13e-4afb-be20-80dc0801c69c" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.49:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 11 14:27:37 crc kubenswrapper[4898]: I1211 14:27:36.995682 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/speaker-7jgw9" podUID="80d1af81-ad34-4f94-afd2-94c3773ea9ea" containerName="speaker" probeResult="failure" output="Get \"http://localhost:29150/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 11 14:27:37 crc kubenswrapper[4898]: I1211 14:27:36.995756 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/speaker-7jgw9" podUID="80d1af81-ad34-4f94-afd2-94c3773ea9ea" containerName="speaker" probeResult="failure" output="Get \"http://localhost:29150/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 11 14:27:37 crc kubenswrapper[4898]: I1211 14:27:37.121128 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/openstack-operator-index-mxsdw" podUID="2911f97f-4469-4335-b6be-48a0e3c6fda8" containerName="registry-server" probeResult="failure" output=< Dec 11 14:27:37 crc kubenswrapper[4898]: timeout: failed to connect service ":50051" within 1s Dec 11 14:27:37 crc kubenswrapper[4898]: > Dec 11 14:27:37 crc kubenswrapper[4898]: I1211 14:27:37.123568 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/openstack-operator-index-mxsdw" podUID="2911f97f-4469-4335-b6be-48a0e3c6fda8" containerName="registry-server" probeResult="failure" output=< Dec 11 14:27:37 crc kubenswrapper[4898]: timeout: failed to connect service ":50051" within 1s Dec 11 14:27:37 crc kubenswrapper[4898]: > Dec 11 14:27:37 crc kubenswrapper[4898]: I1211 14:27:37.278772 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/redhat-marketplace-jqfm2" podUID="f1d2e209-8101-4341-b232-ed52d1d9f629" containerName="registry-server" probeResult="failure" output=< Dec 11 14:27:37 crc kubenswrapper[4898]: timeout: failed to connect service ":50051" within 1s Dec 11 14:27:37 crc kubenswrapper[4898]: > Dec 11 14:27:37 crc kubenswrapper[4898]: I1211 14:27:37.278765 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/redhat-operators-kxmhp" podUID="c3a84ac6-e73e-4e1f-bd6e-cfb6c607dd4a" containerName="registry-server" probeResult="failure" output=< Dec 11 14:27:37 crc kubenswrapper[4898]: timeout: health rpc did not complete within 1s Dec 11 14:27:37 crc kubenswrapper[4898]: > Dec 11 14:27:37 crc kubenswrapper[4898]: I1211 14:27:37.279450 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/redhat-operators-kxmhp" podUID="c3a84ac6-e73e-4e1f-bd6e-cfb6c607dd4a" containerName="registry-server" probeResult="failure" output=< Dec 11 14:27:37 crc kubenswrapper[4898]: timeout: health rpc did not complete within 1s Dec 11 14:27:37 crc kubenswrapper[4898]: > Dec 11 14:27:37 crc kubenswrapper[4898]: I1211 14:27:37.279551 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/redhat-marketplace-jqfm2" podUID="f1d2e209-8101-4341-b232-ed52d1d9f629" containerName="registry-server" probeResult="failure" output=< Dec 11 14:27:37 crc kubenswrapper[4898]: timeout: health rpc did not complete within 1s Dec 11 14:27:37 crc kubenswrapper[4898]: > Dec 11 14:27:37 crc kubenswrapper[4898]: I1211 14:27:37.314914 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/cinder-scheduler-0" podUID="512ddc04-04b7-409c-a856-4f9bf3b22c50" containerName="cinder-scheduler" probeResult="failure" output="Get \"http://10.217.0.203:8080/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 11 14:27:38 crc kubenswrapper[4898]: I1211 14:27:38.796290 4898 patch_prober.go:28] interesting pod/oauth-openshift-6cb668d466-7t64n container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.56:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 11 14:27:38 crc kubenswrapper[4898]: I1211 14:27:38.796566 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-6cb668d466-7t64n" podUID="5531fe73-f1a4-4a40-9458-536d6a8e1865" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.56:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 11 14:27:38 crc kubenswrapper[4898]: I1211 14:27:38.796786 4898 patch_prober.go:28] interesting pod/oauth-openshift-6cb668d466-7t64n container/oauth-openshift namespace/openshift-authentication: Liveness probe status=failure output="Get \"https://10.217.0.56:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 11 14:27:38 crc kubenswrapper[4898]: I1211 14:27:38.796809 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-authentication/oauth-openshift-6cb668d466-7t64n" podUID="5531fe73-f1a4-4a40-9458-536d6a8e1865" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.56:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 11 14:27:39 crc kubenswrapper[4898]: I1211 14:27:39.016885 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879fv8ph7" podUID="1fbc642b-9636-47c2-a3db-7913fa4a6b91" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.117:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 11 14:27:40 crc kubenswrapper[4898]: I1211 14:27:40.706883 4898 patch_prober.go:28] interesting pod/perses-operator-5446b9c989-nq968 container/perses-operator namespace/openshift-operators: Liveness probe status=failure output="Get \"http://10.217.0.11:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 11 14:27:40 crc kubenswrapper[4898]: I1211 14:27:40.707027 4898 patch_prober.go:28] interesting pod/perses-operator-5446b9c989-nq968 container/perses-operator namespace/openshift-operators: Readiness probe status=failure output="Get \"http://10.217.0.11:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 11 14:27:40 crc kubenswrapper[4898]: I1211 14:27:40.707275 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operators/perses-operator-5446b9c989-nq968" podUID="c5f15058-ca6b-40a5-bad2-83ea7339d28b" containerName="perses-operator" probeResult="failure" output="Get \"http://10.217.0.11:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 11 14:27:40 crc kubenswrapper[4898]: I1211 14:27:40.707321 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operators/perses-operator-5446b9c989-nq968" podUID="c5f15058-ca6b-40a5-bad2-83ea7339d28b" containerName="perses-operator" probeResult="failure" output="Get \"http://10.217.0.11:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 11 14:27:41 crc kubenswrapper[4898]: I1211 14:27:40.788577 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-ovs-fxk76" podUID="a6777806-e5a2-4585-bd5a-8ba7f7757c59" containerName="ovs-vswitchd" probeResult="failure" output="command timed out" Dec 11 14:27:41 crc kubenswrapper[4898]: I1211 14:27:40.788606 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-ovs-fxk76" podUID="a6777806-e5a2-4585-bd5a-8ba7f7757c59" containerName="ovsdb-server" probeResult="failure" output="command timed out" Dec 11 14:27:41 crc kubenswrapper[4898]: I1211 14:27:41.490677 4898 patch_prober.go:28] interesting pod/logging-loki-gateway-69ffd5987-wz9b5 container/opa namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.55:8083/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 11 14:27:41 crc kubenswrapper[4898]: I1211 14:27:41.490785 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-69ffd5987-wz9b5" podUID="6d6f1657-b9dd-4fba-a216-ca660a4fa958" containerName="opa" probeResult="failure" output="Get \"https://10.217.0.55:8083/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 11 14:27:41 crc kubenswrapper[4898]: I1211 14:27:41.508781 4898 patch_prober.go:28] interesting pod/logging-loki-gateway-69ffd5987-jj95c container/opa namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.61:8083/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 11 14:27:41 crc kubenswrapper[4898]: I1211 14:27:41.508927 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-69ffd5987-jj95c" podUID="c8cb2c04-2c3f-4ffa-95b3-7e5c32a159ce" containerName="opa" probeResult="failure" output="Get \"https://10.217.0.61:8083/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 11 14:27:41 crc kubenswrapper[4898]: I1211 14:27:41.550696 4898 patch_prober.go:28] interesting pod/observability-operator-d8bb48f5d-cnkcn container/operator namespace/openshift-operators: Readiness probe status=failure output="Get \"http://10.217.0.13:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 11 14:27:41 crc kubenswrapper[4898]: I1211 14:27:41.550798 4898 patch_prober.go:28] interesting pod/observability-operator-d8bb48f5d-cnkcn container/operator namespace/openshift-operators: Liveness probe status=failure output="Get \"http://10.217.0.13:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 11 14:27:41 crc kubenswrapper[4898]: I1211 14:27:41.551263 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operators/observability-operator-d8bb48f5d-cnkcn" podUID="2fed6ea1-0c4d-476c-9f45-ca4b5a9dc406" containerName="operator" probeResult="failure" output="Get \"http://10.217.0.13:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 11 14:27:41 crc kubenswrapper[4898]: I1211 14:27:41.551363 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operators/observability-operator-d8bb48f5d-cnkcn" podUID="2fed6ea1-0c4d-476c-9f45-ca4b5a9dc406" containerName="operator" probeResult="failure" output="Get \"http://10.217.0.13:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 11 14:27:42 crc kubenswrapper[4898]: I1211 14:27:41.786026 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/ceilometer-0" podUID="d71ee22f-68e7-43d7-8a6a-012ff8b8104e" containerName="ceilometer-central-agent" probeResult="failure" output="command timed out" Dec 11 14:27:42 crc kubenswrapper[4898]: I1211 14:27:42.061714 4898 trace.go:236] Trace[1309664322]: "Calculate volume metrics of ovnkube-identity-cm for pod openshift-network-node-identity/network-node-identity-vrzqb" (11-Dec-2025 14:27:39.817) (total time: 1710ms): Dec 11 14:27:42 crc kubenswrapper[4898]: Trace[1309664322]: [1.710570514s] [1.710570514s] END Dec 11 14:27:42 crc kubenswrapper[4898]: I1211 14:27:42.174852 4898 patch_prober.go:28] interesting pod/prometheus-operator-admission-webhook-f54c54754-56mlq container/prometheus-operator-admission-webhook namespace/openshift-monitoring: Liveness probe status=failure output="Get \"https://10.217.0.60:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 11 14:27:42 crc kubenswrapper[4898]: I1211 14:27:42.174918 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-56mlq" podUID="eca70e56-1feb-4a48-9c32-db8e075ebfff" containerName="prometheus-operator-admission-webhook" probeResult="failure" output="Get \"https://10.217.0.60:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 11 14:27:42 crc kubenswrapper[4898]: I1211 14:27:42.175000 4898 patch_prober.go:28] interesting pod/prometheus-operator-admission-webhook-f54c54754-56mlq container/prometheus-operator-admission-webhook namespace/openshift-monitoring: Readiness probe status=failure output="Get \"https://10.217.0.60:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 11 14:27:42 crc kubenswrapper[4898]: I1211 14:27:42.175015 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-56mlq" podUID="eca70e56-1feb-4a48-9c32-db8e075ebfff" containerName="prometheus-operator-admission-webhook" probeResult="failure" output="Get \"https://10.217.0.60:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 11 14:27:42 crc kubenswrapper[4898]: I1211 14:27:42.355780 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/cinder-scheduler-0" podUID="512ddc04-04b7-409c-a856-4f9bf3b22c50" containerName="cinder-scheduler" probeResult="failure" output="Get \"http://10.217.0.203:8080/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 11 14:27:42 crc kubenswrapper[4898]: I1211 14:27:42.498710 4898 patch_prober.go:28] interesting pod/router-default-5444994796-jmljz container/router namespace/openshift-ingress: Liveness probe status=failure output="Get \"http://localhost:1936/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 11 14:27:42 crc kubenswrapper[4898]: I1211 14:27:42.498776 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-ingress/router-default-5444994796-jmljz" podUID="7b9b7fbd-6a91-4761-8604-b82c417e12f8" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 11 14:27:42 crc kubenswrapper[4898]: I1211 14:27:42.499043 4898 patch_prober.go:28] interesting pod/router-default-5444994796-jmljz container/router namespace/openshift-ingress: Readiness probe status=failure output="Get \"http://localhost:1936/healthz/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 11 14:27:42 crc kubenswrapper[4898]: I1211 14:27:42.499124 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-ingress/router-default-5444994796-jmljz" podUID="7b9b7fbd-6a91-4761-8604-b82c417e12f8" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 11 14:27:42 crc kubenswrapper[4898]: I1211 14:27:42.732638 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/designate-operator-controller-manager-697fb699cf-pdvzc" podUID="d7d8e047-7525-4d88-b802-550590e7f743" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.105:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 11 14:27:43 crc kubenswrapper[4898]: I1211 14:27:43.047730 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-tq8mw" podUID="0c6054e7-bb0a-4cbd-b459-d9d100182fa1" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.108:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 11 14:27:43 crc kubenswrapper[4898]: I1211 14:27:43.114701 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/ironic-operator-controller-manager-967d97867-qrfjf" podUID="1f2676f6-97b8-425e-9d05-9ec2c52055de" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.110:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 11 14:27:43 crc kubenswrapper[4898]: I1211 14:27:43.131706 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/mariadb-operator-controller-manager-79c8c4686c-qpznh" podUID="1e77353c-6728-4dfa-814c-1a92115c8bf2" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.112:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 11 14:27:43 crc kubenswrapper[4898]: I1211 14:27:43.186694 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/manila-operator-controller-manager-5b5fd79c9c-wxz25" podUID="a99a2194-b89b-4a6a-a086-acd20b489632" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.113:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 11 14:27:43 crc kubenswrapper[4898]: I1211 14:27:43.783037 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/ovn-northd-0" podUID="69fcbdad-bb34-4a36-9100-352ddce7c906" containerName="ovn-northd" probeResult="failure" output="command timed out" Dec 11 14:27:43 crc kubenswrapper[4898]: I1211 14:27:43.783188 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/openstack-cell1-galera-0" podUID="5ae31191-f9f6-452a-8f45-a48b4736012e" containerName="galera" probeResult="failure" output="command timed out" Dec 11 14:27:43 crc kubenswrapper[4898]: I1211 14:27:43.783298 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-northd-0" podUID="69fcbdad-bb34-4a36-9100-352ddce7c906" containerName="ovn-northd" probeResult="failure" output="command timed out" Dec 11 14:27:43 crc kubenswrapper[4898]: I1211 14:27:43.783367 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/openstack-galera-0" podUID="2e7cffb6-80f8-45e8-a4ab-219dc834a613" containerName="galera" probeResult="failure" output="command timed out" Dec 11 14:27:43 crc kubenswrapper[4898]: I1211 14:27:43.784698 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-cell1-galera-0" podUID="5ae31191-f9f6-452a-8f45-a48b4736012e" containerName="galera" probeResult="failure" output="command timed out" Dec 11 14:27:43 crc kubenswrapper[4898]: I1211 14:27:43.786082 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-galera-0" podUID="2e7cffb6-80f8-45e8-a4ab-219dc834a613" containerName="galera" probeResult="failure" output="command timed out" Dec 11 14:27:45 crc kubenswrapper[4898]: I1211 14:27:45.167827 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-9bw58" podUID="2b267048-1a44-4a75-bd64-24641a6b5a63" containerName="registry-server" probeResult="failure" output=< Dec 11 14:27:45 crc kubenswrapper[4898]: timeout: failed to connect service ":50051" within 1s Dec 11 14:27:45 crc kubenswrapper[4898]: > Dec 11 14:27:45 crc kubenswrapper[4898]: I1211 14:27:45.303738 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/frr-k8s-6gtqk" podUID="d898c00f-7c50-483d-84f3-9c502696b39a" containerName="frr" probeResult="failure" output="Get \"http://127.0.0.1:7573/livez\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 11 14:27:45 crc kubenswrapper[4898]: I1211 14:27:45.304738 4898 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="metallb-system/frr-k8s-6gtqk" Dec 11 14:27:45 crc kubenswrapper[4898]: I1211 14:27:45.306830 4898 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="frr" containerStatusID={"Type":"cri-o","ID":"327b3e02274fa424bc38f29f3663a5fac0c739d9c3423621c944ccbcb6810c8e"} pod="metallb-system/frr-k8s-6gtqk" containerMessage="Container frr failed liveness probe, will be restarted" Dec 11 14:27:45 crc kubenswrapper[4898]: I1211 14:27:45.307252 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="metallb-system/frr-k8s-6gtqk" podUID="d898c00f-7c50-483d-84f3-9c502696b39a" containerName="frr" containerID="cri-o://327b3e02274fa424bc38f29f3663a5fac0c739d9c3423621c944ccbcb6810c8e" gracePeriod=2 Dec 11 14:27:45 crc kubenswrapper[4898]: I1211 14:27:45.456987 4898 patch_prober.go:28] interesting pod/logging-loki-distributor-76cc67bf56-qjz7m container/loki-distributor namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.52:3101/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 11 14:27:45 crc kubenswrapper[4898]: I1211 14:27:45.457058 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-distributor-76cc67bf56-qjz7m" podUID="841a3e5b-876d-43b2-b24a-d5c01876c30d" containerName="loki-distributor" probeResult="failure" output="Get \"https://10.217.0.52:3101/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 11 14:27:45 crc kubenswrapper[4898]: I1211 14:27:45.523707 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/openstack-operator-controller-manager-596947c645-4xjkq" podUID="c6d5540b-2eb6-411c-b1a9-b0db78e67ae7" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.124:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 11 14:27:45 crc kubenswrapper[4898]: I1211 14:27:45.523825 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/openstack-operator-controller-manager-596947c645-4xjkq" podUID="c6d5540b-2eb6-411c-b1a9-b0db78e67ae7" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.124:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 11 14:27:45 crc kubenswrapper[4898]: I1211 14:27:45.599965 4898 patch_prober.go:28] interesting pod/logging-loki-querier-5895d59bb8-62tqd container/loki-querier namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.53:3101/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 11 14:27:45 crc kubenswrapper[4898]: I1211 14:27:45.600026 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-querier-5895d59bb8-62tqd" podUID="4ecec7bc-8ce0-46ce-99fd-2a6fbcc626d0" containerName="loki-querier" probeResult="failure" output="Get \"https://10.217.0.53:3101/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 11 14:27:45 crc kubenswrapper[4898]: I1211 14:27:45.715697 4898 patch_prober.go:28] interesting pod/logging-loki-query-frontend-84558f7c9f-rjbbq container/loki-query-frontend namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.54:3101/loki/api/v1/status/buildinfo\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 11 14:27:45 crc kubenswrapper[4898]: I1211 14:27:45.715800 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-rjbbq" podUID="98efd2cb-8cd8-49c2-a54b-5a04cf51dc71" containerName="loki-query-frontend" probeResult="failure" output="Get \"https://10.217.0.54:3101/loki/api/v1/status/buildinfo\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 11 14:27:46 crc kubenswrapper[4898]: I1211 14:27:46.211013 4898 patch_prober.go:28] interesting pod/metrics-server-7c75cc77ff-zj4jw container/metrics-server namespace/openshift-monitoring: Readiness probe status=failure output="Get \"https://10.217.0.76:10250/livez\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 11 14:27:46 crc kubenswrapper[4898]: I1211 14:27:46.212576 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/metrics-server-7c75cc77ff-zj4jw" podUID="faa81158-1b24-4a0a-8fb6-f362177c51fd" containerName="metrics-server" probeResult="failure" output="Get \"https://10.217.0.76:10250/livez\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 11 14:27:46 crc kubenswrapper[4898]: I1211 14:27:46.250012 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-6gtqk" event={"ID":"d898c00f-7c50-483d-84f3-9c502696b39a","Type":"ContainerDied","Data":"327b3e02274fa424bc38f29f3663a5fac0c739d9c3423621c944ccbcb6810c8e"} Dec 11 14:27:46 crc kubenswrapper[4898]: I1211 14:27:46.250035 4898 generic.go:334] "Generic (PLEG): container finished" podID="d898c00f-7c50-483d-84f3-9c502696b39a" containerID="327b3e02274fa424bc38f29f3663a5fac0c739d9c3423621c944ccbcb6810c8e" exitCode=143 Dec 11 14:27:46 crc kubenswrapper[4898]: I1211 14:27:46.477023 4898 patch_prober.go:28] interesting pod/monitoring-plugin-5ffffb4f84-84dnp container/monitoring-plugin namespace/openshift-monitoring: Readiness probe status=failure output="Get \"https://10.217.0.77:9443/health\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 11 14:27:46 crc kubenswrapper[4898]: I1211 14:27:46.477699 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/monitoring-plugin-5ffffb4f84-84dnp" podUID="1111559b-96c1-4918-b502-1b5045b8a9da" containerName="monitoring-plugin" probeResult="failure" output="Get \"https://10.217.0.77:9443/health\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 11 14:27:46 crc kubenswrapper[4898]: I1211 14:27:46.477900 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/monitoring-plugin-5ffffb4f84-84dnp" Dec 11 14:27:46 crc kubenswrapper[4898]: I1211 14:27:46.489145 4898 patch_prober.go:28] interesting pod/logging-loki-gateway-69ffd5987-wz9b5 container/opa namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.55:8083/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 11 14:27:46 crc kubenswrapper[4898]: I1211 14:27:46.489199 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-69ffd5987-wz9b5" podUID="6d6f1657-b9dd-4fba-a216-ca660a4fa958" containerName="opa" probeResult="failure" output="Get \"https://10.217.0.55:8083/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 11 14:27:46 crc kubenswrapper[4898]: I1211 14:27:46.493311 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-5ffffb4f84-84dnp" Dec 11 14:27:46 crc kubenswrapper[4898]: I1211 14:27:46.509035 4898 patch_prober.go:28] interesting pod/logging-loki-gateway-69ffd5987-jj95c container/opa namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.61:8083/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 11 14:27:46 crc kubenswrapper[4898]: I1211 14:27:46.509091 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-69ffd5987-jj95c" podUID="c8cb2c04-2c3f-4ffa-95b3-7e5c32a159ce" containerName="opa" probeResult="failure" output="Get \"https://10.217.0.61:8083/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 11 14:27:46 crc kubenswrapper[4898]: I1211 14:27:46.916863 4898 trace.go:236] Trace[198571103]: "Calculate volume metrics of storage for pod openshift-logging/logging-loki-ingester-0" (11-Dec-2025 14:27:44.656) (total time: 2259ms): Dec 11 14:27:46 crc kubenswrapper[4898]: Trace[198571103]: [2.259699546s] [2.259699546s] END Dec 11 14:27:47 crc kubenswrapper[4898]: I1211 14:27:47.396665 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/cinder-scheduler-0" podUID="512ddc04-04b7-409c-a856-4f9bf3b22c50" containerName="cinder-scheduler" probeResult="failure" output="Get \"http://10.217.0.203:8080/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 11 14:27:47 crc kubenswrapper[4898]: I1211 14:27:47.396756 4898 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/cinder-scheduler-0" Dec 11 14:27:47 crc kubenswrapper[4898]: I1211 14:27:47.398523 4898 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cinder-scheduler" containerStatusID={"Type":"cri-o","ID":"56cf577ae4d3d1c32613200fed078bc4a9ef27f100edd8ae075c7e367b21e77e"} pod="openstack/cinder-scheduler-0" containerMessage="Container cinder-scheduler failed liveness probe, will be restarted" Dec 11 14:27:47 crc kubenswrapper[4898]: I1211 14:27:47.398589 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="512ddc04-04b7-409c-a856-4f9bf3b22c50" containerName="cinder-scheduler" containerID="cri-o://56cf577ae4d3d1c32613200fed078bc4a9ef27f100edd8ae075c7e367b21e77e" gracePeriod=30 Dec 11 14:27:48 crc kubenswrapper[4898]: I1211 14:27:48.374813 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/ceilometer-0" podUID="d71ee22f-68e7-43d7-8a6a-012ff8b8104e" containerName="ceilometer-central-agent" probeResult="failure" output="command timed out" Dec 11 14:27:48 crc kubenswrapper[4898]: I1211 14:27:48.375570 4898 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/ceilometer-0" Dec 11 14:27:48 crc kubenswrapper[4898]: I1211 14:27:48.383213 4898 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="ceilometer-central-agent" containerStatusID={"Type":"cri-o","ID":"d9a08fbb5b93e0892e7764b24571d0aa7e4a85ae6a652d601607d681dfc4b9ff"} pod="openstack/ceilometer-0" containerMessage="Container ceilometer-central-agent failed liveness probe, will be restarted" Dec 11 14:27:48 crc kubenswrapper[4898]: I1211 14:27:48.383339 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d71ee22f-68e7-43d7-8a6a-012ff8b8104e" containerName="ceilometer-central-agent" containerID="cri-o://d9a08fbb5b93e0892e7764b24571d0aa7e4a85ae6a652d601607d681dfc4b9ff" gracePeriod=30 Dec 11 14:27:48 crc kubenswrapper[4898]: I1211 14:27:48.474889 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-6gtqk" event={"ID":"d898c00f-7c50-483d-84f3-9c502696b39a","Type":"ContainerStarted","Data":"1650648e2fd0bd0654e4a7c9b8d8f07dc3efc8cc303122fcb8db06e04cfa56f8"} Dec 11 14:27:48 crc kubenswrapper[4898]: I1211 14:27:48.536609 4898 patch_prober.go:28] interesting pod/nmstate-webhook-f8fb84555-z6l22 container/nmstate-webhook namespace/openshift-nmstate: Readiness probe status=failure output="Get \"https://10.217.0.90:9443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 11 14:27:48 crc kubenswrapper[4898]: I1211 14:27:48.536680 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-nmstate/nmstate-webhook-f8fb84555-z6l22" podUID="79dd8f49-7447-49a9-84a3-252ac5286cc3" containerName="nmstate-webhook" probeResult="failure" output="Get \"https://10.217.0.90:9443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 11 14:27:48 crc kubenswrapper[4898]: I1211 14:27:48.786417 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/ceilometer-0" podUID="d71ee22f-68e7-43d7-8a6a-012ff8b8104e" containerName="ceilometer-notification-agent" probeResult="failure" output="command timed out" Dec 11 14:27:48 crc kubenswrapper[4898]: I1211 14:27:48.799131 4898 patch_prober.go:28] interesting pod/oauth-openshift-6cb668d466-7t64n container/oauth-openshift namespace/openshift-authentication: Liveness probe status=failure output="Get \"https://10.217.0.56:6443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 11 14:27:48 crc kubenswrapper[4898]: I1211 14:27:48.799176 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-authentication/oauth-openshift-6cb668d466-7t64n" podUID="5531fe73-f1a4-4a40-9458-536d6a8e1865" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.56:6443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 11 14:27:48 crc kubenswrapper[4898]: I1211 14:27:48.799253 4898 patch_prober.go:28] interesting pod/oauth-openshift-6cb668d466-7t64n container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.56:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 11 14:27:48 crc kubenswrapper[4898]: I1211 14:27:48.799302 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-6cb668d466-7t64n" podUID="5531fe73-f1a4-4a40-9458-536d6a8e1865" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.56:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 11 14:27:50 crc kubenswrapper[4898]: I1211 14:27:50.781846 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-galera-0" podUID="2e7cffb6-80f8-45e8-a4ab-219dc834a613" containerName="galera" probeResult="failure" output="command timed out" Dec 11 14:27:50 crc kubenswrapper[4898]: I1211 14:27:50.781907 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/openstack-galera-0" podUID="2e7cffb6-80f8-45e8-a4ab-219dc834a613" containerName="galera" probeResult="failure" output="command timed out" Dec 11 14:27:50 crc kubenswrapper[4898]: I1211 14:27:50.786589 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Dec 11 14:27:50 crc kubenswrapper[4898]: I1211 14:27:50.786623 4898 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/openstack-galera-0" Dec 11 14:27:50 crc kubenswrapper[4898]: I1211 14:27:50.787840 4898 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="galera" containerStatusID={"Type":"cri-o","ID":"7ab646376dff1e213ff8bbf79f0d74175b5dbe8db34e0f510cbcf47d38898a79"} pod="openstack/openstack-galera-0" containerMessage="Container galera failed liveness probe, will be restarted" Dec 11 14:27:52 crc kubenswrapper[4898]: I1211 14:27:51.576499 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-9bw58" podUID="2b267048-1a44-4a75-bd64-24641a6b5a63" containerName="registry-server" probeResult="failure" output=< Dec 11 14:27:52 crc kubenswrapper[4898]: timeout: failed to connect service ":50051" within 1s Dec 11 14:27:52 crc kubenswrapper[4898]: > Dec 11 14:27:52 crc kubenswrapper[4898]: I1211 14:27:51.783232 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-galera-0" podUID="2e7cffb6-80f8-45e8-a4ab-219dc834a613" containerName="galera" probeResult="failure" output="command timed out" Dec 11 14:27:52 crc kubenswrapper[4898]: I1211 14:27:52.139054 4898 patch_prober.go:28] interesting pod/authentication-operator-69f744f599-pjp8z container/authentication-operator namespace/openshift-authentication-operator: Liveness probe status=failure output="Get \"https://10.217.0.6:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 11 14:27:52 crc kubenswrapper[4898]: I1211 14:27:52.139138 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-authentication-operator/authentication-operator-69f744f599-pjp8z" podUID="af9f94b9-3e85-4e0c-bb97-ac84071e969f" containerName="authentication-operator" probeResult="failure" output="Get \"https://10.217.0.6:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 11 14:27:52 crc kubenswrapper[4898]: I1211 14:27:52.173826 4898 patch_prober.go:28] interesting pod/prometheus-operator-admission-webhook-f54c54754-56mlq container/prometheus-operator-admission-webhook namespace/openshift-monitoring: Liveness probe status=failure output="Get \"https://10.217.0.60:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 11 14:27:52 crc kubenswrapper[4898]: I1211 14:27:52.173884 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-56mlq" podUID="eca70e56-1feb-4a48-9c32-db8e075ebfff" containerName="prometheus-operator-admission-webhook" probeResult="failure" output="Get \"https://10.217.0.60:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 11 14:27:52 crc kubenswrapper[4898]: I1211 14:27:52.173993 4898 patch_prober.go:28] interesting pod/prometheus-operator-admission-webhook-f54c54754-56mlq container/prometheus-operator-admission-webhook namespace/openshift-monitoring: Readiness probe status=failure output="Get \"https://10.217.0.60:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 11 14:27:52 crc kubenswrapper[4898]: I1211 14:27:52.174019 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-56mlq" podUID="eca70e56-1feb-4a48-9c32-db8e075ebfff" containerName="prometheus-operator-admission-webhook" probeResult="failure" output="Get \"https://10.217.0.60:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 11 14:27:52 crc kubenswrapper[4898]: I1211 14:27:52.415337 4898 patch_prober.go:28] interesting pod/router-default-5444994796-jmljz container/router namespace/openshift-ingress: Readiness probe status=failure output="Get \"http://localhost:1936/healthz/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 11 14:27:52 crc kubenswrapper[4898]: I1211 14:27:52.415396 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-ingress/router-default-5444994796-jmljz" podUID="7b9b7fbd-6a91-4761-8604-b82c417e12f8" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 11 14:27:52 crc kubenswrapper[4898]: I1211 14:27:52.415576 4898 patch_prober.go:28] interesting pod/router-default-5444994796-jmljz container/router namespace/openshift-ingress: Liveness probe status=failure output="Get \"http://localhost:1936/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 11 14:27:52 crc kubenswrapper[4898]: I1211 14:27:52.415601 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-ingress/router-default-5444994796-jmljz" podUID="7b9b7fbd-6a91-4761-8604-b82c417e12f8" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 11 14:27:52 crc kubenswrapper[4898]: I1211 14:27:52.771713 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/cinder-operator-controller-manager-6c677c69b-rfxp7" podUID="c154e39f-1760-4071-b688-f301c3a398e7" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.104:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 11 14:27:52 crc kubenswrapper[4898]: I1211 14:27:52.771718 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-hmbfx" podUID="c99ec3c0-d415-4322-95cd-d57411a1db7b" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.103:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 11 14:27:52 crc kubenswrapper[4898]: I1211 14:27:52.771769 4898 patch_prober.go:28] interesting pod/console-operator-58897d9998-rkglj container/console-operator namespace/openshift-console-operator: Liveness probe status=failure output="Get \"https://10.217.0.10:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 11 14:27:52 crc kubenswrapper[4898]: I1211 14:27:52.771842 4898 patch_prober.go:28] interesting pod/console-operator-58897d9998-rkglj container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.10:8443/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 11 14:27:52 crc kubenswrapper[4898]: I1211 14:27:52.771843 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console-operator/console-operator-58897d9998-rkglj" podUID="cf77b13b-75a2-4fc1-a068-8fd33773f827" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.10:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 11 14:27:52 crc kubenswrapper[4898]: I1211 14:27:52.771898 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-rkglj" podUID="cf77b13b-75a2-4fc1-a068-8fd33773f827" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.10:8443/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 11 14:27:52 crc kubenswrapper[4898]: I1211 14:27:52.782610 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/openstack-cell1-galera-0" podUID="5ae31191-f9f6-452a-8f45-a48b4736012e" containerName="galera" probeResult="failure" output="command timed out" Dec 11 14:27:52 crc kubenswrapper[4898]: I1211 14:27:52.782647 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-cell1-galera-0" podUID="5ae31191-f9f6-452a-8f45-a48b4736012e" containerName="galera" probeResult="failure" output="command timed out" Dec 11 14:27:52 crc kubenswrapper[4898]: I1211 14:27:52.854703 4898 patch_prober.go:28] interesting pod/downloads-7954f5f757-4bmq4 container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.12:8080/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 11 14:27:52 crc kubenswrapper[4898]: I1211 14:27:52.854712 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-hmbfx" podUID="c99ec3c0-d415-4322-95cd-d57411a1db7b" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.103:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 11 14:27:52 crc kubenswrapper[4898]: I1211 14:27:52.854766 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-4bmq4" podUID="5442029d-e1aa-4496-b3c6-18f11b034179" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 11 14:27:52 crc kubenswrapper[4898]: I1211 14:27:52.906064 4898 trace.go:236] Trace[943270162]: "Calculate volume metrics of config-volume for pod openshift-dns/dns-default-cc9qv" (11-Dec-2025 14:27:51.442) (total time: 1463ms): Dec 11 14:27:52 crc kubenswrapper[4898]: Trace[943270162]: [1.463983595s] [1.463983595s] END Dec 11 14:27:52 crc kubenswrapper[4898]: I1211 14:27:52.952578 4898 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Dec 11 14:27:52 crc kubenswrapper[4898]: I1211 14:27:52.953062 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Dec 11 14:27:52 crc kubenswrapper[4898]: I1211 14:27:52.954176 4898 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="galera" containerStatusID={"Type":"cri-o","ID":"182cb6777d8dce969ca4cceb8ce2ae318316c8e9eefc71b7a475503bb04dab70"} pod="openstack/openstack-cell1-galera-0" containerMessage="Container galera failed liveness probe, will be restarted" Dec 11 14:27:53 crc kubenswrapper[4898]: I1211 14:27:53.031633 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/cinder-operator-controller-manager-6c677c69b-rfxp7" podUID="c154e39f-1760-4071-b688-f301c3a398e7" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.104:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 11 14:27:53 crc kubenswrapper[4898]: I1211 14:27:53.031680 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/glance-operator-controller-manager-5697bb5779-7kffw" podUID="5c391a19-7c2d-4838-9269-2c5cd8eea1ad" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.106:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 11 14:27:53 crc kubenswrapper[4898]: I1211 14:27:53.031734 4898 patch_prober.go:28] interesting pod/downloads-7954f5f757-4bmq4 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.12:8080/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 11 14:27:53 crc kubenswrapper[4898]: I1211 14:27:53.031752 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-4bmq4" podUID="5442029d-e1aa-4496-b3c6-18f11b034179" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 11 14:27:53 crc kubenswrapper[4898]: I1211 14:27:53.031778 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/designate-operator-controller-manager-697fb699cf-pdvzc" podUID="d7d8e047-7525-4d88-b802-550590e7f743" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.105:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 11 14:27:53 crc kubenswrapper[4898]: I1211 14:27:53.036097 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/designate-operator-controller-manager-697fb699cf-pdvzc" podUID="d7d8e047-7525-4d88-b802-550590e7f743" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.105:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 11 14:27:53 crc kubenswrapper[4898]: I1211 14:27:53.036768 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/glance-operator-controller-manager-5697bb5779-7kffw" podUID="5c391a19-7c2d-4838-9269-2c5cd8eea1ad" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.106:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 11 14:27:53 crc kubenswrapper[4898]: I1211 14:27:53.783828 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-northd-0" podUID="69fcbdad-bb34-4a36-9100-352ddce7c906" containerName="ovn-northd" probeResult="failure" output="command timed out" Dec 11 14:27:53 crc kubenswrapper[4898]: I1211 14:27:53.783872 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/ovn-northd-0" podUID="69fcbdad-bb34-4a36-9100-352ddce7c906" containerName="ovn-northd" probeResult="failure" output="command timed out" Dec 11 14:27:53 crc kubenswrapper[4898]: I1211 14:27:53.801933 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-cell1-galera-0" podUID="5ae31191-f9f6-452a-8f45-a48b4736012e" containerName="galera" probeResult="failure" output="command timed out" Dec 11 14:27:53 crc kubenswrapper[4898]: I1211 14:27:53.909706 4898 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-pjgq4 container/packageserver namespace/openshift-operator-lifecycle-manager: Liveness probe status=failure output="Get \"https://10.217.0.23:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 11 14:27:53 crc kubenswrapper[4898]: I1211 14:27:53.909753 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pjgq4" podUID="14360874-1ac7-4262-b7ed-3ccc4d909191" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.23:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 11 14:27:53 crc kubenswrapper[4898]: I1211 14:27:53.910649 4898 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-pjgq4 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.23:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 11 14:27:53 crc kubenswrapper[4898]: I1211 14:27:53.910674 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pjgq4" podUID="14360874-1ac7-4262-b7ed-3ccc4d909191" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.23:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 11 14:27:54 crc kubenswrapper[4898]: I1211 14:27:54.262450 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-6gtqk" Dec 11 14:27:54 crc kubenswrapper[4898]: I1211 14:27:54.499667 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/metallb-operator-webhook-server-585b5958fd-m8kkm" podUID="351e2ec9-301e-4fd9-b8ef-45494b9a1291" containerName="webhook-server" probeResult="failure" output="Get \"http://10.217.0.96:7472/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 11 14:27:54 crc kubenswrapper[4898]: I1211 14:27:54.500012 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/metallb-operator-webhook-server-585b5958fd-m8kkm" podUID="351e2ec9-301e-4fd9-b8ef-45494b9a1291" containerName="webhook-server" probeResult="failure" output="Get \"http://10.217.0.96:7472/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 11 14:27:55 crc kubenswrapper[4898]: I1211 14:27:55.193797 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-6gtqk" Dec 11 14:27:55 crc kubenswrapper[4898]: I1211 14:27:55.660291 4898 generic.go:334] "Generic (PLEG): container finished" podID="d71ee22f-68e7-43d7-8a6a-012ff8b8104e" containerID="d9a08fbb5b93e0892e7764b24571d0aa7e4a85ae6a652d601607d681dfc4b9ff" exitCode=0 Dec 11 14:27:55 crc kubenswrapper[4898]: I1211 14:27:55.660545 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d71ee22f-68e7-43d7-8a6a-012ff8b8104e","Type":"ContainerDied","Data":"d9a08fbb5b93e0892e7764b24571d0aa7e4a85ae6a652d601607d681dfc4b9ff"} Dec 11 14:27:55 crc kubenswrapper[4898]: I1211 14:27:55.919637 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstack-cell1-galera-0" podUID="5ae31191-f9f6-452a-8f45-a48b4736012e" containerName="galera" containerID="cri-o://182cb6777d8dce969ca4cceb8ce2ae318316c8e9eefc71b7a475503bb04dab70" gracePeriod=28 Dec 11 14:27:56 crc kubenswrapper[4898]: I1211 14:27:56.032756 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstack-galera-0" podUID="2e7cffb6-80f8-45e8-a4ab-219dc834a613" containerName="galera" containerID="cri-o://7ab646376dff1e213ff8bbf79f0d74175b5dbe8db34e0f510cbcf47d38898a79" gracePeriod=25 Dec 11 14:27:56 crc kubenswrapper[4898]: I1211 14:27:56.675743 4898 generic.go:334] "Generic (PLEG): container finished" podID="512ddc04-04b7-409c-a856-4f9bf3b22c50" containerID="56cf577ae4d3d1c32613200fed078bc4a9ef27f100edd8ae075c7e367b21e77e" exitCode=0 Dec 11 14:27:56 crc kubenswrapper[4898]: I1211 14:27:56.675838 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"512ddc04-04b7-409c-a856-4f9bf3b22c50","Type":"ContainerDied","Data":"56cf577ae4d3d1c32613200fed078bc4a9ef27f100edd8ae075c7e367b21e77e"} Dec 11 14:27:56 crc kubenswrapper[4898]: I1211 14:27:56.741415 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d71ee22f-68e7-43d7-8a6a-012ff8b8104e","Type":"ContainerStarted","Data":"046710d2aeb7addc891251003be9c15062827a2f022553ef2047d67532d18058"} Dec 11 14:27:57 crc kubenswrapper[4898]: I1211 14:27:57.756255 4898 generic.go:334] "Generic (PLEG): container finished" podID="5ae31191-f9f6-452a-8f45-a48b4736012e" containerID="182cb6777d8dce969ca4cceb8ce2ae318316c8e9eefc71b7a475503bb04dab70" exitCode=0 Dec 11 14:27:57 crc kubenswrapper[4898]: I1211 14:27:57.756646 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"5ae31191-f9f6-452a-8f45-a48b4736012e","Type":"ContainerDied","Data":"182cb6777d8dce969ca4cceb8ce2ae318316c8e9eefc71b7a475503bb04dab70"} Dec 11 14:27:57 crc kubenswrapper[4898]: I1211 14:27:57.756908 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"5ae31191-f9f6-452a-8f45-a48b4736012e","Type":"ContainerStarted","Data":"061517a62d6930cea52a75534317d893b3aa078bd831c17a7ee785e6e6f7bb29"} Dec 11 14:27:57 crc kubenswrapper[4898]: I1211 14:27:57.762668 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"512ddc04-04b7-409c-a856-4f9bf3b22c50","Type":"ContainerStarted","Data":"648ab5b3da3342d1b7474167033cf8cf1b80b0ff158fc375ca70e29afb425175"} Dec 11 14:27:58 crc kubenswrapper[4898]: I1211 14:27:58.773714 4898 generic.go:334] "Generic (PLEG): container finished" podID="2e7cffb6-80f8-45e8-a4ab-219dc834a613" containerID="7ab646376dff1e213ff8bbf79f0d74175b5dbe8db34e0f510cbcf47d38898a79" exitCode=0 Dec 11 14:27:58 crc kubenswrapper[4898]: I1211 14:27:58.773796 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"2e7cffb6-80f8-45e8-a4ab-219dc834a613","Type":"ContainerDied","Data":"7ab646376dff1e213ff8bbf79f0d74175b5dbe8db34e0f510cbcf47d38898a79"} Dec 11 14:27:58 crc kubenswrapper[4898]: I1211 14:27:58.774172 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"2e7cffb6-80f8-45e8-a4ab-219dc834a613","Type":"ContainerStarted","Data":"8389b798fc906b23dfdb25ae5f315096c2fe23d42f01c36b109f2a50b8906a6c"} Dec 11 14:27:59 crc kubenswrapper[4898]: I1211 14:27:59.695674 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Dec 11 14:27:59 crc kubenswrapper[4898]: I1211 14:27:59.695963 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Dec 11 14:28:00 crc kubenswrapper[4898]: I1211 14:28:00.840054 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-9bw58" podUID="2b267048-1a44-4a75-bd64-24641a6b5a63" containerName="registry-server" probeResult="failure" output=< Dec 11 14:28:00 crc kubenswrapper[4898]: timeout: failed to connect service ":50051" within 1s Dec 11 14:28:00 crc kubenswrapper[4898]: > Dec 11 14:28:00 crc kubenswrapper[4898]: I1211 14:28:00.840426 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-9bw58" Dec 11 14:28:00 crc kubenswrapper[4898]: I1211 14:28:00.841397 4898 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="registry-server" containerStatusID={"Type":"cri-o","ID":"17cae00930c797a1735d0d7a2c1dce3066f4554f07ef4ccc04fa9f4ec1578dfc"} pod="openshift-marketplace/redhat-operators-9bw58" containerMessage="Container registry-server failed startup probe, will be restarted" Dec 11 14:28:00 crc kubenswrapper[4898]: I1211 14:28:00.841427 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-9bw58" podUID="2b267048-1a44-4a75-bd64-24641a6b5a63" containerName="registry-server" containerID="cri-o://17cae00930c797a1735d0d7a2c1dce3066f4554f07ef4ccc04fa9f4ec1578dfc" gracePeriod=30 Dec 11 14:28:01 crc kubenswrapper[4898]: I1211 14:28:01.036680 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Dec 11 14:28:01 crc kubenswrapper[4898]: I1211 14:28:01.037988 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Dec 11 14:28:01 crc kubenswrapper[4898]: I1211 14:28:01.151569 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Dec 11 14:28:01 crc kubenswrapper[4898]: I1211 14:28:01.802485 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Dec 11 14:28:01 crc kubenswrapper[4898]: I1211 14:28:01.814886 4898 generic.go:334] "Generic (PLEG): container finished" podID="2b267048-1a44-4a75-bd64-24641a6b5a63" containerID="17cae00930c797a1735d0d7a2c1dce3066f4554f07ef4ccc04fa9f4ec1578dfc" exitCode=0 Dec 11 14:28:01 crc kubenswrapper[4898]: I1211 14:28:01.816488 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9bw58" event={"ID":"2b267048-1a44-4a75-bd64-24641a6b5a63","Type":"ContainerDied","Data":"17cae00930c797a1735d0d7a2c1dce3066f4554f07ef4ccc04fa9f4ec1578dfc"} Dec 11 14:28:01 crc kubenswrapper[4898]: I1211 14:28:01.924898 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Dec 11 14:28:01 crc kubenswrapper[4898]: I1211 14:28:01.949327 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Dec 11 14:28:02 crc kubenswrapper[4898]: I1211 14:28:02.754034 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Dec 11 14:28:02 crc kubenswrapper[4898]: I1211 14:28:02.808804 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Dec 11 14:28:02 crc kubenswrapper[4898]: I1211 14:28:02.837202 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9bw58" event={"ID":"2b267048-1a44-4a75-bd64-24641a6b5a63","Type":"ContainerStarted","Data":"c24f15e0520bfcaefc8494835b0d93cac5c03ccfbed67b04f6bca6f536eedf89"} Dec 11 14:28:04 crc kubenswrapper[4898]: I1211 14:28:04.995754 4898 patch_prober.go:28] interesting pod/machine-config-daemon-7mmvk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 11 14:28:04 crc kubenswrapper[4898]: I1211 14:28:04.996165 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 11 14:28:09 crc kubenswrapper[4898]: I1211 14:28:09.784405 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-9bw58" Dec 11 14:28:09 crc kubenswrapper[4898]: I1211 14:28:09.785021 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-9bw58" Dec 11 14:28:10 crc kubenswrapper[4898]: I1211 14:28:10.845092 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-9bw58" podUID="2b267048-1a44-4a75-bd64-24641a6b5a63" containerName="registry-server" probeResult="failure" output=< Dec 11 14:28:10 crc kubenswrapper[4898]: timeout: failed to connect service ":50051" within 1s Dec 11 14:28:10 crc kubenswrapper[4898]: > Dec 11 14:28:10 crc kubenswrapper[4898]: I1211 14:28:10.878832 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-wb9g7"] Dec 11 14:28:10 crc kubenswrapper[4898]: I1211 14:28:10.884443 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wb9g7" Dec 11 14:28:10 crc kubenswrapper[4898]: I1211 14:28:10.918566 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wb9g7"] Dec 11 14:28:11 crc kubenswrapper[4898]: I1211 14:28:11.014488 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c951f4e9-2da4-4b2d-b134-7dc1b8b3b535-catalog-content\") pod \"community-operators-wb9g7\" (UID: \"c951f4e9-2da4-4b2d-b134-7dc1b8b3b535\") " pod="openshift-marketplace/community-operators-wb9g7" Dec 11 14:28:11 crc kubenswrapper[4898]: I1211 14:28:11.014667 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gc2n4\" (UniqueName: \"kubernetes.io/projected/c951f4e9-2da4-4b2d-b134-7dc1b8b3b535-kube-api-access-gc2n4\") pod \"community-operators-wb9g7\" (UID: \"c951f4e9-2da4-4b2d-b134-7dc1b8b3b535\") " pod="openshift-marketplace/community-operators-wb9g7" Dec 11 14:28:11 crc kubenswrapper[4898]: I1211 14:28:11.014740 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c951f4e9-2da4-4b2d-b134-7dc1b8b3b535-utilities\") pod \"community-operators-wb9g7\" (UID: \"c951f4e9-2da4-4b2d-b134-7dc1b8b3b535\") " pod="openshift-marketplace/community-operators-wb9g7" Dec 11 14:28:11 crc kubenswrapper[4898]: I1211 14:28:11.117116 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gc2n4\" (UniqueName: \"kubernetes.io/projected/c951f4e9-2da4-4b2d-b134-7dc1b8b3b535-kube-api-access-gc2n4\") pod \"community-operators-wb9g7\" (UID: \"c951f4e9-2da4-4b2d-b134-7dc1b8b3b535\") " pod="openshift-marketplace/community-operators-wb9g7" Dec 11 14:28:11 crc kubenswrapper[4898]: I1211 14:28:11.117210 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c951f4e9-2da4-4b2d-b134-7dc1b8b3b535-utilities\") pod \"community-operators-wb9g7\" (UID: \"c951f4e9-2da4-4b2d-b134-7dc1b8b3b535\") " pod="openshift-marketplace/community-operators-wb9g7" Dec 11 14:28:11 crc kubenswrapper[4898]: I1211 14:28:11.117320 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c951f4e9-2da4-4b2d-b134-7dc1b8b3b535-catalog-content\") pod \"community-operators-wb9g7\" (UID: \"c951f4e9-2da4-4b2d-b134-7dc1b8b3b535\") " pod="openshift-marketplace/community-operators-wb9g7" Dec 11 14:28:11 crc kubenswrapper[4898]: I1211 14:28:11.118019 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c951f4e9-2da4-4b2d-b134-7dc1b8b3b535-catalog-content\") pod \"community-operators-wb9g7\" (UID: \"c951f4e9-2da4-4b2d-b134-7dc1b8b3b535\") " pod="openshift-marketplace/community-operators-wb9g7" Dec 11 14:28:11 crc kubenswrapper[4898]: I1211 14:28:11.118599 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c951f4e9-2da4-4b2d-b134-7dc1b8b3b535-utilities\") pod \"community-operators-wb9g7\" (UID: \"c951f4e9-2da4-4b2d-b134-7dc1b8b3b535\") " pod="openshift-marketplace/community-operators-wb9g7" Dec 11 14:28:11 crc kubenswrapper[4898]: I1211 14:28:11.139276 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gc2n4\" (UniqueName: \"kubernetes.io/projected/c951f4e9-2da4-4b2d-b134-7dc1b8b3b535-kube-api-access-gc2n4\") pod \"community-operators-wb9g7\" (UID: \"c951f4e9-2da4-4b2d-b134-7dc1b8b3b535\") " pod="openshift-marketplace/community-operators-wb9g7" Dec 11 14:28:11 crc kubenswrapper[4898]: I1211 14:28:11.208427 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wb9g7" Dec 11 14:28:12 crc kubenswrapper[4898]: I1211 14:28:12.021260 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wb9g7"] Dec 11 14:28:12 crc kubenswrapper[4898]: W1211 14:28:12.028927 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc951f4e9_2da4_4b2d_b134_7dc1b8b3b535.slice/crio-b192179d50d53d7721505acdeab4e7b8a05ad71546a0f1cbeb974f2488b9262f WatchSource:0}: Error finding container b192179d50d53d7721505acdeab4e7b8a05ad71546a0f1cbeb974f2488b9262f: Status 404 returned error can't find the container with id b192179d50d53d7721505acdeab4e7b8a05ad71546a0f1cbeb974f2488b9262f Dec 11 14:28:12 crc kubenswrapper[4898]: I1211 14:28:12.952514 4898 generic.go:334] "Generic (PLEG): container finished" podID="c951f4e9-2da4-4b2d-b134-7dc1b8b3b535" containerID="d0e45a571262831a2a365339f316115789d34555f1aca70e920e8beec631b616" exitCode=0 Dec 11 14:28:12 crc kubenswrapper[4898]: I1211 14:28:12.952604 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wb9g7" event={"ID":"c951f4e9-2da4-4b2d-b134-7dc1b8b3b535","Type":"ContainerDied","Data":"d0e45a571262831a2a365339f316115789d34555f1aca70e920e8beec631b616"} Dec 11 14:28:12 crc kubenswrapper[4898]: I1211 14:28:12.952803 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wb9g7" event={"ID":"c951f4e9-2da4-4b2d-b134-7dc1b8b3b535","Type":"ContainerStarted","Data":"b192179d50d53d7721505acdeab4e7b8a05ad71546a0f1cbeb974f2488b9262f"} Dec 11 14:28:13 crc kubenswrapper[4898]: I1211 14:28:13.989252 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wb9g7" event={"ID":"c951f4e9-2da4-4b2d-b134-7dc1b8b3b535","Type":"ContainerStarted","Data":"6db4fa2fc38242060f091da5105b0065df08369cbd0e5ebc83e3e85bf71fd234"} Dec 11 14:28:16 crc kubenswrapper[4898]: I1211 14:28:16.014111 4898 generic.go:334] "Generic (PLEG): container finished" podID="c951f4e9-2da4-4b2d-b134-7dc1b8b3b535" containerID="6db4fa2fc38242060f091da5105b0065df08369cbd0e5ebc83e3e85bf71fd234" exitCode=0 Dec 11 14:28:16 crc kubenswrapper[4898]: I1211 14:28:16.014536 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wb9g7" event={"ID":"c951f4e9-2da4-4b2d-b134-7dc1b8b3b535","Type":"ContainerDied","Data":"6db4fa2fc38242060f091da5105b0065df08369cbd0e5ebc83e3e85bf71fd234"} Dec 11 14:28:17 crc kubenswrapper[4898]: I1211 14:28:17.029052 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wb9g7" event={"ID":"c951f4e9-2da4-4b2d-b134-7dc1b8b3b535","Type":"ContainerStarted","Data":"ad5d6ca7137d9eb59308a8f900877d4145b1bc4ee6217b5f6631d10214c4c6d3"} Dec 11 14:28:17 crc kubenswrapper[4898]: I1211 14:28:17.064840 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-wb9g7" podStartSLOduration=3.542231266 podStartE2EDuration="7.064306247s" podCreationTimestamp="2025-12-11 14:28:10 +0000 UTC" firstStartedPulling="2025-12-11 14:28:12.955043838 +0000 UTC m=+5050.527370275" lastFinishedPulling="2025-12-11 14:28:16.477118819 +0000 UTC m=+5054.049445256" observedRunningTime="2025-12-11 14:28:17.047313533 +0000 UTC m=+5054.619639980" watchObservedRunningTime="2025-12-11 14:28:17.064306247 +0000 UTC m=+5054.636632684" Dec 11 14:28:20 crc kubenswrapper[4898]: I1211 14:28:20.838836 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-9bw58" podUID="2b267048-1a44-4a75-bd64-24641a6b5a63" containerName="registry-server" probeResult="failure" output=< Dec 11 14:28:20 crc kubenswrapper[4898]: timeout: failed to connect service ":50051" within 1s Dec 11 14:28:20 crc kubenswrapper[4898]: > Dec 11 14:28:21 crc kubenswrapper[4898]: I1211 14:28:21.209540 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-wb9g7" Dec 11 14:28:21 crc kubenswrapper[4898]: I1211 14:28:21.211015 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-wb9g7" Dec 11 14:28:22 crc kubenswrapper[4898]: I1211 14:28:22.276680 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-wb9g7" podUID="c951f4e9-2da4-4b2d-b134-7dc1b8b3b535" containerName="registry-server" probeResult="failure" output=< Dec 11 14:28:22 crc kubenswrapper[4898]: timeout: failed to connect service ":50051" within 1s Dec 11 14:28:22 crc kubenswrapper[4898]: > Dec 11 14:28:29 crc kubenswrapper[4898]: I1211 14:28:29.865468 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-9bw58" Dec 11 14:28:29 crc kubenswrapper[4898]: I1211 14:28:29.930766 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-9bw58" Dec 11 14:28:30 crc kubenswrapper[4898]: I1211 14:28:30.109254 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-9bw58"] Dec 11 14:28:31 crc kubenswrapper[4898]: I1211 14:28:31.182313 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-9bw58" podUID="2b267048-1a44-4a75-bd64-24641a6b5a63" containerName="registry-server" containerID="cri-o://c24f15e0520bfcaefc8494835b0d93cac5c03ccfbed67b04f6bca6f536eedf89" gracePeriod=2 Dec 11 14:28:31 crc kubenswrapper[4898]: I1211 14:28:31.272040 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-wb9g7" Dec 11 14:28:31 crc kubenswrapper[4898]: I1211 14:28:31.338145 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-wb9g7" Dec 11 14:28:31 crc kubenswrapper[4898]: I1211 14:28:31.903233 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9bw58" Dec 11 14:28:31 crc kubenswrapper[4898]: I1211 14:28:31.936784 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b267048-1a44-4a75-bd64-24641a6b5a63-catalog-content\") pod \"2b267048-1a44-4a75-bd64-24641a6b5a63\" (UID: \"2b267048-1a44-4a75-bd64-24641a6b5a63\") " Dec 11 14:28:31 crc kubenswrapper[4898]: I1211 14:28:31.937026 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b267048-1a44-4a75-bd64-24641a6b5a63-utilities\") pod \"2b267048-1a44-4a75-bd64-24641a6b5a63\" (UID: \"2b267048-1a44-4a75-bd64-24641a6b5a63\") " Dec 11 14:28:31 crc kubenswrapper[4898]: I1211 14:28:31.937097 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gs6lt\" (UniqueName: \"kubernetes.io/projected/2b267048-1a44-4a75-bd64-24641a6b5a63-kube-api-access-gs6lt\") pod \"2b267048-1a44-4a75-bd64-24641a6b5a63\" (UID: \"2b267048-1a44-4a75-bd64-24641a6b5a63\") " Dec 11 14:28:31 crc kubenswrapper[4898]: I1211 14:28:31.938385 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2b267048-1a44-4a75-bd64-24641a6b5a63-utilities" (OuterVolumeSpecName: "utilities") pod "2b267048-1a44-4a75-bd64-24641a6b5a63" (UID: "2b267048-1a44-4a75-bd64-24641a6b5a63"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 14:28:31 crc kubenswrapper[4898]: I1211 14:28:31.947014 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b267048-1a44-4a75-bd64-24641a6b5a63-kube-api-access-gs6lt" (OuterVolumeSpecName: "kube-api-access-gs6lt") pod "2b267048-1a44-4a75-bd64-24641a6b5a63" (UID: "2b267048-1a44-4a75-bd64-24641a6b5a63"). InnerVolumeSpecName "kube-api-access-gs6lt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 14:28:32 crc kubenswrapper[4898]: I1211 14:28:32.039659 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gs6lt\" (UniqueName: \"kubernetes.io/projected/2b267048-1a44-4a75-bd64-24641a6b5a63-kube-api-access-gs6lt\") on node \"crc\" DevicePath \"\"" Dec 11 14:28:32 crc kubenswrapper[4898]: I1211 14:28:32.039688 4898 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b267048-1a44-4a75-bd64-24641a6b5a63-utilities\") on node \"crc\" DevicePath \"\"" Dec 11 14:28:32 crc kubenswrapper[4898]: I1211 14:28:32.070925 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2b267048-1a44-4a75-bd64-24641a6b5a63-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2b267048-1a44-4a75-bd64-24641a6b5a63" (UID: "2b267048-1a44-4a75-bd64-24641a6b5a63"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 14:28:32 crc kubenswrapper[4898]: I1211 14:28:32.141543 4898 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b267048-1a44-4a75-bd64-24641a6b5a63-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 11 14:28:32 crc kubenswrapper[4898]: I1211 14:28:32.200904 4898 generic.go:334] "Generic (PLEG): container finished" podID="2b267048-1a44-4a75-bd64-24641a6b5a63" containerID="c24f15e0520bfcaefc8494835b0d93cac5c03ccfbed67b04f6bca6f536eedf89" exitCode=0 Dec 11 14:28:32 crc kubenswrapper[4898]: I1211 14:28:32.202140 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9bw58" Dec 11 14:28:32 crc kubenswrapper[4898]: I1211 14:28:32.204854 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9bw58" event={"ID":"2b267048-1a44-4a75-bd64-24641a6b5a63","Type":"ContainerDied","Data":"c24f15e0520bfcaefc8494835b0d93cac5c03ccfbed67b04f6bca6f536eedf89"} Dec 11 14:28:32 crc kubenswrapper[4898]: I1211 14:28:32.204924 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9bw58" event={"ID":"2b267048-1a44-4a75-bd64-24641a6b5a63","Type":"ContainerDied","Data":"03b65a6d965c0c58c5fb3e39eca279c8d8bb61774498fb612bac05726e7a6102"} Dec 11 14:28:32 crc kubenswrapper[4898]: I1211 14:28:32.204948 4898 scope.go:117] "RemoveContainer" containerID="c24f15e0520bfcaefc8494835b0d93cac5c03ccfbed67b04f6bca6f536eedf89" Dec 11 14:28:32 crc kubenswrapper[4898]: I1211 14:28:32.254938 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-9bw58"] Dec 11 14:28:32 crc kubenswrapper[4898]: I1211 14:28:32.256618 4898 scope.go:117] "RemoveContainer" containerID="17cae00930c797a1735d0d7a2c1dce3066f4554f07ef4ccc04fa9f4ec1578dfc" Dec 11 14:28:32 crc kubenswrapper[4898]: I1211 14:28:32.264353 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-9bw58"] Dec 11 14:28:32 crc kubenswrapper[4898]: I1211 14:28:32.281395 4898 scope.go:117] "RemoveContainer" containerID="9f49690e934231b33c37d5ca5f51e06294074d8a4fe199e0da14708bca1565b2" Dec 11 14:28:32 crc kubenswrapper[4898]: I1211 14:28:32.311929 4898 scope.go:117] "RemoveContainer" containerID="4a553195cc0dc577d426468f06a9cc33b11263e816533e5169604bf3a360ed8e" Dec 11 14:28:32 crc kubenswrapper[4898]: I1211 14:28:32.365694 4898 scope.go:117] "RemoveContainer" containerID="c24f15e0520bfcaefc8494835b0d93cac5c03ccfbed67b04f6bca6f536eedf89" Dec 11 14:28:32 crc kubenswrapper[4898]: E1211 14:28:32.367173 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c24f15e0520bfcaefc8494835b0d93cac5c03ccfbed67b04f6bca6f536eedf89\": container with ID starting with c24f15e0520bfcaefc8494835b0d93cac5c03ccfbed67b04f6bca6f536eedf89 not found: ID does not exist" containerID="c24f15e0520bfcaefc8494835b0d93cac5c03ccfbed67b04f6bca6f536eedf89" Dec 11 14:28:32 crc kubenswrapper[4898]: I1211 14:28:32.367392 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c24f15e0520bfcaefc8494835b0d93cac5c03ccfbed67b04f6bca6f536eedf89"} err="failed to get container status \"c24f15e0520bfcaefc8494835b0d93cac5c03ccfbed67b04f6bca6f536eedf89\": rpc error: code = NotFound desc = could not find container \"c24f15e0520bfcaefc8494835b0d93cac5c03ccfbed67b04f6bca6f536eedf89\": container with ID starting with c24f15e0520bfcaefc8494835b0d93cac5c03ccfbed67b04f6bca6f536eedf89 not found: ID does not exist" Dec 11 14:28:32 crc kubenswrapper[4898]: I1211 14:28:32.367430 4898 scope.go:117] "RemoveContainer" containerID="17cae00930c797a1735d0d7a2c1dce3066f4554f07ef4ccc04fa9f4ec1578dfc" Dec 11 14:28:32 crc kubenswrapper[4898]: E1211 14:28:32.367813 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"17cae00930c797a1735d0d7a2c1dce3066f4554f07ef4ccc04fa9f4ec1578dfc\": container with ID starting with 17cae00930c797a1735d0d7a2c1dce3066f4554f07ef4ccc04fa9f4ec1578dfc not found: ID does not exist" containerID="17cae00930c797a1735d0d7a2c1dce3066f4554f07ef4ccc04fa9f4ec1578dfc" Dec 11 14:28:32 crc kubenswrapper[4898]: I1211 14:28:32.367863 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17cae00930c797a1735d0d7a2c1dce3066f4554f07ef4ccc04fa9f4ec1578dfc"} err="failed to get container status \"17cae00930c797a1735d0d7a2c1dce3066f4554f07ef4ccc04fa9f4ec1578dfc\": rpc error: code = NotFound desc = could not find container \"17cae00930c797a1735d0d7a2c1dce3066f4554f07ef4ccc04fa9f4ec1578dfc\": container with ID starting with 17cae00930c797a1735d0d7a2c1dce3066f4554f07ef4ccc04fa9f4ec1578dfc not found: ID does not exist" Dec 11 14:28:32 crc kubenswrapper[4898]: I1211 14:28:32.367896 4898 scope.go:117] "RemoveContainer" containerID="9f49690e934231b33c37d5ca5f51e06294074d8a4fe199e0da14708bca1565b2" Dec 11 14:28:32 crc kubenswrapper[4898]: E1211 14:28:32.368207 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9f49690e934231b33c37d5ca5f51e06294074d8a4fe199e0da14708bca1565b2\": container with ID starting with 9f49690e934231b33c37d5ca5f51e06294074d8a4fe199e0da14708bca1565b2 not found: ID does not exist" containerID="9f49690e934231b33c37d5ca5f51e06294074d8a4fe199e0da14708bca1565b2" Dec 11 14:28:32 crc kubenswrapper[4898]: I1211 14:28:32.368247 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f49690e934231b33c37d5ca5f51e06294074d8a4fe199e0da14708bca1565b2"} err="failed to get container status \"9f49690e934231b33c37d5ca5f51e06294074d8a4fe199e0da14708bca1565b2\": rpc error: code = NotFound desc = could not find container \"9f49690e934231b33c37d5ca5f51e06294074d8a4fe199e0da14708bca1565b2\": container with ID starting with 9f49690e934231b33c37d5ca5f51e06294074d8a4fe199e0da14708bca1565b2 not found: ID does not exist" Dec 11 14:28:32 crc kubenswrapper[4898]: I1211 14:28:32.368273 4898 scope.go:117] "RemoveContainer" containerID="4a553195cc0dc577d426468f06a9cc33b11263e816533e5169604bf3a360ed8e" Dec 11 14:28:32 crc kubenswrapper[4898]: E1211 14:28:32.368556 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a553195cc0dc577d426468f06a9cc33b11263e816533e5169604bf3a360ed8e\": container with ID starting with 4a553195cc0dc577d426468f06a9cc33b11263e816533e5169604bf3a360ed8e not found: ID does not exist" containerID="4a553195cc0dc577d426468f06a9cc33b11263e816533e5169604bf3a360ed8e" Dec 11 14:28:32 crc kubenswrapper[4898]: I1211 14:28:32.368587 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a553195cc0dc577d426468f06a9cc33b11263e816533e5169604bf3a360ed8e"} err="failed to get container status \"4a553195cc0dc577d426468f06a9cc33b11263e816533e5169604bf3a360ed8e\": rpc error: code = NotFound desc = could not find container \"4a553195cc0dc577d426468f06a9cc33b11263e816533e5169604bf3a360ed8e\": container with ID starting with 4a553195cc0dc577d426468f06a9cc33b11263e816533e5169604bf3a360ed8e not found: ID does not exist" Dec 11 14:28:32 crc kubenswrapper[4898]: I1211 14:28:32.788934 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b267048-1a44-4a75-bd64-24641a6b5a63" path="/var/lib/kubelet/pods/2b267048-1a44-4a75-bd64-24641a6b5a63/volumes" Dec 11 14:28:32 crc kubenswrapper[4898]: I1211 14:28:32.909533 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wb9g7"] Dec 11 14:28:33 crc kubenswrapper[4898]: I1211 14:28:33.212825 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-wb9g7" podUID="c951f4e9-2da4-4b2d-b134-7dc1b8b3b535" containerName="registry-server" containerID="cri-o://ad5d6ca7137d9eb59308a8f900877d4145b1bc4ee6217b5f6631d10214c4c6d3" gracePeriod=2 Dec 11 14:28:33 crc kubenswrapper[4898]: I1211 14:28:33.854759 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wb9g7" Dec 11 14:28:33 crc kubenswrapper[4898]: I1211 14:28:33.892913 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c951f4e9-2da4-4b2d-b134-7dc1b8b3b535-utilities\") pod \"c951f4e9-2da4-4b2d-b134-7dc1b8b3b535\" (UID: \"c951f4e9-2da4-4b2d-b134-7dc1b8b3b535\") " Dec 11 14:28:33 crc kubenswrapper[4898]: I1211 14:28:33.893343 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c951f4e9-2da4-4b2d-b134-7dc1b8b3b535-catalog-content\") pod \"c951f4e9-2da4-4b2d-b134-7dc1b8b3b535\" (UID: \"c951f4e9-2da4-4b2d-b134-7dc1b8b3b535\") " Dec 11 14:28:33 crc kubenswrapper[4898]: I1211 14:28:33.893481 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gc2n4\" (UniqueName: \"kubernetes.io/projected/c951f4e9-2da4-4b2d-b134-7dc1b8b3b535-kube-api-access-gc2n4\") pod \"c951f4e9-2da4-4b2d-b134-7dc1b8b3b535\" (UID: \"c951f4e9-2da4-4b2d-b134-7dc1b8b3b535\") " Dec 11 14:28:33 crc kubenswrapper[4898]: I1211 14:28:33.893798 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c951f4e9-2da4-4b2d-b134-7dc1b8b3b535-utilities" (OuterVolumeSpecName: "utilities") pod "c951f4e9-2da4-4b2d-b134-7dc1b8b3b535" (UID: "c951f4e9-2da4-4b2d-b134-7dc1b8b3b535"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 14:28:33 crc kubenswrapper[4898]: I1211 14:28:33.894082 4898 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c951f4e9-2da4-4b2d-b134-7dc1b8b3b535-utilities\") on node \"crc\" DevicePath \"\"" Dec 11 14:28:33 crc kubenswrapper[4898]: I1211 14:28:33.937190 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c951f4e9-2da4-4b2d-b134-7dc1b8b3b535-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c951f4e9-2da4-4b2d-b134-7dc1b8b3b535" (UID: "c951f4e9-2da4-4b2d-b134-7dc1b8b3b535"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 14:28:33 crc kubenswrapper[4898]: I1211 14:28:33.942573 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c951f4e9-2da4-4b2d-b134-7dc1b8b3b535-kube-api-access-gc2n4" (OuterVolumeSpecName: "kube-api-access-gc2n4") pod "c951f4e9-2da4-4b2d-b134-7dc1b8b3b535" (UID: "c951f4e9-2da4-4b2d-b134-7dc1b8b3b535"). InnerVolumeSpecName "kube-api-access-gc2n4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 14:28:33 crc kubenswrapper[4898]: I1211 14:28:33.996258 4898 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c951f4e9-2da4-4b2d-b134-7dc1b8b3b535-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 11 14:28:33 crc kubenswrapper[4898]: I1211 14:28:33.996292 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gc2n4\" (UniqueName: \"kubernetes.io/projected/c951f4e9-2da4-4b2d-b134-7dc1b8b3b535-kube-api-access-gc2n4\") on node \"crc\" DevicePath \"\"" Dec 11 14:28:34 crc kubenswrapper[4898]: I1211 14:28:34.225259 4898 generic.go:334] "Generic (PLEG): container finished" podID="c951f4e9-2da4-4b2d-b134-7dc1b8b3b535" containerID="ad5d6ca7137d9eb59308a8f900877d4145b1bc4ee6217b5f6631d10214c4c6d3" exitCode=0 Dec 11 14:28:34 crc kubenswrapper[4898]: I1211 14:28:34.225299 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wb9g7" event={"ID":"c951f4e9-2da4-4b2d-b134-7dc1b8b3b535","Type":"ContainerDied","Data":"ad5d6ca7137d9eb59308a8f900877d4145b1bc4ee6217b5f6631d10214c4c6d3"} Dec 11 14:28:34 crc kubenswrapper[4898]: I1211 14:28:34.225317 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wb9g7" Dec 11 14:28:34 crc kubenswrapper[4898]: I1211 14:28:34.225333 4898 scope.go:117] "RemoveContainer" containerID="ad5d6ca7137d9eb59308a8f900877d4145b1bc4ee6217b5f6631d10214c4c6d3" Dec 11 14:28:34 crc kubenswrapper[4898]: I1211 14:28:34.225323 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wb9g7" event={"ID":"c951f4e9-2da4-4b2d-b134-7dc1b8b3b535","Type":"ContainerDied","Data":"b192179d50d53d7721505acdeab4e7b8a05ad71546a0f1cbeb974f2488b9262f"} Dec 11 14:28:34 crc kubenswrapper[4898]: I1211 14:28:34.262560 4898 scope.go:117] "RemoveContainer" containerID="6db4fa2fc38242060f091da5105b0065df08369cbd0e5ebc83e3e85bf71fd234" Dec 11 14:28:34 crc kubenswrapper[4898]: I1211 14:28:34.266248 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wb9g7"] Dec 11 14:28:34 crc kubenswrapper[4898]: I1211 14:28:34.284047 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-wb9g7"] Dec 11 14:28:34 crc kubenswrapper[4898]: I1211 14:28:34.292769 4898 scope.go:117] "RemoveContainer" containerID="d0e45a571262831a2a365339f316115789d34555f1aca70e920e8beec631b616" Dec 11 14:28:34 crc kubenswrapper[4898]: I1211 14:28:34.341448 4898 scope.go:117] "RemoveContainer" containerID="ad5d6ca7137d9eb59308a8f900877d4145b1bc4ee6217b5f6631d10214c4c6d3" Dec 11 14:28:34 crc kubenswrapper[4898]: E1211 14:28:34.341844 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ad5d6ca7137d9eb59308a8f900877d4145b1bc4ee6217b5f6631d10214c4c6d3\": container with ID starting with ad5d6ca7137d9eb59308a8f900877d4145b1bc4ee6217b5f6631d10214c4c6d3 not found: ID does not exist" containerID="ad5d6ca7137d9eb59308a8f900877d4145b1bc4ee6217b5f6631d10214c4c6d3" Dec 11 14:28:34 crc kubenswrapper[4898]: I1211 14:28:34.341876 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad5d6ca7137d9eb59308a8f900877d4145b1bc4ee6217b5f6631d10214c4c6d3"} err="failed to get container status \"ad5d6ca7137d9eb59308a8f900877d4145b1bc4ee6217b5f6631d10214c4c6d3\": rpc error: code = NotFound desc = could not find container \"ad5d6ca7137d9eb59308a8f900877d4145b1bc4ee6217b5f6631d10214c4c6d3\": container with ID starting with ad5d6ca7137d9eb59308a8f900877d4145b1bc4ee6217b5f6631d10214c4c6d3 not found: ID does not exist" Dec 11 14:28:34 crc kubenswrapper[4898]: I1211 14:28:34.341921 4898 scope.go:117] "RemoveContainer" containerID="6db4fa2fc38242060f091da5105b0065df08369cbd0e5ebc83e3e85bf71fd234" Dec 11 14:28:34 crc kubenswrapper[4898]: E1211 14:28:34.342312 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6db4fa2fc38242060f091da5105b0065df08369cbd0e5ebc83e3e85bf71fd234\": container with ID starting with 6db4fa2fc38242060f091da5105b0065df08369cbd0e5ebc83e3e85bf71fd234 not found: ID does not exist" containerID="6db4fa2fc38242060f091da5105b0065df08369cbd0e5ebc83e3e85bf71fd234" Dec 11 14:28:34 crc kubenswrapper[4898]: I1211 14:28:34.342376 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6db4fa2fc38242060f091da5105b0065df08369cbd0e5ebc83e3e85bf71fd234"} err="failed to get container status \"6db4fa2fc38242060f091da5105b0065df08369cbd0e5ebc83e3e85bf71fd234\": rpc error: code = NotFound desc = could not find container \"6db4fa2fc38242060f091da5105b0065df08369cbd0e5ebc83e3e85bf71fd234\": container with ID starting with 6db4fa2fc38242060f091da5105b0065df08369cbd0e5ebc83e3e85bf71fd234 not found: ID does not exist" Dec 11 14:28:34 crc kubenswrapper[4898]: I1211 14:28:34.342415 4898 scope.go:117] "RemoveContainer" containerID="d0e45a571262831a2a365339f316115789d34555f1aca70e920e8beec631b616" Dec 11 14:28:34 crc kubenswrapper[4898]: E1211 14:28:34.343169 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d0e45a571262831a2a365339f316115789d34555f1aca70e920e8beec631b616\": container with ID starting with d0e45a571262831a2a365339f316115789d34555f1aca70e920e8beec631b616 not found: ID does not exist" containerID="d0e45a571262831a2a365339f316115789d34555f1aca70e920e8beec631b616" Dec 11 14:28:34 crc kubenswrapper[4898]: I1211 14:28:34.343197 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d0e45a571262831a2a365339f316115789d34555f1aca70e920e8beec631b616"} err="failed to get container status \"d0e45a571262831a2a365339f316115789d34555f1aca70e920e8beec631b616\": rpc error: code = NotFound desc = could not find container \"d0e45a571262831a2a365339f316115789d34555f1aca70e920e8beec631b616\": container with ID starting with d0e45a571262831a2a365339f316115789d34555f1aca70e920e8beec631b616 not found: ID does not exist" Dec 11 14:28:34 crc kubenswrapper[4898]: I1211 14:28:34.791626 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c951f4e9-2da4-4b2d-b134-7dc1b8b3b535" path="/var/lib/kubelet/pods/c951f4e9-2da4-4b2d-b134-7dc1b8b3b535/volumes" Dec 11 14:28:34 crc kubenswrapper[4898]: I1211 14:28:34.996312 4898 patch_prober.go:28] interesting pod/machine-config-daemon-7mmvk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 11 14:28:34 crc kubenswrapper[4898]: I1211 14:28:34.996367 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 11 14:28:34 crc kubenswrapper[4898]: I1211 14:28:34.996410 4898 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" Dec 11 14:28:34 crc kubenswrapper[4898]: I1211 14:28:34.997531 4898 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"da62fe95b38866dac78489cf9559a274d6c76f5f176513a8e912deec87459d47"} pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 11 14:28:34 crc kubenswrapper[4898]: I1211 14:28:34.997602 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" containerName="machine-config-daemon" containerID="cri-o://da62fe95b38866dac78489cf9559a274d6c76f5f176513a8e912deec87459d47" gracePeriod=600 Dec 11 14:28:35 crc kubenswrapper[4898]: I1211 14:28:35.238360 4898 generic.go:334] "Generic (PLEG): container finished" podID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" containerID="da62fe95b38866dac78489cf9559a274d6c76f5f176513a8e912deec87459d47" exitCode=0 Dec 11 14:28:35 crc kubenswrapper[4898]: I1211 14:28:35.238434 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" event={"ID":"b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c","Type":"ContainerDied","Data":"da62fe95b38866dac78489cf9559a274d6c76f5f176513a8e912deec87459d47"} Dec 11 14:28:35 crc kubenswrapper[4898]: I1211 14:28:35.238850 4898 scope.go:117] "RemoveContainer" containerID="b394cf34ba2fd657a20c472d818353aff5e4e06ca5e8c6123308991c69d40158" Dec 11 14:28:36 crc kubenswrapper[4898]: I1211 14:28:36.254005 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" event={"ID":"b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c","Type":"ContainerStarted","Data":"c9d8382e717f95e9e1fb7259437022afee2b7230a670db5835eea6c52595a22c"} Dec 11 14:30:00 crc kubenswrapper[4898]: I1211 14:30:00.226593 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29424390-6vnb5"] Dec 11 14:30:00 crc kubenswrapper[4898]: E1211 14:30:00.227604 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c951f4e9-2da4-4b2d-b134-7dc1b8b3b535" containerName="registry-server" Dec 11 14:30:00 crc kubenswrapper[4898]: I1211 14:30:00.227621 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="c951f4e9-2da4-4b2d-b134-7dc1b8b3b535" containerName="registry-server" Dec 11 14:30:00 crc kubenswrapper[4898]: E1211 14:30:00.227641 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b267048-1a44-4a75-bd64-24641a6b5a63" containerName="registry-server" Dec 11 14:30:00 crc kubenswrapper[4898]: I1211 14:30:00.227647 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b267048-1a44-4a75-bd64-24641a6b5a63" containerName="registry-server" Dec 11 14:30:00 crc kubenswrapper[4898]: E1211 14:30:00.227663 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c951f4e9-2da4-4b2d-b134-7dc1b8b3b535" containerName="extract-utilities" Dec 11 14:30:00 crc kubenswrapper[4898]: I1211 14:30:00.227669 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="c951f4e9-2da4-4b2d-b134-7dc1b8b3b535" containerName="extract-utilities" Dec 11 14:30:00 crc kubenswrapper[4898]: E1211 14:30:00.227678 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c951f4e9-2da4-4b2d-b134-7dc1b8b3b535" containerName="extract-content" Dec 11 14:30:00 crc kubenswrapper[4898]: I1211 14:30:00.227685 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="c951f4e9-2da4-4b2d-b134-7dc1b8b3b535" containerName="extract-content" Dec 11 14:30:00 crc kubenswrapper[4898]: E1211 14:30:00.227698 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b267048-1a44-4a75-bd64-24641a6b5a63" containerName="extract-content" Dec 11 14:30:00 crc kubenswrapper[4898]: I1211 14:30:00.227703 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b267048-1a44-4a75-bd64-24641a6b5a63" containerName="extract-content" Dec 11 14:30:00 crc kubenswrapper[4898]: E1211 14:30:00.227710 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b267048-1a44-4a75-bd64-24641a6b5a63" containerName="registry-server" Dec 11 14:30:00 crc kubenswrapper[4898]: I1211 14:30:00.227716 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b267048-1a44-4a75-bd64-24641a6b5a63" containerName="registry-server" Dec 11 14:30:00 crc kubenswrapper[4898]: E1211 14:30:00.227733 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b267048-1a44-4a75-bd64-24641a6b5a63" containerName="extract-utilities" Dec 11 14:30:00 crc kubenswrapper[4898]: I1211 14:30:00.227738 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b267048-1a44-4a75-bd64-24641a6b5a63" containerName="extract-utilities" Dec 11 14:30:00 crc kubenswrapper[4898]: I1211 14:30:00.227954 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="c951f4e9-2da4-4b2d-b134-7dc1b8b3b535" containerName="registry-server" Dec 11 14:30:00 crc kubenswrapper[4898]: I1211 14:30:00.227981 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b267048-1a44-4a75-bd64-24641a6b5a63" containerName="registry-server" Dec 11 14:30:00 crc kubenswrapper[4898]: I1211 14:30:00.227997 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b267048-1a44-4a75-bd64-24641a6b5a63" containerName="registry-server" Dec 11 14:30:00 crc kubenswrapper[4898]: I1211 14:30:00.230056 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29424390-6vnb5" Dec 11 14:30:00 crc kubenswrapper[4898]: I1211 14:30:00.242087 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 11 14:30:00 crc kubenswrapper[4898]: I1211 14:30:00.242086 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 11 14:30:00 crc kubenswrapper[4898]: I1211 14:30:00.251485 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29424390-6vnb5"] Dec 11 14:30:00 crc kubenswrapper[4898]: I1211 14:30:00.304880 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/52debc45-051d-4fa4-9466-3a76e6ecb1cb-config-volume\") pod \"collect-profiles-29424390-6vnb5\" (UID: \"52debc45-051d-4fa4-9466-3a76e6ecb1cb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29424390-6vnb5" Dec 11 14:30:00 crc kubenswrapper[4898]: I1211 14:30:00.305010 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zf2hq\" (UniqueName: \"kubernetes.io/projected/52debc45-051d-4fa4-9466-3a76e6ecb1cb-kube-api-access-zf2hq\") pod \"collect-profiles-29424390-6vnb5\" (UID: \"52debc45-051d-4fa4-9466-3a76e6ecb1cb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29424390-6vnb5" Dec 11 14:30:00 crc kubenswrapper[4898]: I1211 14:30:00.305083 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/52debc45-051d-4fa4-9466-3a76e6ecb1cb-secret-volume\") pod \"collect-profiles-29424390-6vnb5\" (UID: \"52debc45-051d-4fa4-9466-3a76e6ecb1cb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29424390-6vnb5" Dec 11 14:30:00 crc kubenswrapper[4898]: I1211 14:30:00.406254 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/52debc45-051d-4fa4-9466-3a76e6ecb1cb-config-volume\") pod \"collect-profiles-29424390-6vnb5\" (UID: \"52debc45-051d-4fa4-9466-3a76e6ecb1cb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29424390-6vnb5" Dec 11 14:30:00 crc kubenswrapper[4898]: I1211 14:30:00.406350 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zf2hq\" (UniqueName: \"kubernetes.io/projected/52debc45-051d-4fa4-9466-3a76e6ecb1cb-kube-api-access-zf2hq\") pod \"collect-profiles-29424390-6vnb5\" (UID: \"52debc45-051d-4fa4-9466-3a76e6ecb1cb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29424390-6vnb5" Dec 11 14:30:00 crc kubenswrapper[4898]: I1211 14:30:00.406433 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/52debc45-051d-4fa4-9466-3a76e6ecb1cb-secret-volume\") pod \"collect-profiles-29424390-6vnb5\" (UID: \"52debc45-051d-4fa4-9466-3a76e6ecb1cb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29424390-6vnb5" Dec 11 14:30:00 crc kubenswrapper[4898]: I1211 14:30:00.407106 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/52debc45-051d-4fa4-9466-3a76e6ecb1cb-config-volume\") pod \"collect-profiles-29424390-6vnb5\" (UID: \"52debc45-051d-4fa4-9466-3a76e6ecb1cb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29424390-6vnb5" Dec 11 14:30:00 crc kubenswrapper[4898]: I1211 14:30:00.983854 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/52debc45-051d-4fa4-9466-3a76e6ecb1cb-secret-volume\") pod \"collect-profiles-29424390-6vnb5\" (UID: \"52debc45-051d-4fa4-9466-3a76e6ecb1cb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29424390-6vnb5" Dec 11 14:30:00 crc kubenswrapper[4898]: I1211 14:30:00.984272 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zf2hq\" (UniqueName: \"kubernetes.io/projected/52debc45-051d-4fa4-9466-3a76e6ecb1cb-kube-api-access-zf2hq\") pod \"collect-profiles-29424390-6vnb5\" (UID: \"52debc45-051d-4fa4-9466-3a76e6ecb1cb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29424390-6vnb5" Dec 11 14:30:01 crc kubenswrapper[4898]: I1211 14:30:01.167406 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29424390-6vnb5" Dec 11 14:30:01 crc kubenswrapper[4898]: I1211 14:30:01.781103 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29424390-6vnb5"] Dec 11 14:30:02 crc kubenswrapper[4898]: I1211 14:30:02.312817 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29424390-6vnb5" event={"ID":"52debc45-051d-4fa4-9466-3a76e6ecb1cb","Type":"ContainerStarted","Data":"ed4a6926e3a4e68c387e227fb69aa489990e1f5c6679f6136c71d7c141613ff5"} Dec 11 14:30:02 crc kubenswrapper[4898]: I1211 14:30:02.313843 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29424390-6vnb5" event={"ID":"52debc45-051d-4fa4-9466-3a76e6ecb1cb","Type":"ContainerStarted","Data":"1de1a7aacd840d57a743e888445b56325e4dddd1e429294e8e243e01bdf6a76a"} Dec 11 14:30:03 crc kubenswrapper[4898]: I1211 14:30:03.332416 4898 generic.go:334] "Generic (PLEG): container finished" podID="52debc45-051d-4fa4-9466-3a76e6ecb1cb" containerID="ed4a6926e3a4e68c387e227fb69aa489990e1f5c6679f6136c71d7c141613ff5" exitCode=0 Dec 11 14:30:03 crc kubenswrapper[4898]: I1211 14:30:03.332792 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29424390-6vnb5" event={"ID":"52debc45-051d-4fa4-9466-3a76e6ecb1cb","Type":"ContainerDied","Data":"ed4a6926e3a4e68c387e227fb69aa489990e1f5c6679f6136c71d7c141613ff5"} Dec 11 14:30:04 crc kubenswrapper[4898]: I1211 14:30:04.840013 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29424390-6vnb5" Dec 11 14:30:04 crc kubenswrapper[4898]: I1211 14:30:04.933127 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zf2hq\" (UniqueName: \"kubernetes.io/projected/52debc45-051d-4fa4-9466-3a76e6ecb1cb-kube-api-access-zf2hq\") pod \"52debc45-051d-4fa4-9466-3a76e6ecb1cb\" (UID: \"52debc45-051d-4fa4-9466-3a76e6ecb1cb\") " Dec 11 14:30:04 crc kubenswrapper[4898]: I1211 14:30:04.933751 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/52debc45-051d-4fa4-9466-3a76e6ecb1cb-secret-volume\") pod \"52debc45-051d-4fa4-9466-3a76e6ecb1cb\" (UID: \"52debc45-051d-4fa4-9466-3a76e6ecb1cb\") " Dec 11 14:30:04 crc kubenswrapper[4898]: I1211 14:30:04.933787 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/52debc45-051d-4fa4-9466-3a76e6ecb1cb-config-volume\") pod \"52debc45-051d-4fa4-9466-3a76e6ecb1cb\" (UID: \"52debc45-051d-4fa4-9466-3a76e6ecb1cb\") " Dec 11 14:30:04 crc kubenswrapper[4898]: I1211 14:30:04.934538 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/52debc45-051d-4fa4-9466-3a76e6ecb1cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "52debc45-051d-4fa4-9466-3a76e6ecb1cb" (UID: "52debc45-051d-4fa4-9466-3a76e6ecb1cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 14:30:04 crc kubenswrapper[4898]: I1211 14:30:04.940066 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52debc45-051d-4fa4-9466-3a76e6ecb1cb-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "52debc45-051d-4fa4-9466-3a76e6ecb1cb" (UID: "52debc45-051d-4fa4-9466-3a76e6ecb1cb"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 14:30:04 crc kubenswrapper[4898]: I1211 14:30:04.940435 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52debc45-051d-4fa4-9466-3a76e6ecb1cb-kube-api-access-zf2hq" (OuterVolumeSpecName: "kube-api-access-zf2hq") pod "52debc45-051d-4fa4-9466-3a76e6ecb1cb" (UID: "52debc45-051d-4fa4-9466-3a76e6ecb1cb"). InnerVolumeSpecName "kube-api-access-zf2hq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 14:30:05 crc kubenswrapper[4898]: I1211 14:30:05.036729 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zf2hq\" (UniqueName: \"kubernetes.io/projected/52debc45-051d-4fa4-9466-3a76e6ecb1cb-kube-api-access-zf2hq\") on node \"crc\" DevicePath \"\"" Dec 11 14:30:05 crc kubenswrapper[4898]: I1211 14:30:05.036764 4898 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/52debc45-051d-4fa4-9466-3a76e6ecb1cb-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 11 14:30:05 crc kubenswrapper[4898]: I1211 14:30:05.036774 4898 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/52debc45-051d-4fa4-9466-3a76e6ecb1cb-config-volume\") on node \"crc\" DevicePath \"\"" Dec 11 14:30:05 crc kubenswrapper[4898]: I1211 14:30:05.356021 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29424390-6vnb5" event={"ID":"52debc45-051d-4fa4-9466-3a76e6ecb1cb","Type":"ContainerDied","Data":"1de1a7aacd840d57a743e888445b56325e4dddd1e429294e8e243e01bdf6a76a"} Dec 11 14:30:05 crc kubenswrapper[4898]: I1211 14:30:05.356114 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1de1a7aacd840d57a743e888445b56325e4dddd1e429294e8e243e01bdf6a76a" Dec 11 14:30:05 crc kubenswrapper[4898]: I1211 14:30:05.356080 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29424390-6vnb5" Dec 11 14:30:05 crc kubenswrapper[4898]: I1211 14:30:05.964328 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29424345-ksd98"] Dec 11 14:30:05 crc kubenswrapper[4898]: I1211 14:30:05.989019 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29424345-ksd98"] Dec 11 14:30:06 crc kubenswrapper[4898]: I1211 14:30:06.790838 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6d34ea67-eff5-4813-b997-35b5656058dd" path="/var/lib/kubelet/pods/6d34ea67-eff5-4813-b997-35b5656058dd/volumes" Dec 11 14:30:42 crc kubenswrapper[4898]: I1211 14:30:42.685410 4898 scope.go:117] "RemoveContainer" containerID="fd0c91d3f24fedb4feab2794bb77331a3937188e1596bc5b4c6ebda68444b0cd" Dec 11 14:30:54 crc kubenswrapper[4898]: I1211 14:30:54.033674 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-bbth6"] Dec 11 14:30:54 crc kubenswrapper[4898]: E1211 14:30:54.034914 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52debc45-051d-4fa4-9466-3a76e6ecb1cb" containerName="collect-profiles" Dec 11 14:30:54 crc kubenswrapper[4898]: I1211 14:30:54.034958 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="52debc45-051d-4fa4-9466-3a76e6ecb1cb" containerName="collect-profiles" Dec 11 14:30:54 crc kubenswrapper[4898]: I1211 14:30:54.035328 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="52debc45-051d-4fa4-9466-3a76e6ecb1cb" containerName="collect-profiles" Dec 11 14:30:54 crc kubenswrapper[4898]: I1211 14:30:54.039736 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bbth6" Dec 11 14:30:54 crc kubenswrapper[4898]: I1211 14:30:54.045718 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bbth6"] Dec 11 14:30:54 crc kubenswrapper[4898]: I1211 14:30:54.149214 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mfmns\" (UniqueName: \"kubernetes.io/projected/a5ffa165-123d-4cfd-8cb1-bcf7ac05d69c-kube-api-access-mfmns\") pod \"redhat-marketplace-bbth6\" (UID: \"a5ffa165-123d-4cfd-8cb1-bcf7ac05d69c\") " pod="openshift-marketplace/redhat-marketplace-bbth6" Dec 11 14:30:54 crc kubenswrapper[4898]: I1211 14:30:54.149415 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a5ffa165-123d-4cfd-8cb1-bcf7ac05d69c-utilities\") pod \"redhat-marketplace-bbth6\" (UID: \"a5ffa165-123d-4cfd-8cb1-bcf7ac05d69c\") " pod="openshift-marketplace/redhat-marketplace-bbth6" Dec 11 14:30:54 crc kubenswrapper[4898]: I1211 14:30:54.149454 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a5ffa165-123d-4cfd-8cb1-bcf7ac05d69c-catalog-content\") pod \"redhat-marketplace-bbth6\" (UID: \"a5ffa165-123d-4cfd-8cb1-bcf7ac05d69c\") " pod="openshift-marketplace/redhat-marketplace-bbth6" Dec 11 14:30:54 crc kubenswrapper[4898]: I1211 14:30:54.251764 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a5ffa165-123d-4cfd-8cb1-bcf7ac05d69c-utilities\") pod \"redhat-marketplace-bbth6\" (UID: \"a5ffa165-123d-4cfd-8cb1-bcf7ac05d69c\") " pod="openshift-marketplace/redhat-marketplace-bbth6" Dec 11 14:30:54 crc kubenswrapper[4898]: I1211 14:30:54.252075 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a5ffa165-123d-4cfd-8cb1-bcf7ac05d69c-catalog-content\") pod \"redhat-marketplace-bbth6\" (UID: \"a5ffa165-123d-4cfd-8cb1-bcf7ac05d69c\") " pod="openshift-marketplace/redhat-marketplace-bbth6" Dec 11 14:30:54 crc kubenswrapper[4898]: I1211 14:30:54.252257 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mfmns\" (UniqueName: \"kubernetes.io/projected/a5ffa165-123d-4cfd-8cb1-bcf7ac05d69c-kube-api-access-mfmns\") pod \"redhat-marketplace-bbth6\" (UID: \"a5ffa165-123d-4cfd-8cb1-bcf7ac05d69c\") " pod="openshift-marketplace/redhat-marketplace-bbth6" Dec 11 14:30:54 crc kubenswrapper[4898]: I1211 14:30:54.252393 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a5ffa165-123d-4cfd-8cb1-bcf7ac05d69c-utilities\") pod \"redhat-marketplace-bbth6\" (UID: \"a5ffa165-123d-4cfd-8cb1-bcf7ac05d69c\") " pod="openshift-marketplace/redhat-marketplace-bbth6" Dec 11 14:30:54 crc kubenswrapper[4898]: I1211 14:30:54.252738 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a5ffa165-123d-4cfd-8cb1-bcf7ac05d69c-catalog-content\") pod \"redhat-marketplace-bbth6\" (UID: \"a5ffa165-123d-4cfd-8cb1-bcf7ac05d69c\") " pod="openshift-marketplace/redhat-marketplace-bbth6" Dec 11 14:30:54 crc kubenswrapper[4898]: I1211 14:30:54.275269 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mfmns\" (UniqueName: \"kubernetes.io/projected/a5ffa165-123d-4cfd-8cb1-bcf7ac05d69c-kube-api-access-mfmns\") pod \"redhat-marketplace-bbth6\" (UID: \"a5ffa165-123d-4cfd-8cb1-bcf7ac05d69c\") " pod="openshift-marketplace/redhat-marketplace-bbth6" Dec 11 14:30:54 crc kubenswrapper[4898]: I1211 14:30:54.373354 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bbth6" Dec 11 14:30:54 crc kubenswrapper[4898]: I1211 14:30:54.921490 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bbth6"] Dec 11 14:30:56 crc kubenswrapper[4898]: I1211 14:30:56.018800 4898 generic.go:334] "Generic (PLEG): container finished" podID="a5ffa165-123d-4cfd-8cb1-bcf7ac05d69c" containerID="7620f97c127d87db60fd103de8f2837fafef40dd48d2faec6abbba04426cf1ac" exitCode=0 Dec 11 14:30:56 crc kubenswrapper[4898]: I1211 14:30:56.018925 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bbth6" event={"ID":"a5ffa165-123d-4cfd-8cb1-bcf7ac05d69c","Type":"ContainerDied","Data":"7620f97c127d87db60fd103de8f2837fafef40dd48d2faec6abbba04426cf1ac"} Dec 11 14:30:56 crc kubenswrapper[4898]: I1211 14:30:56.019300 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bbth6" event={"ID":"a5ffa165-123d-4cfd-8cb1-bcf7ac05d69c","Type":"ContainerStarted","Data":"6eddce5379b37d2c98914e2d2f159738a4a64f51de95c7ce0867190bd780c565"} Dec 11 14:30:58 crc kubenswrapper[4898]: I1211 14:30:58.045753 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bbth6" event={"ID":"a5ffa165-123d-4cfd-8cb1-bcf7ac05d69c","Type":"ContainerStarted","Data":"36d247cc4faa0a84216fa2049b291d71363f1dac5e575f5a9eb3909925add077"} Dec 11 14:31:00 crc kubenswrapper[4898]: I1211 14:31:00.070617 4898 generic.go:334] "Generic (PLEG): container finished" podID="a5ffa165-123d-4cfd-8cb1-bcf7ac05d69c" containerID="36d247cc4faa0a84216fa2049b291d71363f1dac5e575f5a9eb3909925add077" exitCode=0 Dec 11 14:31:00 crc kubenswrapper[4898]: I1211 14:31:00.070705 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bbth6" event={"ID":"a5ffa165-123d-4cfd-8cb1-bcf7ac05d69c","Type":"ContainerDied","Data":"36d247cc4faa0a84216fa2049b291d71363f1dac5e575f5a9eb3909925add077"} Dec 11 14:31:02 crc kubenswrapper[4898]: I1211 14:31:02.098268 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bbth6" event={"ID":"a5ffa165-123d-4cfd-8cb1-bcf7ac05d69c","Type":"ContainerStarted","Data":"3a93c709439bf4f84ae9d0ed35c1caf2f97fd9fce679a2303cc2124e86f7cee6"} Dec 11 14:31:02 crc kubenswrapper[4898]: I1211 14:31:02.124547 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-bbth6" podStartSLOduration=3.3147461209999998 podStartE2EDuration="8.12452347s" podCreationTimestamp="2025-12-11 14:30:54 +0000 UTC" firstStartedPulling="2025-12-11 14:30:56.021355189 +0000 UTC m=+5213.593681626" lastFinishedPulling="2025-12-11 14:31:00.831132528 +0000 UTC m=+5218.403458975" observedRunningTime="2025-12-11 14:31:02.123490853 +0000 UTC m=+5219.695817360" watchObservedRunningTime="2025-12-11 14:31:02.12452347 +0000 UTC m=+5219.696849937" Dec 11 14:31:04 crc kubenswrapper[4898]: I1211 14:31:04.373864 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-bbth6" Dec 11 14:31:04 crc kubenswrapper[4898]: I1211 14:31:04.375190 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-bbth6" Dec 11 14:31:04 crc kubenswrapper[4898]: I1211 14:31:04.433238 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-bbth6" Dec 11 14:31:04 crc kubenswrapper[4898]: I1211 14:31:04.995594 4898 patch_prober.go:28] interesting pod/machine-config-daemon-7mmvk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 11 14:31:04 crc kubenswrapper[4898]: I1211 14:31:04.996018 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 11 14:31:06 crc kubenswrapper[4898]: I1211 14:31:06.192986 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-bbth6" Dec 11 14:31:06 crc kubenswrapper[4898]: I1211 14:31:06.270377 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bbth6"] Dec 11 14:31:08 crc kubenswrapper[4898]: I1211 14:31:08.161597 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-bbth6" podUID="a5ffa165-123d-4cfd-8cb1-bcf7ac05d69c" containerName="registry-server" containerID="cri-o://3a93c709439bf4f84ae9d0ed35c1caf2f97fd9fce679a2303cc2124e86f7cee6" gracePeriod=2 Dec 11 14:31:09 crc kubenswrapper[4898]: I1211 14:31:09.180964 4898 generic.go:334] "Generic (PLEG): container finished" podID="a5ffa165-123d-4cfd-8cb1-bcf7ac05d69c" containerID="3a93c709439bf4f84ae9d0ed35c1caf2f97fd9fce679a2303cc2124e86f7cee6" exitCode=0 Dec 11 14:31:09 crc kubenswrapper[4898]: I1211 14:31:09.181031 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bbth6" event={"ID":"a5ffa165-123d-4cfd-8cb1-bcf7ac05d69c","Type":"ContainerDied","Data":"3a93c709439bf4f84ae9d0ed35c1caf2f97fd9fce679a2303cc2124e86f7cee6"} Dec 11 14:31:10 crc kubenswrapper[4898]: I1211 14:31:10.202214 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bbth6" event={"ID":"a5ffa165-123d-4cfd-8cb1-bcf7ac05d69c","Type":"ContainerDied","Data":"6eddce5379b37d2c98914e2d2f159738a4a64f51de95c7ce0867190bd780c565"} Dec 11 14:31:10 crc kubenswrapper[4898]: I1211 14:31:10.203858 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6eddce5379b37d2c98914e2d2f159738a4a64f51de95c7ce0867190bd780c565" Dec 11 14:31:10 crc kubenswrapper[4898]: I1211 14:31:10.536540 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bbth6" Dec 11 14:31:10 crc kubenswrapper[4898]: I1211 14:31:10.699942 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mfmns\" (UniqueName: \"kubernetes.io/projected/a5ffa165-123d-4cfd-8cb1-bcf7ac05d69c-kube-api-access-mfmns\") pod \"a5ffa165-123d-4cfd-8cb1-bcf7ac05d69c\" (UID: \"a5ffa165-123d-4cfd-8cb1-bcf7ac05d69c\") " Dec 11 14:31:10 crc kubenswrapper[4898]: I1211 14:31:10.700044 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a5ffa165-123d-4cfd-8cb1-bcf7ac05d69c-utilities\") pod \"a5ffa165-123d-4cfd-8cb1-bcf7ac05d69c\" (UID: \"a5ffa165-123d-4cfd-8cb1-bcf7ac05d69c\") " Dec 11 14:31:10 crc kubenswrapper[4898]: I1211 14:31:10.700157 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a5ffa165-123d-4cfd-8cb1-bcf7ac05d69c-catalog-content\") pod \"a5ffa165-123d-4cfd-8cb1-bcf7ac05d69c\" (UID: \"a5ffa165-123d-4cfd-8cb1-bcf7ac05d69c\") " Dec 11 14:31:10 crc kubenswrapper[4898]: I1211 14:31:10.700744 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a5ffa165-123d-4cfd-8cb1-bcf7ac05d69c-utilities" (OuterVolumeSpecName: "utilities") pod "a5ffa165-123d-4cfd-8cb1-bcf7ac05d69c" (UID: "a5ffa165-123d-4cfd-8cb1-bcf7ac05d69c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 14:31:10 crc kubenswrapper[4898]: I1211 14:31:10.710028 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a5ffa165-123d-4cfd-8cb1-bcf7ac05d69c-kube-api-access-mfmns" (OuterVolumeSpecName: "kube-api-access-mfmns") pod "a5ffa165-123d-4cfd-8cb1-bcf7ac05d69c" (UID: "a5ffa165-123d-4cfd-8cb1-bcf7ac05d69c"). InnerVolumeSpecName "kube-api-access-mfmns". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 14:31:10 crc kubenswrapper[4898]: I1211 14:31:10.727086 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a5ffa165-123d-4cfd-8cb1-bcf7ac05d69c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a5ffa165-123d-4cfd-8cb1-bcf7ac05d69c" (UID: "a5ffa165-123d-4cfd-8cb1-bcf7ac05d69c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 14:31:10 crc kubenswrapper[4898]: I1211 14:31:10.803023 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mfmns\" (UniqueName: \"kubernetes.io/projected/a5ffa165-123d-4cfd-8cb1-bcf7ac05d69c-kube-api-access-mfmns\") on node \"crc\" DevicePath \"\"" Dec 11 14:31:10 crc kubenswrapper[4898]: I1211 14:31:10.803052 4898 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a5ffa165-123d-4cfd-8cb1-bcf7ac05d69c-utilities\") on node \"crc\" DevicePath \"\"" Dec 11 14:31:10 crc kubenswrapper[4898]: I1211 14:31:10.803063 4898 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a5ffa165-123d-4cfd-8cb1-bcf7ac05d69c-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 11 14:31:11 crc kubenswrapper[4898]: I1211 14:31:11.216662 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bbth6" Dec 11 14:31:11 crc kubenswrapper[4898]: I1211 14:31:11.250397 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bbth6"] Dec 11 14:31:11 crc kubenswrapper[4898]: I1211 14:31:11.264102 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-bbth6"] Dec 11 14:31:12 crc kubenswrapper[4898]: I1211 14:31:12.787965 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a5ffa165-123d-4cfd-8cb1-bcf7ac05d69c" path="/var/lib/kubelet/pods/a5ffa165-123d-4cfd-8cb1-bcf7ac05d69c/volumes" Dec 11 14:31:44 crc kubenswrapper[4898]: I1211 14:31:44.850019 4898 patch_prober.go:28] interesting pod/logging-loki-gateway-69ffd5987-jj95c container/gateway namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.61:8081/ready\": context deadline exceeded" start-of-body= Dec 11 14:31:44 crc kubenswrapper[4898]: I1211 14:31:44.850011 4898 patch_prober.go:28] interesting pod/logging-loki-gateway-69ffd5987-jj95c container/opa namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.61:8083/ready\": context deadline exceeded" start-of-body= Dec 11 14:31:44 crc kubenswrapper[4898]: I1211 14:31:44.850650 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-69ffd5987-jj95c" podUID="c8cb2c04-2c3f-4ffa-95b3-7e5c32a159ce" containerName="gateway" probeResult="failure" output="Get \"https://10.217.0.61:8081/ready\": context deadline exceeded" Dec 11 14:31:44 crc kubenswrapper[4898]: I1211 14:31:44.850690 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-69ffd5987-jj95c" podUID="c8cb2c04-2c3f-4ffa-95b3-7e5c32a159ce" containerName="opa" probeResult="failure" output="Get \"https://10.217.0.61:8083/ready\": context deadline exceeded" Dec 11 14:31:44 crc kubenswrapper[4898]: I1211 14:31:44.935237 4898 patch_prober.go:28] interesting pod/thanos-querier-b96d76946-svz9h container/kube-rbac-proxy-web namespace/openshift-monitoring: Readiness probe status=failure output="Get \"https://10.217.0.74:9091/-/ready\": context deadline exceeded" start-of-body= Dec 11 14:31:44 crc kubenswrapper[4898]: I1211 14:31:44.935364 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/thanos-querier-b96d76946-svz9h" podUID="7cd8b643-f10c-480e-b851-1d13d35eba18" containerName="kube-rbac-proxy-web" probeResult="failure" output="Get \"https://10.217.0.74:9091/-/ready\": context deadline exceeded" Dec 11 14:31:45 crc kubenswrapper[4898]: I1211 14:31:45.955611 4898 patch_prober.go:28] interesting pod/monitoring-plugin-5ffffb4f84-84dnp container/monitoring-plugin namespace/openshift-monitoring: Readiness probe status=failure output="Get \"https://10.217.0.77:9443/health\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 11 14:31:45 crc kubenswrapper[4898]: I1211 14:31:45.955709 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/monitoring-plugin-5ffffb4f84-84dnp" podUID="1111559b-96c1-4918-b502-1b5045b8a9da" containerName="monitoring-plugin" probeResult="failure" output="Get \"https://10.217.0.77:9443/health\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 11 14:31:45 crc kubenswrapper[4898]: I1211 14:31:45.956742 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/metallb-operator-webhook-server-585b5958fd-m8kkm" podUID="351e2ec9-301e-4fd9-b8ef-45494b9a1291" containerName="webhook-server" probeResult="failure" output="Get \"http://10.217.0.96:7472/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 11 14:31:45 crc kubenswrapper[4898]: I1211 14:31:45.955696 4898 patch_prober.go:28] interesting pod/logging-loki-gateway-69ffd5987-jj95c container/opa namespace/openshift-logging: Liveness probe status=failure output="Get \"https://10.217.0.61:8083/live\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 11 14:31:45 crc kubenswrapper[4898]: I1211 14:31:45.960686 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-logging/logging-loki-gateway-69ffd5987-jj95c" podUID="c8cb2c04-2c3f-4ffa-95b3-7e5c32a159ce" containerName="opa" probeResult="failure" output="Get \"https://10.217.0.61:8083/live\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 11 14:31:45 crc kubenswrapper[4898]: I1211 14:31:45.964767 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/heat-cfnapi-68bbb97c49-zk2lz" podUID="911f5e17-1a51-4bf3-8f1c-cdedc2f4404c" containerName="heat-cfnapi" probeResult="failure" output="Get \"https://10.217.1.16:8000/healthcheck\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 11 14:31:45 crc kubenswrapper[4898]: I1211 14:31:45.985325 4898 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-trbn2 container/marketplace-operator namespace/openshift-marketplace: Liveness probe status=failure output="Get \"http://10.217.0.57:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 11 14:31:45 crc kubenswrapper[4898]: I1211 14:31:45.985389 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/marketplace-operator-79b997595-trbn2" podUID="bb6a1bca-9d01-4e70-882f-47a6e90923df" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.57:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 11 14:31:46 crc kubenswrapper[4898]: I1211 14:31:46.016317 4898 fsHandler.go:133] fs: disk usage and inodes count on following dirs took 2.016907128s: [/var/lib/containers/storage/overlay/2b1bd9c069ea2ef14b3f1cc97941f0b7575f9a7df709538c722fafb666e5ad39/diff ]; will not log again for this container unless duration exceeds 2s Dec 11 14:31:46 crc kubenswrapper[4898]: I1211 14:31:46.016668 4898 fsHandler.go:133] fs: disk usage and inodes count on following dirs took 2.017108293s: [/var/lib/containers/storage/overlay/3f4984effff4c73beeee533e135b1a239afecc1ed8de784be19c02ba82310e95/diff ]; will not log again for this container unless duration exceeds 2s Dec 11 14:31:46 crc kubenswrapper[4898]: I1211 14:31:46.045577 4898 patch_prober.go:28] interesting pod/oauth-openshift-6cb668d466-7t64n container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.56:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 11 14:31:46 crc kubenswrapper[4898]: I1211 14:31:46.045632 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-6cb668d466-7t64n" podUID="5531fe73-f1a4-4a40-9458-536d6a8e1865" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.56:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 11 14:31:46 crc kubenswrapper[4898]: I1211 14:31:46.047015 4898 patch_prober.go:28] interesting pod/oauth-openshift-6cb668d466-7t64n container/oauth-openshift namespace/openshift-authentication: Liveness probe status=failure output="Get \"https://10.217.0.56:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 11 14:31:46 crc kubenswrapper[4898]: I1211 14:31:46.047122 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-authentication/oauth-openshift-6cb668d466-7t64n" podUID="5531fe73-f1a4-4a40-9458-536d6a8e1865" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.56:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 11 14:31:46 crc kubenswrapper[4898]: I1211 14:31:46.049863 4898 fsHandler.go:133] fs: disk usage and inodes count on following dirs took 2.052038835s: [/var/lib/containers/storage/overlay/511ece44d9132d69ddea9c71ac436c77986bf33aa4761ad3fd74bf34175db1e0/diff /var/log/pods/openshift-operator-lifecycle-manager_packageserver-d55dfcdfc-pjgq4_14360874-1ac7-4262-b7ed-3ccc4d909191/packageserver/0.log]; will not log again for this container unless duration exceeds 2s Dec 11 14:31:46 crc kubenswrapper[4898]: I1211 14:31:46.049907 4898 fsHandler.go:133] fs: disk usage and inodes count on following dirs took 2.051433729s: [/var/lib/containers/storage/overlay/25a29c47dc16f4c521bcf029f9e6c36ae06766ae94e0e2fd9fae9eb0a050a3ec/diff /var/log/pods/openstack_swift-storage-0_717e7951-d95b-497f-b2b7-3ec4ef755642/account-server/0.log]; will not log again for this container unless duration exceeds 2s Dec 11 14:31:46 crc kubenswrapper[4898]: I1211 14:31:46.049904 4898 fsHandler.go:133] fs: disk usage and inodes count on following dirs took 2.050986957s: [/var/lib/containers/storage/overlay/9ff339ebeceddaf598837dffce446490aca6b2499625ac922e8d5898ef2ae905/diff /var/log/pods/openshift-operators_observability-operator-d8bb48f5d-cnkcn_2fed6ea1-0c4d-476c-9f45-ca4b5a9dc406/operator/0.log]; will not log again for this container unless duration exceeds 2s Dec 11 14:31:46 crc kubenswrapper[4898]: I1211 14:31:46.049926 4898 fsHandler.go:133] fs: disk usage and inodes count on following dirs took 2.050807442s: [/var/lib/containers/storage/overlay/3edaee3eb7122c214602a93b78767e1e9500cf7138ec26fbdca5b03749ef50c8/diff /var/log/pods/openstack_swift-storage-0_717e7951-d95b-497f-b2b7-3ec4ef755642/account-auditor/0.log]; will not log again for this container unless duration exceeds 2s Dec 11 14:31:46 crc kubenswrapper[4898]: I1211 14:31:46.049941 4898 fsHandler.go:133] fs: disk usage and inodes count on following dirs took 2.050558675s: [/var/lib/containers/storage/overlay/ab83abeaaa3c7cd38a22ee732b1e936c0451db68df92ef86021d1b37d601408e/diff /var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-tq8mw_0c6054e7-bb0a-4cbd-b459-d9d100182fa1/kube-rbac-proxy/0.log]; will not log again for this container unless duration exceeds 2s Dec 11 14:31:46 crc kubenswrapper[4898]: I1211 14:31:46.049959 4898 fsHandler.go:133] fs: disk usage and inodes count on following dirs took 2.051036529s: [/var/lib/containers/storage/overlay/6218af2f17f9d6f869bad546ffaf66451890d7d6db9fa69957d987e3f475a3ad/diff /var/log/pods/openstack_cinder-api-0_3d7a3f50-05f9-4ffd-b473-b7c0c1563bc2/cinder-api-log/0.log]; will not log again for this container unless duration exceeds 2s Dec 11 14:31:46 crc kubenswrapper[4898]: I1211 14:31:46.049985 4898 fsHandler.go:133] fs: disk usage and inodes count on following dirs took 2.050599856s: [/var/lib/containers/storage/overlay/5da1d2bed6830251fdf1f07494a5dfd940c6073ceaf6464170a9ee5531ed32ff/diff /var/log/pods/hostpath-provisioner_csi-hostpathplugin-n8qh4_aa90eeb0-bd02-434e-a457-47336b084be7/hostpath-provisioner/0.log]; will not log again for this container unless duration exceeds 2s Dec 11 14:31:46 crc kubenswrapper[4898]: I1211 14:31:46.050009 4898 fsHandler.go:133] fs: disk usage and inodes count on following dirs took 2.050613017s: [/var/lib/containers/storage/overlay/62985ccb2feeef4483c206dbb1d6b2058878d2f1d98a6b8640694ff0b8ac3355/diff /var/log/pods/openshift-controller-manager-operator_openshift-controller-manager-operator-756b6f6bc6-p9xjl_065a3503-e7a8-4b7d-9cb7-366489ac0247/openshift-controller-manager-operator/0.log]; will not log again for this container unless duration exceeds 2s Dec 11 14:31:46 crc kubenswrapper[4898]: I1211 14:31:46.050033 4898 fsHandler.go:133] fs: disk usage and inodes count on following dirs took 2.050635598s: [/var/lib/containers/storage/overlay/48981d6338fdb053ec21d77a5023e2c1307ff8bfbd9d7f29320e1fdff5ceefe4/diff /var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-676rf_aeb38d85-05d0-4a84-b3a7-4a7a168ccd98/control-plane-machine-set-operator/0.log]; will not log again for this container unless duration exceeds 2s Dec 11 14:31:46 crc kubenswrapper[4898]: I1211 14:31:46.050056 4898 fsHandler.go:133] fs: disk usage and inodes count on following dirs took 2.050657068s: [/var/lib/containers/storage/overlay/2996a5921c33c355beaa0cb2bcfcdb5440302dd5c47663e9cdb8f35ba3b65700/diff /var/log/pods/openshift-logging_collector-ql6sd_c696c635-6d7a-40d8-aef4-ee5781067e7f/collector/0.log]; will not log again for this container unless duration exceeds 2s Dec 11 14:31:46 crc kubenswrapper[4898]: I1211 14:31:46.050078 4898 fsHandler.go:133] fs: disk usage and inodes count on following dirs took 2.050678269s: [/var/lib/containers/storage/overlay/6bef6eebb8be4216c1809e5ecb38b7892009499abe336c512d1b5f4d23a0a1b9/diff /var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-b67b599dd-lw2tt_7d9ad469-8c6d-45df-8a7d-84250d766f58/kube-storage-version-migrator-operator/0.log]; will not log again for this container unless duration exceeds 2s Dec 11 14:31:46 crc kubenswrapper[4898]: I1211 14:31:46.050097 4898 fsHandler.go:133] fs: disk usage and inodes count on following dirs took 2.050713059s: [/var/lib/containers/storage/overlay/497fc6ba0bc96d608066b9aacc8ec221c436e92af6e858a59ce8fa19bf0f65b1/diff /var/log/pods/openstack_swift-storage-0_717e7951-d95b-497f-b2b7-3ec4ef755642/container-server/0.log]; will not log again for this container unless duration exceeds 2s Dec 11 14:31:46 crc kubenswrapper[4898]: I1211 14:31:46.050101 4898 fsHandler.go:133] fs: disk usage and inodes count on following dirs took 2.050519555s: [/var/lib/containers/storage/overlay/d80d98c6b33385a33a525a88e2c17d4c715dd3ca89d0fff19e7e32ddbd5b39d0/diff /var/log/pods/openstack-operators_swift-operator-controller-manager-9d58d64bc-2wt95_cb6adf46-208a-4945-97aa-2c457b9c2614/kube-rbac-proxy/0.log]; will not log again for this container unless duration exceeds 2s Dec 11 14:31:46 crc kubenswrapper[4898]: I1211 14:31:46.050121 4898 fsHandler.go:133] fs: disk usage and inodes count on following dirs took 2.050536345s: [/var/lib/containers/storage/overlay/103595f4600e18df8ee4739cd505f6867799c3d0969c4f666421481582d4be8a/diff /var/log/pods/openshift-multus_multus-admission-controller-857f4d67dd-vjbmx_39bfd95c-8066-475a-ae86-f34ce2cae1e7/multus-admission-controller/0.log]; will not log again for this container unless duration exceeds 2s Dec 11 14:31:46 crc kubenswrapper[4898]: I1211 14:31:46.050126 4898 fsHandler.go:133] fs: disk usage and inodes count on following dirs took 2.050543825s: [/var/lib/containers/storage/overlay/1ce4bc319a11f7936e056d6ef27337ebe2863e24a246f3e4523a1c055a6da9be/diff /var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-dvtzm_88e97f63-c1cb-4ef1-9d95-0c11dc52c94c/kube-rbac-proxy/0.log]; will not log again for this container unless duration exceeds 2s Dec 11 14:31:46 crc kubenswrapper[4898]: I1211 14:31:46.050145 4898 fsHandler.go:133] fs: disk usage and inodes count on following dirs took 2.050425092s: [/var/lib/containers/storage/overlay/a0ec839be660362c3a4ac4e75dfb86d8fc0f744f3494389935fc554c384b2793/diff /var/log/pods/openstack_rabbitmq-cell1-server-0_91c646bc-40ca-434e-8db2-df2eb46c4e5e/rabbitmq/0.log]; will not log again for this container unless duration exceeds 2s Dec 11 14:31:46 crc kubenswrapper[4898]: I1211 14:31:46.050144 4898 fsHandler.go:133] fs: disk usage and inodes count on following dirs took 2.050558916s: [/var/lib/containers/storage/overlay/0ba7fdff2365069b0c15390f7ab9f82f38e94e17004cf11d2d609ab02d96bbf4/diff /var/log/pods/openstack-operators_glance-operator-controller-manager-5697bb5779-7kffw_5c391a19-7c2d-4838-9269-2c5cd8eea1ad/kube-rbac-proxy/0.log]; will not log again for this container unless duration exceeds 2s Dec 11 14:31:46 crc kubenswrapper[4898]: I1211 14:31:46.050164 4898 fsHandler.go:133] fs: disk usage and inodes count on following dirs took 2.050317909s: [/var/lib/containers/storage/overlay/776359622f9e70c582f9ced76f9f4e066d77a8e50601bf121aacc1ec72c85ab7/diff /var/log/pods/hostpath-provisioner_csi-hostpathplugin-n8qh4_aa90eeb0-bd02-434e-a457-47336b084be7/node-driver-registrar/0.log]; will not log again for this container unless duration exceeds 2s Dec 11 14:31:46 crc kubenswrapper[4898]: I1211 14:31:46.050168 4898 fsHandler.go:133] fs: disk usage and inodes count on following dirs took 2.041416191s: [/var/lib/containers/storage/overlay/d3748530a9b1b72909cc8d11e8335bdb8c1f1353c5f050ad8e79050d545e4757/diff /var/log/pods/openstack-operators_watcher-operator-controller-manager-75944c9b7-9kw6z_7d6dbccc-94de-44f9-b7d2-5bbcfee1d119/manager/0.log]; will not log again for this container unless duration exceeds 2s Dec 11 14:31:46 crc kubenswrapper[4898]: I1211 14:31:46.050184 4898 fsHandler.go:133] fs: disk usage and inodes count on following dirs took 2.041293859s: [/var/lib/containers/storage/overlay/038c2ccb1b7814145fbf65e418648d95cb8cac0ee7f33f80f6c18293e94d155a/diff /var/log/pods/openstack_neutron-84b4b98fdc-tbjdg_72ca8f16-912b-44f0-bc9d-868f381fb8fb/neutron-httpd/0.log]; will not log again for this container unless duration exceeds 2s Dec 11 14:31:46 crc kubenswrapper[4898]: I1211 14:31:46.050198 4898 fsHandler.go:133] fs: disk usage and inodes count on following dirs took 2.041044272s: [/var/lib/containers/storage/overlay/25e50b374b4be3f851d00b13a36158b080566184a19002e017a8fe4b58b60a7d/diff /var/log/pods/openshift-cluster-samples-operator_cluster-samples-operator-665b6dd947-nnwb2_e65a531b-17ae-4b24-b9f6-71c758a757b0/cluster-samples-operator/1.log]; will not log again for this container unless duration exceeds 2s Dec 11 14:31:46 crc kubenswrapper[4898]: I1211 14:31:46.050212 4898 fsHandler.go:133] fs: disk usage and inodes count on following dirs took 2.040892148s: [/var/lib/containers/storage/overlay/03f96aea2e4762ed52e1f6e9ab23025dee8b721a73d1792eb60cfd27c855b890/diff /var/log/pods/openstack_glance-default-external-api-0_72c623d3-596d-4db8-8447-3bc93f7187e4/glance-log/0.log]; will not log again for this container unless duration exceeds 2s Dec 11 14:31:46 crc kubenswrapper[4898]: I1211 14:31:46.050224 4898 fsHandler.go:133] fs: disk usage and inodes count on following dirs took 2.040578829s: [/var/lib/containers/storage/overlay/7a3e7e2f5d385b0d495399c39582bb4ebc269c866de339cfad1a34e24b35bb4c/diff /var/log/pods/openstack_swift-storage-0_717e7951-d95b-497f-b2b7-3ec4ef755642/account-replicator/0.log]; will not log again for this container unless duration exceeds 2s Dec 11 14:31:46 crc kubenswrapper[4898]: I1211 14:31:46.050237 4898 fsHandler.go:133] fs: disk usage and inodes count on following dirs took 2.040551928s: [/var/lib/containers/storage/overlay/1eed9d8ba6db2497f06f395c0c48cdee89c7a09da712ba7cfdf14139b9d2d7bf/diff /var/log/pods/openstack_swift-storage-0_717e7951-d95b-497f-b2b7-3ec4ef755642/account-reaper/0.log]; will not log again for this container unless duration exceeds 2s Dec 11 14:31:46 crc kubenswrapper[4898]: I1211 14:31:46.050251 4898 fsHandler.go:133] fs: disk usage and inodes count on following dirs took 2.040564779s: [/var/lib/containers/storage/overlay/ef2ba5c413595ba8bce352375b1bb9114b09c6fefe3467d467102f3044ae85e6/diff /var/log/pods/openshift-monitoring_openshift-state-metrics-566fddb674-p7nkr_05eede04-2d90-4979-908e-29a4d88daf36/kube-rbac-proxy-self/0.log]; will not log again for this container unless duration exceeds 2s Dec 11 14:31:46 crc kubenswrapper[4898]: I1211 14:31:46.050267 4898 fsHandler.go:133] fs: disk usage and inodes count on following dirs took 2.040443976s: [/var/lib/containers/storage/overlay/1548b0ab3c18f387a0680f9037cd00fdb941d7362221dba40c2edea8587cf872/diff /var/log/pods/openstack_placement-6849f86cdd-lt69n_0ee89c59-41c3-4df8-b1ed-7a0fab3b6ee1/placement-log/0.log]; will not log again for this container unless duration exceeds 2s Dec 11 14:31:46 crc kubenswrapper[4898]: I1211 14:31:46.050279 4898 fsHandler.go:133] fs: disk usage and inodes count on following dirs took 2.040440705s: [/var/lib/containers/storage/overlay/7bc0772868136dcb554fd47f435e67cf93ed14674c22cd064408c46e571f10e7/diff /var/log/pods/openstack_placement-6849f86cdd-lt69n_0ee89c59-41c3-4df8-b1ed-7a0fab3b6ee1/placement-api/0.log]; will not log again for this container unless duration exceeds 2s Dec 11 14:31:46 crc kubenswrapper[4898]: I1211 14:31:46.050306 4898 fsHandler.go:133] fs: disk usage and inodes count on following dirs took 2.052328603s: [/var/lib/containers/storage/overlay/13caf76c892a6eda96138855b581db6393978b5b4e249e9a73c4a63e7177c425/diff /var/log/pods/hostpath-provisioner_csi-hostpathplugin-n8qh4_aa90eeb0-bd02-434e-a457-47336b084be7/csi-provisioner/0.log]; will not log again for this container unless duration exceeds 2s Dec 11 14:31:46 crc kubenswrapper[4898]: I1211 14:31:46.050293 4898 fsHandler.go:133] fs: disk usage and inodes count on following dirs took 1.671988034s: [/var/lib/containers/storage/overlay/8a728ba153080ddcb9640c92d5b4454f5571d9e19aeacbcef26637a4ad1b4c18/diff /var/log/pods/openstack_swift-storage-0_717e7951-d95b-497f-b2b7-3ec4ef755642/container-replicator/0.log]; will not log again for this container unless duration exceeds 2s Dec 11 14:31:46 crc kubenswrapper[4898]: I1211 14:31:46.050331 4898 fsHandler.go:133] fs: disk usage and inodes count on following dirs took 2.041440743s: [/var/lib/containers/storage/overlay/a8eff9180c5418701e9091220eb32378b98ceae565465d511b6bdc0e9ff08954/diff /var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-7d9d9f99f6-7sstc_b4cf67d3-b13e-4afb-be20-80dc0801c69c/manager/0.log]; will not log again for this container unless duration exceeds 2s Dec 11 14:31:46 crc kubenswrapper[4898]: I1211 14:31:46.050727 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879fv8ph7" podUID="1fbc642b-9636-47c2-a3db-7913fa4a6b91" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.117:8081/readyz\": dial tcp 10.217.0.117:8081: connect: connection refused" Dec 11 14:31:46 crc kubenswrapper[4898]: I1211 14:31:46.052734 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/manila-operator-controller-manager-5b5fd79c9c-wxz25" podUID="a99a2194-b89b-4a6a-a086-acd20b489632" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.113:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 11 14:31:46 crc kubenswrapper[4898]: I1211 14:31:46.062639 4898 fsHandler.go:133] fs: disk usage and inodes count on following dirs took 2.06494657s: [/var/lib/containers/storage/overlay/ad3d8d3fdd00a1fd7ffc576533ad42d55fd6e70b12c3b2ed6726228b2fbfcd3c/diff /var/log/pods/openshift-monitoring_openshift-state-metrics-566fddb674-p7nkr_05eede04-2d90-4979-908e-29a4d88daf36/kube-rbac-proxy-main/0.log]; will not log again for this container unless duration exceeds 2s Dec 11 14:31:46 crc kubenswrapper[4898]: I1211 14:31:46.063310 4898 fsHandler.go:133] fs: disk usage and inodes count on following dirs took 2.065576576s: [/var/lib/containers/storage/overlay/2821fc1323ab36e72d30a4ae68e745683861fabd2b6440eac03aaaa4164dda4e/diff ]; will not log again for this container unless duration exceeds 2s Dec 11 14:31:46 crc kubenswrapper[4898]: I1211 14:31:46.063724 4898 fsHandler.go:133] fs: disk usage and inodes count on following dirs took 2.065790152s: [/var/lib/containers/storage/overlay/49d394f5815a51c12b935a0ff70a0217bf1801e289062c78c20943de53327e4a/diff /var/log/pods/openshift-ingress-operator_ingress-operator-5b745b69d9-6df8v_d5bcb23e-d100-4e41-bcc4-e8773a821c91/kube-rbac-proxy/0.log]; will not log again for this container unless duration exceeds 2s Dec 11 14:31:46 crc kubenswrapper[4898]: I1211 14:31:46.066969 4898 fsHandler.go:133] fs: disk usage and inodes count on following dirs took 2.068685679s: [/var/lib/containers/storage/overlay/0b4c7a0f351750aaffd59027296279ca0c2ab575d1a22f72b2cb8f88c05ee76f/diff ]; will not log again for this container unless duration exceeds 2s Dec 11 14:31:46 crc kubenswrapper[4898]: I1211 14:31:46.067317 4898 patch_prober.go:28] interesting pod/controller-manager-6d6f47df77-zcqxc container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.68:8443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 11 14:31:46 crc kubenswrapper[4898]: I1211 14:31:46.067383 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-6d6f47df77-zcqxc" podUID="0fd6bf11-4912-4c55-b89d-5866167a0283" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.68:8443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 11 14:31:46 crc kubenswrapper[4898]: I1211 14:31:46.069291 4898 fsHandler.go:133] fs: disk usage and inodes count on following dirs took 2.071427922s: [/var/lib/containers/storage/overlay/9b1cc8d0cef42f704cf04f4da1b0b24f367d083fcd6e8c45bcde0295e707effb/diff /var/log/pods/openshift-ovn-kubernetes_ovnkube-control-plane-749d76644c-brphv_3f71cb6f-609f-4fab-86eb-e01518fb8c61/ovnkube-cluster-manager/0.log]; will not log again for this container unless duration exceeds 2s Dec 11 14:31:46 crc kubenswrapper[4898]: I1211 14:31:46.075288 4898 patch_prober.go:28] interesting pod/controller-manager-6d6f47df77-zcqxc container/controller-manager namespace/openshift-controller-manager: Liveness probe status=failure output="Get \"https://10.217.0.68:8443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 11 14:31:46 crc kubenswrapper[4898]: I1211 14:31:46.075343 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-controller-manager/controller-manager-6d6f47df77-zcqxc" podUID="0fd6bf11-4912-4c55-b89d-5866167a0283" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.68:8443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 11 14:31:46 crc kubenswrapper[4898]: I1211 14:31:46.077743 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/swift-proxy-6867fd7bcf-bbj7b" podUID="4ed17564-edd0-4a66-8b9b-04aabd280113" containerName="proxy-server" probeResult="failure" output="Get \"https://10.217.0.205:8080/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 11 14:31:46 crc kubenswrapper[4898]: I1211 14:31:46.078362 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/swift-proxy-6867fd7bcf-bbj7b" podUID="4ed17564-edd0-4a66-8b9b-04aabd280113" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.205:8080/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 11 14:31:46 crc kubenswrapper[4898]: I1211 14:31:46.079278 4898 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-trbn2 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.57:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 11 14:31:46 crc kubenswrapper[4898]: I1211 14:31:46.079304 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-trbn2" podUID="bb6a1bca-9d01-4e70-882f-47a6e90923df" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.57:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 11 14:31:46 crc kubenswrapper[4898]: I1211 14:31:46.082209 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="hostpath-provisioner/csi-hostpathplugin-n8qh4" podUID="aa90eeb0-bd02-434e-a457-47336b084be7" containerName="hostpath-provisioner" probeResult="failure" output="Get \"http://10.217.0.43:9898/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 11 14:31:46 crc kubenswrapper[4898]: I1211 14:31:46.083122 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/frr-k8s-6gtqk" podUID="d898c00f-7c50-483d-84f3-9c502696b39a" containerName="frr" probeResult="failure" output="Get \"http://127.0.0.1:7573/livez\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 11 14:31:46 crc kubenswrapper[4898]: I1211 14:31:46.083317 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/frr-k8s-6gtqk" podUID="d898c00f-7c50-483d-84f3-9c502696b39a" containerName="controller" probeResult="failure" output="Get \"http://127.0.0.1:7572/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 11 14:31:46 crc kubenswrapper[4898]: I1211 14:31:46.083957 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/frr-k8s-6gtqk" podUID="d898c00f-7c50-483d-84f3-9c502696b39a" containerName="controller" probeResult="failure" output="Get \"http://127.0.0.1:7572/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 11 14:31:46 crc kubenswrapper[4898]: I1211 14:31:46.084239 4898 patch_prober.go:28] interesting pod/console-9bc76884c-z28hg container/console namespace/openshift-console: Readiness probe status=failure output="Get \"https://10.217.0.137:8443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 11 14:31:46 crc kubenswrapper[4898]: I1211 14:31:46.084330 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/console-9bc76884c-z28hg" podUID="4ba352c0-f542-46ac-abcc-c136ddfb67fc" containerName="console" probeResult="failure" output="Get \"https://10.217.0.137:8443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 11 14:31:46 crc kubenswrapper[4898]: I1211 14:31:46.089547 4898 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-7csms container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.217.0.18:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 11 14:31:46 crc kubenswrapper[4898]: I1211 14:31:46.089598 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-7csms" podUID="f46b0f35-a460-4977-a99a-8e0ea7015416" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.18:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 11 14:31:46 crc kubenswrapper[4898]: I1211 14:31:46.089752 4898 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-7csms container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.18:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 11 14:31:46 crc kubenswrapper[4898]: I1211 14:31:46.089769 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-7csms" podUID="f46b0f35-a460-4977-a99a-8e0ea7015416" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.18:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 11 14:31:46 crc kubenswrapper[4898]: I1211 14:31:46.091527 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/openstack-operator-controller-manager-596947c645-4xjkq" podUID="c6d5540b-2eb6-411c-b1a9-b0db78e67ae7" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.124:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 11 14:31:46 crc kubenswrapper[4898]: I1211 14:31:46.091824 4898 patch_prober.go:28] interesting pod/logging-loki-query-frontend-84558f7c9f-rjbbq container/loki-query-frontend namespace/openshift-logging: Liveness probe status=failure output="Get \"https://10.217.0.54:3101/loki/api/v1/status/buildinfo\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 11 14:31:46 crc kubenswrapper[4898]: I1211 14:31:46.098529 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-rjbbq" podUID="98efd2cb-8cd8-49c2-a54b-5a04cf51dc71" containerName="loki-query-frontend" probeResult="failure" output="Get \"https://10.217.0.54:3101/loki/api/v1/status/buildinfo\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 11 14:31:46 crc kubenswrapper[4898]: I1211 14:31:46.093627 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/frr-k8s-webhook-server-7784b6fcf-llp2z" podUID="c0569df8-06fa-4d31-a59a-904b90e4a0ca" containerName="frr-k8s-webhook-server" probeResult="failure" output="Get \"http://10.217.0.97:7572/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 11 14:31:46 crc kubenswrapper[4898]: I1211 14:31:46.093479 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/frr-k8s-webhook-server-7784b6fcf-llp2z" podUID="c0569df8-06fa-4d31-a59a-904b90e4a0ca" containerName="frr-k8s-webhook-server" probeResult="failure" output="Get \"http://10.217.0.97:7572/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 11 14:31:46 crc kubenswrapper[4898]: I1211 14:31:46.093310 4898 patch_prober.go:28] interesting pod/logging-loki-query-frontend-84558f7c9f-rjbbq container/loki-query-frontend namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.54:3101/loki/api/v1/status/buildinfo\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 11 14:31:46 crc kubenswrapper[4898]: I1211 14:31:46.098835 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-rjbbq" podUID="98efd2cb-8cd8-49c2-a54b-5a04cf51dc71" containerName="loki-query-frontend" probeResult="failure" output="Get \"https://10.217.0.54:3101/loki/api/v1/status/buildinfo\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 11 14:31:46 crc kubenswrapper[4898]: I1211 14:31:46.093613 4898 fsHandler.go:133] fs: disk usage and inodes count on following dirs took 2.095934296s: [/var/lib/containers/storage/overlay/408fc88068024451e865589adcf4dbe1b17ea9fadcddf05d26321fb79f2b3bc4/diff /var/log/pods/openstack-operators_nova-operator-controller-manager-697bc559fc-pqc9z_ac70ed50-7e53-4bb9-ac63-35e5c0651db5/manager/0.log]; will not log again for this container unless duration exceeds 3s Dec 11 14:31:46 crc kubenswrapper[4898]: I1211 14:31:46.096629 4898 fsHandler.go:133] fs: disk usage and inodes count on following dirs took 2.098940987s: [/var/lib/containers/storage/overlay/78a10a3afa610de416f6d0caf81bff294dfd6327e9cfa928d87ba543d5532636/diff /var/log/pods/openstack-operators_neutron-operator-controller-manager-5fdfd5b6b5-pgfnv_09d9c781-008c-4486-807c-159f4fefe857/manager/0.log]; will not log again for this container unless duration exceeds 3s Dec 11 14:31:46 crc kubenswrapper[4898]: I1211 14:31:46.100938 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/openstack-operator-controller-operator-5dc777b99d-mszqj" podUID="4e56a824-5c00-4a67-a8c3-a32a001f0ce4" containerName="operator" probeResult="failure" output="Get \"http://10.217.0.102:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 11 14:31:46 crc kubenswrapper[4898]: I1211 14:31:46.101661 4898 fsHandler.go:133] fs: disk usage and inodes count on following dirs took 2.103772055s: [/var/lib/containers/storage/overlay/793048d2d5668e8cbeb842234278b9a9e064652ecf327e8aa2a51a56742c67cd/diff /var/log/pods/openshift-dns_dns-default-cc9qv_2ae587f2-140e-4780-844e-1eb3430f7ee6/dns/0.log]; will not log again for this container unless duration exceeds 2s Dec 11 14:31:46 crc kubenswrapper[4898]: I1211 14:31:46.101866 4898 fsHandler.go:133] fs: disk usage and inodes count on following dirs took 2.104045473s: [/var/lib/containers/storage/overlay/63fd988c41a018a2c116a725332ce2e28cdb618e2ad7e43eedd280a4ab55cedf/diff /var/log/pods/openshift-ingress-canary_ingress-canary-mdzwq_ccaafcb8-0877-4754-b7f7-43d3c35c6283/serve-healthcheck-canary/0.log]; will not log again for this container unless duration exceeds 2s Dec 11 14:31:46 crc kubenswrapper[4898]: I1211 14:31:46.102058 4898 fsHandler.go:133] fs: disk usage and inodes count on following dirs took 2.104129535s: [/var/lib/containers/storage/overlay/bf74d382d237cd29ca9a2177c056a3cbfc5be597030f59b12d0f1ca4c9443e04/diff /var/log/pods/openstack-operators_designate-operator-controller-manager-697fb699cf-pdvzc_d7d8e047-7525-4d88-b802-550590e7f743/manager/0.log]; will not log again for this container unless duration exceeds 2s Dec 11 14:31:46 crc kubenswrapper[4898]: I1211 14:31:46.124629 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/cinder-scheduler-0" podUID="512ddc04-04b7-409c-a856-4f9bf3b22c50" containerName="cinder-scheduler" probeResult="failure" output="Get \"http://10.217.0.203:8080/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 11 14:31:46 crc kubenswrapper[4898]: I1211 14:31:46.124868 4898 patch_prober.go:28] interesting pod/route-controller-manager-6fcffdd775-jsnhs container/route-controller-manager namespace/openshift-route-controller-manager: Liveness probe status=failure output="Get \"https://10.217.0.69:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 11 14:31:46 crc kubenswrapper[4898]: I1211 14:31:46.124888 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-route-controller-manager/route-controller-manager-6fcffdd775-jsnhs" podUID="2dcb95b5-448f-4cc1-8399-9c4c1cc5046f" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.69:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 11 14:31:46 crc kubenswrapper[4898]: I1211 14:31:46.125001 4898 patch_prober.go:28] interesting pod/route-controller-manager-6fcffdd775-jsnhs container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.69:8443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 11 14:31:46 crc kubenswrapper[4898]: I1211 14:31:46.125015 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6fcffdd775-jsnhs" podUID="2dcb95b5-448f-4cc1-8399-9c4c1cc5046f" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.69:8443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 11 14:31:46 crc kubenswrapper[4898]: I1211 14:31:46.125129 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/heat-api-58ffc484cf-pk2vt" podUID="d83af347-3774-4e08-8138-6e67557da826" containerName="heat-api" probeResult="failure" output="Get \"https://10.217.1.15:8004/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 11 14:31:46 crc kubenswrapper[4898]: I1211 14:31:46.126104 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/heat-cfnapi-68bbb97c49-zk2lz" podUID="911f5e17-1a51-4bf3-8f1c-cdedc2f4404c" containerName="heat-cfnapi" probeResult="failure" output="Get \"https://10.217.1.16:8000/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 11 14:31:46 crc kubenswrapper[4898]: I1211 14:31:46.126364 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/heat-api-58ffc484cf-pk2vt" podUID="d83af347-3774-4e08-8138-6e67557da826" containerName="heat-api" probeResult="failure" output="Get \"https://10.217.1.15:8004/healthcheck\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 11 14:31:46 crc kubenswrapper[4898]: I1211 14:31:46.144223 4898 patch_prober.go:28] interesting pod/logging-loki-distributor-76cc67bf56-qjz7m container/loki-distributor namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.52:3101/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 11 14:31:46 crc kubenswrapper[4898]: I1211 14:31:46.144254 4898 patch_prober.go:28] interesting pod/logging-loki-distributor-76cc67bf56-qjz7m container/loki-distributor namespace/openshift-logging: Liveness probe status=failure output="Get \"https://10.217.0.52:3101/loki/api/v1/status/buildinfo\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 11 14:31:46 crc kubenswrapper[4898]: I1211 14:31:46.144260 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-distributor-76cc67bf56-qjz7m" podUID="841a3e5b-876d-43b2-b24a-d5c01876c30d" containerName="loki-distributor" probeResult="failure" output="Get \"https://10.217.0.52:3101/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 11 14:31:46 crc kubenswrapper[4898]: I1211 14:31:46.144280 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-logging/logging-loki-distributor-76cc67bf56-qjz7m" podUID="841a3e5b-876d-43b2-b24a-d5c01876c30d" containerName="loki-distributor" probeResult="failure" output="Get \"https://10.217.0.52:3101/loki/api/v1/status/buildinfo\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 11 14:31:46 crc kubenswrapper[4898]: I1211 14:31:46.144385 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/controller-5bddd4b946-bkr8v" podUID="cfd0fc01-1bde-4b11-bbdd-d95693d0dd15" containerName="controller" probeResult="failure" output="Get \"http://10.217.0.98:29150/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 11 14:31:46 crc kubenswrapper[4898]: I1211 14:31:46.144696 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/controller-5bddd4b946-bkr8v" podUID="cfd0fc01-1bde-4b11-bbdd-d95693d0dd15" containerName="controller" probeResult="failure" output="Get \"http://10.217.0.98:29150/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 11 14:31:46 crc kubenswrapper[4898]: I1211 14:31:46.145719 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/cinder-api-0" podUID="3d7a3f50-05f9-4ffd-b473-b7c0c1563bc2" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.0.216:8776/healthcheck\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 11 14:31:46 crc kubenswrapper[4898]: I1211 14:31:46.149390 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="3d7a3f50-05f9-4ffd-b473-b7c0c1563bc2" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.0.216:8776/healthcheck\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 11 14:31:46 crc kubenswrapper[4898]: I1211 14:31:46.150495 4898 patch_prober.go:28] interesting pod/logging-loki-querier-5895d59bb8-62tqd container/loki-querier namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.53:3101/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 11 14:31:46 crc kubenswrapper[4898]: I1211 14:31:46.150537 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-querier-5895d59bb8-62tqd" podUID="4ecec7bc-8ce0-46ce-99fd-2a6fbcc626d0" containerName="loki-querier" probeResult="failure" output="Get \"https://10.217.0.53:3101/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 11 14:31:46 crc kubenswrapper[4898]: I1211 14:31:46.150682 4898 patch_prober.go:28] interesting pod/logging-loki-querier-5895d59bb8-62tqd container/loki-querier namespace/openshift-logging: Liveness probe status=failure output="Get \"https://10.217.0.53:3101/loki/api/v1/status/buildinfo\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 11 14:31:46 crc kubenswrapper[4898]: I1211 14:31:46.150859 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-logging/logging-loki-querier-5895d59bb8-62tqd" podUID="4ecec7bc-8ce0-46ce-99fd-2a6fbcc626d0" containerName="loki-querier" probeResult="failure" output="Get \"https://10.217.0.53:3101/loki/api/v1/status/buildinfo\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 11 14:31:46 crc kubenswrapper[4898]: I1211 14:31:46.151091 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="cert-manager/cert-manager-webhook-5655c58dd6-dwxc2" podUID="80c45250-6b80-452d-ade1-a8b024cabf10" containerName="cert-manager-webhook" probeResult="failure" output="Get \"http://10.217.0.46:6080/livez\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 11 14:31:46 crc kubenswrapper[4898]: I1211 14:31:46.151417 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="cert-manager/cert-manager-webhook-5655c58dd6-dwxc2" podUID="80c45250-6b80-452d-ade1-a8b024cabf10" containerName="cert-manager-webhook" probeResult="failure" output="Get \"http://10.217.0.46:6080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 11 14:31:46 crc kubenswrapper[4898]: I1211 14:31:46.151602 4898 patch_prober.go:28] interesting pod/machine-config-daemon-7mmvk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 11 14:31:46 crc kubenswrapper[4898]: I1211 14:31:46.151883 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 11 14:31:46 crc kubenswrapper[4898]: I1211 14:31:46.163294 4898 fsHandler.go:133] fs: disk usage and inodes count on following dirs took 2.165614766s: [/var/lib/containers/storage/overlay/de576e76b89e082b3427961813ba7086047f49f6de4c55475615d0df1f04d785/diff /var/log/pods/openstack-operators_manila-operator-controller-manager-5b5fd79c9c-wxz25_a99a2194-b89b-4a6a-a086-acd20b489632/manager/0.log]; will not log again for this container unless duration exceeds 3s Dec 11 14:31:46 crc kubenswrapper[4898]: I1211 14:31:46.163541 4898 fsHandler.go:133] fs: disk usage and inodes count on following dirs took 2.165860082s: [/var/lib/containers/storage/overlay/559a891cf07cd5b5b234e2bcb2df5d66489a5685e2f3af13b497e0ffd2b1413d/diff /var/log/pods/openstack-operators_glance-operator-controller-manager-5697bb5779-7kffw_5c391a19-7c2d-4838-9269-2c5cd8eea1ad/manager/0.log]; will not log again for this container unless duration exceeds 3s Dec 11 14:31:46 crc kubenswrapper[4898]: I1211 14:31:46.163954 4898 fsHandler.go:133] fs: disk usage and inodes count on following dirs took 2.166056477s: [/var/lib/containers/storage/overlay/31ae21029212ebcf6233d0cac52f59240176fa1c9dcd9bd2f9e0474a3d4ef34e/diff /var/log/pods/openshift-kube-scheduler-operator_openshift-kube-scheduler-operator-5fdd9b5758-25l4n_4d25d3be-3b53-4824-b357-5f251a16aa38/kube-scheduler-operator-container/0.log]; will not log again for this container unless duration exceeds 2s Dec 11 14:31:46 crc kubenswrapper[4898]: I1211 14:31:46.164094 4898 fsHandler.go:133] fs: disk usage and inodes count on following dirs took 2.16616355s: [/var/lib/containers/storage/overlay/14f0934fbd23ae4a76afb5013a28622cf828d355d92055f5dd245a68d5e86e50/diff /var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-tq8mw_0c6054e7-bb0a-4cbd-b459-d9d100182fa1/manager/0.log]; will not log again for this container unless duration exceeds 2s Dec 11 14:31:46 crc kubenswrapper[4898]: I1211 14:31:46.183749 4898 fsHandler.go:133] fs: disk usage and inodes count on following dirs took 2.186011028s: [/var/lib/containers/storage/overlay/841ac8d5eb070e832c48377f6e216da32fb30d216c531c85643ca9b47ff42ee1/diff /var/log/pods/openstack_rabbitmq-server-0_d7d19abc-90d0-413d-b8d7-67ae58b010f7/rabbitmq/0.log]; will not log again for this container unless duration exceeds 2s Dec 11 14:31:46 crc kubenswrapper[4898]: I1211 14:31:46.210525 4898 fsHandler.go:133] fs: disk usage and inodes count on following dirs took 2.212848905s: [/var/lib/containers/storage/overlay/186f059e84a576e0f2d5fcb01510038463db928335c2d612ed3f6fc64d2fdff2/diff /var/log/pods/openstack-operators_keystone-operator-controller-manager-7765d96ddf-wlqm4_e87a760e-bf60-4a98-bb37-1f44745e250f/manager/0.log]; will not log again for this container unless duration exceeds 3s Dec 11 14:31:46 crc kubenswrapper[4898]: I1211 14:31:46.212060 4898 fsHandler.go:133] fs: disk usage and inodes count on following dirs took 2.214370476s: [/var/lib/containers/storage/overlay/eec7d82040e51ed4f5606a2b9b561357b43135b47ef0bf0fd3c89e7c4375bcb6/diff /var/log/pods/openstack_neutron-84b4b98fdc-tbjdg_72ca8f16-912b-44f0-bc9d-868f381fb8fb/neutron-api/0.log]; will not log again for this container unless duration exceeds 3s Dec 11 14:31:46 crc kubenswrapper[4898]: I1211 14:31:46.220665 4898 fsHandler.go:133] fs: disk usage and inodes count on following dirs took 2.222820242s: [/var/lib/containers/storage/overlay/b6aefdd56963a05474aada2b290c1af090939f8d9fe60da8afa511d53ba66d94/diff /var/log/pods/openshift-operator-lifecycle-manager_package-server-manager-789f6589d5-zl9tf_d5b90074-739f-4f6d-a41c-29612abca57e/package-server-manager/0.log]; will not log again for this container unless duration exceeds 2s Dec 11 14:31:46 crc kubenswrapper[4898]: I1211 14:31:46.250035 4898 patch_prober.go:28] interesting pod/logging-loki-gateway-69ffd5987-wz9b5 container/opa namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.55:8083/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 11 14:31:46 crc kubenswrapper[4898]: I1211 14:31:46.250087 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-69ffd5987-wz9b5" podUID="6d6f1657-b9dd-4fba-a216-ca660a4fa958" containerName="opa" probeResult="failure" output="Get \"https://10.217.0.55:8083/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 11 14:31:46 crc kubenswrapper[4898]: I1211 14:31:46.254976 4898 patch_prober.go:28] interesting pod/apiserver-76f77b778f-fvkz4 container/openshift-apiserver namespace/openshift-apiserver: Readiness probe status=failure output="Get \"https://10.217.0.7:8443/readyz?exclude=etcd&exclude=etcd-readiness\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 11 14:31:46 crc kubenswrapper[4898]: I1211 14:31:46.255025 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-apiserver/apiserver-76f77b778f-fvkz4" podUID="494fa5aa-0063-4f27-af65-f3a92879017a" containerName="openshift-apiserver" probeResult="failure" output="Get \"https://10.217.0.7:8443/readyz?exclude=etcd&exclude=etcd-readiness\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 11 14:31:46 crc kubenswrapper[4898]: I1211 14:31:46.307549 4898 patch_prober.go:28] interesting pod/logging-loki-ingester-0 container/loki-ingester namespace/openshift-logging: Liveness probe status=failure output="Get \"https://10.217.0.62:3101/loki/api/v1/status/buildinfo\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 11 14:31:46 crc kubenswrapper[4898]: I1211 14:31:46.307870 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-logging/logging-loki-ingester-0" podUID="f6ad4db8-64f3-403c-9c92-9033a73ed12c" containerName="loki-ingester" probeResult="failure" output="Get \"https://10.217.0.62:3101/loki/api/v1/status/buildinfo\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 11 14:31:46 crc kubenswrapper[4898]: I1211 14:31:46.307651 4898 patch_prober.go:28] interesting pod/logging-loki-compactor-0 container/loki-compactor namespace/openshift-logging: Liveness probe status=failure output="Get \"https://10.217.0.75:3101/loki/api/v1/status/buildinfo\": context deadline exceeded" start-of-body= Dec 11 14:31:46 crc kubenswrapper[4898]: I1211 14:31:46.307926 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-logging/logging-loki-compactor-0" podUID="3cd3cd1d-9ead-4620-a346-f83e9e5190ba" containerName="loki-compactor" probeResult="failure" output="Get \"https://10.217.0.75:3101/loki/api/v1/status/buildinfo\": context deadline exceeded" Dec 11 14:31:46 crc kubenswrapper[4898]: I1211 14:31:46.308277 4898 patch_prober.go:28] interesting pod/logging-loki-gateway-69ffd5987-wz9b5 container/gateway namespace/openshift-logging: Liveness probe status=failure output="Get \"https://10.217.0.55:8081/live\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 11 14:31:46 crc kubenswrapper[4898]: I1211 14:31:46.308306 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-logging/logging-loki-gateway-69ffd5987-wz9b5" podUID="6d6f1657-b9dd-4fba-a216-ca660a4fa958" containerName="gateway" probeResult="failure" output="Get \"https://10.217.0.55:8081/live\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 11 14:31:46 crc kubenswrapper[4898]: I1211 14:31:46.308338 4898 patch_prober.go:28] interesting pod/logging-loki-gateway-69ffd5987-wz9b5 container/opa namespace/openshift-logging: Liveness probe status=failure output="Get \"https://10.217.0.55:8083/live\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 11 14:31:46 crc kubenswrapper[4898]: I1211 14:31:46.308350 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-logging/logging-loki-gateway-69ffd5987-wz9b5" podUID="6d6f1657-b9dd-4fba-a216-ca660a4fa958" containerName="opa" probeResult="failure" output="Get \"https://10.217.0.55:8083/live\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 11 14:31:46 crc kubenswrapper[4898]: I1211 14:31:46.310597 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/placement-operator-controller-manager-78f8948974-9p4w9" podUID="42b8c71f-abd8-49b1-b604-49b3292ba29a" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.119:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 11 14:31:46 crc kubenswrapper[4898]: I1211 14:31:46.310648 4898 patch_prober.go:28] interesting pod/metrics-server-7c75cc77ff-zj4jw container/metrics-server namespace/openshift-monitoring: Liveness probe status=failure output="Get \"https://10.217.0.76:10250/livez\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 11 14:31:46 crc kubenswrapper[4898]: I1211 14:31:46.310663 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-monitoring/metrics-server-7c75cc77ff-zj4jw" podUID="faa81158-1b24-4a0a-8fb6-f362177c51fd" containerName="metrics-server" probeResult="failure" output="Get \"https://10.217.0.76:10250/livez\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 11 14:31:46 crc kubenswrapper[4898]: I1211 14:31:45.955670 4898 patch_prober.go:28] interesting pod/loki-operator-controller-manager-7d9d9f99f6-7sstc container/manager namespace/openshift-operators-redhat: Readiness probe status=failure output="Get \"http://10.217.0.49:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 11 14:31:46 crc kubenswrapper[4898]: I1211 14:31:45.955697 4898 patch_prober.go:28] interesting pod/logging-loki-gateway-69ffd5987-wz9b5 container/gateway namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.55:8081/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 11 14:31:46 crc kubenswrapper[4898]: I1211 14:31:46.310982 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operators-redhat/loki-operator-controller-manager-7d9d9f99f6-7sstc" podUID="b4cf67d3-b13e-4afb-be20-80dc0801c69c" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.49:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 11 14:31:46 crc kubenswrapper[4898]: I1211 14:31:46.310996 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-69ffd5987-wz9b5" podUID="6d6f1657-b9dd-4fba-a216-ca660a4fa958" containerName="gateway" probeResult="failure" output="Get \"https://10.217.0.55:8081/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 11 14:31:46 crc kubenswrapper[4898]: I1211 14:31:46.311465 4898 patch_prober.go:28] interesting pod/console-9bc76884c-z28hg container/console namespace/openshift-console: Liveness probe status=failure output="Get \"https://10.217.0.137:8443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 11 14:31:46 crc kubenswrapper[4898]: I1211 14:31:46.311488 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/console-9bc76884c-z28hg" podUID="4ba352c0-f542-46ac-abcc-c136ddfb67fc" containerName="console" probeResult="failure" output="Get \"https://10.217.0.137:8443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 11 14:31:46 crc kubenswrapper[4898]: I1211 14:31:46.311519 4898 patch_prober.go:28] interesting pod/logging-loki-gateway-69ffd5987-jj95c container/gateway namespace/openshift-logging: Liveness probe status=failure output="Get \"https://10.217.0.61:8081/live\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 11 14:31:46 crc kubenswrapper[4898]: I1211 14:31:46.311533 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-logging/logging-loki-gateway-69ffd5987-jj95c" podUID="c8cb2c04-2c3f-4ffa-95b3-7e5c32a159ce" containerName="gateway" probeResult="failure" output="Get \"https://10.217.0.61:8081/live\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 11 14:31:46 crc kubenswrapper[4898]: I1211 14:31:46.321747 4898 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console/console-9bc76884c-z28hg" Dec 11 14:31:46 crc kubenswrapper[4898]: I1211 14:31:46.335168 4898 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="console" containerStatusID={"Type":"cri-o","ID":"2b862afefdde1f53049b99ed9a00156a984800927d438a9fa0edd9dabb7aebc0"} pod="openshift-console/console-9bc76884c-z28hg" containerMessage="Container console failed liveness probe, will be restarted" Dec 11 14:31:47 crc kubenswrapper[4898]: I1211 14:31:47.974126 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879fv8ph7" podUID="1fbc642b-9636-47c2-a3db-7913fa4a6b91" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.117:8081/readyz\": dial tcp 10.217.0.117:8081: connect: connection refused" Dec 11 14:31:47 crc kubenswrapper[4898]: I1211 14:31:47.974140 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879fv8ph7" podUID="1fbc642b-9636-47c2-a3db-7913fa4a6b91" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.117:8081/healthz\": dial tcp 10.217.0.117:8081: connect: connection refused" Dec 11 14:31:49 crc kubenswrapper[4898]: I1211 14:31:49.242134 4898 generic.go:334] "Generic (PLEG): container finished" podID="1fbc642b-9636-47c2-a3db-7913fa4a6b91" containerID="4349b750ee8f0ff9ebcade040362a3fdaaed85dd862c4059adc6e3c26f1d1eef" exitCode=1 Dec 11 14:31:49 crc kubenswrapper[4898]: I1211 14:31:49.242340 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879fv8ph7" event={"ID":"1fbc642b-9636-47c2-a3db-7913fa4a6b91","Type":"ContainerDied","Data":"4349b750ee8f0ff9ebcade040362a3fdaaed85dd862c4059adc6e3c26f1d1eef"} Dec 11 14:31:49 crc kubenswrapper[4898]: I1211 14:31:49.243851 4898 scope.go:117] "RemoveContainer" containerID="4349b750ee8f0ff9ebcade040362a3fdaaed85dd862c4059adc6e3c26f1d1eef" Dec 11 14:31:51 crc kubenswrapper[4898]: I1211 14:31:51.283268 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879fv8ph7" event={"ID":"1fbc642b-9636-47c2-a3db-7913fa4a6b91","Type":"ContainerStarted","Data":"d371a4e97044b968d16d4f6454821cabc5d064407bdd2a9ea5d04130b39a515c"} Dec 11 14:31:52 crc kubenswrapper[4898]: I1211 14:31:52.292098 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879fv8ph7" Dec 11 14:31:57 crc kubenswrapper[4898]: I1211 14:31:57.978159 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879fv8ph7" Dec 11 14:32:04 crc kubenswrapper[4898]: I1211 14:32:04.996140 4898 patch_prober.go:28] interesting pod/machine-config-daemon-7mmvk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 11 14:32:04 crc kubenswrapper[4898]: I1211 14:32:04.996693 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 11 14:32:04 crc kubenswrapper[4898]: I1211 14:32:04.996750 4898 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" Dec 11 14:32:04 crc kubenswrapper[4898]: I1211 14:32:04.997474 4898 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c9d8382e717f95e9e1fb7259437022afee2b7230a670db5835eea6c52595a22c"} pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 11 14:32:04 crc kubenswrapper[4898]: I1211 14:32:04.997540 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" containerName="machine-config-daemon" containerID="cri-o://c9d8382e717f95e9e1fb7259437022afee2b7230a670db5835eea6c52595a22c" gracePeriod=600 Dec 11 14:32:05 crc kubenswrapper[4898]: E1211 14:32:05.126921 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mmvk_openshift-machine-config-operator(b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" Dec 11 14:32:05 crc kubenswrapper[4898]: I1211 14:32:05.440736 4898 generic.go:334] "Generic (PLEG): container finished" podID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" containerID="c9d8382e717f95e9e1fb7259437022afee2b7230a670db5835eea6c52595a22c" exitCode=0 Dec 11 14:32:05 crc kubenswrapper[4898]: I1211 14:32:05.441050 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" event={"ID":"b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c","Type":"ContainerDied","Data":"c9d8382e717f95e9e1fb7259437022afee2b7230a670db5835eea6c52595a22c"} Dec 11 14:32:05 crc kubenswrapper[4898]: I1211 14:32:05.441226 4898 scope.go:117] "RemoveContainer" containerID="da62fe95b38866dac78489cf9559a274d6c76f5f176513a8e912deec87459d47" Dec 11 14:32:05 crc kubenswrapper[4898]: I1211 14:32:05.442097 4898 scope.go:117] "RemoveContainer" containerID="c9d8382e717f95e9e1fb7259437022afee2b7230a670db5835eea6c52595a22c" Dec 11 14:32:05 crc kubenswrapper[4898]: E1211 14:32:05.442514 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mmvk_openshift-machine-config-operator(b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" Dec 11 14:32:11 crc kubenswrapper[4898]: I1211 14:32:11.453262 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-9bc76884c-z28hg" podUID="4ba352c0-f542-46ac-abcc-c136ddfb67fc" containerName="console" containerID="cri-o://2b862afefdde1f53049b99ed9a00156a984800927d438a9fa0edd9dabb7aebc0" gracePeriod=15 Dec 11 14:32:12 crc kubenswrapper[4898]: I1211 14:32:12.515246 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-9bc76884c-z28hg_4ba352c0-f542-46ac-abcc-c136ddfb67fc/console/0.log" Dec 11 14:32:12 crc kubenswrapper[4898]: I1211 14:32:12.516406 4898 generic.go:334] "Generic (PLEG): container finished" podID="4ba352c0-f542-46ac-abcc-c136ddfb67fc" containerID="2b862afefdde1f53049b99ed9a00156a984800927d438a9fa0edd9dabb7aebc0" exitCode=2 Dec 11 14:32:12 crc kubenswrapper[4898]: I1211 14:32:12.516494 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-9bc76884c-z28hg" event={"ID":"4ba352c0-f542-46ac-abcc-c136ddfb67fc","Type":"ContainerDied","Data":"2b862afefdde1f53049b99ed9a00156a984800927d438a9fa0edd9dabb7aebc0"} Dec 11 14:32:12 crc kubenswrapper[4898]: I1211 14:32:12.516567 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-9bc76884c-z28hg" event={"ID":"4ba352c0-f542-46ac-abcc-c136ddfb67fc","Type":"ContainerStarted","Data":"d72c075dc7871e61fb69011a1d028ef903c49b75773f149de16f3104f7ba638a"} Dec 11 14:32:14 crc kubenswrapper[4898]: I1211 14:32:14.297127 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-9bc76884c-z28hg" Dec 11 14:32:14 crc kubenswrapper[4898]: I1211 14:32:14.297568 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-9bc76884c-z28hg" Dec 11 14:32:14 crc kubenswrapper[4898]: I1211 14:32:14.303206 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-9bc76884c-z28hg" Dec 11 14:32:14 crc kubenswrapper[4898]: I1211 14:32:14.543053 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-9bc76884c-z28hg" Dec 11 14:32:17 crc kubenswrapper[4898]: I1211 14:32:17.775432 4898 scope.go:117] "RemoveContainer" containerID="c9d8382e717f95e9e1fb7259437022afee2b7230a670db5835eea6c52595a22c" Dec 11 14:32:17 crc kubenswrapper[4898]: E1211 14:32:17.776304 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mmvk_openshift-machine-config-operator(b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" Dec 11 14:32:28 crc kubenswrapper[4898]: I1211 14:32:28.775919 4898 scope.go:117] "RemoveContainer" containerID="c9d8382e717f95e9e1fb7259437022afee2b7230a670db5835eea6c52595a22c" Dec 11 14:32:28 crc kubenswrapper[4898]: E1211 14:32:28.776916 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mmvk_openshift-machine-config-operator(b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" Dec 11 14:32:42 crc kubenswrapper[4898]: I1211 14:32:42.796744 4898 scope.go:117] "RemoveContainer" containerID="c9d8382e717f95e9e1fb7259437022afee2b7230a670db5835eea6c52595a22c" Dec 11 14:32:42 crc kubenswrapper[4898]: E1211 14:32:42.797913 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mmvk_openshift-machine-config-operator(b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" Dec 11 14:32:54 crc kubenswrapper[4898]: I1211 14:32:54.776321 4898 scope.go:117] "RemoveContainer" containerID="c9d8382e717f95e9e1fb7259437022afee2b7230a670db5835eea6c52595a22c" Dec 11 14:32:54 crc kubenswrapper[4898]: E1211 14:32:54.777754 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mmvk_openshift-machine-config-operator(b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" Dec 11 14:33:05 crc kubenswrapper[4898]: I1211 14:33:05.776601 4898 scope.go:117] "RemoveContainer" containerID="c9d8382e717f95e9e1fb7259437022afee2b7230a670db5835eea6c52595a22c" Dec 11 14:33:05 crc kubenswrapper[4898]: E1211 14:33:05.778398 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mmvk_openshift-machine-config-operator(b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" Dec 11 14:33:20 crc kubenswrapper[4898]: I1211 14:33:20.776255 4898 scope.go:117] "RemoveContainer" containerID="c9d8382e717f95e9e1fb7259437022afee2b7230a670db5835eea6c52595a22c" Dec 11 14:33:20 crc kubenswrapper[4898]: E1211 14:33:20.777266 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mmvk_openshift-machine-config-operator(b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" Dec 11 14:33:35 crc kubenswrapper[4898]: I1211 14:33:35.775186 4898 scope.go:117] "RemoveContainer" containerID="c9d8382e717f95e9e1fb7259437022afee2b7230a670db5835eea6c52595a22c" Dec 11 14:33:35 crc kubenswrapper[4898]: E1211 14:33:35.776112 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mmvk_openshift-machine-config-operator(b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" Dec 11 14:33:47 crc kubenswrapper[4898]: I1211 14:33:47.776140 4898 scope.go:117] "RemoveContainer" containerID="c9d8382e717f95e9e1fb7259437022afee2b7230a670db5835eea6c52595a22c" Dec 11 14:33:47 crc kubenswrapper[4898]: E1211 14:33:47.777451 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mmvk_openshift-machine-config-operator(b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" Dec 11 14:33:59 crc kubenswrapper[4898]: I1211 14:33:59.776127 4898 scope.go:117] "RemoveContainer" containerID="c9d8382e717f95e9e1fb7259437022afee2b7230a670db5835eea6c52595a22c" Dec 11 14:33:59 crc kubenswrapper[4898]: E1211 14:33:59.777865 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mmvk_openshift-machine-config-operator(b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" Dec 11 14:34:10 crc kubenswrapper[4898]: I1211 14:34:10.775402 4898 scope.go:117] "RemoveContainer" containerID="c9d8382e717f95e9e1fb7259437022afee2b7230a670db5835eea6c52595a22c" Dec 11 14:34:10 crc kubenswrapper[4898]: E1211 14:34:10.776169 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mmvk_openshift-machine-config-operator(b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" Dec 11 14:34:23 crc kubenswrapper[4898]: I1211 14:34:23.775161 4898 scope.go:117] "RemoveContainer" containerID="c9d8382e717f95e9e1fb7259437022afee2b7230a670db5835eea6c52595a22c" Dec 11 14:34:23 crc kubenswrapper[4898]: E1211 14:34:23.776101 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mmvk_openshift-machine-config-operator(b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" Dec 11 14:34:35 crc kubenswrapper[4898]: I1211 14:34:35.776255 4898 scope.go:117] "RemoveContainer" containerID="c9d8382e717f95e9e1fb7259437022afee2b7230a670db5835eea6c52595a22c" Dec 11 14:34:35 crc kubenswrapper[4898]: E1211 14:34:35.777932 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mmvk_openshift-machine-config-operator(b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" Dec 11 14:34:49 crc kubenswrapper[4898]: I1211 14:34:49.775395 4898 scope.go:117] "RemoveContainer" containerID="c9d8382e717f95e9e1fb7259437022afee2b7230a670db5835eea6c52595a22c" Dec 11 14:34:49 crc kubenswrapper[4898]: E1211 14:34:49.776636 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mmvk_openshift-machine-config-operator(b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" Dec 11 14:35:02 crc kubenswrapper[4898]: I1211 14:35:02.786554 4898 scope.go:117] "RemoveContainer" containerID="c9d8382e717f95e9e1fb7259437022afee2b7230a670db5835eea6c52595a22c" Dec 11 14:35:02 crc kubenswrapper[4898]: E1211 14:35:02.787404 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mmvk_openshift-machine-config-operator(b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" Dec 11 14:35:13 crc kubenswrapper[4898]: I1211 14:35:13.776550 4898 scope.go:117] "RemoveContainer" containerID="c9d8382e717f95e9e1fb7259437022afee2b7230a670db5835eea6c52595a22c" Dec 11 14:35:13 crc kubenswrapper[4898]: E1211 14:35:13.778747 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mmvk_openshift-machine-config-operator(b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" Dec 11 14:35:22 crc kubenswrapper[4898]: I1211 14:35:22.070134 4898 generic.go:334] "Generic (PLEG): container finished" podID="a1c00711-2048-486e-b3f7-3d5441032df8" containerID="36b9059582a127334c7da49faa3c0ab449f5011fcc0db21ade710e2d72fddbb7" exitCode=1 Dec 11 14:35:22 crc kubenswrapper[4898]: I1211 14:35:22.070227 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"a1c00711-2048-486e-b3f7-3d5441032df8","Type":"ContainerDied","Data":"36b9059582a127334c7da49faa3c0ab449f5011fcc0db21ade710e2d72fddbb7"} Dec 11 14:35:23 crc kubenswrapper[4898]: I1211 14:35:23.485218 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Dec 11 14:35:23 crc kubenswrapper[4898]: I1211 14:35:23.575914 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/a1c00711-2048-486e-b3f7-3d5441032df8-test-operator-ephemeral-workdir\") pod \"a1c00711-2048-486e-b3f7-3d5441032df8\" (UID: \"a1c00711-2048-486e-b3f7-3d5441032df8\") " Dec 11 14:35:23 crc kubenswrapper[4898]: I1211 14:35:23.576101 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"a1c00711-2048-486e-b3f7-3d5441032df8\" (UID: \"a1c00711-2048-486e-b3f7-3d5441032df8\") " Dec 11 14:35:23 crc kubenswrapper[4898]: I1211 14:35:23.576250 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t6xgj\" (UniqueName: \"kubernetes.io/projected/a1c00711-2048-486e-b3f7-3d5441032df8-kube-api-access-t6xgj\") pod \"a1c00711-2048-486e-b3f7-3d5441032df8\" (UID: \"a1c00711-2048-486e-b3f7-3d5441032df8\") " Dec 11 14:35:23 crc kubenswrapper[4898]: I1211 14:35:23.576333 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a1c00711-2048-486e-b3f7-3d5441032df8-ssh-key\") pod \"a1c00711-2048-486e-b3f7-3d5441032df8\" (UID: \"a1c00711-2048-486e-b3f7-3d5441032df8\") " Dec 11 14:35:23 crc kubenswrapper[4898]: I1211 14:35:23.576395 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/a1c00711-2048-486e-b3f7-3d5441032df8-ca-certs\") pod \"a1c00711-2048-486e-b3f7-3d5441032df8\" (UID: \"a1c00711-2048-486e-b3f7-3d5441032df8\") " Dec 11 14:35:23 crc kubenswrapper[4898]: I1211 14:35:23.576425 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/a1c00711-2048-486e-b3f7-3d5441032df8-openstack-config-secret\") pod \"a1c00711-2048-486e-b3f7-3d5441032df8\" (UID: \"a1c00711-2048-486e-b3f7-3d5441032df8\") " Dec 11 14:35:23 crc kubenswrapper[4898]: I1211 14:35:23.576493 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/a1c00711-2048-486e-b3f7-3d5441032df8-test-operator-ephemeral-temporary\") pod \"a1c00711-2048-486e-b3f7-3d5441032df8\" (UID: \"a1c00711-2048-486e-b3f7-3d5441032df8\") " Dec 11 14:35:23 crc kubenswrapper[4898]: I1211 14:35:23.576516 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/a1c00711-2048-486e-b3f7-3d5441032df8-openstack-config\") pod \"a1c00711-2048-486e-b3f7-3d5441032df8\" (UID: \"a1c00711-2048-486e-b3f7-3d5441032df8\") " Dec 11 14:35:23 crc kubenswrapper[4898]: I1211 14:35:23.576551 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a1c00711-2048-486e-b3f7-3d5441032df8-config-data\") pod \"a1c00711-2048-486e-b3f7-3d5441032df8\" (UID: \"a1c00711-2048-486e-b3f7-3d5441032df8\") " Dec 11 14:35:23 crc kubenswrapper[4898]: I1211 14:35:23.577383 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a1c00711-2048-486e-b3f7-3d5441032df8-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "a1c00711-2048-486e-b3f7-3d5441032df8" (UID: "a1c00711-2048-486e-b3f7-3d5441032df8"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 14:35:23 crc kubenswrapper[4898]: I1211 14:35:23.577864 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a1c00711-2048-486e-b3f7-3d5441032df8-config-data" (OuterVolumeSpecName: "config-data") pod "a1c00711-2048-486e-b3f7-3d5441032df8" (UID: "a1c00711-2048-486e-b3f7-3d5441032df8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 14:35:23 crc kubenswrapper[4898]: I1211 14:35:23.579753 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a1c00711-2048-486e-b3f7-3d5441032df8-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "a1c00711-2048-486e-b3f7-3d5441032df8" (UID: "a1c00711-2048-486e-b3f7-3d5441032df8"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 14:35:23 crc kubenswrapper[4898]: I1211 14:35:23.586235 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a1c00711-2048-486e-b3f7-3d5441032df8-kube-api-access-t6xgj" (OuterVolumeSpecName: "kube-api-access-t6xgj") pod "a1c00711-2048-486e-b3f7-3d5441032df8" (UID: "a1c00711-2048-486e-b3f7-3d5441032df8"). InnerVolumeSpecName "kube-api-access-t6xgj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 14:35:23 crc kubenswrapper[4898]: I1211 14:35:23.593503 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "test-operator-logs") pod "a1c00711-2048-486e-b3f7-3d5441032df8" (UID: "a1c00711-2048-486e-b3f7-3d5441032df8"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 11 14:35:23 crc kubenswrapper[4898]: I1211 14:35:23.623220 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1c00711-2048-486e-b3f7-3d5441032df8-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "a1c00711-2048-486e-b3f7-3d5441032df8" (UID: "a1c00711-2048-486e-b3f7-3d5441032df8"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 14:35:23 crc kubenswrapper[4898]: I1211 14:35:23.623747 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1c00711-2048-486e-b3f7-3d5441032df8-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "a1c00711-2048-486e-b3f7-3d5441032df8" (UID: "a1c00711-2048-486e-b3f7-3d5441032df8"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 14:35:23 crc kubenswrapper[4898]: I1211 14:35:23.633613 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1c00711-2048-486e-b3f7-3d5441032df8-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "a1c00711-2048-486e-b3f7-3d5441032df8" (UID: "a1c00711-2048-486e-b3f7-3d5441032df8"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 14:35:23 crc kubenswrapper[4898]: I1211 14:35:23.659753 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a1c00711-2048-486e-b3f7-3d5441032df8-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "a1c00711-2048-486e-b3f7-3d5441032df8" (UID: "a1c00711-2048-486e-b3f7-3d5441032df8"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 14:35:23 crc kubenswrapper[4898]: I1211 14:35:23.683600 4898 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a1c00711-2048-486e-b3f7-3d5441032df8-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 11 14:35:23 crc kubenswrapper[4898]: I1211 14:35:23.683648 4898 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/a1c00711-2048-486e-b3f7-3d5441032df8-ca-certs\") on node \"crc\" DevicePath \"\"" Dec 11 14:35:23 crc kubenswrapper[4898]: I1211 14:35:23.683665 4898 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/a1c00711-2048-486e-b3f7-3d5441032df8-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Dec 11 14:35:23 crc kubenswrapper[4898]: I1211 14:35:23.683681 4898 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/a1c00711-2048-486e-b3f7-3d5441032df8-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Dec 11 14:35:23 crc kubenswrapper[4898]: I1211 14:35:23.689272 4898 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/a1c00711-2048-486e-b3f7-3d5441032df8-openstack-config\") on node \"crc\" DevicePath \"\"" Dec 11 14:35:23 crc kubenswrapper[4898]: I1211 14:35:23.689303 4898 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a1c00711-2048-486e-b3f7-3d5441032df8-config-data\") on node \"crc\" DevicePath \"\"" Dec 11 14:35:23 crc kubenswrapper[4898]: I1211 14:35:23.689318 4898 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/a1c00711-2048-486e-b3f7-3d5441032df8-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Dec 11 14:35:23 crc kubenswrapper[4898]: I1211 14:35:23.691189 4898 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Dec 11 14:35:23 crc kubenswrapper[4898]: I1211 14:35:23.691219 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t6xgj\" (UniqueName: \"kubernetes.io/projected/a1c00711-2048-486e-b3f7-3d5441032df8-kube-api-access-t6xgj\") on node \"crc\" DevicePath \"\"" Dec 11 14:35:23 crc kubenswrapper[4898]: I1211 14:35:23.732235 4898 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Dec 11 14:35:23 crc kubenswrapper[4898]: I1211 14:35:23.793642 4898 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Dec 11 14:35:24 crc kubenswrapper[4898]: I1211 14:35:24.093041 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"a1c00711-2048-486e-b3f7-3d5441032df8","Type":"ContainerDied","Data":"9350096e9801513c7502031d4fa8739d2c915e503f1c7a4c8fd07d48bf747dfe"} Dec 11 14:35:24 crc kubenswrapper[4898]: I1211 14:35:24.093093 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9350096e9801513c7502031d4fa8739d2c915e503f1c7a4c8fd07d48bf747dfe" Dec 11 14:35:24 crc kubenswrapper[4898]: I1211 14:35:24.093219 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Dec 11 14:35:27 crc kubenswrapper[4898]: I1211 14:35:27.661348 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Dec 11 14:35:27 crc kubenswrapper[4898]: E1211 14:35:27.663288 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5ffa165-123d-4cfd-8cb1-bcf7ac05d69c" containerName="extract-content" Dec 11 14:35:27 crc kubenswrapper[4898]: I1211 14:35:27.663329 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5ffa165-123d-4cfd-8cb1-bcf7ac05d69c" containerName="extract-content" Dec 11 14:35:27 crc kubenswrapper[4898]: E1211 14:35:27.663369 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5ffa165-123d-4cfd-8cb1-bcf7ac05d69c" containerName="registry-server" Dec 11 14:35:27 crc kubenswrapper[4898]: I1211 14:35:27.663383 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5ffa165-123d-4cfd-8cb1-bcf7ac05d69c" containerName="registry-server" Dec 11 14:35:27 crc kubenswrapper[4898]: E1211 14:35:27.663428 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1c00711-2048-486e-b3f7-3d5441032df8" containerName="tempest-tests-tempest-tests-runner" Dec 11 14:35:27 crc kubenswrapper[4898]: I1211 14:35:27.663442 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1c00711-2048-486e-b3f7-3d5441032df8" containerName="tempest-tests-tempest-tests-runner" Dec 11 14:35:27 crc kubenswrapper[4898]: E1211 14:35:27.663505 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5ffa165-123d-4cfd-8cb1-bcf7ac05d69c" containerName="extract-utilities" Dec 11 14:35:27 crc kubenswrapper[4898]: I1211 14:35:27.663547 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5ffa165-123d-4cfd-8cb1-bcf7ac05d69c" containerName="extract-utilities" Dec 11 14:35:27 crc kubenswrapper[4898]: I1211 14:35:27.664088 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5ffa165-123d-4cfd-8cb1-bcf7ac05d69c" containerName="registry-server" Dec 11 14:35:27 crc kubenswrapper[4898]: I1211 14:35:27.664125 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1c00711-2048-486e-b3f7-3d5441032df8" containerName="tempest-tests-tempest-tests-runner" Dec 11 14:35:27 crc kubenswrapper[4898]: I1211 14:35:27.666082 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 11 14:35:27 crc kubenswrapper[4898]: I1211 14:35:27.668635 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-wn9nm" Dec 11 14:35:27 crc kubenswrapper[4898]: I1211 14:35:27.673389 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Dec 11 14:35:27 crc kubenswrapper[4898]: I1211 14:35:27.702348 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"5a0f55fc-21c0-4455-9732-de26acfd5907\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 11 14:35:27 crc kubenswrapper[4898]: I1211 14:35:27.702745 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cvt2l\" (UniqueName: \"kubernetes.io/projected/5a0f55fc-21c0-4455-9732-de26acfd5907-kube-api-access-cvt2l\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"5a0f55fc-21c0-4455-9732-de26acfd5907\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 11 14:35:27 crc kubenswrapper[4898]: I1211 14:35:27.775418 4898 scope.go:117] "RemoveContainer" containerID="c9d8382e717f95e9e1fb7259437022afee2b7230a670db5835eea6c52595a22c" Dec 11 14:35:27 crc kubenswrapper[4898]: E1211 14:35:27.775878 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mmvk_openshift-machine-config-operator(b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" Dec 11 14:35:27 crc kubenswrapper[4898]: I1211 14:35:27.805073 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"5a0f55fc-21c0-4455-9732-de26acfd5907\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 11 14:35:27 crc kubenswrapper[4898]: I1211 14:35:27.805174 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cvt2l\" (UniqueName: \"kubernetes.io/projected/5a0f55fc-21c0-4455-9732-de26acfd5907-kube-api-access-cvt2l\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"5a0f55fc-21c0-4455-9732-de26acfd5907\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 11 14:35:27 crc kubenswrapper[4898]: I1211 14:35:27.806652 4898 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"5a0f55fc-21c0-4455-9732-de26acfd5907\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 11 14:35:27 crc kubenswrapper[4898]: I1211 14:35:27.845691 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cvt2l\" (UniqueName: \"kubernetes.io/projected/5a0f55fc-21c0-4455-9732-de26acfd5907-kube-api-access-cvt2l\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"5a0f55fc-21c0-4455-9732-de26acfd5907\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 11 14:35:27 crc kubenswrapper[4898]: I1211 14:35:27.855207 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"5a0f55fc-21c0-4455-9732-de26acfd5907\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 11 14:35:27 crc kubenswrapper[4898]: I1211 14:35:27.999656 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 11 14:35:28 crc kubenswrapper[4898]: I1211 14:35:28.476638 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Dec 11 14:35:29 crc kubenswrapper[4898]: I1211 14:35:29.091815 4898 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 11 14:35:29 crc kubenswrapper[4898]: I1211 14:35:29.168995 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"5a0f55fc-21c0-4455-9732-de26acfd5907","Type":"ContainerStarted","Data":"fe0080f8c4c263530b10b999e6af1e8f3ca1ef1c38d0adda2ba5ce52735381fb"} Dec 11 14:35:31 crc kubenswrapper[4898]: I1211 14:35:31.196305 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"5a0f55fc-21c0-4455-9732-de26acfd5907","Type":"ContainerStarted","Data":"6f5ff907f45328a1f1a0b34ad50a522bd0a97792b717e1391ca3fa4920ba53e1"} Dec 11 14:35:31 crc kubenswrapper[4898]: I1211 14:35:31.211796 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=3.117325661 podStartE2EDuration="4.211762294s" podCreationTimestamp="2025-12-11 14:35:27 +0000 UTC" firstStartedPulling="2025-12-11 14:35:29.089805944 +0000 UTC m=+5486.662132381" lastFinishedPulling="2025-12-11 14:35:30.184242557 +0000 UTC m=+5487.756569014" observedRunningTime="2025-12-11 14:35:31.208341383 +0000 UTC m=+5488.780667820" watchObservedRunningTime="2025-12-11 14:35:31.211762294 +0000 UTC m=+5488.784088811" Dec 11 14:35:42 crc kubenswrapper[4898]: I1211 14:35:42.799421 4898 scope.go:117] "RemoveContainer" containerID="c9d8382e717f95e9e1fb7259437022afee2b7230a670db5835eea6c52595a22c" Dec 11 14:35:42 crc kubenswrapper[4898]: E1211 14:35:42.801008 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mmvk_openshift-machine-config-operator(b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" Dec 11 14:35:53 crc kubenswrapper[4898]: I1211 14:35:53.775204 4898 scope.go:117] "RemoveContainer" containerID="c9d8382e717f95e9e1fb7259437022afee2b7230a670db5835eea6c52595a22c" Dec 11 14:35:53 crc kubenswrapper[4898]: E1211 14:35:53.776101 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mmvk_openshift-machine-config-operator(b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" Dec 11 14:36:01 crc kubenswrapper[4898]: I1211 14:36:01.472182 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-h99mt/must-gather-rk4kt"] Dec 11 14:36:01 crc kubenswrapper[4898]: I1211 14:36:01.474587 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-h99mt/must-gather-rk4kt" Dec 11 14:36:01 crc kubenswrapper[4898]: I1211 14:36:01.481810 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-h99mt"/"default-dockercfg-htg4j" Dec 11 14:36:01 crc kubenswrapper[4898]: I1211 14:36:01.482000 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-h99mt"/"openshift-service-ca.crt" Dec 11 14:36:01 crc kubenswrapper[4898]: I1211 14:36:01.482164 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-h99mt"/"kube-root-ca.crt" Dec 11 14:36:01 crc kubenswrapper[4898]: I1211 14:36:01.488957 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-h99mt/must-gather-rk4kt"] Dec 11 14:36:01 crc kubenswrapper[4898]: I1211 14:36:01.618345 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/7e6cf072-5ba2-483d-9105-7772c4a02929-must-gather-output\") pod \"must-gather-rk4kt\" (UID: \"7e6cf072-5ba2-483d-9105-7772c4a02929\") " pod="openshift-must-gather-h99mt/must-gather-rk4kt" Dec 11 14:36:01 crc kubenswrapper[4898]: I1211 14:36:01.618403 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rsm8l\" (UniqueName: \"kubernetes.io/projected/7e6cf072-5ba2-483d-9105-7772c4a02929-kube-api-access-rsm8l\") pod \"must-gather-rk4kt\" (UID: \"7e6cf072-5ba2-483d-9105-7772c4a02929\") " pod="openshift-must-gather-h99mt/must-gather-rk4kt" Dec 11 14:36:01 crc kubenswrapper[4898]: I1211 14:36:01.720783 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/7e6cf072-5ba2-483d-9105-7772c4a02929-must-gather-output\") pod \"must-gather-rk4kt\" (UID: \"7e6cf072-5ba2-483d-9105-7772c4a02929\") " pod="openshift-must-gather-h99mt/must-gather-rk4kt" Dec 11 14:36:01 crc kubenswrapper[4898]: I1211 14:36:01.721083 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rsm8l\" (UniqueName: \"kubernetes.io/projected/7e6cf072-5ba2-483d-9105-7772c4a02929-kube-api-access-rsm8l\") pod \"must-gather-rk4kt\" (UID: \"7e6cf072-5ba2-483d-9105-7772c4a02929\") " pod="openshift-must-gather-h99mt/must-gather-rk4kt" Dec 11 14:36:01 crc kubenswrapper[4898]: I1211 14:36:01.722284 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/7e6cf072-5ba2-483d-9105-7772c4a02929-must-gather-output\") pod \"must-gather-rk4kt\" (UID: \"7e6cf072-5ba2-483d-9105-7772c4a02929\") " pod="openshift-must-gather-h99mt/must-gather-rk4kt" Dec 11 14:36:01 crc kubenswrapper[4898]: I1211 14:36:01.742515 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rsm8l\" (UniqueName: \"kubernetes.io/projected/7e6cf072-5ba2-483d-9105-7772c4a02929-kube-api-access-rsm8l\") pod \"must-gather-rk4kt\" (UID: \"7e6cf072-5ba2-483d-9105-7772c4a02929\") " pod="openshift-must-gather-h99mt/must-gather-rk4kt" Dec 11 14:36:01 crc kubenswrapper[4898]: I1211 14:36:01.795407 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-h99mt/must-gather-rk4kt" Dec 11 14:36:02 crc kubenswrapper[4898]: I1211 14:36:02.309711 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-h99mt/must-gather-rk4kt"] Dec 11 14:36:02 crc kubenswrapper[4898]: I1211 14:36:02.902293 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-h99mt/must-gather-rk4kt" event={"ID":"7e6cf072-5ba2-483d-9105-7772c4a02929","Type":"ContainerStarted","Data":"c9e38cc9c97d7518e2c8a70f8548a4d1a169f16815ef61fa3eb837ed6c52c7dd"} Dec 11 14:36:04 crc kubenswrapper[4898]: I1211 14:36:04.777269 4898 scope.go:117] "RemoveContainer" containerID="c9d8382e717f95e9e1fb7259437022afee2b7230a670db5835eea6c52595a22c" Dec 11 14:36:04 crc kubenswrapper[4898]: E1211 14:36:04.778127 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mmvk_openshift-machine-config-operator(b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" Dec 11 14:36:13 crc kubenswrapper[4898]: I1211 14:36:13.142423 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-h99mt/must-gather-rk4kt" event={"ID":"7e6cf072-5ba2-483d-9105-7772c4a02929","Type":"ContainerStarted","Data":"22ccb44ac9ef936558611c69137cfd71c856b941bd7ee3e4a58f4133bebf3e83"} Dec 11 14:36:13 crc kubenswrapper[4898]: I1211 14:36:13.142931 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-h99mt/must-gather-rk4kt" event={"ID":"7e6cf072-5ba2-483d-9105-7772c4a02929","Type":"ContainerStarted","Data":"05180daba4b3c82716c0c9ef849e4d95be4ec54c4f3edd315cbe96d6b595e646"} Dec 11 14:36:13 crc kubenswrapper[4898]: I1211 14:36:13.164659 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-h99mt/must-gather-rk4kt" podStartSLOduration=2.587741188 podStartE2EDuration="12.16463748s" podCreationTimestamp="2025-12-11 14:36:01 +0000 UTC" firstStartedPulling="2025-12-11 14:36:02.3128721 +0000 UTC m=+5519.885198547" lastFinishedPulling="2025-12-11 14:36:11.889768402 +0000 UTC m=+5529.462094839" observedRunningTime="2025-12-11 14:36:13.155165767 +0000 UTC m=+5530.727492204" watchObservedRunningTime="2025-12-11 14:36:13.16463748 +0000 UTC m=+5530.736963917" Dec 11 14:36:17 crc kubenswrapper[4898]: I1211 14:36:17.114381 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-h99mt/crc-debug-2njt5"] Dec 11 14:36:17 crc kubenswrapper[4898]: I1211 14:36:17.116702 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-h99mt/crc-debug-2njt5" Dec 11 14:36:17 crc kubenswrapper[4898]: I1211 14:36:17.217235 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t7wf7\" (UniqueName: \"kubernetes.io/projected/2a1d09c2-7fa9-481d-a192-d397a787931f-kube-api-access-t7wf7\") pod \"crc-debug-2njt5\" (UID: \"2a1d09c2-7fa9-481d-a192-d397a787931f\") " pod="openshift-must-gather-h99mt/crc-debug-2njt5" Dec 11 14:36:17 crc kubenswrapper[4898]: I1211 14:36:17.217336 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2a1d09c2-7fa9-481d-a192-d397a787931f-host\") pod \"crc-debug-2njt5\" (UID: \"2a1d09c2-7fa9-481d-a192-d397a787931f\") " pod="openshift-must-gather-h99mt/crc-debug-2njt5" Dec 11 14:36:17 crc kubenswrapper[4898]: I1211 14:36:17.319914 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t7wf7\" (UniqueName: \"kubernetes.io/projected/2a1d09c2-7fa9-481d-a192-d397a787931f-kube-api-access-t7wf7\") pod \"crc-debug-2njt5\" (UID: \"2a1d09c2-7fa9-481d-a192-d397a787931f\") " pod="openshift-must-gather-h99mt/crc-debug-2njt5" Dec 11 14:36:17 crc kubenswrapper[4898]: I1211 14:36:17.320002 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2a1d09c2-7fa9-481d-a192-d397a787931f-host\") pod \"crc-debug-2njt5\" (UID: \"2a1d09c2-7fa9-481d-a192-d397a787931f\") " pod="openshift-must-gather-h99mt/crc-debug-2njt5" Dec 11 14:36:17 crc kubenswrapper[4898]: I1211 14:36:17.320402 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2a1d09c2-7fa9-481d-a192-d397a787931f-host\") pod \"crc-debug-2njt5\" (UID: \"2a1d09c2-7fa9-481d-a192-d397a787931f\") " pod="openshift-must-gather-h99mt/crc-debug-2njt5" Dec 11 14:36:17 crc kubenswrapper[4898]: I1211 14:36:17.344770 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t7wf7\" (UniqueName: \"kubernetes.io/projected/2a1d09c2-7fa9-481d-a192-d397a787931f-kube-api-access-t7wf7\") pod \"crc-debug-2njt5\" (UID: \"2a1d09c2-7fa9-481d-a192-d397a787931f\") " pod="openshift-must-gather-h99mt/crc-debug-2njt5" Dec 11 14:36:17 crc kubenswrapper[4898]: I1211 14:36:17.438055 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-h99mt/crc-debug-2njt5" Dec 11 14:36:17 crc kubenswrapper[4898]: I1211 14:36:17.775058 4898 scope.go:117] "RemoveContainer" containerID="c9d8382e717f95e9e1fb7259437022afee2b7230a670db5835eea6c52595a22c" Dec 11 14:36:17 crc kubenswrapper[4898]: E1211 14:36:17.775676 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mmvk_openshift-machine-config-operator(b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" Dec 11 14:36:18 crc kubenswrapper[4898]: I1211 14:36:18.203594 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-h99mt/crc-debug-2njt5" event={"ID":"2a1d09c2-7fa9-481d-a192-d397a787931f","Type":"ContainerStarted","Data":"124c3fd460c9a82ed3dba8fce6547ee6a7bcf669cfc30ee72d3056113672d7a6"} Dec 11 14:36:30 crc kubenswrapper[4898]: I1211 14:36:30.775975 4898 scope.go:117] "RemoveContainer" containerID="c9d8382e717f95e9e1fb7259437022afee2b7230a670db5835eea6c52595a22c" Dec 11 14:36:30 crc kubenswrapper[4898]: E1211 14:36:30.776724 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mmvk_openshift-machine-config-operator(b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" Dec 11 14:36:31 crc kubenswrapper[4898]: I1211 14:36:31.398448 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-h99mt/crc-debug-2njt5" event={"ID":"2a1d09c2-7fa9-481d-a192-d397a787931f","Type":"ContainerStarted","Data":"fb7e8ce7f20277e58e8f3a910532bb144484588fb091f65e8d89b2fa4152c9d0"} Dec 11 14:36:31 crc kubenswrapper[4898]: I1211 14:36:31.421520 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-h99mt/crc-debug-2njt5" podStartSLOduration=1.7431112930000001 podStartE2EDuration="14.421500752s" podCreationTimestamp="2025-12-11 14:36:17 +0000 UTC" firstStartedPulling="2025-12-11 14:36:17.485502345 +0000 UTC m=+5535.057828782" lastFinishedPulling="2025-12-11 14:36:30.163891804 +0000 UTC m=+5547.736218241" observedRunningTime="2025-12-11 14:36:31.417022472 +0000 UTC m=+5548.989348919" watchObservedRunningTime="2025-12-11 14:36:31.421500752 +0000 UTC m=+5548.993827199" Dec 11 14:36:42 crc kubenswrapper[4898]: I1211 14:36:42.784125 4898 scope.go:117] "RemoveContainer" containerID="c9d8382e717f95e9e1fb7259437022afee2b7230a670db5835eea6c52595a22c" Dec 11 14:36:42 crc kubenswrapper[4898]: E1211 14:36:42.785814 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mmvk_openshift-machine-config-operator(b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" Dec 11 14:36:57 crc kubenswrapper[4898]: I1211 14:36:57.775329 4898 scope.go:117] "RemoveContainer" containerID="c9d8382e717f95e9e1fb7259437022afee2b7230a670db5835eea6c52595a22c" Dec 11 14:36:57 crc kubenswrapper[4898]: E1211 14:36:57.775997 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mmvk_openshift-machine-config-operator(b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" Dec 11 14:37:12 crc kubenswrapper[4898]: I1211 14:37:12.793838 4898 scope.go:117] "RemoveContainer" containerID="c9d8382e717f95e9e1fb7259437022afee2b7230a670db5835eea6c52595a22c" Dec 11 14:37:14 crc kubenswrapper[4898]: I1211 14:37:14.441677 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" event={"ID":"b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c","Type":"ContainerStarted","Data":"8c3848400d3383130ae2106d38db8971207b13a878503083f06812e151d95d89"} Dec 11 14:37:19 crc kubenswrapper[4898]: I1211 14:37:19.504633 4898 generic.go:334] "Generic (PLEG): container finished" podID="2a1d09c2-7fa9-481d-a192-d397a787931f" containerID="fb7e8ce7f20277e58e8f3a910532bb144484588fb091f65e8d89b2fa4152c9d0" exitCode=0 Dec 11 14:37:19 crc kubenswrapper[4898]: I1211 14:37:19.504743 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-h99mt/crc-debug-2njt5" event={"ID":"2a1d09c2-7fa9-481d-a192-d397a787931f","Type":"ContainerDied","Data":"fb7e8ce7f20277e58e8f3a910532bb144484588fb091f65e8d89b2fa4152c9d0"} Dec 11 14:37:20 crc kubenswrapper[4898]: I1211 14:37:20.636996 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-h99mt/crc-debug-2njt5" Dec 11 14:37:20 crc kubenswrapper[4898]: I1211 14:37:20.669515 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t7wf7\" (UniqueName: \"kubernetes.io/projected/2a1d09c2-7fa9-481d-a192-d397a787931f-kube-api-access-t7wf7\") pod \"2a1d09c2-7fa9-481d-a192-d397a787931f\" (UID: \"2a1d09c2-7fa9-481d-a192-d397a787931f\") " Dec 11 14:37:20 crc kubenswrapper[4898]: I1211 14:37:20.669796 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2a1d09c2-7fa9-481d-a192-d397a787931f-host\") pod \"2a1d09c2-7fa9-481d-a192-d397a787931f\" (UID: \"2a1d09c2-7fa9-481d-a192-d397a787931f\") " Dec 11 14:37:20 crc kubenswrapper[4898]: I1211 14:37:20.669947 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2a1d09c2-7fa9-481d-a192-d397a787931f-host" (OuterVolumeSpecName: "host") pod "2a1d09c2-7fa9-481d-a192-d397a787931f" (UID: "2a1d09c2-7fa9-481d-a192-d397a787931f"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 11 14:37:20 crc kubenswrapper[4898]: I1211 14:37:20.670560 4898 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2a1d09c2-7fa9-481d-a192-d397a787931f-host\") on node \"crc\" DevicePath \"\"" Dec 11 14:37:20 crc kubenswrapper[4898]: I1211 14:37:20.676688 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a1d09c2-7fa9-481d-a192-d397a787931f-kube-api-access-t7wf7" (OuterVolumeSpecName: "kube-api-access-t7wf7") pod "2a1d09c2-7fa9-481d-a192-d397a787931f" (UID: "2a1d09c2-7fa9-481d-a192-d397a787931f"). InnerVolumeSpecName "kube-api-access-t7wf7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 14:37:20 crc kubenswrapper[4898]: I1211 14:37:20.691865 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-h99mt/crc-debug-2njt5"] Dec 11 14:37:20 crc kubenswrapper[4898]: I1211 14:37:20.702664 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-h99mt/crc-debug-2njt5"] Dec 11 14:37:20 crc kubenswrapper[4898]: I1211 14:37:20.772196 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t7wf7\" (UniqueName: \"kubernetes.io/projected/2a1d09c2-7fa9-481d-a192-d397a787931f-kube-api-access-t7wf7\") on node \"crc\" DevicePath \"\"" Dec 11 14:37:20 crc kubenswrapper[4898]: I1211 14:37:20.790497 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2a1d09c2-7fa9-481d-a192-d397a787931f" path="/var/lib/kubelet/pods/2a1d09c2-7fa9-481d-a192-d397a787931f/volumes" Dec 11 14:37:21 crc kubenswrapper[4898]: I1211 14:37:21.525793 4898 scope.go:117] "RemoveContainer" containerID="fb7e8ce7f20277e58e8f3a910532bb144484588fb091f65e8d89b2fa4152c9d0" Dec 11 14:37:21 crc kubenswrapper[4898]: I1211 14:37:21.525802 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-h99mt/crc-debug-2njt5" Dec 11 14:37:21 crc kubenswrapper[4898]: I1211 14:37:21.941183 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-h99mt/crc-debug-w28vn"] Dec 11 14:37:21 crc kubenswrapper[4898]: E1211 14:37:21.941707 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a1d09c2-7fa9-481d-a192-d397a787931f" containerName="container-00" Dec 11 14:37:21 crc kubenswrapper[4898]: I1211 14:37:21.941719 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a1d09c2-7fa9-481d-a192-d397a787931f" containerName="container-00" Dec 11 14:37:21 crc kubenswrapper[4898]: I1211 14:37:21.941940 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a1d09c2-7fa9-481d-a192-d397a787931f" containerName="container-00" Dec 11 14:37:21 crc kubenswrapper[4898]: I1211 14:37:21.946657 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-h99mt/crc-debug-w28vn" Dec 11 14:37:22 crc kubenswrapper[4898]: I1211 14:37:22.002966 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pvgf4\" (UniqueName: \"kubernetes.io/projected/17d10e14-67d6-483c-8c7d-779864dfc0c8-kube-api-access-pvgf4\") pod \"crc-debug-w28vn\" (UID: \"17d10e14-67d6-483c-8c7d-779864dfc0c8\") " pod="openshift-must-gather-h99mt/crc-debug-w28vn" Dec 11 14:37:22 crc kubenswrapper[4898]: I1211 14:37:22.003023 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/17d10e14-67d6-483c-8c7d-779864dfc0c8-host\") pod \"crc-debug-w28vn\" (UID: \"17d10e14-67d6-483c-8c7d-779864dfc0c8\") " pod="openshift-must-gather-h99mt/crc-debug-w28vn" Dec 11 14:37:22 crc kubenswrapper[4898]: I1211 14:37:22.105898 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pvgf4\" (UniqueName: \"kubernetes.io/projected/17d10e14-67d6-483c-8c7d-779864dfc0c8-kube-api-access-pvgf4\") pod \"crc-debug-w28vn\" (UID: \"17d10e14-67d6-483c-8c7d-779864dfc0c8\") " pod="openshift-must-gather-h99mt/crc-debug-w28vn" Dec 11 14:37:22 crc kubenswrapper[4898]: I1211 14:37:22.106179 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/17d10e14-67d6-483c-8c7d-779864dfc0c8-host\") pod \"crc-debug-w28vn\" (UID: \"17d10e14-67d6-483c-8c7d-779864dfc0c8\") " pod="openshift-must-gather-h99mt/crc-debug-w28vn" Dec 11 14:37:22 crc kubenswrapper[4898]: I1211 14:37:22.106271 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/17d10e14-67d6-483c-8c7d-779864dfc0c8-host\") pod \"crc-debug-w28vn\" (UID: \"17d10e14-67d6-483c-8c7d-779864dfc0c8\") " pod="openshift-must-gather-h99mt/crc-debug-w28vn" Dec 11 14:37:22 crc kubenswrapper[4898]: I1211 14:37:22.682869 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pvgf4\" (UniqueName: \"kubernetes.io/projected/17d10e14-67d6-483c-8c7d-779864dfc0c8-kube-api-access-pvgf4\") pod \"crc-debug-w28vn\" (UID: \"17d10e14-67d6-483c-8c7d-779864dfc0c8\") " pod="openshift-must-gather-h99mt/crc-debug-w28vn" Dec 11 14:37:22 crc kubenswrapper[4898]: I1211 14:37:22.865914 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-h99mt/crc-debug-w28vn" Dec 11 14:37:23 crc kubenswrapper[4898]: I1211 14:37:23.557974 4898 generic.go:334] "Generic (PLEG): container finished" podID="17d10e14-67d6-483c-8c7d-779864dfc0c8" containerID="f1997493364ba0d976d505747a6f7fcfeded7d592c0df5e2ed01515febf73f92" exitCode=0 Dec 11 14:37:23 crc kubenswrapper[4898]: I1211 14:37:23.558093 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-h99mt/crc-debug-w28vn" event={"ID":"17d10e14-67d6-483c-8c7d-779864dfc0c8","Type":"ContainerDied","Data":"f1997493364ba0d976d505747a6f7fcfeded7d592c0df5e2ed01515febf73f92"} Dec 11 14:37:23 crc kubenswrapper[4898]: I1211 14:37:23.558310 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-h99mt/crc-debug-w28vn" event={"ID":"17d10e14-67d6-483c-8c7d-779864dfc0c8","Type":"ContainerStarted","Data":"81d2860d474ec81dcd32f7fc9645c94c77409acd15fd6049b3f312b31a60bd51"} Dec 11 14:37:24 crc kubenswrapper[4898]: I1211 14:37:24.607299 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-h99mt/crc-debug-w28vn"] Dec 11 14:37:24 crc kubenswrapper[4898]: I1211 14:37:24.617503 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-h99mt/crc-debug-w28vn"] Dec 11 14:37:24 crc kubenswrapper[4898]: I1211 14:37:24.896935 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-h99mt/crc-debug-w28vn" Dec 11 14:37:24 crc kubenswrapper[4898]: I1211 14:37:24.966205 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pvgf4\" (UniqueName: \"kubernetes.io/projected/17d10e14-67d6-483c-8c7d-779864dfc0c8-kube-api-access-pvgf4\") pod \"17d10e14-67d6-483c-8c7d-779864dfc0c8\" (UID: \"17d10e14-67d6-483c-8c7d-779864dfc0c8\") " Dec 11 14:37:24 crc kubenswrapper[4898]: I1211 14:37:24.966348 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/17d10e14-67d6-483c-8c7d-779864dfc0c8-host\") pod \"17d10e14-67d6-483c-8c7d-779864dfc0c8\" (UID: \"17d10e14-67d6-483c-8c7d-779864dfc0c8\") " Dec 11 14:37:24 crc kubenswrapper[4898]: I1211 14:37:24.966492 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/17d10e14-67d6-483c-8c7d-779864dfc0c8-host" (OuterVolumeSpecName: "host") pod "17d10e14-67d6-483c-8c7d-779864dfc0c8" (UID: "17d10e14-67d6-483c-8c7d-779864dfc0c8"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 11 14:37:24 crc kubenswrapper[4898]: I1211 14:37:24.967103 4898 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/17d10e14-67d6-483c-8c7d-779864dfc0c8-host\") on node \"crc\" DevicePath \"\"" Dec 11 14:37:24 crc kubenswrapper[4898]: I1211 14:37:24.990349 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17d10e14-67d6-483c-8c7d-779864dfc0c8-kube-api-access-pvgf4" (OuterVolumeSpecName: "kube-api-access-pvgf4") pod "17d10e14-67d6-483c-8c7d-779864dfc0c8" (UID: "17d10e14-67d6-483c-8c7d-779864dfc0c8"). InnerVolumeSpecName "kube-api-access-pvgf4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 14:37:25 crc kubenswrapper[4898]: I1211 14:37:25.068705 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pvgf4\" (UniqueName: \"kubernetes.io/projected/17d10e14-67d6-483c-8c7d-779864dfc0c8-kube-api-access-pvgf4\") on node \"crc\" DevicePath \"\"" Dec 11 14:37:25 crc kubenswrapper[4898]: I1211 14:37:25.583197 4898 scope.go:117] "RemoveContainer" containerID="f1997493364ba0d976d505747a6f7fcfeded7d592c0df5e2ed01515febf73f92" Dec 11 14:37:25 crc kubenswrapper[4898]: I1211 14:37:25.583221 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-h99mt/crc-debug-w28vn" Dec 11 14:37:26 crc kubenswrapper[4898]: I1211 14:37:26.010875 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-h99mt/crc-debug-tgtbq"] Dec 11 14:37:26 crc kubenswrapper[4898]: E1211 14:37:26.011578 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17d10e14-67d6-483c-8c7d-779864dfc0c8" containerName="container-00" Dec 11 14:37:26 crc kubenswrapper[4898]: I1211 14:37:26.011601 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="17d10e14-67d6-483c-8c7d-779864dfc0c8" containerName="container-00" Dec 11 14:37:26 crc kubenswrapper[4898]: I1211 14:37:26.012045 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="17d10e14-67d6-483c-8c7d-779864dfc0c8" containerName="container-00" Dec 11 14:37:26 crc kubenswrapper[4898]: I1211 14:37:26.013411 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-h99mt/crc-debug-tgtbq" Dec 11 14:37:26 crc kubenswrapper[4898]: I1211 14:37:26.102762 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f7a66500-4a83-488c-9ce7-846bbeb95fac-host\") pod \"crc-debug-tgtbq\" (UID: \"f7a66500-4a83-488c-9ce7-846bbeb95fac\") " pod="openshift-must-gather-h99mt/crc-debug-tgtbq" Dec 11 14:37:26 crc kubenswrapper[4898]: I1211 14:37:26.103023 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6swnw\" (UniqueName: \"kubernetes.io/projected/f7a66500-4a83-488c-9ce7-846bbeb95fac-kube-api-access-6swnw\") pod \"crc-debug-tgtbq\" (UID: \"f7a66500-4a83-488c-9ce7-846bbeb95fac\") " pod="openshift-must-gather-h99mt/crc-debug-tgtbq" Dec 11 14:37:26 crc kubenswrapper[4898]: I1211 14:37:26.205280 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f7a66500-4a83-488c-9ce7-846bbeb95fac-host\") pod \"crc-debug-tgtbq\" (UID: \"f7a66500-4a83-488c-9ce7-846bbeb95fac\") " pod="openshift-must-gather-h99mt/crc-debug-tgtbq" Dec 11 14:37:26 crc kubenswrapper[4898]: I1211 14:37:26.205365 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6swnw\" (UniqueName: \"kubernetes.io/projected/f7a66500-4a83-488c-9ce7-846bbeb95fac-kube-api-access-6swnw\") pod \"crc-debug-tgtbq\" (UID: \"f7a66500-4a83-488c-9ce7-846bbeb95fac\") " pod="openshift-must-gather-h99mt/crc-debug-tgtbq" Dec 11 14:37:26 crc kubenswrapper[4898]: I1211 14:37:26.205519 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f7a66500-4a83-488c-9ce7-846bbeb95fac-host\") pod \"crc-debug-tgtbq\" (UID: \"f7a66500-4a83-488c-9ce7-846bbeb95fac\") " pod="openshift-must-gather-h99mt/crc-debug-tgtbq" Dec 11 14:37:26 crc kubenswrapper[4898]: I1211 14:37:26.232628 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6swnw\" (UniqueName: \"kubernetes.io/projected/f7a66500-4a83-488c-9ce7-846bbeb95fac-kube-api-access-6swnw\") pod \"crc-debug-tgtbq\" (UID: \"f7a66500-4a83-488c-9ce7-846bbeb95fac\") " pod="openshift-must-gather-h99mt/crc-debug-tgtbq" Dec 11 14:37:26 crc kubenswrapper[4898]: I1211 14:37:26.341569 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-h99mt/crc-debug-tgtbq" Dec 11 14:37:26 crc kubenswrapper[4898]: I1211 14:37:26.600537 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-h99mt/crc-debug-tgtbq" event={"ID":"f7a66500-4a83-488c-9ce7-846bbeb95fac","Type":"ContainerStarted","Data":"d62e1e5576096e4cbc3c62f36445eb847d59fa9058f389babef9f752852113cb"} Dec 11 14:37:26 crc kubenswrapper[4898]: I1211 14:37:26.798667 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="17d10e14-67d6-483c-8c7d-779864dfc0c8" path="/var/lib/kubelet/pods/17d10e14-67d6-483c-8c7d-779864dfc0c8/volumes" Dec 11 14:37:27 crc kubenswrapper[4898]: I1211 14:37:27.618540 4898 generic.go:334] "Generic (PLEG): container finished" podID="f7a66500-4a83-488c-9ce7-846bbeb95fac" containerID="610fdd6f46852621ad8566e33bd97ec48e4d9920e3c5551a554d82ff582e5398" exitCode=0 Dec 11 14:37:27 crc kubenswrapper[4898]: I1211 14:37:27.618590 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-h99mt/crc-debug-tgtbq" event={"ID":"f7a66500-4a83-488c-9ce7-846bbeb95fac","Type":"ContainerDied","Data":"610fdd6f46852621ad8566e33bd97ec48e4d9920e3c5551a554d82ff582e5398"} Dec 11 14:37:27 crc kubenswrapper[4898]: I1211 14:37:27.679705 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-h99mt/crc-debug-tgtbq"] Dec 11 14:37:27 crc kubenswrapper[4898]: I1211 14:37:27.692212 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-h99mt/crc-debug-tgtbq"] Dec 11 14:37:28 crc kubenswrapper[4898]: I1211 14:37:28.765535 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-h99mt/crc-debug-tgtbq" Dec 11 14:37:28 crc kubenswrapper[4898]: I1211 14:37:28.878446 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6swnw\" (UniqueName: \"kubernetes.io/projected/f7a66500-4a83-488c-9ce7-846bbeb95fac-kube-api-access-6swnw\") pod \"f7a66500-4a83-488c-9ce7-846bbeb95fac\" (UID: \"f7a66500-4a83-488c-9ce7-846bbeb95fac\") " Dec 11 14:37:28 crc kubenswrapper[4898]: I1211 14:37:28.878949 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f7a66500-4a83-488c-9ce7-846bbeb95fac-host\") pod \"f7a66500-4a83-488c-9ce7-846bbeb95fac\" (UID: \"f7a66500-4a83-488c-9ce7-846bbeb95fac\") " Dec 11 14:37:28 crc kubenswrapper[4898]: I1211 14:37:28.879014 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f7a66500-4a83-488c-9ce7-846bbeb95fac-host" (OuterVolumeSpecName: "host") pod "f7a66500-4a83-488c-9ce7-846bbeb95fac" (UID: "f7a66500-4a83-488c-9ce7-846bbeb95fac"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 11 14:37:28 crc kubenswrapper[4898]: I1211 14:37:28.879839 4898 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f7a66500-4a83-488c-9ce7-846bbeb95fac-host\") on node \"crc\" DevicePath \"\"" Dec 11 14:37:28 crc kubenswrapper[4898]: I1211 14:37:28.887014 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7a66500-4a83-488c-9ce7-846bbeb95fac-kube-api-access-6swnw" (OuterVolumeSpecName: "kube-api-access-6swnw") pod "f7a66500-4a83-488c-9ce7-846bbeb95fac" (UID: "f7a66500-4a83-488c-9ce7-846bbeb95fac"). InnerVolumeSpecName "kube-api-access-6swnw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 14:37:28 crc kubenswrapper[4898]: I1211 14:37:28.983243 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6swnw\" (UniqueName: \"kubernetes.io/projected/f7a66500-4a83-488c-9ce7-846bbeb95fac-kube-api-access-6swnw\") on node \"crc\" DevicePath \"\"" Dec 11 14:37:29 crc kubenswrapper[4898]: I1211 14:37:29.648502 4898 scope.go:117] "RemoveContainer" containerID="610fdd6f46852621ad8566e33bd97ec48e4d9920e3c5551a554d82ff582e5398" Dec 11 14:37:29 crc kubenswrapper[4898]: I1211 14:37:29.648698 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-h99mt/crc-debug-tgtbq" Dec 11 14:37:30 crc kubenswrapper[4898]: I1211 14:37:30.786984 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f7a66500-4a83-488c-9ce7-846bbeb95fac" path="/var/lib/kubelet/pods/f7a66500-4a83-488c-9ce7-846bbeb95fac/volumes" Dec 11 14:37:46 crc kubenswrapper[4898]: I1211 14:37:46.415951 4898 scope.go:117] "RemoveContainer" containerID="3a93c709439bf4f84ae9d0ed35c1caf2f97fd9fce679a2303cc2124e86f7cee6" Dec 11 14:37:46 crc kubenswrapper[4898]: I1211 14:37:46.469011 4898 scope.go:117] "RemoveContainer" containerID="36d247cc4faa0a84216fa2049b291d71363f1dac5e575f5a9eb3909925add077" Dec 11 14:37:46 crc kubenswrapper[4898]: I1211 14:37:46.499330 4898 scope.go:117] "RemoveContainer" containerID="7620f97c127d87db60fd103de8f2837fafef40dd48d2faec6abbba04426cf1ac" Dec 11 14:37:54 crc kubenswrapper[4898]: I1211 14:37:54.819133 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_bd49b53a-7cbb-4ccf-953a-7ed292c090bf/aodh-api/0.log" Dec 11 14:37:55 crc kubenswrapper[4898]: I1211 14:37:55.033933 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_bd49b53a-7cbb-4ccf-953a-7ed292c090bf/aodh-evaluator/0.log" Dec 11 14:37:55 crc kubenswrapper[4898]: I1211 14:37:55.075605 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_bd49b53a-7cbb-4ccf-953a-7ed292c090bf/aodh-listener/0.log" Dec 11 14:37:55 crc kubenswrapper[4898]: I1211 14:37:55.075777 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_bd49b53a-7cbb-4ccf-953a-7ed292c090bf/aodh-notifier/0.log" Dec 11 14:37:55 crc kubenswrapper[4898]: I1211 14:37:55.213048 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-7d9697675d-q65pl_08d32354-0f02-436c-a082-9d02e2ebaadc/barbican-api/0.log" Dec 11 14:37:55 crc kubenswrapper[4898]: I1211 14:37:55.267027 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-7d9697675d-q65pl_08d32354-0f02-436c-a082-9d02e2ebaadc/barbican-api-log/0.log" Dec 11 14:37:55 crc kubenswrapper[4898]: I1211 14:37:55.382506 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-58648c8f48-xlp5l_33744ce1-11e6-4c20-b805-f9ba35221d29/barbican-keystone-listener/0.log" Dec 11 14:37:55 crc kubenswrapper[4898]: I1211 14:37:55.530258 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-58648c8f48-xlp5l_33744ce1-11e6-4c20-b805-f9ba35221d29/barbican-keystone-listener-log/0.log" Dec 11 14:37:55 crc kubenswrapper[4898]: I1211 14:37:55.592275 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-668c999b4c-4pjgd_12ec53d2-3707-4276-89f3-58df46bb17bd/barbican-worker/0.log" Dec 11 14:37:55 crc kubenswrapper[4898]: I1211 14:37:55.596409 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-668c999b4c-4pjgd_12ec53d2-3707-4276-89f3-58df46bb17bd/barbican-worker-log/0.log" Dec 11 14:37:55 crc kubenswrapper[4898]: I1211 14:37:55.740592 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-mh8pf_329b098d-eb3e-413e-b398-697e70fb3ef9/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Dec 11 14:37:55 crc kubenswrapper[4898]: I1211 14:37:55.812817 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_d71ee22f-68e7-43d7-8a6a-012ff8b8104e/ceilometer-central-agent/1.log" Dec 11 14:37:55 crc kubenswrapper[4898]: I1211 14:37:55.951658 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_d71ee22f-68e7-43d7-8a6a-012ff8b8104e/ceilometer-notification-agent/0.log" Dec 11 14:37:56 crc kubenswrapper[4898]: I1211 14:37:56.031187 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_d71ee22f-68e7-43d7-8a6a-012ff8b8104e/proxy-httpd/0.log" Dec 11 14:37:56 crc kubenswrapper[4898]: I1211 14:37:56.046101 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_d71ee22f-68e7-43d7-8a6a-012ff8b8104e/ceilometer-central-agent/0.log" Dec 11 14:37:56 crc kubenswrapper[4898]: I1211 14:37:56.070755 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_d71ee22f-68e7-43d7-8a6a-012ff8b8104e/sg-core/0.log" Dec 11 14:37:56 crc kubenswrapper[4898]: I1211 14:37:56.252117 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_3d7a3f50-05f9-4ffd-b473-b7c0c1563bc2/cinder-api-log/0.log" Dec 11 14:37:56 crc kubenswrapper[4898]: I1211 14:37:56.338059 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_3d7a3f50-05f9-4ffd-b473-b7c0c1563bc2/cinder-api/0.log" Dec 11 14:37:56 crc kubenswrapper[4898]: I1211 14:37:56.464219 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_512ddc04-04b7-409c-a856-4f9bf3b22c50/cinder-scheduler/1.log" Dec 11 14:37:56 crc kubenswrapper[4898]: I1211 14:37:56.504253 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_512ddc04-04b7-409c-a856-4f9bf3b22c50/cinder-scheduler/0.log" Dec 11 14:37:56 crc kubenswrapper[4898]: I1211 14:37:56.589540 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_512ddc04-04b7-409c-a856-4f9bf3b22c50/probe/0.log" Dec 11 14:37:56 crc kubenswrapper[4898]: I1211 14:37:56.770655 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-df9km_112b077a-0512-4528-8b26-158d512f09ad/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Dec 11 14:37:56 crc kubenswrapper[4898]: I1211 14:37:56.851200 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-k57p2_5c240120-432c-4eca-a36a-b16b03d2fbd2/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 11 14:37:57 crc kubenswrapper[4898]: I1211 14:37:57.021741 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-5d75f767dc-9gqbj_1324fad0-9e86-4e14-9d9f-da9de3cf3be7/init/0.log" Dec 11 14:37:57 crc kubenswrapper[4898]: I1211 14:37:57.223606 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-5d75f767dc-9gqbj_1324fad0-9e86-4e14-9d9f-da9de3cf3be7/init/0.log" Dec 11 14:37:57 crc kubenswrapper[4898]: I1211 14:37:57.263669 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-dj55k_fcd00abe-3b13-42b3-81bc-671095c37415/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Dec 11 14:37:57 crc kubenswrapper[4898]: I1211 14:37:57.268073 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-5d75f767dc-9gqbj_1324fad0-9e86-4e14-9d9f-da9de3cf3be7/dnsmasq-dns/0.log" Dec 11 14:37:57 crc kubenswrapper[4898]: I1211 14:37:57.464395 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_72c623d3-596d-4db8-8447-3bc93f7187e4/glance-httpd/0.log" Dec 11 14:37:57 crc kubenswrapper[4898]: I1211 14:37:57.508606 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_72c623d3-596d-4db8-8447-3bc93f7187e4/glance-log/0.log" Dec 11 14:37:58 crc kubenswrapper[4898]: I1211 14:37:58.440670 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_5a98434b-9a01-4f9f-a0c0-5c52ab613405/glance-httpd/0.log" Dec 11 14:37:58 crc kubenswrapper[4898]: I1211 14:37:58.463002 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_5a98434b-9a01-4f9f-a0c0-5c52ab613405/glance-log/0.log" Dec 11 14:37:58 crc kubenswrapper[4898]: I1211 14:37:58.913221 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-engine-686c7f94b-jlnr7_836f22d0-0883-463e-942b-abb6931a997f/heat-engine/0.log" Dec 11 14:37:59 crc kubenswrapper[4898]: I1211 14:37:59.042296 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-gvs5l_799a6ddd-3abc-4961-92b4-4a9db3b41f64/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Dec 11 14:37:59 crc kubenswrapper[4898]: I1211 14:37:59.166659 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-api-58ffc484cf-pk2vt_d83af347-3774-4e08-8138-6e67557da826/heat-api/0.log" Dec 11 14:37:59 crc kubenswrapper[4898]: I1211 14:37:59.291027 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-cfnapi-68bbb97c49-zk2lz_911f5e17-1a51-4bf3-8f1c-cdedc2f4404c/heat-cfnapi/0.log" Dec 11 14:37:59 crc kubenswrapper[4898]: I1211 14:37:59.344085 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-lv4d4_0c0edae9-777b-4b14-9014-8e7dddcc6319/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 11 14:37:59 crc kubenswrapper[4898]: I1211 14:37:59.459938 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29424361-bt5ps_587f3b75-4378-4c14-a2e0-e990a8270221/keystone-cron/0.log" Dec 11 14:37:59 crc kubenswrapper[4898]: I1211 14:37:59.626204 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_882df8e7-3b5e-4dd9-af28-fa75e752cade/kube-state-metrics/0.log" Dec 11 14:38:00 crc kubenswrapper[4898]: I1211 14:38:00.004848 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-mzktd_500dff36-d95c-4690-8fad-db278c0c0ac9/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Dec 11 14:38:00 crc kubenswrapper[4898]: I1211 14:38:00.356534 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-5d97d757c9-6txl4_c23801ba-7898-47cc-bbf9-bd108c25f99e/keystone-api/0.log" Dec 11 14:38:00 crc kubenswrapper[4898]: I1211 14:38:00.410997 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_logging-edpm-deployment-openstack-edpm-ipam-qbckn_2b009a23-c578-4a3c-aca4-b68d1b9e9118/logging-edpm-deployment-openstack-edpm-ipam/0.log" Dec 11 14:38:00 crc kubenswrapper[4898]: I1211 14:38:00.613924 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mysqld-exporter-0_8a0fdc4f-1bd6-4444-80e4-6ce57885c417/mysqld-exporter/0.log" Dec 11 14:38:00 crc kubenswrapper[4898]: I1211 14:38:00.887095 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-84b4b98fdc-tbjdg_72ca8f16-912b-44f0-bc9d-868f381fb8fb/neutron-api/0.log" Dec 11 14:38:00 crc kubenswrapper[4898]: I1211 14:38:00.953708 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-84b4b98fdc-tbjdg_72ca8f16-912b-44f0-bc9d-868f381fb8fb/neutron-httpd/0.log" Dec 11 14:38:01 crc kubenswrapper[4898]: I1211 14:38:01.012294 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-6wt42_67ae3c0c-6ce1-429e-953d-2cac885ee43c/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Dec 11 14:38:01 crc kubenswrapper[4898]: I1211 14:38:01.584209 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_6dbbbea1-e6ed-4ad9-9e31-baf09c312e76/nova-cell0-conductor-conductor/0.log" Dec 11 14:38:01 crc kubenswrapper[4898]: I1211 14:38:01.678887 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_12f955af-c5c3-47de-b1ef-f23b46d06a62/nova-api-log/0.log" Dec 11 14:38:01 crc kubenswrapper[4898]: I1211 14:38:01.873958 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_ec3c0ff6-8367-4309-a686-820483a8f6e5/nova-cell1-conductor-conductor/0.log" Dec 11 14:38:02 crc kubenswrapper[4898]: I1211 14:38:02.058419 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_b2660b5e-f832-4d7b-8fa5-f3f661708a33/nova-cell1-novncproxy-novncproxy/0.log" Dec 11 14:38:02 crc kubenswrapper[4898]: I1211 14:38:02.157265 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_12f955af-c5c3-47de-b1ef-f23b46d06a62/nova-api-api/0.log" Dec 11 14:38:02 crc kubenswrapper[4898]: I1211 14:38:02.182634 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-jrmnk_57ebbb8f-174c-4862-bba6-5644c98c7b1c/nova-edpm-deployment-openstack-edpm-ipam/0.log" Dec 11 14:38:02 crc kubenswrapper[4898]: I1211 14:38:02.522201 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_a7102f4f-6a78-4e44-8f68-9efeccd5d632/nova-metadata-log/0.log" Dec 11 14:38:02 crc kubenswrapper[4898]: I1211 14:38:02.736673 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_5ae31191-f9f6-452a-8f45-a48b4736012e/mysql-bootstrap/0.log" Dec 11 14:38:02 crc kubenswrapper[4898]: I1211 14:38:02.760629 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_119fd4c2-86a1-48d1-8005-2fc9a3062219/nova-scheduler-scheduler/0.log" Dec 11 14:38:02 crc kubenswrapper[4898]: I1211 14:38:02.977301 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_5ae31191-f9f6-452a-8f45-a48b4736012e/mysql-bootstrap/0.log" Dec 11 14:38:03 crc kubenswrapper[4898]: I1211 14:38:03.005624 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_5ae31191-f9f6-452a-8f45-a48b4736012e/galera/0.log" Dec 11 14:38:03 crc kubenswrapper[4898]: I1211 14:38:03.068098 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_5ae31191-f9f6-452a-8f45-a48b4736012e/galera/1.log" Dec 11 14:38:03 crc kubenswrapper[4898]: I1211 14:38:03.207935 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_2e7cffb6-80f8-45e8-a4ab-219dc834a613/mysql-bootstrap/0.log" Dec 11 14:38:03 crc kubenswrapper[4898]: I1211 14:38:03.489993 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_2e7cffb6-80f8-45e8-a4ab-219dc834a613/mysql-bootstrap/0.log" Dec 11 14:38:03 crc kubenswrapper[4898]: I1211 14:38:03.533510 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_2e7cffb6-80f8-45e8-a4ab-219dc834a613/galera/0.log" Dec 11 14:38:03 crc kubenswrapper[4898]: I1211 14:38:03.555634 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_2e7cffb6-80f8-45e8-a4ab-219dc834a613/galera/1.log" Dec 11 14:38:03 crc kubenswrapper[4898]: I1211 14:38:03.756856 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_11154469-5a32-47bb-bbaf-66ea95afcf82/openstackclient/0.log" Dec 11 14:38:03 crc kubenswrapper[4898]: I1211 14:38:03.910089 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-lsxj7_0c503750-f2d3-42e3-84ba-1db55db9228f/ovn-controller/0.log" Dec 11 14:38:04 crc kubenswrapper[4898]: I1211 14:38:04.075414 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-5wp9b_33e30fc7-973e-436e-a9df-c839f8609a99/openstack-network-exporter/0.log" Dec 11 14:38:04 crc kubenswrapper[4898]: I1211 14:38:04.203257 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-fxk76_a6777806-e5a2-4585-bd5a-8ba7f7757c59/ovsdb-server-init/0.log" Dec 11 14:38:04 crc kubenswrapper[4898]: I1211 14:38:04.443231 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-fxk76_a6777806-e5a2-4585-bd5a-8ba7f7757c59/ovs-vswitchd/0.log" Dec 11 14:38:04 crc kubenswrapper[4898]: I1211 14:38:04.465027 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-fxk76_a6777806-e5a2-4585-bd5a-8ba7f7757c59/ovsdb-server-init/0.log" Dec 11 14:38:04 crc kubenswrapper[4898]: I1211 14:38:04.471893 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-fxk76_a6777806-e5a2-4585-bd5a-8ba7f7757c59/ovsdb-server/0.log" Dec 11 14:38:04 crc kubenswrapper[4898]: I1211 14:38:04.725030 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-nhkjq_68ef432f-8154-4f0d-b92f-5cfeed0c22ce/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Dec 11 14:38:04 crc kubenswrapper[4898]: I1211 14:38:04.877231 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_69fcbdad-bb34-4a36-9100-352ddce7c906/openstack-network-exporter/0.log" Dec 11 14:38:04 crc kubenswrapper[4898]: I1211 14:38:04.906018 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_a7102f4f-6a78-4e44-8f68-9efeccd5d632/nova-metadata-metadata/0.log" Dec 11 14:38:04 crc kubenswrapper[4898]: I1211 14:38:04.954119 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_69fcbdad-bb34-4a36-9100-352ddce7c906/ovn-northd/0.log" Dec 11 14:38:05 crc kubenswrapper[4898]: I1211 14:38:05.108955 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_2f82c55a-6891-4ba6-bcbb-854c918faa92/openstack-network-exporter/0.log" Dec 11 14:38:05 crc kubenswrapper[4898]: I1211 14:38:05.149254 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_2f82c55a-6891-4ba6-bcbb-854c918faa92/ovsdbserver-nb/0.log" Dec 11 14:38:05 crc kubenswrapper[4898]: I1211 14:38:05.312714 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_aab51a6c-5473-415e-913c-dcb5907cd012/openstack-network-exporter/0.log" Dec 11 14:38:05 crc kubenswrapper[4898]: I1211 14:38:05.321704 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_aab51a6c-5473-415e-913c-dcb5907cd012/ovsdbserver-sb/0.log" Dec 11 14:38:05 crc kubenswrapper[4898]: I1211 14:38:05.578238 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-6849f86cdd-lt69n_0ee89c59-41c3-4df8-b1ed-7a0fab3b6ee1/placement-api/0.log" Dec 11 14:38:05 crc kubenswrapper[4898]: I1211 14:38:05.654440 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_81600875-da95-4cb0-b179-1804494d29d8/init-config-reloader/0.log" Dec 11 14:38:05 crc kubenswrapper[4898]: I1211 14:38:05.656612 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-6849f86cdd-lt69n_0ee89c59-41c3-4df8-b1ed-7a0fab3b6ee1/placement-log/0.log" Dec 11 14:38:05 crc kubenswrapper[4898]: I1211 14:38:05.937446 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_81600875-da95-4cb0-b179-1804494d29d8/init-config-reloader/0.log" Dec 11 14:38:05 crc kubenswrapper[4898]: I1211 14:38:05.941239 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_81600875-da95-4cb0-b179-1804494d29d8/prometheus/0.log" Dec 11 14:38:05 crc kubenswrapper[4898]: I1211 14:38:05.942141 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_81600875-da95-4cb0-b179-1804494d29d8/thanos-sidecar/0.log" Dec 11 14:38:05 crc kubenswrapper[4898]: I1211 14:38:05.984862 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_81600875-da95-4cb0-b179-1804494d29d8/config-reloader/0.log" Dec 11 14:38:06 crc kubenswrapper[4898]: I1211 14:38:06.146009 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_91c646bc-40ca-434e-8db2-df2eb46c4e5e/setup-container/0.log" Dec 11 14:38:06 crc kubenswrapper[4898]: I1211 14:38:06.388233 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_91c646bc-40ca-434e-8db2-df2eb46c4e5e/setup-container/0.log" Dec 11 14:38:06 crc kubenswrapper[4898]: I1211 14:38:06.409633 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_d7d19abc-90d0-413d-b8d7-67ae58b010f7/setup-container/0.log" Dec 11 14:38:06 crc kubenswrapper[4898]: I1211 14:38:06.461801 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_91c646bc-40ca-434e-8db2-df2eb46c4e5e/rabbitmq/0.log" Dec 11 14:38:06 crc kubenswrapper[4898]: I1211 14:38:06.745511 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_d7d19abc-90d0-413d-b8d7-67ae58b010f7/setup-container/0.log" Dec 11 14:38:06 crc kubenswrapper[4898]: I1211 14:38:06.758362 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-cgb92_968ddbd1-b46e-4d09-85d8-ebcf30b32cb6/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 11 14:38:06 crc kubenswrapper[4898]: I1211 14:38:06.780359 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_d7d19abc-90d0-413d-b8d7-67ae58b010f7/rabbitmq/0.log" Dec 11 14:38:06 crc kubenswrapper[4898]: I1211 14:38:06.973052 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-89l85_aec2abe1-e7dc-48b7-b34d-f5b50f289ab8/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Dec 11 14:38:07 crc kubenswrapper[4898]: I1211 14:38:07.032613 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-mqdrp_8831e333-4a32-4f65-85e7-16a9a95360c9/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Dec 11 14:38:07 crc kubenswrapper[4898]: I1211 14:38:07.222910 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-nz76h_0ffe508f-3789-4430-89e6-fa3faa46514d/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 11 14:38:07 crc kubenswrapper[4898]: I1211 14:38:07.304386 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-zzk8h_680cd71e-efd0-4750-80a1-3c719e9192c2/ssh-known-hosts-edpm-deployment/0.log" Dec 11 14:38:08 crc kubenswrapper[4898]: I1211 14:38:08.120337 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-6867fd7bcf-bbj7b_4ed17564-edd0-4a66-8b9b-04aabd280113/proxy-server/0.log" Dec 11 14:38:08 crc kubenswrapper[4898]: I1211 14:38:08.259166 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-lx2j5_6d5760a1-aea8-4f95-8da7-8832f8879d57/swift-ring-rebalance/0.log" Dec 11 14:38:08 crc kubenswrapper[4898]: I1211 14:38:08.339044 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_717e7951-d95b-497f-b2b7-3ec4ef755642/account-auditor/0.log" Dec 11 14:38:08 crc kubenswrapper[4898]: I1211 14:38:08.367817 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-6867fd7bcf-bbj7b_4ed17564-edd0-4a66-8b9b-04aabd280113/proxy-httpd/0.log" Dec 11 14:38:08 crc kubenswrapper[4898]: I1211 14:38:08.471084 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_717e7951-d95b-497f-b2b7-3ec4ef755642/account-reaper/0.log" Dec 11 14:38:08 crc kubenswrapper[4898]: I1211 14:38:08.588280 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_717e7951-d95b-497f-b2b7-3ec4ef755642/account-replicator/0.log" Dec 11 14:38:08 crc kubenswrapper[4898]: I1211 14:38:08.645070 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_717e7951-d95b-497f-b2b7-3ec4ef755642/container-auditor/0.log" Dec 11 14:38:08 crc kubenswrapper[4898]: I1211 14:38:08.670556 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_717e7951-d95b-497f-b2b7-3ec4ef755642/account-server/0.log" Dec 11 14:38:08 crc kubenswrapper[4898]: I1211 14:38:08.702680 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_717e7951-d95b-497f-b2b7-3ec4ef755642/container-replicator/0.log" Dec 11 14:38:08 crc kubenswrapper[4898]: I1211 14:38:08.824176 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_717e7951-d95b-497f-b2b7-3ec4ef755642/container-server/0.log" Dec 11 14:38:08 crc kubenswrapper[4898]: I1211 14:38:08.838876 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_717e7951-d95b-497f-b2b7-3ec4ef755642/container-updater/0.log" Dec 11 14:38:08 crc kubenswrapper[4898]: I1211 14:38:08.951317 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_717e7951-d95b-497f-b2b7-3ec4ef755642/object-auditor/0.log" Dec 11 14:38:08 crc kubenswrapper[4898]: I1211 14:38:08.956050 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_717e7951-d95b-497f-b2b7-3ec4ef755642/object-expirer/0.log" Dec 11 14:38:09 crc kubenswrapper[4898]: I1211 14:38:09.076637 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_717e7951-d95b-497f-b2b7-3ec4ef755642/object-server/0.log" Dec 11 14:38:09 crc kubenswrapper[4898]: I1211 14:38:09.095173 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_717e7951-d95b-497f-b2b7-3ec4ef755642/object-replicator/0.log" Dec 11 14:38:09 crc kubenswrapper[4898]: I1211 14:38:09.200709 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_717e7951-d95b-497f-b2b7-3ec4ef755642/object-updater/0.log" Dec 11 14:38:09 crc kubenswrapper[4898]: I1211 14:38:09.242322 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_717e7951-d95b-497f-b2b7-3ec4ef755642/rsync/0.log" Dec 11 14:38:09 crc kubenswrapper[4898]: I1211 14:38:09.296632 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_717e7951-d95b-497f-b2b7-3ec4ef755642/swift-recon-cron/0.log" Dec 11 14:38:09 crc kubenswrapper[4898]: I1211 14:38:09.497011 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-tg6rn_c4f48310-0f95-46eb-9a4f-d1058c7a47c0/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Dec 11 14:38:10 crc kubenswrapper[4898]: I1211 14:38:10.315147 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-power-monitoring-edpm-deployment-openstack-edpm-998xg_2330f5fe-a653-4a3b-8563-d9bce00bd081/telemetry-power-monitoring-edpm-deployment-openstack-edpm-ipam/0.log" Dec 11 14:38:10 crc kubenswrapper[4898]: I1211 14:38:10.497137 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_a1c00711-2048-486e-b3f7-3d5441032df8/tempest-tests-tempest-tests-runner/0.log" Dec 11 14:38:10 crc kubenswrapper[4898]: I1211 14:38:10.566992 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_5a0f55fc-21c0-4455-9732-de26acfd5907/test-operator-logs-container/0.log" Dec 11 14:38:10 crc kubenswrapper[4898]: I1211 14:38:10.682771 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-dwntb_9f59c773-f89d-41f6-97a4-26e5a362d79f/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Dec 11 14:38:18 crc kubenswrapper[4898]: I1211 14:38:18.105821 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-lt87r"] Dec 11 14:38:18 crc kubenswrapper[4898]: E1211 14:38:18.107344 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7a66500-4a83-488c-9ce7-846bbeb95fac" containerName="container-00" Dec 11 14:38:18 crc kubenswrapper[4898]: I1211 14:38:18.107365 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7a66500-4a83-488c-9ce7-846bbeb95fac" containerName="container-00" Dec 11 14:38:18 crc kubenswrapper[4898]: I1211 14:38:18.107702 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7a66500-4a83-488c-9ce7-846bbeb95fac" containerName="container-00" Dec 11 14:38:18 crc kubenswrapper[4898]: I1211 14:38:18.109774 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lt87r" Dec 11 14:38:18 crc kubenswrapper[4898]: I1211 14:38:18.127396 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-lt87r"] Dec 11 14:38:18 crc kubenswrapper[4898]: I1211 14:38:18.263743 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/20c294a4-6655-4e6e-a056-7685588b8fd8-catalog-content\") pod \"redhat-operators-lt87r\" (UID: \"20c294a4-6655-4e6e-a056-7685588b8fd8\") " pod="openshift-marketplace/redhat-operators-lt87r" Dec 11 14:38:18 crc kubenswrapper[4898]: I1211 14:38:18.263844 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h57vr\" (UniqueName: \"kubernetes.io/projected/20c294a4-6655-4e6e-a056-7685588b8fd8-kube-api-access-h57vr\") pod \"redhat-operators-lt87r\" (UID: \"20c294a4-6655-4e6e-a056-7685588b8fd8\") " pod="openshift-marketplace/redhat-operators-lt87r" Dec 11 14:38:18 crc kubenswrapper[4898]: I1211 14:38:18.263981 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/20c294a4-6655-4e6e-a056-7685588b8fd8-utilities\") pod \"redhat-operators-lt87r\" (UID: \"20c294a4-6655-4e6e-a056-7685588b8fd8\") " pod="openshift-marketplace/redhat-operators-lt87r" Dec 11 14:38:18 crc kubenswrapper[4898]: I1211 14:38:18.366389 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/20c294a4-6655-4e6e-a056-7685588b8fd8-utilities\") pod \"redhat-operators-lt87r\" (UID: \"20c294a4-6655-4e6e-a056-7685588b8fd8\") " pod="openshift-marketplace/redhat-operators-lt87r" Dec 11 14:38:18 crc kubenswrapper[4898]: I1211 14:38:18.366494 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/20c294a4-6655-4e6e-a056-7685588b8fd8-catalog-content\") pod \"redhat-operators-lt87r\" (UID: \"20c294a4-6655-4e6e-a056-7685588b8fd8\") " pod="openshift-marketplace/redhat-operators-lt87r" Dec 11 14:38:18 crc kubenswrapper[4898]: I1211 14:38:18.366561 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h57vr\" (UniqueName: \"kubernetes.io/projected/20c294a4-6655-4e6e-a056-7685588b8fd8-kube-api-access-h57vr\") pod \"redhat-operators-lt87r\" (UID: \"20c294a4-6655-4e6e-a056-7685588b8fd8\") " pod="openshift-marketplace/redhat-operators-lt87r" Dec 11 14:38:18 crc kubenswrapper[4898]: I1211 14:38:18.367330 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/20c294a4-6655-4e6e-a056-7685588b8fd8-utilities\") pod \"redhat-operators-lt87r\" (UID: \"20c294a4-6655-4e6e-a056-7685588b8fd8\") " pod="openshift-marketplace/redhat-operators-lt87r" Dec 11 14:38:18 crc kubenswrapper[4898]: I1211 14:38:18.367559 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/20c294a4-6655-4e6e-a056-7685588b8fd8-catalog-content\") pod \"redhat-operators-lt87r\" (UID: \"20c294a4-6655-4e6e-a056-7685588b8fd8\") " pod="openshift-marketplace/redhat-operators-lt87r" Dec 11 14:38:18 crc kubenswrapper[4898]: I1211 14:38:18.401028 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h57vr\" (UniqueName: \"kubernetes.io/projected/20c294a4-6655-4e6e-a056-7685588b8fd8-kube-api-access-h57vr\") pod \"redhat-operators-lt87r\" (UID: \"20c294a4-6655-4e6e-a056-7685588b8fd8\") " pod="openshift-marketplace/redhat-operators-lt87r" Dec 11 14:38:18 crc kubenswrapper[4898]: I1211 14:38:18.462487 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lt87r" Dec 11 14:38:19 crc kubenswrapper[4898]: I1211 14:38:19.257490 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-lt87r"] Dec 11 14:38:19 crc kubenswrapper[4898]: I1211 14:38:19.268670 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lt87r" event={"ID":"20c294a4-6655-4e6e-a056-7685588b8fd8","Type":"ContainerStarted","Data":"80aa41e5babeae7cf850b4584996deefa9b2124b63df033f07a0995806f7a965"} Dec 11 14:38:20 crc kubenswrapper[4898]: I1211 14:38:20.282256 4898 generic.go:334] "Generic (PLEG): container finished" podID="20c294a4-6655-4e6e-a056-7685588b8fd8" containerID="0722d22d7f71bb4d6bca583369ea6783dcc4289469decbc65aa4791e5e7beee1" exitCode=0 Dec 11 14:38:20 crc kubenswrapper[4898]: I1211 14:38:20.282474 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lt87r" event={"ID":"20c294a4-6655-4e6e-a056-7685588b8fd8","Type":"ContainerDied","Data":"0722d22d7f71bb4d6bca583369ea6783dcc4289469decbc65aa4791e5e7beee1"} Dec 11 14:38:21 crc kubenswrapper[4898]: I1211 14:38:21.300145 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lt87r" event={"ID":"20c294a4-6655-4e6e-a056-7685588b8fd8","Type":"ContainerStarted","Data":"4845f5c1aabb9b1d69742cf6c662345a5d96c4dd5381bd7be0e81c1f6ec607ce"} Dec 11 14:38:24 crc kubenswrapper[4898]: I1211 14:38:24.229621 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_f596cf47-0571-443a-9104-c61109f65d44/memcached/0.log" Dec 11 14:38:25 crc kubenswrapper[4898]: I1211 14:38:25.352288 4898 generic.go:334] "Generic (PLEG): container finished" podID="20c294a4-6655-4e6e-a056-7685588b8fd8" containerID="4845f5c1aabb9b1d69742cf6c662345a5d96c4dd5381bd7be0e81c1f6ec607ce" exitCode=0 Dec 11 14:38:25 crc kubenswrapper[4898]: I1211 14:38:25.352359 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lt87r" event={"ID":"20c294a4-6655-4e6e-a056-7685588b8fd8","Type":"ContainerDied","Data":"4845f5c1aabb9b1d69742cf6c662345a5d96c4dd5381bd7be0e81c1f6ec607ce"} Dec 11 14:38:26 crc kubenswrapper[4898]: I1211 14:38:26.395160 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lt87r" event={"ID":"20c294a4-6655-4e6e-a056-7685588b8fd8","Type":"ContainerStarted","Data":"d843bae8a5e0c3171656c340dc7e08c8a8004cc2d4441211771f6308699e14be"} Dec 11 14:38:26 crc kubenswrapper[4898]: I1211 14:38:26.423624 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-lt87r" podStartSLOduration=2.892793536 podStartE2EDuration="8.423604816s" podCreationTimestamp="2025-12-11 14:38:18 +0000 UTC" firstStartedPulling="2025-12-11 14:38:20.284245257 +0000 UTC m=+5657.856571694" lastFinishedPulling="2025-12-11 14:38:25.815056537 +0000 UTC m=+5663.387382974" observedRunningTime="2025-12-11 14:38:26.419224109 +0000 UTC m=+5663.991550576" watchObservedRunningTime="2025-12-11 14:38:26.423604816 +0000 UTC m=+5663.995931253" Dec 11 14:38:28 crc kubenswrapper[4898]: I1211 14:38:28.463970 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-lt87r" Dec 11 14:38:28 crc kubenswrapper[4898]: I1211 14:38:28.464485 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-lt87r" Dec 11 14:38:29 crc kubenswrapper[4898]: I1211 14:38:29.516093 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-lt87r" podUID="20c294a4-6655-4e6e-a056-7685588b8fd8" containerName="registry-server" probeResult="failure" output=< Dec 11 14:38:29 crc kubenswrapper[4898]: timeout: failed to connect service ":50051" within 1s Dec 11 14:38:29 crc kubenswrapper[4898]: > Dec 11 14:38:39 crc kubenswrapper[4898]: I1211 14:38:39.522494 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-lt87r" podUID="20c294a4-6655-4e6e-a056-7685588b8fd8" containerName="registry-server" probeResult="failure" output=< Dec 11 14:38:39 crc kubenswrapper[4898]: timeout: failed to connect service ":50051" within 1s Dec 11 14:38:39 crc kubenswrapper[4898]: > Dec 11 14:38:47 crc kubenswrapper[4898]: I1211 14:38:47.149796 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_46a92e0302b0514c24e6aa6ad8547900656bd726806e912560255e10bdqbmbz_4c60c53a-7f88-4da3-8ede-b6f1c3e32564/util/0.log" Dec 11 14:38:47 crc kubenswrapper[4898]: I1211 14:38:47.419292 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_46a92e0302b0514c24e6aa6ad8547900656bd726806e912560255e10bdqbmbz_4c60c53a-7f88-4da3-8ede-b6f1c3e32564/pull/0.log" Dec 11 14:38:47 crc kubenswrapper[4898]: I1211 14:38:47.462127 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_46a92e0302b0514c24e6aa6ad8547900656bd726806e912560255e10bdqbmbz_4c60c53a-7f88-4da3-8ede-b6f1c3e32564/util/0.log" Dec 11 14:38:47 crc kubenswrapper[4898]: I1211 14:38:47.465819 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_46a92e0302b0514c24e6aa6ad8547900656bd726806e912560255e10bdqbmbz_4c60c53a-7f88-4da3-8ede-b6f1c3e32564/pull/0.log" Dec 11 14:38:47 crc kubenswrapper[4898]: I1211 14:38:47.508983 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-9665j"] Dec 11 14:38:47 crc kubenswrapper[4898]: I1211 14:38:47.512427 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9665j" Dec 11 14:38:47 crc kubenswrapper[4898]: I1211 14:38:47.523494 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9665j"] Dec 11 14:38:47 crc kubenswrapper[4898]: I1211 14:38:47.694684 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_46a92e0302b0514c24e6aa6ad8547900656bd726806e912560255e10bdqbmbz_4c60c53a-7f88-4da3-8ede-b6f1c3e32564/util/0.log" Dec 11 14:38:47 crc kubenswrapper[4898]: I1211 14:38:47.701434 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-glmjj\" (UniqueName: \"kubernetes.io/projected/71ef5c35-e68f-40ef-aa81-27670ff4dd27-kube-api-access-glmjj\") pod \"community-operators-9665j\" (UID: \"71ef5c35-e68f-40ef-aa81-27670ff4dd27\") " pod="openshift-marketplace/community-operators-9665j" Dec 11 14:38:47 crc kubenswrapper[4898]: I1211 14:38:47.701588 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/71ef5c35-e68f-40ef-aa81-27670ff4dd27-utilities\") pod \"community-operators-9665j\" (UID: \"71ef5c35-e68f-40ef-aa81-27670ff4dd27\") " pod="openshift-marketplace/community-operators-9665j" Dec 11 14:38:47 crc kubenswrapper[4898]: I1211 14:38:47.701661 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/71ef5c35-e68f-40ef-aa81-27670ff4dd27-catalog-content\") pod \"community-operators-9665j\" (UID: \"71ef5c35-e68f-40ef-aa81-27670ff4dd27\") " pod="openshift-marketplace/community-operators-9665j" Dec 11 14:38:47 crc kubenswrapper[4898]: I1211 14:38:47.705680 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_46a92e0302b0514c24e6aa6ad8547900656bd726806e912560255e10bdqbmbz_4c60c53a-7f88-4da3-8ede-b6f1c3e32564/pull/0.log" Dec 11 14:38:47 crc kubenswrapper[4898]: I1211 14:38:47.773239 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_46a92e0302b0514c24e6aa6ad8547900656bd726806e912560255e10bdqbmbz_4c60c53a-7f88-4da3-8ede-b6f1c3e32564/extract/0.log" Dec 11 14:38:47 crc kubenswrapper[4898]: I1211 14:38:47.804294 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-glmjj\" (UniqueName: \"kubernetes.io/projected/71ef5c35-e68f-40ef-aa81-27670ff4dd27-kube-api-access-glmjj\") pod \"community-operators-9665j\" (UID: \"71ef5c35-e68f-40ef-aa81-27670ff4dd27\") " pod="openshift-marketplace/community-operators-9665j" Dec 11 14:38:47 crc kubenswrapper[4898]: I1211 14:38:47.804356 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/71ef5c35-e68f-40ef-aa81-27670ff4dd27-utilities\") pod \"community-operators-9665j\" (UID: \"71ef5c35-e68f-40ef-aa81-27670ff4dd27\") " pod="openshift-marketplace/community-operators-9665j" Dec 11 14:38:47 crc kubenswrapper[4898]: I1211 14:38:47.804377 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/71ef5c35-e68f-40ef-aa81-27670ff4dd27-catalog-content\") pod \"community-operators-9665j\" (UID: \"71ef5c35-e68f-40ef-aa81-27670ff4dd27\") " pod="openshift-marketplace/community-operators-9665j" Dec 11 14:38:47 crc kubenswrapper[4898]: I1211 14:38:47.804907 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/71ef5c35-e68f-40ef-aa81-27670ff4dd27-catalog-content\") pod \"community-operators-9665j\" (UID: \"71ef5c35-e68f-40ef-aa81-27670ff4dd27\") " pod="openshift-marketplace/community-operators-9665j" Dec 11 14:38:47 crc kubenswrapper[4898]: I1211 14:38:47.805039 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/71ef5c35-e68f-40ef-aa81-27670ff4dd27-utilities\") pod \"community-operators-9665j\" (UID: \"71ef5c35-e68f-40ef-aa81-27670ff4dd27\") " pod="openshift-marketplace/community-operators-9665j" Dec 11 14:38:47 crc kubenswrapper[4898]: I1211 14:38:47.826185 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-glmjj\" (UniqueName: \"kubernetes.io/projected/71ef5c35-e68f-40ef-aa81-27670ff4dd27-kube-api-access-glmjj\") pod \"community-operators-9665j\" (UID: \"71ef5c35-e68f-40ef-aa81-27670ff4dd27\") " pod="openshift-marketplace/community-operators-9665j" Dec 11 14:38:47 crc kubenswrapper[4898]: I1211 14:38:47.835209 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9665j" Dec 11 14:38:47 crc kubenswrapper[4898]: I1211 14:38:47.899055 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7d9dfd778-hmbfx_c99ec3c0-d415-4322-95cd-d57411a1db7b/kube-rbac-proxy/0.log" Dec 11 14:38:48 crc kubenswrapper[4898]: I1211 14:38:48.005273 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7d9dfd778-hmbfx_c99ec3c0-d415-4322-95cd-d57411a1db7b/manager/0.log" Dec 11 14:38:48 crc kubenswrapper[4898]: I1211 14:38:48.086175 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-6c677c69b-rfxp7_c154e39f-1760-4071-b688-f301c3a398e7/kube-rbac-proxy/0.log" Dec 11 14:38:48 crc kubenswrapper[4898]: I1211 14:38:48.337920 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-6c677c69b-rfxp7_c154e39f-1760-4071-b688-f301c3a398e7/manager/0.log" Dec 11 14:38:48 crc kubenswrapper[4898]: I1211 14:38:48.385828 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9665j"] Dec 11 14:38:48 crc kubenswrapper[4898]: I1211 14:38:48.412868 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-697fb699cf-pdvzc_d7d8e047-7525-4d88-b802-550590e7f743/kube-rbac-proxy/0.log" Dec 11 14:38:48 crc kubenswrapper[4898]: I1211 14:38:48.436726 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-697fb699cf-pdvzc_d7d8e047-7525-4d88-b802-550590e7f743/manager/0.log" Dec 11 14:38:48 crc kubenswrapper[4898]: I1211 14:38:48.532553 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-lt87r" Dec 11 14:38:48 crc kubenswrapper[4898]: I1211 14:38:48.616101 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-lt87r" Dec 11 14:38:48 crc kubenswrapper[4898]: I1211 14:38:48.643070 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9665j" event={"ID":"71ef5c35-e68f-40ef-aa81-27670ff4dd27","Type":"ContainerStarted","Data":"95959ab1e34ba12628f60c8c7d009f4a084639a7f11a268cc658186aa5b14ddb"} Dec 11 14:38:48 crc kubenswrapper[4898]: I1211 14:38:48.863391 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-5697bb5779-7kffw_5c391a19-7c2d-4838-9269-2c5cd8eea1ad/kube-rbac-proxy/0.log" Dec 11 14:38:49 crc kubenswrapper[4898]: I1211 14:38:49.108676 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-5697bb5779-7kffw_5c391a19-7c2d-4838-9269-2c5cd8eea1ad/manager/0.log" Dec 11 14:38:49 crc kubenswrapper[4898]: I1211 14:38:49.157428 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5f64f6f8bb-4lcmf_7a0813f7-7167-46ed-b9f8-e2157e92f620/kube-rbac-proxy/0.log" Dec 11 14:38:49 crc kubenswrapper[4898]: I1211 14:38:49.205443 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5f64f6f8bb-4lcmf_7a0813f7-7167-46ed-b9f8-e2157e92f620/manager/0.log" Dec 11 14:38:49 crc kubenswrapper[4898]: I1211 14:38:49.333997 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-tq8mw_0c6054e7-bb0a-4cbd-b459-d9d100182fa1/kube-rbac-proxy/0.log" Dec 11 14:38:49 crc kubenswrapper[4898]: I1211 14:38:49.421256 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-tq8mw_0c6054e7-bb0a-4cbd-b459-d9d100182fa1/manager/0.log" Dec 11 14:38:49 crc kubenswrapper[4898]: I1211 14:38:49.660501 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-78d48bff9d-8nj46_e2834985-dbd0-4ad6-afd2-8238997ec8e5/kube-rbac-proxy/0.log" Dec 11 14:38:49 crc kubenswrapper[4898]: I1211 14:38:49.664428 4898 generic.go:334] "Generic (PLEG): container finished" podID="71ef5c35-e68f-40ef-aa81-27670ff4dd27" containerID="f3a51724ba2ab033362c08f7f75ed2ca58d08619c6f4813fefcdb22e3ed98581" exitCode=0 Dec 11 14:38:49 crc kubenswrapper[4898]: I1211 14:38:49.664502 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9665j" event={"ID":"71ef5c35-e68f-40ef-aa81-27670ff4dd27","Type":"ContainerDied","Data":"f3a51724ba2ab033362c08f7f75ed2ca58d08619c6f4813fefcdb22e3ed98581"} Dec 11 14:38:49 crc kubenswrapper[4898]: I1211 14:38:49.842568 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-967d97867-qrfjf_1f2676f6-97b8-425e-9d05-9ec2c52055de/kube-rbac-proxy/0.log" Dec 11 14:38:49 crc kubenswrapper[4898]: I1211 14:38:49.870867 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-78d48bff9d-8nj46_e2834985-dbd0-4ad6-afd2-8238997ec8e5/manager/0.log" Dec 11 14:38:49 crc kubenswrapper[4898]: I1211 14:38:49.888636 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-967d97867-qrfjf_1f2676f6-97b8-425e-9d05-9ec2c52055de/manager/0.log" Dec 11 14:38:50 crc kubenswrapper[4898]: I1211 14:38:50.051771 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7765d96ddf-wlqm4_e87a760e-bf60-4a98-bb37-1f44745e250f/kube-rbac-proxy/0.log" Dec 11 14:38:50 crc kubenswrapper[4898]: I1211 14:38:50.167351 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7765d96ddf-wlqm4_e87a760e-bf60-4a98-bb37-1f44745e250f/manager/0.log" Dec 11 14:38:50 crc kubenswrapper[4898]: I1211 14:38:50.488401 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-lt87r"] Dec 11 14:38:50 crc kubenswrapper[4898]: I1211 14:38:50.489087 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-lt87r" podUID="20c294a4-6655-4e6e-a056-7685588b8fd8" containerName="registry-server" containerID="cri-o://d843bae8a5e0c3171656c340dc7e08c8a8004cc2d4441211771f6308699e14be" gracePeriod=2 Dec 11 14:38:50 crc kubenswrapper[4898]: I1211 14:38:50.878281 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-5b5fd79c9c-wxz25_a99a2194-b89b-4a6a-a086-acd20b489632/kube-rbac-proxy/0.log" Dec 11 14:38:50 crc kubenswrapper[4898]: I1211 14:38:50.938356 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-5b5fd79c9c-wxz25_a99a2194-b89b-4a6a-a086-acd20b489632/manager/0.log" Dec 11 14:38:50 crc kubenswrapper[4898]: I1211 14:38:50.972084 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-79c8c4686c-qpznh_1e77353c-6728-4dfa-814c-1a92115c8bf2/kube-rbac-proxy/0.log" Dec 11 14:38:51 crc kubenswrapper[4898]: I1211 14:38:51.222118 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-79c8c4686c-qpznh_1e77353c-6728-4dfa-814c-1a92115c8bf2/manager/0.log" Dec 11 14:38:51 crc kubenswrapper[4898]: I1211 14:38:51.349653 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5fdfd5b6b5-pgfnv_09d9c781-008c-4486-807c-159f4fefe857/kube-rbac-proxy/0.log" Dec 11 14:38:51 crc kubenswrapper[4898]: I1211 14:38:51.353659 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5fdfd5b6b5-pgfnv_09d9c781-008c-4486-807c-159f4fefe857/manager/0.log" Dec 11 14:38:51 crc kubenswrapper[4898]: I1211 14:38:51.688377 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9665j" event={"ID":"71ef5c35-e68f-40ef-aa81-27670ff4dd27","Type":"ContainerStarted","Data":"60aef1bd8ae21f8135db27a873743cdf385975f2e0b680918a47c740791f4bab"} Dec 11 14:38:51 crc kubenswrapper[4898]: I1211 14:38:51.690797 4898 generic.go:334] "Generic (PLEG): container finished" podID="20c294a4-6655-4e6e-a056-7685588b8fd8" containerID="d843bae8a5e0c3171656c340dc7e08c8a8004cc2d4441211771f6308699e14be" exitCode=0 Dec 11 14:38:51 crc kubenswrapper[4898]: I1211 14:38:51.690822 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lt87r" event={"ID":"20c294a4-6655-4e6e-a056-7685588b8fd8","Type":"ContainerDied","Data":"d843bae8a5e0c3171656c340dc7e08c8a8004cc2d4441211771f6308699e14be"} Dec 11 14:38:51 crc kubenswrapper[4898]: I1211 14:38:51.820686 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-697bc559fc-pqc9z_ac70ed50-7e53-4bb9-ac63-35e5c0651db5/kube-rbac-proxy/0.log" Dec 11 14:38:51 crc kubenswrapper[4898]: I1211 14:38:51.982396 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-697bc559fc-pqc9z_ac70ed50-7e53-4bb9-ac63-35e5c0651db5/manager/0.log" Dec 11 14:38:52 crc kubenswrapper[4898]: I1211 14:38:52.055644 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lt87r" Dec 11 14:38:52 crc kubenswrapper[4898]: I1211 14:38:52.114530 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/20c294a4-6655-4e6e-a056-7685588b8fd8-utilities\") pod \"20c294a4-6655-4e6e-a056-7685588b8fd8\" (UID: \"20c294a4-6655-4e6e-a056-7685588b8fd8\") " Dec 11 14:38:52 crc kubenswrapper[4898]: I1211 14:38:52.114600 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/20c294a4-6655-4e6e-a056-7685588b8fd8-catalog-content\") pod \"20c294a4-6655-4e6e-a056-7685588b8fd8\" (UID: \"20c294a4-6655-4e6e-a056-7685588b8fd8\") " Dec 11 14:38:52 crc kubenswrapper[4898]: I1211 14:38:52.114737 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h57vr\" (UniqueName: \"kubernetes.io/projected/20c294a4-6655-4e6e-a056-7685588b8fd8-kube-api-access-h57vr\") pod \"20c294a4-6655-4e6e-a056-7685588b8fd8\" (UID: \"20c294a4-6655-4e6e-a056-7685588b8fd8\") " Dec 11 14:38:52 crc kubenswrapper[4898]: I1211 14:38:52.115177 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/20c294a4-6655-4e6e-a056-7685588b8fd8-utilities" (OuterVolumeSpecName: "utilities") pod "20c294a4-6655-4e6e-a056-7685588b8fd8" (UID: "20c294a4-6655-4e6e-a056-7685588b8fd8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 14:38:52 crc kubenswrapper[4898]: I1211 14:38:52.116074 4898 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/20c294a4-6655-4e6e-a056-7685588b8fd8-utilities\") on node \"crc\" DevicePath \"\"" Dec 11 14:38:52 crc kubenswrapper[4898]: I1211 14:38:52.131730 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20c294a4-6655-4e6e-a056-7685588b8fd8-kube-api-access-h57vr" (OuterVolumeSpecName: "kube-api-access-h57vr") pod "20c294a4-6655-4e6e-a056-7685588b8fd8" (UID: "20c294a4-6655-4e6e-a056-7685588b8fd8"). InnerVolumeSpecName "kube-api-access-h57vr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 14:38:52 crc kubenswrapper[4898]: I1211 14:38:52.219231 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h57vr\" (UniqueName: \"kubernetes.io/projected/20c294a4-6655-4e6e-a056-7685588b8fd8-kube-api-access-h57vr\") on node \"crc\" DevicePath \"\"" Dec 11 14:38:52 crc kubenswrapper[4898]: I1211 14:38:52.238874 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/20c294a4-6655-4e6e-a056-7685588b8fd8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "20c294a4-6655-4e6e-a056-7685588b8fd8" (UID: "20c294a4-6655-4e6e-a056-7685588b8fd8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 14:38:52 crc kubenswrapper[4898]: I1211 14:38:52.253169 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-dvtzm_88e97f63-c1cb-4ef1-9d95-0c11dc52c94c/kube-rbac-proxy/0.log" Dec 11 14:38:52 crc kubenswrapper[4898]: I1211 14:38:52.278408 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-dvtzm_88e97f63-c1cb-4ef1-9d95-0c11dc52c94c/manager/0.log" Dec 11 14:38:52 crc kubenswrapper[4898]: I1211 14:38:52.322298 4898 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/20c294a4-6655-4e6e-a056-7685588b8fd8-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 11 14:38:52 crc kubenswrapper[4898]: I1211 14:38:52.335642 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-84b575879fv8ph7_1fbc642b-9636-47c2-a3db-7913fa4a6b91/kube-rbac-proxy/0.log" Dec 11 14:38:52 crc kubenswrapper[4898]: I1211 14:38:52.432775 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-84b575879fv8ph7_1fbc642b-9636-47c2-a3db-7913fa4a6b91/manager/1.log" Dec 11 14:38:52 crc kubenswrapper[4898]: I1211 14:38:52.491091 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-84b575879fv8ph7_1fbc642b-9636-47c2-a3db-7913fa4a6b91/manager/0.log" Dec 11 14:38:52 crc kubenswrapper[4898]: I1211 14:38:52.703876 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lt87r" event={"ID":"20c294a4-6655-4e6e-a056-7685588b8fd8","Type":"ContainerDied","Data":"80aa41e5babeae7cf850b4584996deefa9b2124b63df033f07a0995806f7a965"} Dec 11 14:38:52 crc kubenswrapper[4898]: I1211 14:38:52.703914 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lt87r" Dec 11 14:38:52 crc kubenswrapper[4898]: I1211 14:38:52.703933 4898 scope.go:117] "RemoveContainer" containerID="d843bae8a5e0c3171656c340dc7e08c8a8004cc2d4441211771f6308699e14be" Dec 11 14:38:52 crc kubenswrapper[4898]: I1211 14:38:52.707410 4898 generic.go:334] "Generic (PLEG): container finished" podID="71ef5c35-e68f-40ef-aa81-27670ff4dd27" containerID="60aef1bd8ae21f8135db27a873743cdf385975f2e0b680918a47c740791f4bab" exitCode=0 Dec 11 14:38:52 crc kubenswrapper[4898]: I1211 14:38:52.707528 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9665j" event={"ID":"71ef5c35-e68f-40ef-aa81-27670ff4dd27","Type":"ContainerDied","Data":"60aef1bd8ae21f8135db27a873743cdf385975f2e0b680918a47c740791f4bab"} Dec 11 14:38:52 crc kubenswrapper[4898]: I1211 14:38:52.751799 4898 scope.go:117] "RemoveContainer" containerID="4845f5c1aabb9b1d69742cf6c662345a5d96c4dd5381bd7be0e81c1f6ec607ce" Dec 11 14:38:52 crc kubenswrapper[4898]: I1211 14:38:52.788100 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-lt87r"] Dec 11 14:38:52 crc kubenswrapper[4898]: I1211 14:38:52.796706 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-lt87r"] Dec 11 14:38:53 crc kubenswrapper[4898]: I1211 14:38:53.110628 4898 scope.go:117] "RemoveContainer" containerID="0722d22d7f71bb4d6bca583369ea6783dcc4289469decbc65aa4791e5e7beee1" Dec 11 14:38:53 crc kubenswrapper[4898]: I1211 14:38:53.398642 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-mxsdw_2911f97f-4469-4335-b6be-48a0e3c6fda8/registry-server/0.log" Dec 11 14:38:53 crc kubenswrapper[4898]: I1211 14:38:53.591622 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-b6456fdb6-nx4h5_95c66498-ab0d-4618-b884-523e1183d758/kube-rbac-proxy/0.log" Dec 11 14:38:53 crc kubenswrapper[4898]: I1211 14:38:53.691668 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-5dc777b99d-mszqj_4e56a824-5c00-4a67-a8c3-a32a001f0ce4/operator/0.log" Dec 11 14:38:53 crc kubenswrapper[4898]: I1211 14:38:53.703640 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-b6456fdb6-nx4h5_95c66498-ab0d-4618-b884-523e1183d758/manager/0.log" Dec 11 14:38:53 crc kubenswrapper[4898]: I1211 14:38:53.858056 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-78f8948974-9p4w9_42b8c71f-abd8-49b1-b604-49b3292ba29a/manager/0.log" Dec 11 14:38:53 crc kubenswrapper[4898]: I1211 14:38:53.975840 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-78f8948974-9p4w9_42b8c71f-abd8-49b1-b604-49b3292ba29a/kube-rbac-proxy/0.log" Dec 11 14:38:53 crc kubenswrapper[4898]: I1211 14:38:53.995843 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-djqgv_c7e9c45d-ed03-4e5f-a585-5d1af92727f9/operator/0.log" Dec 11 14:38:54 crc kubenswrapper[4898]: I1211 14:38:54.151387 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-596947c645-4xjkq_c6d5540b-2eb6-411c-b1a9-b0db78e67ae7/manager/0.log" Dec 11 14:38:54 crc kubenswrapper[4898]: I1211 14:38:54.406618 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-9d58d64bc-2wt95_cb6adf46-208a-4945-97aa-2c457b9c2614/kube-rbac-proxy/0.log" Dec 11 14:38:54 crc kubenswrapper[4898]: I1211 14:38:54.448357 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-9d58d64bc-2wt95_cb6adf46-208a-4945-97aa-2c457b9c2614/manager/0.log" Dec 11 14:38:54 crc kubenswrapper[4898]: I1211 14:38:54.591288 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-766b45bcdb-ksffb_f52c9389-ea61-4327-afd9-f4c92541a821/kube-rbac-proxy/0.log" Dec 11 14:38:54 crc kubenswrapper[4898]: I1211 14:38:54.680693 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5854674fcc-8d8zb_1a7e7363-7657-4eb2-a969-9f4c08a50984/kube-rbac-proxy/0.log" Dec 11 14:38:54 crc kubenswrapper[4898]: I1211 14:38:54.733444 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9665j" event={"ID":"71ef5c35-e68f-40ef-aa81-27670ff4dd27","Type":"ContainerStarted","Data":"613c14479d1ec398eaf3b850bb8b3f626b74bbe66e2c67abbf1ce159512693d2"} Dec 11 14:38:54 crc kubenswrapper[4898]: I1211 14:38:54.760186 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-9665j" podStartSLOduration=4.095817338 podStartE2EDuration="7.760165015s" podCreationTimestamp="2025-12-11 14:38:47 +0000 UTC" firstStartedPulling="2025-12-11 14:38:49.666803907 +0000 UTC m=+5687.239130344" lastFinishedPulling="2025-12-11 14:38:53.331151584 +0000 UTC m=+5690.903478021" observedRunningTime="2025-12-11 14:38:54.754007751 +0000 UTC m=+5692.326334188" watchObservedRunningTime="2025-12-11 14:38:54.760165015 +0000 UTC m=+5692.332491452" Dec 11 14:38:54 crc kubenswrapper[4898]: I1211 14:38:54.797833 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20c294a4-6655-4e6e-a056-7685588b8fd8" path="/var/lib/kubelet/pods/20c294a4-6655-4e6e-a056-7685588b8fd8/volumes" Dec 11 14:38:54 crc kubenswrapper[4898]: I1211 14:38:54.861672 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-766b45bcdb-ksffb_f52c9389-ea61-4327-afd9-f4c92541a821/manager/0.log" Dec 11 14:38:54 crc kubenswrapper[4898]: I1211 14:38:54.898008 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-75944c9b7-9kw6z_7d6dbccc-94de-44f9-b7d2-5bbcfee1d119/kube-rbac-proxy/0.log" Dec 11 14:38:54 crc kubenswrapper[4898]: I1211 14:38:54.912516 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5854674fcc-8d8zb_1a7e7363-7657-4eb2-a969-9f4c08a50984/manager/0.log" Dec 11 14:38:54 crc kubenswrapper[4898]: I1211 14:38:54.992400 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-75944c9b7-9kw6z_7d6dbccc-94de-44f9-b7d2-5bbcfee1d119/manager/0.log" Dec 11 14:38:57 crc kubenswrapper[4898]: I1211 14:38:57.424539 4898 trace.go:236] Trace[1801795142]: "Calculate volume metrics of run-httpd for pod openstack/swift-proxy-6867fd7bcf-bbj7b" (11-Dec-2025 14:38:56.307) (total time: 1112ms): Dec 11 14:38:57 crc kubenswrapper[4898]: Trace[1801795142]: [1.112225828s] [1.112225828s] END Dec 11 14:38:57 crc kubenswrapper[4898]: I1211 14:38:57.573929 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-pdwc5"] Dec 11 14:38:57 crc kubenswrapper[4898]: E1211 14:38:57.574560 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20c294a4-6655-4e6e-a056-7685588b8fd8" containerName="extract-content" Dec 11 14:38:57 crc kubenswrapper[4898]: I1211 14:38:57.574577 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="20c294a4-6655-4e6e-a056-7685588b8fd8" containerName="extract-content" Dec 11 14:38:57 crc kubenswrapper[4898]: E1211 14:38:57.574603 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20c294a4-6655-4e6e-a056-7685588b8fd8" containerName="registry-server" Dec 11 14:38:57 crc kubenswrapper[4898]: I1211 14:38:57.574658 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="20c294a4-6655-4e6e-a056-7685588b8fd8" containerName="registry-server" Dec 11 14:38:57 crc kubenswrapper[4898]: E1211 14:38:57.574691 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20c294a4-6655-4e6e-a056-7685588b8fd8" containerName="extract-utilities" Dec 11 14:38:57 crc kubenswrapper[4898]: I1211 14:38:57.574700 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="20c294a4-6655-4e6e-a056-7685588b8fd8" containerName="extract-utilities" Dec 11 14:38:57 crc kubenswrapper[4898]: I1211 14:38:57.575002 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="20c294a4-6655-4e6e-a056-7685588b8fd8" containerName="registry-server" Dec 11 14:38:57 crc kubenswrapper[4898]: I1211 14:38:57.576644 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-pdwc5"] Dec 11 14:38:57 crc kubenswrapper[4898]: I1211 14:38:57.576735 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pdwc5" Dec 11 14:38:57 crc kubenswrapper[4898]: I1211 14:38:57.749770 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/62e19a83-12d3-4611-835b-c90337ab2663-catalog-content\") pod \"certified-operators-pdwc5\" (UID: \"62e19a83-12d3-4611-835b-c90337ab2663\") " pod="openshift-marketplace/certified-operators-pdwc5" Dec 11 14:38:57 crc kubenswrapper[4898]: I1211 14:38:57.749918 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/62e19a83-12d3-4611-835b-c90337ab2663-utilities\") pod \"certified-operators-pdwc5\" (UID: \"62e19a83-12d3-4611-835b-c90337ab2663\") " pod="openshift-marketplace/certified-operators-pdwc5" Dec 11 14:38:57 crc kubenswrapper[4898]: I1211 14:38:57.750093 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xhn84\" (UniqueName: \"kubernetes.io/projected/62e19a83-12d3-4611-835b-c90337ab2663-kube-api-access-xhn84\") pod \"certified-operators-pdwc5\" (UID: \"62e19a83-12d3-4611-835b-c90337ab2663\") " pod="openshift-marketplace/certified-operators-pdwc5" Dec 11 14:38:57 crc kubenswrapper[4898]: I1211 14:38:57.835946 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-9665j" Dec 11 14:38:57 crc kubenswrapper[4898]: I1211 14:38:57.835993 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-9665j" Dec 11 14:38:57 crc kubenswrapper[4898]: I1211 14:38:57.851949 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xhn84\" (UniqueName: \"kubernetes.io/projected/62e19a83-12d3-4611-835b-c90337ab2663-kube-api-access-xhn84\") pod \"certified-operators-pdwc5\" (UID: \"62e19a83-12d3-4611-835b-c90337ab2663\") " pod="openshift-marketplace/certified-operators-pdwc5" Dec 11 14:38:57 crc kubenswrapper[4898]: I1211 14:38:57.852065 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/62e19a83-12d3-4611-835b-c90337ab2663-catalog-content\") pod \"certified-operators-pdwc5\" (UID: \"62e19a83-12d3-4611-835b-c90337ab2663\") " pod="openshift-marketplace/certified-operators-pdwc5" Dec 11 14:38:57 crc kubenswrapper[4898]: I1211 14:38:57.852166 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/62e19a83-12d3-4611-835b-c90337ab2663-utilities\") pod \"certified-operators-pdwc5\" (UID: \"62e19a83-12d3-4611-835b-c90337ab2663\") " pod="openshift-marketplace/certified-operators-pdwc5" Dec 11 14:38:57 crc kubenswrapper[4898]: I1211 14:38:57.852723 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/62e19a83-12d3-4611-835b-c90337ab2663-catalog-content\") pod \"certified-operators-pdwc5\" (UID: \"62e19a83-12d3-4611-835b-c90337ab2663\") " pod="openshift-marketplace/certified-operators-pdwc5" Dec 11 14:38:57 crc kubenswrapper[4898]: I1211 14:38:57.852870 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/62e19a83-12d3-4611-835b-c90337ab2663-utilities\") pod \"certified-operators-pdwc5\" (UID: \"62e19a83-12d3-4611-835b-c90337ab2663\") " pod="openshift-marketplace/certified-operators-pdwc5" Dec 11 14:38:57 crc kubenswrapper[4898]: I1211 14:38:57.896971 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-9665j" Dec 11 14:38:59 crc kubenswrapper[4898]: I1211 14:38:59.772299 4898 fsHandler.go:133] fs: disk usage and inodes count on following dirs took 1.301158458s: [/var/lib/containers/storage/overlay/f2ec2cc6ea9b05c00e0891ed60c4e4f755f0ee3b4a768c5e6bf8aa0e2cdd4ec4/diff /var/log/pods/openstack_openstack-galera-0_2e7cffb6-80f8-45e8-a4ab-219dc834a613/galera/1.log]; will not log again for this container unless duration exceeds 2s Dec 11 14:38:59 crc kubenswrapper[4898]: I1211 14:38:59.772297 4898 fsHandler.go:133] fs: disk usage and inodes count on following dirs took 2.286911441s: [/var/lib/containers/storage/overlay/ed28466c7812794e45d4e08037676e038ba7dd06a6347d3909677bba8628aef3/diff /var/log/pods/openstack_ceilometer-0_d71ee22f-68e7-43d7-8a6a-012ff8b8104e/ceilometer-central-agent/1.log]; will not log again for this container unless duration exceeds 2s Dec 11 14:38:59 crc kubenswrapper[4898]: I1211 14:38:59.772333 4898 fsHandler.go:133] fs: disk usage and inodes count on following dirs took 2.269675622s: [/var/lib/containers/storage/overlay/2cce9161cc75e0e640cc5581d358770d4147eefb71a6e8002dc8c3f83b248949/diff /var/log/pods/openstack_openstack-cell1-galera-0_5ae31191-f9f6-452a-8f45-a48b4736012e/galera/1.log]; will not log again for this container unless duration exceeds 2s Dec 11 14:38:59 crc kubenswrapper[4898]: I1211 14:38:59.785264 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xhn84\" (UniqueName: \"kubernetes.io/projected/62e19a83-12d3-4611-835b-c90337ab2663-kube-api-access-xhn84\") pod \"certified-operators-pdwc5\" (UID: \"62e19a83-12d3-4611-835b-c90337ab2663\") " pod="openshift-marketplace/certified-operators-pdwc5" Dec 11 14:39:00 crc kubenswrapper[4898]: I1211 14:39:00.007164 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pdwc5" Dec 11 14:39:00 crc kubenswrapper[4898]: I1211 14:39:00.708031 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-pdwc5"] Dec 11 14:39:01 crc kubenswrapper[4898]: I1211 14:39:01.609121 4898 generic.go:334] "Generic (PLEG): container finished" podID="62e19a83-12d3-4611-835b-c90337ab2663" containerID="3231579a60cc9687d6fccf2fdd7c6b5e048474088b04750558eee185c0abc6e4" exitCode=0 Dec 11 14:39:01 crc kubenswrapper[4898]: I1211 14:39:01.609707 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pdwc5" event={"ID":"62e19a83-12d3-4611-835b-c90337ab2663","Type":"ContainerDied","Data":"3231579a60cc9687d6fccf2fdd7c6b5e048474088b04750558eee185c0abc6e4"} Dec 11 14:39:01 crc kubenswrapper[4898]: I1211 14:39:01.609798 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pdwc5" event={"ID":"62e19a83-12d3-4611-835b-c90337ab2663","Type":"ContainerStarted","Data":"6d510aeac1f6d8ba8ff4e9d5c4bcaed5aa7b030f92ab311a9af1d7f8a8b49104"} Dec 11 14:39:03 crc kubenswrapper[4898]: I1211 14:39:03.651144 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pdwc5" event={"ID":"62e19a83-12d3-4611-835b-c90337ab2663","Type":"ContainerStarted","Data":"d1ca245621a1f57c4582764b7ee510b76df9e6e96f752416f3a5885b58b33579"} Dec 11 14:39:05 crc kubenswrapper[4898]: I1211 14:39:05.676783 4898 generic.go:334] "Generic (PLEG): container finished" podID="62e19a83-12d3-4611-835b-c90337ab2663" containerID="d1ca245621a1f57c4582764b7ee510b76df9e6e96f752416f3a5885b58b33579" exitCode=0 Dec 11 14:39:05 crc kubenswrapper[4898]: I1211 14:39:05.676925 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pdwc5" event={"ID":"62e19a83-12d3-4611-835b-c90337ab2663","Type":"ContainerDied","Data":"d1ca245621a1f57c4582764b7ee510b76df9e6e96f752416f3a5885b58b33579"} Dec 11 14:39:07 crc kubenswrapper[4898]: I1211 14:39:07.720904 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pdwc5" event={"ID":"62e19a83-12d3-4611-835b-c90337ab2663","Type":"ContainerStarted","Data":"2442646bb0ed92dd784880da0951e77b838a5a135fa00cbe9f0ba2652a1e42eb"} Dec 11 14:39:07 crc kubenswrapper[4898]: I1211 14:39:07.744955 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-pdwc5" podStartSLOduration=6.962509739 podStartE2EDuration="11.74493395s" podCreationTimestamp="2025-12-11 14:38:56 +0000 UTC" firstStartedPulling="2025-12-11 14:39:01.612948889 +0000 UTC m=+5699.185275326" lastFinishedPulling="2025-12-11 14:39:06.39537307 +0000 UTC m=+5703.967699537" observedRunningTime="2025-12-11 14:39:07.738876828 +0000 UTC m=+5705.311203265" watchObservedRunningTime="2025-12-11 14:39:07.74493395 +0000 UTC m=+5705.317260387" Dec 11 14:39:07 crc kubenswrapper[4898]: I1211 14:39:07.912608 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-9665j" Dec 11 14:39:07 crc kubenswrapper[4898]: I1211 14:39:07.964314 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9665j"] Dec 11 14:39:08 crc kubenswrapper[4898]: I1211 14:39:08.731976 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-9665j" podUID="71ef5c35-e68f-40ef-aa81-27670ff4dd27" containerName="registry-server" containerID="cri-o://613c14479d1ec398eaf3b850bb8b3f626b74bbe66e2c67abbf1ce159512693d2" gracePeriod=2 Dec 11 14:39:09 crc kubenswrapper[4898]: I1211 14:39:09.327513 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9665j" Dec 11 14:39:09 crc kubenswrapper[4898]: I1211 14:39:09.472886 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/71ef5c35-e68f-40ef-aa81-27670ff4dd27-utilities\") pod \"71ef5c35-e68f-40ef-aa81-27670ff4dd27\" (UID: \"71ef5c35-e68f-40ef-aa81-27670ff4dd27\") " Dec 11 14:39:09 crc kubenswrapper[4898]: I1211 14:39:09.473176 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/71ef5c35-e68f-40ef-aa81-27670ff4dd27-catalog-content\") pod \"71ef5c35-e68f-40ef-aa81-27670ff4dd27\" (UID: \"71ef5c35-e68f-40ef-aa81-27670ff4dd27\") " Dec 11 14:39:09 crc kubenswrapper[4898]: I1211 14:39:09.473306 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-glmjj\" (UniqueName: \"kubernetes.io/projected/71ef5c35-e68f-40ef-aa81-27670ff4dd27-kube-api-access-glmjj\") pod \"71ef5c35-e68f-40ef-aa81-27670ff4dd27\" (UID: \"71ef5c35-e68f-40ef-aa81-27670ff4dd27\") " Dec 11 14:39:09 crc kubenswrapper[4898]: I1211 14:39:09.473766 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/71ef5c35-e68f-40ef-aa81-27670ff4dd27-utilities" (OuterVolumeSpecName: "utilities") pod "71ef5c35-e68f-40ef-aa81-27670ff4dd27" (UID: "71ef5c35-e68f-40ef-aa81-27670ff4dd27"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 14:39:09 crc kubenswrapper[4898]: I1211 14:39:09.474088 4898 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/71ef5c35-e68f-40ef-aa81-27670ff4dd27-utilities\") on node \"crc\" DevicePath \"\"" Dec 11 14:39:09 crc kubenswrapper[4898]: I1211 14:39:09.484319 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71ef5c35-e68f-40ef-aa81-27670ff4dd27-kube-api-access-glmjj" (OuterVolumeSpecName: "kube-api-access-glmjj") pod "71ef5c35-e68f-40ef-aa81-27670ff4dd27" (UID: "71ef5c35-e68f-40ef-aa81-27670ff4dd27"). InnerVolumeSpecName "kube-api-access-glmjj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 14:39:09 crc kubenswrapper[4898]: I1211 14:39:09.534251 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/71ef5c35-e68f-40ef-aa81-27670ff4dd27-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "71ef5c35-e68f-40ef-aa81-27670ff4dd27" (UID: "71ef5c35-e68f-40ef-aa81-27670ff4dd27"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 14:39:09 crc kubenswrapper[4898]: I1211 14:39:09.576252 4898 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/71ef5c35-e68f-40ef-aa81-27670ff4dd27-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 11 14:39:09 crc kubenswrapper[4898]: I1211 14:39:09.576291 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-glmjj\" (UniqueName: \"kubernetes.io/projected/71ef5c35-e68f-40ef-aa81-27670ff4dd27-kube-api-access-glmjj\") on node \"crc\" DevicePath \"\"" Dec 11 14:39:09 crc kubenswrapper[4898]: I1211 14:39:09.749850 4898 generic.go:334] "Generic (PLEG): container finished" podID="71ef5c35-e68f-40ef-aa81-27670ff4dd27" containerID="613c14479d1ec398eaf3b850bb8b3f626b74bbe66e2c67abbf1ce159512693d2" exitCode=0 Dec 11 14:39:09 crc kubenswrapper[4898]: I1211 14:39:09.749913 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9665j" event={"ID":"71ef5c35-e68f-40ef-aa81-27670ff4dd27","Type":"ContainerDied","Data":"613c14479d1ec398eaf3b850bb8b3f626b74bbe66e2c67abbf1ce159512693d2"} Dec 11 14:39:09 crc kubenswrapper[4898]: I1211 14:39:09.749954 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9665j" event={"ID":"71ef5c35-e68f-40ef-aa81-27670ff4dd27","Type":"ContainerDied","Data":"95959ab1e34ba12628f60c8c7d009f4a084639a7f11a268cc658186aa5b14ddb"} Dec 11 14:39:09 crc kubenswrapper[4898]: I1211 14:39:09.749973 4898 scope.go:117] "RemoveContainer" containerID="613c14479d1ec398eaf3b850bb8b3f626b74bbe66e2c67abbf1ce159512693d2" Dec 11 14:39:09 crc kubenswrapper[4898]: I1211 14:39:09.751676 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9665j" Dec 11 14:39:09 crc kubenswrapper[4898]: I1211 14:39:09.774417 4898 scope.go:117] "RemoveContainer" containerID="60aef1bd8ae21f8135db27a873743cdf385975f2e0b680918a47c740791f4bab" Dec 11 14:39:09 crc kubenswrapper[4898]: I1211 14:39:09.797321 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9665j"] Dec 11 14:39:09 crc kubenswrapper[4898]: I1211 14:39:09.810608 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-9665j"] Dec 11 14:39:09 crc kubenswrapper[4898]: I1211 14:39:09.815982 4898 scope.go:117] "RemoveContainer" containerID="f3a51724ba2ab033362c08f7f75ed2ca58d08619c6f4813fefcdb22e3ed98581" Dec 11 14:39:09 crc kubenswrapper[4898]: I1211 14:39:09.861340 4898 scope.go:117] "RemoveContainer" containerID="613c14479d1ec398eaf3b850bb8b3f626b74bbe66e2c67abbf1ce159512693d2" Dec 11 14:39:09 crc kubenswrapper[4898]: E1211 14:39:09.863210 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"613c14479d1ec398eaf3b850bb8b3f626b74bbe66e2c67abbf1ce159512693d2\": container with ID starting with 613c14479d1ec398eaf3b850bb8b3f626b74bbe66e2c67abbf1ce159512693d2 not found: ID does not exist" containerID="613c14479d1ec398eaf3b850bb8b3f626b74bbe66e2c67abbf1ce159512693d2" Dec 11 14:39:09 crc kubenswrapper[4898]: I1211 14:39:09.863250 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"613c14479d1ec398eaf3b850bb8b3f626b74bbe66e2c67abbf1ce159512693d2"} err="failed to get container status \"613c14479d1ec398eaf3b850bb8b3f626b74bbe66e2c67abbf1ce159512693d2\": rpc error: code = NotFound desc = could not find container \"613c14479d1ec398eaf3b850bb8b3f626b74bbe66e2c67abbf1ce159512693d2\": container with ID starting with 613c14479d1ec398eaf3b850bb8b3f626b74bbe66e2c67abbf1ce159512693d2 not found: ID does not exist" Dec 11 14:39:09 crc kubenswrapper[4898]: I1211 14:39:09.863272 4898 scope.go:117] "RemoveContainer" containerID="60aef1bd8ae21f8135db27a873743cdf385975f2e0b680918a47c740791f4bab" Dec 11 14:39:09 crc kubenswrapper[4898]: E1211 14:39:09.863714 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"60aef1bd8ae21f8135db27a873743cdf385975f2e0b680918a47c740791f4bab\": container with ID starting with 60aef1bd8ae21f8135db27a873743cdf385975f2e0b680918a47c740791f4bab not found: ID does not exist" containerID="60aef1bd8ae21f8135db27a873743cdf385975f2e0b680918a47c740791f4bab" Dec 11 14:39:09 crc kubenswrapper[4898]: I1211 14:39:09.863736 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60aef1bd8ae21f8135db27a873743cdf385975f2e0b680918a47c740791f4bab"} err="failed to get container status \"60aef1bd8ae21f8135db27a873743cdf385975f2e0b680918a47c740791f4bab\": rpc error: code = NotFound desc = could not find container \"60aef1bd8ae21f8135db27a873743cdf385975f2e0b680918a47c740791f4bab\": container with ID starting with 60aef1bd8ae21f8135db27a873743cdf385975f2e0b680918a47c740791f4bab not found: ID does not exist" Dec 11 14:39:09 crc kubenswrapper[4898]: I1211 14:39:09.863755 4898 scope.go:117] "RemoveContainer" containerID="f3a51724ba2ab033362c08f7f75ed2ca58d08619c6f4813fefcdb22e3ed98581" Dec 11 14:39:09 crc kubenswrapper[4898]: E1211 14:39:09.864794 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f3a51724ba2ab033362c08f7f75ed2ca58d08619c6f4813fefcdb22e3ed98581\": container with ID starting with f3a51724ba2ab033362c08f7f75ed2ca58d08619c6f4813fefcdb22e3ed98581 not found: ID does not exist" containerID="f3a51724ba2ab033362c08f7f75ed2ca58d08619c6f4813fefcdb22e3ed98581" Dec 11 14:39:09 crc kubenswrapper[4898]: I1211 14:39:09.864817 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f3a51724ba2ab033362c08f7f75ed2ca58d08619c6f4813fefcdb22e3ed98581"} err="failed to get container status \"f3a51724ba2ab033362c08f7f75ed2ca58d08619c6f4813fefcdb22e3ed98581\": rpc error: code = NotFound desc = could not find container \"f3a51724ba2ab033362c08f7f75ed2ca58d08619c6f4813fefcdb22e3ed98581\": container with ID starting with f3a51724ba2ab033362c08f7f75ed2ca58d08619c6f4813fefcdb22e3ed98581 not found: ID does not exist" Dec 11 14:39:10 crc kubenswrapper[4898]: I1211 14:39:10.008224 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-pdwc5" Dec 11 14:39:10 crc kubenswrapper[4898]: I1211 14:39:10.009702 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-pdwc5" Dec 11 14:39:10 crc kubenswrapper[4898]: I1211 14:39:10.788429 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="71ef5c35-e68f-40ef-aa81-27670ff4dd27" path="/var/lib/kubelet/pods/71ef5c35-e68f-40ef-aa81-27670ff4dd27/volumes" Dec 11 14:39:11 crc kubenswrapper[4898]: I1211 14:39:11.072255 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-pdwc5" podUID="62e19a83-12d3-4611-835b-c90337ab2663" containerName="registry-server" probeResult="failure" output=< Dec 11 14:39:11 crc kubenswrapper[4898]: timeout: failed to connect service ":50051" within 1s Dec 11 14:39:11 crc kubenswrapper[4898]: > Dec 11 14:39:20 crc kubenswrapper[4898]: I1211 14:39:20.068030 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-pdwc5" Dec 11 14:39:20 crc kubenswrapper[4898]: I1211 14:39:20.138297 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-pdwc5" Dec 11 14:39:20 crc kubenswrapper[4898]: I1211 14:39:20.321770 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-pdwc5"] Dec 11 14:39:21 crc kubenswrapper[4898]: I1211 14:39:21.891681 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-pdwc5" podUID="62e19a83-12d3-4611-835b-c90337ab2663" containerName="registry-server" containerID="cri-o://2442646bb0ed92dd784880da0951e77b838a5a135fa00cbe9f0ba2652a1e42eb" gracePeriod=2 Dec 11 14:39:22 crc kubenswrapper[4898]: I1211 14:39:22.042059 4898 patch_prober.go:28] interesting pod/logging-loki-gateway-69ffd5987-jj95c container/gateway namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.61:8081/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 11 14:39:22 crc kubenswrapper[4898]: I1211 14:39:22.042122 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-69ffd5987-jj95c" podUID="c8cb2c04-2c3f-4ffa-95b3-7e5c32a159ce" containerName="gateway" probeResult="failure" output="Get \"https://10.217.0.61:8081/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 11 14:39:22 crc kubenswrapper[4898]: E1211 14:39:22.043973 4898 kubelet.go:2526] "Housekeeping took longer than expected" err="housekeeping took too long" expected="1s" actual="1.268s" Dec 11 14:39:22 crc kubenswrapper[4898]: I1211 14:39:22.048161 4898 fsHandler.go:133] fs: disk usage and inodes count on following dirs took 2.121179789s: [/var/lib/containers/storage/overlay/9dda9105bd1ff64c66655f5714b30ae4f7adc47a6cef46b407812ec6ddff7425/diff /var/log/pods/openstack_aodh-0_bd49b53a-7cbb-4ccf-953a-7ed292c090bf/aodh-listener/0.log]; will not log again for this container unless duration exceeds 2s Dec 11 14:39:22 crc kubenswrapper[4898]: I1211 14:39:22.100040 4898 trace.go:236] Trace[806613732]: "Calculate volume metrics of must-gather-output for pod openshift-must-gather-h99mt/must-gather-rk4kt" (11-Dec-2025 14:39:20.579) (total time: 1520ms): Dec 11 14:39:22 crc kubenswrapper[4898]: Trace[806613732]: [1.52007052s] [1.52007052s] END Dec 11 14:39:22 crc kubenswrapper[4898]: I1211 14:39:22.902856 4898 generic.go:334] "Generic (PLEG): container finished" podID="62e19a83-12d3-4611-835b-c90337ab2663" containerID="2442646bb0ed92dd784880da0951e77b838a5a135fa00cbe9f0ba2652a1e42eb" exitCode=0 Dec 11 14:39:22 crc kubenswrapper[4898]: I1211 14:39:22.902939 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pdwc5" event={"ID":"62e19a83-12d3-4611-835b-c90337ab2663","Type":"ContainerDied","Data":"2442646bb0ed92dd784880da0951e77b838a5a135fa00cbe9f0ba2652a1e42eb"} Dec 11 14:39:23 crc kubenswrapper[4898]: I1211 14:39:23.012474 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pdwc5" Dec 11 14:39:23 crc kubenswrapper[4898]: I1211 14:39:23.122764 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/62e19a83-12d3-4611-835b-c90337ab2663-utilities\") pod \"62e19a83-12d3-4611-835b-c90337ab2663\" (UID: \"62e19a83-12d3-4611-835b-c90337ab2663\") " Dec 11 14:39:23 crc kubenswrapper[4898]: I1211 14:39:23.123008 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/62e19a83-12d3-4611-835b-c90337ab2663-catalog-content\") pod \"62e19a83-12d3-4611-835b-c90337ab2663\" (UID: \"62e19a83-12d3-4611-835b-c90337ab2663\") " Dec 11 14:39:23 crc kubenswrapper[4898]: I1211 14:39:23.123163 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xhn84\" (UniqueName: \"kubernetes.io/projected/62e19a83-12d3-4611-835b-c90337ab2663-kube-api-access-xhn84\") pod \"62e19a83-12d3-4611-835b-c90337ab2663\" (UID: \"62e19a83-12d3-4611-835b-c90337ab2663\") " Dec 11 14:39:23 crc kubenswrapper[4898]: I1211 14:39:23.124284 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/62e19a83-12d3-4611-835b-c90337ab2663-utilities" (OuterVolumeSpecName: "utilities") pod "62e19a83-12d3-4611-835b-c90337ab2663" (UID: "62e19a83-12d3-4611-835b-c90337ab2663"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 14:39:23 crc kubenswrapper[4898]: I1211 14:39:23.132076 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62e19a83-12d3-4611-835b-c90337ab2663-kube-api-access-xhn84" (OuterVolumeSpecName: "kube-api-access-xhn84") pod "62e19a83-12d3-4611-835b-c90337ab2663" (UID: "62e19a83-12d3-4611-835b-c90337ab2663"). InnerVolumeSpecName "kube-api-access-xhn84". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 14:39:23 crc kubenswrapper[4898]: I1211 14:39:23.196029 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/62e19a83-12d3-4611-835b-c90337ab2663-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "62e19a83-12d3-4611-835b-c90337ab2663" (UID: "62e19a83-12d3-4611-835b-c90337ab2663"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 14:39:23 crc kubenswrapper[4898]: I1211 14:39:23.226000 4898 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/62e19a83-12d3-4611-835b-c90337ab2663-utilities\") on node \"crc\" DevicePath \"\"" Dec 11 14:39:23 crc kubenswrapper[4898]: I1211 14:39:23.226036 4898 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/62e19a83-12d3-4611-835b-c90337ab2663-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 11 14:39:23 crc kubenswrapper[4898]: I1211 14:39:23.226049 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xhn84\" (UniqueName: \"kubernetes.io/projected/62e19a83-12d3-4611-835b-c90337ab2663-kube-api-access-xhn84\") on node \"crc\" DevicePath \"\"" Dec 11 14:39:23 crc kubenswrapper[4898]: I1211 14:39:23.715821 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-676rf_aeb38d85-05d0-4a84-b3a7-4a7a168ccd98/control-plane-machine-set-operator/0.log" Dec 11 14:39:23 crc kubenswrapper[4898]: I1211 14:39:23.916199 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pdwc5" event={"ID":"62e19a83-12d3-4611-835b-c90337ab2663","Type":"ContainerDied","Data":"6d510aeac1f6d8ba8ff4e9d5c4bcaed5aa7b030f92ab311a9af1d7f8a8b49104"} Dec 11 14:39:23 crc kubenswrapper[4898]: I1211 14:39:23.916254 4898 scope.go:117] "RemoveContainer" containerID="2442646bb0ed92dd784880da0951e77b838a5a135fa00cbe9f0ba2652a1e42eb" Dec 11 14:39:23 crc kubenswrapper[4898]: I1211 14:39:23.916407 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pdwc5" Dec 11 14:39:23 crc kubenswrapper[4898]: I1211 14:39:23.945714 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-lqr65_1dd4c49a-1898-48ab-9ed0-4f455f392b57/kube-rbac-proxy/0.log" Dec 11 14:39:23 crc kubenswrapper[4898]: I1211 14:39:23.958930 4898 scope.go:117] "RemoveContainer" containerID="d1ca245621a1f57c4582764b7ee510b76df9e6e96f752416f3a5885b58b33579" Dec 11 14:39:23 crc kubenswrapper[4898]: I1211 14:39:23.959690 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-pdwc5"] Dec 11 14:39:23 crc kubenswrapper[4898]: I1211 14:39:23.970317 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-pdwc5"] Dec 11 14:39:23 crc kubenswrapper[4898]: I1211 14:39:23.978857 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-lqr65_1dd4c49a-1898-48ab-9ed0-4f455f392b57/machine-api-operator/0.log" Dec 11 14:39:24 crc kubenswrapper[4898]: I1211 14:39:24.009537 4898 scope.go:117] "RemoveContainer" containerID="3231579a60cc9687d6fccf2fdd7c6b5e048474088b04750558eee185c0abc6e4" Dec 11 14:39:24 crc kubenswrapper[4898]: I1211 14:39:24.813581 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="62e19a83-12d3-4611-835b-c90337ab2663" path="/var/lib/kubelet/pods/62e19a83-12d3-4611-835b-c90337ab2663/volumes" Dec 11 14:39:34 crc kubenswrapper[4898]: I1211 14:39:34.996163 4898 patch_prober.go:28] interesting pod/machine-config-daemon-7mmvk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 11 14:39:34 crc kubenswrapper[4898]: I1211 14:39:34.996680 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 11 14:39:40 crc kubenswrapper[4898]: I1211 14:39:40.820034 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-h9lc7_8e5e03d3-bb27-44ef-9f33-fe1175a655ed/cert-manager-controller/0.log" Dec 11 14:39:40 crc kubenswrapper[4898]: I1211 14:39:40.981016 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-6wct8_df56f399-1275-4a62-b700-05d108445723/cert-manager-cainjector/0.log" Dec 11 14:39:41 crc kubenswrapper[4898]: I1211 14:39:41.047664 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-dwxc2_80c45250-6b80-452d-ade1-a8b024cabf10/cert-manager-webhook/0.log" Dec 11 14:39:55 crc kubenswrapper[4898]: I1211 14:39:55.916688 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-6ff7998486-xz82s_d33832d5-019a-4630-84b6-01df5d77cade/nmstate-console-plugin/0.log" Dec 11 14:39:56 crc kubenswrapper[4898]: I1211 14:39:56.143808 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-s4vbv_caec89bd-563f-4065-87ce-2cb58b5e4dc9/nmstate-handler/0.log" Dec 11 14:39:56 crc kubenswrapper[4898]: I1211 14:39:56.242753 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f7f7578db-65snc_231b14b2-7f77-4900-ba69-07247827770f/kube-rbac-proxy/0.log" Dec 11 14:39:56 crc kubenswrapper[4898]: I1211 14:39:56.265858 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f7f7578db-65snc_231b14b2-7f77-4900-ba69-07247827770f/nmstate-metrics/0.log" Dec 11 14:39:56 crc kubenswrapper[4898]: I1211 14:39:56.421302 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-6769fb99d-j52sl_41793757-3faf-4702-b024-5e2ab032b432/nmstate-operator/0.log" Dec 11 14:39:56 crc kubenswrapper[4898]: I1211 14:39:56.563505 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-f8fb84555-z6l22_79dd8f49-7447-49a9-84a3-252ac5286cc3/nmstate-webhook/0.log" Dec 11 14:40:04 crc kubenswrapper[4898]: I1211 14:40:04.995633 4898 patch_prober.go:28] interesting pod/machine-config-daemon-7mmvk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 11 14:40:04 crc kubenswrapper[4898]: I1211 14:40:04.996212 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 11 14:40:11 crc kubenswrapper[4898]: I1211 14:40:11.260824 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-7d9d9f99f6-7sstc_b4cf67d3-b13e-4afb-be20-80dc0801c69c/kube-rbac-proxy/0.log" Dec 11 14:40:11 crc kubenswrapper[4898]: I1211 14:40:11.272785 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-7d9d9f99f6-7sstc_b4cf67d3-b13e-4afb-be20-80dc0801c69c/manager/0.log" Dec 11 14:40:27 crc kubenswrapper[4898]: I1211 14:40:27.100238 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_cluster-logging-operator-ff9846bd-5qw7n_7ffaf6d3-c69c-4b78-8364-be63b25056c3/cluster-logging-operator/0.log" Dec 11 14:40:27 crc kubenswrapper[4898]: I1211 14:40:27.342090 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_collector-ql6sd_c696c635-6d7a-40d8-aef4-ee5781067e7f/collector/0.log" Dec 11 14:40:27 crc kubenswrapper[4898]: I1211 14:40:27.387196 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-compactor-0_3cd3cd1d-9ead-4620-a346-f83e9e5190ba/loki-compactor/0.log" Dec 11 14:40:27 crc kubenswrapper[4898]: I1211 14:40:27.547250 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-distributor-76cc67bf56-qjz7m_841a3e5b-876d-43b2-b24a-d5c01876c30d/loki-distributor/0.log" Dec 11 14:40:27 crc kubenswrapper[4898]: I1211 14:40:27.595643 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-gateway-69ffd5987-jj95c_c8cb2c04-2c3f-4ffa-95b3-7e5c32a159ce/gateway/0.log" Dec 11 14:40:27 crc kubenswrapper[4898]: I1211 14:40:27.600744 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-gateway-69ffd5987-jj95c_c8cb2c04-2c3f-4ffa-95b3-7e5c32a159ce/opa/0.log" Dec 11 14:40:28 crc kubenswrapper[4898]: I1211 14:40:28.486524 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-gateway-69ffd5987-wz9b5_6d6f1657-b9dd-4fba-a216-ca660a4fa958/gateway/0.log" Dec 11 14:40:28 crc kubenswrapper[4898]: I1211 14:40:28.517861 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-gateway-69ffd5987-wz9b5_6d6f1657-b9dd-4fba-a216-ca660a4fa958/opa/0.log" Dec 11 14:40:28 crc kubenswrapper[4898]: I1211 14:40:28.648997 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-index-gateway-0_8571f1ff-bc62-45f4-a34d-d221e36df569/loki-index-gateway/0.log" Dec 11 14:40:28 crc kubenswrapper[4898]: I1211 14:40:28.785901 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-ingester-0_f6ad4db8-64f3-403c-9c92-9033a73ed12c/loki-ingester/0.log" Dec 11 14:40:28 crc kubenswrapper[4898]: I1211 14:40:28.864435 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-querier-5895d59bb8-62tqd_4ecec7bc-8ce0-46ce-99fd-2a6fbcc626d0/loki-querier/0.log" Dec 11 14:40:28 crc kubenswrapper[4898]: I1211 14:40:28.979757 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-query-frontend-84558f7c9f-rjbbq_98efd2cb-8cd8-49c2-a54b-5a04cf51dc71/loki-query-frontend/0.log" Dec 11 14:40:34 crc kubenswrapper[4898]: I1211 14:40:34.995886 4898 patch_prober.go:28] interesting pod/machine-config-daemon-7mmvk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 11 14:40:34 crc kubenswrapper[4898]: I1211 14:40:34.996390 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 11 14:40:34 crc kubenswrapper[4898]: I1211 14:40:34.996439 4898 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" Dec 11 14:40:34 crc kubenswrapper[4898]: I1211 14:40:34.997362 4898 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8c3848400d3383130ae2106d38db8971207b13a878503083f06812e151d95d89"} pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 11 14:40:34 crc kubenswrapper[4898]: I1211 14:40:34.997417 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" containerName="machine-config-daemon" containerID="cri-o://8c3848400d3383130ae2106d38db8971207b13a878503083f06812e151d95d89" gracePeriod=600 Dec 11 14:40:35 crc kubenswrapper[4898]: I1211 14:40:35.719655 4898 generic.go:334] "Generic (PLEG): container finished" podID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" containerID="8c3848400d3383130ae2106d38db8971207b13a878503083f06812e151d95d89" exitCode=0 Dec 11 14:40:35 crc kubenswrapper[4898]: I1211 14:40:35.719710 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" event={"ID":"b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c","Type":"ContainerDied","Data":"8c3848400d3383130ae2106d38db8971207b13a878503083f06812e151d95d89"} Dec 11 14:40:35 crc kubenswrapper[4898]: I1211 14:40:35.720062 4898 scope.go:117] "RemoveContainer" containerID="c9d8382e717f95e9e1fb7259437022afee2b7230a670db5835eea6c52595a22c" Dec 11 14:40:36 crc kubenswrapper[4898]: I1211 14:40:36.734300 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" event={"ID":"b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c","Type":"ContainerStarted","Data":"4a948fb24285fadc33e95803387e30a8892f96c49c37e3a977d4366e7a6db1ae"} Dec 11 14:40:43 crc kubenswrapper[4898]: I1211 14:40:43.850260 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-5bddd4b946-bkr8v_cfd0fc01-1bde-4b11-bbdd-d95693d0dd15/kube-rbac-proxy/0.log" Dec 11 14:40:43 crc kubenswrapper[4898]: I1211 14:40:43.923221 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-5bddd4b946-bkr8v_cfd0fc01-1bde-4b11-bbdd-d95693d0dd15/controller/0.log" Dec 11 14:40:44 crc kubenswrapper[4898]: I1211 14:40:44.095122 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-6gtqk_d898c00f-7c50-483d-84f3-9c502696b39a/cp-frr-files/0.log" Dec 11 14:40:44 crc kubenswrapper[4898]: I1211 14:40:44.249862 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-6gtqk_d898c00f-7c50-483d-84f3-9c502696b39a/cp-frr-files/0.log" Dec 11 14:40:44 crc kubenswrapper[4898]: I1211 14:40:44.284559 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-6gtqk_d898c00f-7c50-483d-84f3-9c502696b39a/cp-reloader/0.log" Dec 11 14:40:44 crc kubenswrapper[4898]: I1211 14:40:44.316691 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-6gtqk_d898c00f-7c50-483d-84f3-9c502696b39a/cp-metrics/0.log" Dec 11 14:40:44 crc kubenswrapper[4898]: I1211 14:40:44.384645 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-6gtqk_d898c00f-7c50-483d-84f3-9c502696b39a/cp-reloader/0.log" Dec 11 14:40:44 crc kubenswrapper[4898]: I1211 14:40:44.621426 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-6gtqk_d898c00f-7c50-483d-84f3-9c502696b39a/cp-reloader/0.log" Dec 11 14:40:44 crc kubenswrapper[4898]: I1211 14:40:44.638201 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-6gtqk_d898c00f-7c50-483d-84f3-9c502696b39a/cp-metrics/0.log" Dec 11 14:40:44 crc kubenswrapper[4898]: I1211 14:40:44.666368 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-6gtqk_d898c00f-7c50-483d-84f3-9c502696b39a/cp-frr-files/0.log" Dec 11 14:40:44 crc kubenswrapper[4898]: I1211 14:40:44.688539 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-6gtqk_d898c00f-7c50-483d-84f3-9c502696b39a/cp-metrics/0.log" Dec 11 14:40:44 crc kubenswrapper[4898]: I1211 14:40:44.872534 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-6gtqk_d898c00f-7c50-483d-84f3-9c502696b39a/cp-reloader/0.log" Dec 11 14:40:44 crc kubenswrapper[4898]: I1211 14:40:44.893720 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-6gtqk_d898c00f-7c50-483d-84f3-9c502696b39a/controller/0.log" Dec 11 14:40:44 crc kubenswrapper[4898]: I1211 14:40:44.920494 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-6gtqk_d898c00f-7c50-483d-84f3-9c502696b39a/cp-frr-files/0.log" Dec 11 14:40:44 crc kubenswrapper[4898]: I1211 14:40:44.921947 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-6gtqk_d898c00f-7c50-483d-84f3-9c502696b39a/cp-metrics/0.log" Dec 11 14:40:45 crc kubenswrapper[4898]: I1211 14:40:45.130114 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-6gtqk_d898c00f-7c50-483d-84f3-9c502696b39a/kube-rbac-proxy/0.log" Dec 11 14:40:45 crc kubenswrapper[4898]: I1211 14:40:45.190940 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-6gtqk_d898c00f-7c50-483d-84f3-9c502696b39a/frr-metrics/0.log" Dec 11 14:40:45 crc kubenswrapper[4898]: I1211 14:40:45.234278 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-6gtqk_d898c00f-7c50-483d-84f3-9c502696b39a/frr/1.log" Dec 11 14:40:45 crc kubenswrapper[4898]: I1211 14:40:45.393334 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-6gtqk_d898c00f-7c50-483d-84f3-9c502696b39a/reloader/0.log" Dec 11 14:40:45 crc kubenswrapper[4898]: I1211 14:40:45.404699 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-6gtqk_d898c00f-7c50-483d-84f3-9c502696b39a/kube-rbac-proxy-frr/0.log" Dec 11 14:40:45 crc kubenswrapper[4898]: I1211 14:40:45.588711 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7784b6fcf-llp2z_c0569df8-06fa-4d31-a59a-904b90e4a0ca/frr-k8s-webhook-server/0.log" Dec 11 14:40:45 crc kubenswrapper[4898]: I1211 14:40:45.782265 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-7d49b6f5c7-png4s_a3162798-e8d2-458c-a559-9a246c2cae3b/manager/0.log" Dec 11 14:40:45 crc kubenswrapper[4898]: I1211 14:40:45.858770 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-585b5958fd-m8kkm_351e2ec9-301e-4fd9-b8ef-45494b9a1291/webhook-server/0.log" Dec 11 14:40:46 crc kubenswrapper[4898]: I1211 14:40:46.055943 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-7jgw9_80d1af81-ad34-4f94-afd2-94c3773ea9ea/kube-rbac-proxy/0.log" Dec 11 14:40:46 crc kubenswrapper[4898]: I1211 14:40:46.754953 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-7jgw9_80d1af81-ad34-4f94-afd2-94c3773ea9ea/speaker/0.log" Dec 11 14:40:46 crc kubenswrapper[4898]: I1211 14:40:46.837334 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-6gtqk_d898c00f-7c50-483d-84f3-9c502696b39a/frr/0.log" Dec 11 14:41:03 crc kubenswrapper[4898]: I1211 14:41:03.238108 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8td8s7_2e30a8bb-3d26-41d3-af16-de028edba0ff/util/0.log" Dec 11 14:41:03 crc kubenswrapper[4898]: I1211 14:41:03.352285 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8td8s7_2e30a8bb-3d26-41d3-af16-de028edba0ff/util/0.log" Dec 11 14:41:03 crc kubenswrapper[4898]: I1211 14:41:03.408430 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8td8s7_2e30a8bb-3d26-41d3-af16-de028edba0ff/pull/0.log" Dec 11 14:41:03 crc kubenswrapper[4898]: I1211 14:41:03.469191 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8td8s7_2e30a8bb-3d26-41d3-af16-de028edba0ff/pull/0.log" Dec 11 14:41:03 crc kubenswrapper[4898]: I1211 14:41:03.680943 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8td8s7_2e30a8bb-3d26-41d3-af16-de028edba0ff/pull/0.log" Dec 11 14:41:03 crc kubenswrapper[4898]: I1211 14:41:03.705178 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8td8s7_2e30a8bb-3d26-41d3-af16-de028edba0ff/extract/0.log" Dec 11 14:41:03 crc kubenswrapper[4898]: I1211 14:41:03.771197 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8td8s7_2e30a8bb-3d26-41d3-af16-de028edba0ff/util/0.log" Dec 11 14:41:03 crc kubenswrapper[4898]: I1211 14:41:03.887266 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d46qf9d_b949410c-72a6-4040-b66d-daacd4c2c4e2/util/0.log" Dec 11 14:41:05 crc kubenswrapper[4898]: I1211 14:41:05.076722 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d46qf9d_b949410c-72a6-4040-b66d-daacd4c2c4e2/pull/0.log" Dec 11 14:41:05 crc kubenswrapper[4898]: I1211 14:41:05.116719 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d46qf9d_b949410c-72a6-4040-b66d-daacd4c2c4e2/util/0.log" Dec 11 14:41:05 crc kubenswrapper[4898]: I1211 14:41:05.159406 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d46qf9d_b949410c-72a6-4040-b66d-daacd4c2c4e2/pull/0.log" Dec 11 14:41:05 crc kubenswrapper[4898]: I1211 14:41:05.272853 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d46qf9d_b949410c-72a6-4040-b66d-daacd4c2c4e2/util/0.log" Dec 11 14:41:05 crc kubenswrapper[4898]: I1211 14:41:05.307274 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d46qf9d_b949410c-72a6-4040-b66d-daacd4c2c4e2/pull/0.log" Dec 11 14:41:05 crc kubenswrapper[4898]: I1211 14:41:05.379716 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d46qf9d_b949410c-72a6-4040-b66d-daacd4c2c4e2/extract/0.log" Dec 11 14:41:05 crc kubenswrapper[4898]: I1211 14:41:05.475247 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92108dkt4_26dfece8-854b-4146-bcee-c25943aab4b2/util/0.log" Dec 11 14:41:05 crc kubenswrapper[4898]: I1211 14:41:05.648856 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92108dkt4_26dfece8-854b-4146-bcee-c25943aab4b2/util/0.log" Dec 11 14:41:05 crc kubenswrapper[4898]: I1211 14:41:05.649809 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92108dkt4_26dfece8-854b-4146-bcee-c25943aab4b2/pull/0.log" Dec 11 14:41:05 crc kubenswrapper[4898]: I1211 14:41:05.692013 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92108dkt4_26dfece8-854b-4146-bcee-c25943aab4b2/pull/0.log" Dec 11 14:41:05 crc kubenswrapper[4898]: I1211 14:41:05.852588 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92108dkt4_26dfece8-854b-4146-bcee-c25943aab4b2/pull/0.log" Dec 11 14:41:05 crc kubenswrapper[4898]: I1211 14:41:05.861417 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92108dkt4_26dfece8-854b-4146-bcee-c25943aab4b2/util/0.log" Dec 11 14:41:05 crc kubenswrapper[4898]: I1211 14:41:05.882438 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92108dkt4_26dfece8-854b-4146-bcee-c25943aab4b2/extract/0.log" Dec 11 14:41:06 crc kubenswrapper[4898]: I1211 14:41:06.054557 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa84jf9c_c9d7d79c-b405-4c5b-b4b9-860f7d4dea0e/util/0.log" Dec 11 14:41:06 crc kubenswrapper[4898]: I1211 14:41:06.259294 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa84jf9c_c9d7d79c-b405-4c5b-b4b9-860f7d4dea0e/util/0.log" Dec 11 14:41:06 crc kubenswrapper[4898]: I1211 14:41:06.287762 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa84jf9c_c9d7d79c-b405-4c5b-b4b9-860f7d4dea0e/pull/0.log" Dec 11 14:41:06 crc kubenswrapper[4898]: I1211 14:41:06.287905 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa84jf9c_c9d7d79c-b405-4c5b-b4b9-860f7d4dea0e/pull/0.log" Dec 11 14:41:06 crc kubenswrapper[4898]: I1211 14:41:06.499483 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa84jf9c_c9d7d79c-b405-4c5b-b4b9-860f7d4dea0e/util/0.log" Dec 11 14:41:06 crc kubenswrapper[4898]: I1211 14:41:06.506743 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa84jf9c_c9d7d79c-b405-4c5b-b4b9-860f7d4dea0e/pull/0.log" Dec 11 14:41:06 crc kubenswrapper[4898]: I1211 14:41:06.510898 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa84jf9c_c9d7d79c-b405-4c5b-b4b9-860f7d4dea0e/extract/0.log" Dec 11 14:41:06 crc kubenswrapper[4898]: I1211 14:41:06.526067 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463f542dx_77e660a9-36a2-45e9-9d33-a5a6feb01cd2/util/0.log" Dec 11 14:41:06 crc kubenswrapper[4898]: I1211 14:41:06.702468 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463f542dx_77e660a9-36a2-45e9-9d33-a5a6feb01cd2/util/0.log" Dec 11 14:41:06 crc kubenswrapper[4898]: I1211 14:41:06.724431 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463f542dx_77e660a9-36a2-45e9-9d33-a5a6feb01cd2/pull/0.log" Dec 11 14:41:06 crc kubenswrapper[4898]: I1211 14:41:06.728757 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463f542dx_77e660a9-36a2-45e9-9d33-a5a6feb01cd2/pull/0.log" Dec 11 14:41:06 crc kubenswrapper[4898]: I1211 14:41:06.894142 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463f542dx_77e660a9-36a2-45e9-9d33-a5a6feb01cd2/util/0.log" Dec 11 14:41:06 crc kubenswrapper[4898]: I1211 14:41:06.895321 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463f542dx_77e660a9-36a2-45e9-9d33-a5a6feb01cd2/pull/0.log" Dec 11 14:41:06 crc kubenswrapper[4898]: I1211 14:41:06.927470 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463f542dx_77e660a9-36a2-45e9-9d33-a5a6feb01cd2/extract/0.log" Dec 11 14:41:06 crc kubenswrapper[4898]: I1211 14:41:06.980804 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-nkc2q_b434beff-f5d1-4b10-8715-10cdfe445919/extract-utilities/0.log" Dec 11 14:41:07 crc kubenswrapper[4898]: I1211 14:41:07.110002 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-nkc2q_b434beff-f5d1-4b10-8715-10cdfe445919/extract-content/0.log" Dec 11 14:41:07 crc kubenswrapper[4898]: I1211 14:41:07.119569 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-nkc2q_b434beff-f5d1-4b10-8715-10cdfe445919/extract-content/0.log" Dec 11 14:41:07 crc kubenswrapper[4898]: I1211 14:41:07.144174 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-nkc2q_b434beff-f5d1-4b10-8715-10cdfe445919/extract-utilities/0.log" Dec 11 14:41:07 crc kubenswrapper[4898]: I1211 14:41:07.286789 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-nkc2q_b434beff-f5d1-4b10-8715-10cdfe445919/extract-content/0.log" Dec 11 14:41:07 crc kubenswrapper[4898]: I1211 14:41:07.292858 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-nkc2q_b434beff-f5d1-4b10-8715-10cdfe445919/extract-utilities/0.log" Dec 11 14:41:07 crc kubenswrapper[4898]: I1211 14:41:07.396201 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-jncnt_e837aefb-2b43-47e8-87b0-232560ff1b37/extract-utilities/0.log" Dec 11 14:41:07 crc kubenswrapper[4898]: I1211 14:41:07.558882 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-jncnt_e837aefb-2b43-47e8-87b0-232560ff1b37/extract-utilities/0.log" Dec 11 14:41:07 crc kubenswrapper[4898]: I1211 14:41:07.563752 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-jncnt_e837aefb-2b43-47e8-87b0-232560ff1b37/extract-content/0.log" Dec 11 14:41:07 crc kubenswrapper[4898]: I1211 14:41:07.576957 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-jncnt_e837aefb-2b43-47e8-87b0-232560ff1b37/extract-content/0.log" Dec 11 14:41:07 crc kubenswrapper[4898]: I1211 14:41:07.845928 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-jncnt_e837aefb-2b43-47e8-87b0-232560ff1b37/extract-utilities/0.log" Dec 11 14:41:07 crc kubenswrapper[4898]: I1211 14:41:07.867859 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-jncnt_e837aefb-2b43-47e8-87b0-232560ff1b37/extract-content/0.log" Dec 11 14:41:08 crc kubenswrapper[4898]: I1211 14:41:08.091369 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-trbn2_bb6a1bca-9d01-4e70-882f-47a6e90923df/marketplace-operator/0.log" Dec 11 14:41:08 crc kubenswrapper[4898]: I1211 14:41:08.126094 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-jqfm2_f1d2e209-8101-4341-b232-ed52d1d9f629/extract-utilities/0.log" Dec 11 14:41:08 crc kubenswrapper[4898]: I1211 14:41:08.256739 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-nkc2q_b434beff-f5d1-4b10-8715-10cdfe445919/registry-server/0.log" Dec 11 14:41:08 crc kubenswrapper[4898]: I1211 14:41:08.333696 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-jqfm2_f1d2e209-8101-4341-b232-ed52d1d9f629/extract-content/0.log" Dec 11 14:41:08 crc kubenswrapper[4898]: I1211 14:41:08.333830 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-jqfm2_f1d2e209-8101-4341-b232-ed52d1d9f629/extract-content/0.log" Dec 11 14:41:08 crc kubenswrapper[4898]: I1211 14:41:08.350598 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-jqfm2_f1d2e209-8101-4341-b232-ed52d1d9f629/extract-utilities/0.log" Dec 11 14:41:08 crc kubenswrapper[4898]: I1211 14:41:08.594080 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-jqfm2_f1d2e209-8101-4341-b232-ed52d1d9f629/extract-content/0.log" Dec 11 14:41:08 crc kubenswrapper[4898]: I1211 14:41:08.633628 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-jqfm2_f1d2e209-8101-4341-b232-ed52d1d9f629/extract-utilities/0.log" Dec 11 14:41:08 crc kubenswrapper[4898]: I1211 14:41:08.836224 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-kxmhp_c3a84ac6-e73e-4e1f-bd6e-cfb6c607dd4a/extract-utilities/0.log" Dec 11 14:41:08 crc kubenswrapper[4898]: I1211 14:41:08.871549 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-jqfm2_f1d2e209-8101-4341-b232-ed52d1d9f629/registry-server/0.log" Dec 11 14:41:09 crc kubenswrapper[4898]: I1211 14:41:09.009428 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-kxmhp_c3a84ac6-e73e-4e1f-bd6e-cfb6c607dd4a/extract-utilities/0.log" Dec 11 14:41:09 crc kubenswrapper[4898]: I1211 14:41:09.019483 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-jncnt_e837aefb-2b43-47e8-87b0-232560ff1b37/registry-server/0.log" Dec 11 14:41:09 crc kubenswrapper[4898]: I1211 14:41:09.043847 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-kxmhp_c3a84ac6-e73e-4e1f-bd6e-cfb6c607dd4a/extract-content/0.log" Dec 11 14:41:09 crc kubenswrapper[4898]: I1211 14:41:09.064823 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-kxmhp_c3a84ac6-e73e-4e1f-bd6e-cfb6c607dd4a/extract-content/0.log" Dec 11 14:41:09 crc kubenswrapper[4898]: I1211 14:41:09.262494 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-kxmhp_c3a84ac6-e73e-4e1f-bd6e-cfb6c607dd4a/extract-content/0.log" Dec 11 14:41:09 crc kubenswrapper[4898]: I1211 14:41:09.271029 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-kxmhp_c3a84ac6-e73e-4e1f-bd6e-cfb6c607dd4a/extract-utilities/0.log" Dec 11 14:41:10 crc kubenswrapper[4898]: I1211 14:41:10.003999 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-kxmhp_c3a84ac6-e73e-4e1f-bd6e-cfb6c607dd4a/registry-server/0.log" Dec 11 14:41:22 crc kubenswrapper[4898]: I1211 14:41:22.448730 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-668cf9dfbb-kflrx_e1934c2f-8b0a-4a1d-9da5-5de2822c6b82/prometheus-operator/0.log" Dec 11 14:41:22 crc kubenswrapper[4898]: I1211 14:41:22.583392 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-67ddb68bbf-v6w7g_8c1b0eb8-fe9f-4ad8-897e-52e76251f1ef/prometheus-operator-admission-webhook/0.log" Dec 11 14:41:22 crc kubenswrapper[4898]: I1211 14:41:22.666882 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-67ddb68bbf-wvkwh_a328cfd7-383d-4f47-9723-ef24187542bd/prometheus-operator-admission-webhook/0.log" Dec 11 14:41:22 crc kubenswrapper[4898]: I1211 14:41:22.778956 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-d8bb48f5d-cnkcn_2fed6ea1-0c4d-476c-9f45-ca4b5a9dc406/operator/0.log" Dec 11 14:41:22 crc kubenswrapper[4898]: I1211 14:41:22.892013 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-ui-dashboards-7d5fb4cbfb-wb65t_bb657218-de3e-4a7b-8412-cab942943d0a/observability-ui-dashboards/0.log" Dec 11 14:41:22 crc kubenswrapper[4898]: I1211 14:41:22.966379 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5446b9c989-nq968_c5f15058-ca6b-40a5-bad2-83ea7339d28b/perses-operator/0.log" Dec 11 14:41:37 crc kubenswrapper[4898]: I1211 14:41:37.696350 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-7d9d9f99f6-7sstc_b4cf67d3-b13e-4afb-be20-80dc0801c69c/manager/0.log" Dec 11 14:41:37 crc kubenswrapper[4898]: I1211 14:41:37.721995 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-7d9d9f99f6-7sstc_b4cf67d3-b13e-4afb-be20-80dc0801c69c/kube-rbac-proxy/0.log" Dec 11 14:41:52 crc kubenswrapper[4898]: I1211 14:41:52.752499 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-v48dc"] Dec 11 14:41:52 crc kubenswrapper[4898]: E1211 14:41:52.753468 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62e19a83-12d3-4611-835b-c90337ab2663" containerName="extract-utilities" Dec 11 14:41:52 crc kubenswrapper[4898]: I1211 14:41:52.753481 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="62e19a83-12d3-4611-835b-c90337ab2663" containerName="extract-utilities" Dec 11 14:41:52 crc kubenswrapper[4898]: E1211 14:41:52.753493 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71ef5c35-e68f-40ef-aa81-27670ff4dd27" containerName="extract-utilities" Dec 11 14:41:52 crc kubenswrapper[4898]: I1211 14:41:52.753501 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="71ef5c35-e68f-40ef-aa81-27670ff4dd27" containerName="extract-utilities" Dec 11 14:41:52 crc kubenswrapper[4898]: E1211 14:41:52.753530 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62e19a83-12d3-4611-835b-c90337ab2663" containerName="registry-server" Dec 11 14:41:52 crc kubenswrapper[4898]: I1211 14:41:52.753536 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="62e19a83-12d3-4611-835b-c90337ab2663" containerName="registry-server" Dec 11 14:41:52 crc kubenswrapper[4898]: E1211 14:41:52.753567 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71ef5c35-e68f-40ef-aa81-27670ff4dd27" containerName="registry-server" Dec 11 14:41:52 crc kubenswrapper[4898]: I1211 14:41:52.753574 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="71ef5c35-e68f-40ef-aa81-27670ff4dd27" containerName="registry-server" Dec 11 14:41:52 crc kubenswrapper[4898]: E1211 14:41:52.753598 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62e19a83-12d3-4611-835b-c90337ab2663" containerName="extract-content" Dec 11 14:41:52 crc kubenswrapper[4898]: I1211 14:41:52.753603 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="62e19a83-12d3-4611-835b-c90337ab2663" containerName="extract-content" Dec 11 14:41:52 crc kubenswrapper[4898]: E1211 14:41:52.753652 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71ef5c35-e68f-40ef-aa81-27670ff4dd27" containerName="extract-content" Dec 11 14:41:52 crc kubenswrapper[4898]: I1211 14:41:52.753657 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="71ef5c35-e68f-40ef-aa81-27670ff4dd27" containerName="extract-content" Dec 11 14:41:52 crc kubenswrapper[4898]: I1211 14:41:52.753854 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="62e19a83-12d3-4611-835b-c90337ab2663" containerName="registry-server" Dec 11 14:41:52 crc kubenswrapper[4898]: I1211 14:41:52.753873 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="71ef5c35-e68f-40ef-aa81-27670ff4dd27" containerName="registry-server" Dec 11 14:41:52 crc kubenswrapper[4898]: I1211 14:41:52.755496 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-v48dc" Dec 11 14:41:52 crc kubenswrapper[4898]: I1211 14:41:52.802479 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-v48dc"] Dec 11 14:41:52 crc kubenswrapper[4898]: I1211 14:41:52.842178 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c81ab26f-a2a0-4306-a8ec-c7d7fe6cd13d-catalog-content\") pod \"redhat-marketplace-v48dc\" (UID: \"c81ab26f-a2a0-4306-a8ec-c7d7fe6cd13d\") " pod="openshift-marketplace/redhat-marketplace-v48dc" Dec 11 14:41:52 crc kubenswrapper[4898]: I1211 14:41:52.842471 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c81ab26f-a2a0-4306-a8ec-c7d7fe6cd13d-utilities\") pod \"redhat-marketplace-v48dc\" (UID: \"c81ab26f-a2a0-4306-a8ec-c7d7fe6cd13d\") " pod="openshift-marketplace/redhat-marketplace-v48dc" Dec 11 14:41:52 crc kubenswrapper[4898]: I1211 14:41:52.842525 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kdblh\" (UniqueName: \"kubernetes.io/projected/c81ab26f-a2a0-4306-a8ec-c7d7fe6cd13d-kube-api-access-kdblh\") pod \"redhat-marketplace-v48dc\" (UID: \"c81ab26f-a2a0-4306-a8ec-c7d7fe6cd13d\") " pod="openshift-marketplace/redhat-marketplace-v48dc" Dec 11 14:41:52 crc kubenswrapper[4898]: I1211 14:41:52.944837 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c81ab26f-a2a0-4306-a8ec-c7d7fe6cd13d-utilities\") pod \"redhat-marketplace-v48dc\" (UID: \"c81ab26f-a2a0-4306-a8ec-c7d7fe6cd13d\") " pod="openshift-marketplace/redhat-marketplace-v48dc" Dec 11 14:41:52 crc kubenswrapper[4898]: I1211 14:41:52.945136 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kdblh\" (UniqueName: \"kubernetes.io/projected/c81ab26f-a2a0-4306-a8ec-c7d7fe6cd13d-kube-api-access-kdblh\") pod \"redhat-marketplace-v48dc\" (UID: \"c81ab26f-a2a0-4306-a8ec-c7d7fe6cd13d\") " pod="openshift-marketplace/redhat-marketplace-v48dc" Dec 11 14:41:52 crc kubenswrapper[4898]: I1211 14:41:52.945537 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c81ab26f-a2a0-4306-a8ec-c7d7fe6cd13d-utilities\") pod \"redhat-marketplace-v48dc\" (UID: \"c81ab26f-a2a0-4306-a8ec-c7d7fe6cd13d\") " pod="openshift-marketplace/redhat-marketplace-v48dc" Dec 11 14:41:52 crc kubenswrapper[4898]: I1211 14:41:52.946651 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c81ab26f-a2a0-4306-a8ec-c7d7fe6cd13d-catalog-content\") pod \"redhat-marketplace-v48dc\" (UID: \"c81ab26f-a2a0-4306-a8ec-c7d7fe6cd13d\") " pod="openshift-marketplace/redhat-marketplace-v48dc" Dec 11 14:41:52 crc kubenswrapper[4898]: I1211 14:41:52.947069 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c81ab26f-a2a0-4306-a8ec-c7d7fe6cd13d-catalog-content\") pod \"redhat-marketplace-v48dc\" (UID: \"c81ab26f-a2a0-4306-a8ec-c7d7fe6cd13d\") " pod="openshift-marketplace/redhat-marketplace-v48dc" Dec 11 14:41:52 crc kubenswrapper[4898]: I1211 14:41:52.970805 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kdblh\" (UniqueName: \"kubernetes.io/projected/c81ab26f-a2a0-4306-a8ec-c7d7fe6cd13d-kube-api-access-kdblh\") pod \"redhat-marketplace-v48dc\" (UID: \"c81ab26f-a2a0-4306-a8ec-c7d7fe6cd13d\") " pod="openshift-marketplace/redhat-marketplace-v48dc" Dec 11 14:41:53 crc kubenswrapper[4898]: I1211 14:41:53.074777 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-v48dc" Dec 11 14:41:53 crc kubenswrapper[4898]: I1211 14:41:53.784701 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-v48dc"] Dec 11 14:41:55 crc kubenswrapper[4898]: I1211 14:41:55.185710 4898 generic.go:334] "Generic (PLEG): container finished" podID="c81ab26f-a2a0-4306-a8ec-c7d7fe6cd13d" containerID="fbd285e0e9883e898bf21b8d70c51ad07aa06db97376f400aa9a4680aa0e5275" exitCode=0 Dec 11 14:41:55 crc kubenswrapper[4898]: I1211 14:41:55.220821 4898 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 11 14:41:55 crc kubenswrapper[4898]: I1211 14:41:55.233943 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v48dc" event={"ID":"c81ab26f-a2a0-4306-a8ec-c7d7fe6cd13d","Type":"ContainerDied","Data":"fbd285e0e9883e898bf21b8d70c51ad07aa06db97376f400aa9a4680aa0e5275"} Dec 11 14:41:55 crc kubenswrapper[4898]: I1211 14:41:55.234082 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v48dc" event={"ID":"c81ab26f-a2a0-4306-a8ec-c7d7fe6cd13d","Type":"ContainerStarted","Data":"ecca41c71f6ad9601872542b73947a9ac7938f5e1767d32eabe6283b26154e3b"} Dec 11 14:41:57 crc kubenswrapper[4898]: I1211 14:41:57.210811 4898 generic.go:334] "Generic (PLEG): container finished" podID="c81ab26f-a2a0-4306-a8ec-c7d7fe6cd13d" containerID="ee0ee20ec902be0b3b10053ee911ac5bcb2b1f05a00c76b27944cd20afa1618c" exitCode=0 Dec 11 14:41:57 crc kubenswrapper[4898]: I1211 14:41:57.210901 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v48dc" event={"ID":"c81ab26f-a2a0-4306-a8ec-c7d7fe6cd13d","Type":"ContainerDied","Data":"ee0ee20ec902be0b3b10053ee911ac5bcb2b1f05a00c76b27944cd20afa1618c"} Dec 11 14:42:01 crc kubenswrapper[4898]: I1211 14:42:01.263261 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v48dc" event={"ID":"c81ab26f-a2a0-4306-a8ec-c7d7fe6cd13d","Type":"ContainerStarted","Data":"814fced28e460c3902d2a537ffb8415b33714fb813442502a0b098bd55b4f78f"} Dec 11 14:42:01 crc kubenswrapper[4898]: I1211 14:42:01.289049 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-v48dc" podStartSLOduration=4.259457104 podStartE2EDuration="9.289027279s" podCreationTimestamp="2025-12-11 14:41:52 +0000 UTC" firstStartedPulling="2025-12-11 14:41:55.20172184 +0000 UTC m=+5872.774048277" lastFinishedPulling="2025-12-11 14:42:00.231292015 +0000 UTC m=+5877.803618452" observedRunningTime="2025-12-11 14:42:01.279585637 +0000 UTC m=+5878.851912074" watchObservedRunningTime="2025-12-11 14:42:01.289027279 +0000 UTC m=+5878.861353736" Dec 11 14:42:03 crc kubenswrapper[4898]: I1211 14:42:03.074994 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-v48dc" Dec 11 14:42:03 crc kubenswrapper[4898]: I1211 14:42:03.075622 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-v48dc" Dec 11 14:42:03 crc kubenswrapper[4898]: I1211 14:42:03.139090 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-v48dc" Dec 11 14:42:13 crc kubenswrapper[4898]: I1211 14:42:13.127921 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-v48dc" Dec 11 14:42:13 crc kubenswrapper[4898]: I1211 14:42:13.186729 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-v48dc"] Dec 11 14:42:13 crc kubenswrapper[4898]: I1211 14:42:13.414931 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-v48dc" podUID="c81ab26f-a2a0-4306-a8ec-c7d7fe6cd13d" containerName="registry-server" containerID="cri-o://814fced28e460c3902d2a537ffb8415b33714fb813442502a0b098bd55b4f78f" gracePeriod=2 Dec 11 14:42:14 crc kubenswrapper[4898]: I1211 14:42:14.041498 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-v48dc" Dec 11 14:42:14 crc kubenswrapper[4898]: I1211 14:42:14.131920 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c81ab26f-a2a0-4306-a8ec-c7d7fe6cd13d-utilities\") pod \"c81ab26f-a2a0-4306-a8ec-c7d7fe6cd13d\" (UID: \"c81ab26f-a2a0-4306-a8ec-c7d7fe6cd13d\") " Dec 11 14:42:14 crc kubenswrapper[4898]: I1211 14:42:14.132106 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c81ab26f-a2a0-4306-a8ec-c7d7fe6cd13d-catalog-content\") pod \"c81ab26f-a2a0-4306-a8ec-c7d7fe6cd13d\" (UID: \"c81ab26f-a2a0-4306-a8ec-c7d7fe6cd13d\") " Dec 11 14:42:14 crc kubenswrapper[4898]: I1211 14:42:14.132194 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kdblh\" (UniqueName: \"kubernetes.io/projected/c81ab26f-a2a0-4306-a8ec-c7d7fe6cd13d-kube-api-access-kdblh\") pod \"c81ab26f-a2a0-4306-a8ec-c7d7fe6cd13d\" (UID: \"c81ab26f-a2a0-4306-a8ec-c7d7fe6cd13d\") " Dec 11 14:42:14 crc kubenswrapper[4898]: I1211 14:42:14.134338 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c81ab26f-a2a0-4306-a8ec-c7d7fe6cd13d-utilities" (OuterVolumeSpecName: "utilities") pod "c81ab26f-a2a0-4306-a8ec-c7d7fe6cd13d" (UID: "c81ab26f-a2a0-4306-a8ec-c7d7fe6cd13d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 14:42:14 crc kubenswrapper[4898]: I1211 14:42:14.143117 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c81ab26f-a2a0-4306-a8ec-c7d7fe6cd13d-kube-api-access-kdblh" (OuterVolumeSpecName: "kube-api-access-kdblh") pod "c81ab26f-a2a0-4306-a8ec-c7d7fe6cd13d" (UID: "c81ab26f-a2a0-4306-a8ec-c7d7fe6cd13d"). InnerVolumeSpecName "kube-api-access-kdblh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 14:42:14 crc kubenswrapper[4898]: I1211 14:42:14.158154 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c81ab26f-a2a0-4306-a8ec-c7d7fe6cd13d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c81ab26f-a2a0-4306-a8ec-c7d7fe6cd13d" (UID: "c81ab26f-a2a0-4306-a8ec-c7d7fe6cd13d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 14:42:14 crc kubenswrapper[4898]: I1211 14:42:14.234769 4898 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c81ab26f-a2a0-4306-a8ec-c7d7fe6cd13d-utilities\") on node \"crc\" DevicePath \"\"" Dec 11 14:42:14 crc kubenswrapper[4898]: I1211 14:42:14.234805 4898 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c81ab26f-a2a0-4306-a8ec-c7d7fe6cd13d-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 11 14:42:14 crc kubenswrapper[4898]: I1211 14:42:14.234821 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kdblh\" (UniqueName: \"kubernetes.io/projected/c81ab26f-a2a0-4306-a8ec-c7d7fe6cd13d-kube-api-access-kdblh\") on node \"crc\" DevicePath \"\"" Dec 11 14:42:14 crc kubenswrapper[4898]: I1211 14:42:14.427985 4898 generic.go:334] "Generic (PLEG): container finished" podID="c81ab26f-a2a0-4306-a8ec-c7d7fe6cd13d" containerID="814fced28e460c3902d2a537ffb8415b33714fb813442502a0b098bd55b4f78f" exitCode=0 Dec 11 14:42:14 crc kubenswrapper[4898]: I1211 14:42:14.428044 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v48dc" event={"ID":"c81ab26f-a2a0-4306-a8ec-c7d7fe6cd13d","Type":"ContainerDied","Data":"814fced28e460c3902d2a537ffb8415b33714fb813442502a0b098bd55b4f78f"} Dec 11 14:42:14 crc kubenswrapper[4898]: I1211 14:42:14.428083 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-v48dc" Dec 11 14:42:14 crc kubenswrapper[4898]: I1211 14:42:14.428093 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v48dc" event={"ID":"c81ab26f-a2a0-4306-a8ec-c7d7fe6cd13d","Type":"ContainerDied","Data":"ecca41c71f6ad9601872542b73947a9ac7938f5e1767d32eabe6283b26154e3b"} Dec 11 14:42:14 crc kubenswrapper[4898]: I1211 14:42:14.428125 4898 scope.go:117] "RemoveContainer" containerID="814fced28e460c3902d2a537ffb8415b33714fb813442502a0b098bd55b4f78f" Dec 11 14:42:14 crc kubenswrapper[4898]: I1211 14:42:14.457991 4898 scope.go:117] "RemoveContainer" containerID="ee0ee20ec902be0b3b10053ee911ac5bcb2b1f05a00c76b27944cd20afa1618c" Dec 11 14:42:14 crc kubenswrapper[4898]: I1211 14:42:14.479530 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-v48dc"] Dec 11 14:42:14 crc kubenswrapper[4898]: I1211 14:42:14.486865 4898 scope.go:117] "RemoveContainer" containerID="fbd285e0e9883e898bf21b8d70c51ad07aa06db97376f400aa9a4680aa0e5275" Dec 11 14:42:14 crc kubenswrapper[4898]: I1211 14:42:14.491781 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-v48dc"] Dec 11 14:42:14 crc kubenswrapper[4898]: I1211 14:42:14.591614 4898 scope.go:117] "RemoveContainer" containerID="814fced28e460c3902d2a537ffb8415b33714fb813442502a0b098bd55b4f78f" Dec 11 14:42:14 crc kubenswrapper[4898]: E1211 14:42:14.592139 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"814fced28e460c3902d2a537ffb8415b33714fb813442502a0b098bd55b4f78f\": container with ID starting with 814fced28e460c3902d2a537ffb8415b33714fb813442502a0b098bd55b4f78f not found: ID does not exist" containerID="814fced28e460c3902d2a537ffb8415b33714fb813442502a0b098bd55b4f78f" Dec 11 14:42:14 crc kubenswrapper[4898]: I1211 14:42:14.592197 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"814fced28e460c3902d2a537ffb8415b33714fb813442502a0b098bd55b4f78f"} err="failed to get container status \"814fced28e460c3902d2a537ffb8415b33714fb813442502a0b098bd55b4f78f\": rpc error: code = NotFound desc = could not find container \"814fced28e460c3902d2a537ffb8415b33714fb813442502a0b098bd55b4f78f\": container with ID starting with 814fced28e460c3902d2a537ffb8415b33714fb813442502a0b098bd55b4f78f not found: ID does not exist" Dec 11 14:42:14 crc kubenswrapper[4898]: I1211 14:42:14.592228 4898 scope.go:117] "RemoveContainer" containerID="ee0ee20ec902be0b3b10053ee911ac5bcb2b1f05a00c76b27944cd20afa1618c" Dec 11 14:42:14 crc kubenswrapper[4898]: E1211 14:42:14.592722 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ee0ee20ec902be0b3b10053ee911ac5bcb2b1f05a00c76b27944cd20afa1618c\": container with ID starting with ee0ee20ec902be0b3b10053ee911ac5bcb2b1f05a00c76b27944cd20afa1618c not found: ID does not exist" containerID="ee0ee20ec902be0b3b10053ee911ac5bcb2b1f05a00c76b27944cd20afa1618c" Dec 11 14:42:14 crc kubenswrapper[4898]: I1211 14:42:14.592751 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee0ee20ec902be0b3b10053ee911ac5bcb2b1f05a00c76b27944cd20afa1618c"} err="failed to get container status \"ee0ee20ec902be0b3b10053ee911ac5bcb2b1f05a00c76b27944cd20afa1618c\": rpc error: code = NotFound desc = could not find container \"ee0ee20ec902be0b3b10053ee911ac5bcb2b1f05a00c76b27944cd20afa1618c\": container with ID starting with ee0ee20ec902be0b3b10053ee911ac5bcb2b1f05a00c76b27944cd20afa1618c not found: ID does not exist" Dec 11 14:42:14 crc kubenswrapper[4898]: I1211 14:42:14.592771 4898 scope.go:117] "RemoveContainer" containerID="fbd285e0e9883e898bf21b8d70c51ad07aa06db97376f400aa9a4680aa0e5275" Dec 11 14:42:14 crc kubenswrapper[4898]: E1211 14:42:14.593124 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fbd285e0e9883e898bf21b8d70c51ad07aa06db97376f400aa9a4680aa0e5275\": container with ID starting with fbd285e0e9883e898bf21b8d70c51ad07aa06db97376f400aa9a4680aa0e5275 not found: ID does not exist" containerID="fbd285e0e9883e898bf21b8d70c51ad07aa06db97376f400aa9a4680aa0e5275" Dec 11 14:42:14 crc kubenswrapper[4898]: I1211 14:42:14.593149 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fbd285e0e9883e898bf21b8d70c51ad07aa06db97376f400aa9a4680aa0e5275"} err="failed to get container status \"fbd285e0e9883e898bf21b8d70c51ad07aa06db97376f400aa9a4680aa0e5275\": rpc error: code = NotFound desc = could not find container \"fbd285e0e9883e898bf21b8d70c51ad07aa06db97376f400aa9a4680aa0e5275\": container with ID starting with fbd285e0e9883e898bf21b8d70c51ad07aa06db97376f400aa9a4680aa0e5275 not found: ID does not exist" Dec 11 14:42:14 crc kubenswrapper[4898]: I1211 14:42:14.789266 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c81ab26f-a2a0-4306-a8ec-c7d7fe6cd13d" path="/var/lib/kubelet/pods/c81ab26f-a2a0-4306-a8ec-c7d7fe6cd13d/volumes" Dec 11 14:43:04 crc kubenswrapper[4898]: I1211 14:43:04.995511 4898 patch_prober.go:28] interesting pod/machine-config-daemon-7mmvk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 11 14:43:04 crc kubenswrapper[4898]: I1211 14:43:04.995898 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 11 14:43:11 crc kubenswrapper[4898]: I1211 14:43:11.672246 4898 patch_prober.go:28] interesting pod/logging-loki-gateway-69ffd5987-wz9b5 container/opa namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.55:8083/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 11 14:43:11 crc kubenswrapper[4898]: I1211 14:43:11.672852 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-69ffd5987-wz9b5" podUID="6d6f1657-b9dd-4fba-a216-ca660a4fa958" containerName="opa" probeResult="failure" output="Get \"https://10.217.0.55:8083/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 11 14:43:32 crc kubenswrapper[4898]: I1211 14:43:32.082649 4898 generic.go:334] "Generic (PLEG): container finished" podID="7e6cf072-5ba2-483d-9105-7772c4a02929" containerID="05180daba4b3c82716c0c9ef849e4d95be4ec54c4f3edd315cbe96d6b595e646" exitCode=0 Dec 11 14:43:32 crc kubenswrapper[4898]: I1211 14:43:32.082766 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-h99mt/must-gather-rk4kt" event={"ID":"7e6cf072-5ba2-483d-9105-7772c4a02929","Type":"ContainerDied","Data":"05180daba4b3c82716c0c9ef849e4d95be4ec54c4f3edd315cbe96d6b595e646"} Dec 11 14:43:32 crc kubenswrapper[4898]: I1211 14:43:32.084258 4898 scope.go:117] "RemoveContainer" containerID="05180daba4b3c82716c0c9ef849e4d95be4ec54c4f3edd315cbe96d6b595e646" Dec 11 14:43:32 crc kubenswrapper[4898]: I1211 14:43:32.815865 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-h99mt_must-gather-rk4kt_7e6cf072-5ba2-483d-9105-7772c4a02929/gather/0.log" Dec 11 14:43:34 crc kubenswrapper[4898]: I1211 14:43:34.995430 4898 patch_prober.go:28] interesting pod/machine-config-daemon-7mmvk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 11 14:43:34 crc kubenswrapper[4898]: I1211 14:43:34.997121 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 11 14:43:40 crc kubenswrapper[4898]: I1211 14:43:40.754877 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-h99mt/must-gather-rk4kt"] Dec 11 14:43:40 crc kubenswrapper[4898]: I1211 14:43:40.756284 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-h99mt/must-gather-rk4kt" podUID="7e6cf072-5ba2-483d-9105-7772c4a02929" containerName="copy" containerID="cri-o://22ccb44ac9ef936558611c69137cfd71c856b941bd7ee3e4a58f4133bebf3e83" gracePeriod=2 Dec 11 14:43:40 crc kubenswrapper[4898]: I1211 14:43:40.772010 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-h99mt/must-gather-rk4kt"] Dec 11 14:43:41 crc kubenswrapper[4898]: I1211 14:43:41.186697 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-h99mt_must-gather-rk4kt_7e6cf072-5ba2-483d-9105-7772c4a02929/copy/0.log" Dec 11 14:43:41 crc kubenswrapper[4898]: I1211 14:43:41.187650 4898 generic.go:334] "Generic (PLEG): container finished" podID="7e6cf072-5ba2-483d-9105-7772c4a02929" containerID="22ccb44ac9ef936558611c69137cfd71c856b941bd7ee3e4a58f4133bebf3e83" exitCode=143 Dec 11 14:43:41 crc kubenswrapper[4898]: I1211 14:43:41.316148 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-h99mt_must-gather-rk4kt_7e6cf072-5ba2-483d-9105-7772c4a02929/copy/0.log" Dec 11 14:43:41 crc kubenswrapper[4898]: I1211 14:43:41.316924 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-h99mt/must-gather-rk4kt" Dec 11 14:43:41 crc kubenswrapper[4898]: I1211 14:43:41.356880 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rsm8l\" (UniqueName: \"kubernetes.io/projected/7e6cf072-5ba2-483d-9105-7772c4a02929-kube-api-access-rsm8l\") pod \"7e6cf072-5ba2-483d-9105-7772c4a02929\" (UID: \"7e6cf072-5ba2-483d-9105-7772c4a02929\") " Dec 11 14:43:41 crc kubenswrapper[4898]: I1211 14:43:41.357030 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/7e6cf072-5ba2-483d-9105-7772c4a02929-must-gather-output\") pod \"7e6cf072-5ba2-483d-9105-7772c4a02929\" (UID: \"7e6cf072-5ba2-483d-9105-7772c4a02929\") " Dec 11 14:43:41 crc kubenswrapper[4898]: I1211 14:43:41.363605 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e6cf072-5ba2-483d-9105-7772c4a02929-kube-api-access-rsm8l" (OuterVolumeSpecName: "kube-api-access-rsm8l") pod "7e6cf072-5ba2-483d-9105-7772c4a02929" (UID: "7e6cf072-5ba2-483d-9105-7772c4a02929"). InnerVolumeSpecName "kube-api-access-rsm8l". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 14:43:41 crc kubenswrapper[4898]: I1211 14:43:41.460533 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rsm8l\" (UniqueName: \"kubernetes.io/projected/7e6cf072-5ba2-483d-9105-7772c4a02929-kube-api-access-rsm8l\") on node \"crc\" DevicePath \"\"" Dec 11 14:43:41 crc kubenswrapper[4898]: I1211 14:43:41.587750 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7e6cf072-5ba2-483d-9105-7772c4a02929-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "7e6cf072-5ba2-483d-9105-7772c4a02929" (UID: "7e6cf072-5ba2-483d-9105-7772c4a02929"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 14:43:41 crc kubenswrapper[4898]: I1211 14:43:41.668297 4898 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/7e6cf072-5ba2-483d-9105-7772c4a02929-must-gather-output\") on node \"crc\" DevicePath \"\"" Dec 11 14:43:42 crc kubenswrapper[4898]: I1211 14:43:42.202489 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-h99mt_must-gather-rk4kt_7e6cf072-5ba2-483d-9105-7772c4a02929/copy/0.log" Dec 11 14:43:42 crc kubenswrapper[4898]: I1211 14:43:42.204237 4898 scope.go:117] "RemoveContainer" containerID="22ccb44ac9ef936558611c69137cfd71c856b941bd7ee3e4a58f4133bebf3e83" Dec 11 14:43:42 crc kubenswrapper[4898]: I1211 14:43:42.204420 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-h99mt/must-gather-rk4kt" Dec 11 14:43:42 crc kubenswrapper[4898]: I1211 14:43:42.247297 4898 scope.go:117] "RemoveContainer" containerID="05180daba4b3c82716c0c9ef849e4d95be4ec54c4f3edd315cbe96d6b595e646" Dec 11 14:43:42 crc kubenswrapper[4898]: I1211 14:43:42.797389 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e6cf072-5ba2-483d-9105-7772c4a02929" path="/var/lib/kubelet/pods/7e6cf072-5ba2-483d-9105-7772c4a02929/volumes" Dec 11 14:44:04 crc kubenswrapper[4898]: I1211 14:44:04.996478 4898 patch_prober.go:28] interesting pod/machine-config-daemon-7mmvk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 11 14:44:04 crc kubenswrapper[4898]: I1211 14:44:04.997236 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 11 14:44:04 crc kubenswrapper[4898]: I1211 14:44:04.997293 4898 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" Dec 11 14:44:04 crc kubenswrapper[4898]: I1211 14:44:04.998337 4898 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4a948fb24285fadc33e95803387e30a8892f96c49c37e3a977d4366e7a6db1ae"} pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 11 14:44:04 crc kubenswrapper[4898]: I1211 14:44:04.998406 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" containerName="machine-config-daemon" containerID="cri-o://4a948fb24285fadc33e95803387e30a8892f96c49c37e3a977d4366e7a6db1ae" gracePeriod=600 Dec 11 14:44:05 crc kubenswrapper[4898]: E1211 14:44:05.121263 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mmvk_openshift-machine-config-operator(b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" Dec 11 14:44:05 crc kubenswrapper[4898]: I1211 14:44:05.560799 4898 generic.go:334] "Generic (PLEG): container finished" podID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" containerID="4a948fb24285fadc33e95803387e30a8892f96c49c37e3a977d4366e7a6db1ae" exitCode=0 Dec 11 14:44:05 crc kubenswrapper[4898]: I1211 14:44:05.560852 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" event={"ID":"b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c","Type":"ContainerDied","Data":"4a948fb24285fadc33e95803387e30a8892f96c49c37e3a977d4366e7a6db1ae"} Dec 11 14:44:05 crc kubenswrapper[4898]: I1211 14:44:05.560892 4898 scope.go:117] "RemoveContainer" containerID="8c3848400d3383130ae2106d38db8971207b13a878503083f06812e151d95d89" Dec 11 14:44:05 crc kubenswrapper[4898]: I1211 14:44:05.563688 4898 scope.go:117] "RemoveContainer" containerID="4a948fb24285fadc33e95803387e30a8892f96c49c37e3a977d4366e7a6db1ae" Dec 11 14:44:05 crc kubenswrapper[4898]: E1211 14:44:05.564322 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mmvk_openshift-machine-config-operator(b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" Dec 11 14:44:17 crc kubenswrapper[4898]: I1211 14:44:17.384505 4898 patch_prober.go:28] interesting pod/monitoring-plugin-5ffffb4f84-84dnp container/monitoring-plugin namespace/openshift-monitoring: Readiness probe status=failure output="Get \"https://10.217.0.77:9443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 11 14:44:17 crc kubenswrapper[4898]: I1211 14:44:17.385226 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/monitoring-plugin-5ffffb4f84-84dnp" podUID="1111559b-96c1-4918-b502-1b5045b8a9da" containerName="monitoring-plugin" probeResult="failure" output="Get \"https://10.217.0.77:9443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 11 14:44:17 crc kubenswrapper[4898]: I1211 14:44:17.775004 4898 scope.go:117] "RemoveContainer" containerID="4a948fb24285fadc33e95803387e30a8892f96c49c37e3a977d4366e7a6db1ae" Dec 11 14:44:17 crc kubenswrapper[4898]: E1211 14:44:17.775388 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mmvk_openshift-machine-config-operator(b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" Dec 11 14:44:31 crc kubenswrapper[4898]: I1211 14:44:31.775249 4898 scope.go:117] "RemoveContainer" containerID="4a948fb24285fadc33e95803387e30a8892f96c49c37e3a977d4366e7a6db1ae" Dec 11 14:44:31 crc kubenswrapper[4898]: E1211 14:44:31.776322 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mmvk_openshift-machine-config-operator(b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" Dec 11 14:44:42 crc kubenswrapper[4898]: I1211 14:44:42.786450 4898 scope.go:117] "RemoveContainer" containerID="4a948fb24285fadc33e95803387e30a8892f96c49c37e3a977d4366e7a6db1ae" Dec 11 14:44:42 crc kubenswrapper[4898]: E1211 14:44:42.787233 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mmvk_openshift-machine-config-operator(b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" Dec 11 14:44:56 crc kubenswrapper[4898]: I1211 14:44:56.775152 4898 scope.go:117] "RemoveContainer" containerID="4a948fb24285fadc33e95803387e30a8892f96c49c37e3a977d4366e7a6db1ae" Dec 11 14:44:56 crc kubenswrapper[4898]: E1211 14:44:56.776017 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mmvk_openshift-machine-config-operator(b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" Dec 11 14:45:00 crc kubenswrapper[4898]: I1211 14:45:00.250839 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29424405-kxkmx"] Dec 11 14:45:00 crc kubenswrapper[4898]: E1211 14:45:00.252545 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c81ab26f-a2a0-4306-a8ec-c7d7fe6cd13d" containerName="extract-utilities" Dec 11 14:45:00 crc kubenswrapper[4898]: I1211 14:45:00.252595 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="c81ab26f-a2a0-4306-a8ec-c7d7fe6cd13d" containerName="extract-utilities" Dec 11 14:45:00 crc kubenswrapper[4898]: E1211 14:45:00.252626 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e6cf072-5ba2-483d-9105-7772c4a02929" containerName="gather" Dec 11 14:45:00 crc kubenswrapper[4898]: I1211 14:45:00.252653 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e6cf072-5ba2-483d-9105-7772c4a02929" containerName="gather" Dec 11 14:45:00 crc kubenswrapper[4898]: E1211 14:45:00.252672 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e6cf072-5ba2-483d-9105-7772c4a02929" containerName="copy" Dec 11 14:45:00 crc kubenswrapper[4898]: I1211 14:45:00.252678 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e6cf072-5ba2-483d-9105-7772c4a02929" containerName="copy" Dec 11 14:45:00 crc kubenswrapper[4898]: E1211 14:45:00.252704 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c81ab26f-a2a0-4306-a8ec-c7d7fe6cd13d" containerName="registry-server" Dec 11 14:45:00 crc kubenswrapper[4898]: I1211 14:45:00.252711 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="c81ab26f-a2a0-4306-a8ec-c7d7fe6cd13d" containerName="registry-server" Dec 11 14:45:00 crc kubenswrapper[4898]: E1211 14:45:00.252755 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c81ab26f-a2a0-4306-a8ec-c7d7fe6cd13d" containerName="extract-content" Dec 11 14:45:00 crc kubenswrapper[4898]: I1211 14:45:00.252762 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="c81ab26f-a2a0-4306-a8ec-c7d7fe6cd13d" containerName="extract-content" Dec 11 14:45:00 crc kubenswrapper[4898]: I1211 14:45:00.253511 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e6cf072-5ba2-483d-9105-7772c4a02929" containerName="copy" Dec 11 14:45:00 crc kubenswrapper[4898]: I1211 14:45:00.253556 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="c81ab26f-a2a0-4306-a8ec-c7d7fe6cd13d" containerName="registry-server" Dec 11 14:45:00 crc kubenswrapper[4898]: I1211 14:45:00.253574 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e6cf072-5ba2-483d-9105-7772c4a02929" containerName="gather" Dec 11 14:45:00 crc kubenswrapper[4898]: I1211 14:45:00.255538 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29424405-kxkmx" Dec 11 14:45:00 crc kubenswrapper[4898]: I1211 14:45:00.290872 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 11 14:45:00 crc kubenswrapper[4898]: I1211 14:45:00.295890 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29424405-kxkmx"] Dec 11 14:45:00 crc kubenswrapper[4898]: I1211 14:45:00.300866 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 11 14:45:00 crc kubenswrapper[4898]: I1211 14:45:00.396268 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9nmz8\" (UniqueName: \"kubernetes.io/projected/3068fe2d-4edf-4792-81ef-936509b8664d-kube-api-access-9nmz8\") pod \"collect-profiles-29424405-kxkmx\" (UID: \"3068fe2d-4edf-4792-81ef-936509b8664d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29424405-kxkmx" Dec 11 14:45:00 crc kubenswrapper[4898]: I1211 14:45:00.396428 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3068fe2d-4edf-4792-81ef-936509b8664d-config-volume\") pod \"collect-profiles-29424405-kxkmx\" (UID: \"3068fe2d-4edf-4792-81ef-936509b8664d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29424405-kxkmx" Dec 11 14:45:00 crc kubenswrapper[4898]: I1211 14:45:00.396684 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3068fe2d-4edf-4792-81ef-936509b8664d-secret-volume\") pod \"collect-profiles-29424405-kxkmx\" (UID: \"3068fe2d-4edf-4792-81ef-936509b8664d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29424405-kxkmx" Dec 11 14:45:00 crc kubenswrapper[4898]: I1211 14:45:00.499612 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3068fe2d-4edf-4792-81ef-936509b8664d-secret-volume\") pod \"collect-profiles-29424405-kxkmx\" (UID: \"3068fe2d-4edf-4792-81ef-936509b8664d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29424405-kxkmx" Dec 11 14:45:00 crc kubenswrapper[4898]: I1211 14:45:00.499746 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9nmz8\" (UniqueName: \"kubernetes.io/projected/3068fe2d-4edf-4792-81ef-936509b8664d-kube-api-access-9nmz8\") pod \"collect-profiles-29424405-kxkmx\" (UID: \"3068fe2d-4edf-4792-81ef-936509b8664d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29424405-kxkmx" Dec 11 14:45:00 crc kubenswrapper[4898]: I1211 14:45:00.499802 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3068fe2d-4edf-4792-81ef-936509b8664d-config-volume\") pod \"collect-profiles-29424405-kxkmx\" (UID: \"3068fe2d-4edf-4792-81ef-936509b8664d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29424405-kxkmx" Dec 11 14:45:00 crc kubenswrapper[4898]: I1211 14:45:00.500750 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3068fe2d-4edf-4792-81ef-936509b8664d-config-volume\") pod \"collect-profiles-29424405-kxkmx\" (UID: \"3068fe2d-4edf-4792-81ef-936509b8664d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29424405-kxkmx" Dec 11 14:45:00 crc kubenswrapper[4898]: I1211 14:45:00.506290 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3068fe2d-4edf-4792-81ef-936509b8664d-secret-volume\") pod \"collect-profiles-29424405-kxkmx\" (UID: \"3068fe2d-4edf-4792-81ef-936509b8664d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29424405-kxkmx" Dec 11 14:45:00 crc kubenswrapper[4898]: I1211 14:45:00.529228 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9nmz8\" (UniqueName: \"kubernetes.io/projected/3068fe2d-4edf-4792-81ef-936509b8664d-kube-api-access-9nmz8\") pod \"collect-profiles-29424405-kxkmx\" (UID: \"3068fe2d-4edf-4792-81ef-936509b8664d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29424405-kxkmx" Dec 11 14:45:00 crc kubenswrapper[4898]: I1211 14:45:00.612971 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29424405-kxkmx" Dec 11 14:45:01 crc kubenswrapper[4898]: I1211 14:45:01.082109 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29424405-kxkmx"] Dec 11 14:45:01 crc kubenswrapper[4898]: W1211 14:45:01.088682 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3068fe2d_4edf_4792_81ef_936509b8664d.slice/crio-d4f89ba6fbc423f99f766733a65222241ca97e9e1c9ab21993ad9c9d8137db87 WatchSource:0}: Error finding container d4f89ba6fbc423f99f766733a65222241ca97e9e1c9ab21993ad9c9d8137db87: Status 404 returned error can't find the container with id d4f89ba6fbc423f99f766733a65222241ca97e9e1c9ab21993ad9c9d8137db87 Dec 11 14:45:01 crc kubenswrapper[4898]: I1211 14:45:01.305988 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29424405-kxkmx" event={"ID":"3068fe2d-4edf-4792-81ef-936509b8664d","Type":"ContainerStarted","Data":"d4f89ba6fbc423f99f766733a65222241ca97e9e1c9ab21993ad9c9d8137db87"} Dec 11 14:45:02 crc kubenswrapper[4898]: I1211 14:45:02.319815 4898 generic.go:334] "Generic (PLEG): container finished" podID="3068fe2d-4edf-4792-81ef-936509b8664d" containerID="836959fb41c62c9e6737fc1ee647897707698d440285c4c3fc694f3b282278f3" exitCode=0 Dec 11 14:45:02 crc kubenswrapper[4898]: I1211 14:45:02.319916 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29424405-kxkmx" event={"ID":"3068fe2d-4edf-4792-81ef-936509b8664d","Type":"ContainerDied","Data":"836959fb41c62c9e6737fc1ee647897707698d440285c4c3fc694f3b282278f3"} Dec 11 14:45:03 crc kubenswrapper[4898]: I1211 14:45:03.822652 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29424405-kxkmx" Dec 11 14:45:03 crc kubenswrapper[4898]: I1211 14:45:03.975139 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3068fe2d-4edf-4792-81ef-936509b8664d-secret-volume\") pod \"3068fe2d-4edf-4792-81ef-936509b8664d\" (UID: \"3068fe2d-4edf-4792-81ef-936509b8664d\") " Dec 11 14:45:03 crc kubenswrapper[4898]: I1211 14:45:03.975283 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9nmz8\" (UniqueName: \"kubernetes.io/projected/3068fe2d-4edf-4792-81ef-936509b8664d-kube-api-access-9nmz8\") pod \"3068fe2d-4edf-4792-81ef-936509b8664d\" (UID: \"3068fe2d-4edf-4792-81ef-936509b8664d\") " Dec 11 14:45:03 crc kubenswrapper[4898]: I1211 14:45:03.975348 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3068fe2d-4edf-4792-81ef-936509b8664d-config-volume\") pod \"3068fe2d-4edf-4792-81ef-936509b8664d\" (UID: \"3068fe2d-4edf-4792-81ef-936509b8664d\") " Dec 11 14:45:03 crc kubenswrapper[4898]: I1211 14:45:03.976025 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3068fe2d-4edf-4792-81ef-936509b8664d-config-volume" (OuterVolumeSpecName: "config-volume") pod "3068fe2d-4edf-4792-81ef-936509b8664d" (UID: "3068fe2d-4edf-4792-81ef-936509b8664d"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 14:45:03 crc kubenswrapper[4898]: I1211 14:45:03.976383 4898 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3068fe2d-4edf-4792-81ef-936509b8664d-config-volume\") on node \"crc\" DevicePath \"\"" Dec 11 14:45:03 crc kubenswrapper[4898]: I1211 14:45:03.981690 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3068fe2d-4edf-4792-81ef-936509b8664d-kube-api-access-9nmz8" (OuterVolumeSpecName: "kube-api-access-9nmz8") pod "3068fe2d-4edf-4792-81ef-936509b8664d" (UID: "3068fe2d-4edf-4792-81ef-936509b8664d"). InnerVolumeSpecName "kube-api-access-9nmz8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 14:45:03 crc kubenswrapper[4898]: I1211 14:45:03.981805 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3068fe2d-4edf-4792-81ef-936509b8664d-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "3068fe2d-4edf-4792-81ef-936509b8664d" (UID: "3068fe2d-4edf-4792-81ef-936509b8664d"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 14:45:04 crc kubenswrapper[4898]: I1211 14:45:04.078614 4898 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3068fe2d-4edf-4792-81ef-936509b8664d-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 11 14:45:04 crc kubenswrapper[4898]: I1211 14:45:04.078649 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9nmz8\" (UniqueName: \"kubernetes.io/projected/3068fe2d-4edf-4792-81ef-936509b8664d-kube-api-access-9nmz8\") on node \"crc\" DevicePath \"\"" Dec 11 14:45:04 crc kubenswrapper[4898]: I1211 14:45:04.340627 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29424405-kxkmx" event={"ID":"3068fe2d-4edf-4792-81ef-936509b8664d","Type":"ContainerDied","Data":"d4f89ba6fbc423f99f766733a65222241ca97e9e1c9ab21993ad9c9d8137db87"} Dec 11 14:45:04 crc kubenswrapper[4898]: I1211 14:45:04.340669 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d4f89ba6fbc423f99f766733a65222241ca97e9e1c9ab21993ad9c9d8137db87" Dec 11 14:45:04 crc kubenswrapper[4898]: I1211 14:45:04.341076 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29424405-kxkmx" Dec 11 14:45:04 crc kubenswrapper[4898]: I1211 14:45:04.901829 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29424360-8ltlj"] Dec 11 14:45:04 crc kubenswrapper[4898]: I1211 14:45:04.914847 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29424360-8ltlj"] Dec 11 14:45:06 crc kubenswrapper[4898]: I1211 14:45:06.788990 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="016e1dee-c122-4f9b-8720-19bf59ce0987" path="/var/lib/kubelet/pods/016e1dee-c122-4f9b-8720-19bf59ce0987/volumes" Dec 11 14:45:09 crc kubenswrapper[4898]: I1211 14:45:09.775397 4898 scope.go:117] "RemoveContainer" containerID="4a948fb24285fadc33e95803387e30a8892f96c49c37e3a977d4366e7a6db1ae" Dec 11 14:45:09 crc kubenswrapper[4898]: E1211 14:45:09.776017 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mmvk_openshift-machine-config-operator(b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" Dec 11 14:45:23 crc kubenswrapper[4898]: I1211 14:45:23.775220 4898 scope.go:117] "RemoveContainer" containerID="4a948fb24285fadc33e95803387e30a8892f96c49c37e3a977d4366e7a6db1ae" Dec 11 14:45:23 crc kubenswrapper[4898]: E1211 14:45:23.775970 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mmvk_openshift-machine-config-operator(b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" Dec 11 14:45:34 crc kubenswrapper[4898]: I1211 14:45:34.775753 4898 scope.go:117] "RemoveContainer" containerID="4a948fb24285fadc33e95803387e30a8892f96c49c37e3a977d4366e7a6db1ae" Dec 11 14:45:34 crc kubenswrapper[4898]: E1211 14:45:34.776735 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mmvk_openshift-machine-config-operator(b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" Dec 11 14:45:45 crc kubenswrapper[4898]: I1211 14:45:45.776056 4898 scope.go:117] "RemoveContainer" containerID="4a948fb24285fadc33e95803387e30a8892f96c49c37e3a977d4366e7a6db1ae" Dec 11 14:45:45 crc kubenswrapper[4898]: E1211 14:45:45.777543 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mmvk_openshift-machine-config-operator(b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" Dec 11 14:45:46 crc kubenswrapper[4898]: I1211 14:45:46.932400 4898 scope.go:117] "RemoveContainer" containerID="50152bfd32e95e33b7afd08f1a0665c7beb6fb918ba3c6e5d20e85c9572b44d4" Dec 11 14:45:57 crc kubenswrapper[4898]: I1211 14:45:57.777283 4898 scope.go:117] "RemoveContainer" containerID="4a948fb24285fadc33e95803387e30a8892f96c49c37e3a977d4366e7a6db1ae" Dec 11 14:45:57 crc kubenswrapper[4898]: E1211 14:45:57.777910 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mmvk_openshift-machine-config-operator(b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" Dec 11 14:46:11 crc kubenswrapper[4898]: I1211 14:46:11.776203 4898 scope.go:117] "RemoveContainer" containerID="4a948fb24285fadc33e95803387e30a8892f96c49c37e3a977d4366e7a6db1ae" Dec 11 14:46:11 crc kubenswrapper[4898]: E1211 14:46:11.777723 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mmvk_openshift-machine-config-operator(b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" Dec 11 14:46:25 crc kubenswrapper[4898]: I1211 14:46:25.775508 4898 scope.go:117] "RemoveContainer" containerID="4a948fb24285fadc33e95803387e30a8892f96c49c37e3a977d4366e7a6db1ae" Dec 11 14:46:25 crc kubenswrapper[4898]: E1211 14:46:25.776661 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mmvk_openshift-machine-config-operator(b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" Dec 11 14:46:37 crc kubenswrapper[4898]: I1211 14:46:37.775669 4898 scope.go:117] "RemoveContainer" containerID="4a948fb24285fadc33e95803387e30a8892f96c49c37e3a977d4366e7a6db1ae" Dec 11 14:46:37 crc kubenswrapper[4898]: E1211 14:46:37.776911 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mmvk_openshift-machine-config-operator(b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" Dec 11 14:46:52 crc kubenswrapper[4898]: I1211 14:46:52.815113 4898 scope.go:117] "RemoveContainer" containerID="4a948fb24285fadc33e95803387e30a8892f96c49c37e3a977d4366e7a6db1ae" Dec 11 14:46:52 crc kubenswrapper[4898]: E1211 14:46:52.817647 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mmvk_openshift-machine-config-operator(b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" Dec 11 14:47:05 crc kubenswrapper[4898]: I1211 14:47:05.774778 4898 scope.go:117] "RemoveContainer" containerID="4a948fb24285fadc33e95803387e30a8892f96c49c37e3a977d4366e7a6db1ae" Dec 11 14:47:05 crc kubenswrapper[4898]: E1211 14:47:05.775616 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mmvk_openshift-machine-config-operator(b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" Dec 11 14:47:17 crc kubenswrapper[4898]: I1211 14:47:17.775997 4898 scope.go:117] "RemoveContainer" containerID="4a948fb24285fadc33e95803387e30a8892f96c49c37e3a977d4366e7a6db1ae" Dec 11 14:47:17 crc kubenswrapper[4898]: E1211 14:47:17.776732 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mmvk_openshift-machine-config-operator(b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" Dec 11 14:47:29 crc kubenswrapper[4898]: I1211 14:47:29.775559 4898 scope.go:117] "RemoveContainer" containerID="4a948fb24285fadc33e95803387e30a8892f96c49c37e3a977d4366e7a6db1ae" Dec 11 14:47:29 crc kubenswrapper[4898]: E1211 14:47:29.776363 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mmvk_openshift-machine-config-operator(b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" Dec 11 14:47:42 crc kubenswrapper[4898]: I1211 14:47:42.787285 4898 scope.go:117] "RemoveContainer" containerID="4a948fb24285fadc33e95803387e30a8892f96c49c37e3a977d4366e7a6db1ae" Dec 11 14:47:42 crc kubenswrapper[4898]: E1211 14:47:42.789164 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mmvk_openshift-machine-config-operator(b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" Dec 11 14:47:55 crc kubenswrapper[4898]: I1211 14:47:55.775445 4898 scope.go:117] "RemoveContainer" containerID="4a948fb24285fadc33e95803387e30a8892f96c49c37e3a977d4366e7a6db1ae" Dec 11 14:47:55 crc kubenswrapper[4898]: E1211 14:47:55.776383 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mmvk_openshift-machine-config-operator(b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" Dec 11 14:48:06 crc kubenswrapper[4898]: I1211 14:48:06.776290 4898 scope.go:117] "RemoveContainer" containerID="4a948fb24285fadc33e95803387e30a8892f96c49c37e3a977d4366e7a6db1ae" Dec 11 14:48:06 crc kubenswrapper[4898]: E1211 14:48:06.777162 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mmvk_openshift-machine-config-operator(b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" Dec 11 14:48:18 crc kubenswrapper[4898]: I1211 14:48:18.776866 4898 scope.go:117] "RemoveContainer" containerID="4a948fb24285fadc33e95803387e30a8892f96c49c37e3a977d4366e7a6db1ae" Dec 11 14:48:18 crc kubenswrapper[4898]: E1211 14:48:18.777814 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mmvk_openshift-machine-config-operator(b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" Dec 11 14:48:33 crc kubenswrapper[4898]: I1211 14:48:33.775905 4898 scope.go:117] "RemoveContainer" containerID="4a948fb24285fadc33e95803387e30a8892f96c49c37e3a977d4366e7a6db1ae" Dec 11 14:48:33 crc kubenswrapper[4898]: E1211 14:48:33.776817 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mmvk_openshift-machine-config-operator(b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" Dec 11 14:48:44 crc kubenswrapper[4898]: I1211 14:48:44.778664 4898 scope.go:117] "RemoveContainer" containerID="4a948fb24285fadc33e95803387e30a8892f96c49c37e3a977d4366e7a6db1ae" Dec 11 14:48:44 crc kubenswrapper[4898]: E1211 14:48:44.779575 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mmvk_openshift-machine-config-operator(b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" Dec 11 14:48:50 crc kubenswrapper[4898]: I1211 14:48:50.953648 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-j7qpf"] Dec 11 14:48:50 crc kubenswrapper[4898]: E1211 14:48:50.954865 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3068fe2d-4edf-4792-81ef-936509b8664d" containerName="collect-profiles" Dec 11 14:48:50 crc kubenswrapper[4898]: I1211 14:48:50.954881 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="3068fe2d-4edf-4792-81ef-936509b8664d" containerName="collect-profiles" Dec 11 14:48:50 crc kubenswrapper[4898]: I1211 14:48:50.955237 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="3068fe2d-4edf-4792-81ef-936509b8664d" containerName="collect-profiles" Dec 11 14:48:50 crc kubenswrapper[4898]: I1211 14:48:50.961939 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-j7qpf" Dec 11 14:48:50 crc kubenswrapper[4898]: I1211 14:48:50.967482 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-j7qpf"] Dec 11 14:48:51 crc kubenswrapper[4898]: I1211 14:48:51.016105 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1b3809e4-ba63-4896-87ef-648c16729b53-catalog-content\") pod \"community-operators-j7qpf\" (UID: \"1b3809e4-ba63-4896-87ef-648c16729b53\") " pod="openshift-marketplace/community-operators-j7qpf" Dec 11 14:48:51 crc kubenswrapper[4898]: I1211 14:48:51.016206 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1b3809e4-ba63-4896-87ef-648c16729b53-utilities\") pod \"community-operators-j7qpf\" (UID: \"1b3809e4-ba63-4896-87ef-648c16729b53\") " pod="openshift-marketplace/community-operators-j7qpf" Dec 11 14:48:51 crc kubenswrapper[4898]: I1211 14:48:51.016405 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z74vw\" (UniqueName: \"kubernetes.io/projected/1b3809e4-ba63-4896-87ef-648c16729b53-kube-api-access-z74vw\") pod \"community-operators-j7qpf\" (UID: \"1b3809e4-ba63-4896-87ef-648c16729b53\") " pod="openshift-marketplace/community-operators-j7qpf" Dec 11 14:48:51 crc kubenswrapper[4898]: I1211 14:48:51.118795 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1b3809e4-ba63-4896-87ef-648c16729b53-utilities\") pod \"community-operators-j7qpf\" (UID: \"1b3809e4-ba63-4896-87ef-648c16729b53\") " pod="openshift-marketplace/community-operators-j7qpf" Dec 11 14:48:51 crc kubenswrapper[4898]: I1211 14:48:51.119237 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1b3809e4-ba63-4896-87ef-648c16729b53-utilities\") pod \"community-operators-j7qpf\" (UID: \"1b3809e4-ba63-4896-87ef-648c16729b53\") " pod="openshift-marketplace/community-operators-j7qpf" Dec 11 14:48:51 crc kubenswrapper[4898]: I1211 14:48:51.119246 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z74vw\" (UniqueName: \"kubernetes.io/projected/1b3809e4-ba63-4896-87ef-648c16729b53-kube-api-access-z74vw\") pod \"community-operators-j7qpf\" (UID: \"1b3809e4-ba63-4896-87ef-648c16729b53\") " pod="openshift-marketplace/community-operators-j7qpf" Dec 11 14:48:51 crc kubenswrapper[4898]: I1211 14:48:51.119498 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1b3809e4-ba63-4896-87ef-648c16729b53-catalog-content\") pod \"community-operators-j7qpf\" (UID: \"1b3809e4-ba63-4896-87ef-648c16729b53\") " pod="openshift-marketplace/community-operators-j7qpf" Dec 11 14:48:51 crc kubenswrapper[4898]: I1211 14:48:51.120035 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1b3809e4-ba63-4896-87ef-648c16729b53-catalog-content\") pod \"community-operators-j7qpf\" (UID: \"1b3809e4-ba63-4896-87ef-648c16729b53\") " pod="openshift-marketplace/community-operators-j7qpf" Dec 11 14:48:51 crc kubenswrapper[4898]: I1211 14:48:51.145384 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z74vw\" (UniqueName: \"kubernetes.io/projected/1b3809e4-ba63-4896-87ef-648c16729b53-kube-api-access-z74vw\") pod \"community-operators-j7qpf\" (UID: \"1b3809e4-ba63-4896-87ef-648c16729b53\") " pod="openshift-marketplace/community-operators-j7qpf" Dec 11 14:48:51 crc kubenswrapper[4898]: I1211 14:48:51.290779 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-j7qpf" Dec 11 14:48:51 crc kubenswrapper[4898]: I1211 14:48:51.679734 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-j7qpf"] Dec 11 14:48:52 crc kubenswrapper[4898]: I1211 14:48:52.191967 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j7qpf" event={"ID":"1b3809e4-ba63-4896-87ef-648c16729b53","Type":"ContainerStarted","Data":"1a9ab71fcd3d6f295bebf91c2e8d53dd943f314093188afb6fa415b256b3360e"} Dec 11 14:48:52 crc kubenswrapper[4898]: I1211 14:48:52.192315 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j7qpf" event={"ID":"1b3809e4-ba63-4896-87ef-648c16729b53","Type":"ContainerStarted","Data":"ff60766c5bc378bb08dac9097dc88bc83fc2381797d365e075acfb2670505d43"} Dec 11 14:48:53 crc kubenswrapper[4898]: I1211 14:48:53.208445 4898 generic.go:334] "Generic (PLEG): container finished" podID="1b3809e4-ba63-4896-87ef-648c16729b53" containerID="1a9ab71fcd3d6f295bebf91c2e8d53dd943f314093188afb6fa415b256b3360e" exitCode=0 Dec 11 14:48:53 crc kubenswrapper[4898]: I1211 14:48:53.208557 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j7qpf" event={"ID":"1b3809e4-ba63-4896-87ef-648c16729b53","Type":"ContainerDied","Data":"1a9ab71fcd3d6f295bebf91c2e8d53dd943f314093188afb6fa415b256b3360e"} Dec 11 14:48:53 crc kubenswrapper[4898]: I1211 14:48:53.211804 4898 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 11 14:48:54 crc kubenswrapper[4898]: I1211 14:48:54.222051 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j7qpf" event={"ID":"1b3809e4-ba63-4896-87ef-648c16729b53","Type":"ContainerStarted","Data":"b32c3c71435a1edca28c79eee0d9cde1e8b12b44b61d144eb3000c138d2e49a3"} Dec 11 14:48:56 crc kubenswrapper[4898]: I1211 14:48:56.251445 4898 generic.go:334] "Generic (PLEG): container finished" podID="1b3809e4-ba63-4896-87ef-648c16729b53" containerID="b32c3c71435a1edca28c79eee0d9cde1e8b12b44b61d144eb3000c138d2e49a3" exitCode=0 Dec 11 14:48:56 crc kubenswrapper[4898]: I1211 14:48:56.251504 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j7qpf" event={"ID":"1b3809e4-ba63-4896-87ef-648c16729b53","Type":"ContainerDied","Data":"b32c3c71435a1edca28c79eee0d9cde1e8b12b44b61d144eb3000c138d2e49a3"} Dec 11 14:48:56 crc kubenswrapper[4898]: I1211 14:48:56.775496 4898 scope.go:117] "RemoveContainer" containerID="4a948fb24285fadc33e95803387e30a8892f96c49c37e3a977d4366e7a6db1ae" Dec 11 14:48:56 crc kubenswrapper[4898]: E1211 14:48:56.776007 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7mmvk_openshift-machine-config-operator(b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c)\"" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" podUID="b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c" Dec 11 14:48:57 crc kubenswrapper[4898]: I1211 14:48:57.267283 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j7qpf" event={"ID":"1b3809e4-ba63-4896-87ef-648c16729b53","Type":"ContainerStarted","Data":"924a25084db43aa1bcd736e33acfd59e0fb92a58fecf9eceb7ab6c3d5fb50389"} Dec 11 14:48:57 crc kubenswrapper[4898]: I1211 14:48:57.309367 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-j7qpf" podStartSLOduration=3.688723802 podStartE2EDuration="7.30933563s" podCreationTimestamp="2025-12-11 14:48:50 +0000 UTC" firstStartedPulling="2025-12-11 14:48:53.211429685 +0000 UTC m=+6290.783756122" lastFinishedPulling="2025-12-11 14:48:56.832041503 +0000 UTC m=+6294.404367950" observedRunningTime="2025-12-11 14:48:57.299891756 +0000 UTC m=+6294.872218203" watchObservedRunningTime="2025-12-11 14:48:57.30933563 +0000 UTC m=+6294.881662087" Dec 11 14:49:01 crc kubenswrapper[4898]: I1211 14:49:01.291431 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-j7qpf" Dec 11 14:49:01 crc kubenswrapper[4898]: I1211 14:49:01.292849 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-j7qpf" Dec 11 14:49:01 crc kubenswrapper[4898]: I1211 14:49:01.392711 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-j7qpf" Dec 11 14:49:02 crc kubenswrapper[4898]: I1211 14:49:02.401910 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-j7qpf" Dec 11 14:49:02 crc kubenswrapper[4898]: I1211 14:49:02.467887 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-j7qpf"] Dec 11 14:49:04 crc kubenswrapper[4898]: I1211 14:49:04.359299 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-j7qpf" podUID="1b3809e4-ba63-4896-87ef-648c16729b53" containerName="registry-server" containerID="cri-o://924a25084db43aa1bcd736e33acfd59e0fb92a58fecf9eceb7ab6c3d5fb50389" gracePeriod=2 Dec 11 14:49:05 crc kubenswrapper[4898]: I1211 14:49:05.034667 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-j7qpf" Dec 11 14:49:05 crc kubenswrapper[4898]: I1211 14:49:05.206892 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z74vw\" (UniqueName: \"kubernetes.io/projected/1b3809e4-ba63-4896-87ef-648c16729b53-kube-api-access-z74vw\") pod \"1b3809e4-ba63-4896-87ef-648c16729b53\" (UID: \"1b3809e4-ba63-4896-87ef-648c16729b53\") " Dec 11 14:49:05 crc kubenswrapper[4898]: I1211 14:49:05.207288 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1b3809e4-ba63-4896-87ef-648c16729b53-catalog-content\") pod \"1b3809e4-ba63-4896-87ef-648c16729b53\" (UID: \"1b3809e4-ba63-4896-87ef-648c16729b53\") " Dec 11 14:49:05 crc kubenswrapper[4898]: I1211 14:49:05.207379 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1b3809e4-ba63-4896-87ef-648c16729b53-utilities\") pod \"1b3809e4-ba63-4896-87ef-648c16729b53\" (UID: \"1b3809e4-ba63-4896-87ef-648c16729b53\") " Dec 11 14:49:05 crc kubenswrapper[4898]: I1211 14:49:05.208900 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1b3809e4-ba63-4896-87ef-648c16729b53-utilities" (OuterVolumeSpecName: "utilities") pod "1b3809e4-ba63-4896-87ef-648c16729b53" (UID: "1b3809e4-ba63-4896-87ef-648c16729b53"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 14:49:05 crc kubenswrapper[4898]: I1211 14:49:05.217654 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b3809e4-ba63-4896-87ef-648c16729b53-kube-api-access-z74vw" (OuterVolumeSpecName: "kube-api-access-z74vw") pod "1b3809e4-ba63-4896-87ef-648c16729b53" (UID: "1b3809e4-ba63-4896-87ef-648c16729b53"). InnerVolumeSpecName "kube-api-access-z74vw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 14:49:05 crc kubenswrapper[4898]: I1211 14:49:05.259696 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1b3809e4-ba63-4896-87ef-648c16729b53-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1b3809e4-ba63-4896-87ef-648c16729b53" (UID: "1b3809e4-ba63-4896-87ef-648c16729b53"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 14:49:05 crc kubenswrapper[4898]: I1211 14:49:05.312121 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z74vw\" (UniqueName: \"kubernetes.io/projected/1b3809e4-ba63-4896-87ef-648c16729b53-kube-api-access-z74vw\") on node \"crc\" DevicePath \"\"" Dec 11 14:49:05 crc kubenswrapper[4898]: I1211 14:49:05.312156 4898 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1b3809e4-ba63-4896-87ef-648c16729b53-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 11 14:49:05 crc kubenswrapper[4898]: I1211 14:49:05.312165 4898 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1b3809e4-ba63-4896-87ef-648c16729b53-utilities\") on node \"crc\" DevicePath \"\"" Dec 11 14:49:05 crc kubenswrapper[4898]: I1211 14:49:05.377043 4898 generic.go:334] "Generic (PLEG): container finished" podID="1b3809e4-ba63-4896-87ef-648c16729b53" containerID="924a25084db43aa1bcd736e33acfd59e0fb92a58fecf9eceb7ab6c3d5fb50389" exitCode=0 Dec 11 14:49:05 crc kubenswrapper[4898]: I1211 14:49:05.377088 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j7qpf" event={"ID":"1b3809e4-ba63-4896-87ef-648c16729b53","Type":"ContainerDied","Data":"924a25084db43aa1bcd736e33acfd59e0fb92a58fecf9eceb7ab6c3d5fb50389"} Dec 11 14:49:05 crc kubenswrapper[4898]: I1211 14:49:05.377130 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j7qpf" event={"ID":"1b3809e4-ba63-4896-87ef-648c16729b53","Type":"ContainerDied","Data":"ff60766c5bc378bb08dac9097dc88bc83fc2381797d365e075acfb2670505d43"} Dec 11 14:49:05 crc kubenswrapper[4898]: I1211 14:49:05.377148 4898 scope.go:117] "RemoveContainer" containerID="924a25084db43aa1bcd736e33acfd59e0fb92a58fecf9eceb7ab6c3d5fb50389" Dec 11 14:49:05 crc kubenswrapper[4898]: I1211 14:49:05.377173 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-j7qpf" Dec 11 14:49:05 crc kubenswrapper[4898]: I1211 14:49:05.402603 4898 scope.go:117] "RemoveContainer" containerID="b32c3c71435a1edca28c79eee0d9cde1e8b12b44b61d144eb3000c138d2e49a3" Dec 11 14:49:05 crc kubenswrapper[4898]: I1211 14:49:05.418130 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-j7qpf"] Dec 11 14:49:05 crc kubenswrapper[4898]: I1211 14:49:05.430041 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-j7qpf"] Dec 11 14:49:05 crc kubenswrapper[4898]: I1211 14:49:05.463669 4898 scope.go:117] "RemoveContainer" containerID="1a9ab71fcd3d6f295bebf91c2e8d53dd943f314093188afb6fa415b256b3360e" Dec 11 14:49:05 crc kubenswrapper[4898]: I1211 14:49:05.501869 4898 scope.go:117] "RemoveContainer" containerID="924a25084db43aa1bcd736e33acfd59e0fb92a58fecf9eceb7ab6c3d5fb50389" Dec 11 14:49:05 crc kubenswrapper[4898]: E1211 14:49:05.502380 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"924a25084db43aa1bcd736e33acfd59e0fb92a58fecf9eceb7ab6c3d5fb50389\": container with ID starting with 924a25084db43aa1bcd736e33acfd59e0fb92a58fecf9eceb7ab6c3d5fb50389 not found: ID does not exist" containerID="924a25084db43aa1bcd736e33acfd59e0fb92a58fecf9eceb7ab6c3d5fb50389" Dec 11 14:49:05 crc kubenswrapper[4898]: I1211 14:49:05.502423 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"924a25084db43aa1bcd736e33acfd59e0fb92a58fecf9eceb7ab6c3d5fb50389"} err="failed to get container status \"924a25084db43aa1bcd736e33acfd59e0fb92a58fecf9eceb7ab6c3d5fb50389\": rpc error: code = NotFound desc = could not find container \"924a25084db43aa1bcd736e33acfd59e0fb92a58fecf9eceb7ab6c3d5fb50389\": container with ID starting with 924a25084db43aa1bcd736e33acfd59e0fb92a58fecf9eceb7ab6c3d5fb50389 not found: ID does not exist" Dec 11 14:49:05 crc kubenswrapper[4898]: I1211 14:49:05.502451 4898 scope.go:117] "RemoveContainer" containerID="b32c3c71435a1edca28c79eee0d9cde1e8b12b44b61d144eb3000c138d2e49a3" Dec 11 14:49:05 crc kubenswrapper[4898]: E1211 14:49:05.503429 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b32c3c71435a1edca28c79eee0d9cde1e8b12b44b61d144eb3000c138d2e49a3\": container with ID starting with b32c3c71435a1edca28c79eee0d9cde1e8b12b44b61d144eb3000c138d2e49a3 not found: ID does not exist" containerID="b32c3c71435a1edca28c79eee0d9cde1e8b12b44b61d144eb3000c138d2e49a3" Dec 11 14:49:05 crc kubenswrapper[4898]: I1211 14:49:05.503467 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b32c3c71435a1edca28c79eee0d9cde1e8b12b44b61d144eb3000c138d2e49a3"} err="failed to get container status \"b32c3c71435a1edca28c79eee0d9cde1e8b12b44b61d144eb3000c138d2e49a3\": rpc error: code = NotFound desc = could not find container \"b32c3c71435a1edca28c79eee0d9cde1e8b12b44b61d144eb3000c138d2e49a3\": container with ID starting with b32c3c71435a1edca28c79eee0d9cde1e8b12b44b61d144eb3000c138d2e49a3 not found: ID does not exist" Dec 11 14:49:05 crc kubenswrapper[4898]: I1211 14:49:05.503482 4898 scope.go:117] "RemoveContainer" containerID="1a9ab71fcd3d6f295bebf91c2e8d53dd943f314093188afb6fa415b256b3360e" Dec 11 14:49:05 crc kubenswrapper[4898]: E1211 14:49:05.503722 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a9ab71fcd3d6f295bebf91c2e8d53dd943f314093188afb6fa415b256b3360e\": container with ID starting with 1a9ab71fcd3d6f295bebf91c2e8d53dd943f314093188afb6fa415b256b3360e not found: ID does not exist" containerID="1a9ab71fcd3d6f295bebf91c2e8d53dd943f314093188afb6fa415b256b3360e" Dec 11 14:49:05 crc kubenswrapper[4898]: I1211 14:49:05.503757 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a9ab71fcd3d6f295bebf91c2e8d53dd943f314093188afb6fa415b256b3360e"} err="failed to get container status \"1a9ab71fcd3d6f295bebf91c2e8d53dd943f314093188afb6fa415b256b3360e\": rpc error: code = NotFound desc = could not find container \"1a9ab71fcd3d6f295bebf91c2e8d53dd943f314093188afb6fa415b256b3360e\": container with ID starting with 1a9ab71fcd3d6f295bebf91c2e8d53dd943f314093188afb6fa415b256b3360e not found: ID does not exist" Dec 11 14:49:06 crc kubenswrapper[4898]: I1211 14:49:06.793194 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b3809e4-ba63-4896-87ef-648c16729b53" path="/var/lib/kubelet/pods/1b3809e4-ba63-4896-87ef-648c16729b53/volumes" Dec 11 14:49:07 crc kubenswrapper[4898]: I1211 14:49:07.778017 4898 scope.go:117] "RemoveContainer" containerID="4a948fb24285fadc33e95803387e30a8892f96c49c37e3a977d4366e7a6db1ae" Dec 11 14:49:08 crc kubenswrapper[4898]: I1211 14:49:08.423893 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7mmvk" event={"ID":"b7d3e80d-5f0f-4e25-a2b5-fa5a6f5d742c","Type":"ContainerStarted","Data":"690ae9c3a52b5dee9a040cec3e33b258f45dfc4bfb0d42b188896e64d6e6e2e9"}